An experiment is a success if you learn something valuable from it. But when the results of an experiment give you insight, can you trust it?
Much has been written about confirmation bias, the tendency for the human brain to reject information that does not fit what’s expected and trust information that is what was expected. A lesser known cousin of confirmation bias is the Hawthorne Effect: Behavior that occurs because people know they are being watched. The Hawthorne Effect gets its name from a series of factory experiments in the 1920s but has proved enduring and tricky to avoid. If people know they are part of an experiment or some other change, they may consciously or subconsciously behave differently and affect the outcome of the experiment.
So, if we’re hardwired for bias and deviance, what are potential antidotes? One tactic is to design experiments to minimize psychological risks for participants. It is best that people do not know about the experiment, but where it cannot be avoided, anonymity decreases the feeling of being watched. Experimenters should also review data and facts (ideally seeing experiments live) and mostly importantly, check back regularly when experiments are over. Have results been maintained?
How often do we check back on the results of experiments and do they hold up over time? Check in on the results and see how you and your team can continue to learn and find value.