Yes, it’s a thing. A thing where Facebook makes videos about science and posts them on Facebook. Yay! Science! By Facebook! It’ll be great!
Today’s vid told me that memories are inherited in your DNA because your parents DNA changes in response to different environmental stimuli. There was even a mouse experiment that proved it! OMG this totally explains why I get, like, really panicky in the presence of, like, books. Because my Mum was once given a series of electric shocks every time she opened a book!
I don’t even need to do any research at all to spot the first problem; – epigenetic transmission of behaviours is about methylation. It does not change the DNA. If it did I’d be a fucking legend cricket player who could successfully load a dishwasher.
Actually, the mouse experiment is kind of interesting. There have been many experiments and much research into methylation and epigenetics, but that mouse experiment stands out because of one thing; the results were completely unlikely. This article pulls the experiments apart in a reasonably straightforward way that I’d probably understand in its entirety if I wasn’t such a fabulous cricketer;
An article reporting statistical evidence for epigenetic transfer of learned behavior has important implications, if true. With random sampling, real effects do not always result in rejection of the null hypothesis, but the reported experiments were uniformly successful. Such an outcome is expected to occur with a probability of 0.004.
0.004. That’s pretty small odds. The article basically takes a series of guesses as to how the reported results were so amazingly coincidentally completely in line with the researchers’ hypothesis, but what it makes clear is how research design is often quite shonky. Obviously drug companies edit out their failures but I was a bit surprised to read this article detailing all the ways in which people bugger it up in other fields too,
How could the findings of Dias and Ressler (2014) have been so positive with such low odds of success? Perhaps there were unreported experiments that did not agree with the theoretical claims; perhaps the experiments were run in a way that improperly inflated the success and type I error rates, which would render the statistical inferences invalid. Researchers can unintentionally introduce these problems with seemingly minor choices in data collection, data analysis, and result interpretation. Regardless of the reasons, too much success undermines reader confidence that the experimental results represent reality. Even if some of the effects prove to be real, the findings reported in Dias and Ressler (2014) likely overestimate the effect magnitudes because unreported unsuccessful outcomes usually indicate a smaller effect than reported successful outcomes.
Next stop; chaos theory, closed loop control systems and my fucking car.