Hardly a day goes by without hearing about in the media the latest discoveries in the field of medical research. While in some cases the media present the facts as they are and give a faithful account of the results of a new study, it often happens that they pour into sensationalism or that they distort, or even deliberately distort, the results of the study. Being given the reports you are used to seeing, you will surely be surprised to learn that the studies scientists in no way allow a hypothesis to be confirmed or invalidated. In reality, medical research is based on probabilities (the possibility that a given event will occur or not), not on certainties.
The Right Examples
Take the example of a recent study in which 70% of patients who received the treatment had their condition improve significantly. This percentage gives an idea of the probability of improvement in the condition of patients treated, but it is not possible to predict with certainty the outcome of treatment in all patients. If a treatment has been shown to work in 7 out of 10 patients, you will not be able to tell if it is working for you until you try it. For the pharmaceutical evidence based medicine this is important.
On the other hand, if another treatment had the desired effects in 9 patients out of 10, you will know that you are more likely to benefit from this treatment. How many times have you read or heard headlines like these.
The Cancer Issues
This Drug Causes Cancer, a certain food prevents dementia or a certain activity makes you lose weight? Now, here is what should rather make headlines: “This Drug Could Cause Cancer in Mice”; “Such food could prevent dementia among middle-class Norwegians who take aspirin”, “Such activity, when paired with a healthy diet, causes certain types of people to lose weight”.
- The point is, most health-related information is extremely complex. Journalists have not necessarily the time to read and analyze the research articles in their entirety and may rely on brief press releases to prepare their stories. Besides some of them lack the skills or experience to critically examine data complex scientists presented in these articles. It is very difficult to determine the degree of reliability of all reports on the same subject, since they diverge to such an extent that they can sometimes seem contradictory.
- So, headlines one day say antidepressants increase suicide risking adolescents, and the next day, that these drugs cause a decrease, not an increase, infrequency of suicides in this population. When in doubt about the validity of a story, it may be useful to ask the following questions.
This reports it intended for promotion?
Does he sink into sensationalism to capture the public’s interest? Does it contain data to assess it? Does it offer different perspectives or is it limited to one point of view? Are they presenting the opinion of independent experts who were not involved in the study? If the reports particularly biased (whether very positive or very negative), you should automatically have doubts as to the validity of the information it contains. Likewise, if the results of the study are not placed in context or that they are not provided in the form of probabilities in the report, it is good to question the information presented.