YouTube advertises big brands alongside fake cancer cure videos

Share

YouTube’s algorithm promotes fake cancer cures in a number of languages and the site runs adverts for major brands and universities next to misleading videos, a BBC investigation has found.

 

Searching YouTube across 10 languages, the BBC found more than 80 videos containing health misinformation – mainly bogus cancer cures. Ten of the videos found had more than a million views. Many were accompanied by adverts.

 

The unproven “cures” often involved consuming specific substances, such as turmeric or baking soda. Juice diets or extreme fasting were also common themes. Some YouTubers advocated drinking donkey’s milk or boiling water. None of the so-called cures offered are clinically proven to

treat cancer.

 

Appearing before the fake cancer cure videos were adverts for well-known brands including Samsung, Heinz and Clinique.

 

YouTube’s advertising system means that both the Google-owned company and the video makers are making money from the misleading clips.

 

In January, YouTube announced they would be “reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness.”

 

But the company said the change would initially only affect recommendations of a very small set of videos in the United States, and does not apply in languages other than English.

 

The BBC search covered English, Portuguese, Russian, Arabic, Persian, Hindi, German, Ukrainian, French and Italian.

 

We found, for example, that in Russian, a simple search for “cancer treatment” leads to videos advocating drinking baking soda. Watching these videos in turn led to recommendations for other unproven “treatments” such as carrot juice or extreme fasting.

 

Erin McAweeney, a research analyst at the Data & Society institute, explained that because YouTube’s algorithm recommends similar videos to the ones you have just watched, it is continuously “carving a path” from one video to the next, regardless of the credibility of the advice offered

within.

 

“Someone can start out on a credible video and be suggested to watch a juice cure video next. A recommendation system doesn’t know credible from non-credible content.” McAweeney says.

 

YouTube has stated that its recommendation system – which has been accused of leading users down rabbit holes of conspiracy theories and radicalisation – would change, recommending videos that are credible and trustworthy to people that are watching videos that might not be.

 

YouTube’s Community Guidelines ban harmful content including: “Promoting dangerous remedies or cures: content which claims that harmful substances or treatments can have health benefits.”

 

Many of the fake cancer cures the BBC found, such as juicing, were not in themselves harmful, but could indirectly damage a cancer sufferer’s health – for instance, if they neglect conventional medical approaches in favour of the so-called cures.