Site icon DemCast

How YouTube Spreads COVID Disinformation

Image Credit: Noun Project


290,000 Americans have already died from COVID-19, yet people protest wearing masks and are skeptical of vaccinations. How does disinformation manipulate people into risky behavior? Who gains from spreading disinformation that might result in avoidable deaths? Why does YouTube spread disinformation?

Background

This blog is based on discussions with Dr. Nitin Agarwal at UA Little Rock and an expert in disinformation. His research is supported by U.S. National Science Foundation (NSF), Office of Naval Research (ONR), Army Research Office (ARO), Defense Advanced Research Projects Agency (DARPA), Air Force Research Laboratory (AFRL), and Department of Homeland Security (DHS). Nitin directs the Collaboratorium for Social Media and Online Behavioral Studies (COSMOS) which researches disinformation dissemination and participates in the national Tech Innovation Hub launched by the U.S. Department of State’s Global Engagement Center to defeat foreign based propaganda.

Other resources:

Understand the threat

The Cyber Kill Chain model developed by Lockheed Martin explains the steps that an adversary must complete to achieve their objectives. The kill model shows how to recognize disinformation based social engineering attacks designed to manipulate public opinion.

Spreading disinformation

Social media is ideal suited to spread disinformation as content as stories can be planted without the due diligence of professional journalists and then be amplified through bots and paid workers. YouTube spreads disinformation effectively as its algorithm determines what 70% of people watch. YouTube makes money by selling ads so the longer a viewer stays on its platform, the more money it makes. YouTube makes the same amount of money in serving an ads next to disinformation videos as it does serving ads next to videos with truthful content. Disinformation tends to be sensational and plays to people’s biases making them more likely to hook viewers. How big is YouTube?

How YouTube picks videos to recommend

YouTube like other social media firms makes money selling ads and collecting information on users that can be sold to marketers. The longer people stay on YouTube, the more ads they can be served. In order to do this, YouTube recommends videos to watch based on the number of times it has already been viewed, the number of likes and comments it has received. Disinformation actors know this and manipulate the YouTube algorithm to recommend their videos by:

Disinformation videos can be detected based on the large number of videos being posted by a user (dozens per day), sudden spikes in the number of views, subscribers and comments.

Follow the (blood) money

Blood money comes at the cost of someone else’s life. Beyond political motivations, where is the money in spreading disinformation?

Take Away

Shouting ‘FIRE!’ in a crowded building may land you in court, but some vendors will still sell the shouter a megaphone. The high tech equivalent is a platform like YouTube that profits from distributing disinformation which may harm the public. There has be more regulation in the public interest.

Firms advertising on YouTube should also check that their YouTube ads are being seen by real humans and not just bots.

Deepak
DemLabs


Recent Articles:


DemLabs is a project of the Advocacy Fund


DemCast is an advocacy-based 501(c)4 nonprofit. We have made the decision to build a media site free of outside influence. There are no ads. We do not get paid for clicks. If you appreciate our content, please consider a small monthly donation.


Exit mobile version