Wired Magazine Headline: How YouTube Can Rewrite the Past and Shape an Election
Subheadline: Philippine researcher Fatima Gaw says the platform has become a hub for pro-Marcos historical revisionism.
Note: The recent election in Philippines shows us social media giant, youtube, did nothing to stop disinformation and misinformation campaigns by the Marcos family, the same family which killed dissidents and stole a good amount of the Philippines common wealth of its people.
The following is an interview by a researcher who followed this election closely and who tried to warn and intercede with youtube about how they were going to throw this election to a young Authoritarian who busied his campaign to erase his family’s sordid past and re-construct a favorable narrative.
This interview is worth your read in transcript form. Let’s hope Trump and Cambridge Analytica are not studying this playbook.
p.s. Sam Watkins, one of two men behind 4Chan and 8Chan and believed to be the writer of QAnon’s original content was thrown out of the Philippines for running a porn website and server. Watkins is running for office in Arizona and has never been held accountable for all the lives he destroyed in America with his ad libbed QAnon conspiracy theories which gullible people ate up.
Knowing the Philippines well, it would not surprise me if he lent a hand in some way to Marcos. This is just speculation. But these are two sociopathic men who have a lust for power and both have used nefarious methods to help throw elections in the USA (QAnon) and now the home base back in the Philippines.
Have you seen YouTube take down any of the videos?
No, that’s actually the most frustrating part. Early in the election season, they said “We’re going to really be serious in making sure that the election is fair and free.” But the part where they actually take action on the content, on the platform, there’s really nothing that’s happening, nothing meaningful. Even the historical disinformation I flagged two years ago is still there. In fact, because they were not taken down, those 500,000 subscribers now are 2 million. So there’s this exponential gain on these channels and videos because they were left untouched by the platform.
If videos are popular they can get brand sponsorships. And because they have a lot of subscribers and they’re talking about a very salient topic, there are lots of views. And that’s paid for by YouTube—they’re kind of paying for disinformation.
[YouTube’s Ivy Choi says that it removes offensive content “as quickly as possible” and that it removed more than 48,000 videos in the Philippines during Q4 2021 for violating its Community Guidelines. YouTube says it is reviewing the specific channels flagged by WIRED, but that it reviews all of the channels in its YouTube’s partner program and removes those that don’t comply with its policies.]
Is this like, say, the right-wing or alt-right YouTube channels in the US?
It’s not like the alt-right network in the US, where you’ll see influencers in guest appearances on each other’s shows. What we’ve seen is that they echo the same narratives, but they don’t want to be technically associated with each other, because if videos are flagged for violating YouTube’s policies or community standards, it’s easier to take down the whole network because they’re connected.
Their connection is more subtle and algorithmic—they’re not mentioning each other per se. What YouTube will do is take the videos on a case-by-case basis. But even if you take down one or two videos, there’s still hundreds left there. Regardless if they mention each other or not, they are recommending each other. So if you’re watching, you would see the same people, the same message, referencing the same events and narrative. We see a lot of reposting, where one channel will repost the content of another influencer, but it’s a different kind of amplification. And if you take down one video, if it’s reposted elsewhere, it will exist on the platform nonetheless.
What do you mean when you say the videos are algorithmically connected?
We don’t know how YouTube’s algorithm works, and it changes all the time, but we can infer that there are things that signal to the algorithm that certain topics are connected. So in my Marcos disinformation research, you see that posters use the same keywords in their video titles, the same tags, signaling to the algorithm they’re talking about the same topics. They categorize themselves as “news, politics, or educational content,” even if they’re not educational content at all. They belong to the same self-reported genre, so they probably would be grouped together and recommended to each other. It’s also the timing of when they release the videos, around an event like a presidential debate.