The world’s largest video platform will start showing text from the crowdsourced site alongside videos that feature popular conspiracy theories.
YouTube has a new plan to help combat fake news on its platform.
Susan Wojcicki, CEO of the Google-owned video streaming giant, said that it will begin featuring text from Wikipedia and other third-party sources alongside videos focused on popular internet conspiracy theories.
“There are many benefits of text,” the exec on Tuesday told a crowd of SXSW attendees during an onstage discussion with Wired editor-in-chief Nicholas Thompson, who had seemed skeptical that the world’s largest video platform would look to promote text-based information. “As much as we love video, we want to make sure video and text work together.”
Wikipedia, much like YouTube, is also a predominantly user-generated platform that is susceptible to false information.
Wojcicki said that YouTube will focus on a list of the most well-known conspiracy theories. While she was onstage, she showed an example featuring a video about the moon landing, which some conspiracy theorists believe didn’t really happen. The feature, which will also include links to the third-party sites, is expected to roll out in the coming months.
YouTube has been looking for other ways to crack down on false or inappropriate content on its platform after it has received criticism for a number of high-profile instances of fake news on its site. The platform has taken pains to promote videos from trusted news sources, though sometimes conspiracies still make their way to the top of search results or into the website’s trending page, like when a video calling Parkland, Florida, shooting survivor and activist David Hogg a crisis actor began trending.
To address some of the content moderation challenges that have plagued YouTube over the last year, Wojcicki announced in December that the company would grow its content and moderation teams to 10,000 people in 2018 to help it better review the videos posted to its platform.
On Tuesday, Wojcicki explained that YouTube’s algorithms will be able to catch more content that shouldn’t be on the platform than humans, but she added that YouTube human moderators won’t go away. “We’ll always need humans,” she said, noting that YouTube is able to remove the majority of violent extremism content with machines. “We need humans to review it and make sure that it’s being done correctly.”
Part of YouTube’s challenge is how to balance being an open technology platform with being one that must moderate content uploaded into its ecosystem. That dilemma has become especially relevant over the last year, as YouTube has faced advertiser revolts over everything from inappropriate videos to exploitative children’s content.
Wojcicki acknowledged onstage that it is “tricky” for YouTube to figure out how to deliver fair and accurate information to users as quickly as possible. “What this year has shown me is how important it is for us to be able to … deliver the right information to people at the right time,” she said.
Asked at the beginning of her hourlong talk about how her philosophy about YouTube has changed in her four years there, Wojcicki said that she has started to think about it like a library “because of the sheer amount of video that we have” and the “ability for people to learn and look up information.”
The exec also touched on content moderation, admitting that YouTube can “get a whole lot better with comments.” But she said that comments are integral to the platform because of the culture around how YouTube stars relate to their fans. Said Wojcicki: “What differentiates YouTube [from TV] is it can be this two-way conversation.”