YouTube is a place where anyone in the world with an Internet connection can access a seemingly endless stream of video content. Missed last night’s game? Catch the highlights in seconds. Love cats? Behold a montage of felines knocking things off tables again and again.
But when taking an honest look at YouTube’s impact on our world, we can’t pretend it’s all Zion dunks and cat videos. Experts have raised some real concerns about the site. Do YouTube’s algorithms keep us from seeing differing points of view? In a sea of content, how do we identify what’s reliable and what’s not true? And, although the Momo challenge turned out to be a hoax, how can we prevent the spread of harmful, violent content?
Let’s look at some of these concerns, and see what experts have to say about each.
Escaping the echo chamber
YouTube is undoubtedly one of the most influential social media platforms of all time. The website reports that its more than one billion users watch over one billion hours of content daily.
What’s helped make it such a success is its business model. While most of the content on the site is free for users, it’s important to remember that YouTube is a money-making business. Companies pay YouTube to show ads on its videos, and outside of a relatively new subscription option, this is how the site makes money.
This model is the same for virtually every social networking site. For the most part, it’s a good thing. I mean, who would still use Facebook, Twitter, and Instagram if you had to pay a monthly subscription?
Because revenue is flowing in from the ads on YouTube, the site is incentivized to keep you watching videos for as long as possible. To do this, YouTube developed algorithms that track the kinds of content you tend to watch and suggests more content like it in various places on the site. This adds some convenience to the user experience, but also makes it easier to end up in what experts call an echo chamber.
Basically, when YouTube’s algorithms feed you only content that aligns with your interests, political opinions, and so on, you’re less likely to have your views challenged and more likely to stay grounded in your beliefs. You’re in a room where all you hear are your own ideas echoed back to you.
Fortunately, this is something researchers are looking into, studying ways to integrate algorithms that recommend more diverse content—and balancing that with the need to keep users watching videos, plus, of course, ads.
False and troubling content
Beyond the echo chamber, YouTube must deal with the issue of keeping its content in check.
A 2018 Pew Research study found that 64 percent of users say they sometimes encounter videos that seem obviously false or untrue while using the site, while 60 percent said they sometimes encounter videos that show people engaging in dangerous or troubling behavior. And 61 percent of parents say they have encountered content on YouTube that they felt was unsuitable for children.
With these figures in mind, how do we know we can trust what we watch on YouTube—and that it won’t be violent or graphic? This is a question virtually every social media platform is asking itself—and scrambling to address.
Currently, YouTube has a moderating system in place that utilizes both algorithmic and human moderation. In the past couple of years, the site has made some real efforts to beef up its moderation, hiring more human staff to screen videos for violent or inappropriate content. Unfortunately, increasing the amount of people screening its content alone hasn’t fully addressed these concerns (and has raised some mental health concerns for those tasked with sitting through hours and hours of violent content).
Some say there are problems with how YouTube set up guidelines to decide what types of content is and isn’t allowed to be shared. This has led to some YouTube channels complaining that they’ve had videos that, in their opinions, don’t violate the rules, taken down or demonetized (meaning they can’t run ads on their videos and make a share of the profits).
While the site has made some progress updating its algorithms to ensure hateful or violent videos don’t show up as suggested content, for the time being, at least, YouTube is largely relying on its own users to help keep the platform a safe place to learn, share, and be entertained.
It’s only getting deeper
Even as we work to address these concerns, the plot is thickening. Consider this question: how do we identify the content of a video as fake when it’s coming straight from the mouth of a reputable source, or at least appears to be? This is the unique challenge posed by “deep fakes.”
Essentially, these are videos that feature a computer-generated version of someone saying or doing things they have never said or done, made possible by machine learning. It’s been said of deep fakes that they are “where truth comes to die.” So, yeah, you could say this is serious stuff. Think of the power that comes with being able to control the words coming out of the mouths of our world’s most powerful leaders.
But don’t panic just yet. Just as some are making use of new technology to do harm, some of our brightest minds are hard at working finding ways to combat deep fakes. In fact, a group of researchers from the State University of New York at Albany has already come up with a way to expose deep fakes using neural networks to track eye blinking.
Deep fakes could turn out to be the biggest challenge social media sites like YouTube have ever faced. Successfully addressing it will depend upon how quickly we can create new policies and technologies that can meet the challenge.
This is what we do
If you’re feeling a bit pessimistic, you’re probably not alone.
These are some serious topics that have the potential to impact the lives of not millions, but billions. Keep this in mind though: some of the technology at play here may be uncharted territory, but as humans, solving complex problems is kind of our thing.
And solving these kinds of problems will require brilliant, motivated people from all over the world working together. That’s why it’s so important for gifted and talented kids just like you to be able to pursue their passions, explore new topics, and find their place in a community of brilliant minds.