The Interview Series
September 18, 2020

YouTube Algorithms and Violence Against Girls: Insights from Dr. Kyra Gaunt

Dr. Kyra Gaunt is an ethnomusicologist at the University of Albany. She shares insights on racism and sexploitation on YouTube—and offers advice on how parents and kids can navigate the online world.

Dr. Kyra Gaunt—or Professor G as she’s known to her students at the University at Albany—is an ethnomusicologist and the author of The Games Black Girls Play: Learning the Ropes from Double-Dutch to Hip-Hop. She was chosen as a member of the 2009 inaugural class of “world-changing” TED Fellows and offers professional consulting services and testimony in state and federal legal court cases as a federally-certified social media expert. Her work has contributed to the emerging field of Black girlhood studies and hip-hop feminism, and her upcoming book, PLAYED: Twerking at the Intersection of Music, YouTube, and Violence Against Black Girls, takes a deep dive into the ways music reproduces racism and sexploitation online.

In part two of our interview, Dr. Gaunt discussed some of her work on YouTube, streaming and music—and how algorithms specifically affect children and Black girls. Below are Dr. Gaunt’s insights on each topic, as told to our Cheif Growth Officer, Brittany Skolovy.

On inappropriate content and sexploitation on YouTube:

I had done various projects with people through Twitter for International Women’s Day—across five different continents, curating stuff for little books that I created. I ran across a video that featured Nicki Minaj. It’s Nicki Minaj, so it was pretty explicit.

Now I don’t have kids, but I think kids should be protected from that. YouTube is a general audience platform, so I decided to file a protest with the FCC, because back then I didn’t know that social media platforms were monitored by the FTC and not the FCC.

I spent a few months building a case for filing complaints, and as I got going, I found out there was no monitoring of user-generated content. Then 2013 was a year of twerking. Nicki Minaj was twerking in her video, and I wondered if girls were twerking on YouTube, too. I discovered that there are lots of girls under 13 twerking from their bedrooms. Then I noticed, particularly for Black girls, that the comments below those videos were highly sexualized.

There would be girls that were eight or nine years old. We’d study the way that they marketed themselves. The name of the video might say “eight-year-old twerking.” And I thought that was weird—an eight-year-old talking about themselves in third person. The other videos generally had names like “me and my cousin,” or “me and my sister dancing,” but to do it third person was odd as far psychological age goes.

What we began to discover was that those videos were often being downloaded and uploaded by male users to monetize their own channel, while still slut-shaming the subjects in the video. One user had 400 videos in a playlist titled “tween twerking.” It felt like digital sex trafficking. I realized there was no one monitoring this or addressing the harm that it’s doing to girls from a sexual violence standpoint.

Or even the boys: there was an eight-year-old white boy who was in his bedroom rapping to these sexist lyrics, doing this cultural ventriloquism of objectifying women. You know anybody who wants to exploit him could come along and say, “Wow, you are the next Eminem. Why don’t you meet me at the 7-Eleven? I’m going to be Scooter to your Justin Bieber.” Once I realized that the FCC wasn’t in charge—but there were community guidelines—I would flag the videos and I’d learn that process. And I never heard back from YouTube. Never.

On algorithms, the attention economy and monetization:

My book is about the intersection of twerking, the way the music is being monetized through streaming and the way artists will exploit girls’ games in order to gain attention in the attention economy. There’s this halo-effect for artists and girls because they know when Cardi B and Megan Thee Stallion release a song, there’s a whole slew of algorithms that are going to pull their videos in. That’s the monetization of music. YouTube is profiting off of the content of very young girls. And before last year when the fine came from the FTC, they were practicing plausible deniability.

Another issue is the violence against women, and thereby, the violence against girls. The interface looks very much like free porn. And the way people engage with it is like porn. That means that for these very young girls, their first encounter in the public sphere is people treating them like prostitutes. Even though they don’t engage with it, of course, they’re reading it.

So now I have a whole ecology that I’m trying to write about. It’s like a complex web. It’s taken me a long time, sometimes with a lot of the same kind of trauma that content moderators experience, especially because you can’t help anybody. You see a 30-year-old man commenting under one of the videos, approaching a 12-year-old with classic grooming language.

I’m trying to figure out how we can shed light on this and get people to start coming forward. But I have no idea how to do that, especially among marginalized girls who are already feeling inundated with stigmatization, hypersexualization, hyper-invisibility and hyper-visibility. So, I’ve been thinking of ways to get them to create content that is more empowering, like taking those same twerking videos of girls and writing new lyrics and their own music. Or we could work on finding a more productive way to shift their lens and practices to make content that is more healing instead of self-objectifying.

On the FTC fine and YouTube Kids:

Although the $170 million fine from the FTC was a good thing to call attention to the problems on YouTube, it was 0.002% of the valuation of Alphabet. It was like a slap on the wrist for them. But they redesigned the YouTube Kids app with some age gating, which was nice. The problem that I worry about was the same problem that came up when YouTube Kids first was launched—and that is that they’re pulling from the same archive of data and mistakes will be made. Kids will encounter sexist, racist and homophobic videos, and cartoons that are actually violent. They will seep in occasionally because it’s only algorithms monitoring them.

And since the comments are disabled and you can’t download or share videos that include children under 13, there are problems in the other direction. There is a scholar who does inherent bias work at Harvard. Because her name is Dolly Chugh, the algorithm thinks her videos are kids’ videos. You can’t download them and there are no comments. And given this moment with Black Lives Matter, engagement would be very valuable. But because this filter is not human, it doesn’t know how to learn the difference between dolly a toy and Dolly a person.

People are aware that they can game the system. I can name something that’s violent or insidious or pernicious “Dolly” to get it into the YouTube Kids platform.

If you’d like to learn more from Dr. Gaunt, check out her website!

Connect children with family, friends and fun on the kid-safe messenger built by parents.
Kinzoo Together is the only video-calling
app designed to connect kids with the grown-ups they love.

You might also like...

How to encourage media literacy ahead of the new school year

How to encourage media literacy ahead of the new school year

It can feel like a massive challenge to keep up with the changing media landscape—especially for parents. We sat down with Michelle Lipkin, Executive Director of the National Association for Media Literacy to learn how we can help our kids develop critical media literacy skills to thrive in life.

How to connect with online safety: advice from an Internet Safety Expert

How to connect with online safety: advice from an Internet Safety Expert

Fareedah Shaheed is the CEO and founder of Sekuva and she shares her best digital parenting advice.

In Conversation with The White Hatter: Teaching Kids to Thrive Online

In Conversation with The White Hatter: Teaching Kids to Thrive Online

The White Hatter has been providing online safety and digital literacy training since 1993. We sat down to chat with founder Darren Laur and learned how parents can empower their kids to thrive online.

Better tech for kids is here

We’re working hard to be the most trusted brand for incorporating technology into our children’s lives.