Report: Instagram’s video app recommends possible child abuse videos

Instagram appears to be recommending graphic videos to potential underage users.

IGTV, Instagram’s long-form video app that launched in June, reportedly recommended “sexually suggestive footage of young girls” and a video of genital mutilation during a three-week study conducted by Business Insider.

Two of the graphic videos, which reportedly had reached more than 1 million views, were taken down by Instagram five days after Business Insider reported them to the app. They were also reported to police by a British children’s charity.

Business Insider monitored the “Popular” and “For You” tabs on IGTV with multiple accounts, including an anonymous account created for the study that stated the user was 13. The site discovered a video, called “Hot Girl Follow Me” that showed what appeared to be a preteen girl who was in the process of removing her top just before the video ended.

The comments on that and a similar video reportedly included angry people asking why that video had been recommended. Others described it as “superb” or “sexy.”

Another video discovered by Business Insider showed a penis being operated on by a motorized saw. Videos also showed strange footage of a monkey touching a crying baby and a “video of a woman pulling something long and bloody out of her nose.”

In a statement to Business Insider, IGTV said it had removed the videos and apologized to those who had viewed them.

“We care deeply about keeping all of Instagram—including IGTV—a safe place for young people to get closer to the people and interests they care about,” a spokesperson said.

“We have Community Guidelines in place to protect everyone using Instagram and have zero tolerance for anyone sharing explicit images or images of child abuse. … We take measures to proactively monitor potential violations of our Community Guidelines and just like on the rest of Instagram, we encourage our community to report content that concerns them. We have a trained team of reviewers who work 24/7 to remove anything which violates our terms.”

Instagram did not comment when asked by Business Insider how the algorithm works and why it was recommending such graphic videos. As of earlier this week, the accounts that posted the sexually suggestive videos were still active.

Facebook—which has had numerous issues with fake accounts spreading fake news and convinced both liberal and conservative bases in the U.S. that the company is biased against them—owns Instagram.

YouTube, which IGTV wants to compete with, has also faced extraordinary problems with its algorithms. Inappropriate videos featuring children and conspiracy theory videos flood the platform every time there’s a national tragedy.

The Business Insider report details the process for its investigation and what it found. Read the report here.

H/T CNET

The post Report: Instagram’s video app recommends possible child abuse videos appeared first on The Daily Dot.

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close