The Mozilla Foundation, the maker of Mozilla Firefox or simply Firefox, just launched a site featuring 28 user-submitted stories, detailing incidents where YouTube’s recommendation algorithm that sometimes leads people towards disturbing videos.

Alphabet-owned YouTube has received a lot of flak this year for radicalization, pedophilia, and mostly offensive content. This is a big issue as 70 percent of YouTube’s viewing time comes from recommendations.

YouTube has previously denied suggestions that their algorithms deliberately promote extremist or harmful content because it increases viewing time or benefits the business in some other way. YouTube fixed this issue by showing “warning labels” and “knowledge pans” containing trustworthy information on videos that contain misinformation and conspiracy theories.

But it looks like the truth is far from here.

#YouTubeRegrets project by The Mozilla Foundation aims to highlight this issue and urge YouTube to change its practice. The site features YouTube horror stories detailing incidents where YouTube’s recommendation algorithm displayed bizarre and horrifying videos that users had no interest in. Some of these recommendations feature conspiracies, racism, and even violence.

YouTube Horror Stories 2

Credit: The Mozilla Foundation

The Mozilla Foundation pointed out that YouTube hasn’t provided data for researchers to verify the company’s own claim that it has reduced recommendations containing misinformation and harmful content by 50 percent.

This initiative by Mozilla hopes to make YouTube look for ways to make the recommendation process more transparent. Hopefully, it will also cut down some of the most offensive videos on the Internet.