You should be familiar with those algorithmically-produced videos of “year in review” Facebook usually generate if you are still using the social media platform. Another form of it could be celebrating years of friendship with someone or birthdays and the videos often set slideshows of your posts with cheerful music. Facebook is reportedly using them to promote contents for extremist organizations along with creating a business page for Al-Qaeda, the National Whistleblower Center Researchers found.

According to The Associated Press, the social media giant automatically created an animated, bubbly video on a local “business” page for Al-Qaeda and extremist contents “auto-generated” by the social platform include a celebratory jihadist video. While investigating the matter, the whistleblower also found other content for white supremacists groups and self-identified Nazis online.

Al-Qaeda’s local business page “automatically” generated by Facebook’s algorithm had over 7,400 likes, with reports saying that Facebook gave the group “valuable data” when they were recruiting people. Facebook’s tools put job descriptions uploaded by users on their profile. It also copied flags, images, and branding used by the extremist group.

Augmented Reality

Facebook blames a lack of perfection in the algorithm

“After making heavy investments, we are detecting and removing terrorism content at a far higher success rate than even two years ago,” Facebook said in a statement to AP, admitting its hate speech filter lacks perfection. “We don’t claim to find everything and we remain vigilant in our efforts against terrorist groups around the world.”

Meanwhile, Facebook’s claim of having a more sophisticated system of removing terrorism content could not recognize hundreds of personal profiles named after terrorist leaders as well as “Al-Qaeda” business profile identified on Facebook during the study by the whistleblowers.

The study, which lasted for five months, closely monitored the pages of 3,000 people who connected to the organizations the US government listed as terrorist groups. It found that many of these groups including al-Qaeda and Islamic State group were “openly” active on Facebook. It also found that the social media network was continuously making new contents for the groups using its algorithms, creating videos of “celebration” and “memories”

The whole infrastructure is fundamentally flawed

The study found that Facebook’s automatic features barely removed 38% of posts prominently featuring hate symbols. The feature also contributed to pages with thousands of likes. It also found that extremists sometimes used Facebook tools to curate their own media. That means their images were completed unnoticed.

“The whole infrastructure is fundamentally flawed,” said expert Hany Farid of UC Berkeley digital forensics. The more pressing challenge is the “little appetite to fix it.” Facebook and other social media networks are pushing away from being responsible for materials on their platform because that will upon up “a whole can of worms”