Meta confirms the problem on Instagram that floods users with violent and sexual drums

Rate this post


Met has admitted CNBC This Instagram is experiencing an error that floods users’ accounts with drum videos that do not usually appear from its algorithms. “We are correcting an error that made some users see content in the Instagram reel feed that should not be recommended,” the company told News Organization. “We apologize for the mistake.” Consumers have joined the social media platforms to ask other people if they have recently been flooded with drums that contain violent and sexual topics. One Reddit user He said their drum pages were flooded with school firing and killings.

Other said They receive videos from going back, such as piercing, beheading and castration, nudity, Porn And straight -rape. Some say they still see such videos, even if they have allowed their sensitive content control. Social media algorithms are designed to show you videos and other content similar to those you usually watch, read, like or interact. In this case, however, Instagram shows graphic videos of even those who have not interacted with similar drums, and sometimes even after the user has It took time To click “don’t be interested” on a reel with violence or sexual content.

Meta spokesman did not say CNBC What exactly was the mistake, but some of the videos that people reported seeing first on Instagram, based on the company’s own policies. “To protect consumers … we remove the most graphic content and add warning labels to other graphic content so that people are aware that this may be sensitive or disturbing before clicking,” company policy readS The rules of Meta condition that it removes “real photos and videos of nudity and sexual activity.”

This article originally appeared on Engadget at https://www.engadget.com/apps/Meta-confirms-instagram-ssesue-mates-flooding-users-with- andrrs

 
Report

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *