YouTube Keeps Recommending Similar Videos When You "Dislike" One, Research Shows

YouTube Keeps Recommending Similar Videos When You “Dislike” One, Research Shows

The business urges users to click the “Dislike” or “Not Interested” buttons when YouTube’s potent recommendation algorithms steer them toward films they don’t like in order to instruct the website’s software to stop surfacing that kind of content.

But a research released on Tuesday by Mozilla, the organisation that created the Firefox web browser, revealed that those buttons don’t accomplish much to remove undesirable movies from the customised suggestions that YouTube feeds to viewers.

For instance, according to Mozilla, selecting the “Not Interested” button only stopped 11% of suggestions for videos like that. Only 12% of people clicked the “Dislike” option. The “Don’t recommend this channel” button is the most efficient control, however it only works 43% of the time. Together with research assistants from the University of Exeter, Mozilla developed a machine learning model to determine video similarity. This model looks at factors like comparable subject matter or political stance.

Data from more than 20,000 users who voluntarily downloaded Mozilla’s RegretsReporter, a browser extension that grants Mozilla’s researchers access to people’s YouTube behaviour, served as the basis for this study. The project intends to make the opaque recommendation system employed by the Google-owned video site, which has been accused of directing users toward conspiracy theories, false information, and extremist content, more transparent.

Mozilla polled more than 2,700 users who downloaded the browser extension about their experiences trying to influence YouTube’s recommendations in addition to the RegretsReporter data. The majority of respondents—nearly 40%—said they didn’t believe YouTube’s controls had any bearing on the recommendations they made.

The main discovery, according to Becca Ricks, a senior researcher at Mozilla and co-author of the study, is that some of the concerns people had about not being in control were supported by the data. “Generally speaking, many undesirable videos do pass through.”

Elena Hernandez, a spokesperson for YouTube, disputed the assertions made in the report. Hernandez stated in a statement that “Mozilla’s research doesn’t take into account how our systems actually work, and as a result it’s difficult for us to extract many insights.” “We provide users control over their suggestions, including the option to prevent future recommendations of a certain video or channel. It’s important to note that our controls don’t eliminate entire topics or points of view because doing so can alienate viewers by fostering echo chambers.

See Also: LR Artist Pepperboy Is Set To Make A Major Splash In Austin

After a video has finished playing, inside the video player after it has finished, or on the homepage of the website, content is recommended by YouTube’s algorithms. Every suggestion is made with the individual viewer in mind, taking into consideration factors like their viewing habits, list of subscribed channels, or geographic area. Critics assert that YouTube’s suggestions can also direct viewers to esoteric and harmful content, even while the recommendations can be innocent, such as another live performance from the band they’re viewing.

The “Dislike” button on YouTube has also generated debate. After YouTube stopped showing dislike counts on videos in November of last year, users complained about the corporation, furious that by hiding the number, viewers were prevented from publicly criticising a video.

Users who have installed the RegretsReporter extension saw a new “Stop suggesting” button that Mozilla has developed in the top left corner of suggested videos for research purposes. A YouTube control, such as the “Remove from watch history” or “Dislike” button, was activated when the button was clicked, allowing Mozilla to test all of the available feedback options. (Mozilla also provided a “Stop suggesting” option for some users to utilise in order to act as the experiment’s control group.)

The RegretsReporter addon was originally made available by Mozilla in September 2020. The tool’s second significant research output is the report that was released on Tuesday. The first set of findings, which were published in July of last year, asserted that YouTube’s algorithms occasionally suggest videos that go outside the rules put forth by the website.

For years, Mozilla has carefully examined YouTube’s algorithm. Guillaume Chaslot, a former YouTube engineer and vocal critic of the corporation, received a fellowship from the organisation in 2019 to fund his study of the platform’s AI technologies. A project called TheirTube, which Mozilla funded, was introduced two years ago. It allows users to examine how YouTube’s recommendations can vary depending on their ideologies.

Recently, attention has been focused on YouTube’s content management. At a Senate Homeland Security hearing last week, senior officials from Twitter, TikTok, YouTube, and TikTok spoke about the tweets and videos that are published on their platforms. Neal Mohan, the head of YouTube’s product division, answered inquiries concerning harassment and hate speech. A day later, Mohan announced modifications to YouTube’s standards on violent extremism, barring any material that solicits support for or recruits members of extremist organisations, even if it is unrelated to well-known terrorist groups.

Keep visiting pepperboy to stay updated wit the latest news and information.

Share

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *