Pages

2/19/2019

YouTube announced it will no longer recommend "Conspiracy" Videos to users!




In a significant policy change, YouTube said on Friday that it planned to stop recommending “conspiracy theory” videos. The announcement comes in the form of a test that will only affect users from the United States. In an effort to get rid of “borderline content,” the streaming site will try to recommend videos involving conspiracy theories less often. YouTube says that these videos spread misinformation and often rope viewers into extremist behaviors

Image result for YouTube announces it will no longer recommend conspiracy videos

By Kalhan Rosenblatt
YouTube has announced that it will no longer recommend videos that "come close to" violating its community guidelines, such as conspiracy or medically inaccurate videos.
On Saturday, a former engineer for Google, YouTube's parent company, hailed the move as a "historic victory."
The original blog post from YouTube, published on Jan. 25, said that videos the site recommends, usually after a user has viewed one video, would no longer lead just to similar videos and instead would "pull in recommendations from a wider set of topics."
For example, if one person watches one video showing the recipe for snickerdoodles, they may be bombarded with suggestions for other cookie recipe videos. Up until the change, the same scenario would apply to conspiracy videos.
YouTube said in the post that the action is meant to "reduce the spread of content that comes close to — but doesn’t quite cross the line of — violating" its community policies. The examples the company cited include "promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."

The change will not affect the videos' availability. And if users are subscribed to a channel that, for instance, produces conspiracy content, or if they search for it, they will still see related recommendations, the company wrote.
Guillaume Chaslot, a former Google engineer, said that he helped to build the artificial intelligence used to curate recommended videos. In a thread of tweets posted on Saturday, he praised the change.
"It's only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable," Chaslot wrote.
Chaslot described how, prior to the change, a user watching conspiracy theory videos was led down a rabbit hole of similar content, which was the intention of the AI he said he helped build.
According to Chaslot, the goal of YouTube's AI was to keep users on the site as long as possible in order to promote more advertisements. When a user was enticed by multiple conspiracy videos, the AI not only became biased by the content the hyper-engaged users were watching, it also kept track of the content that those users were engaging with in an attempt to reproduce that pattern with other users, Chaslot explained.
Feb 11, 2019
Google-owned video sharing platform YouTube has announced that it will no longer recommend videos that ...


Jan 25, 2019
YouTube says it will recommend fewer videos about conspiracy theories. Taking steps to reduce the spread ...


Jan 25, 2019
YouTube is making changes to its recommendation algorithm, which serves up new videos for users to ...
Jan 25, 2019
Examples of videos YouTube hopes to promote less often include ones ...YouTube announced in a company ..

Image result for YouTube announces it will no longer recommend conspiracy videos


//////



No comments:

Post a Comment