The routine of the moderators of Facebook that should see the publications of violent content | TECNOLOGICS-SITE

Science and Technology Blog

Post Top Ad

The routine of the moderators of Facebook that should see the publications of violent content

Share This

A group of friends sets fire to a dog on the street; a few teens are forced to have oral sex; a girl with a razor blade announces his suicide on a live video feed; a newborn is beaten by a relative in the cradle.


 

 

 

[caption id="" align="aligncenter" width="762"]Facebook pics Facebook has a department that monitors publications with hate speech, violence and inappropriate images.[/caption]

 

 

 

 

The list is endless. The worst thing that you can be on Facebook for eight hours, Monday to Friday, in exchange for a minimum wage.


 

 

Sergio, a young brazilian who asked not to be identified, lived this routine for almost a year, until he left his job as a reviewer of complaints regarding violence and hate on the page in Portuguese the social network.


 

 

Since then, he says, has become more a person "cold and unfeeling" in the life outside of the internet.

 

 

"I saw a live video by if someone was going to kill himself", he explains.

 

 

Its function was to decide as quickly as possible if the publications aggressive that other users reported were tolerable or passed the limits set by Facebook.


 

 

In his office, the goal for each reviewer was to assess the 3,500 photos, videos and texts reported to the day. More than seven per minute, or one every 8.5 seconds.


 

 

"Impossible to not have human error in that rate," says Sergio, who now works as a freelancer and decided to turn off their traces in the social network after meeting her "on the inside".


 

 

 

 

[anuncio_b30 id=1]

 

 

 

Routine





According to Sergio, the employment of those who work in this way to the empire of Mark Zuckerberg has nothing to do with the common image that we have of the offices of Silicon Valley.


 

 

 

[caption id="" align="aligncenter" width="624"]a post each 8,5 seconds. Content moderators should review a publication every 8.5 seconds.[/caption]

 

 

 

 

In a building with long rows of computers distributed in several floors, Sergio and approximately 500 colleagues from all over the world spent days assessing allegations of pedophilia, nudity, necrophilia, suicide, murders, sieges, threats, weapons, drugs, and violence animal in more than 10 languages.


 

 

According to the extrabajador, in these review centers of the social network most used on the planet, mobile phones are prohibited, breaks for food or the bathroom are monitored and the employment contracts provide for fines and legal proceedings against the leakage of information.


 

 

"It was like a center of attention the phone, without the phonesThe people were there to serve the customer: in this case, Facebook and all its users," he says.


 

 

On your computer, Sergio had access to a time-line of "alternative" that showed only the entries target of complaints from users, at random, along with a menu of possible violations.


 

The moderators only displayed the name of the author of the publications, and do not have access to their complete profiles. Your mission is to remove, ignore or send the publication to a superior evaluation, which occurs especially in cases of suicide or pedophilia, which in turn are sent to the authorities.


 

 

 

[anuncio_b30 id=1]

 

 

 

 

The moderators





The reviewers of the content, according to Sergio, are usually young professionals who live abroad or who cannot find work in their areas. The majority do not complete a year in the position.


 

 

The pressure to meet the targets appeared, according to Sergio, in recurring meetings with supervisors.


 

 

"They had regular reports on goals of moderation. The heads sometimes seemed to be cheering and trying to motivate us by saying that we had saved X people of suicides or assaults in the month, " he says.


 

 

 

 

[caption id="" align="aligncenter" width="725"]Moderators usually do not last more than a year in this job. Moderators usually do not last more than a year in this job.[/caption]

 

 

"But they also said always that the continuity of our jobs depended on the fulfilment of the goals for the day and was quoted in other places with better results than ours, the people never knew how long that would last the office," she adds.


 

 

A week ago, announcing an increase of 47% on the annual turnover of Facebook, which surpassed for the first time in history the mark of US$10,000 million in a quarter, Mark Zuckerberg promised to invest in "people and technology to identify bad behavior and eliminate false news, hate speech and other content problematic" of the network.


 

 

In may, the head of Global Policy for Facebook, Monika Bickert, said the work of reviewers as Sergio in a text about the challenges of content moderation.


 

"They have an obstacle: understanding the context, it is difficult to judge the intention behind an entry, or the risk implicit in another. Someone post a video violent of a terrorist attack, which will inspire the people to imitate the violence, or to talk about "how Someone write a joke about suicide, it is a mere comment or a cry for help?"


 

 

The brazilian confirmed the difficulties, but says that he couldn't discuss decisions with superiors. "I didn't have space to think critically, the work had to be automated and accelerated, was to follow the manual, press button, and doesn't ask many questions," he says.


 

After a request from the BBC before the publication of this story, Facebook responded that it had decided "not to comment" the statements of the ext worked.


 

 

In the questions sent in, requesting information on the charging time, goals, and contracts of employees, sub-contractors and data on any eventual psychological support to workers exposed to publications violent.


 

 

 

[anuncio_b30 id=1]

 

 

 

Injury death





The most recent data from Facebook shows that the network excludes nearly 300,000 publications each month.


 

 

"To see content strong all day makes you lose the sensitivity to certain things. Especially in relation to the naked -there were so many selfies of naked people, closeups of penises, vaginas and nipples, pornography lost the grace for me," he says.


 

 

 

[caption id="" align="aligncenter" width="728"]The moderators receive guidance on Facebook's The moderators receive guidance on Facebook's new policies in relation to publications, such as sharing images of women breastfeeding their babies.[/caption]

 

 

 

 

As there is no consensus in the laws of different countries on offensive content or hate speech online (phrases that can lead to imprisonment in places like Germany may be protected as freedom of expression by the US Constitution), Facebook has created its own rules, considering race and ethnicity, religion, gender and sexual orientation as "protected categories".


 

 

Reviewers are oriented, following the rules of the network, to delete "any direct attack on people" on the basis of these categories.


 

 

In terms of the violence, the network determines that the images of public interest, "such as human rights abuses or acts of terrorism", be kept when the postures expressing disapproval, or awareness.


 

 

In cases of sharing "for the sadistic pleasure of celebrating or praising violence", should be deleted. Many cases generated discord among the employees.


 

 

 

 

[anuncio_b30 id=1]

 

 

 

Shock





Sergio, what most shocked his colleagues was the cruelty of publications with attacks on animals.


 

"In one of them appeared a machine of slaughter, with a rope that it was tied to an engine and that had a cow tied up, the rope pulled, and the cow was broken to live," he says.


 

 

Excessive exposure to violent images has already led to the moderators of Facebook to develop anxiety disorders, sexual problems and panic, according to several media.


 

 

The brazilian, however, says that it was affected to a lesser extent by the content of the images.


 

 

 

[caption id="" align="aligncenter" width="702"]Facebook The social network has millions of users around the world.[/caption]

 

 

 

"First, because I was born in Brazil, and our references of violence tend to be more aggressive than the european counterparts, for example", he says.


 

 

"Also because I saw every day videos of brutality against children, minorities and animals and one ends up getting used to these images. For me, human cruelty in terms of words, supporting attacks, preaching hate, making fun of the victims in the comments was always much worse ".


 

The trauma led to Sergio to close down your profile on Facebook.


 

 

"Not to get stuck in the bubbles, in the echo chambers where people only hear their own voices, and of those who agree with them, I decided to isolate myself," says the brazilian.


 

 

"I did not want to become one of those people who appeared in the complaints."


 

 

 

SOURCE> BBC

 

 

[anuncio_b30 id=5]

 

 

No comments:

Post a Comment

Post Bottom Ad

Pages