this is what Facebook workers see


Death, animal abuse and humiliation: this is what Facebook workers see

Surely many times you have asked who controls the content that users upload to Facebook. Are the workers of the social network themselves carrying out the work of moderation? Is it more like a bot? The answer is that neither one nor the other. It's more about external companies contracted by Facebook those who are responsible for monitoring those contents that may be sensitive to the huge community of the social network.

And it is not surprising, since monitoring all the content that users share daily around the world is not exactly an easy task. Not much less. It is the joint work of several teams of moderators that have to do all content that is uploaded to the network. All without exception.

The agonizing life of a moderator

In a report published by The Verge, Casey Newton had the opportunity to speak with 3 former workers of one of these external companies known as Cognizant. And as you would expect, his stories are scary.

We talk about the less friendly face of the social network, the one in which certain users publish post daily with speeches that encourage hatred against immigrants, women or the LGTBQ community, videos or photographs of murders, or even child pornography.

According to the 3 former employees, nothing or nobody had prepared them for what they were going to contemplate from the offices of one of the Cognizant branches in Tampa, Florida. At a daily rate, they were obliged to visualize up to 400 videos at least for 15 seconds to be able to report it as inappropriate. Of course, this did not prevent them from re-visualizing a certain content despite having been deleted, since, in the vast majority of cases, they are replicated from different accounts from the moment they are uploaded to the network.

To make matters worse, the level of stress in those jobs was infamous. Since in addition to having to face real monstrosities in the social network, the teams had to report a minimum of 98% of correct answers. In other words, his reviews could not exceed, in any case, an error rate of 2% at the time of passing as valid content that goes against the rules of the social network. Otherwise, these types of companies are in danger of not renewing a contract with Facebook.

Other news about … Facebook



Publicado en TuExperto el
2019-06-19 14:28:49

Autor:
Raúl García

Visite el articulo en origen aqui

Tal vez te interese...