The algorithm of Instagram (Instagram) built to show for as many pictures of men or women dressed as little as possible, so claims a new study by AlgorithmWatch, a German research organization that tries to decipher how built algorithms that affect the content displayed to users of social networks in general and the EU in particular.
Researchers – Judith DuPortay, Nicolas Kaiser-Brill, Kira Shachat, and Edouard Rishar – have developed a plugin for Chrome and Firefox browsers, which regularly opens Instagram updates. The four recruited 26 volunteers who installed the extension, giving it a peek at their torch from February through May. Volunteers were asked to track each of 3 professional content creators out of a sample of 37 – of whom 14 were men and 23 were women. The extension searched among the first posts for images from the selected content creators, and if they appeared – he reported it to researchers.
The research idea, according to the researchers, came from two business Instagram users, one entrepreneur and the other a writer. Both have noticed that photos that they raise in minimal attire receive a lot of likes and reactions from images of a more ‘professional’ nature.
In the 4-month study period, the researchers analyzed the posts that were produced by the selected creators and which appeared in the volunteers’ flow. Of 1,737 posts that contained 2,400 images, 21% (which are 326 images) contained shirtless men or women in swimsuits – but in calculating the posts presented to volunteers, those posts accounted for 30% of those presented. In a division between men and women, researchers found that posts with exposed women appeared 54% more frequently than some of the posts uploaded, and exposed men’s photos appeared 28% more often than others. In contrast, food or landscape photos appeared 60% less than the proportion of photos uploaded in sampled accounts.
According to the researchers, the data leave no doubt that the algorithm of Instagram prioritize visible images, or as he defined it Kayser-Bril ‘soft porn’ – what drives users who want to ensure they get the highest exposure to upload more pictures of this kind.
Facebook, for its part, said: “This research is flawed in a number of ways, demonstrating a lack of understanding in the way Instagram works. We rate your torch posts based on content and accounts you’ve shown interest in, rather than arbitrary factors such as the presence of swimwear.” However, the researchers noted that a patent filed by Facebook in 2011 and published in 2015 actually spoke exactly about it, noting that “(the algorithm) may identify the number of people in the photo, (…) the sex of the people in the photo, (…) Face social network users, (…) estimate how little people in the picture are dressed by identifying large strips of skin color. “
Now, researchers are looking to expand the research and recruit additional volunteers to install the supplement . They emphasize that the information collected is not identified, and the extension only reports the images from the selected accounts that appear in the torch. Each time he finds such images, he also reports which other accounts the user is following, but the names of the accounts are encrypted in such a way that researchers cannot know which accounts are involved, but only to know if two users who follow similar accounts see the same images from the accounts used as test cases.