Fake “Likes” to Stay away Only a Couple of dollars, researchers Say
“A very #merry Christmas for all,” Margrethe Vestager, Europe’s top antitrust enforcer, wrote on Facebook last December. Your post-144 “dressed like that.”
A few months later, as an experiment, researchers paid a company a few dollars, to attract attention to your good wishes. In 30 minutes, the post 100 had like more. The researchers had– similar results on a holiday post to Mrs Vestager’s Instagram account, and a Christmas tweet from Vera Jourovaare bad for the work of the police automated bots and other methods to manipulate social media platforms, according to a report the NATO Strategic Communications Center of Excellence< published on Friday of researchers!-- -->. With a small amount of money, the researchers found, virtually anyone can hire a company to get more likes, comments and clicks.
The groupPosts of prominent politicians such as Mrs Vestager and Mrs Jourova.
After four weeks, more than 80 percent of the fake clicks remained, the researchers said. And to generate virtually all of the accounts that was used, clicks, and remained active for three weeks after the researchers, they reported to the company.
The report highlights the challenges for Facebook, YouTube and Twitter, as they try, against online misinformation and other forms of manipulation. According to But the report also brings new attention to an often overlooked security flaw for internet platforms: company, the < sell!-- -->Clicks, likes and comments on social media networks. Many of the companies in Russia, according to the researchers. Because the social networks’ software ranks posts by the amount of engagement you generate, the paid activity can lead to more prominent set.
“We spend a lot of time thinking about how to regulate the social media in the enterprise — but not so much about how the rules of social media-manipulation industry,” said Sebastian Bay, one of the researchers who worked on the report. “We need to check whether this is something that should be allowed, but, perhaps more, to be very aware of the fact that this is so widespread.”
From may to August, the researchers tested the ability of social networks to handle, the for-rent-manipulation industry. The researchers said they had identified hundreds of providers of social-media manipulation, with a significant fall in sales. You have 16.
“The openness in this sector is striking,” the report says. “In fact, manipulation of the service providers openly advertise on all the major platforms.”
The researchers bought the engagement at over a hundred posts on Facebook, Instagram, Twitter, and YouTube. She saw “little to no resistance,” Mr. Bay said.
After the purchase, the researchers found, nearly 20,000 accounts that were used to do the editing of the social media platforms, and reports of a sample of them to the internet companies. Three weeks later, more than 95 percent of the reported accounts were still active online.
directed to The researchers, the most clicks on posts social media accounts You had for the experiment. But they also tested some of the verified accounts, as Mrs Vestager is to see if they were protected better. They were not, the researchers said.
said The researchers to limit their influence on real conversations that you had bought, commitment to the contributions of politicians, at least six months old and contained a non-political messages.
researchers found that the big tech companies are equally bad in the elimination of manipulation. Twitter identifies and removes more than the other, the found researchers; on average, half of the likes and retweets bought on Twitter were finally removed, she said.
Facebook, the world’s largest social network, was blocking the best, the creation of accounts under false pretenses, but it was rarely the content.
Instagram, which owns Facebook, was to manipulate the easiest and cheapest to. The Researchers YouTube found to remove the worst of it, the non-authentic accounts and to manipulate the most expensive. The researchers reported 100 accounts used for the manipulation in their test each of the social media company, and YouTube was the only one not to suspend and, unless a statement.
Samantha Bradshaw, a researcher at the Oxford Internet Institute, a Department of Oxford University, said social media manipulation could also
“Fake-engagement — can be said, whether produced by automatic or real-time — accounts – skew the perceived popularity of a candidate or issue,” Ms. Bradshaw. “If these strategies are used to strengthen, disinformation, conspiracy and intolerance, social media, the polarisation and mistrust are exacerbated within the company.”
Ms. Bradshaw, who reviewed the report independently, said the reason accounts might not have been made, that “you could be a part of, real people, where people are paid a small amount of money for the favor or approval of contributions.” This strategy is pointed out to them, makes it that much harder for the platforms to trade.
Still, she said, the company could do more to track and monitor accounts in connection with manipulation services. And the company could not suspend or remove them to reduce the accounts after multiple instances of suspicious activity, behavior authentic.
“is an examination of the fake engagement is important, because the accounts said not to fake the environment,” Ms. Bradshaw pollute. “Real people with real accounts to produce inauthentic behaviour, which distorted the online discourse and generates virality.”
Released on Fri, 06 Dec 2019 05:00:08 +0000