Covid-19: People who get info from Facebook more likely to be anti-vaccine – study –


People who source information from Facebook are more likely to be opposed to the Covid-19 vaccination, an unpublished study has found.
University of Canterbury student Amy Morahan researched the effect traditional media and Facebook have on attitudes and intentions towards the vaccine by surveying 484 selected people nationwide.
The study was done in August and September this year, and the initial findings – before the study was peer-reviewed – have been made available.
Morahan showed the participants seven pro-vaccination messages and three anti-vaccination messages.
* Covid-19: Vaccine certificate data will be legally protected, and work offline
* Covid-19: Facebook could have stopped anti-vaccine comments swarming users
* Covid-19: Nelson Marlborough chief medical officer addresses vaccine worries

People indicated they saw all the positive messages more frequently on traditional media platforms such as newspapers and online news sites.
Conversely, Morahan found the more people sourced vaccine information from Facebook, the more likely they were to hold a negative view of the vaccine.
“The more knowledge you have from Facebook, the less likely you are going to get the vaccine.”
The research also showed there was no demographic difference in the 25 per cent of respondents who did not want to get vaccinated.
Morahan was surprised that age and education levels did not make a difference to those holding negative views, saying it was easy to assume less educated people would be more likely to be against the vaccine.
“That makes the effects of the media source even more important.”
Morahan was now compiling data and would release a final report in February next year. She hoped the research would help communicators effectively target vaccination-hesitant audiences.
Originally, she began the research after noticing the same trends around positive messaging on vaccinations in traditional media versus negative posts on Facebook.
She said the research was worrying because people couldn’t ascertain if messages on the social media platform could be verified.
“Anyone can say what they like.”
Back in March, Facebook employees revealed they had found a way to help stop the spread of misinformation about the virus through posts on its page.
By altering how posts about vaccines were ranked in people’s newsfeeds, researchers at the company realised they could curtail the misleading information individuals saw about Covid-19 vaccines and offer users posts from legitimate sources like the World Health Organisation.
“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote, responding to the internal memo about the study.
Instead, Facebook shelved some suggestions from the study. Other changes weren’t made until April with critics saying they believed the tech giant was worried the change might impact company profits.
“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, CEO for the Centre for Countering Digital Hate, an internet watchdog group in the United Kingdom.
“It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”
However, in an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.
© 2021 Stuff Limited


Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *