The sinister truth behind Facebook Reactions

By now you have seen the new Facebook reaction buttons. You may have thrown a giggle a friend’s way or expressed anger at any story featuring Donald Trump.

They seem like such a great idea, right? You no longer have to like a post when your real sentiment is “Sorry you’re going through this, I’m here for you.” So far, so user-centric. Thanks Facebook for listening to what your user base wants.

shutterstock_383104915-1000x700

But there is a hidden agenda that we should be aware of.

Facebook exists to make money. It’s a publicly listed company accountable to its share-holders. Primarily, it makes money through advertising. And the reason advertisers love Facebook is how targeted it allows their marketing to be. In the days of advertising past, a broad-brush approach was used. You threw a message at a large volume of faceless people and hoped it stuck to some of them.

The advent of social media has given advertisers a way to be much more selective. What would you rather? Spend lots of money on an ad that will be seen by vast amounts of people who may or may not be interested? Or smaller amounts of money on ads that will be seen by a select group of people who are exclusively made up of your target market? It’s a no brainer for most. And the more information advertisers can glean about the people they are targeting, the better for them.

You already know that the ads on Facebook are targeted towards you. Sometimes they get it right and sometimes they get it really wrong. I have friends who have been bombarded with “lose the baby weight” ads a few days after giving birth. People with professional qualifications that are compelled to consider enrolling in degrees that they already have. People in loving relationships being shown ads for dating sites (curiously, this seems to be happen regularly after people announce engagements).

Targeted advertising isn’t easy – you need to know a person’s attitudes as well as their interests to get it right. The Like button was never a great measurement of attitude or sentiment. People “liked” things they didn’t really like.

Now we have six ways to respond rather than one: Love, Haha, Wow, Sad, and Angry – and Like still remains for the traditionalists. Six very different sentiments that are now very easy to measure and give an accurate idea of how people are reacting to posts.

shutterstock_262749239-1000x700

I imagine this wasn’t cheap to implement. I imagine a great deal of testing sat behind this idea. I imagine Facebook is seeking a return on its investment. I know advertisers are already salivating at the prospect of this new data (as recently reported by Wired). As Facebook users we need to ask how is this information going to be used? And in particular, how will the vulnerable be protected?

What do my social interactions tell Facebook advertisers? That I’m a married 37 year old mother with a weakness for pretty clothes, that I feel ashamed of the way this country treats refugees, that I write a blog, that I like creative things and pictures of my friends’ kids. I don’t really mind if that’s used to target my advertising. To be honest, I’d be happier to see ads for lovely dresses and campaigns to improve conditions for refugees than I would ads for weight loss or dating agencies.

But what if I’m not a 37 year old woman with a penchant for shoes? What if I’m a 15 year old girl struggling with my body image and desperate to lose weight, no matter how dangerous the method offered?

What if I’m a disenfranchised 14 year old boy who struggles to fit in, seeking a place online where I will be welcomed and feel valued, no matter what the content of that place might be?

What if I’m a person with racist and sexist ideas that I look to the internet to validate for me?

What if I’m dabbling in some dangerous idealogies and exactly the kind of person unscrupulous groups are trying to recruit?

What do the reactions buttons mean then?

Facebook has shown again and again that it’s either unable to protect people or not particularly interested in doing so. It has only recently changed its policies to expressly allow photographs of breastfeeding women. In the past they have been removed while pictures promoting violence against women were deemed not to breach community standards.

Last year, we saw Clementine Ford banned for exposing the men who attacked her using social media. The men were not banned. We can assume that Facebook is not going to come to the rescue of the vulnerable in a timely way. It’s potentially unrealistic to expect Facebook to keep tabs on everything that is shared. They rely on people reporting posts as inappropriate. And that has its own problems.

Let’s imagine an advertiser posts a joke in slightly questionable taste. A portion find it funny (and use the Haha reaction) and a portion express anger (and use the angry reaction). The advertiser now knows who it can push and who it can’t. Who has a tolerance for discrimination and who does not. Who is going to report certain ads and who is not.

This leads into dangerous territory where negative attitudes can become adopted as normal, without challenge. It’s particularly dangerous when this advertising is targeted at younger people. If you continually see or read a thing, you tend to accept it. You’re certainly not going to report it.

You know who I tend to see in my feed? People with very similar ideals to my own. Until I read the comments under news stories, I am very protected from views different from my own. My views tend to veer to the left side of politics. I imagine that those with different political views are also virtually surrounded by echoes of their own agenda. So we are part of this huge social network, but silo-ed by our attitudes.

Exposure to the same thing over and over can lead you to believe that your views are the right ones and shared by the majority. Even though this may be fallacy. The Facebook reactions are going to enable the further narrowing of these silos. And I’m not sure that’s a great thing. Particularly when Facebook advertising is not limited to people selling things, but extends to people promoting (sometimes dangerous) ideas.

I can see how reactions will be an incredibly useful tool for advertisers. I can see how they can be used positively between friends. I just question whether this tool and the data it will create will always be used in a conscionable way. I wonder how vulnerable people will be protected. I wonder just how many of my emotions Facebook really needs to be privy to.

How do you feel about the new Facebook emotions?

Written By

A self-confessed geek and lover of all things digital, Robyna started her professional life as software developer before moving into IT management and consulting. Her excitement about technology has grown since the rise of social media and she now helps professionals and firms build a strong online presence. She also writes at the Mummy & the Minx about keeping your mojo during motherhood, drinks a lot of coffee and makes her own clothes.

3 Comments

  • You know, I never would have realised this myself. Thanks for pointing it out. It’s good to know all the marketing and media tricks out there. I have a feeling I get the wool pulled over my eyes everyday with this sort of thing.

  • Very thought provoking post Robyna! It is a very important discussion the Facebook community needs to have so that we cease to be so manipulated. I guess this is just another thing for parents to educate their children about.

  • Facebook used to be fun. When it was just about posting a status comment or a photo. But now that every keystroke is analysed, contents of photos are automatically identified and so on, I find it too creepy. Users signing up these days have nowhere near the privacy settings that older users do. I know because I made a fake new account to test it. Try it yourself.

    I run a business and I am forever being told by facebook of the amazingly targeted power of their advertising platform, and their incredible depth of customer information I can benefit from… at a price. No thanks!

Leave a Reply