In January 2018 Facebook announced plans to reshape the content and priority of posts on its News Feed. Zuckerberg claims that spam, fake news, and clickbait content from marketers and publishers would be minimised, while priority will be given to relevant posts from friends and groups.
The aim is to gain back the original focus of Facebook, “establishing meaningful connections”, profiting from value added to the image of the social network and the company itself.
For the first time, Mark Zuckerberg is making a major decision that goes against one of his long-held beliefs: any change to the network must have the goal of improving engagement. This move, he concedes, will likely lead to people spending less time on the site. (Chris Baraniuk, 2018)¹
Facebook has often been researching methods to improve the way people interact with their main social media platform (let’s not forget that WhatsApp, Oculus, and Instagram — among others companies— are also owned by Facebook).
At the heart of Facebook Research there is the acknowledgement that their products are far from perfect. Building a safer and more inclusive design is always at the order of the day for everyone working at Facebook.
Social networks need to be constantly redesigned to meet the needs of their audience. To maintain its status (and profits), the social network colossus needs to be concerned about its users.
Sometimes this means collecting data from users through several tools which we interact to while on the platform. In the past few years, the company has put a lot of effort into diminishing self-censorship.
Self-censorship is the act of preventing oneself from speaking. Important in face-to-face communication, it is unsurprising that it manifests in communications mediated through social networking sites. On these venues, self-censorship may be caused by artifacts unique to, or exacerbated by, social media. (Sauvik and Kramer, 2013)²
Facebook believes that self-censorship in social media, while not the greatest of modern day issues, is perhaps still one which is worth reflecting about. An insight into this behaviour can be found in a recent article written by Sauvik Das — Ph.D. student at Carnegie Mellon and ex software engineer intern at Facebook — and Facebook data scientist Adam Kramer.
The research focuses on last-minute self censorship. This behaviour happens when a user refrains from sharing a post or comment, “filtering a thought after it has been formed and expressed, but before it has been shared”².
The study revealed that 71% over the 3.9 million Facebook users in their sample, self-censored content at least once over the course of 17 days. The behaviour could be attributed to two main factors: people’s perception of their social media audience, and the multiple identities the user needs to manage when sharing online to “totally distinct social circles”².
Furthermore, the data acquired indicates a prominent gender and age gap regarding self-censorship. Men are found to censor more than women, and so do users with more opposite sex friends. On the other hand, younger users generally censor less — an exception comes from users with an higher percentage of older friends, who will censor more.
Additional data and conclusions from Sauvik and Kramer on this research can be found here.
The papers also report how this data, and what kind of it, is collected. Facebook claims that when users type in a textbox nothing but binary digits of whether that post was published or not get collected. Information about what you type are not sent back to Facebook’s server.
For some, even the remote ability to collect such data poses threats to the user’s privacy. At the same time, we all would benefit from a more open and diverse platform where thoughts are shared freely — without that constant fear of spamming your entire circle of friends.
“…Facebook considers your thoughtful discretion about what to post as bad, because it withholds value from Facebook and from other users. Facebook monitors those unposted thoughts to better understand them, in order to build a system that minimizes this deliberate behavior.” (Jennifer Golbeck, 2013)³
I, sometimes, find myself self-censoring thoughts online for the fear of not being true to my persona. And while I — like many others — would prefer to know which of my online movements are tracked by the website I’m visiting, I also would do want to see a change in the image and content of modern social media. And can guess we all wish that will happen soon.
Image Source: Pixabay.
- Baraniuk, C. (2018). Facebook plans major news feed changes. [online] BBC News. Available at: http://www.bbc.co.uk/news/technology-42657621.
- Das, S. and Kramer, A. (2013). Self-Censorship on Facebook. [ebook] Available at: https://research.fb.com/wp-content/uploads/2016/11/self-censorship-on-facebook.pdf.
- Golbeck, J. (2018). Facebook Wants to Know Why You’re Self-Censoring Your Posts. [online] Slate Magazine. Available at: http://www.slate.com/articles/technology/future_tense/2013/12/facebook_self_censorship_what_happens_to_the_posts_you_don_t_publish.html.
Francesco Imola is a London-based musician, weekend photographer, and current Sound Design student at the University of Greenwich.