Disclaimer: This essay is provided as an example of work produced by students studying towards a media degree, it is not illustrative of the work produced by our in-house experts. Click here for sample essays written by our professional writers.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UKEssays.com.

Fake News and Misinformation on Social Media

Paper Type: Free Essay Subject: Media
Wordcount: 4495 words Published: 10th Aug 2021

Reference this

Part of: Social Media

Audiences are often controversial. Current controversies mentioned in the course include the Western export of media globally, the gullibility of audiences to “fake news”, the democratic value of audience participation, the productivity or exploitability of fans, taking audience reception into account in content regulation, and more.

Drawing on sources and ideas discussed in the course, you should critically examine the claim that news audiences are vulnerable to misinformation on social media. By disentangling the competing claims and evidence about implied and actual audiences for “fake news”, explain why audiences can be controversial and whether it matters.

From the Arab Spring to Trump’s election, “fake news” has always been at the centre of controversy around the world. Wardle and Derakhshan (2017) argue that the consequences of rumours and falsified content on social media include mistrust and confusion; yet, it is highly problematic to say that audiences are vulnerable to “fake news”. There are two reason for this problematisation: first, the term “fake news” in itself hides a whole host of assumptions and complexities and, second, there are layers of structural complexities and contextualisation around the process of audience engagement on social media that should be taken into account. For example, the major social media corporations can algorithmically manipulate users’ feed and flow of information.

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help!
Find out more about our Essay Writing Service

This essay comprehensively examines the audiences’ vulnerability to “fake news” on social media as a controversy, the notion of active audience is also considered controversial. The thesis of this essay is social media audiences are not vulnerable to “fake news” but yet, at the same time, major social media providers should be regulated and continuously monitored. To unpack this argument, the essay starts with conceptualising “fake news”. It then contrasts theories describing active audiences such as Jenkins’ (2013) participatory culture theory, with theories painting more vulnerable audiences, such as the Spiral of Silence, in an online context (Soffer and Gordoni, 2018).

This essay concludes that while social media firms provide their audience with a more open and engaging platform, there is still a need for long-term solutions that regulate these firms’ misconduct, especially with reference to “fake news” (LSE T3 report, 2018). The policy implications of these long-term solutions can be harming the freedom of speech online. Findings will be applied to empirical works from Egypt, UK and USA to ensure a cross-national approach.

Define: ‘“fake news”’ and Misinformation

“fake news” is a symptom of a much wider systemic challenge around the accuracy and credibility of information and the way that people – socially, politically and economically – are going to handle the threats and opportunities of new communication technologies, especially social media. Firstly, as suggested by Beckett (2017), one ought to recognise that ‘“fake news”’ is in its heart a range of misinformation that can be:

  1. False connection: when headlines or visuals do not support the content,
  2. False context: when genuine content is shared with false contextual information,
  3. Manipulated content: when genuine information or imagery manipulated to deceive
  4. Satire: when there is no intention to harm but potential to fool
  5. Misleading content: misleading use of information to frame an issue or an individual
  6. Imposter content: when genuine sources are impersonated
  7. Fabricated content: new content that is 100% false, designed to deceive and do harm

The term “fake news” can be associated with the five giant evils of the information crisis: confusion, cynicism, fragmentation, irresponsibility and apathy. There is a concern that the term ‘“fake news”’ is used by politicians and powerful elites as a description of simply what they disagree with or feel threatened by (Wardle and Derakhshan, 2017). Some also concern oversimplifying the issue of information pollution online when using the term ‘“fake news”’ (Tambini, 2018).

Alternatively, there is a growing number of scholars use different terms to refer to information disorders other than “fake news”. These terms can be the mis-, dis- and mal-information. Using the dimensions of harm and falseness, LSE Commission on Truth Trust and Technology (2018) report, describe the differences between these three types of information as Mis-information is when false information is shared, but no harm is meant. Dis-information is when false information is knowingly shared to cause harm. Mal-information is when real or actual information is shared to cause harm, often by moving information designed to stay private into the public sphere.

Certain governments have become so worried about “fake news” that they have introduced punitive legislation or threatened to force companies to open their applications to the security services. Germany passed a law— the Netzwerkdurchsetzungsgesetz— threatening fines of up to €50mn if platforms fail to remove illegal content including hate speech or “fake news” (Tambini and Moore, 2018). The UK Home Secretary said that police and intelligence agencies should be given access to encrypted messaging services like WhatsApp (Sparrow 2017). This was justified with the need to tackle any spread of information that affects national security or harms the public. It is argued that spread of “fake news” with an intent to cause harm, as in mal-information on social media, lead to political polarization and intensive consumption of news that came from partisan media environment.

Problem: algorithms and claims of political bias on social media

One of the reasons that the crisis of misinformation on social media become so prominent in recent years is political communication controversy. Political communication controversy can be linked to the usage of the algorithm that target, downgrade or influence how implied audience interact with information in a biased manner. According the John Hopkins Guide to Digital Media (2018), algorithms is defined as finite and sequence of instructions, rules, or linear steps designed to guarantee that the agent performing the sequence will reach a particular implied, pre-defined group of people.  Implied audiences lurk behind a host of homogenizing synonyms (market, public, users, citizens, people) and nominalized processes (diffusion, adoption, culture, practice, mediation, identity, change) that mask their agency, diversity, life contexts and interests at stake (Livingstone, 2007). During elections time different parties’ campaigns use the audience’s personal data to target specific groups of voters (Gillespie, 2018).

Online data-intensive approaches greatly increase campaign message targeting capabilities, and these are now central to the political communication strategies used by the major UK parties. Facebook alone has 40 million registered users in the UK and offers a very effective reporting tool (LSET3 Report, 2018). Facebook implemented a team of human curators to guide the algorithms that selected advertisements and news for its Trending Topics section. This shows “the role of humans in what was assumed to be an automated feature, but it also raised questions about the amount of judgment they exercised over the Trending Topics that appeared” (DeVito, 2017: p5). The recent controversy over the use of algorithms by Cambridge Analytica to use Facebook users’ data to help target political campaigning in the US shows the risks associated with such applications, exacerbated in that particular case by the absence of consent for use of personal data (House of Commons Science and Technology Committee, 2018). Facebook, however, was not able to identify any incidents of bias, let alone a systemic slanting, but the perception of a problem opened up opaque practices to broad criticism and prompted internal reactions by Facebook to repair and protect its public reputation.

Reuters Institute Digital News Report (2018) found that partisan news providers on social media are said to have played a part in bringing Donald Trump to power in the US and in mobilising support for Jeremy Corbyn in the UK. The narrowness of these partisan news providers separates them from established news sites like Fox News and Mail Online, which also have a reputation for partisan political coverage but tend to cover the full range of news (world news, sport, entertainment). The algorithm helps partisan social media accounts reach the audience who have the same political and ideological bias and creates a network of like-minded parties. For example, Breitbart is far-right syndicated American news provider, over 80% of its audience identified as right-wing. In the UK around 75% of The Canary’s audience, left-wing news provider, identified as left-wing, though with a small base.

In May 2018, the UK’s Parliament Committee’s report was published calling on the new Centre for Data, Ethics & Innovation to examine algorithm biases on social media. Transparency tools used to determine the scope for individuals ought to be challenged and all significant algorithmic decisions that affect them must be observed, where appropriate. It calls on the Government to provide better oversight of private sector algorithms that use public sector datasets, and to look at how to monetise these datasets to improve outcomes across Government. This debate surrounding algorithm on social media, the intentions behind using it and regulations of the algorithm in decision-making can be framed in a larger framework which is of audience’s gullibility. 

Controversy: are audience vulnerable to “fake news”?

Jenkins (2008) argues that the general public is aware of the coming changes in news and industry leaders acknowledge the importance of the role that ordinary consumers can play not just in accepting convergence, but actually in driving the process. Jenkins’ (2008) notion is that online communities are far from guidable as they are now focal points for criticisms of companies that they feel have violated their trust. These online communities provide the means for their members to express their distrust of the news media and their discontent with political changes. Jenkins (2014) notes that there are many current examples of people taking skills that they acquired in playful ways on the web and turning them towards politics, and thus become a political activist. In social media, Jenkins (2014) says, the ability to create videos and send them out did shape political discourse in many cases around the world.

In Egypt, Jenkins (2014) exemplifies his point, Twitter permitted several diasporic and interconnected public to chime in and produce, through the storytelling conventions of repetition (retweeting) and reinforcement, a collective chant of a revolution in the making, well before the movement itself had resulted in regime reversal (and some would argue that the movement still has not produced the comprehensive regime reversal they were hoping for). These forms of affective involvement can be key in connecting energies and helping reflexively drive movements forward. But they can also entangle publics in ongoing loops of engaged passivity.

Jenkins (2014) also notes that people make many active decisions when spreading media, whether simply passing content to their social network, or posting a mash-up video. Active social media audience has shown a remarkable ability to circulate advertising slogans and jingles against their organizing companies or to hijack popular news stories to express an entirely different interpretation from those of their authors. Nonetheless, such critical engagement with news on social media can be prevented, manipulated or distorted by use of algorithm by large giants in the market for commercial and political purposes.  Even in the case of “fake news”, Jenkins says, social media users are regarded as contributors to the spread of misinformation rather than vulnerable and lacking the basic critical media literacy skills.

Nonetheless, this notion of active audience is deemed controversial. Some scholars took a more moderate approach to Jenkins’s neo-liberal approach took. For example, Sonia Livingstone (2007) argues that:

Is an active audience alert, attentive and original? Is he or she politically active or subversive? Does the active audience represent anything other than a challenge to the straw person of the ‘passive audience’? To reject the extreme, do-what-you-will-with-the-text model of the active audience is not necessarily to reject a vigilant, attentive and creative audience, and nor is it necessarily to accept a habitual, unimaginative one.

Livingstone (2007) shed light upon complexities in audience’s activities on social media. By 2018, scholars seem to offer more theories about these complexities, calling for regulations to better conduct of news consumption on these platforms. Media texts, like other texts, are multi-layered, subject to conventional and generic constraints, open in their meanings, providing multiple yet bounded paths for the viewer. Tambini and Moore (2018) argue that, when it comes to social media audience, further layers of contextualisation around the process of audience engagement should be taken into account. An example of this, from a technological context, manipulating social media content algorithmically by Google and Facebook to target different users. The notion of audience’s participation or activity that Jenkins (2008, 2014) wrote about was systemically challenged.

Another example that shed lights upon complexities surrounding social media engagement with the news was proposed by Oren Soffer and Galit Gordoni (2018) who examined the mode of public expression in an online and offline setting via the theory of the spiral of silence. Oren Soffer and Galit Gordoni (2018) adopted alternative measures, with regard to three major issues on the Israeli agenda, in comparison to expression in public, as in the traditional theory. They found virtually no difference in audience activism in news site or in an offline setting as perceived support for one’s opinion by the majority was found to have a positive and significant effect on the willingness to express an opinion in the online sphere. In more pessimistic cases, explanatory role of fear of isolation and not being supported was much greater in the online sphere than in face-to-face context. This raise question about news on social media and whether users are actively engaging with it.  

Beside theories, recent empirics may also reflect the need for more regulations and media literacy skills. Statics suggest that a majority of citizens (both students and adults) lack the capacity to correctly differentiate “fake news” from verified content (Tambini, 2018). The Reuters Institute for the Study of Journalism (2018) looked at how news is being consumed in a range of countries around the globe and found out that fewer than a quarter (23%) say they trust the news they find in social media, let alone engaging with it. Over half (54%) agree or strongly agree that they are concerned about what is real and fake on the internet. This is highest in countries like Brazil (85%), Spain (69%), and the United States (64%) where polarised political situations combine with high social media use. This controversy of the gullibility of social media audience was better addressed by regulators.

Solution: regulations and why it matters?

These issues of information on social media can lead to a weakening of democracy and not serving the interests of citizens (Allcott and Gentzkow, 2017). The growing sense of a ‘crisis’ pushed governments, parliaments and public authorities in the UK and beyond to respond. By the 2003 Communications Act, the United Kingdom has formed the Office of Communications (Ofcom) as a principled, (almost) sector-wide regulator, funded by industry to replace multiple regulators of diverse provenance and practices, was widely welcomed as a constructive response to the emerging challenges of a converged, global media market. Its primary duty – to further the interests of citizens and consumers, along with responsibilities in relation to public service broadcasting, universal service provision for broadband, the management of spectrum, and much more, with an intriguing addition of the promotion of media literacy – gave rise to new hopes.

One of Ofcom mission is to study adults’ media use and attitudes. Ofcom looks at media use, understanding and how these have changed over time. For instance, Ofcom’s Adults’ Media Lives report – a longitudinal ethnographic study tracking a small number of individuals and their evolving relationships with digital media- provide useful insight about users’ literacy and critical skills. The more that information and communication technologies become central to modern society, the more it is imperative to identify, and to manage the development of the skills and abilities required to use them (Livingstone and Thumim, 2003). The concept of media literacy is being extended from its traditional focus on print and audio-visual media to encompass the internet and social media in particular.

However, Ofcom’s role can be easier in an offline setting especially with regard to news and audience vulnerability to political messages. For example, there is a ban on political advertising on television and radio in the UK and political parties are given some free airtime to communicate directly with the electorate (Allcott and Gentzkow, 2017). Such a standard is arguably very difficult to have in an online setting. The content of online political advertising in the UK remains largely unregulated, advertisers as political advertising is not covered by the Advertising Standards Authority’s (ASA).

Ofcom (2018) was critical of social media unregulated political advertising and the lack of transparency of data or algorithms used by social media providers and their advertising agencies. While Facebook was asked to increase its monitoring and identification of political advertising to enforce its existing guidelines, these efforts lack transparency and are no substitute for publicly accountable regulation. Regulators themselves have recognised this and have called for more reforms. In 2018, the Electoral Commission called for increased transparency on who is paying for online political advertisements on social media websites. In such case, power over meaning is conquered by organizations who lack transparency and value attitudes. Transparency is even more needed when it comes to social media usage of data.  Ofcom (2018) evidence suggests that while people may be aware that their data are being used to allow third parties to tailor messages to them, they know less about the data brokerage systems and involvement of third parties and how the data are used or monetised.  In future developments, the shortfall in transparency’ in how individuals are targeted by political parties on social media can be tackled and more data is made public. Without intervening in the media environment through policies and regulation we risk burden and blame the individuals for the problems of the digital environment.

Some countries such as Egypt has intensified its social media regulations and went beyond regulation the service providers to include individuals. This increases the risk of harming freedom of speech and dissent on social a media. In 2015, Egypt’s parliament has passed a controversial law that will allow the state to regulate social media users (BBC, 2018). The law suggests that any personal social media account, blog or website with more than 5,000 followers is considered a media outlet and thus subject to media laws. A number of opposition activists have been arrested in recent months on charges of spreading false news online. Tens of thousands of people have been detained in Egypt since 2013, when the military overthrew Mohammed Morsi, Egypt’s first democratically elected president, following mass protests against his rule.

It is clear how the issue of misinformation, disinformation and mal-information is being handled differently in the UK compared Egypt. Such geopolitical aspect with reference to regulating social media is under-theorized, more theoretical outputs are needed. While Britain’s Ofcom avoids blaming the individuals of the issue of ‘“fake news”’ on social media calling for more transparency in who is funding political content on social media, individuals in the Egyptian legal system are blamed for the spread of “fake news” and are subjected to tough laws including arrest for spreading political content online. Independent regulators and academics may decide how to regulate the conduct of social media news rather than authorities and oppressive regimes.

In conclusion, this essay critically engaged with the controversial notion that the audience is guidable to “fake news”. It does not suggest that the audience is gullible yet argues that social media manipulation can result in misinforming, disinforming, or mal-informing audience. It starts off by explaining the term “fake news” before discussing algorithms usage by social media firms for political and commercial purposes. Relevant theoretical debates about the audience’s gullibility or their participatory nature were applied to the cases from the US and Egypt. Optimistic views on social media news audience were contrasted with a more cautious approach to this new matter. This essay recommends that in future, independent regulatory bodies such as Ofcom should shoulder more duties and responsibilities to guarantee better information flow and reliability on social media. The risk of over-regulating social media and thus harming the free speech has also discussed and considered as a potential implication. Finally, applying contemporary theories on empirics to criticise the issue of “fake news” on social media demonstrates the need for more theories to solve current legislation challenges.

Bibliography:

  • Allcott, H. and Gentzkow, M. (2017) ‘Social media and “fake news” in the 2016 election’, Journal of Economic Perspectives, 31(2), 211–36. Available at www.nber.org/papers/w23089 [accessed 3 January 2018].
  • Beckett, Charlie (2017) ‘“fake news”’: the best thing that’s happened to journalism. POLIS: journalism and society at the LSE (11 Mar 2017). Available at: eprints.lse.ac.uk/76568/
  • DeVito, M.A. (2017) ‘From editors to algorithms: A values-based approach to understanding story selection in the Facebook news feed’, Digital Journalism, 5(6), 753–73.
  • DeVito, M.A. (2017) ‘From editors to algorithms: A values-based approach to understanding story selection in the Facebook news feed’, Digital Journalism, 5(6), 753–73; Hintz, A., Dencik, L. and Wahl-Jorgensen, K. (2019) Digital citizenship in a data society, Cambridge: Polity.
  • Evans, D. and Schmalensee, R. (2016) Matchmakers: The New Economics of Multisided Markets. Boston, MA: Harvard Business Review Press.
  • European Parliament (2018) ‘Parliament adopts its position on digital copyright rules’, 12 September. Available at: www.europarl.europa.eu/ news/en/press-room/20180906IPR12103/parliament-adopts-its-position-on-digital-copyright-rules [accessed 14 December 2018].
  • Gillespie, T. (2018) Custodians of the internet, New Haven, CT: Yale University Press. 
  • House of Commons Science and Technology Committee (2018) Algorithms in decisionmaking. [Online] Available from: https://publications.parliament.uk/pa/cm201719/cmselect/cmsctech/351/351.pdf [Accessed 12 December 2018].
  • Livingstone, S. (1998) Audience research at the crossroads: the ‘implied audience’ in media theory. European Journal of Cultural Studies, 1, 2.
  • Livingstone, S., and Thumim, N. (2003). Assessing the media literacy of UK adults: A review of the academic literature. Report commissioned by the Broadcasting Standards Commission/ Independent Television Commission/National Institute of Adult and Continuing Education. March, 2003.
  • Marie-Laure, E and Benjamin J. (2018) John Hopkins Guide to Digital Media. Maryland: John Hopkins University Press
  • Mayer, J. (2018) ‘Journalists: Defend your work through action, not just with editorials.’ Available at https://medium.com/trusting-news/ journalists-defend-journalism-through-action-not-just-with-editorials-304387197446  [accessed 4 January 2019].
  • Ofcom (2018) ‘Eight in ten Internet users have concerns about going online’, 18 September. Available at www.ofcom.org.uk/about-ofcom/ latest/features-and-news/eight-in-ten-have-online-concerns [accessed 4 January 2019].
  • Ofcom (2018) News consumption in the UK: 2018, Jigsaw Research. Available at www.ofcom.org.uk/__data/assets/pdf_ le/0024/116529/ news-consumption-2018.pdf  [accessed 30 December 2018].
  • Sunstein, C. (2017) #Republic: Divided democracy in the age of social media, Princeton, NJ: Princeton University Press.
  • Tambini, D. (2018) ‘Social media power and election legitimacy’, in M. Moore and D. Tambini. (2018) Digital dominance, New York: Oxford University Press, pp 265–93; Mayer, J. (2018) ‘How Russia helped swing the election for Trump’, The New Yorker, 1 October. Available at www.newyorker.com/magazine/2018/10/01/how-russia-helped-to-swing-the-election-for-trump [accessed 1 October 2018].
  • Wardle, C. and Derakhshan, H. (2017) Information disorder: Toward an interdisciplinary framework for research and policy making, Council of Europe, DGI (2017)09, p 5. Available at: https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c  [accessed 2 August 2018].

 

Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

Related Content

Collections

Content relating to: "Social Media"

Social Media platforms enable people to interact with others from around the world, including sharing information, engaging in discussions, and interacting with content. Social Media can be accessed on multiple devices, such as computers, smartphones, tablets, and more.

Related Articles

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please:

Related Services

Our academic writing and marking services can help you!

Prices from

£124

Approximate costs for:

  • Undergraduate 2:2
  • 1000 words
  • 7 day delivery

Order an Essay

Related Lectures

Study for free with our range of university lecture notes!

Academic Knowledge Logo

Freelance Writing Jobs

Looking for a flexible role?
Do you have a 2:1 degree or higher?

Apply Today!