Digital Propaganda: The Tyranny of Ignorance

The existence of propaganda is inexorably bound to the nature of communication and communications technology. Mass communication by citizens in the digital age has been heralded as a means to counter elite propaganda; however, it also provides a forum for misinformation, aggression and hostility. The extremist group Britain First has used Facebook as a way to propagate hostility towards Muslims, immigrants and social security claimants in the form of memes, leading to a backlash from sites antithetical to their message. This article provides a memetic analysis, which addresses persuasion, organisation, political echo chambers and self-correcting online narratives; arguing that propaganda can be best understood as an evolving set of techniques and mechanisms which facilitate the propagation of ideas and actions. This allows the concept to be adapted to fit a changing political and technological landscape and to encompass both propaganda and counter-propaganda in the context of horizontal communications networks.


Introduction
Propaganda's enigmatic position within scholarship has been noted by a diverse range of scholars; Corner (2007) goes as far as to suggest that this confusion has made the concept too politically loaded to be analytically useful, while others such as Miller (2015;Miller and Sabir 2012) have argued that the marginalisation of propaganda as an object of study has coincided with an increase in the deployment and sophistication of propaganda by democratic states, and that euphemising propaganda with terms such as psyops, public relations and strategic communication, is problematic in the face of its routine connection with violence and coercion. O'Shaughnessy (2004) attempted to construct a unified definition through discussion of the various approaches within the literature but like Corner, fell back on specific techniques of persuasion as a means to define the

Memes, memetics and propaganda
A meme, in its original sense relates to any form of culture with the potential for unitary replication. It was coined by Richard Dawkins (2006a) in an attempt to provide a theoretical basis for an evolutionary theory of culture, and as a way of demonstrating that the gene was not unique in its relationship to evolutionary processes, but one example of a specific type of transmittable information (he refers to them as 'replicators') whose capacity to make nearly accurate copies of themselves, underpins the evolutionary process. This was later developed in more detail by the philosopher Daniel Dennett (1996Dennett ( , 2003 as an evolutionary model for culture. Although initial hopes for a science of memetics largely evaporated around the middle of the 2000s with the death of the discipline's only dedicated journal (Edmonds, 2005), interest has been maintained in relation to so-called 'Internet memes', which include shared online jokes and other digital material which are subject to forms of evolutionary change (Bauckhage et al., 2013;Christensen, 2011;Rintel, 2013;Shifman, 2012;Shifman and Thelwall, 2009;Ward, 2008;Weng et al., 2012).
Aside from the wildly different scope of these two uses of the term, the most significant difference in definition is that academic memes are memes simply by virtue of their capacity to replicate as a coherent unit, whilst an Internet meme is often defined in relation to its popularity (see Sparkes-Vian, 2015). There are numerous recognisable variations on the Internet meme of which the infographic is amongst the most recognisable, usually taking the form of a captioned image, with specific image and text formulations such as the black border/white text/image, and the 'keep calm and carry on' memes. They spread with variations in the text and images but are nevertheless recognisable as variations of the original meme (IMD, 2014;Rintel, 2013).
The use of memetics in order to analyse propaganda is initially promising because of the relationship between propaganda and the notion of ideological propagation. Memetic theory deals specifically with the capacity of ideas and practices to replicate throughout and between different cultural environments (Blackmore, 1999;Dawkins, 2006a); propaganda can also be understood in such terms, with the dissemination of ideas and actions throughout culture a unifying theme within many disparate approaches to the topic (Bakir et al., 2015;Ellul, 1973;Herman and Chomsky, 1988;Miller, 2015). It is particularly useful when discussing digital propaganda, where peer to peer replication of ideas through networks can lessen the significance of centralised organisational structures, while maintaining or even accelerating the propagation of ideas (Castells, 2012). Bakir et al. (2015) note that a variety of scholars have taken what they refer to as an 'a-critical' approach to propaganda analysis, as opposed to critical scholars who have focused on the abuse of propaganda, in particular its abuse by the powerful. Whilst propaganda, like many of its euphemisms, has acquired negative connotations as a consequence of that abuse, it has also been acknowledged that assuming a negative definition can lead to 'propaganda by allies' being excused or ignored (Corner, 2007;Miller, 2015;Miller and Sabir, 2012). The memetic definition takes a neutral stance, not in order to marginalise the significance of the abuse of propaganda, but to avoid cluttering the definition of an elusive and evolving concept, with the subjectivity of moral judgement. The ethical evaluation of propaganda and its techniques should rightly be considered a subject for scholarly investigation, but a neutral definition focused on the facilitation of ideological propagation, allows for a nuanced understanding of the ethical issues associated with its use. This preference for nuance is of particular importance online, where power structures can be both dispersed and hidden, including from those spreading propaganda. It is for this reason that the memetic approach also rejects simple binary or basic spectrum categorisation of good/bad or black/grey/ white propaganda (see Jowett and O'Donnell, 1986) based on a single criterion such as deception, which is too inflexible to cope with the evolutionary nature of both propaganda and morality.
Instead of using a simple good/bad dichotomy or gradient of deception for each propagandistic meme, each technique of propaganda -defined as a means by which the propagation of a meme can be facilitated -should be subject to ethical scrutiny. As a meme may contain multiple propaganda techniques, its ethical status is likely to be more complex within such a system than in one with only three categories along a single spectrum -that of deception. As with Bakir et al.'s (2015) model, the necessary ethical evaluation of propaganda can also address issues such as coercion, deception and incentivisation, but also allows for additional ethical considerations to be incorporated into the model as they arise within an analysis. This adaptability within the memetic model stems from the way in which memetics treats the rhetorical techniques which allow an individual meme's chances of replication to be improved. Propaganda techniques are regarded as memes in their own right; unitary replicators which can be passed from person to person -not least through the pedagogical processes of teaching PR and marketing. As a consequence, the techniques of propaganda are subject to evolutionary pressures and liable to adapt and change in response to them. This understanding allows propaganda itself to be viewed as an evolving body of techniques for the propagation of memes, with new variations emerging in response to changes in the political and technological environment, including the other memes replicating within it. In the same way, ethical concerns surrounding the propagation of ideas may change depending on political, social and technological context (see Sparkes-Vian, 2015; Sparkes-Vian as Vian, 2011 for further discussion of these ideas).
The relationship between meme and environment is characterised as a dialectical one, in which the body politic is created and defined by the ideas within it, whilst the ideas within the body politic have their replication facilitated or hindered by the proliferation of other compatible or incompatible memes within their surroundings. In keeping with both Miller (2015;Miller and Sabir, 2012) and Ellul's (1973) conceptions of propaganda, actions, which can also be treated as memes, are included in this model as well; violence and coercion, for example, which Miller and Sabir (2012) identify as integrally linked to some forms of state propaganda, can be understood as replicable propaganda techniques. In fact, they document extensively the way techniques, such as causing pain through placing prisoners in forced stress positions, have codified and replicated as part of the arsenal of state repression. The memetic model removes the need for that central co-ordination as a definitional aspect of propaganda allows for a comparison with techniques such as the use of vehicles as weapons, which can be emulated without the need for a military training program.
In the context of the Britain First case study, relevant memetic environments include the structure of Facebook as a network, its infrastructural mechanisms for spreading data throughout the network and its tendency to facilitate political echo chambers and 'trench warfare' communication dynamics (see Bright, 2016;Karlsen et al., 2017;Shin and Thorson, 2017). This is in addition to the political environment of UK political discourse and public opinion, including views expressed by more mainstream politicians and within traditional media, which would contribute to the selection pressures placed on the replication of memes within any contemporary British case study.

Methodology
The methodology for this investigation was developed from a larger scale study on digital propaganda in the 2010 election (Sparkes- Vian, 2015) which dealt with both 'academic' memetic analysis and Internet memes. The analysis in this paper is focused mainly on images (as opposed to text and video), allowing for better direct comparison with later 'reaction memes' intended to critique and satirise original posts by Britain First. The data corpus for this study included 30 individual memes from Britain First's Facebook page spanning the period 1 March 1-7 April 2015 and 10 memes selected from the various anti-Britain First pages during the same time period. 1 As the analysis here is qualitative, individual memes were chosen for detailed analysis based on relevant themes which appeared throughout the data sample and the surrounding discussions between and within the two communities. The memes selected for detailed analysis represent 'typical' memes in terms of structure and content, based on a larger collection of 450 most recent images collected from Britain First's Facebook page on 15 May 2015. However, the analysis should not be treated as necessarily representative of all communication by these groups, but rather as a means to explore specific patterns of discourse which appeared throughout the period when the data was gathered.
Qualitative memetics involves an analytical 'toolkit' of concepts within the literature on memetics in order to conduct a systematic analysis of memes and their environment. It was developed as a memetic adaptation of Jager and Maier's (2009) Foucaultian approach to Critical Discourse Analysis (CDA) and is intended to be an adaptive set of tools with which to address different research questions. The following 'tools' were utilised within the case study analysis: a) Disaggregate the memeplex into its constituent alleles. Does the alteration of an allele change the meaning of the meme?
An allele in this context is a smaller constituent part of a larger meme which could theoretically be altered or replaced with alternative; for example, the content of text or the picture used within an infographic. The alteration of an allele can subvert the nature of a propagandistic meme. b) Is this a 'copy the product', or 'copy the instructions' meme?
Copy the product memes are replicated in their entirety -for example by sharing or cutting and pasting a picture or piece of text. Copy the instructions memes replicate by reproducing a meme (sometimes with alterations) from a specific or implicit set of instructions. It has been argued by Blackmore (1999) that such memes are likely to be more stable and long-lasting than copy the product memes. c) Take the meme's eye view. What opportunities for replication exist within this environment?
The meme's eye view is a thought experiment which looks at a meme's environment in terms of opportunities for replication and pressures for alteration, in order to understand the spread and evolution of particular ideas, actions or texts. This is often done by endowing the meme with a pseudo-agency and treating it as though it 'intends' to replicate. The purpose is to understand the relationship between a meme and the environment in which it replicates. d) Identify the propaganda techniques used to facilitate memetic replication within the data corpus. Have they been replicated from elsewhere?
There are a large number of techniques mentioned in the literature on propaganda which can facilitate a meme's replication, from the use of emotive rhetoric to linguistic devices such as paired contrasts and tripartite lists (Atkinson, 1984;Jowett and O'Donnell, 1986;Lee and Lee, 1939;O'Shaughnessy, 2004;Pratkanis and Aronson, 1991), in addition to the more violent and coercive techniques identified by Miller and Sabir (2012). These can be deliberately passed on via pedagogical processes but also copied instinctively, even by those without formal training in propaganda. Identifying such techniques can be useful in terms of a) understanding the degree of professionalism with which propaganda is being produced, and b) evaluating the utility of specific techniques for the replication of memes within a specific environment or with respect to a specific case study.

Case Study: Britain First
Britain First is an extremist organisation and minor political party which splintered from the British National Party (BNP) in 2011 under the leadership of Jim Dowson. Dowson left the BNP in 2010 after a disagreement with Nick Griffin over the enforcement of a court order prohibiting the party from disallowing membership on grounds of race. He was also accused of sexual harassment by a female BNP activist (Stewart, 2010). Dowson left Britain First in 2014 to the leadership of another former BNP activist Paul Goulding, amidst newspaper reports that the two had clashed over Goulding's support for the tactic of invading Mosques and harassing worshippers (Dearden, 2014b;Sommerlad, 2014). This is a technique which can be compared to the use of terror and torture as a form of propaganda, discussed by Miller and Sabir (2012), but deployed here by an extremist political group -the specific use of intimidation and harassment by Britain First against Muslims being a notable aspect of Goulding's leadership. Although they are a marginal power in themselves, many of the grievances exploited by Britain First include the same hostility against Islam and immigration which make up recurring themes within UK tabloid press coverage (Taylor, 2014).
As a political party, the group has made little headway, largely lending their support to the United Kingdom Independence Party (UKIP) in most constituencies (Britain First, 2015a;Collins, 2015). However, their social media following, especially on Facebook, has successfully outstripped all of the major UK political parties with over 720,000 page likes and an estimated online reach of over 20 million (Collins, 2015). 2 The page itself publishes a combination of nationalistic, antiimmigration and Islamophobic rhetoric, militarism, promotion of Britain First merchandising and links to other right-wing groups such as the Knights Templar International (Britain First, 2015b). The use of the Internet by racist groups is a longstanding issue, and can often involve deceptive and covert forms of propaganda in which the origins and agenda of those involved are concealed (Daniels, 2009). Something which makes the kind of counter-propaganda advocated (and anticipated) by Mason (2013a) considerably more difficult to do successfully.
In the case of Britain First, whilst their identity and agenda are not covert, they do deny the racist nature of their position, arguing instead that it is pro-British, anti-immigration and anti-Muslim, which they contend is not a racist position because it attacks a religion not a race and that "the only people [they] hate are the white leftwing politicians and journalists who are wrecking our beautiful country" (Britain First, 2015a, emphasis in original). A sobering statement given that during the trial of the white supremacist Thomas Mair, it was reported that he had repeatedly shouted 'Britain first' during his murder of 'white, left-wing politician', Jo Cox (Walker, 2016). The organisation also denies the existence of racism, because the word was 'invented by a communist mass murderer', Leon Trotsky, to silence European opposition to 'multi-culturalism' (Britain First, 2015a).
Often these narratives are embodied in infographics; these combinations of image and text have in turn given rise to a series of critical Facebook pages which produce various kinds of anti-fascist counter-propaganda, many of them specifically targeting Britain First. Some of these groups can be seen as part of a wider category of online activists who use humour to counter extremist rightwing messages -such as the English Disco Lovers (2015), who organise disco-themed flash mobs in protest against the far-right English Defence League (2015). Others use more informative memes, seeking to expose the deceptive statements made by Britain First and to encourage people to take action against them. They are joined in this goal by non-specific anti-fascist activists such as Tom Clark, author of the blog Another Angry Voice (Clark, 2015), which uses a combination of infographic memes and more detailed articles to encourage people to oppose online fascistic propaganda. These responses are examples of the kind of corrective counter-narrative that people such as Mason (2013a) and Zittrain (2008) identify as providing an effective bulwark against deceptive propaganda in the Internet age. However, there are limitations to this form of counter-propaganda; not least that posed by wider negative stereotyping and hostility surrounding media depictions and public understanding of Islam, immigration and the welfare state. Britain First is at the far end of the political spectrum in terms of their extremity but they are by no means alone in terms of the groups they have chosen as targets for their hostility.

Discourses Surrounding Immigrants, Islam and Welfare
One of the underlying assumptions of those who have championed digital communication is that the democratisation of discourse provided by social media should have a positive impact on public communication including providing a bulwark against deceptive propaganda by elite groups (Castells, 2012;Mason, 2013b). However, the analysis of Britain First's Facebook page revealed a plethora of examples of deceptive, racist propaganda used to perpetuate their nationalist ideology and promote themselves as a group. The inaccurate stereotypes targeting mainly Muslims, immigrants and social security claimants intersect with surrounding narratives, which also reflect wider discourses on these topics propounded by the right-wing tabloid press and more mainstream political parties (BBC News, 2015;Bhatia, 2014;Harkins and Lugo-Ocando, 2015;Taylor, 2014).
From the meme's eye view, the specific memes produced and shared by Britain First are replicating within a society replete with other memes which also attack the groups they are demonising, something that can be expected to facilitate the replication of memes that form part of the same ideological narrative. The process is similar to the replication of 'pre-propaganda' or sociological propaganda, which Ellul (1973) identifies as a prerequisite for the perpetuation of 'true' or active propaganda, but without the need for a central, coordinating authority, or any direct collusion between Britain First and the tabloid press. Figure 1 represents a typical example of a Britain First meme which picks up on some of these themes. As a 'copy the product' meme, its success is measured in terms of its specific replication with absolute accuracy (although potentially alongside additional detractions) via Facebook's like and share features. It was shared 77,266 times, received 25,789 likes and approximately 2,100 comments, which was the highest level of replication identified within this corpus. Although it should not be assumed that all of these interactions were endorsing the views expressed in the meme, within the context of Facebook's network each of them did serve to propagate it, as interactions with posts on Facebook will appear in the newsfeeds of others on the network.
Looking at this meme in terms of identifiable alleles (i.e. aspects of the meme which can be picked apart and swapped for others whilst remaining at least somewhat recognisable as a copy of the original), there are three distinct text groups and a background image, with each of the text groups using different fonts and effects. In keeping with its status as a 'copy the product meme' these do not conform to recognised infographic patterns from other Internet memes such as the black border/image/text or 'keep calm' formulations (IMD, 2014). It is possible that an implicit set of instructions can be seen in the use of different fonts and effects for the text groups, as this is a pattern replicated within some of the satirical anti-Britain First memes, but if so these are instructions which have not been widely copied and may be limited to this specific case study.
Despite the lack of a professional context or organising body, recognisable propaganda techniques from within the literature can be identified. For example, the use of paired contrast, comparing then and now, and the othering discourse of 'us' and 'them' (Atkinson, 1984). This othering is implicitly directed at either immigrants or foreigners, as demonstrated by the phrase 'our own country', indicating that the dangerous and violent other who are provided with income and shelter by the state, are not part of that group, as indicated by the use of the word 'them'. The reference to 'kill[ing] us in our own country' relates to discourses surrounding terror and in the context of Britain First it can also be seen to specifically refer to the murder of the soldier Lee Rigby by Michael Adebolajo and Michael Adebowale (BBC News, 2014b), who claimed the action was in retaliation for the murder of Muslims by British soldiers engaged in military action in Muslim countries. As such the meme incorporates three of the group's favoured villains -Muslims, immigrants/foreigners and social security claimants -and excludes them from the 'us' of respectable British citizenry.
The phrase 'LIKE AND SHARE IF YOU ARE DISGUSTED' represents a common technique used within Britain First propaganda memes and an established one within digital propaganda more generally. In terms of older propaganda techniques, it combines the 'Bandwagon' notion of encouraging people to join in and replicate an idea which is already popular with the 'Testimonial' of the endorsement of a friend or acquaintance (Lee and Lee, 1939). However, it can also be seen as a specific technique, 'Copy Me', which has arisen specifically within the web 2.0 era as a consequence of the ease with which information can be shared throughout a distributed network. Whereas a centralised network such as television provides a one-to-many broadcast mechanism, with priority access given to those in power, within distributed networks (which still prioritise recognisable individuals), power is no longer a prerequisite for access and peer-to-peer replication becomes an avenue for memes to exploit.
The use of 'copy me' techniques can be seen at once as a positive -and more democraticdevelopment within the evolution of propaganda, and a potentially dangerous and deceptive one. Encouraging supporters to propagate a message on your behalf in this way helps to sidestep the distrust many feel towards authority figures, especially established politicians (Sparkes- Vian, 2015). This is another theme frequently highlighted within Britain First posts, which often exhibit hostility towards mainstream political figures irrespective of party (Britain First, 2015b). The significance of this mistrust in mainstream politics has continued, and can be seen across the political spectrum and is by no means limited to the UK. In the US both the Donald Trump and Bernie Sanders campaigns played on this narrative, and the more recent resurgence of the UK left with the rise of Jeremy Corbyn to the leadership of the Labour Party has also focused on the primacy of the general population over the elite. The 'copy me' technique is particularly well adapted to this political environment as well as the ever-increasing popularity of social networking (Greenwood et al., 2016). The facilitation of direct peer-to-peer communication by these networks reduces the organisational and administrative burden on single sources of propaganda, and allows ideas to flow from less powerful organisations, as well as those already powerful. In the case of the ideas flowing from Britain First, this can be considered a cause for concern to those who appreciate the benefits of a pluralistic, multicultural society.
'Copy me' techniques also provide the opportunity to obscure the origins of a piece of information, something which is problematic in mainstream politics, where enthusiastic supporters become the face of a campaign, whilst more traditional campaign strategists and spin doctors remain in charge (see discussion in Sparkes- Vian, 2015). When dealing with groups such as Britain First, the use of 'copy me' techniques in this manner can function in a similar way to the use of cloaked websites as discussed by Daniels (2009), obscuring the nature of the organisation who originally posted the meme and thus gaining likes and shares from those who would not otherwise agree with many of their views. This is the case with Britain First's use of Lee Rigby's image against the express wishes of his family and their use of the Royal British Legion's poppy symbol and the royal crest (BBC News, 2014a; Dearden, 2014a).
The capacity of information to propagate effectively without the need for excessive, expensive administrative control at the centre is one of the reasons that social networking has been hailed as a democratising force for communication and a means whereby centralised or elite propaganda can be challenged (Castells, 2012;Gerbaudo, 2012;Mason, 2013aMason, , 2013b). However, this example demonstrates another side to this form of communication: the replication of extremist views and false or misleading information which resonate with widespread social ignorance. It also raises questions about the notion that propaganda should specifically be 'organised' communication. Much of the information which passes through Britain First is shared from other places and replicated virally without centralised organisation (Britian First, 2015). This is a common distribution method for virally-grown movements on both the left and right, and though Britain First do have a hierarchical structure and identifiable leaders, movements on the left such as the Indignadas/os in Spain and the global Occupy movement have specifically rejected the notion of leadership. Although some, such as Paolo Gerbaudo (2012), have disputed the true horizontalism of such movements as a whole, the replication of individual pieces of viral propaganda can be generated ad hoc by an individual without need for any form of explicit coordination, even if copies of it are also replicated via more clearly organised distribution points such as that provided by Britain First. Figure 2 further highlights the theme also present in Figure 1; the (inaccurate) claim that foreign nationals and immigrants are more likely than their British counterparts to be in receipt of social security support. Like Figure 1 it is also a 'copy the product' meme with no obvious pattern of substitutable alleles which can be related to more widespread Internet memes. The use of copy the product memes is common but not universal on Britain First's page, which also includes occasional examples of the 'black border/image' style meme and a variation of 'Je Suis Charlie' in the wake of the attack on the French magazine (Britain First, 2015b;IMD, 2014). Figure 2 was considerably less successful than Figure 1, amassing only 2,878 likes and 1,886 shares; however, it does illustrate something significant in relation to the counter-propaganda which Britain First inspired. Errors in spelling and grammar, such as the grammatically incorrect use of 'is' in Figure 2, are reoccurring themes amongst contributors to Britain First's page, and repeatedly lampooned by their detractors. The use of humour as a propaganda technique can be an effective facilitator for replication, although the appeal is likely to be restricted to those who are already hostile to Britain First's message (Shifman, 2012;Sparkes-Vian, 2015), in keeping with the scholarly literature on political echo chambers which are a notable feature of digital communication (Bright, 2016;Karlsen et al., 2017).
The popularity of Britain First and the propagation of the narratives espoused in their memes can be understood in terms of propaganda but not always in terms of persuasion. The automatic conflation of propaganda with persuasive speech is problematic, not because propaganda is never persuasive, but because it ignores propaganda's role in reaffirming beliefs which are already held by the individuals consuming the propaganda. The creation of structured ideological groups with an inside and outside (in memetic terms this is referred to as an institutional memetic construct or institutional memeplex (Blackmore, 1999;Dawkins, 2006b) can create distinct memetic environments in which the selection pressure on specific memes is altered by the ready acceptance of the basic premises of the ideology (Sparkes- Vian, 2015). This can be seen as an effect not only of ideological proliferation, but also of Facebook as a social network which can cause the clustering of political opinion as a consequence of the relationship between ideology, social networks and friendship groups (McPherson et al., 2001). The presence of this echo chamber effect relies on the tendency of ideologically committed individuals to seek conformation bias when consuming media, something more frequently noted in conservative groups and political extremes than more liberal ones (Boutyline and Willer, 2017). However, given presence of oppositional groups actively critiquing Britain First, the more useful comparison for this study may be Karlsen et al.'s (2017) analysis of digital 'trench warfare' dynamics, which notes the significance of 'disconfirmation bias' -that is, exposure to oppositional views may actually reinforce a person's strongly held political stance.
When examined through the meme's eye view it appears that Britain First's posts are not necessarily serving as persuasive tools but as affirmations of ideas which are already popular. For example, a substantial majority of the UK population (77%) believe that immigration is too high and should be reduced, with 55% saying the reduction should be substantial (Kitchen, 2009). The presence of such narratives suggests that, if the Britain First advocates exist within a political echo chamber, it may be a more expansive one than that of a single Facebook group. Rather than trying to persuade, memes such as Figures 1 and 2 utilise the presence of anti-immigration and racist narratives to facilitate their own replication. Also, despite suggestions (Mason, 2013a;Zittrain, 2008) that the online 'hive mind' should produce truth and weed out disinformation, the presence of such narratives counteracts the importance of truth as a selection pressure, as topics such as immigration and welfare are areas of mass ignorance. For example, research conducted by University College London (Dustmann and Frattini, 2013) shows that immigrants are in fact 45% less likely to receive benefits than their British counterparts and 3% less likely to live in supported housing. Additional research for the Royal Statistical Society (2013) shows that those polled consistently overestimated the level of migration (believing it to be double the actual figure even when unregistered migration was accounted for) as well as overestimating the number of Muslims, the level of benefit fraud and the portion of the Social Security budget spent on Jobseekers Allowance in relation to pensions. It is noteworthy that these areas of ignorance coincide with trends in reporting identified within the right-wing press, further emphasising the potential of such memes to resonate widely beyond the specific group of extremists identified within this case study (Taylor, 2014). This is an environment in some of the narratives embedded within Britain First's memes that already resonates with a significant portion (and in some cases a majority) of the population and their codification in easily replicable memes merely facilitates the replication of such ideas. The argument that in a digital environment 'propaganda becomes flammable' (Mason, 2013a), therefore appears problematic and the digital world appears to merely exacerbate already existing ignorance. However, explicit racism of the kind often present on Britain First's page is less acknowledged with two thirds of the population denying any racial prejudice and only 3% admitting to significant prejudice (NatCen, 2014), and perceptions of racist attitudes by minority groups also becoming less prevalent (Kitchen, 2009) although an increase in reported hate crime was noted after the EU referendum, suggesting this trend may be changing (Forster, 2016). Even Britain First deny that their position is racist in a manner comparable with practices by US far-right groups discussed by Daniels (2009) and they also attempt to promote themselves using more benign images praising the royal family and the military (Britain First, 2015b), which can find a receptive audience outside of those willing to admit to socially unacceptable levels of prejudice. It is therefore less than surprising that, in keeping with trends noticeable in both political and apolitical studies on Internet memes, the propaganda produced by Britain First spawned a selection of critical response memes which can be analysed as a form of counter-propaganda. In such an environment, counter-propaganda memes attacking Britain First have also found many opportunities for replication within the Facebook network.

Satire and Critique as Counter-Propaganda
In their study of the diffusion of an Internet meme in the form of an online joke, Shifman and Thelwall (2009) noted that the meme they were studying developed a distinctive 'call-response' pattern. In that particular case, as the initial joke revolved around stereotypical gender roles, the pattern took the form of a male/female call-response. In Sparkes- Vian's (2015) analysis of Internet memes in the 2010 election a similar call-response pattern was noted on a much smaller scale with reference to major party memes in which initial propaganda images constructed by major parties were co-opted and adapted to propagate anti-party messages. In the case of the Conservative Party's billboard campaign posters the satirical response from www.mydavidcameron.com (Singer, 2010) arguably became a more successful meme than the original propaganda posters, with a wide variety of anti-Conservative iterations proliferating online. Where such interactions span political boundaries there is a comparison to be made with Karlsen et al.'s (2017) observations of trench warfare dynamics between online groups; each side ridicules the other from a position of certainty, more interested in the reactions of their supporters than their adversaries.
Although a more detailed, quantitative analysis would be needed to confirm this, the initial evidence suggests that the Britain First memes have also followed a call-response pattern and that counter-propaganda memes by anti-fascist groups have been less successful at proliferating than Britain First's initial posts. Whilst a number of Facebook pages such as Exposing Britain The tactics of these anti-Britain First groups include the organised reporting of Britain First memes deemed to have broken Facebook's terms and conditions, as well as the use of satirical and critical memes which attack either general themes or specific posts from Britain First's page. In terms of the first tactic there appears to have been a certain amount of success. Specifically, these groups targeted the practice of using anti-animal abuse memes as a means to raise funds for the groups and spread Britain First's reach beyond the scope of the far right (Clark, 2015). These memes typically took the form of an image of an injured animal captioned with slogans such as 'Like and Share if you are against animal cruelty' 3 (Clark, 2014). The graphic nature of the images allowed those opposed to Britain First to request their removal on the grounds that they breached Facebook's community standards, a tactic which did not work when complaining about demonstrably false and aggressive memes about Muslims, social security claimants or migrants (Report Britain First, 2015).
The use of an external coercive authority in order to halt the spread of memes can be seen as a form of citizen-generated censorship -in this case deployed against a deceptive and manipulative form of propaganda consistent with more negative definitions of the term such as those proposed by Bakir et al. (2015) and Miller and Sabir (2012). However, the appeal to a central authority to counter such propaganda highlights the very different set of power relations in play when discussing the digital propaganda of non-elite and extremist groups, as opposed to that produced by the state. As a practice, this action is comparable to other legalistic tactics such as the pursuit of Britain First under copyright legislation for the use of the Royal British Legion poppy symbol and the Crown (BBC News, 2014a;Dearden, 2014a). Using these tactics as a means to limit the spread of a contrary meme can also be considered an imitable propaganda technique and one which Britain First have used themselves in the organised submission of complaints to Channel 4, over a docudrama that criticised UKIP (Saul, 2015). Given the nature of the material that anti-fascist groups are seeking to ban and the organisation they are campaigning against, the role of coercion, cast as an unambiguous negative when discussed by Bakir et al. (2015) and Miller and Sabir (2012), may need to be revisited and problematised. The moral ambiguity is further complicated in this instance, because the centralised authority from which anti-fascist groups were requesting censorship is Facebook, an undemocratic corporate entity which has itself been criticised for unethical behaviour (Arthur, 2014;Solon, 2016).
The use of satire is a common theme amongst groups producing anti-Britain First memes, in particular, frequent allusions to the poor grammar and spelling which is a common feature of their output. This is widely evident in page names such as Britain Furst and Britian First and is also a common feature of memes distributed by Fuck Britain First. For example, Figure 3 shows the combination of poor grammar as well as a comically exaggerated version of British nationalism in relation to 'English elephants' designed to highlight the absurdity of the narratives within Britain First's memes. The use of satire as a technique for propagation is easily understood from the meme's eye view. Racist narratives such as those perpetuated by Britain First still have a substantial pool of receptive minds in which to replicate and can therefore spread without the need for much centralised coordination. However, the political environment is heterogeneous and also contains many minds which are actively hostile to the ideas Britain First espouses. This opens up a replication opportunity for memes of counter-propaganda such as Figure 3, which use the increased replicator power of humour to subvert Britain First's message. It also includes the instruction to provide 'help and money', as a common criticism of Britain First by critical commentators is their heavy promotion of merchandising (including the use of the Crown and the British Legion's poppy, which drew lawsuits) in order to perpetuate their extremist ideology (Clark, 2015;Exposing Britain First, 2015;Report Britain First, 2015).
The final tactic used by the counter-propaganda sites is possibly the least successful; it involves the generation of explanatory memes which highlight inconsistencies in claims made in specific Britain First posts. This is a tactic used in particular by Report Britain , for example, creating memes which question Britain First's claims to have slept on the streets in solidarity with homeless veterans by analysing the times at which their photos were taken. In keeping with the analysis of Internet memes undertaken by Shifman (2012), the satirical counter-propaganda memes were considerably more popular, with typical analytical memes only getting a handful of shares and likes as humour was a common indicator of popularity within Shifman's study, whilst political memes had a more difficult time getting replicated. Interestingly, these analytical memes are the only ones which specifically clearly seek to persuade, with the majority of the propaganda memes from both Britain First and their detractors seeking instead to appeal to a base of existing sympathisers. The lack of success in replication when discussing explicitly persuasive memes could be explained by the entrenchment of political ideas, as this can result in disconfirmation bias (Karlsen et al., 2017), although it could also be that the very high level of specificity exhibited by these memes makes them less effective as replicators.
In general, persuasion -to the extent to which it is present at all within this exchange -comes in the form of persuading those already convinced of the rightness or wrongness of Britain First's position to replicate their propaganda or the counter-propaganda which seeks to undermine and discredit it. Whilst the memes from the satirical sites and the censorship tactics have elements of 'copy the instructions' to them, these analytical memes tend to be fairly specific to individual instances of Britain First propaganda, rather than following a more longstanding imitable pattern. They are not, however, sufficiently prolific to provide an effective counter-narrative against the original posts.

Conclusion
The memes distributed via Britain First's Facebook page have propagated a selection of extremist narratives, notably in connection with race and Islam, with considerable success. This may seem surprising given that over two-thirds of the British public deny having any racial prejudice (NatCen, 2014); however, the specific areas in which they perpetuate memes tap into views and narratives which are widespread throughout contemporary UK culture. The prevalence of Islamophobic, racist and antiwelfare narratives persist, even in the face of evidence that they are often based on false assumptions and widespread ignorance. Despite hopeful assertions by some scholars, the activities of Britain First suggest that in some cases, deceptive propaganda championed by elite groups such as Conservative politicians and right-wing newspapers, can in fact be supported and exacerbated by digital means rather than automatically challenged. A fact which has become all too clear since this case study was undertaken, in light of the rise of the American far right and the presidency of Donald Trump.
Although there is considerable evidence that groups opposed to Britain First have sought to challenge them, the proliferation of counter-propaganda against the group has not been as effective as the propagation of the group's page itself. In addition, despite some successes in appealing to Facebook over breaches of the site's terms and conditions, much of the counter-propaganda shares with Britain First's own posts a sense of preaching to the converted -a pattern consistent with research into digital echo chambers and 'trench warfare'. A call-response pattern emerges where Britain First generates memes which appeal to those who are critical of immigration and multiculturalism and other groups critical of Britain First produce material which subverts and satirises those memes for the benefit of those critical of Britain First and its political agenda. In both instances, it is notable that while each meme can be understood in terms of propaganda, the only aspects which seek to persuade are the calls to action, sometimes to attend demonstrations, but more frequently simply to perpetuate the propaganda of each group. The only examples from this case study of memes which really intended to persuade people to change their ideas, were those which critically addressed memes originally posted by Britain First and challenged their content. These were the least successful at replicating, suggesting a need to qualify and contextualise the role of persuasion in defining propaganda in this context.
Similarly, although Britain First's organisation and Facebook page can be seen as evidence of the organised propagation of ideas, the distribution of specific memes online is much more chaotic and requires little in the way of traditional organisational structures, although their presence can certainly facilitate the spread of specific memes by congregating likeminded people together in a space where they can easily share them throughout their own networks. Like persuasion, organisation is a possible but not necessary aspect of digital propaganda, whereas propagation remains central to understanding the spread of ideas and ideological memes online. It is also worth emphasising that whilst the digital environment is one in which ideas -and the actions involved in creating and disseminating ideas -can spread, it is not an isolated, separate universe. The murder of the MP Jo Cox by a far-right extremist linked to Britain First serves as a stark reminder that where the right memes meet the wrong mind, the consequences can be of great concern to those invested in the peaceful functioning of the body politic.

Funding
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Notes
1. All figures for numbers of shares, likes and comments, including page likes are accurate as of 15 May 2015 and may not match those in screenshots taken at later dates.