Propaganda Explained

Propaganda is a form of communication that is aimed at influencing the attitude of a community toward some cause or position. Propaganda is usually repeated and dispersed over a wide variety of media in order to create the desired result in audience attitudes.

As opposed to impartially providing information, propaganda, in its most basic sense, presents information primarily to influence an audience. Propaganda often presents facts selectively (thus possibly lying by omission) to encourage a particular synthesis, or uses loaded messages to produce an emotional rather than rational response to the information presented. The desired result is a change of the attitude toward the subject in the target audience to further a political agenda. Propaganda can be used as a form of political warfare.

While the term propaganda has acquired a strongly negative connotation by association with its most manipulative and jingoistic examples, propaganda in its original sense was neutral, and could refer to uses that were generally benign or innocuous, such as public health recommendations, signs encouraging citizens to participate in a census or election, or messages encouraging persons to report crimes to the police, among others.

Etymology

The term is not pejorative in origin and its political sense dates back to World War I.[1]

Types

Defining propaganda has always been a problem. The main difficulties have involved differentiating propaganda from other types of persuasion, and avoiding an "if they do it then that's propaganda, while if we do it then that's information and education" biased approach. Garth Jowett and Victoria O'Donnell have provided a concise, workable definition of the term: "Propaganda is the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist."[2] More comprehensive is the description by Richard Alan Nelson: "Propaganda is neutrally defined as a systematic form of purposeful persuasion that attempts to influence the emotions, attitudes, opinions, and actions of specified target audiences for ideological, political or commercial purposes through the controlled transmission of one-sided messages (which may or may not be factual) via mass and direct media channels. A propaganda organization employs propagandists who engage in propagandism—the applied creation and distribution of such forms of persuasion."[3]

Both definitions focus on the communicative process involved — or more precisely, on the purpose of the process, and allow "propaganda" to be considered objectively and then interpreted as positive or negative behavior depending on the perspective of the viewer or listener.

Propaganda is generally an appeal to emotion, not intellect. It shares techniques with advertising and public relations, each of which can be thought of as propaganda that promotes a commercial product or shapes the perception of an organization, person, or brand. In post–World War II usage the word "propaganda" more typically refers to political or nationalist uses of these techniques or to the promotion of a set of ideas, since the term had gained a pejorative meaning. The refusal phenomenon was eventually to be seen in politics itself by the substitution of "political marketing" and other designations for "political propaganda".

Propaganda was often used to influence opinions and beliefs on religious issues, particularly during the split between the Roman Catholic Church and the Protestant churches. Propaganda has become more common in political contexts, in particular to refer to certain efforts sponsored by governments, political groups, but also often covert interests. In the early 20th century, propaganda was exemplified in the form of party slogans. Also in the early 20th century the term propaganda was used by the founders of the nascent public relations industry to describe their activities. This usage died out around the time of World War II, as the industry started to avoid the word, given the pejorative connotation it had acquired.

Literally translated from the Latin gerundive as "things that must be disseminated", in some cultures the term is neutral or even positive, while in others the term has acquired a strong negative connotation. The connotations of the term "propaganda" can also vary over time. For example, in Portuguese and some Spanish language speaking countries, particularly in the Southern Cone, the word "propaganda" usually refers to the most common manipulative media — "advertising".

In English, "propaganda" was originally a neutral term used to describe the dissemination of information in favor of any given cause. During the 20th century, however, the term acquired a thoroughly negative meaning in western countries, representing the intentional dissemination of often false, but certainly "compelling" claims to support or justify political actions or ideologies. This redefinition arose because both the Soviet Union and Germany's government under Hitler admitted explicitly to using propaganda favoring, respectively, communism and Nazism, in all forms of public expression. As these ideologies were repugnant to liberal western societies, the negative feelings toward them came to be projected into the word "propaganda" itself.

Roderick Hindery argues[4] that propaganda exists on the political left, and right, and in mainstream centrist parties. Hindery further argues that debates about most social issues can be productively revisited in the context of asking "what is or is not propaganda?" Not to be overlooked is the link between propaganda, indoctrination, and terrorism/counterterrorism. He argues that threats to destroy are often as socially disruptive as physical devastation itself.

Propaganda also has much in common with public information campaigns by governments, which are intended to encourage or discourage certain forms of behavior (such as wearing seat belts, not smoking, not littering and so forth). Again, the emphasis is more political in propaganda. Propaganda can take the form of leaflets, posters, TV and radio broadcasts and can also extend to any other medium. In the case of the United States, there is also an important legal (imposed by law) distinction between advertising (a type of overt propaganda) and what the Government Accountability Office (GAO), an arm of the United States Congress, refers to as "covert propaganda".

Journalistic theory generally holds that news items should be objective, giving the reader an accurate background and analysis of the subject at hand. On the other hand, advertisements evolved from the traditional commercial advertisements to include also a new type in the form of paid articles or broadcasts disguised as news. These generally present an issue in a very subjective and often misleading light, primarily meant to persuade rather than inform. Normally they use only subtle propaganda techniques and not the more obvious ones used in traditional commercial advertisements. If the reader believes that a paid advertisement is in fact a news item, the message the advertiser is trying to communicate will be more easily "believed" or "internalized".

Such advertisements are considered obvious examples of "covert" propaganda because they take on the appearance of objective information rather than the appearance of propaganda, which is misleading. Federal law specifically mandates that any advertisement appearing in the format of a news item must state that the item is in fact a paid advertisement.

The propagandist seeks to change the way people understand an issue or situation for the purpose of changing their actions and expectations in ways that are desirable to the interest group. Propaganda, in this sense, serves as a corollary to censorship in which the same purpose is achieved, not by filling people's minds with approved information, but by preventing people from being confronted with opposing points of view. What sets propaganda apart from other forms of advocacy is the willingness of the propagandist to change people's understanding through deception and confusion rather than persuasion and understanding. The leaders of an organization know the information to be one sided or untrue, but this may not be true for the rank and file members who help to disseminate the propaganda.

More in line with the religious roots of the term, it is also used widely in the debates about new religious movements (NRMs), both by people who defend them and by people who oppose them. The latter pejoratively call these NRMs cults. Anti-cult activists and countercult activists accuse the leaders of what they consider cults of using propaganda extensively to recruit followers and keep them. Some social scientists, such as the late Jeffrey Hadden, and CESNUR affiliated scholars accuse ex-members of "cults" who became vocal critics and the anti-cult movement of making these unusual religious movements look bad without sufficient reasons.

Propaganda is a powerful weapon in war; it is used to dehumanize and create hatred toward a supposed enemy, either internal or external, by creating a false image in the mind. This can be done by using derogatory or racist terms, avoiding some words or by making allegations of enemy atrocities. Most propaganda wars require the home population to feel the enemy has inflicted an injustice, which may be fictitious or may be based on facts. The home population must also decide that the cause of their nation is just.

Propaganda is also one of the methods used in psychological warfare, which may also involve false flag operations. The term propaganda may also refer to false information meant to reinforce the mindsets of people who already believe as the propagandist wishes. The assumption is that, if people believe something false, they will constantly be assailed by doubts. Since these doubts are unpleasant (see cognitive dissonance), people will be eager to have them extinguished, and are therefore receptive to the reassurances of those in power. For this reason propaganda is often addressed to people who are already sympathetic to the agenda. This process of reinforcement uses an individual's predisposition to self-select "agreeable" information sources as a mechanism for maintaining control.

Propaganda can be classified according to the source and nature of the message. White propaganda generally comes from an openly identified source, and is characterized by gentler methods of persuasion, such as standard public relations techniques and one-sided presentation of an argument. Black propaganda is identified as being from one source, but is in fact from another. This is most commonly to disguise the true origins of the propaganda, be it from an enemy country or from an organization with a negative public image. Grey propaganda is propaganda without any identifiable source or author. A major application of grey propaganda is making enemies believe falsehoods using straw arguments: As phase one, to make someone believe "A", one releases as grey propaganda "B", the opposite of "A". In phase two, "B" is discredited using some strawman. The enemy will then assume "A" to be true.

In scale, these different types of propaganda can also be defined by the potential of true and correct information to compete with the propaganda. For example, opposition to white propaganda is often readily found and may slightly discredit the propaganda source. Opposition to grey propaganda, when revealed (often by an inside source), may create some level of public outcry. Opposition to black propaganda is often unavailable and may be dangerous to reveal, because public cognizance of black propaganda tactics and sources would undermine or backfire the very campaign the black propagandist supported.

Propaganda may be administered in insidious ways. For instance, disparaging disinformation about the history of certain groups or foreign countries may be encouraged or tolerated in the educational system. Since few people actually double-check what they learn at school, such disinformation will be repeated by journalists as well as parents, thus reinforcing the idea that the disinformation item is really a "well-known fact", even though no one repeating the myth is able to point to an authoritative source. The disinformation is then recycled in the media and in the educational system, without the need for direct governmental intervention on the media. Such permeating propaganda may be used for political goals: by giving citizens a false impression of the quality or policies of their country, they may be incited to reject certain proposals or certain remarks or ignore the experience of others. See also: black propaganda, marketing, advertising.

Techniques

See also: Doublespeak, Cult of personality and Factoid.

Common media for transmitting propaganda messages include news reports, government reports, historical revision, junk science, books, leaflets, movies, radio, television, and posters. Less common nowadays are letterpost envelopes examples of which of survive from the time of the American Civil War. (Connecticut Historical Society; Civil War Collections; Covers.) In principle any thing that appears on a poster can be produced on a reduced scale on a pocket-style envelope with corresponding proportions to the poster. The case of radio and television, propaganda can exist on news, current-affairs or talk-show segments, as advertising or public-service announce "spots" or as long-running advertorials. Propaganda campaigns often follow a strategic transmission pattern to indoctrinate the target group. This may begin with a simple transmission such as a leaflet dropped from a plane or an advertisement. Generally these messages will contain directions on how to obtain more information, via a web site, hot line, radio program, et cetera (as it is seen also for selling purposes among other goals). The strategy intends to initiate the individual from information recipient to information seeker through reinforcement, and then from information seeker to opinion leader through indoctrination.

A number of techniques based in social psychological research are used to generate propaganda. Many of these same techniques can be found under logical fallacies, since propagandists use arguments that, while sometimes convincing, are not necessarily valid.

Some time has been spent analyzing the means by which the propaganda messages are transmitted. That work is important but it is clear that information dissemination strategies become propaganda strategies only when coupled with propagandistic messages. Identifying these messages is a necessary prerequisite to study the methods by which those messages are spread. Below are a number of techniques for generating propaganda:

Ad hominem
  • A Latin phrase that has come to mean attacking one's opponent, as opposed to attacking their arguments.
    Ad nauseam
  • This argument approach uses tireless repetition of an idea. An idea, especially a simple slogan, that is repeated enough times, may begin to be taken as the truth. This approach works best when media sources are limited or controlled by the propagator.
    Appeal to authority
  • Appeals to authority cite prominent figures to support a position, idea, argument, or course of action.
    Appeal to fear
  • Appeals to fear and seeks to build support by instilling anxieties and panic in the general population, for example, Joseph Goebbels exploited Theodore Kaufman's Germany Must Perish! to claim that the Allies sought the extermination of the German people.
    Appeal to prejudice
  • Using loaded or emotive terms to attach value or moral goodness to believing the proposition. Used in biased or misleading ways.
    Bandwagon
  • Bandwagon and "inevitable-victory" appeals attempt to persuade the target audience to join in and take the course of action that "everyone else is taking".
    Inevitable victory
  • Invites those not already on the bandwagon to join those already on the road to certain victory. Those already or at least partially on the bandwagon are reassured that staying aboard is their best course of action.
    Join the crowd
  • This technique reinforces people's natural desire to be on the winning side. This technique is used to convince the audience that a program is an expression of an irresistible mass movement and that it is in their best interest to join.
    Beautiful people
  • The type of propaganda that deals with famous people or depicts attractive, happy people. This makes other people think that if they buy a product or follow a certain ideology, they too will be happy or successful.
    Big Lie
  • The repeated articulation of a complex of events that justify subsequent action. The descriptions of these events have elements of truth, and the "big lie" generalizations merge and eventually supplant the public's accurate perception of the underlying events. After World War I the German Stab in the back explanation of the cause of their defeat became a justification for Nazi re-militarization and revanchist aggression.
    Black-and-white fallacy
  • Presenting only two choices, with the product or idea being propagated as the better choice. For example: "You're either with us, or against us...."
    Classical conditioning
  • All vertebrates, including humans, respond to classical conditioning. That is, if object A is always present when object B is present and object B causes a negative physical reaction (e.g., disgust, pleasure) then we will when presented with object A when object B is not present, we will experience the same feelings.
    Cognitive dissonance
  • People desire to be consistent. Suppose a pollster finds that a certain group of people hates his candidate for senator but love actor A. They use actor A's endorsement of their candidate to change people's minds because people cannot tolerate inconsistency. They are forced to either to dislike the actor or like the candidate.
    Common man
  • The "plain folks" or "common man" approach attempts to convince the audience that the propagandist's positions reflect the common sense of the people. It is designed to win the confidence of the audience by communicating in the common manner and style of the target audience. Propagandists use ordinary language and mannerisms (and clothe their message in face-to-face and audiovisual communications) in attempting to identify their point of view with that of the average person. For example, a propaganda leaflet may make an argument on a macroeconomic issue, such as unemployment insurance benefits, using everyday terms: "Given that the country has little money during this recession, we should stop paying unemployment benefits to those who do not work, because that is like maxing out all your credit cards during a tight period, when you should be tightening your belt."
    Cult of personality
  • A cult of personality arises when an individual uses mass media to create an idealized and heroic public image, often through unquestioning flattery and praise. The hero personality then advocates the positions that the propagandist desires to promote. For example, modern propagandists hire popular personalities to promote their ideas and/or products.
    Demonizing the enemy
  • Making individuals from the opposing nation, from a different ethnic group, or those who support the opposing viewpoint appear to be subhuman (e.g., the Vietnam War-era term "gooks" for National Front for the Liberation of South Vietnam aka Vietcong, or "VC", soldiers), worthless, or immoral, through suggestion or false accusations. Dehumanizing is also a termed used synonymously with demonizing, the latter usually serves as an aspect of the former.
    Dictat
  • This technique hopes to simplify the decision making process by using images and words to tell the audience exactly what actions to take, eliminating any other possible choices. Authority figures can be used to give the order, overlapping it with the Appeal to authority technique, but not necessarily. The Uncle Sam "I want you" image is an example of this technique.
    Disinformation
  • The creation or deletion of information from public records, in the purpose of making a false record of an event or the actions of a person or organization, including outright forgery of photographs, motion pictures, broadcasts, and sound recordings as well as printed documents.
    Door-in-the-face technique
  • Is used to increase a person's latitude of acceptance. For example, if a salesperson wants to sell an item for $100 but the public is only willing to pay $50, the salesperson first offers the item at a higher price (e.g., $200) and subsequently reduces the price to $100 to make it seem like a good deal.
    Euphoria
  • The use of an event that generates euphoria or happiness, or using an appealing event to boost morale. Euphoria can be created by declaring a holiday, making luxury items available, or mounting a military parade with marching bands and patriotic messages.
    Fear, uncertainty and doubt
  • An attempt to influence public perception by disseminating negative and dubious/false information designed to undermine the credibility of their beliefs.
    Flag-waving
  • An attempt to justify an action on the grounds that doing so will make one more patriotic, or in some way benefit a country, group or idea the targeted audience supports.
    Foot-in-the-door technique
  • Often used by recruiters and salesmen. For example, a member of the opposite sex walks up to the victim and pins a flower or gives a small gift to the victim. The victim says thanks and now they have incurred a psychological debt to the perpetrator. The person eventually asks for a larger favor (e.g., a donation or to buy something far more expensive). The unwritten social contract between the victim and perpetrator causes the victim to feel obligated to reciprocate by agreeing to do the larger favor or buy the more expensive gift.
    Glittering generalities
  • Glittering generalities are emotionally appealing words that are applied to a product or idea, but present no concrete argument or analysis. This technique has also been referred to as the PT Barnum effect.
    Half-truth
  • A half-truth is a deceptive statement, which may come in several forms and includes some element of truth. The statement might be partly true, the statement may be totally true but only part of the whole truth, or it may utilize some deceptive element, such as improper punctuation, or double meaning, especially if the intent is to deceive, evade, blame or misrepresent the truth.
    Labeling
  • A euphemism is used when the propagandist attempts to increase the perceived quality, credibility, or credence of a particular ideal. A Dysphemism is used when the intent of the propagandist is to discredit, diminish the perceived quality, or hurt the perceived righteousness of the Mark. By creating a "label" or "category" or "faction" of a population, it is much easier to make an example of these larger bodies, because they can uplift or defame the Mark without actually incurring legal-defamation. Example: "Liberal" is a dysphemism intended to diminish the perceived credibility of a particular Mark. By taking a displeasing argument presented by a Mark, the propagandist can quote that person, and then attack "liberals" in an attempt to both (1) create a political battle-ax of unaccountable aggression and (2) diminish the quality of the Mark. If the propagandist uses the label on too-many perceivably credible individuals, muddying up the word can be done by broadcasting bad-examples of "liberals" into the media. Labeling can be thought of as a sub-set of Guilt by association, another logical fallacy.
    Latitudes of acceptance
  • If a person's message is outside the bounds of acceptance for an individual and group, most techniques will engender psychological reactance (simply hearing the argument will make the message even less acceptable). There are two techniques for increasing the bounds of acceptance. First, one can take a more even extreme position that will make more moderate positions seem more acceptable. This is similar to the Door-in-the-Face technique. Alternatively, one can moderate one's own position to the edge of the latitude of acceptance and then over time slowly move to the position that was previously.[5]
    Love bombing
  • Used to recruit members to a cult or ideology by having a group of individuals cut off a person from their existing social support and replace it entirely with members of the group who deliberately bombard the person with affection in an attempt to isolate the person from their prior beliefs and value system—see Milieu control.
    Lying and deception
  • Lying and deception can be the basis of many propaganda techniques including Ad Homimen arguments, Big-Lie, Defamation, Door-in-the-Face, Half-truth, Name-calling or any other technique that is based on dishonesty or deception. For example, many politicians have been found to frequently stretch or break the truth.
    Managing the news
  • According to Adolf Hitler "The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly - it must confine itself to a few points and repeat them over and over."[6] [7] This idea is consistent with the principle of classical conditioning as well as the idea of "Staying on Message."
    Milieu control
  • An attempt to control the social environment and ideas through the use of social pressure
    Name-calling
  • Propagandists use the name-calling technique to incite fears and arouse prejudices in their hearers in the intent that the bad names will cause hearers to construct a negative opinion about a group or set of beliefs or ideas that the propagandist wants hearers to denounce. The method is intended to provoke conclusions about a matter apart from impartial examinations of facts. Name-calling is thus a substitute for rational, fact-based arguments against the an idea or belief on its own merits.[8]
    Obfuscation, intentional vagueness, confusion
  • Generalities are deliberately vague so that the audience may supply its own interpretations. The intention is to move the audience by use of undefined phrases, without analyzing their validity or attempting to determine their reasonableness or application. The intent is to cause people to draw their own interpretations rather than simply being presented with an explicit idea. In trying to "figure out" the propaganda, the audience forgoes judgment of the ideas presented. Their validity, reasonableness and application may still be considered.
    Obtain disapproval or Reductio ad Hitlerum
  • This technique is used to persuade a target audience to disapprove of an action or idea by suggesting that the idea is popular with groups hated, feared, or held in contempt by the target audience. Thus if a group that supports a certain policy is led to believe that undesirable, subversive, or contemptible people support the same policy, then the members of the group may decide to change their original position. This is a form of bad logic, where a is said to include X, and b is said to include X, therefore, a = b.
    Operant conditioning
  • Operant conditioning involves learning through imitation. For example, watching an appealing person buy products or endorse positions teaches a person to buy the product or endorse the position. Operant conditioning is the underlying principle behind the Ad Nauseam, Slogan and other repetition public relations campaigns.
    Oversimplification
  • Favorable generalities are used to provide simple answers to complex social, political, economic, or military problems.
    Pensée unique
  • Enforced reduction of discussion by use of overly simplistic phrases or arguments (e.g., "There is no alternative to war.")
    Quotes out of context
  • Selectively editing quotes to change meanings - political documentaries designed to discredit an opponent or an opposing political viewpoint often make use of this technique.
    Rationalization (making excuses)
  • Individuals or groups may use favorable generalities to rationalize questionable acts or beliefs. Vague and pleasant phrases are often used to justify such actions or beliefs.
    Red herring
  • Presenting data or issues that, while compelling, are irrelevant to the argument at hand, and then claiming that it validates the argument.
    Repetition
  • This is the repeating of a certain symbol or slogan so that the audience remembers it. This could be in the form of a jingle or an image placed on nearly everything in the picture/scene.
    Scapegoating
  • Assigning blame to an individual or group, thus alleviating feelings of guilt from responsible parties and/or distracting attention from the need to fix the problem for which blame is being assigned.
    Slogans
  • A slogan is a brief, striking phrase that may include labeling and stereotyping. Although slogans may be enlisted to support reasoned ideas, in practice they tend to act only as emotional appeals. Opponents of the US's invasion and occupation of Iraq use the slogan "blood for oil" to suggest that the invasion and its human losses was done to access Iraq's oil riches. On the other hand, supporters who argue that the US should continue to fight in Iraq use the slogan "cut and run" to suggest withdrawal is cowardly or weak.
    Stereotyping
  • This technique attempts to arouse prejudices in an audience by labeling the object of the propaganda campaign as something the target audience fears, hates, loathes, or finds undesirable. For instance, reporting on a foreign country or social group may focus on the stereotypical traits that the reader expects, even though they are far from being representative of the whole country or group; such reporting often focuses on the anecdotal. In graphic propaganda, including war posters, this might include portraying enemies with stereotyped racial features.
    Straw man
  • A straw man argument is an informal fallacy based on misrepresentation of an opponent's position. To "attack a straw man" is to create the illusion of having refuted a proposition by substituting a superficially similar proposition (the "straw man"), and refuting it, without ever having actually refuted the original position.
    Testimonial
  • Testimonials are quotations, in or out of context, especially cited to support or reject a given policy, action, program, or personality. The reputation or the role (expert, respected public figure, etc.) of the individual giving the statement is exploited. The testimonial places the official sanction of a respected person or authority on a propaganda message. This is done in an effort to cause the target audience to identify itself with the authority or to accept the authority's opinions and beliefs as its own.
    Third party technique
  • Works on the principle that people are more willing to accept an argument from a seemingly independent source of information than from someone with a stake in the outcome. It is a marketing strategy commonly employed by Public Relations (PR) firms, that involves placing a premeditated message in the "mouth of the media." Third party technique can take many forms, ranging from the hiring of journalists to report the organization in a favorable light, to using scientists within the organization to present their perhaps prejudicial findings to the public. Frequently astroturf groups or front groups are used to deliver the message.

    Foreign governments, particularly those that own marketable commercial products or services, often promote their interests and positions through the advertising of those goods because the target audience is not only largely unaware of the forum as vehicle for foreign messaging but also willing to receive the message while in a mental state of absorbing information from advertisements during television commercial breaks, while reading a periodical, or while passing by billboards in public spaces. A prime example of this messaging technique is advertising campaigns to promote international travel. While advertising foreign destinations and services may stem from the typical goal of increasing revenue by drawing more tourism, some travel campaigns carry the additional or alternative intended purpose of promoting good sentiments or improving existing ones among the target audience towards a given nation or region. It is common for advertising promoting foreign countries to be produced and distributed by the tourism ministries of those countries, so these ads often carry political statements and/or depictions of the foreign government's desired international public perception. Additionally, a wide range of foreign airlines and travel-related services which advertise separately from the destinations, themselves, are owned by their respective governments; examples include, though are not limited to, the Emirates airline (Dubai), Singapore Airlines (Singapore), Qatar Airways (Qatar), China Airlines (Taiwan/Republic of China), and Air China (People's Republic of China). By depicting their destinations, airlines, and other services in a favorable and pleasant light, countries market themselves to populations abroad in a manner that could mitigate prior public impressions. See: Soft Power

    Thought-terminating cliché
  • A commonly used phrase, sometimes passing as folk wisdom, used to quell cognitive dissonance.
    Transfer
  • Also known as association, this is a technique that involves projecting the positive or negative qualities of one person, entity, object, or value onto another to make the second more acceptable or to discredit it. It evokes an emotional response, which stimulates the target to identify with recognized authorities. Often highly visual, this technique often utilizes symbols (e.g. swastikas) superimposed over other visual images (e.g. logos). These symbols may be used in place of words.
    Selective truth
  • Richard Crossman, the British Deputy Director of Psychological Warfare Division (PWD) for the Supreme Headquarters Allied Expeditionary Force (SHAEF) during the Second World War said "In propaganda truth pays... It is a complete delusion to think of the brilliant propagandist as being a professional liar. The brilliant propagandist is the man who tells the truth, or that selection of the truth which is requisite for his purpose, and tells it in such a way that the recipient does not think he is receiving any propaganda... [...] The art of propaganda is not telling lies, but rather selecting the truth you require and giving it mixed up with some truths the audience wants to hear."[9]
    Unstated assumption
  • This technique is used when the idea the propagandist wants to plant would seem less credible if explicitly stated. The concept is instead repeatedly assumed or implied.
    Virtue words
  • These are words in the value system of the target audience that produce a positive image when attached to a person or issue. Peace, happiness, security, wise leadership, freedom, "The Truth", etc. are virtue words. Many see religiosity as a virtue, making associations to this quality affectively beneficial. Their use is considered of the Transfer propaganda technique.

    Models

    Social Psychology

    The field of Social Psychology includes the study of persuasion. Social psychologists can be sociologists or psychologists. The field includes many theories and approaches to understanding persuasion. For example, communication theory points out that people can be persuaded by the communicator's credibility, expertise, trustworthiness, and attractiveness. The elaboration likelihood model as well as heuristic models of persuasion suggest that a number of factors (e.g., the degree of interest of the recipient of the communication), influence the degree to which people allow superficial factors to persuade them. Nobel Prize winning psychologist Herbert Simon won the Nobel prize for his theory that people are cognitive misers. That is, in a society of mass information people are forced to make decisions quickly and often superficially, as opposed to logically.

    Social cognitive theories suggest that people have inherent biases in the way they perceive the world and these biases can be used to manipulate them. For example, people tend to believe that people's misfortune (e.g., poverty) is a result of the person and downplay external factors (e.g., being born into poverty). This bias is referred to as the Fundamental Attribution Error. The self-fulfilling prophecy relates to people's tendencies to stick with their original theories even when evidence proves their theories are incorrect. This tendency is sometimes referred to as confirmation bias (see list of cognitive biases). Propaganda frequently plays upon people's existing biases to achieve its end. For example, the illusion of control, refers to people's seemingly innate desire to believe they can and should control their lives. Propagandists frequently argue their point by claiming that the other side is attempting to take away your control. For example, Republicans frequently claim that Democrats are attempting to control you by imposing big government on your private life and take away your spending power by imposing higher taxes while Democrats frequently argue that they are reigning in big corporations that are attempting to influence elections with money, power and take away your job, health etc. ... According to bipartisan analysis, these claims are frequently untrue.[10]

    Role theory is frequently used to identify an idea as appropriate because it is associated with a role. For example, the public relations firm Leo Burnett Worldwide used the Marlboro Man to persuade males that Marlboro cigarettes were a part of being a cool, risk-taking, cowboy rebel who was fearless in the face of threats of cancer. The campaign quadrupled sales of their cigarettes. Of course, smoking has nothing to do with being a cowboy or a rebel. This is a fantasy but the campaign's success is consistent with the tenets of role theory. In fact, the three actors who played the Marlboro man died of lung cancer.

    Herman and Chomsky's propaganda model

    The propaganda model is a theory advanced by Edward S. Herman and Noam Chomsky that alleges systemic biases in the mass media and seeks to explain them in terms of structural economic causes.

    First presented in their 1988 book Manufacturing Consent: the Political Economy of the Mass Media, the propaganda model views the private media as businesses selling a product — readers and audiences (rather than news) — to other businesses (advertisers) and relying primarily on government and corporate information and propaganda. The theory postulates five general classes of "filters" that determine the type of news that is presented in news media: Ownership of the medium, the medium's Funding, Sourcing of the news, Flak, and Anti-communist ideology.

    The first three (ownership, funding, and sourcing) are generally regarded by the authors as being the most important. Although the model was based mainly on the characterization of United States media, Chomsky and Herman believe the theory is equally applicable to any country that shares the basic economic structure and organizing principles the model postulates as the cause of media biases. After the Soviet Union disintegrated, Chomsky said terrorism and Islam would be the new filter replacing communism.

    Ross' epistemic merit model

    The epistemic merit model is a method for understanding propaganda conceived by Sheryl Tuttle Ross and detailed in her 2002 article for the Journal of Aesthetic Education entitled "Understanding Propaganda: The Epistemic Merit Model and Its Application to Art".[11] Ross developed the Epistemic merit model due to concern about narrow, misleading definitions of propaganda. She contrasted her model with the ideas of Pope Gregory XV, the Institute for Propaganda Analysis, Alfred Lee, F.C. Bartlett, and Hans Speier. Insisting that each of their respective discussions of propaganda are too narrow, Ross proposed her own definition.

    To appropriately discuss propaganda, Ross argues that one must consider a threefold communication model: that of Sender-Message-Receiver. "That is... propaganda involve[s]... the one who is persuading (Sender) [who is] doing so intentionally, [the] target for such persuasion (Receiver) and [the] means of reaching that target (Message)." There are four conditions for a message to be considered propaganda. Propaganda involves the intention to persuade. As well, propaganda is sent on behalf of a sociopolitical institution, organization, or cause. Next, the recipient of propaganda is a socially significant group of people. Finally, propaganda is an epistemic struggle to challenge others' thoughts.

    Ross claims that it is misleading to say that propaganda is simply false, or that it is conditional to a lie, since often the propagandist believes in what he/she is propagandizing. In other words, it is not necessarily a lie if the person who creates the propaganda is trying to persuade you of a view that they actually hold. "The aim of the propagandist is to create the semblance of credibility." This means that they appeal to an epistemology that is weak or defective.

    Throughout history those who have wished to persuade have used art to get their message out. This can be accomplished by hiring artists for the express aim of propagandizing or by investing new meanings to a previously non-political work. Therefore, Ross states, it is important to consider "the conditions of its making [and] the conditions of its use."

    History

    Ancient propaganda

    Propaganda has been a human activity as far back as reliable recorded evidence exists. The Behistun Inscription (c. 515 BC) detailing the rise of Darius I to the Persian throne is viewed by most historians as an early example of propaganda. The Arthashastra written by Chanakya (c. 350 - 283 BC), a professor of political science at Takshashila University and a prime minister of the Maurya Empire in ancient India, discusses propaganda in detail, such as how to spread propaganda and how to apply it in warfare. His student Chandragupta Maurya (c. 340 - 293 BC), founder of the Maurya Empire, employed these methods during his rise to power.[12] The writings of Romans such as Livy (c. 59 BC - 17 AD) are considered masterpieces of pro-Roman propaganda. Another example of early propaganda is the 12th century work, The War of the Irish with the Foreigners, written by the Dál gCais to portray themselves as legitimate rulers of Ireland.

    Propaganda during the Reformation

    See main article: Propaganda during the Reformation.

    Propaganda during the Reformation, helped by the spread of the printing press throughout Europe, and in particular within Germany, caused new ideas, thoughts, and doctrine to be made available to the public in ways that had never been seen before the sixteenth century. The printing press was invented in approximately 1450 and quickly spread to other major cities around Europe; by the time the Reformation was underway in 1517 there were printing centers in over 200 of the major European cities.[13] These centers became the primary producers of both Reformation works by the Protestant Reformers and anti-Reformation works put forth by the Roman Catholics.

    19th and 20th centuries

    Gabriel Tarde's Laws of Imitation (1890) and Gustave Le Bon's The Crowd: A Study of the Popular Mind (1897) were two of the first codifications of propaganda techniques, which influenced many writers afterward, including Sigmund Freud. Hitler's Mein Kampf is heavily influenced by Le Bon's theories. Journalist Walter Lippmann, in Public Opinion (1922) also worked on the subject, as well as the American advertising pioneer and founder of the field of public relations Edward Bernays, a nephew of Freud, who wrote the book Propaganda early in the 20th century.[14]

    During World War I, President Woodrow Wilson hired Lippmann and Bernays to participate in the Creel Commission, which was to sway popular opinion in favor of entering the war on the side of the United Kingdom. The Creel Committee provided themes for speeches by "four-minute men" at public functions, and also encouraged censorship of the American press. Starting after World War I, propaganda had a growing negative connotation. This was due in part to the 1920 book “How We Advertised America: the First Telling of the Amazing Story of the Committee on Public Information that Carried the Gospel of Americanism to Every Corner of the Globe”[15] in which the impact of the Creel Committee, and the power of propaganda, was overemphasized. The Committee was so unpopular that after the war, Congress closed it down without providing funding to organize and archive its papers.

    The war propaganda campaign of Lippmann and Bernays produced within six months such an intense anti-German hysteria as to permanently impress American business (and Adolf Hitler, among others) with the potential of large-scale propaganda to control public opinion. Bernays coined the terms "group mind" and "engineering consent", important concepts in practical propaganda work. The file Century of the Self by Adam Curtis documents the immense influence of these ideas on public relations and politics throughout the last century.

    The current public relations industry is a direct outgrowth of Lippmann's and Bernays' work and is still used extensively by the United States government. For the first half of the 20th century Bernays and Lippmann themselves ran a very successful public relations firm. World War II saw continued use of propaganda as a weapon of war, both by Hitler's propagandist Joseph Goebbels and the British Political Warfare Executive, as well as the United States Office of War Information.

    Edward Bernays had major influence in the propaganda world. He created many campaigns that integrated the use of propaganda. Bernays wrote a book stating how he truly felt about the use of propaganda. He wrote:[16]

    The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.
    We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of. This is a logical result of the way in which our democratic society is organized. Vast numbers of human beings must cooperate in this manner if they are to live together as a smoothly functioning society.

    In the early 2000s, the United States government developed and freely distributed a video game known as America's Army. The stated intention of the game is to encourage players to become interested in joining the U.S. Army.

    Russian revolution

    See also: Propaganda in the Soviet Union, Agitprop and Socialist realism. Russian revolutionaries of the 19th and 20th centuries distinguished two different aspects covered by the English term propaganda. Their terminology included two terms: Russian: агитация (agitatsiya), or agitation, and Russian: пропаганда, or propaganda, see agitprop (agitprop is not, however, limited to the Soviet Union, as it was considered, before the October Revolution, to be one of the fundamental activities of any Marxist activist; this importance of agit-prop in Marxist theory may also be observed today in Trotskyist circles, who insist on the importance of leaflet distribution).

    Soviet propaganda meant dissemination of revolutionary ideas, teachings of Marxism, and theoretical and practical knowledge of Marxist economics, while agitation meant forming favorable public opinion and stirring up political unrest. These activities did not carry negative connotations (as they usually do in English) and were encouraged. Expanding dimensions of state propaganda, the Bolsheviks actively used transportation such as trains, aircraft and other means.

    Joseph Stalin's regime built the largest fixed-wing aircraft of the 1930s, Tupolev ANT-20, exclusively for this purpose. Named after the famous Soviet writer Maxim Gorky who had recently returned from fascist Italy, it was equipped with a powerful radio set called "Voice from the sky", printing and leaflet-dropping machinery, radio stations, photographic laboratory, film projector with sound for showing movies in flight, library, etc. The aircraft could be disassembled and transported by railroad if needed. The giant aircraft set a number of world records.

    Nazi Germany

    See main article: Nazi propaganda. Most propaganda in Germany was produced by the Ministry of Public Enlightenment and Propaganda. Joseph Goebbels was placed in charge of this ministry shortly after Hitler took power in 1933. All journalists, writers, and artists were required to register with one of the Ministry's subordinate chambers for the press, fine arts, music, theatre, film, literature, or radio.

    The Nazis believed in propaganda as a vital tool in achieving their goals. Adolf Hitler, Germany's Führer, was impressed by the power of Allied propaganda during World War I and believed that it had been a primary cause of the collapse of morale and revolts in the German home front and Navy in 1918 (see also: Dolchstoßlegende). Hitler met nearly every day with Goebbels to discuss the news, and Goebbels would obtain Hitler's thoughts on the subject. Goebbels then met with senior Ministry officials to pass down the official Party line on world events. Broadcasters and journalists required prior approval before their works were disseminated. Along with posters, the Nazis produced a number of films and books to spread their beliefs.

    Empire of Japan

    See main article: Japanese propaganda during World War II.

    America in World War II

    See main article: American propaganda during World War II.

    Britain in World War II

    See main article: British propaganda during WWII.

    Cold War propaganda

    See also: Eastern Bloc information dissemination and Propaganda in the Soviet Union. The United States and the Soviet Union both used propaganda extensively during the Cold War. Both sides used film, television, and radio programming to influence their own citizens, each other, and Third World nations. The United States Information Agency operated the Voice of America as an official government station. Radio Free Europe and Radio Liberty, which were, in part, supported by the Central Intelligence Agency, provided grey propaganda in news and entertainment programs to Eastern Europe and the Soviet Union respectively. The Soviet Union's official government station, Radio Moscow, broadcast white propaganda, while Radio Peace and Freedom broadcast grey propaganda. Both sides also broadcast black propaganda programs in periods of special crises.

    In 1948, the United Kingdom's Foreign Office created the IRD (Information Research Department), which took over from wartime and slightly post-war departments such as the Ministry of Information and dispensed propaganda via various media such as the BBC and publishing.

    The ideological and border dispute between the Soviet Union and People's Republic of China resulted in a number of cross-border operations. One technique developed during this period was the "backwards transmission," in which the radio program was recorded and played backwards over the air. (This was done so that messages meant to be received by the other government could be heard, while the average listener could not understand the content of the program.)

    When describing life in capitalist countries, in the US in particular, propaganda focused on social issues such as poverty and anti-union action by the government. Workers in capitalist countries were portrayed as "ideologically close". Propaganda claimed rich people from the US derived their income from weapons manufacturing, and claimed that there was substantial racism or neo-fascism in the US.

    When describing life in Communist countries, western propaganda sought to depict an image of a citizenry held captive by governments that brainwash them. The West also created a fear of the East, by depicting an aggressive Soviet Union. In the Americas, Cuba served as a major source and a target of propaganda from both black and white stations operated by the CIA and Cuban exile groups. Radio Habana Cuba, in turn, broadcast original programming, relayed Radio Moscow, and broadcast The Voice of Vietnam as well as alleged confessions from the crew of the USS Pueblo.

    George Orwell's novels Animal Farm and Nineteen Eighty-Four are virtual textbooks on the use of propaganda. Though not set in the Soviet Union, these books are about totalitarian regimes that constantly corrupt language for political purposes. These novels were, ironically, used for explicit propaganda. The CIA, for example, secretly commissioned an animated film adaptation of Animal Farm in the 1950s with small changes to the original story to suit its own needs.[17]

    Revolution in Central and Eastern Europe

    During the democratic revolutions of 1989 in Central and Eastern Europe the propaganda poster was an important weapon in the hand of the opposition. Printed and hand-made political posters appeared on the Berlin Wall, on the statue of St. Wenceslas in Prague and around the unmarked grave of Imre Nagy in Budapest and the role of them was important for the democratic change.

    Yugoslav wars

    During the Yugoslav wars propaganda was used as a military strategy by governments of Federal Republic of Yugoslavia and Croatia.

    Propaganda was used to create fear and hatred and particularly incite the Serb population against the other ethnicities (Bosniaks, Croats, Albanians and other non-Serbs). Serb media made a great effort in justifying, revising or denying mass war crimes committed by Serb forces during the Yugoslav wars on Bosniaks and other non-Serbs.

    According to the ICTY verdicts against Serb political and military leaders, during the Bosnian war, the propaganda was a part of the Strategic Plan by Serb leadership, aimed at linking Serb-populated areas in Bosnia and Herzegovina together, gaining control over these areas and creating a sovereign Serb nation state, from which most non-Serbs would be permanently removed. The Serb leadership was aware that the Strategic Plan could only be implemented by the use of force and fear, thus by the commission of war crimes.

    Croats also used propaganda against Serbs throughout and against Bosniaks during the 1992–1994 Croat-Bosniak war, which was part of the larger Bosnian War. During Lašva Valley ethnic cleansing Croat forces seized the television broadcasting stations (for example at Skradno) and created its own local radio and television to carry propaganda, seized the public institutions, raised the Croatian flag over public institution buildings, and imposed the Croatian Dinar as the unit of currency. During this time, Busovača's Bosniaks were forced to sign an act of allegiance to the Croat authorities, fell victim to numerous attacks on shops and businesses and, gradually, left the area out of fear that they would be the victims of mass crimes. According to ICTY Trial Chambers in Blaškić case Croat authorities created a radio station in Kiseljak to broadcast nationalist propaganda. A similar pattern was applied in Mostar and Gornji Vakuf (where Croats created a radio station called Radio Uskoplje). Local propaganda efforts in parts of Bosnia and Herzegovina controlled by the Croats, were supported by Croatian daily newspapers such as Večernji list and Croatian Radiotelevision, especially by controversial reporters Dijana Čuljak and Smiljko Šagolj who are still blamed by the families of Bosniak victims in Vranica case for inciting massacre of Bosnian POWs in Mostar, when broadcasting a report about alleged terrorists arrested by Croats who victimized Croat civilians. The bodies of Bosnian POWs were later found in Goranci mass grave. Croatian Radiotelevision presented Croat attack on Mostar, as a Bosnian Muslim attack on Croats in alliance with the Serbs. According to ICTY, in the early hours of May 9, 1993, the Croatian Defence Council (HVO) attacked Mostar using artillery, mortars, heavy weapons and small arms. The HVO controlled all roads leading into Mostar and international organisations were denied access. Radio Mostar announced that all Bosniaks should hang out a white flag from their windows. The HVO attack had been well prepared and planned.

    During the ICTY trials against Croat war leaders, many Croatian journalists participated as the defence witnesses trying to relativise war crimes committed by Croatian troops against non-Croat civilians (Bosniaks in Bosnia and Herzegovina and Serbs in Croatia). During the trial against general Tihomir Blaškić (later convicted of war crimes), Ivica Mlivončić, Croatian columnist in Slobodna Dalmacija, tried to defend general Blaškić presenting number of claims in his book Zločin s pečatom about alleged genocide against Croats (most of it unproven or false), which was considered by the Trial Chambers as irrelevant for the case. After the conviction, he continued to write in Slobodna Dalmacija against the ICTY presenting it as the court against Croats, with chauvinistic claims that the ICTY cannot be unbiased because it is financed by Saudi Arabia (Muslims).[18] [19]

    Afghan War

    In the 2001 invasion of Afghanistan, psychological operations tactics were employed to demoralize the Taliban and to win the sympathies of the Afghan population. At least six EC-130E Commando Solo aircraft were used to jam local radio transmissions and transmit replacement propaganda messages.Leaflets were also dropped throughout Afghanistan, offering rewards for Osama bin Laden and other individuals, portraying Americans as friends of Afghanistan and emphasizing various negative aspects of the Taliban. Another shows a picture of Mohammed Omar in a set of crosshairs with the words "We are watching." This technique has been shown to be rather ineffective in terms of long term opinions change given current political and social conditions in Afghanistan.

    Iraq War

    The United States and Iraq both employed propaganda during the Iraq War. The United States established campaigns towards the American people on the justifications of the war while using similar tactics to bring down Saddam Hussein’s government in Iraq.[20]

    Iraqi Propaganda

    The Iraqi insurgency's plan was to gain as much support as possible by using violence as their propaganda tool.[21] Inspired by the Vietcong's tactics,[22] insurgents were using rapid movement to keep the coalition off-balance. By using low-technology strategies to convey their messages, they were able to gain support.[23] Graffiti slogans were used on walls and houses praising the virtues of many group leaders while condemning the Iraqi government. Others used flyers, leaflets, articles and self published newspapers and magazines to get the point across.

    Insurgents also produced CDs and DVDs and distributed them in communities that the Iraq and the U.S. Government were trying to influence.[24] The insurgents designed advertisements that cost a fraction of what the U.S. was spending on their ads aimed at the same people in Iraq with much more success. In addition, the Iraqis also created and established an Arabic language television station to transmit information to the people of Iraq about the rumors and lies that the Americans were spreading about the war.

    American Propaganda in Iraq

    To achieve their aim of a moderate, pro-western Iraq, U.S. authorities were careful to avoid conflicts with Islamic culture that would produce passionate reactions from Iraqis, but differentiating between "good" and "bad" Islams has proved challenging for the U.S.

    The U.S. implemented something called “Black Propaganda” by creating false radio personalities that would disseminate pro-American information but supposedly run by the supporters of Saddam Hussein. One radio station used was Radio Tikrit. Another example of America’s attempt with Black Propaganda is that the U.S. paid Iraqis to publish articles written by American troops in their newspapers under the idea that they are unbiased and real accounts; this was brought forth by the New York Times in 2005.[25] The article stated that it was the Lincoln Group who had been hired by the U.S. government to create the propaganda, however their names were later cleared from any wrong doing.

    The U.S. was more successful with the “Voice of America” campaign, which is an old Cold War tactic that exploited people’s desire for information. While the information they gave out to the Iraqis was truthful, they were in a high degree of competition with the opposing forces after the censorship of the Iraqi media was lifted with the removal of Saddam from power.[26]

    In November 2005, the Chicago Tribune and the Los Angeles Times, alleged that the United States military had manipulated news reported in Iraqi media in an effort to cast a favorable light on its actions while demoralizing the insurgency. Lt. Col. Barry Johnson, a military spokesman in Iraq, said the program is "an important part of countering misinformation in the news by insurgents", while a spokesman for former Defense Secretary Donald H. Rumsfeld said the allegations of manipulation were troubling if true. The Department of Defense confirmed the existence of the program.

    Propaganda aimed at Americans

    The extent to which the US government was guilty of propaganda aimed at its own people is a matter of discussion. The book Selling Intervention & War by Jon Western argued that president Bush was "selling the war" to the public.[27]

    President George W. Bush gave a talk at the Athena Performing Arts Center at Greece Athena Middle and High School Tuesday, May 24, 2005 in Rochester, NY. About half way through the event Bush said, "See in my line of work you got to keep repeating things over and over and over again for the truth to sink in, to kind of catapult the propaganda."

    People had their initial reactions to the War on Terror, but with more biased and persuading information, Iraq as a whole has been negatively targeted.[28] America’s goal was to remove Saddam Hussein’s power in Iraq with allegations of possible weapons of mass destruction related to Osama Bin Laden.[29] Video and picture coverage in the news has shown shocking and disturbing images of torture and other evils being done under the Iraqi Government.

    People's Republic of China

    See main article: Propaganda in the People's Republic of China.

    Republic of China (Taiwan)

    See main article: Propaganda in the Republic of China.

    North Korea

    See main article: Propaganda in North Korea. Every year, a state-owned publishing house releases several cartoons (called geurim-chaek in North Korea), many of which are smuggled across the Chinese border and, sometimes, end up in university libraries in the United States. The books are designed to instill the Juche philosophy of Kim Il-sung (the ‘father’ of North Korea)—radical self-reliance of the state. The plots mostly feature scheming capitalists from the United States and Japan who create dilemmas for naïve North Korean characters.

    War in Somalia

    See main article: Propaganda in the War in Somalia.

    Mexican drug cartels

    Drug cartels have been engaged in propaganda and psychological campaigns to influence their rivals and those within their area of influence. They use banners and "narcomantras" to threaten their rivals. Some cartels hand out pamphlets and leaflets to conduct public relation campaigns. They have been able to control the information environment by threatening journalists, bloggers, and others who speak out against them. They have elaborate recruitment strategies targeting young adults to join their cartel groups. They have successfully branded the word "narco", and the word has become part of Mexican culture. There is music, television shows, literature, beverages, food, and architecture that all have been branded "narco".[30] [31]

    Children

    Of all the potential targets for propaganda, children are the most vulnerable because they are the most unprepared for the critical reasoning and contextual comprehension required to determine whether a message is propaganda or not. Children's vulnerability to propaganda is rooted in developmental psychology. The attention children give their environment during development, due to the process of developing their understanding of the world, will cause them to absorb propaganda indiscriminately. Also, children are highly imitative: studies by Albert Bandura, Dorothea Ross and Sheila A. Ross in the 1960s indicated that children are susceptible to filmed representations of behaviour. Therefore television is of particular interest in regard to children's vulnerability to propaganda.

    Another vulnerability of children is the theoretical influence that their peers have over their behaviour. According to Judith Rich Harris's group-socialization theory, children learn the majority of what they do not receive paternally, through genes, from their peer groups. The implication then is that if peer-groups can be indoctrinated through propaganda at a young age to hold certain beliefs, the group will self-regulate the indoctrination, since new members to the group will adapt their beliefs to fit the group's.

    To a degree, socialization, formal education, and standardized television programming can be seen as using propaganda for the purpose of indoctrination. The use of propaganda in schools was highly prevalent during the 1930s and 1940s in Germany, as well as in Stalinist Russia.

    Anti-Semitic propaganda for children

    In Nazi Germany, the education system was thoroughly co-opted to indoctrinate the German youth with anti-Semitic ideology. This was accomplished through the National Socialist Teachers League, of which 97% of all German teachers were members in 1937. It encouraged the teaching of “racial theory.” Picture books for children such as Don’t Trust A Fox in A Green Meadow Or the Word of A Jew, The Poisonous Mushroom, and The Poodle-Pug-Dachshund-Pincher were widely circulated (over 100,000 copies of Don’t Trust A Fox... were circulated during the late 1930s) and contained depictions of Jews as devils, child molesters, and other morally charged figures. Slogans such as “Judas the Jew betrayed Jesus the German to the Jews” were recited in class.[32] The following is an example of a propagandistic math problem recommended by the National Socialist Essence of Education:

    Tomorrow's Pioneers (Arabic: رواد الغد; also The Pioneers of Tomorrow) is a children's program, broadcast since April 13, 2007 on the official Palestinian Hamas television station, Al-Aqsa TV (Arabic: مرئية الأقصى قناة الأقصى). The program deals with many life aspects Palestinian children face.Assoud (Arabic: اسود; also rendered as Assud), a Bugs Bunny-like rabbit character whose name means lion was introduced after his brother Nahoul, the previous co-host, died of illness.[33]

    In explaining why he is called Assoud (lion), when Arnoub (rabbit) would be more appropriate, Assoud explains that "A rabbit is a term for a bad person and coward. And I, Assoud, will finish off the Jews and eat them."[34] Before Nahoul's death, Assoud lived in Lebanon; he returned "in order to return to the homeland and liberate it." Assoud has hinted in episode 113 that he will be replaced by a tiger when he is martyred.

    See also

    References

                    • =

    Further reading

    Books

    • Altheide, David L. & Johnson, John M. Bureaucratic Propaganda. Boston: Allyn and Bacon, Inc. (1980)
    • Brown, J.A.C. Techniques of Persuasion: From Propaganda to Brainwashing Harmondsworth: Pelican (1963)
    • Chomsky, Noam and Herman, Edward. Manufacturing Consent: The Political Economy of the Mass Media. New York: Pantheon Books. (1988)
    • Cole, Robert. Propaganda in Twentieth Century War and Politics (1996)
    • Cole, Robert, ed. Encyclopedia of Propaganda (3 vol 1998)
    • Combs, James E. & Nimmo, Dan. The New Propaganda: The Dictatorship of Palaver in Contemporary Politics. White Plains, N.Y. Longman. (1993)
    • Cull, Nicholas John, Culbert, and Welch, eds. Propaganda and Mass Persuasion: A Historical Encyclopedia, 1500 to the Present (2003)
    • Cunningham, Stanley, B. The Idea of Propaganda: A Reconstruction. Westport, Conn.: Praeger. (2002)
    • Ellul, Jacques. Propaganda: The Formation of Men's Attitudes. Trans. Konrad Kellen & Jean Lerner. New York: Knopf, 1965. New York: Random House/ Vintage 1973
    • Kingsbury, Celia Malone. For Home and Country: World War I Propaganda on the Home Front (University of Nebraska Press; 2010; 308 pages). Describes propaganda directed toward the homes of the American homefront in everything from cookbooks and popular magazines to children's toys.
    • Lasswell, Harold D.. Propaganda Technique in World War I. Cambridge, Mass: The M.I.T. Press. (1971)
    • Le Bon, Gustave, The Crowd: a study of the Popular Mind (1895)
    • MacArthur, John R.. Second Front: Censorship and Propaganda in the Gulf War. New York: Hill and Wang. (1992)
    • Marlin, Randal. Propaganda & The Ethics of Persuasion. Orchard Park, New York: Broadview Press. (2002)
    • McCombs M. E. & Shaw, D. L. (1972). The agenda-setting function of mass media. Public Opinion Quarterly, 36, 176-87.
    • Linebarger, Paul M. Psychological Warfare. International Propaganda and Communications. ISBN 0-405-04755-X (1948)
    • Pratkanis, Anthony & Aronson, Elliot. Age of Propaganda: The Everyday Use and Abuse of Persuasion. New York: W.H. Freeman and Company. (1992)
    • Rutherford, Paul. Endless Propaganda: The Advertising of Public Goods. Toronto: University of Toronto Press. (2000)
    • Rutherford, Paul. Weapons of Mass Persuasion: Marketing the War Against Iraq. Toronto: University of Toronto Press. (2004)
    • Snow, Nancy. Propaganda, Inc.: Selling America's Culture to the World. New York, NY: Seven Stories Press. (2010)
    • Sproule, J. Michael. Channels of Propaganda. Bloomington, IN: EDINFO Press. (1994)
    • Stauber, John, and Rampton, Sheldon Toxic Sludge Is Good for You! Lies, Damn Lies and the Public Relations Industry Monroe, Maine: Common Courage Press, 1995.

    Essays/Articles

    External links

    Current propaganda

    Historical propaganda

    Notes and References

    1. http://www.etymonline.com/index.php?term=propaganda
    2. Garth Jowett and Victoria O'Donnell, Propaganda and Persuasion, 4th ed. Sage Publications, p. 7
    3. Richard Alan Nelson, A Chronology and Glossary of Propaganda in the United States (1996) pp. 232-233
    4. Hindery, Roderick R., Indoctrination and Self-deception or Free and Critical Thought? (2001)
    5. http://www.jamescmccroskey.com/publications/36.htm unacceptable message
    6. Book: Joel H. Spring. Pedagogies of globalization: the rise of the educational security state. 2006. Psychology Press. 9780805855579. 60.
    7. Book: Hilmar Hoffmann. John Broadwin. Volker R. Berghahn. The triumph of propaganda: film and national socialism, 1933-1945. 1997. Berghahn Books. 9781571811226. 140.
    8. http://mason.gmu.edu/~amcdonal/Propaganda%20Techniques.html Propaganda Techniques
    9. Book: Scot Macdonald. Propaganda and information warfare in the twenty-first century: altered images and deception operations. 2007. Taylor & Francis. 9780415771450. 35.
    10. http://www.factcheck.org/
    11. Ross, Sheryl Tuttle. "Understanding Propaganda: The Epistemic Merit Model and Its Application to Art." Journal of Aesthetic Education, Vol. 36, No.1. pp. 16–30
    12. Boesche, Roger. "Kautilya’s Arthasastra on War and Diplomacy in Ancient India", The Journal of Military History 67 (p. 9–38), January 2003.
    13. Mark U. Edwards, Printing Propaganda and Martin Luther 15; Louise W. Holborn, “Printing and the Growth of a Protestant Movement in Germany from 1517 to 1524”, Church History, 11, no. 2 (1942), 123.
    14. http://home.bway.net/drstu/chapter.html About Edward Berneys book chapter
    15. Rogers, E.M. (1994). A history of communication study: A biographical approach. New York, NY: The Free Press.
    16. Bernays, Edward. Propaganda (1928)
    17. http://film.guardian.co.uk/features/featurepages/0,4120,908925,00.html Guardian — The cartoon that came in from the cold -
    18. Slobodna Dalmacija — NAJVEĆI DONATOR HAAŠKOG SUDA JE — SAUDIJSKA ARABIJA http://www.hsp1861.hr/vijesti1/011228im.htm
    19. http://www.aimpress.ch/dyn/pubs/archive/data/200110/11005-004-pubs-zag.htm Igor Lasić — Izlog izdavačkog smeća
    20. Altheide, David L. "War and Mass Mediated Evidence." Cultural Studies — Critical Methodologies 9 (2009): 14-22.
    21. Garfield, Andrew. "The U.S. Counter-propaganda Failure in Iraq." Middle East Quarterly 14 (2007): 23-32.
    22. Schleifer, Ron. "Reconstructing Iraq: Winning the Propaganda War in Iraq." Middle East Quarterly (2005): 15-24.
    23. Garfield, Andrew. "The U.S. Counter-propaganda Failure in Iraq." Middle East Quarterly 14 (2007): 24
    24. Garfield, Andrew. "The U.S. Counter-propaganda Failure in Iraq." Middle East Quarterly 14 (2007): 26
    25. Shah, Anup. Iraq War Media Reporting, Journalism and Propaganda. Aug 1, 2007. May 12, 2009.
    26. Goldstein, Sol. "A Strategic Failure: American Information Control Policy in Occupied Iraq." Military Review 88.2 (Mar. 2008): 58-65.
    27. Thrall, A. Trevor. "A Review of: "Weapons of Mass Deception: The Uses of Propaganda in Bush's War on Iraq, by Sheldon Rampton and John Stauber Weapons of Mass Persuasion: Marketing the War Against Iraq, by Paul Rutherford Selling Intervention & War: The Presidency, the..." Political Communication 24.2 (Apr. 2007): 202-207.
    28. John, Sue Lockett, et al. "Going Public, Crisis after Crisis: The Bush Administration and the Press from September 11 to Saddam." Rhetoric & Public Affairs 10.2 (Summer2007 2007): 195-219.
    29. O'Shaughnessy, Nicholas. "Weapons of Mass Seduction: Propaganda, Media and the Iraq War." Journal of Political Marketing 3.4 (2004): 79-104. America: History & Life.
    30. News: Analysis: A PR department for Mexico's narcos. O'Connor. Mike. November 5, 2010. GlobalPost. 2012-03-28.
    31. News: The Narco Generation. Beckhart. Sarah. February 21, 2011. AL DÍA. Woodrow Wilson International Center for Scholars's Mexico Institute. 2012-03-28.
    32. Mills, Mary. "Propaganda and Children During the Hitler Years". Jewish Virtual Library. http://www.jewishvirtuallibrary.org/jsource/Holocaust/propchil.html
    33. Assoud Arrives. Tomorrow's Pioneers. Tomorrow's Pioneers. 2008-02-01. 1. 11.
    34. Nissan Ratzlav-Katz, "PA TV Bunny Rabbit Threatens to 'Eat the Jews'", Arutz Sheva, February 12, 2008 (6 Adar 5768).