Autor: admin
Datum objave: 13.02.2020
Share


The Trump Campaign's Disinformation Tactics

If you think disinformation only happens in other countries, think again. The Trump campaign machine known as the 'death star' is already in action.

The Atlantic Outlines The Trump Campaign's Disinformation Tactics | Velshi & Ruhle | MSNBC

https://www.youtube.com/watch?v=8YjcsUNf1dU


If you think disinformation only happens in other countries, think again. The Trump campaign machine known as the 'death star' is already in action. Stephanie Ruhle is joined by Atlantic staff writer McKay Coppins to break down how this is already shaping the 2020 election. Weighing in: Former Congresswoman Donna Edwards and Political Consultant Shermichael Singleton.


One day last fall, I sat down to create a new Facebook account. I picked a forgettable name, snapped a profile pic with my face obscured, and clicked “Like” on the official pages of Donald Trump and his reelection campaign. Facebook’s algorithm prodded me to follow Ann Coulter, Fox Business, and a variety of fan pages with names like “In Trump We Trust.” I complied. I also gave my cellphone number to the Trump campaign, and joined a handful of private Facebook groups for MAGA diehards, one of which required an application that seemed designed to screen out interlopers.

The president’s reelection campaign was then in the midst of a multimillion-dollar ad blitz aimed at shaping Americans’ understanding of the recently launched impeachment proceedings. Thousands of micro-targeted ads had flooded the internet, portraying Trump as a heroic reformer cracking down on foreign corruption while Democrats plotted a coup. That this narrative bore little resemblance to reality seemed only to accelerate its spread. Right-wing websites amplified every claim. Pro-Trump forums teemed with conspiracy theories. An alternate information ecosystem was taking shape around the biggest news story in the country, and I wanted to see it from the inside.

The story that unfurled in my Facebook feed over the next several weeks was, at times, disorienting. There were days when I would watch, live on TV, an impeachment hearing filled with damning testimony about the president’s conduct, only to look at my phone later and find a slickly edited video—served up by the Trump campaign—that used out-of-context clips to recast the same testimony as an exoneration. Wait, I caught myself wondering more than once, is that what happened today?

As I swiped at my phone, a stream of pro-Trump propaganda filled the screen: “That’s right, the whistleblower’s own lawyer said, ‘The coup has started …’  Swipe. “Democrats are doing Putin’s bidding …” Swipe. “The only message these radical socialists and extremists will understand is a crushing …” Swipe. “Only one man can stop this chaos …” Swipe, swipe, swipe.

I was surprised by the effect it had on me. I’d assumed that my skepticism and media literacy would inoculate me against such distortions. But I soon found myself reflexively questioning every headline. It wasn’t that I believed Trump and his boosters were telling the truth. It was that, in this state of heightened suspicion, truth itself—about Ukraine, impeachment, or anything else—felt more and more difficult to locate. With each swipe, the notion of observable reality drifted further out of reach.

What I was seeing was a strategy that has been deployed by illiberal political leaders around the world. Rather than shutting down dissenting voices, these leaders have learned to harness the democratizing power of social media for their own purposes—jamming the signals, sowing confusion. They no longer need to silence the dissident shouting in the streets; they can use a megaphone to drown him out. Scholars have a name for this: censorship through noise.

After the 2016 election, much was made of the threats posed to American democracy by foreign disinformation. Stories of Russian troll farms and Macedonian fake-news mills loomed in the national imagination. But while these shadowy outside forces preoccupied politicians and journalists, Trump and his domestic allies were beginning to adopt the same tactics of information warfare that have kept the world’s demagogues and strongmen in power.

Every presidential campaign sees its share of spin and misdirection, but this year’s contest promises to be different. In conversations with political strategists and other experts, a dystopian picture of the general election comes into view—one shaped by coordinated bot attacks, Potemkin local-news sites, micro-targeted fearmongering, and anonymous mass texting. Both parties will have these tools at their disposal. But in the hands of a president who lies constantly, who traffics in conspiracy theories, and who readily manipulates the levers of government for his own gain, their potential to wreak havoc is enormous.

The Trump campaign is planning to spend more than $1 billion, and it will be aided by a vast coalition of partisan media, outside political groups, and enterprising freelance operatives. These pro-Trump forces are poised to wage what could be the most extensive disinformation campaign in U.S. history. Whether or not it succeeds in reelecting the president, the wreckage it leaves behind could be irreparable.

THE DEATH STAR

The campaign is run from the 14th floor of a gleaming, modern office tower in Rosslyn, Virginia, just outside Washington, D.C. Glass-walled conference rooms look out on the Potomac River. Rows of sleek monitors line the main office space. Unlike the bootstrap operation that first got Trump elected—with its motley band of B-teamers toiling in an unfinished space in Trump Tower—his 2020 enterprise is heavily funded, technologically sophisticated, and staffed with dozens of experienced operatives. One Republican strategist referred to it, admiringly, as “the Death Presiding over this effort is Brad Parscale, a 6-foot-8 Viking of a man with a shaved head and a triangular beard. As the digital director of Trump’s 2016 campaign, Parscale didn’t become a household name like Steve Bannon and Kellyanne Conway. But he played a crucial role in delivering Trump to the Oval Office—and his efforts will shape this year’s election.

In speeches and interviews, Parscale likes to tell his life story as a tidy rags-to-riches tale, embroidered with Trumpian embellishments. He grew up a simple “farm boy from Kansas” (read: son of an affluent lawyer from suburban Topeka) who managed to graduate from an “Ivy League” school (Trinity University, in San Antonio). After college, he went to work for a software company in California, only to watch the business collapse in the economic aftermath of 9/11 (not to mention allegations in a lawsuit that he and his parents, who owned the business, had illegally transferred company funds—claims that they disputed). Broke and desperate, Parscale took his “last $500” (not counting the value of three rental properties he owned) and used it to start a one-man web-design business in Texas.

 Star.”

Parscale Media was, by most accounts, a scrappy endeavor at the outset. Hustling to drum up clients, Parscale cold-pitched shoppers in the tech aisle of a Borders bookstore. Over time, he built enough websites for plumbers and gun shops that bigger clients took notice—including the Trump Organization. In 2011, Parscale was invited to bid on designing a website for Trump International Realty. An ardent fan of The Apprentice, he offered to do the job for $10,000, a fraction of the actual cost. “I just made up a price,” he later told The Washington Post. “I recognized that I was a nobody in San Antonio, but working for the Trumps would be everything.” The contract was his, and a lucrative relationship was born.

Over the next four years, he was hired to design websites for a range of Trump ventures—a winery, a skin-care line, and then a presidential campaign. By late 2015, Parscale—a man with no discernible politics, let alone campaign experience—was running the Republican front-runner’s digital operation from his personal laptop.

Parscale slid comfortably into Trump’s orbit. Not only was he cheap and unpretentious—with no hint of the savvier-than-thou smugness that characterized other political operatives—but he seemed to carry a chip on his shoulder that matched the candidate’s. “Brad was one of those people who wanted to prove the establishment wrong and show the world what he was made of,” says a former colleague from the campaign.

Perhaps most important, he seemed to have no reservations about the kind of campaign Trump wanted to run. The race-baiting, the immigrant-bashing, the truth-bending—none of it seemed to bother Parscale. While some Republicans wrung their hands over Trump’s inflammatory messages, Parscale came up with ideas to more effectively disseminate them.

The campaign had little interest at first in cutting-edge ad technology, and for a while, Parscale’s most valued contribution was the merchandise page he built to sell MAGA hats. But that changed in the general election. Outgunned on the airwaves and lagging badly in fundraising, campaign officials turned to Google and Facebook, where ads were inexpensive and shock value was rewarded. As the campaign poured tens of millions into online advertising—amplifying themes such as Hillary Clinton’s criminality and the threat of radical Islamic terrorism—Parscale’s team, which was christened Project Alamo, grew to 100.

Parscale was generally well liked by his colleagues, who recall him as competent and intensely focused. “He was a get-shit-done type of person,” says A. J. Delgado, who worked with him. Perhaps just as important, he had a talent for ingratiating himself with the Trump family. “He was probably better at managing up,” Kurt Luidhardt, a consultant for the campaign, told me. He made sure to share credit for his work with the candidate’s son-in-law, Jared Kushner, and he excelled at using Trump’s digital ignorance to flatter him. “Parscale would come in and tell Trump he didn’t need to listen to the polls, because he’d crunched his data and they were going to win by six points,” one former campaign staffer told me. “I was like, ‘Come on, man, don’t bullshit a bullshitter.’ But Trump seemed to buy it. (Parscale declined to be interviewed for this story.)

James Barnes, a Facebook employee who was dispatched to work closely with the campaign, told me Parscale’s political inexperience made him open to experimenting with the platform’s new tools. “Whereas some grizzled campaign strategist who’d been around the block a few times might say, ‘Oh, that will never work,’ Brad’s predisposition was to say, ‘Yeah, let’s try it.’ From June to November, Trumps campaign ran 5.9 million ads on Facebook, while Clinton’s ran just 66,000. A Facebook executive would later write in a leaked memo that Trump “got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser.”

Though some strategists questioned how much these ads actually mattered, Parscale was hailed for Trump’s surprise victory. Stories appeared in the press calling him a “genius” and the campaign’s “secret weapon,” and in 2018 he was tapped to lead the entire reelection effort. The promotion was widely viewed as a sign that the president’s 2020 strategy would hinge on the digital tactics that Parscale had mastered.

Through it all, the strategist has continued to show a preference for narrative over truth. Last May, Parscale regaled a crowd of donors and activists in Miami with the story of his ascent. When a ProPublica reporter confronted him about the many misleading details in his account, he shrugged off the fact-check. “When I give a speech, I tell it like a story,” he said. “My story is my story.”

DISINFORMATION ARCHITECTURE

In his book This Is Not Propaganda, Peter Pomerantsev, a researcher at the London School of Economics, writes about a young Filipino political consultant he calls “P.” In college, P had studied the “Little Albert experiment,” in which scientists conditioned a young child to fear furry animals by exposing him to loud noises every time he encountered a white lab rat. The experiment gave P an idea. He created a series of Facebook groups for Filipinos to discuss what was going on in their communities. Once the groups got big enough—about 100,000 members—he began posting local crime stories, and instructed his employees to leave comments falsely tying the grisly headlines to drug cartels. The pages lit up with frightened chatter. Rumors swirled; conspiracy theories metastasized. To many, all crimes became drug crimes.

Unbeknownst to their members, the Facebook groups were designed to boost Rodrigo Duterte, then a long-shot presidential candidate running on a pledge to brutally crack down on drug criminals. (Duterte once boasted that, as mayor of Davao City, he rode through the streets on his motorcycle and personally executed drug dealers.) P’s experiment was one plank in a larger “disinformation architecture”—which also included social-media influencers paid to mock opposing candidates, and mercenary trolls working out of former call centers—that experts say aided Duterte’s rise to power. Since assuming office in 2016, Duterte has reportedly ramped up these efforts while presiding over thousands of extrajudicial killings.

The campaign in the Philippines was emblematic of an emerging propaganda playbook, one that uses new tools for the age-old ends of autocracy. The Kremlin has long been an innovator in this area. (A 2011 manual for Russian civil servants favorably compared their methods of disinformation to “an invisible radiation” that takes effect while “the population doesn’t even feel it is being acted upon.”) But with the technological advances of the past decade, and the global proliferation of smartphones, governments around the world have found success deploying Kremlin-honed techniques against their own people.

In the United States, we tend to view such tools of oppression as the faraway problems of more fragile democracies. But the people working to reelect Trump understand the power of these tactics. They may use gentler terminology—muddy the waters; alternative facts—but they’re building a machine designed to exploit their own sprawling disinformation architecture.

Central to that effort is the campaign’s use of micro-targeting—the process of slicing up the electorate into distinct niches and then appealing to them with precisely tailored digital messages. The advantages of this approach are obvious: An ad that calls for defunding Planned Parenthood might get a mixed response from a large national audience, but serve it directly via Facebook to 800 Roman Catholic women in Dubuque, Iowa, and its reception will be much more positive. If candidates once had to shout their campaign promises from a soapbox, micro-targeting allows them to sidle up to millions of voters and whisper personalized messages in their ear.

Parscale didn’t invent this practice—Barack Obama’s campaign famously used it in 2012, and Clinton’s followed suit. But Trump’s effort in 2016 was unprecedented, in both its scale and its brazenness. In the final days of the 2016 race, for example, Trump’s team tried to suppress turnout among black voters in Florida by slipping ads into their News Feeds that read, “Hillary Thinks African-Americans Are Super Predators.” An unnamed campaign official boasted to Bloomberg Businessweek that it was one of “three major voter suppression operations underway.” (The other two targeted young women and white liberals.) The weaponization of micro-targeting was pioneered in large part by the data scientists at Cambridge Analytica. The firm began as part of a nonpartisan military contractor that used digital psyops to target terrorist groups and drug cartels. In Pakistan, it worked to thwart jihadist recruitment efforts; in South America, it circulated disinformation to turn drug dealers against their bosses.

The emphasis shifted once the conservative billionaire Robert Mercer became a major investor and installed Steve Bannon as his point man. Using a massive trove of data it had gathered from Facebook and other sources—without users’ consent—Cambridge Analytica worked to develop detailed “psychographic profiles” for every voter in the U.S., and began experimenting with ways to stoke paranoia and bigotry by exploiting certain personality traits. In one exercise, the firm asked white men whether they would approve of their daughter marrying a Mexican immigrant; those who said yes were asked a follow-up question designed to provoke irritation at the constraints of political correctness: “Did you feel like you had to say that?”

Christopher Wylie, who was the director of research at Cambridge Analytica and later testified about the company to Congress, told me that “with the right kind of nudges,” people who exhibited certain psychological characteristics could be pushed into ever more extreme beliefs and conspiratorial thinking. “Rather than using data to interfere with the process of radicalization, Steve Bannon was able to invert that,” Wylie said. “We were essentially seeding an insurgency in the United States.”

Cambridge Analytica was dissolved in 2018, shortly after its CEO was caught on tape bragging about using bribery and sexual “honey traps” on behalf of clients. (The firm denied that it actually used such tactics.) Since then, some political scientists have questioned how much effect its “psychographic” targeting really had. But Wylie—who spoke with me from London, where he now works for H&M, as a fashion-trend forecaster—said the firm’s work in 2016 was a modest test run compared with what could come.

“What happens if North Korea or Iran picks up where Cambridge Analytica left off?” he said, noting that plenty of foreign actors will be looking for ways to interfere in this year’s election. “There are countless hostile states that have more than enough capacity to quickly replicate what we were able to do … and make it much more sophisticated.” These efforts may not come only from abroad: A group of former Cambridge Analytica employees have formed a new firm that, according to the Associated Press, is working with the Trump campaign. (The firm has denied this, and a campaign spokesperson declined to comment.)

After the Cambridge Analytica scandal broke, Facebook was excoriated for its mishandling of user data and complicity in the viral spread of fake news. Mark Zuckerberg promised to do better, and rolled out a flurry of reforms. But then, last fall, he handed a major victory to lying politicians: Candidates, he said, would be allowed to continue running false ads on Facebook. (Commercial advertisers, by contrast, are subject to fact-checking.) In a speech at Georgetown University, the CEO argued that his company shouldn’t be responsible for arbitrating political speech, and that because political ads already receive so much scrutiny, candidates who choose to lie will be held accountable by journalists and watchdogs.

Shady political actors are discovering how easy it is to wage an untraceable whisper campaign by text message.


389
Kategorije: Društvo
Developed by LELOO. All rights reserved.