Decorum Will Save Us, You Moron

Decorum Will Save Us, You Moron

You’re a professional, you idiot. You’ve worked your entire adult life to achieve respect in the field of your choice and you were just getting used to seeing the term “expert” in your bio when the cataclysm-that-is-the-Internet came along and turned your dreams of contributing knowledge and insight to the progress of humanity into lowbrow waking nightmares. Now you spend your days online, which is to say that you’re physically somewhere in the world, everyone is somewhere, but your mind isn’t focused on your immediate surroundings, your walls, your floors, your children, you negligent imbecile. Instead, you’re emotionally elsewhere, pulsing thoughts with millions of other people’s thoughts, people who are equally there-but-not-there; people who’ve shown up, uninvited, just like you; people who’re anonymous and not of your choosing, who may or may not know things, and you engage them randomly depending on your mood and the scant remainder of your finite daily willpower, you unfathomably stupid jerk. You’re interacting with people you would never have crossed paths with if not for the internet, which would be a miracle and a blessing except that, day after day, hour after hour, they predominantly require you to correct their falsehoods and wrongheaded assumptions, and hold them accountable for their basest, most repellent prejudices, and you oblige them, you dumbass! You throw down facts and historical context, insights and morally sound thinking, but you don’t just leave it there, do you, you mindless cretin. No, you do what you never dreamed you’d do in all of those years of reputation-building that led to now: you publicly call a complete stranger an ass clown. In the absense of inspirational leadership and national identity, you cave, you schmuck, and sink to the lowest possible rung of human interaction, the rung where cruel, douchebag power mongers hang out. From down there, everything you shout into the ether sounds small, and less than, and angry, you worthless P.O.S. And all of the people who turn to you for expertise and role-modeling, they follow you down there because that’s where you’ve chosen to live, you slob. You’re leading alright! Leading millions of people down the drain, you weak, visionless, soft-brained piece of excrement.

If there was a shred of evidence, even a single study, that proved calling people morons and idiots and low I.Q. imbeciles had the effect of motivating them to reconsider their views, or bettered the lives of their victims, or just got them to plain shut-up, then your tweets would make perfect sense. You’d be a hero, you pathetic dope. Instead, time and again, the only reliable outcomes from insulting someone are fist-fights, exponentially more hate, and a swell of entitlement to the festering rage and alienation that sits pent up in all of us, ready to savage the next person who ticks us off and feel damn justified doing it. Well done! You did that! Way to be, you fetid piece of garbage.

Did it ever occur to you that the reason we need a police force is for the comparatively tiny number of people who can’t police themselves? That the vast majority of us choose not to break the law for the same reason we choose not to insult people — not because we’re masochists and we get off giving assholes a pass, but because we aspire to civility? Do you realize that anarchy is just one decision away? That what makes each of us better than the worst of our society is the ability to control our impulses, you deplorable reject? That you’re following the lead of the guy who separates kids from parents and holds them all in cages every time you call someone a moron, you stinking piece of trash? Did you stop just once to consider that no amount of fact-checking and moral-vetting matters if it’s accompanied by a black eye, you total, complete and utter dipshit?

What to do instead. Mmmm, I don’t know, jackass. Maybe climb back out of the sewer. Try deleting the insults and stick with the facts and perspective. It’s surprising how powerful the truth is when the messenger’s pre-teen angst isn’t butted up to it like a rapey locker room weasel. If someone says something so ridiculously stupid that you’re tempted to call them a name, take the high road instead, or zip it. Be a study in contrast. Don’t jump in the ring.

Meanwhile, if, deep down, you really do care about elevating public dialogue (and let’s be honest, that’s not too challenging these days, especially for you) then get creative. You’re an expert. You’re a role model. People are listening. React intelligently in the online space when someone behaves ignorantly or monstrously. You’re not an average joe who occasionally slips. You have a platform. Be accountable to it. People will amplify whatever you do, so set an example. Forget #BeBest. Just #BeBetter. Remember that person you wanted to be when you started out in life? #BeHer. #BeHim. #BeYou.

Connecting the Dots

Connecting the Dots

Al Gore’s Inconvenient Sequel

Activist Heather Heyer said, “If you you’re not outraged, you’re not paying attention.” Incredibly, it took her murder in a public space in broad daylight to make people pay attention to organized American white nationalists. Even then, her death wasn’t sufficient to galvanize substantive action on domestic terror. The president’s refusal to condemn her killers became the focus of the news cycle, thus shifting the public’s outrage away from a dire national threat, and proving once again that motivated people are easily immobilized without the guidance of a good, well-informed leader.

The Inconvenient Sequel to An Inconvenient Truth doesn’t mention the alt-right or white supremacy, but two centuries of white Western economic dominance over the world has certainly left its mark. While “the West” includes a diverse mix of races, it is white men who led the charge of industrialization and technological advancement with devastating environmental consequences. The deeply upsetting conversation about the environmental crisis often glosses over the fact that older wealthy white men would have to give up substantial economic gains in order to lead a course-correction for the entire planet. Instead, the powerful few are pitted against millions who will be adversely affected by climate change for generations to come and they are using their limitless resources to disinform the world and downplay the dangers.

The images of melting glaciers and floods throughout An Inconvenient Sequel are disturbing, but to an informed viewer the most panic-inducing sections of the movie should be the round table negotiations between world leaders. The magnitude of political star power that shows up for working-level environmental policy meetings is alarming. While the agreement reached at the 2016 UN Climate Change Conference is presented in the film as a triumph, it should strike fear into the hearts of every global citizen. The unprecedented cooperation which occurred to make that agreement happen is damning evidence that we’re facing an imminent existential threat.

Al Gore is no longer a controversial figure. His presence is almost Christ-like now. He’s a mouthpiece for the planet, a voice for millions of people who have no political power in the face of this unfolding man-made catastrophe. Gore doesn’t do much explaining in this film. We simply follow him around the world and watch how he responds to questions about what’s happening. He looks fatigued and worried. He speaks in short bursts of truth. No one has a justification for ignoring reality that he can’t refute in a few words. When Christiana Figueres, Secretariat of the UNFCCC, entreats him to bring India — the 1.3 billion people of India — on board with the Paris Accords, Gore makes a phone call to the CEO of SolarCity and an economic carrot materializes. The urgency of our situation is evident in the staggeringly short distance between nightmare and hope, that distance being the reach of one man, Gore.

Figueres closes the Paris climate conference with an announcement that 194 countries signed the Paris Agreement. The jubilation onscreen is heartbreaking in light of what we now know will follow — an alt-right sympathizer will take power in the White House. He will refuse to acknowledge the global cooperation and sacrifice needed to save the planet. He will withdraw America from the Paris Agreement and derail our best hope of reversing climate change, thus exhibiting the hallmark decision-making of denialism and white American exceptionalism.

Gore says American democracy has been “hacked” by corporations. He’s adamant that the government is not acting in the best interests of the people. Given his personal and very public journey to bring climate change to light, there’s no reason to doubt him. He asks viewers to “connect the dots” but in truth he has connected them for us. All we need to do is watch the film and let that truth wash over us.

An Inconvenient Sequel is in theaters now.

Legislate the Internet, Don’t Rewrite It

Legislate the Internet, Don’t Rewrite It

George F. Cram (1842–1928) — Cram’s Unrivaled Family Atlas of the World, Chicago IL. Lithograph color print. Diagram of the Principal High Buildings of the Old World

A response to Walter Isaacson’s “How To Fix The Internet” (The Atlantic.)

In an article published in The Atlantic this week, Walter Isaacson laid out his vision for “how to fix the internet.” The problem, he says, is Trolls. Bugs. Lack of tracking. He believes anonymity has “poisoned civil discourse, enabled hacking, permitted cyberbullying and made email a risk.” His solution is to do away with anonymity, thereby offering himself as the mouthpiece for every Silicon Valley titan with deep pockets and a hunger for data.

I’ve written on how we civilize technology before, on the challenges we face with each shift forward in technology, whether it’s ships, trains, radio transmitters or nuclear energy. The trajectory involves a series of near-misses while we get the hang of our shiny new toy. When cars were first invented there were no laws to govern driving. As cars proliferated, accidents increased. Now we legislate everything about car and road safety, down to the driver’s decision to wear a seatbelt. There are fines for not wearing one. If the trouble with internet technology is bad behavior why not address the behavior?

What Isaacson skims over in his trolling lament is that the worst trolls on the internet are the very people he thinks should solve the trolling problem. Huge media companies like Facebook shamelessly collect their users’ data and sell it. Anonymity is not permitted on Facebook because the company can’t use you, can’t parse your information into demographics and ad bins, if they don’t know who you are. Similarly, the “trust” relationship built into place by search engines like Google is merely a handshake agreement that the company won’t “be evil.” That doesn’t mean Google deletes your search history. As we saw in the 2016 election, “evil” is a word that’s up for interpretation in our society. We, as users of Google, don’t know who is deciding what’s evil at any given time. Isaacson wants users to lose anonymity but notably makes no mention of tech companies and their addiction to opacity. In Isaacson’s future world, users are the biggest losers.

Isaacson offers logical suggestions for what a safe internet might include but how he gets there is the sales pitch of the century. Certainly, it’s important to institute payment for work. We don’t need a new internet for that. I’ve been pitching companies like Medium on this concept for years. “Find a way to pay your writers, even one cent per read, and you will revolutionize the publishing industry.” “A pay model is the only way forward to maintain the integrity of what is published and read.” Medium could institute a pay-model today. What Isaacson misses is that companies and sites most users rely on for information offer their services for free so that they can take whatever consumer data they want in return. The internet hasn’t evolved naturally into a pay model because the people currently making big bucks off of internet technology are also in charge of its design. There are no checks and balances built into the governing of the internet. This does not mean we do away with internet privacy. It means we legislate it.

To revolutionize the internet, the Googles and Facebooks would have to become industry-leading pay-model services. In a pay-model service, user-consumers would lose anonymity to the company offering the service (via their credit card), but maintain privacy in whatever further capacity they wished while using the service. It would be no different than walking into a Starbucks and ordering a latte. Give the barrista your own name or someone else’s, pay with cash or credit, hide your face behind sunglasses or don’t…at the end of the day, you’re physically standing in the store and someone can deal with you if you cause a disturbance. As long as you’re a peaceful coffee drinker you still have agency to determine your level of privacy. The same is true of a paying customer online.

Finally, and this is perhaps the most important omission in Isaacson’s piece, there is presently a massive power struggle underway between government and technology elites — specific, powerful individuals within broader industries. Both groups are greedy for data. One group wants to retard technology in order to maintain control over its electorate. The other group wants to advance technology so fast it will maintain control over its creations and, by extension, its users. The electorate and users are one in the same. The bad seeds among us exist whether anonymity is built into the internet or not. They exist in government, they exist in boardrooms and they exist in chatrooms. It is persistant abuses of power which promote toxicity. Unless government and technology elites find a way to work together for the betterment of society as a whole, that toxicity will continue no matter what internet protocols are put in place.

Tribeca, Vaxxed, and Credibility

Tribeca, Vaxxed, and Credibility

The fundamental growing pain of the Information Age is distrust.

I don’t want medical information from Del Bigtree, producer of Vaxxed and a former producer for the Dr. Phil-created show The Doctors. Sadly, millions of Americans listen to people like Bigtree because faux medical shows run on free television and are endorsed by celebrities like Oprah. For this reason, Vaxxed must be addressed.

I also don’t want medical information from ABC News after listening to the questions posed to Bigtree by their segment reporter during their unedited 10-minute interview prior to releasing Vaxxed. She asked general rather than science-based questions and subsequently ran a piece focusing on celebrity-non-medical-professional Robert De Niro. Sadly, millions more Americans get their medical information from ratings-chasing sources such as these.

The confluence of too much information and a massive shift in newspaper revenue streams means many journalists have cut the corner of agnosticism and taken the shortcut to opinion. Opinions sell faster and better than impartial news because they provide an extra service. The public is overwhelmed by the sheer scope of information out there. The layperson’s response to information overload has been to confer trust on opinionated individuals in the media, whether those individuals have any expertise or credentials or not. (Dr. Phil has a masters degree in experimental psychology. Millions of people are unwittingly participating in his experiments.)

The underlying problem is this: Everything Del Bigtree says in his interview about the way our institutions are supposed to work is correct. His logic about our broken system lends disproportionate weight to his unrelated thoughts about vaccines. Donald Trump is presently enjoying the same path to success. People are habituated to follow the breadcrumbs of rational-sounding speakers, even if their only rational thoughts are to voice obvious grievances. However, it no longer goes without saying — just because people are right about the way the system is broken doesn’t make them right about anything else.

Our refusal as a society to properly fund journalism by embracing “free” information on the internet is directly responsible for proliferating misinformation.

Distrust of our institutions has ultimately fostered an environment where people distrust professionals. The majority of us are not doctors, haven’t attended medical school, and therefore rely on trained doctors for good/best information. When trust in that system breaks down, the next line of defense is journalism. When trust in that system breaks down, whistleblowers come forward. When trust in whisteblowers breaks down, you have millions of people basing important medical decisions on uneducated readings of partial and/or decontextualized information online or on television. In the case of vaccines, this creates unnecessary dangers and has already lead to unnecessary deaths.

To be extra clear: shaming people for their refusal to vaccinate is profoundly unhelpful. Shaming people for looking for explanations and answers…also profoundly unhelpful. Shaming people for blatantly not doing their jobs is completely acceptable.

To that end, I’d like to publicly shame the writers at mainstream media outlets who pressured the Tribeca Film Festival to pull Vaxxed from their line-up, not because I think the film has an ounce of validity (…how could I know? I haven’t seen it…), but because we have a problem with people not vaccinating their children. When film critics and science writers suppress a film that illustrates a real problem, namely broken trust in our institutions, they feed the narrative on both sides of the vaccine issue (Andrew Wakefield’s a quack/Andrew Wakefield’s being suppressed) and perpetuate a serious problem. A journalist’s job is to convey the necessary facts in order to resolve the issue. When journalists publicly decline to see a film AND assert it is quackery, they squander what little trust remains in the institution of reportage.

If the answer to our vaccine problem is as simple as debunking a quack doctor, then journalists should sit through a two-hour movie, wade through the information yet again, debunk the father of this misinformation and demonstrate to a skittish public that no stone has been left unturned. Journalists should do this not because Vaxxed has any validity, but because anti-vaxxers think it does, and those people are not vaccinating their children. The number of people who will see Vaxxed is negligible compared to the millions of people who will read a widely shared takedown piece. The stronger the case science journalists and film reviewers make against a film like Vaxxed, the sooner this issue will be resolved.

If journalists can’t make a strong enough case for this problem to be resolved — and I doubt they can because the task is too big; a “strong enough” case today entails renewing people’s trust in the entire healthcare system. We’re that far down the path of suspicion — then the issue should continue to be treated with skepticism while a second case is made for the public to accept and weigh the alternatives: potential return of deadly disease versus potential vaccine-autism links. There is no third option at present. “Waiting” for a different vaccine is equivalent to not vaccinating and carries consequences. You vaccinate or you don’t. Personally, I encourage people to do as much investigation of the diseases they aren’t vaccinating against as they do of the vaccines. That precious airtime spent looking at Robert De Niro’s headshot should be filled with information on what happens when we don’t prevent preventable diseases. (I expect he would agree.)

This issue will continue to worsen until we respectfully acknowledge that people’s trust in their institutions is broken, and behave accordingly. Yelling at people to trust something never works. The vaccine debate, like so many debates cropping up across the country, came about due to systemic distrust. The way forward is for institutions to demonstrate their trustworthiness, not their disdain, and to give the public a free, considered, informed alternative to Dr. Phil and his ilk.

HOW WE CIVILIZE TECHNOLOGY

HOW WE CIVILIZE TECHNOLOGY

Living in Asia in the late 90s, I spent time in countries that were then considered “developing” economies. Textbooks were filled with prognostications about the potential growth and downfall of these places but no bar chart captured the terrifying hilarity of driving an hour outside of Seoul at high speed in a brand new sedan on unpaved roads and only potholes and feral animals to navigate by. Technology was tangibly out of sync with infrastructure. When something blocked the road drivers veered onto the front steps of houses to get around it. Parking was wherever you feel like it, and parked cars were often rendered inaccessible due to other people’s feelings about parking. Disagreements were resolved the old-fashioned way, with pointing, yelling, and threat of fists. Over time, enough pedestrians were casualties and enough expensive tires were blown in potholes that laws became necessary, as did the paving of roads. The automobile is no less amazing because society set a speed limit. We mitigate and retard technology where it threatens and outpaces us. This is how we civilize our innovations.

The most poignant irony of the Information Age is the internet’s role in restructuring our relationship to politics. In CITIZENFOUR, Edward Snowden avowed his intent to end the tyranny of the snooping government, but technocratic paternalism is equally invasive and it’s built into the digital realm. Complicated legal documents pop up at the outset of a business relationship and people with no legal background are conditioned to move ahead with a trust us one-click “Agree.” Our relationship to intelligent technology is best portrayed by the routine updates we tacitly agree to without reading or understanding what they entail. I Agree to whatever you’re about to load onto my phone or into my computer, agree to what you think is best for this device and my use of it, agree without stipulation, agree without working knowledge, agree because not agreeing seems time-wasting and foolish and questioning is beyond my technical ability. I always agree with you because everyone else is agreeing with you so it must be okay. I always agree with you because I don’t know why I should disagree.

This habitual agreement has proved deadly to the exchange of real information. The technocracy devised the fastest, most appealing method for securing a user, and internet users subsequently became desensitized to the act of giving away their rights. The repetitive process has numbed healthy suspicion of any organization that demands legal agreement to a loss of personal agency. Those internet service agreements are not there to protect individuals, they are documents created by expensive legal teams to ensure a company has no responsibility to the consumer. If these statements aren’t disturbing enough, stretch them to apply to the government in the shocking months and years after 9/11. The PATRIOT Act was the federal government’s service agreement, and the majority of the American people agreed to it without understanding what they were signing away.

Fourteen years on, perhaps the greatest misstep in rectifying our mistake is to begin with privacy. Loss of privacy is an end result. Privacy can be protected, it can be violated, but it cannot be given. That notion is a falsehood born of Victorian manners — I’ll give you some privacy — which preempt uncomfortable directives: Leave the room. Get off the line. Turn your head. Don’t read my emails. I need my privacy. The sci-fi notion of “mindreading” is terrifying precisely because it violates the only space on earth that belongs entirely to us. When we communicate with people, through talking, writing, or touch, we consciously extend that private space to include others. A violation of private space is a form of mindreading. In building society around the digital world, we’ve ceded a massive amount of private space to move in safely. The only recourse to learning your boyfriend has read your journal is to hide it in a new place, but the only recourse to discovering people can hack your emails is to stop writing anything sensitive or private at all. By necessity, we’ve retreated inward. Our truly private worlds are almost entirely interior now. That loss of intimacy has already alienated us from one another. Unable to safely extend a hand or share a thought, our knowledge of people stops with avatars and public text. We can’t know people’s deeper feelings and they can’t know ours. There’s nowhere safe to talk. We are alienated.

In Citizenfour, Glenn Greenwald asked Edward Snowden why he would risk imprisonment — the obliteration of privacy. In doing so, Greenwald identified the one circumstance where personal agency is taken away. That the cyber debate revolves around the give and take of privacy tells us that we’re already in a prison of sorts. To get out, we need to reestablish laws and agreement. Not the tacit agreement of accepting free stuff in exchange for unknown costs but overt agreement and expectation of legal recourse if our rights are violated. As political theorist Stephen Krasner observed in the early 1980s: “The Constitution is a document more concerned with limiting than enhancing the power of the state.” Modern lawmakers violated this precept into extinction with the USA PATRIOT Act. There’s no current expectation that the present government will give up the Patriot Act of their own volition, and no reason to believe the public has the will to make them. This is where most people drop out of the resistance movement and succumb to prison life.

The other misstep in solving the puzzle is a myopic focus on the future. Pew Research Center’s Net Threats survey asked over 1400 technology experts to predict “the most serious threats to the most effective accessing and sharing of content on the Internet.” With so much focus on forecasting, we’re overlooking a wealth of facts in the present. Ask a South Korean mother living 20 miles from the DMZ in 1997 what the most serious threat to her children’s lives was and most Americans would have predicted a doomsday fear of war with the north. However, it’s just as likely she would have said: “See that black sedan driving 50mph over my front doormat…?” Attention-grabbing headlines often obliterate imminent dangers. Public discussion leapfrogs over what we could solve today because no one wants to dig in and do the unglamorous work of painting a dotted line down the center of the road. (Put another way: Why isn’t Pew asking these 1400 experts to identify today’s most solvable problem and offer a specific solution? That’s 1400 solutions right there.)

If technology is responsible for creating a state of alienation then the government is guilty of capitalizing on that alienation. When politicians appeal to people’s confusion over new technology, they perpetuate a dangerous myth that people can separate themselves from the digital age. Lindsey Graham’s opinion on cyber surveillance is useless if he doesn’t understand how Americans use email or why they might be upset that those emails are intercepted and read by government officials. Perhaps he’d like to turn his diary over to the CIA and see how that feels. His vote on privacy legislation would certainly be made with the necessary wisdom.

America is a world leader in computer technology and innovation. Every member of Congress, and certainly the next president, should be knowledgeable about computer technology. America’s elite governing body must be prepared to debate cyber. My 90-year-old grandmother has been sending emails for years and she has a Facebook account. If United States senators can’t keep up with her computing skills then they don’t belong anywhere near the Capitol. The most important action Americans can take is to vote for cybersmart House and Senate representatives in upcoming elections.

As backwards as Washington seems, cybersmart politicians do exist. It’s clear from Hillary Clinton’s decision to house computer servers in her home during her tenure at State that she’s knowledgeable about cyber. Despite her public statement, Clinton’s use of personal servers has nothing to do with convenience and everything to do with security. Clinton owns her data. She also possesses depth of knowledge about what goes on in the intelligence community. I expect that is what drove her to take control of her privacy. If she wants to do the country a great service, in or out of the White House, she should make cyber legislation her top priority and level the playing field for citizens everywhere. It would unite the country to speak plainly about the state of our internet. Honest talk about cyber surveillance from a public figure who can speak to both sides of the debate would be a huge step forward for the country.

What will hopefully become apparent, to decision makers and citizens alike, is that both sides of the ideological struggle derive their power from the online participation of citizens. The present situation has left people with nowhere to turn for trustworthy leadership. The conditions that permitted fascism’s spread after World War I — post-war malaise, financial struggles, political distrust — tamp down people’s natural resistance to incremental loss of agency. The circumstances that facilitated the rapid creation of totalitarian governments in previously liberal, rational societies are cropping up exactly one century later. The situation is again ripe for machtergreifung, or power-grab.

Democratic European societies once made a desperate attempt to escape their status quo by funding unstable third parties with disastrous consequences. We are now seeing many radical ideas thrown into the mix, some backed by logical process, others attempting to shake people out of rhetoric fatigue. Reboot the Government! Reboot the Bible! Reboot the Brain! Drop one letter from those slogans and we’re deep in A.I. territory. Bill Gates, Elon Musk, Stephen Hawking and their ilk proclaim their fear of the dark side of artificial intelligence with increasing regularity. We should be afraid too. There’s no precedent for the power vacuum created by a flaccid Congress and a disproportionately wealthy technology sector. This situation could pave the way for the first artificially intelligent leader. The engineering is getting there, and the rest would be…history.

Excerpted from a longform analysis of historical, theoretical and political factors in the ongoing “cyber” debate. Full piece here— https://medium.com/@paintedbird/the-information-game-aee16ecdfd0d

The Information Game

The Information Game

…or How to Think About Cyber

There’s a gut-wrenching scene at the climax of the World War II biopic The Imitation Game. Alan Turing and the codebreakers at Bletchley Park decrypt a German cable and suddenly they know the enemy’s plan to attack Allied ships and, incredibly, all attacks for the foreseeable future. Their celebration is short-lived. Turing grasps the ephemeral nature of their discovery and has a sickening epiphany: To win the war they can’t tip off the Germans that they’ve decoded Enigma. Instead they must simulate ignorance by choosing strategic victories and sacrificing the rest of their men. Panic sets in. One of the codebreakers has a brother serving aboard a targeted convoy. He begs his colleagues to use what they know to spare his brother’s life but Turing is resolved. Their secret must be concealed at the highest cost. The ensuing choices haunted the intelligence community long after the war was won.

Over the last 14 years, Americans have been conscripted into an information war. Individual privacy is now incidental to the objectives of government and technocratic elites, and vulnerable to the exploits of criminals and extremists. The battle for control over the digital space is a gloves off, civil-liberties-be-damned free-for-all. To reestablish trust in our oldest institutions it’s necessary to parse the steps that led to the present situation and decrypt the objectives of contemporary leaders and policymakers.

RED FLAGS

Nearly 100 years after Nazism flourished in Germany, the question is still asked with incredulity: Why did German citizens permit and participate in genocide? There will never be a satisfactory answer to the moral question of why, but there is a clear beginning in the circumstances of how. The rise of fascism in post-World War I Europe began with a confluence of domestic troubles in Italy: a financial crisis, concomitant economic hardship, grief over millions of Italian war casualties, widespread dissatisfaction with political parties that failed to deliver on promises, and a perceived threat to financial security from a foreign (Communist) ideology.

Onto this stage stepped Benito Mussolini, a staunch nationalist and war veteran whose preoccupation with violence inspired the formation of an army of uniformed “Blackshirts” — unemployed youth, funded by the middle and upper classes, who assassinated opposition leaders, suppressed and destroyed opposition newspapers, and eventually marched on the capital to take power in 1924. “A Brief History of the Western World” summarizes Italian fascism thus:

“In the beginning, as Mussolini himself admitted, [fascism] was largely a negative movement: against liberalism, democracy, rationalism, socialism, and pacifism…[Italians] had been cast adrift, let down by failed hopes of progress and happiness. Faceless in a mass society, they also felt alienated from themselves. The Fascists found an answer to this emptiness by arousing extreme nationalism….The fascist myth rejected the liberal reliance on reason and replaced it with a mystical faith. Stridently anti-intellectual, it held that the “new order” would spring from the conviction of the “heart.” Fascists therefore looked upon intellectuals as…suspicious characters…. Most ordinary Italians accepted Fascism with enthusiasm. The individual who formerly felt alone and unneeded, enjoyed a new sense of “belonging.”

The rise of fascism in Italy took less than six years from invention to political dominance. Fostered by comparable conditions in neighboring countries, the ideology spread across Europe and fatefully intersected with the political ascent of Adolf Hitler in Germany. The Germans have a word for Hitler’s rise to Fuehrer: machtergreifung — macht, meaning power, and ergreifen, to grab or seize. Like Mussolini, Hitler headed up a violent army of unemployed youth and committed illegal acts to dissuade and undermine his opponents, but it was the power vacuum created by ineffective German leadership that paved the way for the Third Reich and Nazism.

*

Flag of the Soviet Union

A second world war and one Pax Americana later the world was pumped with Cold War adrenalin. In 1962, nuclear superpowers bumbled their way into a stand-off and lucked their way out of the unthinkable during thirteen days of diplomatic posturing over Cuba. The rapid advancement of nuclear technology meant there was no room for error, yet error upon error was made. In effect, American leadership failed the test but passed the class. America and Russia skated by on their shared basic values, but the crisis taught no lessons on how to face an adversary with profoundly different goals, specifically those rooted in tribal conflict and revenge.

In the aftermath of America’s nuclear showdown, political theorist Graham Allison published his seminal work “Conceptual Models and the Cuban Missile Crisis.” It would form the foundation of American foreign policy. Allison defined three distinct methods for understanding policy outcomes: The rational policy model (foreign governments behave rationally in relation to their goals), the organizational-process model (the military typically wants X, the bureaucracy typically wants Y, and historically they have n relationship to each other so the outcome will predictably be z), and the bureaucratic politics model, where shapeshifting factors such as interpersonal conflicts, bureaucratic inertia, and availability of resources act on each other to influence foreign policy outcomes. Government elites strongly favored the bureaucratic model as conventional wisdom that would shape American foreign policy for decades to come.

Political theorist Stephen Krasner reassessed Allison’s models, first in 1972, and later at the height of the “first” Cold War. He was troubled that President Kennedy, subcabinet members, and scholars from top public policy programs in the 1960s wholly adopted the bureaucratic approach, where outcomes were viewed as an evolving compromise of inputs. Krasner identified the fundamental flaw in the model as giving elite decision-makers a blanket excuse for their failures. Specifically, he reframed bureaucratic-politics thinking as a biased framework for blaming policy errors on the “self-serving interests of the permanent government,” where elected officials were viewed as powerless to corral the government “machine.” He summarized the infinite loop of accountability thus:

Bureaucracy is a machine, and “[machines] cannot be held responsible for what they do, nor can the men caught in their workings.”

This is a stunning double entendre for the Information Age.


DIGITAL DICTATORSHIP AND WARRING ELITES

Rights and privacy are dictated by an elite group of decision makers who control the laws (Government) and the digital infrastructure (Technocracy.) Internet usage and hardware purchases now constitute a “vote.” Government and technology sectors each employ 1% (3–4 million people) of the American population. The percentage of top-level decision-makers, technicians and analysts within those fields is assumed to be less than .01% of the American public and is therefore elite. Technocratic elite lumps Anonymous hackers in with tech CEOs, and government elite includes members of all branches of government and political influencers with monetary or legislative sway. Since both elites invest billions of dollars successfully marketing themselves to society, the benefits they provide are widely known and will not be discussed here. Instead, the focus is the encrypted cost of advancement. Decoding the costs reveals which services and policies are truly beneficial, and to whom.

*

The Technocracy

The history of the government’s relationship with computer technology is long and complicated. Perhaps only one fact is universally accepted: Al Gore did not invent the internet. Contrary to popular folklore, he never claimed to invent the internet. Gore’s words were twisted, the transcripts are widely available and he was subsequently defended by two of the “fathers of the internet” as deserving “significant credit for his early recognition of the importance of what has become the Internet.” The urban legend illustrates the strange paradox of the Age of Information. Even with unprecedented access to the truth, millions of people are often misinformed.

Internet development began in the 1960s, became its broadly used iteration in the mid-1970s, was commercialized through the 1980s and came into its own in the early 1990s with the introduction of the World Wide Web, the universally accepted infrastructure for data exchange on the internet. Web engineering is credited to Tim Berners-Lee’s 1989 proposal at CERN. It was developed over the next few years and made free to the public in 1993. Anecdotally, this snippet enumerating current issues confronting global governing bodies from the then-definitive International Law Anthology reveals the digitally unsophisticated world that received this new technology:

Global Communications: The earliest topics in this burgeoning field were international postal services and the laying of submarine cables. The invention of radio, television, and facsimile and modem communications technology, have led to explosive growth in this area of international regulation. Jamming and counter-jamming of another nation’s radio wave frequencies, channel regulation, remote sensing, and stationary satellite transmission are matters of intense interest. There is a move toward international broadcast standards and transmission quality. But there are also countervailing pressures against freedom of information, with some nations (and religious groups) desiring the suppression of international telecommunications relating to the advocacy of war or revolution, criticism of governmental officials or policies, regulation of commercial messages, and materials depicting real or fictional violence or pornography. — Anthony D’Amato, “Domains of International Law,” International Law Anthology

It reads like a mid-century newspaper clipping but that passage was published in 1994. Bill Clinton was president.

Twenty years later, Laura Poitras’s Oscar-winning documentary CITIZENFOUR is more than an exceptional historical record. The film is also a primer for technocratic culture and ideology. In June, 2013, after months of anonymous communications, National Security Agency contractor Edward Snowden sat down face-to-face with Poitras and The Guardian journalist Glenn Greenwald in a Hong Kong hotel room. Snowden spoke eloquently and fluently about the values at the root of his dangerous undertaking to leak classified documents detailing secret surveillance programs run by the United States government.

From CITIZENFOUR:

Glenn Greenwald: So, why did you decide to do what you’ve done?

Edward Snowden: For me, it all comes down to state power against the people’s ability to meaningfully oppose that power. I’m sitting there every day getting paid to design methods to amplify that state power. And I’m realizing that if the policy switches that are the only thing that restrain these states were changed you couldn’t meaningfully oppose these. You would have to be the most incredibly sophisticated technical actor in existence. I’m not sure there’s anybody, no matter how gifted you are, who could oppose all of the offices and all of the bright people, even all of the mediocre people out there with all of the tools and all of their capabilities. And as I saw the promise of the Obama Administration be betrayed and walked away from and, in fact, actually advance the things that had been promised to be curtailed and reined in and dialed back, actually got worse. Particularly drone strikes…That really hardened me to action.

GG: If your self interest is to live in a world in which there is maximum privacy, doing something that could put you in prison in which your privacy is completely destroyed as sort of the antithesis of that, how did you reach the point where that was a worthwhile calculation for you?

ES: I remember what the internet was like before it was being watched and there has never been anything in the history of man that’s like it. You could have children from one part of the world having an equal discussion where they were granted the same respect for their ideas in conversation with experts in the field from another part of the world on any topic anywhere any time all the time, and it was free and unrestrained and we’ve seen the chilling of that, the cooling of that, the changing of that model toward something in which people self-police their own views and they literally make jokes about ending up on “the list” if they donate to a political cause or if they say something in a discussion. It’s become an expectation that we’re being watched. Many people I’ve talked to have mentioned that they’re careful about what they type into search engines because they know it’s being recorded and that limits the boundaries of their intellectual exploration. I’m more willing to risk imprisonment, or any other negative outcome personally than I am willing to risk the curtailment of my intellectual freedom, and that of those around me whom I care for equally as I do for myself. Again, that’s not to say that I’m self-sacrificing because I feel good in my human experience to know that I can contribute to the good of others.

[transcription from video]

It’s striking that Snowden didn’t say privacy in his mission statement. Greenwald framed the debate with the question many of us would ask after hearing that we’re being surveilled, and subsequent news reports by outlets across the globe widely referred to “privacy.” It’s unclear whether Greenwald and Poitras heard more of Snowden’s thoughts where he raised the issue of privacy himself, but he doesn’t say the word. He advocated an unmonitored internet from the vantage point of someone who is highly skilled at protecting his own privacy. He recollected the realization, at his NSA desk, that before too long he — a member of the tech elite — would be technologically outpaced and unable to protect his privacy. The technocracy was losing ground to the government.

Society owes Edward Snowden an enormous debt for his decision to blow the whistle on the NSA at great personal risk. To be clear: he enabled a profoundly necessary conversation to begin. However, his poetic description of the unrestrained nature of intellectual advancement is technocratic rhetoric for a digital utopia that never existed. As compelling and passionate as he is, Snowden made several incorrect assertions that should be dispelled in the interest of productive discussion.

First, there have been many inventions in the history of man like the internet, including the space shuttle, the airplane, the telephone, or the galleon, all of which brought people together across vast distances at previously unmatched speeds to have discussions and exchange knowledge. Mankind went through periods of adjustment to those profound changes in infrastructure and we will navigate this one as well. Innovation is not unprecedented. This invention will mature beyond its makers and it must assimilate to the needs of civilization, not the other way around.

Second, the children can still spend their days online talking to experts as equals if they want to (though it’s doubtful they do.) Invoking chilled children and cooled innocence is misleading rhetoric when it’s primarily adults who spend their time staring at a screen. Further, the tech industry pushes expensive gadgets and software for kids but, as highlighted by the New York Times’ “Steve Jobs Was a Low-Tech Parent,” many technocrats strictly limit gadget exposure for their own families because they’re aware of the harmful effects of internet and technology use on young minds. Teenage youth are a more complicated issue with regard to internet freedom, which is especially clear in the case of ISIL’s recruiting techniques, but Snowden wasn’t referring to Muslim children discussing ideas with expert terrorists across the globe. He wasn’t lamenting privacy incursions on thugs. In fact, he didn’t acknowledge the grey areas of internet freedom at all.

The most important falsehood in Snowden’s statement, and the core message of the technocratic ideology, is that the internet was once and should always be free. This is a seductive idea, especially to people with good computing skills and entrepreneurial leanings, but it is patently untrue. Getting online requires expensive hardware and infrastructure that is designed and sold by the same community that dominates the internet through technical expertise.

For the last 20 years the technology industry has hard-sold hardware to citizens, corporations and governments alike along with software that seamlessly replaced or supplanted infrastructure for everything from financial transactions and brick-and-mortar stores to research and even face-to-face meetings. The technocracy orchestrated one of the greatest heists in history by amassing “free” content from writers and established media publications trying to maintain their brands with a millennial generation that wasn’t taught to pay people for their time, research, and intellectual work. As a final insult to “freedom,” tech companies undertook the systematic repackaging of users’ private information as data useful for advertizing, which they bundle and sell to whoever they choose at a profit. (The word “user” rather than “customer” has always implied a barter arrangement, but it is rarely spelled out exactly what is being given and gotten. You open a social media account once, perhaps only use it for an hour or a day, but the service provider owns your personal information forever and can sell it many times over.)

In 2015, Apple, Microsoft, Google, IBM and Samsung have risen to the top ten of Forbes’ World’s Most Valuable Brands, and 11 more technology companies are in the top 100. Six of the world’s 20 richest billionaires are computer technology elite. All of that free internet has paid for mansions and private educations. There’s nothing wrong with companies and people making money off of this invention — America is a proudly capitalist society — but perpetuating myths about intellectual freedom and raging against government misuse of personal data is hypocritical and misleading.

If it appears I’ve misinterpreted Snowden’s meaning entirely, breathe easy. It’s clear that Snowden’s “free internet” refers to freedom of thought, communication and information, not freedom of goods and services. However, the cyber conversation can’t bifurcate those billions of dollars from the billions of devices and trillions of gigabytes of data. Doing so hides the massively lucrative business objectives behind fun, sometimes addictive, products. If technocrats truly want a free, unrestrained internet they’re now rich enough to forgo that pile of money, make cheap hardware, set chaos-legitimizing rules (First Rule of Internet: There are no rules) and enforce the entropy. I doubt they’d have billions of takers and no one would be typing their credit card number into a chaos box.

*

Screenshot from the Department of Justice website

The Government

Spying, surveillance and covert activity have always been part of America’s security and defense apparatus; that activity just wasn’t legal. Illegality was at the heart of clandestine work, making it extremely risky and therefore far more considered by those commissioning it and those undertaking it. The legalization of amoral behavior came about in the weeks after 9/11 because, ostensibly, the president and his cabinet wanted the freedom to openly plan illegal activity without fear of legal repercussions. The PATRIOT Act inoculated government officials from risk and, many would say, ethical pause. What followed was a confident, unrisky expansion of intelligence infrastructure with no heeded supervision or endgame.

A nation that was once gripped by the unraveling of Richard Nixon now shrugs off revelations of CIA agents breaking into Senate Intelligence Committee computers in 2014. Government workers have spied on elected officials before, but today the public digests these incidents with a vague assumption that all criminal behavior by the government has a footnoted legal justification somewhere. These stories translate as infighting among elites. Fourteen years of the Patriot Act have conditioned Americans to expel what little outrage they can muster in a matter of days and then go limp. The groups taking legal action against injustices are typically news or special interest organizations with a financial or moral dog in the fight and powerful legal teams to back them. (The latest New York Times op-ed piece from Wikipedia’s Jimmy Wales and the AP’s lawsuit against Hillary Clinton are two cases in 2015 alone.) Even with funded legal representation, there’s a pervasive sense that their effort is futile. For all of the flagrant rights abuses, the government’s tracks are papered over by the PATRIOT Act.

One way to step off the merry-go-round is to take a page from Alan Turing’s estimable problem-solving approach and look at what isn’t happening in our every day lives. Government elites have made several huge assumptions on our behalf and, in light of Edward Snowden’s unspooling NSA leaks, it’s worth revisiting their decisions after seeing the results. The government uses negative hypotheses to great effect (if we don’t renew the PATRIOT Act…) and so can the people whose rights are in the balance.

What isn’t being done with NSA-collected data?

Potentially, the important stuff. Through indiscriminate data-collection, the NSA is extensively aware of wrongdoing by the American people, corporations, government agencies and officials. We don’t need Edward Snowden’s evidence to know this is true. Daily news stories show that digital communications include sexually harassing emails in the workplace, threats of murder or violence, faxed paper trails of embezzlement, proof of premeditated theft, telephonic recordings of gender and race discrimination, and documented personal indiscretions by public officials. The American government inadvertently nets evidence to myriad criminal acts, both domestic and foreign. It then employs people to sift through these stores looking for some lawbreakers, but not others. When intelligence officers stumble upon criminal or threatening activity that doesn’t serve their objectives do they look the other way to conceal their methods? It’s conceivable and probable that actual lives have been lost to inaction rooted in concealment. What happens in situations like these? What do the numbers look like on paper — lives lost or ruined versus casualties from terrorist attacks. The legal ramifications are mind-boggling but the ethical question is straightforward: Is a government obligated to protect its people or its objectives?

What else isn’t being done with NSA surveillance data? For all of their time spent sweating over Apple’s Xcode, the U.S. government didn’t stop the Tsarnaev brothers, the French government didn’t stop the Charlie Hebdo murderers, and the U.K. government isn’t stopping thousands of teenagers from leaving the country, unaccompanied, to join ISIL. Most disturbing was the story of three teenaged girls who left the U.K. in February and may have been aided by a western spy in transit, forcing us to question why governments aren’t helping their most vulnerable citizens return to safety (and whether they may be using them as unsuspecting spy assets instead.) With the Snowden data we have proof that our individual rights, and lives, are considered a worthy sacrifice to what the government deems “the greater good.” When spy agencies might be risking the lives of teenagers in the name of future terrorist attack victims, it’s clear government objectives no longer align with the values of the citizens they work for.

What if we don’t have the internet?

When Lindsey Graham weighed in on Hillary Clinton’s email debacle on Meet the Press with an I’ve-never-sent-an-email statement, he pumped a figurative fist of defiance. He’s a loud, proud Luddite in the new millennium. However, ask him where he does his banking, whether he gets money from the ATM, uses a cellphone, watches cable television, or has ever read the news online and he’ll be forced to admit he’s got a digital footprint. His televised statement gives him credibility with the anti-technology demo, the people who are done with all the smart talk and just want to love America with all of their hearts [see: Fascism, precursor to]. The only people alive today who aren’t consciously reliant on cyber technology are toddlers. The rest of the modern world communicates regularly online and is increasingly aware that public officials lack cyber expertise.

But what if we did live in Lindsey Graham’s la-la-land and didn’t have a digital footprint? A world without the internet is inconceivable today, but that world existed only two decades ago. In that time we traded infrastructure for more than just privacy. What we save in time and gain in information should be held up to what we spend in dollars to participate in the digitized world.

A sliver of the data shows that in 2014, 177 million smartphones sold in North America, amounting to $71 billion in sales. Globally, 1.3 billion smartphones sold. Add to that the pc, tablet and cellphone sales, software sales, internet and cellphone service contracts…Americans pay a lot of money to go about their daily lives. This is not to suggest we should shun progress and innovation, but we should know what we’re getting for our money. We aren’t getting shiny new laws for the digital infrastructure we depend on. Our brightest technological minds unwittingly innovated a cyber-police state and elected officials aren’t knowledgeable enough, or confident enough, to walk back what technology wrought. For a country that leads the world in cyber technology, many of our legislators are tech-dumb to the point of ridiculousness. The fatal mistake would be to insist we can separate ourselves from the infrastructure of modern society by never sending an email. Politicians like Graham sell that idea because it sounds freeing [See: Paternalism, Fascism’s sweet-faced uncle named] but they’re diverting attention from the pressing issue of lawmaking because they clearly have no idea where to begin. The gridlock in Congress might not be gridlock at all. Perhaps our representatives are simply confused about how to hit “Send.”

Finally, who doesn’t control personal data?

If the answer to this question isn’t obvious yet then it’s worth stepping into the nearest bathroom and checking out the wall above the sink. (Or ask Hillary Clinton. She gets it.) In military jargon, intelligence refers to strategically useful information. Information becomes intelligence when it has an application, and that application is determined by whoever finds, reads, assesses and controls the information. To grasp how important this seemingly obvious statement is, consider the juxtaposition of Director of National Intelligence James Clapper and former NSA contractor Edward Snowden, two men working at the same government agency in control of the same information who found starkly different uses for it.

From this we must conclude that, within the government, a select group of officials and contractors control our information and they each have specific objectives in mind. Then we must acknowledge that almost none of us can articulate what those individuals’ objectives are so we don’t know if we agree with them. As internet-reliant citizens, we play the odds every time we connect digitally, not knowing which side of the numbers game we’re on. To use the analogy of WWII Britain, are we the majority at home or the unsuspecting brothers on targeted convoys? None of us can answer this question because the government elite draws up the map in secret. To the extent that events unfold in a manner we agree with and our lives aren’t negatively affected, we can only say we got lucky.


Loading screenshot of Google’s Virtual Library project

HOW WE CIVILIZE TECHNOLOGY

Living in Asia in the late 90s, I spent time in countries that were then considered “developing” economies. Textbooks were filled with prognostications about the potential growth and downfall of these places but no bar chart captured the terrifying hilarity of driving an hour outside of Seoul at high speed in a brand new sedan on unpaved roads and only potholes and feral animals to navigate by. Technology was tangibly out of sync with infrastructure. A blocked road sent drivers veering onto the front steps of houses. Parking was wherever you feel like it, and parked cars were often rendered inaccessible due to other people’s feelings about parking. Disagreements were resolved the old-fashioned way with pointing, yelling, and threat of fists. Over time, enough pedestrians were casualties and enough expensive tires were blown in potholes that laws became necessary, as did the paving of roads. The automobile is no less amazing because society set a speed limit. We mitigate and retard technology where it threatens and outpaces us. This is how we civilize our innovations.

The most poignant irony of the Information Age is the internet’s role in restructuring our relationship to politics. Snowden avowed his intent to end the tyranny of the snooping government, but technocratic paternalism is equally invasive and it’s built into the digital realm. Complicated legal documents pop up at the outset of a business relationship and people with no legal background are conditioned to move ahead with a trust us one-click “Agree.” Our relationship to intelligent technology is best portrayed by the routine updates we tacitly agree to without reading or understanding what they entail. I Agree to whatever you’re about to load onto my phone or into my computer, agree to what you think is best for this device and my use of it, agree without stipulation, agree without working knowledge, agree because not agreeing seems time-wasting and foolish and questioning is beyond my technical ability. I always agree with you because everyone else is agreeing with you so it must be okay. I always agree with you because I don’t know why I should disagree.

This habitual agreement has proved deadly to the exchange of real information. The technocracy devised the fastest, most appealing method for securing a user, and internet users subsequently became desensitized to the act of giving away their rights. The repetitive process has numbed healthy suspicion of any organization that demands legal agreement to a loss of personal agency. Those internet service agreements are not there to protect individuals, they are documents created by expensive legal teams to ensure a company has no responsibility to the consumer. If these statements aren’t disturbing enough, stretch them to apply to the government in the shocking months and years after 9/11. The PATRIOT Act was the federal government’s service agreement, and the majority of the American people agreed to it without understanding what they were signing away.

Fourteen years on, perhaps the greatest misstep in rectifying our mistake is to begin with privacy. Loss of privacy is an end result. Privacy can be protected, it can be violated, but it cannot be given. That notion is a falsehood born of Victorian manners — I’ll give you some privacy — which preempt uncomfortable directives: Leave the room. Get off the line. Turn your head. Don’t read my emails. I need my privacy. The sci-fi notion of “mindreading” is terrifying precisely because it violates the only space on earth that belongs entirely to us. When we communicate with people, through talking, writing, or touch, we consciously extend that private space to include others. A violation of private space is a form of mindreading. In building society around the digital world, we’ve ceded a massive amount of private space to move in safely. The only recourse to learning your boyfriend has read your journal is to hide it in a new place, but the only recourse to discovering people can hack your emails is to stop writing anything sensitive or private at all. By necessity, we’ve retreated inward. Our truly private worlds are almost entirely interior now. That loss of intimacy has already alienated us from one another. Unable to safely extend a hand or share a thought, our knowledge of people stops with avatars and public text. We can’t know people’s deeper feelings and they can’t know ours. There’s nowhere safe to talk. We are alienated.

When Glenn Greenwald asked Edward Snowden why he would risk imprisonment — the obliteration of privacy — Greenwald identified the one circumstance where personal agency is taken away. That the cyber debate revolves around the give and take of privacy tells us that we’re already in a prison of sorts. To get out, we need to reestablish laws and agreement. Not the tacit agreement of accepting free stuff in exchange for unknown costs, but overt agreement and an expectation of legal recourse if our rights are violated. As Stephen Krasner observed: “The Constitution is a document more concerned with limiting than enhancing the power of the state.” Modern lawmakers violated this precept into extinction with the PATRIOT Act. There’s no underlying belief that our present government will give up the PATRIOT Act of their own volition, and no reason to believe the public has the will to make them. This is where most people drop out of the resistance movement and succumb to prison life.

The other misstep in solving the puzzle is our obsession with predicting the future. Pew Research Center’s Net Threats survey of over 1400 technology experts asked them to predict “the most serious threats to the most effective accessing and sharing of content on the Internet.” But with so much emphasis on forecasting, we’re overlooking today’s storm. If you’d asked a South Korean mother living 20 miles from the DMZ in 1997 what the most serious threat to her children’s lives was, most Americans would have expected her answer to be a doomsday scenario of war with the north. However, it’s just as likely she would have said: “See that black sedan driving 50mph over my front doormat…?” The news-grabbing headlines often obliterate imminent dangers. Public discussion leapfrogs over what we could solve today because no one wants to dig in and do the unglamorous work of painting a dotted line down the center of the road. (Why isn’t Pew asking these 1400 experts to identify today’s most solvable problem and offer a specific solution? That’s 1400 solutions right there.)

If technology is responsible for creating a state of alienation then the government is guilty of capitalizing on that alienation. When politicians appeal to people’s confusion over new technology, they perpetuate a dangerous myth: that people can separate themselves from the digital age. Lindsey Graham’s opinion on cyber surveillance is useless if he doesn’t understand how Americans use email or why they might be upset that those emails are intercepted and read by government officials. Perhaps he’d like to turn his diary over to the CIA and see how that feels. His vote on privacy legislation would certainly be made with the necessary wisdom.

America is a world leader in computer technology and innovation. Every member of Congress, and certainly the next president, should be knowledgeable about computer technology. America’s elite governing body must be prepared to debate cyber. My 90-year-old grandmother has been sending emails for years and she has a Facebook account. If senators can’t keep up with her rudimentary computing skills then they don’t belong anywhere near the Capitol. The most important action Americans can take is to vote for cybersmart House and Senate representatives in upcoming elections.

As backwards as Washington seems, cybersmart politicians do exist. It’s clear from Hillary Clinton’s decision to house computer servers in her home during her tenure at State that she’s knowledgeable about cyber. Despite her public statement, Clinton’s use of personal servers has nothing to do with convenience and everything to do with security. Clinton owns her data. She also possesses depth of knowledge about what goes on in the intelligence community, and I expect that is precisely what drove her to take control of her privacy. If she wants to do the country a great service, in or out of the White House, she should make cyber legislation her top priority and level the playing field for citizens everywhere. It would unite the country to speak plainly about the state of our internet. Honest talk about cyber surveillance from a public figure who can speak to both sides of the debate would be a huge step forward for the country.

What will hopefully become apparent, to decision makers and citizens alike, is that both sides of the ideological struggle derive their power from the online participation of citizens. The present situation has left people with nowhere to turn for trustworthy leadership. The conditions that permitted fascism’s spread — post-war malaise, financial struggles, political distrust — tamp down people’s natural resistance to incremental loss of agency. The circumstances that facilitated the rapid creation of totalitarian governments in previously liberal, rational societies are cropping up again a century later. The situation is again ripe for machtergreifung.

Democratic European societies once made a desperate attempt to escape their status quo by funding unstable third parties with disastrous consequences. We are now seeing many radical ideas thrown into the mix, some backed by logical process, others attempting to shake people out of rhetoric fatigue. Reboot the Government! Reboot the Bible! Reboot the Brain! Drop one letter from those slogans and we’re deep in A.I. territory. Bill Gates, Elon Musk, Stephen Hawking and their ilk proclaim their fear of the dark side of artificial intelligence with increasing regularity. We should be afraid too. There’s no precedent for the power vacuum created by a flaccid Congress and a disproportionately wealthy technology sector. This situation could pave the way for the first artificially intelligent leader. The engineering is getting there, and the rest would be…history.


CONCLUSION

At the end of The Imitation Game, when the Germans have been defeated and the war declared a victory, the British codebreakers sit around a table to be dismissed. They are solemn and alienated from one another because of secrecy, spying, suspicion, and lying, though they each believe their transgressions were the morally responsible thing to do. They’re ordered by their government to keep yet another secret — deny everything they know and deny they know each other. The path they’re on has no exit and no truth. They’re in a prison of past decisions and will be for the rest of their lives. However, the circumstances that created their prison are the opposite of America’s situation today. In WWII the British government was desperate. The enemy was winning. Their strategy wasn’t clandestine by design but by circumstance, and the British public was spared the burden of deciding who to sacrifice.

Today we’re faced with governments and corporations that spy, lie, classify decision-making, and manipulate online users. These conditions are self-perpetuating. There is no definitive endgame in the shapeshifting political narratives and money-making schemes except to exert more power over the online space. To reclaim the space for public privacy, we must take the messages we’re being sent and decrypt the puzzle ourselves. Whether your bias is to fault the system or the individuals who make decisions within it, both are responsible for mistakes, and both hold the keys to solving the puzzle. The trick is to look at what isn’t there, and to ask why something is free.

Obama’s Other War

Obama’s Other War

At the Oscars last weekend, Sean Penn presented Mexican director Alejandro Inarritu with the Best Picture Oscar and a joke. “Who gave this son of a bitch his green card?” His comment hurt people but it was important that he said it. With a seemingly off-the-cuff remark he reminded billions of people worldwide that for the fifth year in a row America’s most celebrated film came about because America is both a temporary and permanent home to talented hard-working foreigners. Acknowledging Inarritu without acknowledging how he came to make his much beloved film is the ugly habit that perpetuates a damaging fiction of American life.

Rudy Giuliani tried his hand at the same topic last week. His swipe was serious where Penn’s was a joke but both men drew similar reactions in the media. Giuliani blessed us with something approximating the cliché second act plot twist in a romantic comedy when he announced at a GOP fundraising dinner that President Obama doesn’t love America. “He doesn’t love you, and he doesn’t love me.” Giuliani clarified in a follow-up interview that Obama is a patriot but that doesn’t mean he loves his country. (Relationships are so complicated.)

Giuliani and Penn are both savvier than their comments suggest. There’s a political message buried in the birthright narrative that Americans are finally on track to demystify. Beneath the fabled veneer of an all-American childhood is the reality that there is no uniformity to the American way of life. This is a vast, complex, multicultural democracy. Between cities, towns, states, timezones, and even between parents and children, there are stark differences in American upbringings and American lives. Anyone who defines America by his own experience is describing a culture of one.

When Giuliani stated that “[Obama] wasn’t brought up the way you were brought up and I was brought up…” he gets one important fact right. Giuliani and the President have different backgrounds. Giuliani was born and raised in New York. Obama belongs to a group of Americans (and other nationalities) who spent part of their childhood as expatriates. The term Third Culture Kids (now Third Culture Individuals or TCIs) arose in the 1950s to describe American children of expatriate military, foreign service, missionary and business families. Modern surveys on TCIs paint an interesting portrait of the “global citizen” with hallmark traits of linguistic adeptness, creativity, and excellent observational skill, but the salient characteristic of a TCI is multiculturalism.

There are various interpretations of multiculturalism (as one would expect) but the predominant tenet is to preserve and respect cultural and ethnic differences. Coexistence is the goal, rather than dominance by one culture. In extreme situations, rejection of multiculturalism is the justification for genocide. In more moderate societies, cultural intolerance plays out in the economic realm, when minorities suffer without access to the same opportunities as the majority, the armed, or the wealthy. Although the world has always struggled to accommodate cultural differences, modern civilization presents a unique confluence of culture, technology and mobility. In 2015, a person can physically travel to anywhere in the world within 24 hours and can virtually connect with nearly 50% of the world’s population in mere seconds via the internet. This unprecedented proximity of cultures is not optional food for thought. The interconnected world requires multicultural leadership.

Thus, multiculturalism is the key to America’s future. America’s power as a world leader is predicated on a thriving world to lead. (Translation for Team Giuliani: You can’t be great if you’re the only country at the table.) Leadership means fully grasping both your own potential and the potential of those you lead, and those you compete against. A quarterback is nothing without his team, and a great quarterback is the first to acknowledge the talent and efforts of his opponents after the game. Why? The point of competition is to test your skills, not to marginalize others. A victory over the weak is profoundly unexceptional. Collaboration is the unspoken foundation of competition. The ethics of sportsmanship assume that we make each other better by giving our best effort and playing our hardest and fairest. Countries are no different than teams in this regard. America needs the world as much as the world needs us.

As America’s quarterback, Obama has struggled to find his footing. He took office with the expectation that people were rolling up their sleeves beside him, yet his message of multiculturalism and his invitations for all views at the table were met with increasingly virulent suspicion from opponents and supporters alike. The people who voted him into office were unprepared for the “otherness” of his ideology and they reacted by withdrawing. Obama was equally at a loss for how to allay people’s fears. He doggedly stuck with the tactics that won him the presidency — the explanations of inclusion and the promises of cultural prosperity — but the through line of the story wasn’t conveyed: that change begins at the top, but real change happens within the people. This dynamic is at the heart of America’s culture crisis. If we resolve it, we will be exceptional for doing so.

In the meantime, the communication breakdown has spiraled into the worst gridlock in Congressional history and Obama has lost the trust of his constituents. The quarterback is nothing without his team… It’s irrefutable that Obama loves America. He has served as our president for six years, and counting. (If that service isn’t enough for Giuliani then he’s taking the definition of love to a whole new level. I kind of want to go there just to see what I’ve been missing.) However, Obama likely sees one intractable barrier to America’s limitless potential: just like the rest of the world, America is a population of Rudy Giulianis. Not the Giuliani who makes racially insensitive remarks, or spouts provocative political rhetoric, but the broadly-drawn Giuliani who is fearful and suspicious of “other” and demands reassurance that he is exceptional. There’s a Giuliani in each of us and he is at odds with multiculturalism.

The most powerful act of cultural evolution Americans can hope to achieve today is to embrace their diversity. A global community resides within American borders. Acceptance of American diversity as our new millennium identity is a conscious act of self-education, of stepping beyond familiar terrain and learning about the people who reside steps, minutes, or miles away. To fully grasp our potential we need to know each other. With one click we can learn about, connect with, and even see people far beyond our own borders.

We have in Obama a president who is uniquely suited to help us balance a multitude of views. A TCI president was an intelligent choice to grow America’s position as world leader in the Information Age. Obama possesses both an abiding love of this country and a deep understanding of the riches of the world at large. His espousal of multicultural views is exceptional in the canon of presidential rhetoric. However, for this president to be effective Americans must actively embrace diversity on an individual level with self-directed hiring practices, awareness of conflicts, and learned skill at resolving intercultural differences. No policy will make the difference. Obama can only lead by example.

Sean Penn knew there was a high probability he’d be handing the Best Picture Oscar to Alejandro Inarritu, his Mexican friend and colleague. I expect the “green card” comment was a calculated statement, not a flippant joke. Ever the activist, Penn took it on the chin for progress and invited people to hate him for the sake of opening dialogue. The hurt people feel at the mention of “green card” is not because Penn said the words, but because of what those words have come to signify in daily American life: other, different, less than, less American. Wasn’t brought up the way you were brought up and I was brought up. Doesn’t love you, and doesn’t love me. That story is out of date. The new romantic tale of America is one where we take our love to the next level and learn to embrace who we want to be, a society of tolerant, peacefully coexisting people who draw on our vast, diverse strengths, each of us different, all of us equal.

Satire, Foreign Policy and the Sony Hack

Satire, Foreign Policy and the Sony Hack

Personally, I would prefer to live in a world where Seth Rogen and James Franco aren’t our foreign policy drivers. Everyone who works at Sony probably feels the same way right now, and quite a few busy people at the State Department, too. North Korea is a loose cannon with a long history of erratic foreign and domestic policies, but the aftermath of the Sony hack has seen America making equally temperamental choices. America is playing down to a lunatic’s level and ignoring lessons it might have learned from 9/11. The notion that America’s free speech is being messed with because The Interview is in distribution limbo is the kind of histrionic overstatement that citizens of a superpower make when they don’t have an accurate self-image.

Prior to the hacking incident, I saw a trailer for The Interview and had a visceral reaction: putting this film out is a terrible idea. I work as a screenwriter now, but my college degree was earned at Georgetown’s School of Foreign Service with a specialty in comparative studies of Asia and Europe. My thesis was on power in the Asian region. I lived and traveled extensively in Asia. From an admittedly dated knowledge base, I feel confident saying that anyone who thinks they won’t get a response from North Korea for depicting the bloody assassination of its leader, images that will be exported globally through the American marketing and distribution machine, is truly living in a fantasy world. If the tables were turned and a film studio in an adversarial country depicted the violent assassination of our leader as comedy and, most importantly, had the power to share that film worldwide, we’d be disgusted and outraged. America has resources and official diplomatic channels to respond to that sort of propaganda attack. We’d start by demanding an apology. In the case of The Interview, America is the perpetrator and we’ve gone after an isolated, unstable dictatorship. Sony foolishly picked a fight with a cornered, rabid dog and dragged the entire country into the alley with them. America has no choice now but to stand behind a questionable film on principle. This is not a strong position.

Satire has a goal. It’s not toothless. Americans frequently, maddeningly blur the line between satire and bad behavior. In the worst cases, racism, misogyny and hate are passed off as comedy. In the middling cases, comedy promotes the status quo, which generally isn’t a good thing. For material to be satirical the writers must have a firm grasp of the issues, be skillful at self-examination, and have the goal of shifting people’s perceptions toward greater clarity. The South Park series comes to mind as an example of great satirical writing, as does The Simpsons. Tropic Thunder was an incredible satire of the film industry, with an edgy script that pushed far beyond discomfort into outright offense and insult. Those writers put Hollywood under the microscope and dissected with aplomb.

In contrast, bad behavior is poking fun at something — a person, an idea, a philosophy, a moral precept — without self-examination. While I don’t know Rogen or Franco personally and I have not watched The Interview, I struggle to be optimistic that Rogen has written a politically self-aware satire of America’s relationship to North Korea. I really enjoyed Rogen’s frat comedy Neighbors, and his upcoming Sausage Party sounds like it will keep his fans happy, but they’re two of many reasons I expect The Interview is no Catch-22 or Dr. Strangelove. The synopsis reads like a couple of stoner writers thought “dictators are stupid and wouldn’t it be funny if…” Well, the answer is no. America assassinating the leader of a foreign country isn’t funny at all and we shouldn’t be in the position of defending it as humorous or entertaining. Now we’re stuck promoting an image overseas that we’ll wield our considerable power in defense of our right to spend Christmas Day laughing at Kim Jong-un’s dismemberment at our hands. The film is a propaganda attack on North Korea’s sovereignty, intentional or otherwise, and one that America really doesn’t want to instigate. There are too many other fires burning.

In touting the release of The Interview as a symbol of our right to say or do anything we want, the American public is trading free speech for common sense and confusing comedy with xenophobia. Further, the aftermath of the initial data dump generated an ugly public conversation about celebrity emails and then about censorship and the perceived cowardice of the victims of the attack. In this way, the public and the media abetted the attackers. To suggest that Sony is “caving” or “capitulating” to people who are threatening violence to their employees and the general public is essentially to say that Sony should ignore their hostage situation. Until Sony is “released” or has outside protection, the company has no way to push back against their attacker. “Free speech” as a concept is not remotely in danger. Individuals and a company are in danger. Sony employees have already been terribly compromised by this cyberattack, and they’re under continued threat. Sony made a mistake with this film, but the company needs the country’s support to get through the situation. It’s important to grasp how effective we could be in pushing back against cyberattacks if we’re all on the same page. Instead, the hackers have forced us to get behind The Interview, a movie that promotes a threatening image of American foreign policy. No one wants to be in that situation. That’s the precedent we don’t want to set.

People who worry about the future of free speech in this country can rest easy. The fallout from The Interview potentially has more long-term positive affects on free speech than negative ones once the danger is over. For one thing, our awareness of how to wield American power in a technologically interconnected world will be greatly increased. We can learn from these mistakes. The film industry needed a recalibration in how it assesses its output and true reach. While this incident may make the Hollywood community fearful initially, the way the country stands behind Sony and deals with the hackers will ultimately embolden executives and talent to make smarter, sharper political films once they’ve shored up their vulnerabilities. Defiance is the backbone of change.

9/11 threw America into a state of fear that divided us. We continue to be divided, and easily distracted. It’s time to regroup so we can address crises like these successfully. America’s power lies dormant in a unified voice we’ve forgotten. Without it, we continue to be vulnerable to even the weakest dictators.

The Unknown Known

The Unknown Known

screen-shot-2019-10-07-at-6.19.05-pm

America’s Addiction to Black and White Thinking

Errol Morris (off camera): If the purpose of the war is to get rid of Saddam Hussein, why can’t they just assassinate him? Why did you have to invade his country?

Donald Rumsfeld: Who is “they”?

Morris: Us!

Rumsfeld: You said “they”! You didn’t say “we.”

Morris: Well, we. I will rephrase it. Why do we have to do that.

Rumsfeld: We don’t assassinate leaders of other countries.

Morris: Well, Dora Farms we’re doing our best.

Rumsfeld: That was an active war.

(Transcription)


Young children think in black-and-white terms. Something is good or it’s bad, the answer to all questions is a variation of “yes” or “no,” and anything more complicated results in frustration. Once kids master basic reality then they move on to complexity. Someone who does a bad thing, like kick you in the shin, isn’t necessarily a bad person, and someone who does a good thing, like offer you candy, can’t automatically be trusted. Grasping that life is nuanced is a rite of passage that signals maturity and readiness for greater responsibility.

If media and culture are taken as a fuzzy reflection of American tastes, it’s evident that the country needs its leaders to be heroes or bad guys. The public seems largely indifferent to anyone who doesn’t enthrall or disgust it. People demand “the truth” when it’s staring them in the face, they just can’t see it for all of those pesky conflicting details. Such is the case with Errol Morris’s fascinating documentary, The Unknown Known.

The film’s subject is Donald Rumsfeld, the charismatic politician with an uncanny knack for finding a seat in the Oval Office during nearly every political crisis since the 1970s. A Princeton graduate and Navy pilot, he was elected to Congress in 1962 at age 30 and went on to hold several posts in the Nixon administration (during which time he hired Dick Cheney as his assistant.) In 1973, he became the U.S. Ambassador to NATO, a fortuitous departure from Washington that allowed him to emerge unscathed from Nixon’s disgraceful exit and return to the White House as Gerald Ford’s Chief of Staff. He subsequently became the youngest Secretary of Defense in history, then moved into the private sector as CEO, and later Chairman, of pharmaceutical company G.D. Searle. George Shultz asked him to return to public service following the Beirut Barracks attack that killed hundreds of American soldiers in 1983, and Rumsfeld set off on a fact-finding mission as Ronald Reagan’s Special Envoy to the Middle East. He was also notably a lead contender for Reagan’s Vice Presidential running mate, although the position ultimately went to future president George H.W. Bush. Most people know Rumsfeld best for his final tour in Washington as George W. Bush’s Secretary of Defense from 2000–2006, the guy who took America into prolonged, unsuccessful wars in Afghanistan and Iraq in the years following the 9/11 terrorist attacks. In 2014, Rumsfeld is home. Our troops are not.

The Unknown Known doesn’t present any groundbreaking information on history or the six decades Rumsfeld has been in and out of public office. Instead, it recounts what we already know and, in the process, presents an inconvenient truth about American culture: Americans habitually disavow the known knowns of power and democracy when they dislike an outcome. When a country is debating whether a war or a leader was good or bad, honest or evil, it’s safe to say no substantive lessons will be learned.

In his New York Times supplementary pieces, Morris says that The Unknown Known was his quest to pin Donald Rumsfeld down and get some answers to the quagmire of the Iraq war (never mind that Rumsfeld famously stated he “doesn’t do” quagmires.) The entire press corps and media establishment could not accomplish Morris’s goal during Rumsfeld’s ill-fated second stint as Secretary of Defense but this is insufficient evidence for Morris that his objective is a fool’s errand.

For reasons that aren’t disclosed in the film, Morris is a biased, bordering on hostile, interviewer. Perhaps he felt obliged to play the role of interrogator due to the mistreatment of detainees on Rumsfeld’s watch, or maybe he grew frustrated as his questions sent him in familiar circles. No matter the reason, the film is both riveting and disturbing as it illustrates our cultural addiction to black and white thinking, even when we’re prepared and committed to dig for answers.

While American soldiers are still hard at work over in Afghanistan, Iraq, and now the entire Levant, Morris frames Rumsfeld’s account of sending troops to the Middle East as utterly lacking intelligence or rationality. For some, any war lacks intelligence and rationality. Morris’s line of questioning seems to place him in that category. His interview with Rumsfeld plays out as two opposing ideologues who have no interest in understanding each other, only in being understood. An award-winning documentarian with a sterling reputation, Morris indirectly reveals the crux of our national dilemma: We already know everything we need to know here. The real question is: What are we prepared to do about it?

To make the best possible decision, it’s worth laying the cards on the table and playing an open hand.


Photo Credit: Tech. Sgt. Cherie A. Thurlby, U.S. Air Force. (Released)

Known #1: Rumsfeld Left a Prolific Paper Trail

It’s apparent during the film that Rumsfeld is a guy who strictly adhered to the rules and expected others to do the same. He was “dumped” by Nixon in 1973 for not being crooked enough. He was willing to stake his job on corralling Condaleeza Rice when he perceived her to be overstepping her bounds as National Security Advisor. Most notably, he left a paper trail of memos he estimates to be in the “millions” that communicated his thoughts throughout his years of public service. This isn’t a guy who seems to have anything to hide, yet that’s evidence no one wants to acknowledge because it runs counter to the bad guy theme.

Morris raises the topic of Richard Nixon’s proclivity for self-recording. Rumsfeld offers the explanation that perhaps the wayward president felt everything he said was valuable. Morris asks if Rumsfeld knows of any president since who made similar recordings. Rumsfeld says he does not and suggests that people tend not to make the same errors as their predecessors but instead make “original” mistakes. This thinking highlights the key to Rumsfeld’s confidence and serves to make him a compelling figure. While Morris tries to draw a comparison between Rumsfeld’s millions of memos to Nixon’s omnipresent tape recorder, it’s the differences that are most striking. Rumsfeld’s memos were overt and, by his own description, his primary tool of communication with his staff. They were “working documents.” For sheer bulk and publicity, Rumsfeld’s “snowflakes” seem like a precursor to the cult of selfies more than an echo of Nixon’s paranoia. Further, being working documents, the snowflakes were intended to be fluid, even ephemeral. No matter how we choose to interpret the facts, Rumsfeld offers plenty of evidence that he’s working above ground which contrasts sharply with the vacuum of available memos authored by Dick Cheney, Karl Rove, and others in the G. W. Bush Administration.

Morris observes that Shakespeare portrayed historical conflicts as entirely hinging on infighting and personality struggles between individuals in power. This comment follows Rumsfeld’s refutation of any meaningful strife between himself and George H.W. Bush in the years leading up to a presidential candidate fork in the road. It’s interesting that Morris opts to leverage a playwright who dramatized monarchies for the pleasure of a monarch against Rumsfeld’s feathery dismissal of personality politics in American democracy. The blueprints for America’s polarization are on display here: The imagination of those not in power (Morris, Shakespeare) leans toward notions of infinite unchecked aggression, while an erstwhile decision-maker in the most powerful government in the world (Rumsfeld) privileges the process and the system, and grasps the limitations of what one man can actually do with his aspirations in a democracy. Nixon abused his power, was caught and ejected. Rumsfeld recorded his thoughts and intentions publicly and stands by those thoughts today. Morris goes after Rumsfeld as though Rumsfeld is hiding in plain sight, but his questions assume that Rumsfeld has something to hide in the first place. Is it possible the only thing Rumsfeld concealed was his ambition? And if so, based on his track record, did he even conceal it?

How could a man who is so transparent have duped the public into war? In a dictatorship, people are cowed into submission and given no choices, but in a democracy it isn’t rational to claim that an entire population was mystified. A handful of representatives cannot lead a vast, multifaceted, democratically empowered people so far afield of their purported values. Asserting otherwise is an abdication of responsibility, and getting to the truth of what America wrought in the aftermath of 9/11 will require accountability on all sides of the equation. Democracy is predicated on power residing in the people.


Photo credit: Bio. A&E Television Networks, 2014. Web. 19 Nov. 2014.

Known #2 – Saddam Hussein did not have Weapons of Mass Destruction

A telling exchange unfolds when Rumsfeld discusses the hunting and eventual capture of Saddam Hussein. His subordinates asked him if he wanted to talk to Hussein. Rumsfeld declined. In the film, Rumsfeld says the person he would have liked to talk to after all was said and done was Tariq Aziz. Aziz was Hussein’s Deputy Prime Minister and right hand man who Rumsfeld first met on his travels through the Middle East in the 1980s. The two men spent hours in conversation and Rumsfeld found Aziz to be a rational guy. He says he would’ve wanted Aziz to explain what alternative approach might have worked to get the Iraqis to “behave rationally.”

With all of the additional evidence we now have about the Iraq War, it’s odd that Morris doesn’t take this opportunity to reframe Rumsfeld’s perspective here. Granted, Morris would have to sympathize with our enemy, a dictator, to point out that the United States was the irrational actor. Saddam Hussein perpetrated unforgivable violence on his own people, but his misreading of his situation vis-à-vis America in 2003 is entirely understandable. America did to Iraq what it should have done to Pakistan if it was serious about invading countries that harbored terrorists connected to 9/11. Hussein couldn’t possibly anticipate the actions of an irrational superpower. America stepped onto the world stage and presented false/made-up/incorrect “evidence” of a nuclear bombmaking program Hussein did not have as justification for starting an illegitimate war.

In playground vernacular that Rumsfeld might appreciate, America got kicked, so it turned around and kicked someone weaker, harder.

Why Morris doesn’t put this to Rumsfeld speaks to how far back we need to go to sort out the context for our decisions, and how deep we need to wade into the dark questions of what we’ve done. The risk is that we’ll be forced to absolve a few bad actors of wrongdoings, that we’ll show characters like Hussein to be less evil or crazy than we need them to be to feel okay with ourselves. Should America forgive itself for acting irrationally after 9/11 and pursuing revenge in lieu of justice? Eventually, but that forgiveness can only come with acknowledgment of its mistakes. That Morris doesn’t take Rumsfeld to that place, a place so many of us want to go, is a missed opportunity to find consensus between polarized Americans. Morris won’t let Rumsfeld off the hook and Rumsfeld won’t ask to be released. It’s gridlock that is breaking the country.


Known #3 – Rumsfeld Was Not the Chief Architect of the Iraq War

Morris: When you’re in a position like Secretary of Defense, do you feel you are actually in control of history, or that history is controlling you?

Rumsfeld: Neither. Obviously you don’t control history, and you are failing if history controls you.

This is an excellent answer. It is representative of almost all of Rumsfeld’s answers to Morris’s questions. Morris gives Rumsfeld two options: Are you a megalomaniac or a pawn? Do you believe you control the world, or that you cannot be controlled? Rumsfeld responds by exposing the black-or-white supposition buried in Morris’s question but doesn’t go further by revealing a third option as he sees it. Both of these guys are smart enough to come up with a third option, even one they might agree on, but neither sticks their neck out to hazard one. They’re both too vexed by a need to be understood. (Answerer: Here’s why I did this. Questioner: Here’s why I hate it.)

The question remains why Rumsfeld didn’t guide the country toward a bit of soul-searching in advance of going to war. If history isn’t going to have us by the tail, if we’re not going to “fail” by Rumsfeld’s definition, we needed to pause and collectively ensure that our actions were informed by history, but also by values, ethics, and newly formed goals in a scary new landscape. With so much knowledge of political decision-making under his belt, Rumsfeld would have been an ideal person to help us ask: Will we feel better about 9/11 after one, two, three…or thirteen years of war? Will we feel avenged by the death of more Americans, and foreign innocents? With the years now passed and the death toll so high, the answer is definitively no. We will not.

As I watched Rumsfeld lay bare his methods of decision-making and politicking, I thought of the complexity of holding high office. Bush and Cheney knew what they were doing when they brought “Rummy” into the White House in 2000. In hindsight, the obvious reason to choose Rumsfeld out of a pool of highly qualified candidates was his willingness to serve the country’s leadership, to voice his opinions in his area of expertise, when requested, and then make decisions and follow directives without looking beyond the well-defined boundaries of his domain. He was not in charge of intelligence gathering, as he points out. Intelligence combined with Rumsfeld’s suggested policy of “ridding the world of terrorism by going after states that harbor terrorists” formed the case for invading Iraq. Yet, Rumsfeld admits to Morris that he heard of the decision to go to war with Iraq from the Vice President in front of the Saudi ambassador. Rumsfeld wasn’t exactly in the driver’s seat.

In 1983, Rumsfeld toured the Middle East as Special Envoy and sent “cables” back to Washington, including the now-famous “Swamp” memo to Secretary of State George Shultz. Morris asked Rumsfeld to read it aloud for the camera. In the film, Rumsfeld’s words are heard over images, presented as one continuous paragraph. In fact, it is a series of excerpts. On my first viewing I mistakenly thought this was a reading of the entire memo, but when I searched for the document I found it was much longer and more involved than its presentation. The following is transcription from the film. I’ve added ellipses to show where the memo breaks:

Rumsfeld (voiceover): I suspect we ought to lighten our hand in the Middle East. … We should move the framework away from the current situation where everyone is telling us everything is our fault and is angry with us to a basis where they are seeking our help. In the future we should never use U.S. troops as a peacekeeping force. … We’re too big a target. Let the Fijians or New Zealanders do that. And keep reminding ourselves that it is easier to get into something than it is to get out of it. … I promise you will never hear out of my mouth the phrase “The U.S. seeks a just and lasting peace in the Middle East.” There is little that is just and the only things I’ve seen that are lasting are conflict, blackmail and killing.

Oddly, Morris omits the final two words of the memo after killing. On the page, Rumsfeld finishes “ — not peace.” Peace is on Rumsfeld’s mind, or it is at least part of the government’s agenda. He does not see it as viable in the Middle East and the memo lays out his sense that America shouldn’t participate there without an invitation, and only in a limited capacity. The entire 8-page memo provides incredible insight into the region at that time and the key players.

Given his strongly worded assessment, it seems unlikely Rumsfeld would have waged a full-blown war there, even 20 years later, if left to his own devices in the aftermath of 9/11. Military action would undoubtedly have been part of any president’s response – Special Forces operations to find Bin Laden would have been on every agenda post-9/11 – but there’s no line drawn, A to B, that indicates a full-scale invasion of two Middle Eastern countries would have been at the top of Rumsfeld’s list of priorities. I was left with the leaden feeling that if Rumsfeld had been Reagan’s chosen running mate three decades ago, rather than G.H.W. Bush, that the country mightn’t have gone to Iraq in the first place with President Rumsfeld at the helm. While Americans may not like the ambiguity, it’s worth remembering that it’s possible to be a warmonger without waging an actual war.


Known #4: American Values Have Shifted

This final card is either the Joker or the ace in our deck. Values are abstract and thus difficult to define, but the discussion is crucially important because it illuminates the context for our decisions. A ramping up of self-centric thinking over the last several decades has lead to a pronounced shift in our concept of civic duty. Personal power captures our imaginations more routinely than national achievement. Steve Jobs is a god, Warren Buffet is a guru, and Beyoncé is America’s “Queen Bey.” America openly worships successful individuals, which is surprising behavior from a country whose defining political victory was independence from a monarchy. Yet in 2014, the public seems eager to bow to a handful of individuals without considering the toxic system that makes those lucky few excessively rich or powerful.

One increasingly common fast track to notoriety comes through social media attacks on “the establishment.” This phenomenon and accompanying philosophy is evident in the example of 23-year-old Twitter activist Suey Park who garnered national headlines earlier this year with a call to cancel The Colbert Report based on a tweet that offended her. Despite her 23,000 followers at that time, she described her social justice activism in the New Yorker as follows:

“There’s no reason for me to act reasonable, because I won’t be taken seriously anyway,” she said. “So I might as well perform crazy to point out exactly what’s expected from me.”

The implications of this statement are that the system is too powerful to be dealt with rationally, that an audience of 23,000 people is not substantial enough to warrant personal accountability, and that mirroring the irrationality of the system is preferable to joining it with an intention to make it better. Park is not alone in her reasoning. Online discourse is full of marginalized citizens expressing anger and helplessness in the face of perceived injustices. It’s not a stretch to expect this notion of personal power will lead to a future society replete with irrational actors.

To balance this bleak picture, the example of NSA leaker Edward Snowden comes to mind. Snowden worked within the system and took it upon himself to countermand the National Security Agency’s entire playbook. He maintains that trying to change the intelligence-gathering machine via proper channels would have failed. An examination of the fallout from Snowden’s intel-bomb proves him right. While the media responded with a full-throated cry to label Snowden a “hero” or a “traitor,” the real questions are dead in the water. Who among us is glad to know what our elected officials have been up to and what, if anything, needs to change now? Are we still okay with the Patriot Act, a set of legal procedures that we knowingly permitted our elected officials to enact more than a decade ago? Does a system of covert government controls that once alleviated our fear in the aftermath of 9/11 still serve us? (Based on the current state of this discussion, I’d wager that another terrorist attack will answer these questions for us before we, as citizens, take substantive action to resolve them.) There is no resolution to this story yet. Snowden might be a hero or a traitor, or both. What is clear is his sacrifice of personal freedom for a political principle, and this makes his choice compelling where online ranting is not.

Running counter to these rogue actors and the “i” generation are Donald Rumsfeld and his contemporaries. Serving was the ideal most uttered by Rumsfeld’s generation, specifically serving one’s country, not serving oneself. Upholding the system and “doing their part” was a common refrain in mid-20th century speeches, even as the baby boomers protested the Vietnam War. Ask not what your country can do for you… Today, “service” and “sacrifice” aren’t words you hear many 20-somethings use, vernacularly, and the reasons for that may be rooted in America’s shifted values. Our culture celebrates the notion of making millions of dollars, ostensibly to avoid sacrifice and service of any kind.

Thousands of engineers, designers, marketers, lawyers, and inventors do, in fact, serve in the shadow of “cool” and “awesome” visionaries like Elon Musk, while our government bureaucracy, military outfits and private sector corporations are lazily referred to as necessary evils without a charismatic figurehead to sell the public on their personified goodness.

How people perceive their service is being perceived has become an integral part of individual identity.

Thus, feeling good about a decision is now as-or-more important than the decision itself. What any leader will tell you is that you often can’t feel good about your decisions because they’re rarely black-and-white options you’re choosing from. You’re paid to make complex decisions that almost always sacrifice one desired outcome for another. Holding out for the perfect, feel-good option results in paralysis. No one feels good about dropping bombs unless they’re blind to the risks. Nothing in The Unknown Known indicates Rumsfeld was blind.

President George W. Bush answers a question about Osama bin Laden during a media opportunity held after meeting with Defense Secretary Donald H. Rumsfeld and the National Security team at the Pentagon on Sept. 16, 2001. DoD photo by Tech. Sgt. Cedric H. Rudisill, U.S. Air Force. (Released)

To that end, Rumsfeld was the ideal figurehead for America’s military in 2001, the perfect pitch person to take the country to war — a guy with decades of experience who loved to spar with the press, whose obvious enjoyment of life, and power, emboldened a bewildered, bereft public to strike back after they’d been hit. Make no mistake, we all had a desire to hit back. It is disingenuous to deny having those feelings in the aftermath of 9/11, and yet in our black-and-white discourse we pigeonhole ourselves into “pacifist” and “warmonger” camps, effectively taking the complexity of crafting an appropriate response to the 9/11 attacks off the table. Rumsfeld was in the Pentagon when an airplane flew into the building, as close as any person could be to the physical attack. He describes the pieces of the airplane strewn across the grass, and footage shows him carrying his injured personnel away from the building. I wondered why Morris didn’t ask Rumsfeld about his personal feelings toward the enemy or whether he desired revenge. Morris clearly agrees with Shakespeare: history is teeming with flawed people in power, driven by emotion, acting on impulse and motivated by greed, so why not invite Rumsfeld to reflect in a personal capacity? Is it because this will make him more relatable? More human?

Still from HBO’s Ghosts of Abu Ghraib

What becomes evident from The Unknown Known is the disconnect between Rumsfeld’s understanding of his role in a new millennium, the exponential increase of personal power over the last several decades that made him singularly responsible in the public’s eyes for guiding the country toward military action in the Middle East. His willingness to talk to the press during the Bush Administration was clearly a function of his enjoyment of interaction, but he did not fully grasp the character of his audience, the general public, nor his soldiers, some of whom perpetrated horrific abuses on our prisoners of war and proudly catalogued photographic evidence of said abuse to share with friends. For all of his time spent imagining potentially terrible outcomes, Rumsfeld’s understanding of personal power did not stretch to include rogue actors in his own army. Equally blind, these rogue actors had no understanding of how their actions weakened the United States and, in Rumsfeld’s view, gave power to the terrorists.

Rumsfeld: “I testified before the House, testified before the Senate, tried to figure out how everything happened. When a ship runs aground, the captain is generally relieved.”

[cut]

Rumsfeld: You don’t relieve your presence? [sic; does he mean presidents?]

[cut]

Rumsfeld: And I couldn’t find anyone that I thought it would be fair and responsible to pin the tail on, so I sat down and wrote a second letter of resignation. I still believe to this day that I was correct and it would have been better, better for the administration, and the Department of Defense, and better for me, if the Department could have started fresh with someone in the leadership position.”

Morris: “So you wish it had been accepted.”

Rumsfeld: “Yes.”

The Abu Ghraib scandal occurred in 2003. Bush kept Rumsfeld on as Secretary of Defense until 2006. What the American public should then ask is: Why wasn’t his resignation accepted? In light of everything that went wrong with our military action after 9/11, why would Bush keep him on?


CONCLUSION

Even as Rumsfeld gamely debates Morris, compliantly reads his own memos aloud and lays out events with candor, he is portrayed unequivocally as “the bad guy.” Yet, for all of my personal horror at the decisions that were made during Bush’s administration and Rumsfeld’s tenure at DoD, I came away from the film with a clear sense that Rumsfeld was a rational actor who understood the scope and limits of his role and, in fact, did his best to uphold the values of the system. Many Americans did, and do, agree with his actions. While many of us see the system as corrupt, and the Bush Administration as manipulative, it is notable that Bush was reelected in 2004. It wasn’t until the country finally registered a change of heart over the war in the 2006 midterm elections and shifted power to the Democrats that Bush was forced to make a change; Rumsfeld was out.

It turns out Rumsfeld’s untold crime might be that he sticks to what he knows. He doesn’t read legal briefs – “I’m not a lawyer.” – and he’s not a detective or a policeman. He recounts the day he took over the Chief of Staff office in the Ford Administration and discovered a locked safe in one of the cupboards. It had belonged to Nixon’s Chief of Staff, H.R. Haldeman, and remained unopened in the office through Alexander Haig’s short tenure. Rumsfeld asked his then-assistant Dick Cheney to dispense with it through a proper chain of evidence without opening it. Investigating crimes and bringing people to justice were not his areas of expertise. In another era, this blinders-on approach to work was respected. Today, anyone with access to the internet and a search engine is a for-a-day doctor, lawyer, psychologist or war strategist, and Rumsfeld’s dispatching of the safe signals a refusal to get his hands dirty. However, Rumsfeld is of an earlier generation. He followed protocol and got on with his job.

The unspoken aspects of Rumsfeld’s interview are surprisingly easy to miss. There is so much information and history presented, and so much energy in the back and forth between subject and interviewer, that it only came to me later how terribly sad it must be for someone as ambitious as Rumsfeld to end his career the way he did. The personal failure he expresses in the film is dwarfed by the magnitude of torture memos, detainee abuses, and evidence of his ineffectiveness in controlling the military. That Rumsfeld doesn’t mention this sadness, or elicit sympathy for his personal losses, is characteristic of the stoicism of his generation. It’s easy to loathe someone who represents failure, manipulation, abuse of power, and death, but I had a strong feeling after watching this film that he was being dehumanized as penance for our mistakes as well as for his. It troubled me that Morris didn’t pave the way for Rumsfeld to be human on camera. Someone has to go first. Because I inherently relate to Morris’s anger over the war, I hold him to a higher moral standard. He represents me in this film, and I wanted him to offer detente.

Morris’s final question to Rumsfeld is “Why are you doing this? Why are you talking to me?” Rumsfeld responds that it’s a vicious question, and in light of his compliance with Morris’s format, it is exactly that: a vicious question. Whether you agree or disagree with Rumsfeld’s decisions, he’s a public figure with political aspirations who left a paper trail millions of miles long and willingly explains their content. He admits to fretting over complex choices that had to be made but didn’t permit self-doubt to paralyze him. The opposite complaint is lodged against the current administration. There is no winning this either/or war. Careful consideration is a habit we hope for in our leaders because it’s the appropriate way to deal with complexity. Morris ends up looking like he would prefer Rumsfeld were more dogmatic than thoughtful. It’s a trap of the quagmire Morris waded into. The quagmire of quagmires.

What Rumsfeld likely knows, and the American public can’t stomach, is that he did the job he was appointed to do and therefore any apology would ring hollow. Americans elected the wrong president to lead them in the aftermath of a brutal attack. George W. Bush only needed the public’s grief to justify a war, nothing else. Morris supplies evidence, unsolicited, to support the notion that Rumsfeld wouldn’t necessarily have taken us to full-blown war of his own volition: the Swamp memo from all of those years ago. This is potentially as transparent as politics gets. The American public sought revenge after 9/11. Rumsfeld served us.

It’s difficult to say “Rumsfeld isn’t evil” when you look at the photographs from Abu Ghraib. Taking away the “evil” moniker will make some feel like the abuse and deaths he caused, through orders, through policy, and through mistakes, are less honoured or properly remembered. Personally, I think the opposite is true. Mischaracterizing a leader so that we feel better about blaming him is exactly the mistake we made to get us into the Iraq war in the first place. We expanded upon Hussein’s “evilness” to justify our actions. By making that mistake again in analyzing our own leaders, we’re dishonouring the scarred and the dead.

Either Americans were deceived by members of the Bush Administration (which I think is true), or the government was relying on intelligence that turned out to be false (as Rumsfeld, Colin Powell, and others have averred, which I also think is true.) However, Americans’ willful self-deception is the greatest crime of all. That Rumsfeld refuses the role of bad guy or hero in his narrative is what seems to confound Americans most. They want an apologetic villain or a delusional king but they have no apparent capacity to see Rumsfeld for what he is: a decision-maker whose rise to power meant that his failures were amplified and spectacular. This is not absolution, it is a statement of events. The question we should ask ourselves now is: Can Donald Rumsfeld and Errol Morris share a national identity? The answer has far-reaching implications for the future of the country.

Photo credit: Zachary Roberts for the Village Voice

The country is in the crosswalk arguing passionately over the correct place to stand, in the black or in the white. Sidewalks are gray, and frankly that’s where we should be doing our arguing if we want to avoid another tragedy. Americans would best use their time, then, by moving past the desire to elicit apologies from their would-be villains. Placing blame and bestowing forgiveness won’t ameliorate the gnawing guilt over so many fallen and wounded soldiers and civilians. Instead, people should take a hard look at where they might participate in remedying or, better yet, rebuilding a broken system. To do that effectively, people with conflicting views will have to listen to each other. Morris and Rumsfeld begin that process in The Unknown Known by sitting down to talk.