On Scottish “Independence”

First Minister of Scotland Alex Salmond, in preparation for next September’s independence referendum, released a 670-page white paper last month outlining the course an independent Scotland would take on the world stage. It’s a bold, perhaps overly optimistic document that envisions easy continuity of relations with the European Union, NATO, and the United Kingdom. The Scottish government pitches independence as the nation’s chance to set a different path from its southern neighbor:

Independence means that the decisions about Scotland that are currently taken by governments at Westminster – often by governments that have been rejected by the majority of people in Scotland – will be taken here instead.

This is true, in a way. But this claim sidesteps how much influence over economic, foreign, and defense policy will still be wielded by non-Scottish actors.

Let’s assume Salmond and the SNP get everything they want. By sharing the pound sterling with Westminster, Scotland would be wedding its much of its economic policy to that of a foreign government’s. (Gordon Brown once called the SNP plan “self-imposed colonialism.”) Like the 27 other member states, Scottish foreign policy would be heavily influenced by its role in the European Union. Scottish businesses and industries would still be fully subject to all manner of EU economic and trade regulations. Holyrood would still abide by the rulings of the European Court of Human Rights. Scottish defense policy would be determined largely by its role in NATO. As Scotland’s first prime minister, Salmond may discover in 2015 that he simply swapped Westminster for Brussels and Washington. This can hardly be called independence.

A better solution would be to structurally overhaul the United Kingdom itself. This idea is not new. One hundred years ago, British political discourse was dominated by the question of Irish Home Rule: should Ireland have its own parliament within the United Kingdom? Proponents hoped that a sub-national parliament in Ireland would bridge the gulf between nationalists and unionists. Eventually, Parliament passed the Third Home Rule Bill, only to see it delayed by the start of World War I and eventually scrapped after Irish nationalists rose up in 1916. Modern Scottish independence activists, conscious of the historical symbolism, scheduled the referendum for September 18th, 2014 — one hundred years to the day after the Third Home Rule Bill became law.

Irish Home Rule sought to address systemic flaws in the British system, but others saw the logic behind local parliaments and suggested more extensive reforms. In 1912, Winston Churchill, then First Lord of the Admiralty, proposed what he called “Home Rule All Around” to federalize the United Kingdom:

Another great reason for the settlement of the Irish question in the present Parliament and for disposing of the Home Rule controversy now, while we have the full opportunity presented, is that the ground is thereby cleared for the consideration of claims of self-government for other parts of the United Kingdom besides Ireland.

[…] I spoke of the establishment of a federal system in the United Kingdom, in which Scotland, Ireland, and Wales, and, if necessary, parts of England, could have separate legislative and Parliamentary institutions, enabling them to develop, in their own way, their own life according to their own ideas and needs in the same way as the great and prosperous States of the American Union…

Churchill’s proposal envisioned a United Kingdom divided into between ten and twelve parts, each with their own local parliament for local concerns. Ireland, Scotland, and Wales would each gain a parliament; the other seven to nine would be carved from England, perhaps corresponding to the seven ancient Anglo-Saxon kingdoms known as the Heptarchy. Above them all would be the national parliament in London, which would then be free to focus on national matters.

I am perhaps at an unfortunate age for making a prophecy. I am ceasing to belong to the young men who dream dreams and I have not yet joined the ranks of the old men who see visions; still I will run the risk of prophecy and tell you that the day will most certainly come — many of you will live to see it — when a federal system will be established in these Islands which will give Wales and Scotland the control within proper limits of their own Welsh and Scottish affairs, which will free the Imperial Parliament from the great congestion of business by which it is now pressed, and which will resound and conduce to the contentment and well-being of all our people.

He wasn’t far off. Scotland and Wales gained their own regional legislatures in 1998 under Tony Blair’s Labour government. (Northern Ireland also gained, lost, and regained one in the 20th century, with its current Assembly established by the Good Friday Agreement.) But England, in whole or in part, still lacks a legislature exclusively dedicated to English issues. English issues are instead debated in the British parliament, where Scottish, Welsh, and Northern Irish MPs can vote and debate on them.

The solution to the English question is as simple as it is obvious: a devolved English parliament, separate from the Parliament of the United Kingdom and coequal to its Scottish, Welsh, and Northern Irish counterparts. But looking more broadly, it would reshape the debate over Scottish independence and British sovereignty. No longer would devolution be seen as a stop-gap measure to appease separatists, but rather a genuine and more equitable means of governance.

An independent Scotland would still be inextricably linked and shaped by the United Kingdom, to the European Union, and other forces beyond its control. In many ways, the Union will still exist whether Scotland wants it or not. Why not work instead to make sure it’s a good one?

Advertisements

Happy Einstein Day!

We shape our heroes, and then they shape us. That’s why we have a Martin Luther King, Jr. Day, after all. We want to instill a reverence for racial equality and peaceful, nonviolent protest in both ourselves and in the generations that follow us. It’s a worthy, universal aspiration to create national heroes to venerate and draw inspiration from.

Which is why it’s utterly absurd that we have a day devoted to Italian explorer and indigenous mass murderer Christopher Columbus.

Schoolchildren are taught that Columbus braved the violent North Atlantic seas and heroically landed in the New World, opening up the vast lands for European colonization. From that was born the United States of America, the greatest country in human history and the last best hope for democracy on Earth. Worth celebrating, right?

But most teachers usually leave out Columbus’ darker aspects. Under his rule of Hispaniola as colonial governor, Columbus’ policies killed thousands of indigenous peoples and enslaved many more with impunity. His actions set the precedent for brutal Spanish colonial rule that would guide European colonization of the New World for generations to come — with implications that would last well into the present day.

As if this weren’t enough, Columbus didn’t even really discover America; you can’t “discover” something if millions of people already live there. Nor was he the first European to set foot in the New World! That honor goes to Scandinavian explorer Leif Eriksson, who helped settle a Viking colony in the New World in the 11th century. It’s like celebrating Buzz Aldrin for being the first man on the Moon while simultaneously ignoring Aldrin’s legacy of slaughtering and enslaving the native lunar inhabitants.

It’s clear to me that we need a new mid-October holiday, so I set about scanning the historical record. My first impulse was to celebrate John Brown’s raid on Harper’s Ferry, which began on October 16th, but I tabled that one after realizing his birthday would be preferable (which, unfortunately, isn’t in October). Fortunately, a far better option soon presented itself, one that preserves a celebration of immigration without glorifying the horrors of colonial oppression.

Picture it: Princeton, New Jersey. The date: October 17th, 1933. As Adolf Hitler takes power in Nazi Germany, a man flees the hatred and persecution now coursing through the Old World’s veins and takes refuge in the New. His generation is replete with brilliant scientific minds, but in both scientific discovery and raw intellect, he already towers as far above them as they tower above a kindergartner. In six years, he would write a letter alerting Franklin D. Roosevelt to the Nazi atomic bomb program; in eleven years, he would call for the abolition of nuclear weapons and become a symbol of peaceful scientific progress for the benefit of all humanity.

He is Albert Einstein, the greatest scientist of our age.  He changed our understanding of the universe just by thinking about it. He lent his global fame to the fight for African-American civil rights in the age of segregation and lynching. He advocated for world peace as a tangible goal instead of just an abstract ideal. Einstein symbolizes everything America could be and should be and must be: a refuge for the oppressed and the exiled, a beacon for freedom and justice, and a vibrant center of human and scientific achievement.

So happy Albert Einstein Day, everyone! Thank you to all the immigrants who have made America great!

Twelve Years

This is an unusual 9/11 anniversary for me. I turned twenty-four a few weeks ago, so this twelfth anniversary means that tomorrow I’ll have lived more of my life in the post-9/11 world than in the pre-9/11 one.

I had lived on the West Coast my entire life until recently so I didn’t see the World Trade Center get hit live. My mom came in and shook me awake as soon as she’d woken up and told me “America is under attack.” I couldn’t even process it until I got downstairs and saw them replaying footage of smoke and flames. By the time the sun rose in Nevada, both towers had fallen, the Pentagon was on fire, and thousands of my people were dead. I went to sleep in one world and woke up in another.

I remember the entire school talking about it, despite knowing nothing about it. (I was in seventh grade at the time.) I remember the principal using the school announcements to tell us we were safe and had nothing to worry about. I remember kids chattering excitedly at recess to hide their fear. We declared triumphantly that we were going to war with whoever did it. We had no sense of the gravity of what that meant. Having known nothing else, our faith in the swift, unyielding power of American hegemony was still absolute.

I remember getting home and finding out my birthday present from my aunt had finally arrived, a LEGO set whose details I can’t remember. My birthday’s in late August but she lives in Hawaii, so it always took longer to get there. I built it while George W. Bush addressed the nation from the Oval Office. He spoke of “huge structures collapsing” and how the attacks “have filled us with disbelief, terrible sadness, and a quiet, unyielding anger.” I don’t think, even today, I ever felt anger about the attacks. It remained too abstract. The disbelief and terrible sadness, however, was endless.

That’s what fills my memories of the aftermath too. I remember the numbness most clearly, when you put on a brave face when everyone else is around and then lose it the moment you’re alone. The only other time I’ve felt that was after Newtown. And I was 3,000 miles away from everything. I didn’t even know anyone who lived on the East Coast back then. What right had I to grieve?

But most of all, I remember the changes. I first flew again a month later. Reno was a small airport then and the security increase was barely noticeable, but a connecting flight took us through Los Angeles. There, the main concourse still had men in body armor armed with automatic rifles standing every fifty paces. The absurdity of it all — did we expect an entire battalion to attack the terminal in broad daylight in the middle of California? — struck me as much as in the seventh grade as it does now that I’m twenty-four. Soon it became commonplace.

I wish I hadn’t woken up that morning. Every anniversary, I hope that I’ll open my eyes in my old childhood bedroom, and that I’ll still be twelve years old, and that the last twelve years will all have been a dream. But I know it isn’t. It’s a nightmare from which the whole world is still trying to wake.

Tragedy and Memory

An article in The Nation today by Robert Scheer made the outstanding claim that “August 6 marks 68 years since the United States committed what is arguably the single gravest act of terrorism that the world has ever known.” The act, of course, is the atomic bombings of Hiroshima and (three days later) Nagasaki. To Scheer’s credit, he included the modifier “arguably,” which seems insufficient to capture the magnitude of the statement. Yet it is not enough.

From the start, even the definition of terrorism is problematic. Hoffman traces the term’s origins to the French Revolution. Since then, it has described Osama bin Laden and al-Qaeda, Nathan Bedford Forrest and the Ku Klux Klan, Gerry Adams and the Provisional Irish Republican Army, and countless other armed groups. Not all those labeled as terrorists fit the popular conception: for leading the armed wing of the African National Congress against apartheid South Africa, Nelson Mandela spent 27 years in prison. The State Department only lifted his designation as a terrorist after the revered statesman had already won the Nobel Peace Prize and successfully won South Africa’s first multiracial presidential election. “Terrorism means the deliberate targeting of innocent civilians, and targeted [Hiroshima and Nagasaki] were,” Scheer states. If only it were that simple.

Despite the thousands of man-hours put into the subject by political scientists and legal scholars around the world, there is no universally agreed-upon definition of terrorism. Unlike other forms of political violence like war and rebellion, what constitutes terrorism is not a fixed constant but an emotionally-charged, subjective, and imprecise term at best. Almost all scholars, however, agree on one characteristic: that it is caused by non-state actors against civilian populations. While tens of thousands of Japanese civilians died in the atomic bombings, they died in the course of a declared armed conflict between the United States of America and the Empire of Japan. Examining it under the laws of war is the more appropriate framework framework. Neither an international tribunal nor an American court has addressed the atomic bombings in this war, although a post-war Japanese court ruled the atomic bombings to be war crimes. War crimes and terrorism, however, are two different things.

Scheer also reveals a fundamental shift in Western political thought, in which “terrorism” is increasingly applied to all manner of violent actions that do not strictly (or even loosely) meet the political science definition. Culturally, it is no longer simply armed violence by non-state actors against civilian populations for political reasons, or any other semantic permutation. Terrorism has become a super-crime, elevated beyond the mere misdemeanors and felonies composing it into an existential societal burden. Only into this darkest of categories can the atomic bombings of Hiroshima and Nagasaki fall, according to Scheer, and we must all bear some collective guilt for it. “As a nation,” Scheer writes, wagging a finger to an audience overwhelmingly born after August 6, 1945, “we excel at obliterating reminders of our own failings.”

But context also matters. As U.S. forces drew closer to the Japanese archipelago, enemy garrisons in Peleliu, Tarawa, Luzon, and Iwo Jima only demonstrated increasing resilience. 100,000 Japanese soldiers dug into the mountainsides at Okinawa, the last stronghold before the Home Islands, and traded their lives for 60,000 American casualties. American war planners extrapolated from those losses when estimating the human cost of Operation Downfall, the codename for the planned Allied invasion of the Japanese Home Islands. Had it been executed, it would have been the largest amphibious military operation of all time, dwarfing even the Normandy landings in size, scope, and scale. 900,000 Japanese soldiers stood ready to defend their home, along with millions of civilians conscripted into the reserves and armed with often nothing more than farm implements.

The war planners’ estimates varied from branch to branch, but few foresaw fewer than 100,000 fatalities and a quarter-million casualties for the Allies in the first stage alone. Estimates that factored in the mass mobilization of the Japanese people as guerrillas and the widespread usage of airplanes, boats, and midget submarines as kamikazes (at least 10,000 planes had been prepared) had dramatically higher casualties for both the Allies and for Japan. The latter were estimated to suffer almost-unconscionable losses: between five and ten million civilian casualties were not unexpected in even the most conservative projections.

(None of this was idle speculation, either. In 1945 the War Department manufactured 500,000 Purple Heart medals in anticipation of the vast casualties Downfall would bring. That stock has yet to be depleted today, even after every battle and every war the United States has fought since 1945.)

With the country’s industrial base and population already devastated by relentless Allied air raids — more Japanese citizens died in the March 10, 1945 firebombing of Tokyo than in either Hiroshima or in Nagasaki — Downfall presented a truly existential threat to the Japanese nation. With so many millions of lives in the balance, American and Japanese alike, we can see the calculus that led Harry Truman to authorize the atomic bombing, even if we disagree with it.

None of this precludes the idea that there are dark, shameful chapters in American history. White settlers and soldiers presided over the forced relocation of Native American tribes, warring with those who resisted. A slaver aristocracy in the South plunged the United States into civil war because the nation had elected a president who thought the enslavement of four million black men, women, and children was not moral. Hundreds of thousands of Japanese-Americans were interned during World War II by order of Franklin D. Roosevelt, forsaken by Congress and even the courts of law. The United States of America was founded on great ideals, but its people and leaders have frequently failed to live up to them.

But to single out Hiroshima above all others, to point at it and say, “Yes, this is the worst that humanity has ever done” seems hollow. What does such a sweeping statement say about the other blood-soaked chapters of World War II? Shall we compare the tens of thousands who died at Hiroshima to the quarter-million who died at the Rape of Nanking, where Japanese soldiers raped, tortured, and murdered Chinese civilians for three days, or the tens of thousands who perished in flames at Dresden? Shall we then stack those corpses against those from the forced starvation of millions of Soviet citizens by Germany on the Eastern Front, or against the Wehrmacht’s horrific multi-year sieges of Leningrad and Stalingrad? Must we rank atrocities and tragedies like some sinister Olympics, duly awarding medals of shame to those whose nations have most thoroughly and efficiently brutalized their fellow human beings?

Historians will never cease debating the atom bomb’s role in ending the worst war humanity ever fought, nor should they. Future generations may find Truman’s decision to be justified and necessary, to be unwarranted and unforgivable, or perhaps even something more complex than that. We can only hope that they learn from the horrors their forefathers faced. May they never take for granted the indelible luxury that allows them to set one tragedy above another for transitory argumentative gain.

Housekeeping

I haven’t posted much on this blog over the past month. Fortunately, it was for a good reason. Four weeks ago, I applied for a summer fellowship at BuzzFeed and was asked to contribute some content on my own for their evaluation. My posts ranged from subjects close to me, like Star Trek, Game of Thronesanalyses of North Korean military forces, and awful Twitter accounts, to topics with a broader appeal like sad koalas and cinemagraphs.

I’ve been looking for work ever since I returned from studying abroad in France last August and so far it’s been formulaic. I wake up in the morning, look at relevant job postings online, and then figure out whether I’d have a decent chance at each one. Unpaid internships are out — moving to the East Coast is one thing; doing it without an income is another — but any paid position for which I meet the stated qualifications and at which I’d be good is fair game. Some days I don’t find anything. When I do, I hammer out a fresh cover letter, rearrange the bullet points on my resume as needed, then email it to whatever address the posting requires.

Then there’s just silence. No job application I’ve sent since August has elicited a phone call, e-mail, or other response of any kind.

With the exception of asking questions or making passing remarks, I’ve avoided writing or tweeting at length about being unemployed. I didn’t want to complain or commiserate when I’m just one among millions, many of whom are in worse situations than me. But mostly I avoid lengthy discussion of it out of shame. I follow and interact with some outstanding people on Twitter and through this blog. You’re probably one of them. For a kid from nowhere, living in the forested mountains of northern Nevada and dreaming of moving to a major city like New York or Washington, those interactions are a lifeline to which I’ve tightly clutched over the past ten months. The last thing I’d ever want to do is give you or them a bad impression of myself or to let them infer anything from my inability to find a job.

So, throughout the past month, I avoided telling anyone aside from a close friend or two that my BuzzFeed posts were part of a job application process. There were other reasons as well. A few BuzzFeed employees follow me on Twitter and I wanted neither to make things awkward nor to appear like I was pressuring them by campaigning for myself. I also didn’t want people to redistribute my posts solely out of support for me personally or to campaign for me. (To my surprise and deep gratitude, a few people on Twitter did so spontaneously.) Only when a person or two saw my links and thought I had begun working there — I don’t think many people know how easily you can sign up for an account and start posting things — did I correct them to avoid misrepresenting myself. I wanted to prove to BuzzFeed that I could make my work go viral on its own merits and that I could be a valuable member of their team. And after almost a year in this jobless wilderness, I also wanted to prove it to myself.

Unemployment, with its crushingly repetitive cycle of applying for jobs and hearing nothing back, takes a toll. After a few weeks your optimism fades only to be replaced with disappointment. Months later, you feel the sting of self-doubt and a creeping sense of shame. And as autumn turns to winter and spring turns to another summer without success, you lose faith in yourself almost completely. Persistent joblessness, almost to the degree that it’s an economic condition, can be a mental and emotional paralytic.

BuzzFeed removed the fellowship application from their site on May 1. Since I’ve seen indications that others already found out they were accepted for the fellowship, it looks like I haven’t. I’d be lying if I said I wasn’t saddened about this. It’s a great fellowship at an emerging media superpower and an incomparable opportunity to be part of building something great. For me personally it was also a chance to move east, away from the cold mountains of Nevada and into the greatest of American cities. I could, at long last, get my life started.

This experience was different, though. By giving applicants the chance to demonstrate their skills, BuzzFeed allowed me to elevate myself above my origins. With every post I made, I became less like a faceless resume or a few paragraphs on LinkedIn and more like a human being with depth and substance. So I worked even harder. I taught myself new skills, like how to make GIFs and curate dozens of images. I spent the past four weeks searching for and creating content that would actually (hopefully) want to read. Since BuzzFeed distributes user-submitted content like mine, almost two hundred thousand people actually did. That’s never happened to me before. I may have fallen short, but the chance to prove myself was a thoroughly edifying experience. And I’m thankful for it.

Joel Stein would likely dismiss my desire for more than silence from potential employers as self-absorption, symbolic of all that’s wrong with my generation. TIME even sweepingly proclaimed in its cover for Stein’s article that millennials like myself “are lazy, entitled narcissists who still live with their parents.” The claim’s breadth makes its absurdity clear. When I realized I wouldn’t be working for BuzzFeed this summer, I didn’t feel that anger that springs from undeserved entitlement. I wasn’t selfless about it either, but only insofar as I’m saddened that I missed out on a great opportunity.

I can’t move to the East Coast just yet. But what I can do is move past this temporary setback, continue blogging and tweeting, and hope that all my future job applications receive that which BuzzFeed gave me and to which millennials rightfully deserve: a fair chance.

I’ve Seen Your Boobs!

Where to begin with Seth MacFarlane’s Oscars performance last night? Do I start by condemning the part where he made a joke sexualizing 9-year-old actress Quvenzhane Harris to her face in front of her family? Maybe the part where he applauded women for getting the flu to lose weight? “Looking good,” he said, with thorough creepiness. There’s always his joke about how the Jews run Hollywood — long-standing anti-Semitic slurs are good for a laugh, after all.

I could start with his dumbest part: “I’ve Seen Your Boobs.” If dehumanizing an audience of women on one of the most important nights of their careers wasn’t enough for you, MacFarlane also didn’t care about the context. Jodie Foster’s topless scene in The Accused was a rape scene, after all. Objectification of women’s nothing new in society, so he also brought in a gay choir group to spice things up. Did you know gay men aren’t sexually aroused by a woman’s breasts? That’s comedic, right?

Of course, if you didn’t find this funny, you don’t “get” comedy. Look at Rob Delaney’s feed, after all! He offends people all the time! Surely his well-thought-out, socially-conscious tweets are the same thing as MacFarlane’s juvenile dehumanizations, right? Why can’t you take a joke? Why can’t you laugh when the humor isn’t inherent, but rather derived from forcing you to be uncomfortable and upset? What’s wrong with you?

Most obnoxiously, there are those who will undoubtedly defend MacFarlane as free speech, which it was. Misogynists, homophobes, and racists often seek refuge beneath our most sacred right when condemned, mistakenly equating freedom of speech with freedom from criticism. MacFarlane is of course free to say whatever he likes and offend whomever he wants, and I am free to not watch him do it. The Academy, for its part, is also free: to hire someone else next time.