Why I Choose to Live in Wayne National Forest

TO THE POINT

Our current system is like an abandoned parking lot. Asphalt was laid, killing life and turning everything into a homogenous blackness, a dead sameness. The levers of maintaining this have broken down. No one is coming to touch up the asphalt. In abandoned parking lots, cracks form and life grows from the cracks.

All these riots, environmental catastrophes, food crises, occupations of land by protestors, and various breakdowns in daily life are cracks in the asphalt. What will spring from the cracks depends on what seed is planted within them. Beautiful flowers could grow. Weeds could grow.

Modern rich nations have walled themselves in. Colonized India was a world apart from Britain. The United States exists an ocean away from the places it drone strikes. Citizenship acts as a tool of ethnic cleansing. The world, according to the new nationalists, will be a checkerboard of racially homogenous governments with swords continuously drawn. The rich nations will now literally wall themselves in, ensure their “racial purity”, and steal from the poorer nations until the end of days. At least, this is the future envisioned by the Trump/Bannon regime. This is the future governments everywhere seem to be carrying us toward, a divided people screaming in joy or anger.

The continued and sped up process of fracking Wayne National Forest, Ohio’s only national forest, fits perfectly into this worldview, or a governance in managing the cracks. The power of this world and the world our rulers wish to realize is dependent on fracking wells, oil rigs, pipelines, and energy infrastructure in general. To oppose this infrastructure is to oppose this system, to take as our starting point the cracks.

I am living in Wayne National Forest in hopes for, first and foremost, protecting the forest. I hope to crack the asphalt and plant a flower.

Everyone is welcome to join the occupation, beginning on May 12th. Everyone is welcome to visit. Everyone is welcome to participate, in one way or another, in this land defense project.

EXTENDED

Some conclude the election of Trump signals and end of the left. Though the opinion seems rushed, and forces could push for a revitalization, if true, then good riddance.

Those of the left are preoccupied with flaunting ego. Taking up their various labels- communist, socialist, anarchist, Trotskyist- seems more about themselves than any revolutionary project. The labelling urge is bureaucratic. Leftists have done themselves no favors talking like politicians. Their endless meetings bear all the marks of officialdom and red tape. Distant from daily life, they alienate those who truly seek a new world. Most meetings, not much more is accomplished than an agreement to continue having meetings. This is a hallmark of bureaucracy.

Rally after rally appears the same dead tactics and strategies. Standing on the sidewalk, holding signs, and chanting slogans at buildings will never bring change. These events only pose a threat when a variety of activity occurs, when people stop listening to the activists. This could be anything from smashing up cop cars to a group of musicians playing spur of the moment.

Supervisors hate the unplanned.

If change is sought than an understanding of the ruling structure is vital. Understanding the current arrangement takes a grasping of history. History reveals how the present came to be and such recognition provides the basis for comprehending our current world.

The first known civilization sprang up in modern day Iraq around 6,000 years ago. This did not occur because humans became smarter or more physically fit. Modern humans evolved physically around 100,000 years ago and mentally 40,000 years ago. The five main qualities of civilization are: 1. City life 2. Specialized labor 3. Control of resources above what is needed to survive by a small group, leading to 4. Class rank and 5. Government. This is still the order we face today.

Before civilizations ascendency humans organized life in various ways. One was the hunter-gatherer band. These were groups of 100 or less, usually with no formal leadership and no difference in wealth and status. These groups were mobile, never staying in one spot more than temporarily. Again, it was not due to stupidity that these people did not develop more civilized ways of living. One could argue the hunter-gatherer life promotes a general knowledge while modern society encourages a narrow, yet dense, knowledge.

Agriculture and animal domestication led to farming villages and settled life. With this came the “Trap of Sedentism.” After a few generations of village life people forgot the skills needed to live nomadically and became dependent upon the village. In general, people worked harder and longer to survive while close quarters with each other and animals increased illness. With greater access to food, the population increased.

Chiefdoms were another form of pre-civilized living. These ranked societies had various clans placed differently on the pecking order and everyone governed by a chief. The chief controlled whatever food was produced above what the village needed to survive. The chief controlled the surplus. These societies came the closest to civilized living patterns.

Agriculture’s surplus allowed more people to feast than in the hunter-gatherer band. With more people working the fields and tinkering with technology came innovation and with innovation a larger surplus. This larger surplus allowed for continued population growth. This cycle proceeded to the birth of civilization and became more rapid with its birth.

Economists have advertised the story of “barter” for a very long time, perhaps because it is so vital to their domain of study. The narrative is as follows: John owns 3 pairs of boots but needs an axe and Jane has 2 axes but needs a pair of boots. The two trade with each other to get what they want and each is trying to get the upper hand in the trade. The massive problem with this story is it is false.

Adam Smith, an economist from the late 1700s, popularized this tale and made it the basis of economics. He asserted one would find barter where money did not exist, in all cases, and pointed to aboriginal Americans as an example. When Europeans came to conquer the continent, they did not find a land of barter where money was nonexistent.

Barter took place between strangers and enemies. Within the village, one found different forms of distribution. One place may have a central hub where people add to and take from. Another place may have free gifts between them. To redo our John and Jane example, John takes Jane’s axe and Jane knows that when she needs something of John’s he will let her have free use of it. What we never find happening is barter.

This is important because the barter folktale convinces people our present system is a reasonable development. If humanity’s natural propensity is to barter, then money and profitable exchange seem like evident progression. This is not to say that barter is “unnatural”, as it came from the heads and relationships of people, but that it is not the only game in town. If it is not the only game in town, and there are a multitude of ways humanity could and has organized itself, then the current system can’t be justified as the necessary development of human nature.

So, for most of human history impersonal government power did not exist. Communities were self-sufficient and relationships were equal and local. The rise of civilization and government changed this. Dependency and inequality marked associations and the few held power over the many.

Surplus food put some in a position where they did not need to work for their survival. While most still obtained resources from the earth and survived on their labor, a few extracted supplies from the many. This small group became the wealthy ruling class and controlled the allocation of production excess. The basic relationship here is parasitic.

The smart parasite practices restrained predation, meaning it doesn’t use up all the energy of the host to maintain the host’s life and continue its own survival. The smartest parasite defends its host. Rulers learned to protect the workers for this reason and in the process increased these laborers’ dependence on them. Increasing population developed into cities and problems of coordination occurred with more people living in a single space. The ruling parasites organized social life to maintain their control of the surplus and, at the same time, rationalize the city to solve problems of communication and coordination.

State power emerged from large scale infrastructural projects as well, specifically irrigation. Irrigation is a way of diverting water from the source to fields. Large scale irrigation endeavors required thousands of people and careful utilization of raw materials. Undertaking such a plan required a small group with the technical know-how to control what labor was done, how and when it was done, how much material was needed, when and how it was used, and utilize these same networks of influence for future repairs. Large infrastructure and complex city life increased the dependency of producers on rulers.

The city is the basis of civilization. The city, simply defined, is land where too many people exist for it to be self-sufficient. It requires continuous resource importation to keep the large population alive, one that cannot live off the soil. This impersonal power, who’s structures don’t change, is based on mindless expansion outside of the city in search of resources. War, of course, is the most efficient way to grab these resources when one city’s importation search runs head on with another’s or with people who live in the way of what is sought. Conquering existed before civilization yet become perfected within its system.

Emperors emerged to rule the masses, gaining prestige from war prowess. Forming empires, these leaders ushered in a new form of rule through large territories gained in conquest. For peasants that controlled their land and were not controlled by feudal lords, they only came into contact with government once a year. Politics was centralized in the palace. Ruling families may change but this did not affect peasant lives. Without modern surveillance technologies and police institutions it was virtually impossible to continuously govern every piece of land. Peasants organized their villages on their own. The only time they saw their government was when the army was sent to collect taxes. This all changed with the rise of the nation-state and mass politics.

Feudalism was based on loyalty to the King and land distribution by the King to obedient lords. Lords, in turn, granted parts of their land to vassals under similar conditions of obedience. Governing authority was decentralized. The King was the ultimate feudal lord, but could only flex on those lords who held land from him. The entire system depended on the lord’s willingness to obey or the ability of the king to rally enough troops to crush the disobedient. This system was basically moneyless, relying on rents or food and other goods flowing up the feudal pyramid. This changed with increased commercial activity.

Buying and selling began to replace rents, with power beginning to shift to merchants and urban commercial activity in general. This change brought about the ability for Kings to monetarily tax those within their domain. Centralization was required to do this and it undermined feudal relations, that of the lord controlling his own land. Any further development of commercial activity would strengthen the monarchy over the nobility.

Changes in warfare required taxes and the creation of a permanent army. Before, Kings would call upon their lords who would rally their vassals to the King’s will. Feudal armies were small, unreliable, and war was local. With Kings increasing their revenue, they were able to hire foreign mercenaries and pay a small permanent army. If other Kings did not want to be conquered, they conformed or died. With a permanent army came a need to increase taxation for maintenance, further undermining feudalism.

Kingly taxation of the populace established a direct link between the highest governing authority and the lowest on the power chain. This completely undermined the rule of lords and centralized power into national monarchies. The primary concern of these nations was that people consented to taxes.

Another way the nation-state emerged was through city-state infighting. Dictators would rise within the city to calm civil war taking place between the rich and poor. These dictators would conquer more land and become princes. When these carved out territories fell apart, cities and other units would try to conquer each other to fill the power vacuum. Eventually, consolidation would happen and usually with the help of mercenaries. Since mercenaries held it all together, whoever controlled the national treasury had power.

When vast empires fell apart, specifically in the Middle East, there arose smaller governing units. These smaller units were concerned with conquering and so had to develop militaries. To do this they taxed the population and could only do so if the people consented, meaning they had to provide services and other incentives. Politics moved out of the palace to everywhere. The nation-state gave birth to mass-politics.

The nation-state is totalitarian by nature. It must care about what its population is doing. Government presence went from one year at tax time to being a constant. Laws upon laws developed, strictly regulating the life of the people in the borders of the nation. The daily life of the people was now bound together with the health and viability of the system. Here, we find the international system of nation-states and world market.

Peasants no longer grew food, ate it, and had a surplus. Now, they sold their food on the market, which the nation-state taxed, in exchange for money and used this money to buy food and pay taxes. Urban centers made goods for a taxable wage and the goods they made could be taxed. Imported goods from other nations could be taxed as well. Truly, all of daily life was absorbed into the system. People’s continued consent and work within new market parameters called forth the totalitarian nature of the nation-state.

Economic development led to restructuring. Small craftsmen went out of business when factory production was able to make, and therefore sell, goods cheaper and faster. These craftsmen found themselves doing unskilled and semiskilled labor on the factory floor. Before, production was individual. Those that produced a good also owned the shop and tools so it made sense that they should get all the money earned. Factory production saw creation become social, with many helping to make the goods, while payment stayed individual, with factory owners who contributed no labor gaining all the profit for simply owning the tools and the building. This is the same parasitic relationship found throughout all of civilization, just new roles and new ways for the ruling class to live off of the labor of many.

The workers movement developed in response to this, made up of various left ideologies; from Marian communism to anarchism. Regardless of ideological preference, the idea was the same. The factory was the kernel of the new world. People had been separated from the land and each other through borders, style of work, race, and a number of other things. The factory got all these different types of people together and under the same conditions. The more the factory spread, the more people were united by their similar exploitation. Eventually, they would rise up and usher in a new world based on freedom and equality.

There were problems with this. People were united in their separation. It took the imposition of an ethic by the workers movement, that all these different types of people should identify first and foremost as workers, for collective action to take place. All the workers did not have similar interests. Young white single males have much different concerns than a single black immigrant mother, regardless of being in the same factory. Obviously, government leaders and factory owners utilized these differences to their own advantage by privileging some groups over others. The slogan “An Injury to One is an Injury to All” was based more on faith than fact.

The workers movement saw the factory’s mass employment with hope as well. With massive profits, owners would reinvest this money in machines and other tools. Needing people to work the new equipment, they hired. Selling more products, created more efficiently, led to more profits and the cycle continued. As the factory system expanded, it was believed capitalism was bringing its own collapse. More were being united by a common state, that of the worker, and eventually their false separations would subside. They would see each other as the same, regardless of creed or color, see their true enemy in the factory owners and their government, and revolt.

For this reason, the workers movement advocated the expansion of the factory in a policy called “proletarianization.” When the Bolshevik Communists came to power in Russia, their main concern was to industrialize the nation for this purpose, similar to the rise of Communist governments elsewhere. One could ask the obvious question: Would spreading the factory system and the working-class condition really bring its end? Would spreading the plantation system and the slave condition end slavery, or strengthen it?

If Trump is the end of the left, good riddance.

The conditions that brought about the original workers movement have changed, yet the left seems blind to this or prepares mental gymnastics. For starters, the current economy is deindustrializing in America and post-industrial worldwide. Even in current industrial powerhouses like China and India, employment rates and growth from a former period are not found. For the United States, Europe, and the West in general, there is no real industrial manufacturing base. This type of work only happens in the colonized world or prison. It’s only sweatshops of various types in different spaces.

In fact, it may even be fallacious to speak of a “colonized” world. The nation-state seems not to matter anymore. A new, global system has developed. Transnational corporations organize social life, almost everywhere, to operate for the creation of value. Every facebook post made, every online search informs advertisers and helps business adapt their products. The spending habits tracked on your debit card help to know who you are and what type of products you like. One’s interaction with the current world contributes to value creation. In other words, production has moved from the workplace to all of life and has only been possible with modern communication technology and the new post-industrial economy.

The workers now are not the same as the past in this country or countries similarly situated. The left, when admitting that things have changed, will then perform backflips to also claim nothing has changed. The service sector has come to dominate, yet the left holds its orientation to be exactly the same as the factory. I was discussing this with a Trotskyist friend that worked a service sector job at a burrito joint. Since they were still payed a wage, he claimed, the form of capitalist exploitation had not changed.

Taking the example of the burrito joint, the harvesting of the lettuce, tomatoes, and other food items used to make those burritos most likely occurred in another underdeveloped country or by migrants or prisoners in this country. They receive wages much lower than those in the service sector (usually) and their labor is more vital to the economic set-up than those performing easily automated service jobs. If they did not harvest the food, my Trotskyist friend would have no lettuce to put on someone’s burrito. Building a burrito is not the same as building a highway, car, skyscraper, or harvesting fields. No kernel of a new world can be seen within this type of work, other than someone akin to a psychiatric ward.

So, how will a better world be brought about? I think anyone who believes they know the answer to this question is arrogant and needs to come back down to earth. I certainly do not know the answer. I will provide some thoughts to help answer this question.

Every single revolution has failed. The French Revolution, American Revolution, Russian Revolution, the list goes on, all have failed to usher in a world that has ended the few dominating the many. To hang on to these past conceptions of revolution is to condemn the next one to loss. This means a rethinking of fundamental questions is needed.

What does revolutionary action succeed at doing?

First and foremost, it succeeds at establishing a set of values within a subversive context. Courageousness is a good thing to find in the hearts of people, yet the soldier who goes to fight and die is “courageous.” The last goal of revolutionary action is to get people to join the armed forces. An insurrectionary act affirms notions of justice, courage, honor, right and wrong, freedom, kindness, empathy, etc. that completely negate the selfishness, materialism, and overall toxicity of the dominant values.

This is where anarchists who fetishize violence get off the mark. Simply put, just because we burn everything to the ground does not mean people stop being assholes. This is not to say, however, that these values won’t get affirmed in riots and the like. Who could say those in Ferguson, Baltimore, and many other places were not courageous with deep notions of justice, right and wrong, and freedom? These values can also be affirmed through wise grandparents going on a hike with their grandson, a teacher who treats her students as equals, a victim who stands up to their bully, a group of musicians playing carefree, a ropeswing and a group of good people, graffiti, sharing a smoke, stealing from Walmart, fighting mobilized Nazis, and many other ways.

Revolutionary action does not just happen at a march or political meeting. I’d go so far as to argue it happens at these places less of the time.

Secondly, it succeeds in taking space to keep these values and energy going. It takes space and organizes the shared life within it in a completely new way. It may even be wrong to describe this as “organized.

When hegemonic powers fall apart, power REALLY does go back to individual people. Depending on how we relate to each other flowers or weeds could grow. What seed is planted in the cracks?

How is power disrupted?

From here, we can look to the most interesting struggle to occur in the United States in many years: Standing Rock. For all its problems, the Standing Rock resistance highlighted some important things. Power is found in infrastructure. The construction of the pipeline only strengthens the world of pipelines and oil dependency. These constructions, from oil pipelines to highways to electrical system to fracking equipment, help keep this world running. Those of us who went to Standing Rock and stayed with a certain group in Sacred Stone saw the banner: “Against the Pipeline and Its World!”

Standing Rock had one camp that sat in the path of the pipeline to black construction, until forcibly removed by the police, and camps across the river. This struggle blocked the construction of a world it did not want to see and built one it did right in the space it captured. It had its own food supply, water supply, etc. It had its own logistical system, outside of government and business. It relied on the power of people.

During the Occupy movement, it seemed natural for those in Oakland to block the port. The port brought in commodities to be sold, benefitting the rich and propping up the system. It seemed like common sense for revolutionaries in Egypt to take Tahrir Square, the center of activity, block main roads, stopping people from shopping and working, and burn police stations. In fact, focusing on Tahrir square misses all the blocked roads and burnt police stations all across Egypt.

The reflex seems to be to block the flows of this world and construct new ones, to block on form of life and build many new forms.

Why do revolutions fail?

There is no good answer to this.

One reason revolt fails to materialize (among many) is activity gets pacified by liberals. This, again, could be seen at Standing Rock where those apart of “Spirit Camps” put their bodies between police and “Warrior Camps”, telling them to demobilize, leave the initiated conflict, and pray. This can be seen when liberals demask covered protesters trying to push things or these liberals even pepper spraying them when they nonviolently damage property.

Following this, one of the most inspiring revolutions in the last 100 years was snuffed out by revolutionaries giving up their power, believing it was strategic. Within the Spanish Civil War, workers in Barcelona, Aargon, and other urban and rural places took over the land and factories, abolished the government and money, and armed themselves. They subordinated themselves to Republican government authority in the belief they could win the fight against the fascists by doing this.

What was shown from this is the Republican government was no more capable of fighting fascists than autonomous armed workers. The workers should of trusted no one but themselves, being repressed by both Republican and Communist henchmen to fall in line. Both of these forces reintroduced market mechanisms and money, government authority, and other ways the few rule over the many. Contrary to their claims, these efforts did not make fighting the war any more efficient and in some ways, especially the reintroduction of market forces into the food supply, it made things much worse. In the end, the fascists still won.

The problem here is viewing the conflict in purely military terms instead of a social war. By falling in line with Republican government and military command, those in Barcelona and other places allowed them to organize social life and overall just laid the groundwork for a fascist organization. Their self organization should’ve never been sacrificed.

When revolutionaries forget their struggle is more than a military confrontation they become exactly what they are fighting against. They become their enemy. They also miss inspiring movements due to fetishizing combat. We heard the left praise the fight of Kurdish women in Rojava against ISIS, and justifiably so, yet hear nothing about grassroots councils that have sprung up and continue to survive all across Syria in spite of horrible civil war. With the collapse of the Assad dictatorship, these councils took on the role of getting electricity, distributing food and water, healing the sick and injured, and whatever else is necessary to life.

I have spoken in generalities mainly, attempting to adequately explain my reasoning yet not over complicate and bore.

Over 700 acres of the Wayne National Forest have been auctioned off to the with hydrofracturing intentions. The Wayne is not new to gas and energy exploitation, yet this is a new and intensified maneuver in the war on Ohio’s only national forest. The plan from the Bureau of Land Management is to continue resource extraction until it’s all gone and The Wayne is dead. Some people will make a profit, though…

I will live in Wayne National Forest, in a long-term occupation starting on May 12th, in hopes of changing this tide. While it would be interesting for this to fit into some wider narrative of struggle, and in some ways it naturally does, that is not my main concern. My main concern is stopping the continued energy industry’s attack on the forest.

To anyone who has resonated with what’s been written, who sees this battle as their battle, and who believes they can help, PLEASE GET INVOLVED.

EVERYONE IS WELCOME TO COME.

To read:

– Affirming Gasland by the creators of the documentary Gasland

– 1984 by George Orwell

– The Madman: His Parables and Poems by Kahlil Gibran

– The Great Divorce by C.S. Lewis

– The Worst Mistake in the History of the Human Race by Jared Diamond

– What is Civilization? by John Haywood (found in The Penguin Historical Atlas of Ancient Civilizations)

– Debt by David Graeber

– To Our Friends by The Invisible Committee

To watch:

– Gasland

– Gasland 2

The Strange Persistence of Guilt

Those of us living in the developed countries of the West find ourselves in the tightening grip of a paradox, one whose shape and character have so far largely eluded our understanding. It is the strange persistence of guilt as a psychological force in modern life. If anything, the word persistence understates the matter. Guilt has not merely lingered. It has grown, even metastasized, into an ever more powerful and pervasive element in the life of the contemporary West, even as the rich language formerly used to define it has withered and faded from discourse, and the means of containing its effects, let alone obtaining relief from it, have become ever more elusive.

This paradox has set up a condition in which the phenomenon of rising guilt becomes both a byproduct of and an obstacle to civilizational advance. The stupendous achievements of the West in improving the material conditions of human life and extending the blessings of liberty and dignity to more and more people are in danger of being countervailed and even negated by a growing burden of guilt that poisons our social relations and hinders our efforts to live happy and harmonious lives.

I use the words strange persistence to suggest that the modern drama of guilt has not followed the script that was written for it. Prophets such as Friedrich Nietzsche were confident that once the modern Western world finally threw off the metaphysical straitjacket that had confined the possibilities of all previous generations, the moral reflexes that had accompanied that framework would disappear along with them. With God dead, all would indeed be permitted. Chief among the outmoded reflexes would be the experience of guilt, an obvious vestige of irrational fear promulgated by oppressive, life-denying institutions erected in the name and image of a punitive deity.

Indeed, Nietzsche had argued in On the Genealogy of Morality (1887), a locus classicus for the modern understanding of guilt, that the very idea of God, or of the gods, originated hand-in-hand with the feeling of indebtedness (the German Schuld—“guilt”—being the same as the word for “debt,” Schulden).1 The belief in God or gods arose in primitive societies, Nietzsche speculated, out of dread of the ancestors and a feeling of indebtedness to them. This feeling of indebtedness expanded its hold, in tandem with the expansion of the concept of God, to the point that when the Christian God offered itself as “the maximal god yet achieved,” it also brought about “the greatest feeling of indebtedness on earth.”

But “we have now started in the reverse direction,” Nietzsche exulted. With the “death” of God, meaning God’s general cultural unavailability, we should expect to see a consequent “decline in the consciousness of human debt.” With the cultural triumph of atheism at hand, such a victory could also “release humanity from this whole feeling of being indebted towards its beginnings, its prima causa.” Atheism would mean “a second innocence,” a regaining of Eden with neither God nor Satan there to interfere with and otherwise corrupt the proceedings.2

This is not quite what has happened; nor does there seem to be much likelihood that it will happen, in the near future. Nietzsche’s younger contemporary Sigmund Freud has proven to be the better prophet, having offered a dramatically different analysis that seems to have been more fully borne out. In his book Civilization and Its Discontents (Das Unbehagen in der Kultur), Freud declared the tenacious sense of guilt to be “the most important problem in the development of civilization.” Indeed, he observed, “the price we pay for our advance in civilization is a loss of happiness through the heightening of the sense of guilt.”3

Such guilt was hard to identify and hard to understand, though, since it so frequently dwelled on an unconscious level, and could easily be mistaken for something else. It often appears to us, Freud argued, “as a sort of malaise [Unbehagen], a dissatisfaction,”4 for which people seek other explanations, whether external or internal. Guilt is crafty, a trickster and chameleon, capable of disguising itself, hiding out, changing its size and appearance, even its location, all the while managing to persist and deepen.

This seems to me a very rich and incisive description, and a useful starting place for considering a subject almost entirely neglected by historians: the steadily intensifying (although not always visible) role played by guilt in determining the structure of our lives in the twentieth and twenty-first centuries. By connecting the phenomenon of rising guilt to the phenomenon of civilizational advance, Freud was pointing to an unsuspected but inevitable byproduct of progress itself, a problem that will only become more pronounced in the generations to come.

Demoralizing Guilt

Thanks in part to Freud’s influence, we live in a therapeutic age; nothing illustrates that fact more clearly than the striking ways in which the sources of guilt’s power and the nature of its would-be antidotes have changed for us. Freud sought to relieve in his patients the worst mental burdens and pathologies imposed by their oppressive and hyperactive consciences, which he renamed their superegos, while deliberately refraining from rendering any judgment as to whether the guilty feelings ordained by those punitive superegos had any moral justification. In other words, he sought to release the patient from guilt’s crushing hold by disarming and setting aside guilt’s moral significance, and re-designating it as just another psychological phenomenon, whose proper functioning could be ascertained by its effects on one’s more general well-being. He sought to “demoralize” guilt by treating it as a strictly subjective and emotional matter.

Health was the only remaining criterion for success or failure in therapy, and health was a functional category, not an ontological one. And the nonjudgmental therapeutic worldview whose seeds Freud planted has come into full flower in the mainstream sensibility of modern America, which in turn has profoundly affected the standing and meaning of the most venerable among our moral transactions, and not merely matters of guilt.

Take, for example, the various ways in which “forgiveness” is now understood. Forgiveness is one of the chief antidotes to the forensic stigma of guilt, and as such has long been one of the golden words of our culture, with particularly deep roots in the Christian tradition, in which the capacity for forgiveness is seen as a central attribute of the Deity itself. In the face of our shared human frailty, forgiveness expresses a kind of transcendent and unconditional regard for the humanity of the other, free of any admixture of interest or punitive anger or puffed-up self-righteousness. Yet forgiveness rightly understood can never deny the reality of justice. To forgive, whether one forgives trespasses or debts, means abandoning the just claims we have against others, in the name of the higher ground of love. Forgiveness affirms justice even in the act of suspending it. It is rare because it is so costly.

In the new therapeutic dispensation, however, forgiveness is all about the forgiver, and his or her power and well-being. We have come a long way from Shakespeare’s Portia, who spoke so memorably in The Merchant of Venice about the unstrained “quality of mercy,” which “droppeth as the gentle rain from heaven” and blesses both “him that gives and him that takes.”5 And an even longer way from Christ’s anguished cry from the cross, “Forgive them, for they know not what they do.”6 And perhaps even further yet from the most basic sense of forgiveness, the canceling of a monetary debt or the pardoning of a criminal offense, in either case a very conscious suspension of the entirely rightful demands of justice.

We still claim to think well of forgiveness, but it has in fact very nearly lost its moral weight by having been translated into an act of random kindness whose chief value lies in the sense of personal release it gives us. “Forgiveness,” proclaimed the journalist Gregg Easterbrook writing at Beliefnet, “is good for your health.”7 Like the similar acts of confession or apology, and other transactions in the moral economy of sin and guilt, forgiveness is in danger of being debased into a kind of cheap grace, a waiving of standards entirely, standards without which such transactions have little or no moral significance. Forgiveness only makes sense in the presence of a robust conception of justice. Without that, it is in real danger of being reduced to something passive and automatic and flimsy—a sanctimonious way of saying that nothing really matters very much at all.

The Infinite Extensibility of Guilt

The therapeutic view of guilt seems to offer the guilt-ridden an avenue of escape from its power, by redefining guilt as the result of psychic forces that do not relate to anything morally consequential. But that has not turned out to be an entirely workable solution, since it is not so easy to banish guilt merely by denying its reality. There is another powerful factor at work too, one that might be called the infinite extensibility of guilt. This proceeds from a very different set of assumptions, and is a surprising byproduct of modernity’s proudest achievement: its ceaselessly expanding capacity to comprehend and control the physical world.

In a world in which the web of relationships between causes and effects yields increasingly to human understanding and manipulation, and in which human agency therefore becomes ever more powerful and effective, the range of our potential moral responsibility, and therefore of our potential guilt, also steadily expands. We like to speak, romantically, of the interconnectedness of all things, failing to recognize that this same principle means that there is almost nothing for which we cannot be, in some way, held responsible. This is one inevitable side effect of the growing movement to change the name of our geological epoch from the Holocene to the Anthropocene—the first era in the life of the planet to be defined by the effects of the human presence and human power: effects such as nuclear fallout, plastic pollution, domesticated animals, and anthropogenic climate change. Power entails responsibility, and responsibility leads to guilt.

I can see pictures of a starving child in a remote corner of the world on my television, and know for a fact that I could travel to that faraway place and relieve that child’s immediate suffering, if I cared to. I don’t do it, but I know I could. Although if I did so, I would be a well-meaning fool like Dickens’s ludicrous Mrs. Jellyby, who grossly neglects her own family and neighborhood in favor of the distant philanthropy of African missions. Either way, some measure of guilt would seem to be my inescapable lot, as an empowered man living in an interconnected world.

Whatever donation I make to a charitable organization, it can never be as much as I could have given. I can never diminish my carbon footprint enough, or give to the poor enough, or support medical research enough, or otherwise do the things that would render me morally blameless.

Colonialism, slavery, structural poverty, water pollution, deforestation—there’s an endless list of items for which you and I can take the rap. To be found blameless is a pipe dream, for the demands on an active conscience are literally as endless as an active imagination’s ability to conjure them. And as those of us who teach young people often have occasion to observe, it may be precisely the most morally perceptive and earnest individuals who have the weakest common-sense defenses against such overwhelming assaults on their over-receptive sensibilities. They cannot see a logical place to stop. Indeed, when any one of us reflects on the brute fact of our being alive and taking up space on this planet, consuming resources that could have met some other, more worthy need, we may be led to feel guilt about the very fact of our existence.

The questions involved are genuine and profound; they deserve to be asked. Those who struggle most deeply with issues of environmental justice and stewardship are often led to wonder whether there can be any way of life that might allow one to escape being implicated in the cycles of exploitation and cruelty and privilege that mark, ineluctably, our relationship with our environment. They suffer from a hypertrophied sense of guilt, and desperately seek some path to an existence free of it.

In this, they embody a tendency of the West as a whole, expressed in an only slightly exaggerated form. So excessive is this propensity toward guilt, particularly in the most highly developed nations of the Western world, that the French writer Pascal Bruckner, in a courageous and brilliant recent study called The Tyranny of Guilt (in French, the title is the slightly different La tyrannie de la pénitence), has identified the problem as “Western masochism.” The lingering presence of “the old notion of original sin, the ancient poison of damnation,” Bruckner argues, holds even secular philosophers and sociologists captive to its logic.8

For all its brilliance, though, Bruckner’s analysis is not fully adequate. The problem goes deeper than a mere question of alleged cultural masochism arising out of vestigial moral reflexes. It is, after all, not merely our pathologies that dispose us in this direction. The pathologies themselves have an anterior source in the very things that make us proudest: our knowledge of the world, of its causes and effects, and our consequent power to shape and alter those causes and effects. The problem is perfectly expressed in T.S. Eliot’s famous question “After such knowledge, what forgiveness?”9 In a world of relentlessly proliferating knowledge, there is no easy way of deciding how much guilt is enough, and how much is too much.

Stolen Suffering

Notwithstanding all claims about our living in a post-Christian world devoid of censorious public morality, we in fact live in a world that carries around an enormous and growing burden of guilt, and yearns—sometimes even demands—to be free of it. About this, Bruckner could not have been more right. And that burden is always looking for an opportunity to discharge itself. Indeed, it is impossible to exaggerate how many of the deeds of individual men and women can be traced back to the powerful and inextinguishable need of human beings to feel morally justified, to feel themselves to be “right with the world.” One would be right to expect that such a powerful need, nearly as powerful as the merely physical ones, would continue to find ways to manifest itself, even if it had to do so in odd and perverse ways.

Which brings me to a very curious story, full of significance for these matters. It comes from a New York Times op-ed column by Daniel Mendelsohn, published on March 9, 2008, and aptly titled “Stolen Suffering.”10 Mendelsohn, a Bard College professor who had written a book about his family’s experience of the Holocaust, told of hearing the story of an orphaned Jewish girl who trekked 2,000 miles from Belgium to Ukraine, surviving the Warsaw ghetto, murdering a German officer, and taking refuge in forests where she was protected by kindly wolves. The story had been given wide circulation in a 1997 book, Misha: A Mémoire of the Holocaust Years, and its veracity was generally accepted. But it was eventually discovered to be a complete fabrication, created by a Belgian Roman Catholic named Monique De Wael.11

Such a deception, Mendelsohn argued, is not an isolated event. It needs to be understood in the context of a growing number of “phony memoirs,” such as the notorious child-survivor Holocaust memoir Fragments, or Love and Consequences, the putative autobiography of a young mixed-race woman raised by a black foster mother in gang-infested Los Angeles.12 These books were, as Mendelsohn said, “a plagiarism of other people’s trauma,” written not, as their authors claimed, “by members of oppressed classes (the Jews during World War II, the impoverished African-Americans of Los Angeles today), but by members of relatively safe or privileged classes.” Interestingly, too, he noted that the authors seemed to have an unusual degree of identification with their subjects—indeed, a degree of identification approaching the pathological. Defending Misha, De Wael declared, astonishingly, that “the story is mine…not actually reality, but my reality, my way of surviving.”13

What these authors have appropriated is suffering, and the identification they pursue is an identification not with certifiable heroes but with certifiable victims. It is a particular and peculiar kind of identity theft. How do we account for it? What motivates it? Why would comfortable and privileged people want to identify with victims? And why would their efforts appeal to a substantial reading public?

Or, to pose the question even more generally, in a way that I think goes straight to the heart of our dilemma: How can one account for the rise of the extraordinary prestige of victims, as a category, in the contemporary world?

I believe that the explanation can be traced back to the extraordinary weight of guilt in our time, the pervasive need to find innocence through moral absolution and somehow discharge one’s moral burden, and the fact that the conventional means of finding that absolution—or even of keeping the range of one’s responsibility for one’s sins within some kind of reasonable boundaries—are no longer generally available. Making a claim to the status of certified victim, or identifying with victims, however, offers itself as a substitute means by which the moral burden of sin can be shifted, and one’s innocence affirmed. Recognition of this substitution may operate with particular strength in certain individuals, such as De Wael and her fellow hoaxing memoirists. But the strangeness of the phenomenon suggests a larger shift of sensibility, which represents a change in the moral economy of sin. And almost none of it has occurred consciously. It is not something as simple as hypocrisy that we are seeing. Instead, it is a story of people working out their salvation in fear and trembling.

The Moral Economy of Sin

In the modern West, the moral economy of sin remains strongly tied to the Judeo-Christian tradition, and the fundamental truth about sin in the Judeo-Christian tradition is that sin must be paid for or its burden otherwise discharged. It can neither be dissolved by divine fiat nor repressed nor borne forever. In the Jewish moral world in which Christianity originated, and without which it would have been unthinkable, sin had always had to be paid for, generally by the sacrificial shedding of blood; its effects could never be ignored or willed away. Which is precisely why, in the Christian context, forgiveness of sin was specifically related to Jesus Christ’s atoning sacrifice, his vicarious payment for all human sins, procured through his death on the cross and made available freely to all who embraced him in faith. Forgiveness has a stratospherically high standing in the Christian faith. But it is grounded in fundamental theological and metaphysical beliefs about the person and work of Christ, which in turn can be traced back to Jewish notions of sin and how one pays for it. It makes little sense without them. Forgiveness, or expiation, or atonement—all of these concepts promising freedom from the weight of guilt are grounded in a moral transaction, enacted within the universe of a moral economy of sin.

But in a society that retains its Judeo-Christian moral reflexes but has abandoned the corresponding metaphysics, how can the moral economy of sin continue to operate properly, and its transactions be effectual? Can a credible substitute means of discharging the weight of sin be found? One workable way to be at peace with oneself and feel innocent and “right with the world” is to identify oneself as a certifiable victim—or better yet, to identify oneself with victims. This is why the Mendelsohn story is so important and so profoundly indicative, even if it deals with an extreme case. It points to the way in which identification with victims, and the appropriation of victim status, has become an irresistible moral attraction. It suggests the real possibility that claiming victim status is the sole sure means left of absolving oneself and securing one’s sense of fundamental moral innocence. It explains the extraordinary moral prestige of victimhood in modern America and Western society in general.

Why should that be so? The answer is simple. With moral responsibility comes inevitable moral guilt, for reasons already explained. So if one wishes to be accounted innocent, one must find a way to make the claim that one cannot be held morally responsible. This is precisely what the status of victimhood accomplishes. When one is a certifiable victim, one is released from moral responsibility, since a victim is someone who is, by definition, not responsible for his condition, but can point to another who is responsible.

But victimhood at its most potent promises not only release from responsibility, but an ability to displace that responsibility onto others. As a victim, one can project onto another person, the victimizer or oppressor, any feelings of guilt he might harbor, and in projecting that guilt lift it from his own shoulders. The result is an astonishing reversal, in which the designated victimizer plays the role of the scapegoat, upon whose head the sin comes to rest, and who pays the price for it. By contrast, in appropriating the status of victim, or identifying oneself with victims, the victimized can experience a profound sense of moral release, of recovered innocence. It is no wonder that this has become so common a gambit in our time, so effectively does it deal with the problem of guilt—at least individually, and in the short run, though at the price of social pathologies in the larger society that will likely prove unsustainable.

Grievance—and Penitence—on a Global Scale

All of this confusion and disruption to our most time-honored ways of handling the dispensing of guilt and absolution creates enormous problems, especially in our public life, as we assess questions of social justice and group inequities, which are almost impossible to address without such morally charged categories coming into play. Just look at the incredible spectacle of today’s college campuses, saturated as they are with ever-more-fractured identity politics, featuring an ever-expanding array of ever-more-minute grievances, with accompanying rounds of moral accusation and declarations of victimhood. These phenomena are not merely a fad, and they did not come out of nowhere.

Similar categories also come into play powerfully when the issues in question are ones relating to matters such as the historical guilt of nations and their culpability or innocence in the international sphere. Such questions are ubiquitous, as never before.

In the words of political scientist Thomas U. Berger, “We live in an age of apology and recrimination,” and he could not be more right.14 Guilt is everywhere around us, and its potential sources have only just begun to be plumbed, as our understanding of the buried past widens and deepens.

Gone is the amoral Hobbesian notion that war between nations is merely an expression of the state of nature. The assignment of responsibility for causing a war, the designation of war guilt, the assessment of punishments and reparations, the identification and prosecution of war crimes, the compensation of victims, and so on—all of these are thought to be an essential part of settling a war’s effects justly, and are part and parcel of the moral economy of guilt as it now operates on the national and international levels.

The heightened moral awareness we now bring to international affairs is something new in human history, stemming from the growing social and political pluralism of Western democracies and the unprecedented influence of universalized norms of human rights and justice, supported and buttressed by a robust array of international institutions and nongovernmental organizations ranging from the International Criminal Court to Amnesty International.

In addition, the larger narratives through which a nation organizes and relates its history, and through which it constitutes its collective memory, are increasingly subject to monitoring and careful scrutiny by its constituent ethnic, linguistic, cultural, and other subgroups, and are responsive to demands that those histories reflect the nation’s past misdeeds and express contrition for them. Never has there been a keener and more widespread sense of particularized grievances at work throughout in the world, and never have such grievances been able to count on receiving such a thorough and generally sympathetic hearing from scholars and the general public.

Indeed, it is not an exaggeration to say that one could not begin to understand the workings of world politics today without taking into account a whole range of morally charged questions of guilt and innocence. How can one fully understand the decision by Chancellor Angela Merkel to admit a million foreign migrants a year into Germany without first understanding how the powerfully the burden of historical guilt weighs upon her and many other Germans? Such factors are now as much a part of historical causation and explanation as such standbys as climate, geography, access to natural resources, demographics, and socioeconomic organization.

There is no disputing the fact, then, that history itself, particularly in the form of “coming to terms with” the wrongs of the past and the search for historical justice, is becoming an ever more salient element in national and international politics. We see it in the concern over past abuses of indigenous peoples, colonized peoples, subordinated races and classes, and the like, and we see it in the ways that nations relate their stories of war. Far from being buried, the past has become ever more alive with moral contestation.

Perhaps the most impressive example of sustained collective penitence in human history has come from the government and people of Germany, who have done so much to atone for the sins of Nazism. But how much penitence is enough? And how long must penance be done? When can we say that the German people—who are, after all, an almost entirely different cast of characters from those who lived under the Nazis—are free and clear, and have “paid their debt” to the world and to the past, and are no longer under a cloud of suspicion? Who could possibly make that judgment? And will there come a day—indeed, has it already arrived, with the nation’s backlash against Chancellor Merkel’s immigration blunders?—when the Germans have had enough of the Sisyphean guilt which, as it may seem to them, they have been forced by other sinful nations to bear, and begin to seek their redemption by other means?

Who, after all, has ever been pure and wise enough to administer such postwar justice with impartiality and detachment, and impeccable moral credibility? What nation or entity at the close of World War II was sufficiently without sin to cast the decisive stone? The Nuremberg and Tokyo war crimes trials were landmarks in the establishment of institutional entities administering and enforcing international law. But they also were of questionable legality, reflecting the imposition of ad hoc, ex post facto laws, administered by victors whose own hands were far from entirely clean (consider the irony of Soviet judges sitting in judgment of the same kinds of crimes their own regime committed with impunity)—indeed, victors who might well have been made to stand trial themselves, had the tables been turned, and the subject at hand been the bombing of civilian targets in Hiroshima and Dresden.

Or consider whether the infamous Article 231 in the Treaty of Versailles, assigning “guilt” to Germany for the First World War, was not, in the very attempt to impose the victor’s just punishment on a defeated foe, itself an act of grave injustice, the indignity of which surely helped to precipitate the catastrophes that followed it. The assignment of guilt, especially exclusive guilt, to one party or another may satisfy the most urgent claims of justice, or the desire for retribution, but may fail utterly the needs of reconciliation and reconstruction. As Elazar Barkan bluntly argued in his book The Guilt of Nations, “In forcing an admission of war guilt at Versailles, rather than healing, the victors instigated resentment that contributed to the rise of Fascism.”15 The work of healing, like the work of the Red Cross, has a claim all its own, one that is not always compatible with the utmost pursuit of justice (although it probably cannot succeed in the complete absence of such a pursuit). Nor does such an effort to isolate and assign exclusive guilt meet the needs of a more capacious historical understanding, one that understands, as Herbert Butterfield once wrote, that history is “a clash of wills out of which there emerges something that no man ever willed.”16 And, he might have added, in which no party is entirely innocent.

So once again we find ourselves confronting the paradox of sin that cannot be adequately expiated. The deeply inscribed algorithm of sin demands some kind of atonement, but for some aspects of the past there is no imaginable way of making that transaction without creating new sins of equivalent or greater dimension. What possible atonement can there be for, say, the institution of slavery? It is no wonder that the issue of reparations for slavery surfaces periodically, and probably always will, yet it is simply beyond the power of the present or the future to atone for the sins of the past in any effective way. Those of us who teach history, and take seriously the moral formation of our students, have to consider what the takeaway from this is likely to be. Do we really want to rest easy with the idea that a proper moral education needs to involve a knowledge of our extensive individual and collective guilt—a guilt for which there is no imaginable atonement? That this is not a satisfactory state of affairs would seem obvious; what to do about it, particularly in a strictly secular context, is another matter.

Again, the question arises whether and to what extent all of this has something to do with our living in a world that has increasingly, for the past century or so, been run according to secular premises, using a secular vocabulary operating within an “immanent frame”—a mode of operation that requires us to be silent about, and forcibly repress, the very religious frameworks and vocabularies within which the dynamics of sin and guilt and atonement have hitherto been rendered intelligible. I use the term “repress” here with some irony, given its Freudian provenance. But even the irreligious Freud did not envision the “liberation” of the human race from its religious illusions as an automatic and sufficient solution to its problems. He saw nothing resembling a solution. Indeed, it could well be the case, and paradoxically so, that just at the moment when we have become more keenly aware than ever of the wages of sin in the world, and more keenly anxious to address those sins, we find ourselves least able to describe them in those now-forbidden terms, let alone find moral release from their weight. Andrew Delbanco puts it quite well in his perceptive and insightful 1995 book The Death of Satan:

We live in the most brutal century in human history, but instead of stepping forward to take the credit, the devil has rendered himself invisible. The very notion of evil seems to be incompatible with modern life, from which the ideas of transgression and the accountable self are fast receding. Yet despite the loss of old words and moral concepts—Satan, sin, evil—we cannot do without some conceptual means for thinking about the universal human experience of cruelty and pain…. If evil, with all its insidious complexity, escape the reach of our imagination, it will have established dominion over us all.17

So there are always going to be consequences attendant upon the disappearance of such words, and they may be hard to foresee, and hard to address. “Whatever became of sin?” asked the psychiatrist Karl Menninger, in his 1973 book of that title. What, in the new arrangements, can accomplish the moral and transactional work that was formerly done by the now-discarded concepts? If, thanks to Nietzsche, the absence of belief in God is “the notional condition of modern Western culture,” as Paula Fredriksen argues in her study of the history of the concept of sin, doesn’t that mean that the idea of sin is finished too?18

Yes, it would seem to mean just that. After all, “sin” cannot be understood apart from a larger context of ideas. So what happens when all the ideas that upheld “sin” in its earlier sense have ceased to be normatively embraced? Could not the answer to Menninger’s question be something like Zarathustra’s famous cry: “Sin is dead and we have killed it!”?

Sin is a transgression against God, and without a God, how can there be such a thing as sin? So the theory would seem to dictate. But as Fredriksen argues, that theory fails miserably to explain the world we actually inhabit. Sin lives on, it seems, even if we decline to name it as such. We live, she says, in the web of culture, and “the biblical god…seems to have taken up permanent residence in Western imagination…[so much so that] even nonbelievers seem to know exactly who or what it is that they do not believe in.”19 In fact, given the anger that so many nonbelievers evince toward this nonexistent god, one might be tempted to speculate whether their unconscious cry is “Lord, I do not believe; please strengthen my belief in your nonexistence!” Such was Nietzsche’s genius in communicating how difficult an achievement a clean and unconditional atheism is, a conundrum that he captured not by asserting that God does not exist, but that God is dead. For the existence of the dead constitutes, for us, a presence as well as an absence. It is not so easy to wish that enduring presence away, particularly when there is the lingering sense that the presence was once something living and breathing.

What makes the situation dangerous for us, as Fredriksen observes, is not only the fact that we have lost the ability to make conscious use of the concept of sin but that we have also lost any semblance of a “coherent idea of redemption,”20the idea that has always been required to accompany the concept of sin in the past and tame its harsh and punitive potential. The presence of vast amounts of unacknowledged sin in a culture, a culture full to the brim with its own hubristic sense of world-conquering power and agency but lacking any effectual means of achieving redemption for all the unacknowledged sin that accompanies such power: This is surely a moral crisis in the making—a kind of moral-transactional analogue to the debt crisis that threatens the world’s fiscal and monetary health. The rituals of scapegoating, of public humiliation and shaming, of multiplying morally impermissible utterances and sentiments and punishing them with disproportionate severity, are visibly on the increase in our public life. They are not merely signs of intolerance or incivility, but of a deeper moral disorder, an Unbehagen that cannot be willed away by the psychoanalytic trick of pretending that it does not exist.

The Persistence of Guilt

Where then does this analysis of our broken moral economy leave us? The progress of our scientific and technological knowledge in the West, and of the culture of mastery that has come along with it, has worked to displace the cultural centrality of Christianity and Judaism, the great historical religions of the West. But it has not been able to replace them. For all its achievements, modern science has left us with at least two overwhelmingly important, and seemingly insoluble, problems for the conduct of human life. First, modern science cannot instruct us in how to live, since it cannot provide us with the ordering ends according to which our human strivings should be oriented. In a word, it cannot tell us what we should live for, let alone what we should be willing to sacrifice for, or die for.

And second, science cannot do anything to relieve the guilt weighing down our souls, a weight to which it has added appreciably, precisely by rendering us able to be in control of, and therefore accountable for, more and more elements in our lives—responsibility being the fertile seedbed of guilt. That growing weight seeks opportunities for release, seeks transactional outlets, but finds no obvious or straightforward ones in the secular dispensation. Instead, more often than not we are left to flail about, seeking some semblance of absolution in an incoherent post-Christian moral economy that has not entirely abandoned the concept of sin but lacks the transactional power of absolution or expiation without which no moral system can be bearable.

What is to be done? One conclusion seems unavoidable. Those who have viewed the obliteration of religion, and particularly of Judeo-Christian metaphysics, as the modern age’s signal act of human liberation need to reconsider their dogmatic assurance on that point. Indeed, the persistent problem of guilt may open up an entirely different basis for reconsidering the enduring claims of religion. Perhaps human progress cannot be sustained without religion, or something like it, and specifically without something very like the moral economy of sin and absolution that has hitherto been secured by the religious traditions of the West.

Such an argument would have little to do with conventional theological apologetics. Instead, it would draw from empirical realities regarding the social and psychological makeup of advanced Western societies. And it would fully face the fact that, without the support of religious beliefs and institutions, one may have no choice but to accept the dismal prospect envisioned by Freud, in which the advance of human civilization brings not happiness but a mounting tide of unassuaged guilt, ever in search of novel and ineffective, and ultimately bizarre, ways to discharge itself. Such an advance would steadily diminish the human prospect, and render it less and less sustainable. It would smother the energies of innovation that have made the West what it is, and fatally undermine the spirited confidence needed to uphold the very possibility of progress itself. It must therefore be countered. But to be countered, it must first be understood.

Endnotes

  1. The discussion that follows is drawn from the second essay in Friedrich Nietzsche, On the Genealogy of Morality, ed. Keith Ansell-Pearson, trans. Carol Diethe (Cambridge, England: Cambridge University Press, 2006), 35–67. First published 1887. I here take note of the fact that any discussion of guilt per se runs the risk of conflating different meanings of the word: guilt as a forensic or objective term, guilt as culpability, is not the same thing as guilt as a subjective or emotional term. It is the difference between being guilty and feeling guilty, a difference that is analytically clear, but often difficult to sustain in discussions of particular instances.
  2. Ibid., 61–62.
  3. Sigmund Freud, Civilization and Its Discontents, trans. James Strachey (New York, NY: Norton, 2005), 137, 140. First published 1930.
  4. Ibid., 140.
  5. William Shakespeare, The Merchant of Venice, Act 4, Scene 1, lines 184–205; see e.g., Stanley Wells, and Gary Taylor, eds., The Oxford Shakespeare: The Complete Works, second edition (Oxford, England: Oxford University Press, 2005), 473.
  6. Luke 23:34 (Revised Standard Version).
  7. Gregg Easterbrook, “Forgiveness is Good for Your Health,” Beliefnet, n.d., http://www.beliefnet.com/wellness/health/2002/03/forgiveness-is-good-for-your-health.aspx. Accessed 5 January 2017.
  8. Pascal Bruckner, The Tyranny of Guilt: An Essay on Western Masochism, trans. Steven Rendall (Princeton, NJ: Princeton University Press, 2010), 1–4.
  9. T.S. Eliot, “Gerontion,” line 34, in The Complete Poems and Plays: 1909–1950 (Orlando, FL: Harcourt Brace Jovanovich, 1971), 22. The poem was first published in 1920.
  10. Daniel Mendelsohn, “Stolen Suffering,” New York Times, March 9, 2008, WK12, http://www.nytimes.com/2008/03/09/opinion/09mendelsohn.html?_r=0.
  11. The book was Misha: A Mémoire of the Holocaust Years (Boston, MA: Mount Ivy Press, 1997), and the author published it under the name Misha Defonseca. According to the Belgian newspaper Le Soir, De Wael was the daughter of parents who had collaborated with the Nazis: see David Mehegan, “Misha and the Wolves,” Off the Shelf (blog), Boston Globe, March 3, 2008, http://www.boston.com/ae/books/blog/2008/03/misha_and_the_w.html.
  12. Binjamin Wilkomirski, Fragments: Memories of a Wartime Childhood (New York, NY: Schocken, 1997); Margaret B. Jones, Love and Consequences: A Memoir of Hope and Survival (New York, NY: Riverhead, 2008).
  13. In a final twist of the case, in May 2014 the Massachusetts Court of Appeals ruled that De Wael had to forfeit the $22.5 million in royalties she had received for Misha. Quotation from Lizzie Dearden, “Misha Defonseca: Author Who Made Up Holocaust Memoir Ordered to Repay £13.3m,” The Independent, May 12, 2014, http://www.independent.co.uk/arts-entertainment/books/news/author-who-made-up-bestselling-holocaust-memoir-ordered-to-repay-133m-9353897.html; additional details from Jeff D. Gorman, “Bizarre Holocaust Lies Support Publisher’s Win,” Courthouse News Service, May 8, 2014, http://www.courthousenews.com/2014/05/08/67710.htm.
  14. Thomas U. Berger, War, Guilt, and World Politics after World War II (New York, NY: Cambridge University Press, 2012), 8.
  15. Elazar Barkan, The Guilt of Nations: Restitution and Negotiating Historical Injustices (Baltimore, MD: Johns Hopkins University Press, 2000), xxxiii.
  16. Herbert Butterfield, The Whig Interpretation of History (New York, NY: Norton, 1965), 45–47.
  17. Andrew Delbanco, The Death of Satan: How Americans Have Lost the Sense of Evil (New York, NY: Farrar, Straus and Giroux, 1995), 9.
  18. Paula Fredriksen, Sin: The Early History of an Idea (Princeton, NJ: Princeton University Press, 2012), 149.
  19. Ibid.
  20. Ibid., 150.

Wilfred M. McClay is G.T. and Libby Blankenship Chair in the History of Liberty and director of the Center for the History of Liberty at the University of Oklahoma.

Reprinted from The Hedgehog Review 19.1 (Spring 2017). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.

In Search Of Ourselves

From yoga retreats to mindfulness meditation, finding ourselves is in vogue. Yet what we expect to find remains a mystery and who finds who is a self evident puzzle. Might there be no one there to find? Is the process of self-discovery a desirable and vital goal? Or just advertising spin?

The Panel

Philosopher and author of The Singularity David Chalmers joins explorer Ed Stafford and award-winning novelist Joanna Kavenna to discover what lies within.

https://iai.tv/video/in-search-of-ourselves

Speakers

  • David Chalmers
    Formulator of the hard problem of consciousness, philosopher of mind David Chalmers is the author of The Singularity
  • Joanna Kavenna
    Winner of the Orange First Novel prize, Kavenna’s works include A Field Guide to Reality, The Ice Museum and Inglorious. Her journalism has appeared in the Lond…
  • David Malone
    David Malone is a director and presenter of BBC and Channel 4 documentaries exploring the history and philosophy of science. His work includes Testing God and h…
  • Ed Stafford
    Ed Stafford, is an English explorer and Guinness World Record holder for being the first human ever to walk the length of the Amazon River.

William Powell, author of counterculture manifesto ‘The Anarchist Cookbook’, dies at 66

An enraged 19-year-old, William Powell holed up in the bowels of the New York City Public Library and pored through every shred of mayhem he could find — declassified military documents, Army field guides, electronic catalogs, insurrectionist pamphlets, survivalist guidebooks.

The material formed the bedrock for “The Anarchist Cookbook,” a crude though clever how-to book for aspiring terrorists, troublemakers and would-be revolutionaries.

Published as the Vietnam War continued to boil and the Summer of Love faded in the distance, the book became a bestseller and an instant manifesto of dissent in America, as ubiquitous in a college dorm room as a Che Guevara poster or a copy of the “Whole Earth Catalog.”

But as the decades passed, Powell came to see the book as a misstep, a vast error in judgment.

Confronted late in life by the makers of the documentary “American Anarchist,” Powell seemed to buckle at the thought that his book had been tied to Columbine, the Oklahoma City bombing, and a litany of other atrocities.

But if there was blood on his hands, he didn’t fully acknowledge it.

“I don’t know the influence the book may have had on the thinking of the perpetrators of these attacks, but I cannot imagine it was positive.”

Long an expatriate, Powell died of a heart attack July 11 during a vacation with his wife, children and grandchildren in Halifax, Canada. His death only became public when it was noted in the closing credits of “American Anarchist,” which premiered Friday. . His death was also disclosed on a Facebook page devoted to Powell’s work as a special education teacher in Africa and Asia. He was 66.

“The Anarchist Cookbook,” which has sold at least 2 million copies — printed, downloaded or otherwise — and remains in publication, was originally a 160-page book that offered a nuts-and-bolts overview of weaponry, sabotage, explosives, booby traps, lethal poisons and drug making. Illustrated with crude drawings, it informed readers how to make TNT and Molotov cocktails, convert shotguns to rocket launchers, destroy bridges, behead someone with piano wire and brew LSD.

The book came with a warning: “Not for children or morons.”

In a foreword, Powell advised that he hadn’t written the book for fringe militant groups of the era like the Weathermen or Minutemen, but for the “silent majority” in America, those he said needed to learn the tools for survival in an uncertain time. Powell himself was worried about being drafted and was an outspoken critic of the Vietnam War and President Nixon.

“This book is for anarchists — those who feel able to discipline themselves — on all subjects (from drugs to weapons to explosives) that are currently illegal or suppressed in this country,” he wrote.

Critics brushed the book off both as “reckless” and “pointless”; the FBI took note but decided any intervention would only stoke further interest in the book. Activists associated with militant groups branded it a transparent attempt to profit off the discord in America.

Powell said he received death threats and retreated to Vermont. He held only one press conference after the book was published, and it had been interrupted when someone hurled stink bombs toward the author.

In more recent publications, the book appears to have grown shorter and readers on Amazon have complained that it has been heavily edited. One reader said he was gravely disappointed to find out that a recipe for napalm had been cut from the book.

Powell eventually found a more conventional life, returning to college, earning a master’s degree in English, becoming a teacher, getting married and raising a family. He also led a nomadic life, teaching special needs children as he roamed the world with his wife and children, traveling from China to Tanzania.

The book itself never made him rich. He conceded years later than the copyright had been held from the start by the book’s original publisher, Lyle Stuart Inc., and that at best he had made $50,000 off the book.

Powell said he became a Christian and found himself increasingly uncomfortable with the book, which had tailed him like a shadow, sometimes standing in the way of a job or testing a friendship. In the late 1970s, he asked the publisher to take “The Anarchist Cookbook” out of publication. His request was rejected.

The author did, though, add a cautionary note to would-be buyers on Amazon, condemning his own book as “a misguided product of my adolescent anger.” He said the book should no longer be in print. He stopped short of urging people not to buy it, though his feelings we clear.

“The central idea to the book was that violence is an acceptable means to bring about political change,” he wrote. “I no longer agree with this.”

In 2013 he wrote a first-person story for The Guardian, again expressing remorse for the book and noting that he had more than atoned for it with decades of teaching and public service in the poorest and least developed countries in the world. He concluded that as a teen, he had accepted the notion that violence could be used to prevent violence.

“I had fallen for the same irrational pattern of thought that led to U.S. military involvement in both Vietnam and Iraq,” he wrote. “The irony is not lost on me.”

On a Facebook remembrance page, filled with condolences and fond memories from students, fellow teachers and family members, there is no obvious mention of the book that made him noteworthy. There is, though, evidence Powell had carved out a far different reputation in the classroom.

If I have made any difference or had any impact on student’s lives since I began teaching overseas it is because Bill was the catalyst,” wrote Kenny Peavy. “He was the first one to take the time to truly see.”

steve.marble@latimes.com

twitter.com/stephenmarble

Resist the Internet

Emilio Morenatti/Associated Press

So far, in my ongoing series of columns making the case for implausible ideas, I’ve fixed race relations and solved the problem of a workless working class. So now it’s time to turn to the real threat to the human future: the one in your pocket or on your desk, the one you might be reading this column on right now.

Search your feelings, you know it to be true: You are enslaved to the internet. Definitely if you’re young, increasingly if you’re old, your day-to-day, minute-to-minute existence is dominated by a compulsion to check email and Twitter and Facebook and Instagram with a frequency that bears no relationship to any communicative need.

Compulsions are rarely harmless. The internet is not the opioid crisis; it is not likely to kill you (unless you’re hit by a distracted driver) or leave you ravaged and destitute. But it requires you to focus intensely, furiously, and constantly on the ephemera that fills a tiny little screen, and experience the traditional graces of existence — your spouse and friends and children, the natural world, good food and great art — in a state of perpetual distraction.

Used within reasonable limits, of course, these devices also offer us new graces. But we are not using them within reasonable limits. They are the masters; we are not. They are built to addict us, as the social psychologist Adam Alter’s new book “Irresistible” points out — and to madden us, distract us, arouse us and deceive us. We primp and perform for them as for a lover; we surrender our privacy to their demands; we wait on tenterhooks for every “like.” The smartphone is in the saddle, and it rides mankind.

Which is why we need a social and political movement — digital temperance, if you will — to take back some control.

“Temperance?” you might object, with one eye on the latest outrage shared by your co-partisans on social media. “You mean, like, Prohibition? For something everyone relies on for their daily work and lives, that’s the basis for our economic — hang on, I just need to ‘favorite’ this tweet …”

No, not like Prohibition. Temperance doesn’t have to mean teetotaling; it can simply mean a culture of restraint that tries to keep a specific product in its place. And the internet, like alcohol, may be an example of a technology that should be sensibly restricted in custom and in law.

Of course it’s too soon to fully know (and indeed we can never fully know) what online life is doing to us. It certainly delivers some social benefits, some intellectual advantages, and contributes an important share to recent economic growth.

But there are also excellent reasons to think that online life breeds narcissism, alienation and depression, that it’s an opiate for the lower classes and an insanity-inducing influence on the politically-engaged, and that it takes more than it gives from creativity and deep thought. Meanwhile the age of the internet has been, thus far, an era of bubbles, stagnation and democratic decay — hardly a golden age whose customs must be left inviolate.

So a digital temperance movement would start by resisting the wiring of everything, and seek to create more spaces in which internet use is illegal, discouraged or taboo. Toughen laws against cellphone use in cars, keep computers out of college lecture halls, put special “phone boxes” in restaurants where patrons would be expected to deposit their devices, confiscate smartphones being used in museums and libraries and cathedrals, create corporate norms that strongly discourage checking email in a meeting.

Then there are the starker steps. Get computers — all of them — out of elementary schools, where there is no good evidence that they improve learning. Let kids learn from books for years before they’re asked to go online for research; let them play in the real before they’re enveloped by the virtual.

Then keep going. The age of consent should be 16, not 13, for Facebook accounts. Kids under 16 shouldn’t be allowed on gaming networks. High school students shouldn’t bring smartphones to school. Kids under 13 shouldn’t have them at all. If you want to buy your child a cellphone, by all means: In the new dispensation, Verizon and Sprint will have some great “voice-only” plans available for minors.

I suspect that versions of these ideas will be embraced within my lifetime by a segment of the upper class and a certain kind of religious family. But the masses will still be addicted, and the technology itself will have evolved to hook and immerse — and alienate and sedate — more completely and efficiently.

But what if we decided that what’s good for the Silicon Valley overlords who send their kids to a low-tech Waldorf school is also good for everyone else? Our devices we shall always have with us, but we can choose the terms. We just have to choose together, to embrace temperance and paternalism both. Only a movement can save you from the tyrant in your pocket.

REVIVAL Resurrecting the Process Church of the Final Judgment

Who is seeking to destroy all esoteric religious movements, starting with The Process Church of the Final Judgement? The Process was the most fascinating innovative cult of the 1960s, then vanished for four decades before being virtually reborn using information technology.

Revival seems to be fiction, yet it’s based on fact and explores the implications of the internet, and the disintegration of conventional faiths. As reported in the author’s anthropological study, Satan’s Power, the Process was polytheistic, asserting the union of Jehovah with Lucifer, and the unity of Christ with Satan. Each Process member was a fragment of a god, with a corresponding personality trait: Jehovah = Discipline, Lucifer = Liberation, Christ = Unification, Satan = Separation.

Before the first page of this book, the computer magician who resurrected the Process Church was murdered. Was this man Christ?

Christianity may be the opposite of what it seems, a Satanic plot that subconsciously preaches, “Release the fiend that lies dormant within you, for he is strong and ruthless, and his power is far beyond the bounds of human frailty. Come forth in your savage might, rampant with the lust of battle, tense and quivering with the urge to strike, to smash, to split asunder all that seek to detain you.” Can the surviving Processeans achieve the hopes expressed in their blessing: “May the life-giving water of the Lord Christ and the purifying fire of the Lord Satan bring the presence of love and unity into this assembly”?

Author William Sims Bainbridge earned his doctorate in sociology from Harvard University in 1975 and  he has published about 300 articles and written or edited 40 books in a variety of scientific fields. Currently, he is Co-Director, Cyber-Human Systems (Human-Centered Computing) at the National Science Foundation.

http://feralhouse.com/revival/

End-to-End Encryption 101

And do the Vault 7 Revelations Mean Encryption Is Useless?

If you’ve used the internet at any point since May 2013, you’ve probably heard that you should use encrypted communications. Edward Snowden’s revelation that the National Security Agency logs all of our calls, texts, and emails sparked a surge in the development and use of encryption apps and services. Only a few years later, encryption is widely used for daily communication. If you use any of these encryption tools, you’ve probably also heard the phrase “end-to-end encryption,” or “E2EE.” The name seems straightforward enough: end-to-end means content is encrypted from one endpoint (generally your phone or computer) to another endpoint (the phone or computer of your message’s intended recipient). But what level of security does this promise for you, the user?

Since the beginning of Trump’s administration, the US Customs and Border Protection (CBP) has stepped up its invasions of travelers’ privacy. The CBP has been demanding that both US citizens and visitors log into their phones and laptops and hand them over to the CBP for inspection. They’ve also demanded that travelers provide their passwords or log into their social media accounts. Travelers who don’t comply face the threat of being denied entry.

Yesterday, Wikileaks publish a trove of leaked CIA documents including knowledge of security vulnerabilities and exploits that the CIA paid for and kept secret from the general public. Now that this information has leaked, it’s no longer just the CIA that knows these vulnerabilities—it’s everyone. The New York Times and others misreported that the CIA had broken the encryption in apps like Signal and WhatsApp, when in fact what the CIA did was target and compromise specific people’s Android devices.

In short, this revelation confirms the importance of using end-to-end encrypted communications, which hinder state-level actors from performing broad spectrum dragnet surveillance. E2EE is still important.

Many reports around Vault 7 have given the impression that encrypted apps like Signal have been compromised. In fact, the compromise is at the device level—at the endpoint. There is no reason to believe the encryption itself does not work.

Limitations: Plaintext Endpoints

First, it’s important to understand that if you can read a message, it is plaintext—that is, no longer encrypted. With end-to-end encryption, the weak links in the security chain are you and your device, and your recipient and their device. If your recipient can read your message, anyone with access to their device can also read it. An undercover cop could read your message over your recipient’s shoulder, or the police could confiscate your recipient’s device and crack it open. If there is any risk of either of these unfortunate events taking place, you should think twice before sending anything you wouldn’t want to share with the authorities.

This particular limitation is also relevant to the recent “Vault 7” reveals, which demonstrate how apps like Signal, WhatsApp, and Telegram may not be useful if an adversary (like the CIA) gains physical access to your device or your contact’s device and is able to unlock it. Many reports around Vault 7 have been somewhat misleading, giving the impression that the apps themselves have been compromised. In this case, the compromise is at the device level—at the endpoint. The encryption itself is still good.

Limitations: Targeted Surveillance

Considering that you can’t control the security conditions of your message’s recipient, you should consider the possibility that any message you send them might be read. While rare, there are cases of state powers targeting people with direct surveillance. In these cases, targets may be working with malware-infected devices intended to log all of their incoming and outgoing communications. This compromise functions at the endpoint level, rendering E2EE useless against these specific adversaries. Because it is difficult to know whether you (or your message recipient) are the target of this type of attack, it is always best to default to not sending overly-sensitive information via digital communications. Currently, such attacks appear to be rare, but one should never take risks needlessly.

The third thing you should know about E2EE is that it doesn’t necessarily protect your metadata. Depending on how communications are transmitted, logs may still show the time and size of communication, as well as the sender and recipient. Logs may also show the location of both sender and recipient at the time of communication. While this is not typically enough to land someone in jail on its own, it can be useful in proving associations between people, establishing proximity to crime scenes, and tracking communication patterns. All these pieces of information are useful in establishing larger behavioral patterns in cases of direct surveillance.

So… Why?

So, if end-to-end encryption doesn’t necessarily protect the content of your communications, and still gives up useful metadata, what’s the point of using it?

One of the most important things E2EE does is ensure that your data never hits someone else’s servers in a readable form. Since end-to-end encryption starts from the moment you hit “send” and persists until it hits your recipient’s device, when a company—like Facebook—is subpoenaed for your logged communications, they do not have any plaintext content to give up. This puts the authorities in a position in which if they wish to acquire the content of your communications, they are forced to spend a significant amount of time and resources attempting to break the encryption. In the United States, your right to a speedy trial may render this evidence useless to prosecutors, who may not be able to decrypt it quickly enough to please a judge.

Mass Surveillance

Another use of E2EE serves is to make dragnet surveillance by the National Security Agency and other law enforcement agencies much more difficult. Since there is no point in the middle at which your unencrypted communications can be grabbed, what is grabbed instead is the same encrypted blocks of text available by subpoena. Dragnet surveillance is generally conducted by collecting any available data and subjecting it to automated sorting rather than individual analysis. The use of encryption prevents algorithmic sifting for content, thus making this process much more difficult and generally not worthwhile.

Stingrays

In addition to NSA’s data collection, federal and state law enforcement agencies around the country have, and frequently use, cell site simulators known as “IMSI catchers” or “Stingrays.” IMSI catchers pretend to be cell towers in order to trick your phone into giving up identifying information, including your location. Cell site simulators also grab and log your communications. As with other methods of interception, encryption means that what is retrieved is largely useless, unless the law enforcement agency is willing to go to the trouble to decrypt it.

Encryption At Rest

In addition to using end-to-end encryption to protect the content of your messages while they’re being sent, you can use full-disk encryption to protect your information while it’s stored on your device. Proper full-disk encryption means that all of the information on your device is indecipherable without your encryption key (usually a passphrase), creating a hardened endpoint which is much more difficult to compromise. Although encrypting your endpoints is not necessarily protection against some of the more insidious methods of surveillance, such as malware, it can prevent adversaries who gain possession of your devices from pulling any useful data off of them.

End-to-end encryption is by no means a magical shield against surveillance by nation states or malicious individuals, but Vault 7 highlights how using it can help force a procedural shift from dragnet surveillance to resource-intensive targeted attacks. When paired with good sense, encrypted devices, and other security practices, E2EE can be a powerful tool for significantly reducing your attack surface. Consistent, habitual use of end-to-end encryption can nullify many lower-tier threats and may even cause some higher-level adversaries to decide that attacking you is simply not worth the effort.

Further reading

— By Elle Armageddon

A Deconstruction of Love: Mary and Percy Shelley Edition

In honor of Valentine’s Day, Flavorwire is deconstructing a few famous pop-culture romances. Here’s our first effort, on Mary and Percy Shelley.

You know, people present the romance of Mary and Percy Shelley as one of history’s great love stories. Possibly because it involved a sudden elopement, and sudden elopements seem so romantic in theory. But they are often just flat-out crazy in practice.

To consider, for starters: Mary (then-Wollstonecraft Godwin) was 16 when she snuck off to meet her future husband, Percy Shelley. They were acquainted through her father, but that rendezvous would be their first tryst. And it would happen in a graveyard. Taken alone, that would be ghoulish enough, as first dates go. But this particular graveyard had, as one of its inhabitants, Mary’s own famous suffragette mother, Mary Wollstonecraft. The symbolism of standing over your dead mother as you profess your love for a hot young poet: tough to ignore. Though the drama of the atmosphere certainly let you ignore that the hot young poet is already married, and expecting a child.

Shelley was 21 himself, so not that much older and certainly not much wiser. He almost instantly slept with Mary (possibly in the cemetery, Miranda Seymour’s recent biography speculates, though who knows), and within two weeks had announced his affections to Mary’s father. Shockingly, the elder Godwin, who had counted Shelley a friend, was less than pleased at these developments. He tried to place himself between Shelley and the “fair and spotless fame of my young child.” He was foiled by his other daughters, one of whom was helping run love notes back and forth between the lovers. Jane had also been a chaperone on that night (or those nights) in the graveyard, though she was more accomplice than hall monitor, obviously.

Meanwhile, Shelley told his wife, Harriet, in a letter that he would continue to be friendly but that she must consider Mary’s own suffering, and “the tyranny which is exercised upon her,” meaning, it seems, Godwin’s control over his daughter. The man had balls, telling his pregnant wife to sympathize with his mistress of perhaps a month’s standing. He began to make financial plans to leave Harriet, plans no doubt made easier by the fact that Shelley was an endless borrower from his father’s estate.

Never one to stick to the practicalities, this plan-making did not prevent Shelley from, on at least two occasions, threatening suicide by laudanum. At least once, he tried to induce Mary herself to do the same in a sort of misbegotten Romeo-and-Juliet drama. Finally, he hatched a plan that they would leave and go abroad, and Mary dashed to the end of a street after a chaise, trailing Jane too, running in black silk ball gowns. It was under two months after the first night in the cemetery and the thing had already gone all to hell. Though Shelley was happy, and even toying with the idea of eventually having Harriet join them when the trio settled in continental Europe.

The thing he might not have known then, as they rushed to Dover to sail across the English channel, was that Mary was already pregnant.

She would eventually miscarry that child. Bad childbirths became a theme of her life, possibly one reflected in the fleshy grostesqueness of Frankenstein, as Ruth Franklin has speculated. Of the five pregnancies she’d have in the four years she’d be with Shelley, only one child would actually survive.

In fact, most things, after the pretty elopement, went sour. Jane, who continued to live with the couple for some time, changed her name to Claire and promptly began to deeply annoy Mary by carrying on what was at least an emotional affair with Shelley.

Harriet Shelley committed suicide a couple of years later.

Godwin’s former financial stability began to evaporate, and he reconciled with his daughter just in time to become a financial burden on her.

Shelley’s interest in Mary herself began to wane.

And then Shelley died in a sailing accident, and his family refused to support Mary or the one surviving child of the alliance, William.

Appropriately, though, for someone whose romantic life hit its peak in a graveyard, she managed to keep his heart, cut from the decaying body they took from the sea. Later, she burned it with Shelley’s friends on a pyre. (The apocrypha holds that Shelley was a bit afraid of being buried alive, and had often hoped for this.) After Mary died, Seymour notes, what remained of the heart was found among her things, wrapped in silk. People like to think of this as a melodramatic testament to the lastingness of their love.

That, I think, is a bit of a teenager’s interpretation. Let me offer another counterintuitive reading, one which I have no real evidence for — but, I think, with a life of melodramatic boyfriends behind me, I do have some authority to offer. See, as for myself, I like to think there were nights when she saw keeping his heart imprisoned in a drawer so long after he was gone as a sort of payback, for his taking her own at such a young age. There is a level on which it all sounds like revenge.

A photographer edits out our smartphones to show our strange and lonely new world

Are you reading this on a handheld device? There’s a good chance you are. Now imagine how’d you look if that device suddenly disappeared. Lonely? Slightly crazy? Perhaps next to a person being ignored? As we are sucked in ever more by the screens we carry around, even in the company of friends and family, the hunched pose of the phone-absorbed seems increasingly normal.

US photographer Eric Pickersgill has created “Removed,” a series of photos to remind us of how strange that pose actually is. In each portrait, electronic devices have been “edited out” (removed before the photo was taken, from people who’d been using them) so that people stare at their hands, or the empty space between their hands, often ignoring beautiful surroundings or opportunities for human connection. The results are a bit sad and eerie—and a reminder, perhaps, to put our phones away.

aaron_scott_670
Ashleys_Neighbors_670
Cara_after_thanksgiving_670
cody_and_erica_670
court_sarah_couch_670
Jamie_Joni_aiden_670

tanya_addi_670

Photos courtesy of Eric Pickersgill.

When you think of global financial hubs, Dublin doesn’t immediately spring to mind. Indeed, it currently ranks 30th in the Z/Yen Group’s index of financial centers (pdf). But the Irish capital is poised to rise in the rankings, thanks to Brexit.

Since the UK voted to leave the European Union, Britain’s many financial firms have explored setting up new subsidiaries—or even moving headquarters—elsewhere in the bloc, in order to keep a foothold in the EU’s single market. This has set off a battle between European cities to try and tempt banks, insurers, funds and the like.

The minutes of the Irish central bank’s last meeting (pdf), in December, revealed the details of a roundtable discussion a deputy governor had in December with financial industry players. On the inbound interest from UK financial firms:

There had been significant levels of interest in authorisations sought for new businesses looking to relocate from the UK. The levels of interest were larger than had been initially anticipated.

Dublin is well positioned to attract Brexit-related relocations for several reasons, including that it’s a short hop from London, shares the same language, and levies some of the lowest tax rates in the Europe. That makes the city less of an underdog in the scramble for Brexit business against established EU financial hubs like Frankfurt, Luxembourg, and Paris.

But Dublin recently complained to the European Commission about the sharp tactics other cities are using to lure British firms. Eoghan Murphy, the minister in charge of promoting Dublin’s financial centre, told Reuters that the cities are being “very aggressive” to get banks and other financial firms to relocate, but Ireland isn’t interesting in “brass plating.” Rather than just setting up a token operation with a brass plate on the door, Dublin expects “the mind and management of the entity” to be in Ireland, according to the central bank minutes.

Banks in London have repeatedly warned that they will shift staff from London to elsewhere in EU in order to maintain their legal ability to provide services across the region. Some estimates from City lobby groups suggest that up to 70,000 financial services jobs may move away from London. British prime minister Theresa May tried to charm bankers in Davos into believing that the economic prospects for post-Brexit Britain are bright, but some have pressed on with their relocation plans regardless.

The size of London’s massive financial industry could make it hard for any other city, including Dublin (paywall), to cope with a big influx. Ireland’s central bank is struggling to hire and retain enough staff to deal with the potential growth in authorizations (and subsequent supervision) of relocated firms. At the end of 2016, the central bank had almost 100 fewer full-time staff than it was authorized to hire. It is trying to boost its headcount by another 10% this year, a target the governor has called a “challenging target.”

The central bank is not the only Irish institution straining to staff up after Brexit. Last year, there was a 40% increase in applications for Irish passports from Brits, which has led to extended waiting times.