Friday, February 24, 2017

Our miserable 21st century -- commentary magazine

On the morning of November 9, 2016, America’s elite—its talking and deciding classes—woke up to a country they did not know. To most privileged and well-educated Americans, especially those living in its bicoastal bastions, the election of Donald Trump had been a thing almost impossible even to imagine. What sort of country would go and elect someone like Trump as president? Certainly not one they were familiar with, or understood anything about.
Whatever else it may or may not have accomplished, the 2016 election was a sort of shock therapy for Americans living within what Charles Murray famously termed “the bubble” (the protective barrier of prosperity and self-selected associations that increasingly shield our best and brightest from contact with the rest of their society). The very fact of Trump’s election served as a truth broadcast about a reality that could no longer be denied: Things out there in America are a whole lot different from what you thought. 
Yes, things are very different indeed these days in the “real America” outside the bubble. In fact, things have been going badly wrong in America since the beginning of the 21st century.
It turns out that the year 2000 marks a grim historical milestone of sorts for our nation. For whatever reasons, the Great American Escalator, which had lifted successive generations of Americans to ever higher standards of living and levels of social well-being, broke down around then—and broke down very badly.
The warning lights have been flashing, and the klaxons sounding, for more than a decade and a half. But our pundits and prognosticators and professors and policymakers, ensconced as they generally are deep within the bubble, were for the most part too distant from the distress of the general population to see or hear it. (So much for the vaunted “information era” and “big-data revolution.”) Now that those signals are no longer possible to ignore, it is high time for experts and intellectuals to reacquaint themselves with the country in which they live and to begin the task of describing what has befallen the country in which we have lived since the dawn of the new century.

II

Consider the condition of the American economy. In some circles people still widely believe, as one recent New York Times business-section article cluelessly insisted before the inauguration, that “Mr. Trump will inherit an economy that is fundamentally solid.” But this is patent nonsense. By now it should be painfully obvious that the U.S. economy has been in the grip of deep dysfunction since the dawn of the new century. And in retrospect, it should also be apparent that America’s strange new economic maladies were almost perfectly designed to set the stage for a populist storm.
Ever since 2000, basic indicators have offered oddly inconsistent readings on America’s economic performance and prospects. It is curious and highly uncharacteristic to find such measures so very far out of alignment with one another. We are witnessing an ominous and growing divergence between three trends that should ordinarily move in tandem: wealth, output, and employment. Depending upon which of these three indicators you choose, America looks to be heading up, down, or more or less nowhere.
From the standpoint of wealth creation, the 21st century is off to a roaring start. By this yardstick, it looks as if Americans have never had it so good and as if the future is full of promise. Between early 2000 and late 2016, the estimated net worth of American households and nonprofit institutions more than doubled, from $44 trillion to $90 trillion. (SEE FIGURE 1.)
Although that wealth is not evenly distributed, it is still a fantastic sum of money—an average of over a million dollars for every notional family of four. This upsurge of wealth took place despite the crash of 2008—indeed, private wealth holdings are over $20 trillion higher now than they were at their pre-crash apogee. The value of American real-estate assets is near or at all-time highs, and America’s businesses appear to be thriving. Even before the “Trump rally” of late 2016 and early 2017, U.S. equities markets were hitting new highs—and since stock prices are strongly shaped by expectations of future profits, investors evidently are counting on the continuation of the current happy days for U.S. asset holders for some time to come.
A rather less cheering picture, though, emerges if we look instead at real trends for the macro-economy. Here, performance since the start of the century might charitably be described as mediocre, and prospects today are no better than guarded.
The recovery from the crash of 2008—which unleashed the worst recession since the Great Depression—has been singularly slow and weak. According to the Bureau of Economic Analysis (BEA), it took nearly four years for America’s gross domestic product (GDP) to re-attain its late 2007 level. As of late 2016, total value added to the U.S. economy was just 12 percent higher than in 2007. (SEE FIGURE 2.) The situation is even more sobering if we consider per capita growth. It took America six and a half years—until mid-2014—to get back to its late 2007 per capita production levels. And in late 2016, per capita output was just 4 percent higher than in late 2007—nine years earlier. By this reckoning, the American economy looks to have suffered something close to a lost decade.
But there was clearly trouble brewing in America’s macro-economy well before the 2008 crash, too. Between late 2000 and late 2007, per capita GDP growth averaged less than 1.5 percent per annum. That compares with the nation’s long-term postwar 1948–2000 per capita growth rate of almost 2.3 percent, which in turn can be compared to the “snap back” tempo of 1.1 percent per annum since per capita GDP bottomed out in 2009. Between 2000 and 2016, per capita growth in America has averaged less than 1 percent a year. To state it plainly: With postwar, pre-21st-century rates for the years 20002016, per capita GDP in America would be more than 20 percent higher than it is today.
The reasons for America’s newly fitful and halting macroeconomic performance are still a puzzlement to economists and a subject of considerable contention and debate.1Economists are generally in consensus, however, in one area: They have begun redefining the growth potential of the U.S. economy downwards. The U.S. Congressional Budget Office (CBO), for example, suggests that the “potential growth” rate for the U.S. economy at full employment of factors of production has now dropped below 1.7 percent a year, implying a sustainable long-term annual per capita economic growth rate for America today of well under 1 percent.
Then there is the employment situation. If 21st-century America’s GDP trends have been disappointing, labor-force trends have been utterly dismal. Work rates have fallen off a cliff since the year 2000 and are at their lowest levels in decades. We can see this by looking at the estimates by the Bureau of Labor Statistics (BLS) for the civilian employment rate, the jobs-to-population ratio for adult civilian men and women. (SEE FIGURE 3.) Between early 2000 and late 2016, America’s overall work rate for Americans age 20 and older underwent a drastic decline. It plunged by almost 5 percentage points (from 64.6 to 59.7). Unless you are a labor economist, you may not appreciate just how severe a falloff in employment such numbers attest to. Postwar America never experienced anything comparable.
From peak to trough, the collapse in work rates for U.S. adults between 2008 and 2010 was roughly twice the amplitude of what had previously been the country’s worst postwar recession, back in the early 1980s. In that previous steep recession, it took America five years to re-attain the adult work rates recorded at the start of 1980. This time, the U.S. job market has as yet, in early 2017, scarcely begun to claw its way back up to the work rates of 2007—much less back to the work rates from early 2000.
As may be seen in Figure 3, U.S. adult work rates never recovered entirely from the recession of 2001—much less the crash of ’08. And the work rates being measured here include people who are engaged in any paid employment—any job, at any wage, for any number of hours of work at all.
On Wall Street and in some parts of Washington these days, one hears that America has gotten back to “near full employment.” For Americans outside the bubble, such talk must seem nonsensical. It is true that the oft-cited “civilian unemployment rate” looked pretty good by the end of the Obama era—in December 2016, it was down to 4.7 percent, about the same as it had been back in 1965, at a time of genuine full employment. The problem here is that the unemployment rate only tracks joblessness for those still in the labor force; it takes no account of workforce dropouts. Alas, the exodus out of the workforce has been the big labor-market story for America’s new century. (At this writing, for every unemployed American man between 25 and 55 years of age, there are another three who are neither working nor looking for work.) Thus the “unemployment rate” increasingly looks like an antique index devised for some earlier and increasingly distant war: the economic equivalent of a musket inventory or a cavalry count.
By the criterion of adult work rates, by contrast, employment conditions in America remain remarkably bleak. From late 2009 through early 2014, the country’s work rates more or less flatlined. So far as can be told, this is the only “recovery” in U.S. economic history in which that basic labor-market indicator almost completely failed to respond.
Since 2014, there has finally been a measure of improvement in the work rate—but it would be unwise to exaggerate the dimensions of that turnaround. As of late 2016, the adult work rate in America was still at its lowest level in more than 30 years. To put things another way: If our nation’s work rate today were back up to its start-of-the-century highs, well over 10 million more Americans would currently have paying jobs.
There is no way to sugarcoat these awful numbers. They are not a statistical artifact that can be explained away by population aging, or by increased educational enrollment for adult students, or by any other genuine change in contemporary American society. The plain fact is that 21st-century America has witnessed a dreadful collapse of work.
For an apples-to-apples look at America’s 21st-century jobs problem, we can focus on the 25–54 population—known to labor economists for self-evident reasons as the “prime working age” group. For this key labor-force cohort, work rates in late 2016 were down almost 4 percentage points from their year-2000 highs. That is a jobs gap approaching 5 million for this group alone.
It is not only that work rates for prime-age males have fallen since the year 2000—they have, but the collapse of work for American men is a tale that goes back at least half a century. (I wrote a short book last year about this sad saga.2) What is perhaps more startling is the unexpected and largely unnoticed fall-off in work rates for prime-age women. In the U.S. and all other Western societies, postwar labor markets underwent an epochal transformation. After World War II, work rates for prime women surged, and continued to rise—until the year 2000. Since then, they too have declined. Current work rates for prime-age women are back to where they were a generation ago, in the late 1980s. The 21st-century U.S. economy has been brutal for male and female laborers alike—and the wreckage in the labor market has been sufficiently powerful to cancel, and even reverse, one of our society’s most distinctive postwar trends: the rise of paid work for women outside the household.
In our era of no more than indifferent economic growth, 21st–century America has somehow managed to produce markedly more wealth for its wealthholders even as it provided markedly less work for its workers. And trends for paid hours of work look even worse than the work rates themselves. Between 2000 and 2015, according to the BEA, total paid hours of work in America increased by just 4 percent (as against a 35 percent increase for 1985–2000, the 15-year period immediately preceding this one). Over the 2000–2015 period, however, the adult civilian population rose by almost 18 percent—meaning that paid hours of work per adult civilian have plummeted by a shocking 12 percent thus far in our new American century.
This is the terrible contradiction of economic life in what we might call America’s Second Gilded Age (2000—). It is a paradox that may help us understand a number of overarching features of our new century. These include the consistent findings that public trust in almost all U.S. institutions has sharply declined since 2000, even as growing majorities hold that America is “heading in the wrong direction.” It provides an immediate answer to why overwhelming majorities of respondents in public-opinion surveys continue to tell pollsters, year after year, that our ever-richer America is still stuck in the middle of a recession. The mounting economic woes of the “little people” may not have been generally recognized by those inside the bubble, or even by many bubble inhabitants who claimed to be economic specialists—but they proved to be potent fuel for the populist fire that raged through American politics in 2016.

III

So general economic conditions for many ordinary Americans—not least of these, Americans who did not fit within the academy’s designated victim classes—have been rather more insecure than those within the comfort of the bubble understood. But the anxiety, dissatisfaction, anger, and despair that range within our borders today are not wholly a reaction to the way our economy is misfiring. On the nonmaterial front, it is likewise clear that many things in our society are going wrong and yet seem beyond our powers to correct.
Some of these gnawing problems are by no means new: A number of them (such as family breakdown) can be traced back at least to the 1960s, while others are arguably as old as modernity itself (anomie and isolation in big anonymous communities, secularization and the decline of faith). But a number have roared down upon us by surprise since the turn of the century—and others have redoubled with fearsome new intensity since roughly the year 2000.
American health conditions seem to have taken a seriously wrong turn in the new century. It is not just that overall health progress has been shockingly slow, despite the trillions we devote to medical services each year. (Which “Cold War babies” among us would have predicted we’d live to see the day when life expectancy in East Germany was higher than in the United States, as is the case today?)
Alas, the problem is not just slowdowns in health progress—there also appears to have been positive retrogression for broad and heretofore seemingly untroubled segments of the national population. A short but electrifying 2015 paper by Anne Case and Nobel Economics Laureate Angus Deaton talked about a mortality trend that had gone almost unnoticed until then: rising death rates for middle-aged U.S. whites. By Case and Deaton’s reckoning, death rates rose somewhat slightly over the 1999–2013 period for all non-Hispanic white men and women 45–54 years of age—but they rose sharply for those with high-school degrees or less, and for this less-educated grouping most of the rise in death rates was accounted for by suicides, chronic liver cirrhosis, and poisonings (including drug overdoses).
Though some researchers, for highly technical reasons, suggested that the mortality spike might not have been quite as sharp as Case and Deaton reckoned, there is little doubt that the spike itself has taken place. Health has been deteriorating for a significant swath of white America in our new century, thanks in large part to drug and alcohol abuse. All this sounds a little too close for comfort to the story of modern Russia, with its devastating vodka- and drug-binging health setbacks. Yes: It can happen here, and it has. Welcome to our new America.
In December 2016, the Centers for Disease Control and Prevention (CDC) reported that for the first time in decades, life expectancy at birth in the United States had dropped very slightly (to 78.8 years in 2015, from 78.9 years in 2014). Though the decline was small, it was statistically meaningful—rising death rates were characteristic of males and females alike; of blacks and whites and Latinos together. (Only black women avoided mortality increases—their death levels were stagnant.) A jump in “unintentional injuries” accounted for much of the overall uptick.
It would be unwarranted to place too much portent in a single year’s mortality changes; slight annual drops in U.S. life expectancy have occasionally been registered in the past, too, followed by continued improvements. But given other developments we are witnessing in our new America, we must wonder whether the 2015 decline in life expectancy is just a blip, or the start of a new trend. We will find out soon enough. It cannot be encouraging, though, that the Human Mortality Database, an international consortium of demographers who vet national data to improve comparability between countries, has suggested that health progress in America essentially ceased in 2012—that the U.S. gained on average only about a single day of life expectancy at birth between 2012 and 2014, before the 2015 turndown.
The opioid epidemic of pain pills and heroin that has been ravaging and shortening lives from coast to coast is a new plague for our new century. The terrifying novelty of this particular drug epidemic, of course, is that it has gone (so to speak) “mainstream” this time, effecting breakout from disadvantaged minority communities to Main Street White America. By 2013, according to a 2015 report by the Drug Enforcement Administration, more Americans died from drug overdoses (largely but not wholly opioid abuse) than from either traffic fatalities or guns. The dimensions of the opioid epidemic in the real America are still not fully appreciated within the bubble, where drug use tends to be more carefully limited and recreational. In Dreamland, his harrowing and magisterial account of modern America’s opioid explosion, the journalist Sam Quinones notes in passing that “in one three-month period” just a few years ago, according to the Ohio Department of Health, “fully 11 percent of all Ohioans were prescribed opiates.” And of course many Americans self-medicate with licit or illicit painkillers without doctors’ orders.
In the fall of 2016, Alan Krueger, former chairman of the President’s Council of Economic Advisers, released a study that further refined the picture of the real existing opioid epidemic in America: According to his work, nearly half of all prime working-age male labor-force dropouts—an army now totaling roughly 7 million men—currently take pain medication on a daily basis.
We already knew from other sources (such as BLS “time use” surveys) that the overwhelming majority of the prime-age men in this un-working army generally don’t “do civil society” (charitable work, religious activities, volunteering), or for that matter much in the way of child care or help for others in the home either, despite the abundance of time on their hands. Their routine, instead, typically centers on watching—watching TV, DVDs, Internet, hand-held devices, etc.—and indeed watching for an average of 2,000 hours a year, as if it were a full-time job. But Krueger’s study adds a poignant and immensely sad detail to this portrait of daily life in 21st-century America: In our mind’s eye we can now picture many millions of un-working men in the prime of life, out of work and not looking for jobs, sitting in front of screens—stoned.
But how did so many millions of un-working men, whose incomes are limited, manage en masse to afford a constant supply of pain medication? Oxycontin is not cheap. As Dreamland carefully explains, one main mechanism today has been the welfare state: more specifically, Medicaid, Uncle Sam’s means-tested health-benefits program. Here is how it works (we are with Quinones in Portsmouth, Ohio):
[The Medicaid card] pays for medicine—whatever pills a doctor deems that the insured patient needs. Among those who receive Medicaid cards are people on state welfare or on a federal disability program known as SSI. . . . If you could get a prescription from a willing doctor—and Portsmouth had plenty of them—Medicaid health-insurance cards paid for that prescription every month. For a three-dollar Medicaid co-pay, therefore, addicts got pills priced at thousands of dollars, with the difference paid for by U.S. and state taxpayers. A user could turn around and sell those pills, obtained for that three-dollar co-pay, for as much as ten thousand dollars on the street.
In 21st-century America, “dependence on government” has thus come to take on an entirely new meaning.
You may now wish to ask: What share of prime-working-age men these days are enrolled in Medicaid? According to the Census Bureau’s SIPP survey (Survey of Income and Program Participation), as of 2013, over one-fifth (21 percent) of all civilian men between 25 and 55 years of age were Medicaid beneficiaries. For prime-age people not in the labor force, the share was over half (53 percent). And for un-working Anglos (non-Hispanic white men not in the labor force) of prime working age, the share enrolled in Medicaid was 48 percent.
By the way: Of the entire un-working prime-age male Anglo population in 2013, nearly three-fifths (57 percent) were reportedly collecting disability benefits from one or more government disability program in 2013. Disability checks and means-tested benefits cannot support a lavish lifestyle. But they can offer a permanent alternative to paid employment, and for growing numbers of American men, they do. The rise of these programs has coincided with the death of work for larger and larger numbers of American men not yet of retirement age. We cannot say that these programs caused the death of work for millions upon millions of younger men: What is incontrovertible, however, is that they have financed it—just as Medicaid inadvertently helped finance America’s immense and increasing appetite for opioids in our new century.
It is intriguing to note that America’s nationwide opioid epidemic has not been accompanied by a nationwide crime wave (excepting of course the apparent explosion of illicit heroin use). Just the opposite: As best can be told, national victimization rates for violent crimes and property crimes have both reportedly dropped by about two-thirds over the past two decades.3 The drop in crime over the past generation has done great things for the general quality of life in much of America. There is one complication from this drama, however, that inhabitants of the bubble may not be aware of, even though it is all too well known to a great many residents of the real America. This is the extraordinary expansion of what some have termed America’s “criminal class”—the population sentenced to prison or convicted of felony offenses—in recent decades. This trend did not begin in our century, but it has taken on breathtaking enormity since the year 2000.
Most well-informed readers know that the U.S. currently has a higher share of its populace in jail or prison than almost any other country on earth, that Barack Obama and others talk of our criminal-justice process as “mass incarceration,” and know that well over 2 million men were in prison or jail in recent years.4 But only a tiny fraction of all living Americans ever convicted of a felony is actually incarcerated at this very moment. Quite the contrary: Maybe 90 percent of all sentenced felons today are out of confinement and living more or less among us. The reason: the basic arithmetic of sentencing and incarceration in America today. Correctional release and sentenced community supervision (probation and parole) guarantee a steady annual “flow” of convicted felons back into society to augment the very considerable “stock” of felons and ex-felons already there. And this “stock” is by now truly enormous.
One forthcoming demographic study by Sarah Shannon and five other researchers estimates that the cohort of current and former felons in America very nearly reached 20 million by the year 2010. If its estimates are roughly accurate, and if America’s felon population has continued to grow at more or less the same tempotraced out for the years leading up to 2010, we would expect it to surpass 23 million persons by the end of 2016 at the latest. Very rough calculations might therefore suggest that at this writing, America’s population of non-institutionalized adults with a felony conviction somewhere in their past has almost certainly broken the 20 million mark by the end of 2016. A little more rough arithmetic suggests that about 17 million men in our general population have a felony conviction somewhere in their CV. That works out to one of every eight adult males in America today.
We have to use rough estimates here, rather than precise official numbers, because the government does not collect any data at all on the size or socioeconomic circumstances of this population of 20 million, and never has. Amazing as this may sound and scandalous though it may be, America has, at least to date, effectively banished this huge group—a group roughly twice the total size of our illegal-immigrant population and an adult population larger than that in any state but California—to a near-total and seemingly unending statistical invisibility. Our ex-cons are, so to speak, statistical outcasts who live in a darkness our polity does not care enough to illuminate—beyond the scope or interest of public policy, unless and until they next run afoul of the law.
Thus we cannot describe with any precision or certainty what has become of those who make up our “criminal class” after their (latest) sentencing or release. In the most stylized terms, however, we might guess that their odds in the real America are not all that favorable. And when we consider some of the other trends we have already mentioned—employment, health, addiction, welfare dependence—we can see the emergence of a malign new nationwide undertow, pulling downward against social mobility.
Social mobility has always been the jewel in the crown of the American mythosand ethos. The idea (not without a measure of truth to back it up) was that people in America are free to achieve according to their merit and their grit—unlike in other places, where they are trapped by barriers of class or the misfortune of misrule. Nearly two decades into our new century, there are unmistakable signs that America’s fabled social mobility is in trouble—perhaps even in serious trouble.
Consider the following facts. First, according to the Census Bureau, geographical mobility in America has been on the decline for three decades, and in 2016 the annual movement of households from one location to the next was reportedly at an all-time (postwar) low. Second, as a study by three Federal Reserve economists and a Notre Dame colleague demonstrated last year, “labor market fluidity”—the churning between jobs that among other things allows people to get ahead—has been on the decline in the American labor market for decades, with no sign as yet of a turnaround. Finally, and not least important, a December 2016 report by the “Equal Opportunity Project,” a team led by the formidable Stanford economist Raj Chetty, calculated that the odds of a 30-year-old’s earning more than his parents at the same age was now just 51 percent: down from 86 percent 40 years ago. Other researchers who have examined the same data argue that the odds may not be quite as low as the Chetty team concludes, but agree that the chances of surpassing one’s parents’ real income have been on the downswing and are probably lower now than ever before in postwar America.
Thus the bittersweet reality of life for real Americans in the early 21st century: Even though the American economy still remains the world’s unrivaled engine of wealth generation, those outside the bubble may have less of a shot at the American Dream than has been the case for decades, maybe generations—possibly even since the Great Depression.

IV

The funny thing is, people inside the bubble are forever talking about “economic inequality,” that wonderful seminar construct, and forever virtue-signaling about how personally opposed they are to it. By contrast, “economic insecurity” is akin to a phrase from an unknown language. But if we were somehow to find a “Google Translate” function for communicating from real America into the bubble, an important message might be conveyed:
The abstraction of “inequality” doesn’t matter a lot to ordinary Americans. The reality of economic insecurity does. The Great American Escalator is broken—and it badly needs to be fixed.
With the election of 2016, Americans within the bubble finally learned that the 21st century has gotten off to a very bad start in America. Welcome to the reality. We have a lot of work to do together to turn this around.

Friday, February 3, 2017

Davey Morrison, Sacrament meeting talk Jan 2017

Here's my talk from church today. I had to make a bunch of edits and change a lot of things around to fit the time constraint, so here's the director's cut:
In Doctrine & Covenants 88:118, we are told, "Seek ye diligently and teach one another words of wisdom; yea, seek ye out of the best books words of wisdom; seek learning, even by study and also by faith."
I was asked today to speak on our ward conference theme, the power of stories and of storytelling. I would like to approach this topic with these words from the Doctrine and Covenants in mind. How can we seek and how can we teach wisdom from the best books? How can we seek learning by both study and by faith?
In college and in my work since college, I've spent a lot of time studying stories and storytelling. In the BYU film program, in addition to telling our own stories, we watched films and read books and studied different methods of interpretation and criticism--ways of understanding the stories we are told. I want to talk about a few of those today, and how they apply more broadly to gospel principles.
First: What's a story? The most basic definition of "story" is something with a beginning, a middle, and an end. By choosing where something begins and where it ends (and by selecting what we include in the middle), we create stories--even true stories are created in this way. Fiction or non-fiction, we are creating stories all the time as we are accessing, reconstructing, and retelling memories, ideas, events, stories we have heard and stories from our imaginations, stories about our lives, our relationships, our careers, our day--even the Plan of Salvation is a story with a beginning in the pre-existence, a middle in mortality, and an end in the eternities. Stories are containers for meaning, and telling stories is an act of creation. And creation is one of the ultimate characteristics of godliness.
But a storyteller can only do half the work. God created the heavens and the earth; then, on the sixth day, God created men and women to experience, take part in, take care of, nurture and add to that creation--to recreate. In all communication, there is both a sender and a receiver, and we take part in creation when we hear stories, understand them, interpret them, build on them, and liken them to ourselves. We believe that our ultimate divine destiny is to become like God--a creator.
In 1978, Hugh Nibley published a collection of essays called On the Timely and the Timeless. The first two methods of interpretation I want to talk about have to do with these concepts: The timely and the timeless. First, the timeless. In high school, many of us learned to read for theme. This is a method of looking for clues in the text--looking for how the author develops the story and the characters, for motifs and ideas and language and images and other stylistic clues the author uses to build a thematic argument or ask a thematic question. This is a structural understanding of story--we create meaning through the linear development of ideas across beginning, middle, and end. We can "construct" our own reading of the text in the same way the text itself is constructed--by drawing out clues and building them into an argument.
This is a useful approach to take when we study the scriptures. While individual passages or verses may jump out at us as powerful or meaningful, we should also consider these passages in context of the whole. How does the book of Genesis begin and end? What about The Book of Mormon? What are some of the thematic throughlines, and how are they developed? Who are the characters, and how do they change? What clues can we find to support our reading? These tools can lead us to the timeless ideas, themes, and questions at the heart of any text we encounter, including and especially a text as rich and layered with meaning as the Bible, Book of Mormon, Doctrine and Covenants, or Pearl of Great Price.
Some stories, like some of Jesus' parables, may be difficult to understand, but have a clear message. Others are more ambiguous. The Book of Job is a debate about the nature of God and the problem of evil. Why does God allow bad things to happen to good people? At the end of the book, after a long debate between Job and his friends, God appears, but God offers no easy answers. Instead, God responds to Job's questions with more questions:
"Where wast thou when I laid the foundations of the earth? declare, if thou hast understanding. Who hath laid the measures thereof, if thou knowest? or who hath stretched the line upon it? Whereupon are the foundations thereof fastened? or who laid the corner stone thereof; When the morning stars sang together, and all the sons of God shouted for joy?"
God offers no answers, only the self-evident fact of God's existence in the midst of Job's suffering.
Next, the timely. An historical or cultural reading of a text situates it within the time and place of its writing. A book like Uncle Tom's Cabin, for example, may read as antiquated, simplistic, even offensive in its understanding of race and racism today, but its place as a revolutionary document in American history is undeniable. We may also draw on our understanding of the author's biography, the things we know about his or her life experiences and personal beliefs, along with the rest of their writing, to inform our understanding of a story. The works of 19th century Russian novelist Fyodor Dostoevsky, with their obsessive focus on crime and punishment, deep-seated dread, and miraculous redemption, gain added depth and dimension when we learn that Dostoevsky was, at age 28, wrongfully imprisoned, sentenced to death, and even led to the firing squad before being reprieved at the literal last moment.
We can and should take these reading skills into our scripture study. When and where and how were these stories written? To whom? By whom? What messages were they trying to communicate? What messages might they be unintentionally communicating? We believe scripture to be the word of God as far as it is translated correctly. We also believe prophets to be fallible men, inspired, but capable of error. A greater understanding of the time and place in which scripture was written, along with its intended rhetorical purpose, may help us navigate these complicated books as combinations of history, inspiration, prophecy, poetry, storytelling, and so on.
We may also read a text phenomenologically. Phenomenological inquiry is based on the premise that, because language is metaphor, a set of agreed-upon symbols to communicate ideas, meaning in the text is never entirely fixed--each of us as readers brings to the text our own meaning, interacting with the text to create meaning through our engagement. Each of us, writer and reader, is reaching for some spiritual truth that always lies just beyond our grasp, truth that may be sublingual, or superlingual, using the imperfect tool of language to touch it, like blind men feeling an elephant and trying to describe it. This incompleteness of language is the subject of the story of the Tower of Babel; Joseph Smith once described it as “the little narrow prison almost as it were total darkness of paper pen and ink and a crooked broken scattered and imperfect language.” Because of this imperfection, our reading is, by necessity, colored by what we bring to it--our personal life history, experiences, cultural context, and set of values, our spiritual impressions, all as valid and as potentially valuable as authorial intent. This kind of reading might be what Nephi calls "likening the scriptures unto ourselves." We can, should, and can't help but do this with all stories, scriptural or otherwise. I imagine if we asked everyone in the room what their favorite book or their favorite movie is, we would hear a lot of different answers. Even more interesting would be to follow up the question with, "Why?" Hearing what a story means to someone is always an exciting way of connecting with that person, and insight into their life experiences, their values, and their beliefs.
Hannah and Her Sisters is my favorite movie. It is a movie that came along at just the right time in my life. One of the characters in the movie, Mickey Sachs, worries he may have a brain tumor. Mickey goes in for further tests and soon realizes he is healthy and everything is fine. However, this brush with mortality sets him on an existential tailspin. Mickey begins questioning everything--what does anything matter if one day he's going to die? Like Job, he searches for meaning in the face of mortality's inevitable suffering. Mickey quits his job working on a TV show and he turns to religion and philosophy for the first time in his life, trying on different religions like one might try on different pairs of pants--seeing what fits, what's comfortable, which cut he likes best. Nothing brings him consolation. Finally, in the midst of a suicidal depression, Mickey goes to the movies. He sees some singing and dancing on the screen, and he realizes that even if he doesn't have all the answers, he does has an opportunity to enjoy and appreciate the life he's been given. Like Job, the questions are not necessarily resolved, they are transcended. Mickey realizes, in a sense, the teaching of Jesus from Luke 17, that "The kingdom of God cometh not with observation: Neither shall they say, Lo here! or, lo there! for, behold, the kingdom of God is within you." While Mickey's questions about what comes after this life aren't answered for him, he does find God's mercy and a sense of peace and joy in appreciating the life he has. This was a message I needed to hear when I first stumbled upon the movie, and I love it still, both because it continues to resonate for me, and because it reminds me of that personal connection the first time I saw it. I'm sure if I asked you about one of your favorite stories, you would share a similarly personal experience.
Finally, there is the redemptive reading. The definition of a redemptive reading is perhaps best explained by example. In October 1944, Austrian neurologist Viktor Frankl entered a concentration camp in Auschwitz, Germany. He would spend the next six months moving from one camp to another, witnessing and experiencing unthinkable horror, losing his wife and his parents. In 1945, the war ended and Frankl was freed. The following year, he wrote about his experiences in the book, Man's Search For Meaning. "Everything," he writes, "can be taken from a man but one thing: the last of the human freedoms--to choose one's attitude in any given set of circumstances, to choose one's own way." Confronted by the worst of humanity, Frankl still managed to construct a positive philosophy through the work of storytelling, through the deliberate construction of meaning out of seemingly chaotic pain and horror, redeeming his experience through a constructive reading of his life. It would be difficult to conceive of a more powerful example of a redemptive reading.
We have discussed four types of reading: The formal and structural, the historical and biographical, the phenomenological, and the redemptive. Finally, I would like to turn more deeply to an example from the scriptures, to see how all of these different modes of interpretation together can provide us greater context for and unlock deeper meanings within a text.
The Book of Mormon as an historical record ends in death, bloodshed, rape, murder, cannibalism, and warfare. The book is, as writer Terryl Givens has pointed out, a story of "sibling jealousies...culminating in a tragic and genocidal finale painfully deferred until the record's final pages." The story ends with Moroni, the sole Nephite survivor, wandering the earth, hunted, with the preservation of a sacred record and a people's history resting squarely on his forsaken shoulders.
Yet the final chapters of the Book of Mormon contain some of the most deeply optimistic passages in any book of scripture. As he concludes the record, in the midst of despair and desolation, Moroni returns to the document's fundamental thematic throughline--the inherently redemptive hope at the heart of Christian faith. Writing, apparently, to a post-Restoration readership (the only audience to which he had access), Moroni recounts one of his father's, Mormon's, sermons in chapter 7:
"For I remember the word of God, which saith by their works ye shall know them; for if their works be good, then they are good also. For behold, God hath said a man being evil cannot do that which is good; for if he offereth a gift, or prayeth unto God, except he shall do it with real intent it profiteth him nothing." (Moroni 7:5-6)
Moroni, still recounting his father, Mormon, goes on to very clearly delineate that which is of God from that which is not, saying:
"Wherefore, take heed, my beloved brethren, that ye do not judge that which is evil to be of God, or that which is good and of God to be of the devil." (Moroni 7:14)
There is an unmistakable, black-and-white kind of clarity to this formulation—“a man being evil cannot do that which is good." There is fundamental, mathematical precision here; gray area is entirely eliminated. There is good and there is evil. Every man and woman must make a choice, both in his actions (in the gift he or she offereth) and in the judgment of the offerings of others. No man can serve two masters.
But this potentially polarized line of thinking is complicated by introducing an implicit spiritual phenomenology into the equation—there is the sender, but there is also the receiver. And, as always, “by their fruits [we] shall know them” (Moroni 7:5). Moroni writes:
"Behold, that which is of God inviteth and enticeth to do good continually; wherefore, every thing which inviteth and enticeth to do good, and to love God, and to serve him, is inspired of God." (Moroni 10:13)
If these are the fruits by which we know them, then the burden of meaning lies in the receiver’s interpretation of the “gift” to an equal or greater degree than it lies in the sender’s intention—different gifts invite and entice different receivers to “do good, to love God, and to serve him”; as we know, what yields fruit for one may not yield the same fruit for another. The stories that have been important and meaningful to you may be and often are different from the stories that have been important and meaningful to me. To liken it to and expound upon Jesus' parable, some of our soils have the nutrients necessary to grow one kind of seed and others have the nutrients needed to grow another kind.
In John Milton’s Paradise Lost, Satan makes the case that, “The mind is its own place, and in itself can make a Heav’n of Hell, a Hell of Heav’n.” This might be a dangerous notion if we hope to construct a theology based upon the power of the sender, and there's a reason that Milton gives the line to the devil. Yet Joseph Smith expressed this very same sentiment centuries later when he said, “And if we go to hell, we will turn the devils out of doors and make a heaven of it.” Joseph’s vision of the hereafter is radically physical—a heaven with roads and architecture and dimensions—yet it remains rooted in a spiritual state of mind; like all else in creation, Zion becomes physicalized only after a pre-existing spiritual genesis. Jesus' kingdom of God is within us—in other words, a state of mind (as Milton’s Satan suggested), a spiritual state. Heaven is a kind of tuning in to the governing creative powers of the universe, a sense of alignment that provides the necessary groundwork for Joseph’s grand visions of city-building. Once we are of “one heart and one mind,” with “no poor among [us]” (Moses 7:18), the construction can commence. Zion is built with bricks and mortar—but the kingdom of heaven begins and ends within us.
If that's the case, then we can find no more humbling, no more inspiring, and no more moving an act of redemptive readership in the Book of Mormon than Moroni’s conclusion to his father's sermon at the end of chapter seven. Surrounded by death, brutality, and apostasy, in constant fear of his life, writing to an audience who wouldn't be born for centuries to come, Moroni has every external reason to be unhappy—but look at the words he chooses to record: “And now, my brethren, how is it possible that ye can lay hold upon every good thing?” (Moroni 7:20).
With no one left to talk to, his family and friends murdered, still Moroni sees this fundamentally optimistic question fit to conclude a grim, painful record of his people's history: “How is it possible that ye can lay hold upon every good thing?” In the midst of a hell, Moroni fashions a heaven through the transformative lens of the Spirit. There is more good to be found in the world—even a fallen, utterly infernal world—“than we can ever lay hold upon.” In one of the final chapters of the Book of Mormon, Moroni concludes by providing another testament of Jesus Christ and another testimony of the Atonement.
With this rhetorical question, Moroni bears testimony of the redemptive, healing power of Jesus—and the redemptive, healing power that lies within us when we access Him. Moroni is not physically translated to another plane of existence, but his spiritual eyes are opened to the entirely unexpected beauty around him; the power of Christ’s Atonement is perhaps not that it provides a clean slate, but that it does something more extraordinary—it redeems. It doesn’t remove our negative experiences or past mistakes or indiscretions, but it transforms all experiences into good and constructive ones. Jesus' Atonement is the ultimate power of story--through experiencing each of our stories individually, He is able to understand us, and it is this empathic understanding we seek when we yearn for redemption. This is the central faith and hope of Christianity, a faith and hope that turns a hell into a heaven (and the absence of which can make a heaven hell)—it is the kingdom of God that lies within us.
It is my hope that we can all access the divine creative power within ourselves and the ultimate power of Christ's Atonement to reshape, reframe, and redeem our own stories--to make sense out of the pain, sin, and shortcoming we all inevitably experience in this life. I believe this gift is readily accessible to any of us at any time in our lives if we are willing to open our hearts to the power of Jesus' love.