Books
-
I’m Your Man: The Life of Leonard Cohen
This book succeeds as a biography better than almost any I’ve ever read. All Leonard Cohen questions are answered. All questions about the times in which he lived are answered. “What Leonard Cohen means to me (the author)” is touched upon lightly and then put down. As far as I can tell no relevant musical or poetic detail is omitted, including my long running question of “Why did he shift from guitar in the 1980s?”
Highly recommended for anyone into Leonard Cohen – well written and very informative.
Things that surprised me
- He spent much more time in the Buddhist monastery than I originally thought
- Far more drug use, especially amphetamines than I would have thought, especially later in life (most people grow out of that sort of thing as they get older, Leonard grew into it)
- His work ethic and perfectionism were quite impressive
- The reason that he moved from guitar based folk was not due to some artistic “growth” but a musical writer’s block regarding guitar accompaniment. His synthesizer accompaniment was no blocked so he rolled with that.
- His youth and his old age lasted for long periods of time, his middle age was quite short
Highlights
Many years later Edgar H. Cohen would go on to write Mademoiselle Libertine: A Portrait of Ninon de Lanclos, a biography published in 1970 of a seventeenth-century courtesan, writer and muse whose lovers included Voltaire and Molière, and who, after a period in a convent, emerged to establish a school where young French noblemen could learn erotic technique.
Leonard did not cry at the death of his father; he wept more when his dog Tinkie died a few years later. “I didn’t feel a profound sense of loss,” he said in a 1991 interview, “maybe because he was very ill throughout my entire childhood. It seemed natural that he died. He was weak and he died. Maybe my heart is cold.”
Chapter Two of the hypnotism manual might have been written as career advice to the singer and performer Leonard would become. It cautioned against any appearance of levity and instructed, “Your features should be set, firm and stern. Be quiet in all your actions. Let your voice grow lower, lower, till just above a whisper. Pause a moment or two. You will fail if you try to hurry.”
Since the age of thirteen Leonard had taken to going out late at night, two or three nights a week, wandering alone through the seedier streets of Montreal. Before the Saint Lawrence Seaway was built the city was a major port, the place where all the cargo destined for central North America went to be offloaded from oceangoing freighters and put on canal boats and taken up to the Great Lakes or sent by rail to the West. At night the city swarmed with sailors, longshoremen and passengers from the cruise ships that docked in the harbor, and welcoming them were countless bars, which openly flouted the law requiring that they close at three A.M.
Lorca was a dramatist and a collector of old Spanish folk songs as well as a poet, and his poems were dark, melodious, elegiac and emotionally intense, honest and at the same time self-mythologizing. He wrote as if song and poetry were part of the same breath. Through his love for Gypsy culture and his depressive cast of mind he introduced Leonard to the sorrow, romance and dignity of flamenco. Through his political stance he introduced Leonard to the sorrow, romance and dignity of the Spanish Civil War. Leonard was very pleased to meet them both.
Over the subsequent years, whenever interviewers would ask him what drew him to poetry, Leonard offered an earthier reason: getting women. Having someone confirm one’s beauty in verse was a big attraction for women, and, before rock ’n’ roll came along, poets had the monopoly. But in reality, for a boy of his age, generation and background, “everything was in my imagination,” Leonard said. “We were starved. It wasn’t like today, you didn’t sleep with your girlfriend. I just wanted to embrace someone.”
In the summer of 1950, when Leonard left once again for summer camp—Camp Sunshine in Sainte-Marguerite—he took the guitar with him. Here he would begin playing folk songs, and discover for the first time the instrument’s possibilities when it came to his social life. You were still going to summer camp at age fifteen? “I was a counselor.
There were a lot of the Wobbly songs—I don’t know if you know that movement? A Socialist international workers union. Wonderful songs. ‘There once was a union maid / Who never was afraid / Of goons and ginks and company finks / And deputy sheriffs that made the raid . . . No you can’t scare me I’m stickin’ with the union.’ Great song.”
Leonard was clearly enthused. Some fifty years after his stay at Camp Sunshine he could still sing the songbook by heart from beginning to end.* In
At the second lesson, the Spaniard started to teach Leonard the six-chord flamenco progression he had played the day before, and at the third lesson Leonard began learning the tremolo pattern. He practiced diligently, standing in front of a mirror, copying how the young man held the guitar when he played. His young teacher failed to arrive for their fourth lesson. When Leonard called the number of his boardinghouse, the landlady answered the phone. The guitar player was dead, she told him. He had committed suicide.
The streets around McGill University were named for august British men—Peel, Stanley, McTavish—its buildings constructed by solid, stony Scotsmen in solid Scottish stone.
Had someone told you the British Empire was run from McGill, you’d be forgiven for believing them; in September 1951, when Leonard started at McGill on his seventeenth birthday, it was the most perfect nineteenth-century city-within-a-city in North America.
The general attitude to bilingualism at that time was not a lot different, if less deity-specific, from that of the first female governor of Texas, Ma Ferguson: “If the English language was good enough for Jesus Christ, it’s good enough for everybody.”
Fraternities and presidencies might appear surprisingly pro-establishment for a youth who had shown himself to have Socialist tendencies and a poetic inclination, but Leonard, as Arnold Steinberg notes, “is not antiestablishment and never was, except that he has never done what the establishment does. But that doesn’t make him antiestablishment.
But in 1952, between his first and second years, Leonard formed his first band with two university friends, Mike Doddman and Terry Davis. The Buckskin Boys was a country and western trio (Mort had not yet taken up the banjo or it might have been a quartet), which set about cornering the Montreal square-dance market.
Mostly, though, he played guitar—alone, in the quadrangle, at the frat house, or anywhere there was a party. It wasn’t a performance; it was just something he did. Leonard with a guitar was as familiar a sight as Leonard with a notebook.
Leonard, even before he started to write his own stuff, was relentless. He would play a song, whether it was ‘Home on the Range’ or whatever, over and over and over all day, play it on his guitar and sing it. When he was learning a song he would play it thousands of times, all day, for days and days and weeks, the same song, over and over, fast and slow, faster, this and that. It would drive you crazy. It was the same when he started to write his own stuff. He still works that way. It still takes him four years to write a lyric because he’s written twenty thousand verses or something.”
That sense of a lost Eden, of something beautiful that did not work out or could not last, would be detectable in a good deal of Leonard’s work.
“I felt that what I wrote was beautiful and that beauty was the passport of all ideas,” Leonard would say in 1991.
Leonard liked the Beats. They did not return the sentiment. “I was writing very rhymed, polished verses and they were in open revolt against that kind of form, which they associated with the oppressive literary establishment. I felt close to those guys, and I later bumped into them here and there, although I can’t describe myself remotely as part of that circle.”10 Neither did he have any desire to join it. “I
or writing about himself, as he did when one professor, knowing when he was beaten, allowed Leonard to submit a term paper on Let Us Compare Mythologies.
He applied to the U.S. Department of the Interior Bureau of Indian Affairs in Washington, DC, for a teaching position on a reservation. The bureau, oddly, had little use for a Jewish poet from Montreal with electro-cycle turret lathe skills.
Recalls Aviva Layton, who went to Leonard’s first night with Irving to give moral support, “I don’t remember him reading poetry, I remember him singing and playing the guitar. He perched himself on a high, three-legged stool and he sang—his own songs. That magic that he had, whatever it was, you could see it there at these performances.”
Survival, in discussions of the mystery and motivations of Leonard Cohen, has tended to be left in the corner clutching an empty dance card while writers head for the more alluring sex, God and depression and haul them around the dance floor. There is no argument that between them these three have been a driving force in his life and work. But what served Leonard best was his survival instinct.
Leonard walked through the town, he noticed that there were no cars. Instead there were donkeys, with a basket hung on either side, lumbering up and down the steep cobblestone streets between the port and the Monastery of the Prophet Elijah. It might have been an illustration from a children’s Bible.
On a small island with few telephones and little electricity, therefore no television, the ferry provided their news and entertainment, and their contact with the outside world.
The ritual, routine and sparsity of this life satisfied him immensely. It felt monastic somehow, except this was a monk with benefits; the Hydra arts colony had beaten the hippies to free love by half a decade.
He and I both carry komboloi—Greek worry beads; only Greek men do that. The beads have nothing to do with religion at all—in fact one of the Ancient Greek meanings of the word is ‘wisdom beads,’ indicating that men once used them to meditate and contemplate.”
He quoted himself saying, in his familiar partly humorous, partly truthful fashion, “I shouldn’t be in Canada at all. I belong beside the Mediterranean. My ancestors made a terrible mistake. But I have to keep coming back to Montreal to renew my neurotic affiliations.”
In this drab, run-down part of the Lower East Side, it looked like somebody had bombed a rainbow.
Alighting in Aberdeen, Trocchi made his way to London, where he registered as a heroin addict with the National Health Service and obtained his drug legally.
Leonard had the assistance, or at least the companionship, of a variety of drugs. He had a particular liking for Maxiton, generically dexamphetamine, a stimulant known outside of pharmaceutical circles as speed. He also had a fondness for its sweet counterpoint Mandrax, a hypnotic sedative, part happy pill, part aphrodisiac, very popular in the UK. They were as handsome a pair of pharmaceuticals as a hardworking writer could wish to meet; better yet, in Europe they could still be bought over the counter. Providing backup was a three-part harmony of hashish, opium and acid (the last of these three still legal at that time in Europe and most of North America).
That same year, her former partner became the first black person to be imprisoned under Britain’s Race Relations Act—a statute originally passed to protect immigrants from racism—after calling for the shooting of any black woman seen with a white man; Bacal is white.
The end of De Freitas/X/Malik’s story came in 1975, when he was hanged for murder. The Trinidad government ignored pleas for clemency from people in the U.S., UK and Canada, many of them celebrities. They included Angela Davis, Dick Gregory, Judy Collins and Leonard Cohen.
Leonard also argued to keep the title. It would appeal, he wrote, to “the diseased adolescents who compose my public.”
He put the albums on, Nadel wrote, “to the chagrin of everyone” besides Leonard, who listened “intently, solemnly” and announced to the room “that he would become the Canadian Dylan.”
At his decree, their singer and songwriter Lou Reed, a short, young, Jewish New Yorker, shared the spotlight with a tall, blond German in her late twenties. Nico, said Lou Reed, “set some kind of standard for incredible-looking people.”
While Dylan was babysitting her son, Ari—the result of her brief affair with the French movie star Alain Delon—Dylan wrote the song “I’ll Keep It with Mine,” which he gave to Nico. When
“You’re Leonard Cohen, you wrote Beautiful Losers,” which nobody had read, it only sold a few copies in America. And it was Lou Reed.
Nico told Leonard she liked younger men and did not make an exception. Her young man du jour, her guitar player, was a fresh-faced singer-songwriter from Southern California, barely eighteen years old, named Jackson Browne. A surfer boy crossed with an angel, his natural good looks appeared unnatural alongside the cadaverous Warhol and his black-clad entourage.
bumped into Jim Morrison a couple of times but I did not know him well. And Hendrix—we actually jammed together one night in New York. I forget the name of the club, but I was there and he was there and he knew my song ‘Suzanne,’ so we kind of jammed on it.” You and Hendrix jammed on “Suzanne”? What did he do with it? “He was very gentle. He didn’t distort his guitar. It was just a lovely thing. I
With “Suzanne” being such a powerful song and Collins such an evangelical cheerleader for its writer, Leonard was getting attention too, including from John Hammond, the leading A & R man at America’s foremost record company, Columbia.
Leonard knew how he wanted to sound, or at least how he did not want to, but as an untrained musician he lacked the language to explain it. He could not play as well as the session musicians, so he found them intimidating.
In 1964 Joni quit art school to be a folksinger, moving to Toronto and the coffeehouses around which the folk scene revolved. In February 1965 she gave birth to a daughter, the result of an affair with a photographer. A few weeks later she married folksinger Chuck Mitchell and gave the baby up for adoption. The marriage did not last. Joni left, taking his name with her, and moved into Greenwich Village, where she was living alone in a small hotel room when she met Leonard.
Any close inspection of Mitchell’s songs pre- and post-Leonard would seem to indicate that he had some effect on her work. Over the decades, Leonard and Joni have remained friends.
“He was tentative and earnest, very unpolished,” says Montreal music critic Juan Rodriguez. Nancy Bacal concurs. “He was horrified, just frozen. He told me, looking out at these people, how could he just become this other person?
producer and music publisher named Jeff Chase whom Mary Martin thought might prove helpful was brought in, and somewhere in the process Leonard appeared to have somehow signed over the songs to him.
When Fields walked into the room, he found the two women “pasting sequins one at a time in a coloring book,” an activity pursued after the age of seven only if a person is on speed.
He was involved and yet not involved—which described his general dealings with the Warhol set. They were more to his taste than the hippie scene on the West Coast that had begun to infiltrate New York: “There seemed to be something flabby about the hippie movement. They pulled flowers out of public gardens. They put them in guns, but they also left their campsites in a mess. No self-discipline,” he said.
This time, when Leonard arrived at Studio B for the first session, there were no musicians waiting for him, just his young producer and the two union-mandated engineers. (“Producers could only talk,” says Simon. “Unless you were in the union, you were strictly forbidden from touching any equipment, mics, mixing board, etc.”)
Three weeks and four sessions later, Leonard nailed “So Long, Marianne,” a song he had recorded more than a dozen times with two producers and with two different titles. In total, since May 1967 Leonard had recorded twenty-five original compositions with John Hammond and John Simon. Ten of these songs made it onto Leonard’s debut album. Four would be revisited on his second and third albums, and two would appear as bonus tracks on the Songs of Leonard Cohen reissue in 2003 (“Store Room” and “Blessed Is the Memory”).
As my friend Leon Wieseltier said, ‘It has the delicious quality of doneness.’
one thing Leonard said he liked about Greece was that he could get Ritalin there—a stimulant widely used for both narcolepsy and hyperactivity—without a prescription. Crill told Leonard that he had stopped taking acid since some of the manufacturers starting cutting it with Ritalin. “Leonard said, ‘Oh, I really loved that.’ He said it was very good for focus.”
Said Leonard, “I always think of something Irving Layton said about the requirements for a young poet, and I think it goes for a young singer, too, or a beginning singer: ‘The two qualities most important for a young poet are arrogance and inexperience.’ It’s only some very strong self-image that can keep you going in a world that conspires to silence everyone.”
“Suzanne,” the opening song, appears to be a love song, but it is a most mysterious love song, in which the woman inspires a vision of Jesus, first walking on the water, then forsaken by his father, on the Cross. “So Long, Marianne,” likewise, begins as a romance, until we learn that the woman who protects him from loneliness also distracts him from his prayers, thereby robbing him of divine protection. The two women in “Sisters of Mercy,” since they are not his lovers, are portrayed as nuns. (Leonard wrote the song during a blizzard in Edmonton, Canada, after encountering two young girl backpackers in a doorway. He offered them his hotel bed and, when they fell straight to sleep, watched them from an armchair, writing, and played them the song the next morning when they woke.)
After his short promotional trip to London, Leonard returned to New York and the Chelsea. He checked into room 100 (which Sid Vicious and Nancy Spungen would later make notorious) and propped his guitar in the corner and put his typewriter on the desk.
Leonard thought Scientology, for all its snake oil, had “very good data.”9 He signed up for auditing.
Since Bob Johnston was, as always, busy in the studio, he sent Charlie Daniels to pick them up. In 1968 Daniels wasn’t the Opry-inducted, hard-core country star with the big beard and Stetson, but a songwriter and session musician—fiddle, guitar, bass and mandolin. Johnston
One night Daniels called Johnston and asked if he could get him out of jail—it was advance planning; Daniels was about to get into a fistfight with a club owner. Johnston hollered down the phone that he should “get the fuck out to Nashville,” and he did. Johnston had kept him busy ever since, playing on albums by Johnny Cash, Marty Robbins, Bob Dylan, and now Leonard Cohen.
Johnston taped Leonard singing ten songs. Five would appear on Songs from a Room; one would be put aside for the third album, Songs of Love and Hate; and four have never as yet been released: “Baby I’ve Seen You,” “Your Private Name,” “Breakdown” and “Just Two People”
Other singer-poets are obscure, but generally the feeling comes through that an attempt is being made to reach to a heart of meaning. But Cohen sings with such lack of energy that it’s pretty easy to conclude that if he’s not going to get worked up about it, why should we.”
In 1962, when Roshi was fifty-five, just a kid with a crazy dream, he left Japan for Los Angeles to establish the first Rinzai center in the U.S.
Then, “he locked himself in one of the suites for hours and listened to the music and read the books he had his chauffeur go out and buy of Leonard’s. He came out and said he at least felt I was leaving him for someone worthwhile.”
But Leonard no longer attended the Scientology Center. Disenchantment had set in, as well as anger that the organization had begun to exploit his name. Leonard had “gone clear”; he had a certificate confirming him as a “Senior Dianetic, Grade IV Release.”4 “I participated in all those investigations that engaged the imagination of my generation at that time,” said Leonard. “I even danced and sang with the Hare Krishnas—no robe, I didn’t join them, but I was trying everything.”
It was May 4, 1970, the day of the Kent State massacre in the U.S., and, as some kind of convoluted antiauthority peace gesture, Leonard decided to start the second half of the show by clicking his heels twice and giving the Nazi salute. He had come back onstage to lighted matches and a long standing ovation, but the mood changed instantly.
Leonard took Cornelius, Johnston and Donovan to meet a friend in London who—he told them—had the best acid anywhere. “It was called Desert Dust and it was like LSD-plus,” says Cornelius. “You had to take a needle—a pin was too big—and touch your tongue with this brown dust, and with as much as you could pick up on the end of that needle you were gone, sixteen hours, no reentry.” Ample supplies were purchased and consumed; it would get to where the tour manager made them all hold hands at the airport as they walked to the plane so that he would not lose anyone—“a big conga line,” Donovan says, “with everybody just singing along.”
A review of the show in Billboard described him as “nervous” and “lifeless.” Wrote Nancy Erlich, “He works hard to achieve that bloodless vocal, that dull, humorless quality of a voice speaking after death. And the voice does not offer comfort or wisdom; it expresses total defeat. His art is oppressive.”
“Leonard said, ‘I want to play mental asylums,’ ” says Johnston. And just like he’d done when Johnny Cash told him he wanted to play prisons, Johnston said, “Okay,” and “booked a bunch of them.” Despite appearances, the Henderson (closed now, due to funding cuts) was a pioneering hospital with an innovative approach to the treatment of personality disorders. It called itself a therapeutic community and the patients residents.
The artists who played that year included the Who, the Doors, Miles Davis, Donovan and Ten Years After. Leonard had the slot before last on the fifth and final day, after Jimi Hendrix and Joan Baez and before Richie Havens.
Tension had been rising at the festival for days. The promoters had expected a hundred and fifty thousand people but half a million more turned up, many with no intention of paying. Even
Thirty-nine years later the spellbinding performance was released, along with Lerner’s footage, on the CD/DVD Leonard Cohen: Live at the Isle of Wight 1970
Leonard’s depression begged to differ. Says Suzanne, “Of course, it can feel like a dark room with no doors. It’s a common experience of many people, especially with a creative nature, and the more spiritual the person, the closer to the tendency resembling what the church called acedia”—a sin that encompassed apathy in the practice of virtue and the loss of grace.
He was at the Chelsea, having what might have been a somber, one-man bachelor party.
Leonard tried to get in contact, but he says, “I was just too late.” She had killed herself three days before. Leonard was mentioned in her suicide note. He published her letter on the album sleeve, he said, because she had always wanted to be published and no one would do so.
Leonard said in 1974. “I am committed to the survival of the Jewish people”7), but also bravado, narcissism and, near the top of the list, desperation to get away. “Women,” he said, “only let you out of the house for two reasons: to make money or to fight a war,”8 and in his present state of mind dying for a noble cause—any cause—was better than this life he was living as an indentured artist and a caged man.
“Who by Fire” had been directly inspired by a Hebrew prayer sung on the Day of Atonement when the Book of Life was opened and the names read aloud of who will die and how. Leonard said he had first heard it in the synagogue when he was five years old, “standing beside my uncles in their black suits.”
“We were all kids. I was twenty-two and I had never played a concert before such a big audience, and I’ve never been on tour with a guy who’s revered like he was. In Europe Leonard was bigger than Dylan—all the shows were sold out—and he had the most sincere, devoted, almost nuts following.
Serious poetry lovers don’t get violent but, boy, there was some suicide watches going on, on occasion. There were people who Leonard meant life or death to. I’d see girls in the front row”—women outnumbered men three to one in the audiences, by Lissauer’s count—“openly weep for Leonard and they would send back letters and packages.
The only guy I’ve seen who drew better-looking women than Leonard Cohen was probably Charles Bukowski. These women were all dressed up in seventies style and hanging on Leonard’s every word, during the show and afterwards.”
Leonard was also hungry for hunger. This domestic life had caused him to put on weight and what he needed was to be empty. As he wrote in Beautiful Losers, “If I’m empty then I can receive, if I can receive it means it comes from somewhere outside of me, if it comes from outside of me I’m not alone. I cannot bear this loneliness . . .”—a loneliness deeper than anything that the ongoing presence of a woman and children could relieve.
John Miller replaced Lissauer as musical director, the rest of the band consisting of Sid McGinnis, Fred Thaylor and Luther Rix. Leonard’s new backing singers were Cheryl Barnes (who three years later would appear in the film of the musical Hair) and a nineteen-year-old Laura Branigan (who three years later would sign to Atlantic and become a successful solo pop artist).
“So,” says John Lissauer, “the famous missing album. I have the rough mixes but the master tapes just disappeared. Marty culled the two-inch tapes from both studios. He never returned my calls and Leonard didn’t return my calls. Maybe he was embarrassed. I didn’t find out what happened for twenty-five years. I heard this from a couple of different sources. Marty managed Phil Spector and Spector had not delivered on this big Warner Bros. deal; they got a huge advance, two million dollars, and Marty took his rather hefty percentage, but Phil didn’t produce any albums. So Warner Bros. go to Marty, ‘He comes up with an album or we get our money back.’ So Marty said, ‘I know what to do. Screw this Lissauer project, I’ll put Phil and Leonard together.’ ” Which is what he did.
His records were “Phil Spector” records, the artists and musicians merely bricks in his celebrated “Wall of Sound”—the name that was given to Spector’s epic production style. It required battalions of musicians all playing at the same time—horns bleeding into drums bleeding into strings bleeding into guitars—magnified through tape echo. With this technique Spector transformed pop ballads and R & B songs, like “Be My Baby,” “Da Doo Ron Ron” and “Unchained Melody” into dense, clamorous, delirious minisymphonies that captured in two and a half exquisite minutes the joy and pain of teenage love.
Leonard was dressed for work, wearing a suit and carrying a briefcase. He looked, Dan Kessel recalls, “like a suave, continental Dustin Hoffman.”
“In the final moment,” Leonard said, “Phil couldn’t resist annihilating me. I don’t think he can tolerate any other shadows in his darkness.”
after his initial misgivings about fatherhood he had taken to it seriously, and his friends say he was grief-stricken at being separated from
Once the book was completed, the public had come sharply back into focus. One big reason for this was that Leonard was running out of money. If Leonard lived like a celebrity, if he’d had a yacht or a cocaine habit, it might be easier to understand. But though he did not spend much money on himself, he still had expenses: Suzanne, the children, Roshi’s monastery and various friends whom he supported financially in one way or another. The majority of Leonard’s income came from his songs, not his books, and five years had passed since his last album.
Lissauer came to the conclusion that Leonard had reached a point in his songwriting where he had “run out of ideas as a guitar player. There were certain things he could do with his guitar playing, but this dopey Casio did things that he couldn’t on his guitar and made it possible for him to approach songwriting in a different way.”
Writing songs was certainly proving torturously difficult for Leonard again. But this cheesy little two-octave keyboard that Leonard seemed so fond of gave him a whole new set of rhythms to work with, and he found he was able to come up with things he could never have created with six strings and what he called his “one chop.”
The first song to feature Leonard playing his Casio was the new album’s opening track, “Dance Me to the End of Love.” The seed of the song was something Leonard had read about an orchestra of inmates in a concentration camp, who were forced by the Nazis to play as their fellow prisoners were marched off to the gas chambers. As a testimonial to Leonard’s way with words and a romantic melody, it would go on to become a popular song at weddings.
Love is there to help your loneliness, prayer is to end your sense of separation with the source of things.”
“Hallelujah” took Leonard five years to write. When Larry “Ratso” Sloman interviewed him in 1984, Leonard showed him a pile of notebooks, “book after book filled with verses for the song he then called ‘The Other Hallelujah.’ ” Leonard kept around eighty of them and discarded many more. Even
after the final edit, Leonard kept two different endings for “Hallelujah.” One of them was downbeat: It’s not somebody who’s seen the light It’s a cold and it’s a broken hallelujah The other had an almost “My Way” bravado: Even though it all went wrong I’ll stand before the Lord of Song With nothing on my tongue but Hallelujah Bob Dylan said he preferred the second version, which was the one Leonard finally used on the album, although he would return to the darker ending at various concerts.
Dylan showed Leonard his new song “I and I.” Leonard asked how long it took him to write, and Dylan said fifteen minutes. Leonard showed Dylan “Hallelujah.” Impressed, Dylan asked how long it took Leonard to write it. “A couple of years,” said Leonard, too embarrassed to give the true answer. Sloman,
It is an intensely moving song, intimate and fragile, and sung in a voice that had deepened with age. Lissauer noted that it had dropped four semitones since he and Leonard had last worked together.
Leonard remembers, “Walter Yetnikoff said, ‘Leonard, we know you’re great, we just don’t know if you’re any good.’
Zembaty’s Polish version of Leonard’s adaptation of “The Partisan” had become an unofficial anthem of the Solidarity movement.
Famous Blue Raincoat was released in 1987. It featured nine songs,* including a few that Judy Collins had previously covered (“Bird on the Wire,” “Joan of Arc,” “Famous Blue Raincoat”) and a few that Warnes—like Collins in the past—would release before Leonard had recorded his own versions.
You’re stuck with the consequences of your actions, but in your work you can go back.”1 He had left behind him, he said, a “shipwreck of ten or fifteen years of broken families and hotel rooms for some kind of shining idea that my voice was important, that I had a meaning in the cosmos. . . . Well, after enough lonely nights you don’t care whether you have a meaning in the cosmos or not.”
Another Cohen-Robinson cowrite made it onto this album. On a visit to her house, he had handed her a sheet of verses—a litany of world-weary wisdom and cynicism—and asked her if she could write a melody. She did, and it became the song “Everybody Knows.”
When Marty Machat died on March 19, 1988, aged sixty-seven, Lynch took various files on Leonard from the offices of Machat & Machat that the lawyers said could be taken legally, including documents relating to the publishing company that Marty Machat had set up for Leonard. Lynch took the files to L.A., where she set up shop and began making herself as indispensable to Leonard as Marty had once been. At one point Leonard and Kelley became lovers. Eventually she became his manager.
Prince Charles, whose charity the concert benefited, was also a Leonard Cohen fan.
I’m Your Man had outsold all of his earlier albums.
Leonard appears to have remained good friends with many of his former lovers, remarkably few of whom seem to bear him any ill will.
But as Roshi told him, “You can’t live in God’s world. There are no restaurants or toilets.”
An old Eastern European adage says that a man should pray once before going to sea, twice before going to war and three times before getting married, but when it came to the last of the three, Leonard never seemed to stop praying.
During the four months Adam spent in the hospital, Leonard stayed there, keeping vigil. He would sit in the room quietly, watching his son, who remained in a coma. Sometimes he would read aloud to him from the Bible. When Adam finally regained consciousness, his first words to his father were, “Dad, can you read something else?”
Early on in their relationship, Rebecca was “whining about the various pain I had, my childhood, and this and that. And Leonard is the best listener, but at a certain point he said, ‘I understand, it must have been really terrible for you, Rebecca, having had to grow up poor and black.’ ” Rebecca laughed.
In his acceptance speech at the 1992 Juno Awards ceremony, Leonard deadpanned, “It’s only in a country like this that I could win a best vocalist award.”
“The light,” Leonard explained, “is the capacity to reconcile your experience, your sorrow, with every day that dawns. It is that understanding, which is beyond significance or meaning, that allows you to live a life and embrace the disasters and sorrows and joys that are our common lot. But it’s only with the recognition that there is a crack in everything. I think all other visions are doomed to irretrievable gloom.”
He was returning to the place where he had moved quietly, with no announcement, a few months before, not long after the last date of the Future tour. A small, bare hut on a mountain, where he had chosen to live as the servant and companion of an old Japanese monk.
Leonard became expert at rustling up soups. At the age of sixty-one, he would earn a certificate from San Bernardino County that qualified him to take work as a chef, waiter or busboy.
Rinzai monks, Leonard liked to boast, were “the Marines of the spiritual world”4 with a regimen “designed to overthrow a twenty-year-old.”
By the midsummer of 1993, when the tour was finally over, Leonard and Rebecca’s engagement was too.
When I finished my tour in 1993 I was approaching the age of sixty; Roshi was approaching ninety. My old teacher was getting older and I hadn’t spent enough time with him, and my kids were grown and I thought it was an appropriate moment to intensify my friendship and my association with the community.”
The old man, now approaching his ninetieth birthday, instructed Leonard that he wanted a traditional, open-pyre cremation. If Leonard would like to, Roshi said, he could keep one of his bones.
Among the uninvited guests, in Kigen’s words, was “a beautiful young lady who came up one evening and was wearing rags and feathers, literally. ‘Where’s Leonard? I’m here for Leonard.’
Leonard particularly enjoyed creating art on a computer. He just liked computers.
His interest in Macs started early on, thanks in part to the Apple company giving away free computers to select Canadian writers—among them Leonard, Irving Layton and Margaret Atwood—and sending tutors to their homes to show them how to use them.
He thought he had read somewhere “that the brain cells associated with anxiety can die as you get older,”39 although the general intelligence is that depression worsens with age. Perhaps
Leonard had left the monastery with around two hundred and fifty songs and poems in various states of completion.
In Canada, meanwhile, where new ways of honoring Leonard were still, miraculously, being found,
Leonard returned in September, at the request of Trudeau’s children, to be a pallbearer at Trudeau’s funeral.
Kelley took care of Leonard’s business affairs—good, reliable Kelley, not simply his manager but a close friend, almost part of the family; he even employed Kelley’s parents. Leonard, who took little interest in such things, had given Lynch broad power of attorney over his finances. He trusted her enough to have named her in his living will as the person responsible in an extreme medical circumstance for giving the order as to whether he should live or die. Lynch had been there almost continuously during the making of Dear Heather and they had been in regular contact since the album was completed, just as they always were, and Kelley had said nothing about any financial problems.
He repeated the same understatement to the media once the lawsuits began and the story went public. And what a strange story it would turn out to be, one with a tangled plot whose cast of characters included a SWAT team, financiers, a tough-talking parrot, Tibetan Buddhists and Leonard’s lover Anjani’s ex-husband.
To have been redeemed from depression in his old age only to have to spend it in an eternity of legal and financial paperwork was a cosmic joke so black as to test even Leonard’s famous gallows humor.
Leonard had wanted to walk away from the whole thing, but the lawyers said he couldn’t. They told him that lot of the missing money had been in retirement accounts and charitable trust funds, which left Leonard liable for large tax bills on the sums withdrawn and no money with which to pay them. It was no good telling the IRS that he had not been the one who had made the withdrawals; they needed proof. Which was why Leonard was sitting at his desk with Anjani and Lorca, in the house he had been forced to mortgage in order to pay his legal bills, grimly going through stacks of financial statements and e-mails.
Kory and Rice explained to Leonard that a case could probably be made that between ten and thirteen million dollars had been improperly taken. “That stunned him,” says Kory. “It stunned me.”
In 2001, Kelley, Greenberg and Westin orchestrated the sale of Leonard’s future record royalties to Sony/ATV for $8 million. After various cuts, Leonard apparently netted $4.7 million, according to documents later filed in Los Angeles Superior Court.
At that meeting, Kory held out the possibility of a reasonable settlement if Kelley would disclose what had happened to all the money. The alternative, he said, would be serious litigation and ultimately the destruction of her life as she knew it. Her response, Kory said, was “Hell will freeze over before you find out what happened to the money. It was my money.”
By Lynch’s account, the police took her on a long drive, interrogating her en route about her friendship with Phil Spector (who had been freed on $1 million bail while awaiting trial for murder). The journey ended at a hospital across town, where Lynch was taken to the psychiatric ward. She claimed that she was involuntarily drugged and held in the hospital for twenty-four hours, and that during this time Steve Lindsey filed for and subsequently won custody of their son. Lynch believed that Leonard and Kory were behind the whole episode, as well as several other strange things she claimed had happened to her following the hostage incident, such as being rear-ended by a Mercedes and threatened by a mysterious man.
Lynch’s subsequent accounts, related in thousands upon thousands of words she posted on the Internet, involved long, elaborate conspiracies, in which Phil Spector’s murder trial seemed to feature frequently and in which Lynch claimed to be a scapegoat in a scheme devised to hide Leonard’s lavish spending and tax fraud. Rather than fight Leonard in court, Kelley did so in cyberspace. Wherever Leonard was mentioned online and there was a space for comments, she left them, and not in brief. She sent innumerable lengthy e-mails to Leonard and his friends, family, musicians, associates and former girlfriends, as well as to the police, the district attorney, the media, the Buddhist community and the IRS.
Lynch had ignored Leonard’s lawsuit, including requests for discovery, and he was frustrated by her ability to avoid any accountability, even in litigation. But once a court issues the writ, Rice explained, the person who filed it can take it to the sheriff’s office and ask for officers to go with him to where his property is being held and take it back.
On a rainy October morning at nine A.M., Rice and her paralegal showed up, unannounced, at Lynch’s house in Mandeville Canyon with two armed sheriffs in riot gear, to search the house and garage and take possession of Leonard’s documents per the court order. The sheriffs emerged with one box after another.
Lynch, who continued her ceaseless assault of blogs and e-mails full of accusations and invective, also began to make threatening phone calls—to Leonard, to Kory and to friends and associates from various places across the U.S.
She thanked the millions of her fellow countrymen who failed to buy his early poetry books and novels, “because without that he might not have turned to songwriting.”
“Who Do You Really Remember” catalogs various deaths—his dog, his uncles and aunts, his friends—that occurred between his father’s death, when Leonard was nine, and his mother’s, when he was forty-three.
describes a conversation with the ghost of a dead friend, conducted while Leonard was on the twenty-year-old speed he’d found in the pocket of an old suit.
In the abbreviated, six-line version of his poem “Not a Jew” he asserts that he remains unswervingly Jewish. In “One of My Letters” he signs off not with “L. Cohen” but with his Jewish and his Buddhist names, Jikan Eliezer.
Finley remembers that he had talked about marriage “as an opportunity to be of service to another human being, an opportunity for the deepest human transformation, because you’re so deep in the presence of another human being. Which takes work, it takes mindfulness, it takes commitment, it takes discipline.
In his first conversation with Leonard, the rabbi had asked him, “You’re a Buddhist priest, how does that square with Judaism?” It was the same question Leonard had been asked by the press when he was ordained a monk; he had answered it in his poem “Not a Jew.” Leonard answered Finley that it did not have to square; Buddhism was nontheistic and Roshi was a great man with a great mind. “Leonard made it very clear to me that it had nothing to do with his religion, nor his beliefs. As we got to know each other better, I was delighted to see that he is a very learned Jew. He’s deeply well-read, very committed to understanding Kabbalah and—in a very similar way that I do—is using the Kabbalah not so much as a theology but as spiritual psychology and a way to mythically represent the Divine. If you understand that human consciousness is basically symbolic, then one has to find some kind of symbol system that most closely articulates one’s understanding of all the levels of reality.”
He gets the inner ethos of brokenness and healing and the tragedy of the human condition, in that we’re not particularly well suited for this life but you still have to find your way through.”
Blue Alert, the album Leonard and Anjani had worked on together, was released, as was Book of Longing, in May 2006.
In a few weeks’ time Roshi would be one hundred years old, and yet here he still was, the constant in Leonard’s life, the good friend, the wise father figure who disciplined and indulged him and never left, not even when Leonard had left him.
When he quit smoking, Leonard had promised himself he could start again when he reached seventy-five. He blamed his abstinence from cigarettes for the loss of the two lowest notes in his vocal range, even if in truth they had only ever been audible to certain mammals and devoted female fans. His voice now was deeper than it had ever been.
He had to stop counting how many tribute albums there were—more than fifty by this point, from twenty different countries.
because it was recorded by his first and most stalwart champion, was Democracy: Judy Collins Sings Leonard Cohen—from 2004,
Major artists were increasingly making their money from touring, charging considerably higher ticket prices than under the old system, when concerts existed to promote album sales. Although
Rob Hallett was getting anxious. Leonard had been rehearsing for at least four months now and all he had was bills. “About a million dollars later, I started panicking. Then Leonard said, ‘Okay, come and see the rehearsals.’ ”
Leonard, the band and crew, and Kory and Hallett arrived several days early so that they could rehearse some more in the theater, five, six hours a day.
He took a deep breath; one lesson he had learned from his years at the monastery was to “stop whining.”
But here he stood in the spotlight in his sharp suit, fedora and shiny shoes, looking like a Rat Pack rabbi, God’s chosen mobster.
Leonard sang as if he had come to this place alone to tell all these people in the seats, individually, a secret.
That his choices leaned toward the more stirring, later songs than the naked early ones was perhaps in part an old man’s delicacy, but more likely because they worked better with a large band, and Leonard needed a large band to drown out the noise of doubt. Equally
They played for almost three hours that night, with a short intermission—and no one played three-hour shows, certainly not a man in his seventies who had not sung more than a handful of songs in succession on a stage in a decade and a half.
At another of the concerts, two young women rushed the stage, prompting Leonard to comment wryly, or wistfully, or both, as security gently led them off, “If only I were two years younger.”
This was quite a change from Leonard Cohen tours in the past, which had been fueled by cigarettes and alcohol or the drug du jour. (By the end of his last tour, with The Future, Leonard had been smoking two packs a day and drinking three bottles of Château Latour before every show.)
“As the Irish say, with the help of God and two policemen, [it] may last a year and a half, or two.”
Michael Eavis was. The dairy farmer who founded the UK’s biggest and best-loved rock festival had been trying to get Leonard to agree to play there, he said, “for almost forty years.”
Songs of Leonard Cohen came with two old songs released for the first time: “Store Room” and “Blessed Is the Memory,” which were recorded during the 1967 sessions and shelved.* The reissued Songs from a Room also had two additional songs, the previously unheard versions of “Bird on the Wire” (titled “Like a Bird”) and “You Know Who I Am” (titled “Nothing to One”) that Leonard recorded with David Crosby before making the album with Bob Johnston.
It was a reflection of Leonard’s growing confidence onstage that he premiered more new material on the 2009 U.S. tour, “Feels So Good” and “The Darkness.” The set list, remarkably, had continued to expand, now featuring more than thirty songs.
The new decade began with “Hallelujah” at the top of the iTunes download charts in 2010—the version Justin Timberlake and Matt Morris sang on the Hope for Haiti telethon—and the first of a new slew of awards.
Then, while doing a Pilates exercise, he threw out his back—a spinal compression injury, the doctors told him, that would take four to six months of physical therapy to fix. Leonard insisted he was fine. His friends say he was not, that he was in great pain and could barely move. The tour was postponed. Since he was stuck in one place, Leonard thought he might as well do something. He began recording a new album.
“A sublime experience,” said Leonard, staying just long enough to have his photograph taken with an arm around Taylor Swift and to tell Rolling Stone that his new album, “God willing, will be finished next spring.”
These past three years on the road, with their three-hour shows and two-hour sound checks, sometimes barely a day off in between, had been more than rigorous, but much as Leonard had said of Roshi’s monastery, “once you get the hang of it, you go into ninth gear and kind of float through it all.”
On April 1, as he donned his monk’s robes to visit Roshi, who was celebrating his 104th birthday, Leonard learned that he had won the prestigious Glenn Gould Prize.
After a long stretch of contentment with his synthesizer, Leonard found himself returning to the guitar, playing it on four of the tracks. His guitar on “Crazy to Love You” takes the listener back to his earliest albums, in particular to Songs from a Room.
Leonard had returned, at least part-time, to his old job of driving Roshi around, running errands and taking him food; Roshi had become quite fond of Leonard’s chicken soup. Roshi, weeks away from his 105th birthday, was still working;
He doesn’t think too much about the future, he says, other than looking forward to the promise he made himself to take up smoking again on his eightieth birthday.
While we sat drinking at the small kitchen table, which was pushed up against the wall, by an open window through which a cool breeze blew, he asked how things were going with the book—a book, I should add, that he did not ask me to write and did not ask to read, neither of which appeared to inhibit his support.
(Biographers always lament the ones who got away, and I was sad not to have added Joni Mitchell, Jennifer Warnes and Phil Spector to this list. I tried. )
Jarkko Arjatsalo, founder and overseer of LeonardCohenFiles.com—Leonard calls him “the General Secretary of the party”—to whose website Leonard contributes;
Most of all, thank you, Leonard Cohen, for being so considerate as to choose the second I hit puberty to release your first album, for continuing to move and enlighten me with your music and words ever since, for permitting me to out you as a ukulele player, and for living a remarkable life that has run me ragged these past few years.
It was only his duties to Roshi—which now regularly included driving him back and forth to doctors’ appointments—that kept him off the road. Roshi, by this point, was a hundred and five years old.
Once again it was completed at remarkable speed: nine months. Popular Problems, Leonard announced, would be released on his eightieth birthday.
Leonard dedicated Popular Problems to Roshi, who had died in a Los Angeles hospital on July 27, 2014, age one hundred and seven.
All the space that’s left when the passing of time takes away everything—friends, family, libido, his taste for alcohol, his health—there’s nothing left to fill it but work. So Leonard lit a cigarette and worked.
But though his mind was still sharp, Leonard’s body betrayed him. Time and touring had taken their toll. No more skipping onto a stage or falling to his knees; he had multiple compression fractures of the spine. He was also fighting cancer. Immobilized by pain, in the words of a man of soldierly habits, he was “confined to barracks.”
“There were hilarious, esoteric arguments fueled by medical marijuana,” Adam said, “episodes of blissful joy that sometimes lasted hours, where we’d listen to one song on repeat like teenagers.”
Leonard died at home in his sleep on November 7, 2016, following a fall in the middle of the night. He was buried three days later, according to his wishes, in a plain pine box next to his parents in the Shaar Hashomayim Cemetery in Montreal.
I’ve worked at my work I’ve slept at my sleep I’ve died at my death And now I can leave.
-
Wilderness by Rennie Sparks
From My Notion template
What It’s About
A weird collection of observations that bounce back and forth between fact and fiction
How I Discovered It
I like the band the author is in
Thoughts
It was definitely mixed and somewhat repetitive. There were several interesting insights to be found though.
What I Liked About It
I like Weird Fiction – and observation that bounce back and forth between fancy and reality
What I Didn’t Like About It
It was a tad repetitive
Highlights
Much of this book was inspired by my own hunting and dreaming while foraging in the wilderness of dusty books, vague notions and the endless trails of our great Internet.
What a quiet world that would be! Everything governed by the softest movements of the littlest things. In the stillness of a grove of trees — what great orchestral melodies might finally be revealed?
Mary Sweeney, the Wisconsin Window Smasher, may have lived in just such a world of secret revelation. In the 1890’s Mary was arrested more than a hundred times for attempting to smash plate-glass windows in various small towns across the state. She was a wife and mother, a former school marm, who periodically ran away from her family in St. Paul, boarded a train for Wisconsin and then would be caught again, “indulging in her wild sport” according to The Badger State Banner. Mary’s method of smashing windows was a strange one. She would repeatedly throw her satchel at a window until it shattered and she was often caught before completing her task. It’s as if she had no plans to smash a window, but had simply gotten off a train and then, upon being confronted with a pane of glass, became so upset she could not help but throw her bag again and again at the terrible sight.
Our tendency is, of course, to believe that Mary was simply delusional. What, though, if the opposite is true? Consider for a moment that we may all be ‘sane’ simply because we are blind. Are there things inside a pane of glass that we are blessed not to see? Would we all stop in mid-sentence and begin throwing things at the very air if our blinders were suddenly removed? Such is the experience of larvae born deep within a tree trunk. Some larvae actually tunnel for years within the dark wood, gradually changing shape as they travel through the tree, heading outward toward light they have never seen. How amazed such creatures must be when, at last transformed into beetle or moth, they emerge into the air and spread wings they never knew they’d grown. Can you imagine your own world suddenly made shining, weightless and stretching infinitely in all directions?
It’s a mercy that most of us remain as blind to such things as the larvae lost within their dark world. Those cursed to peer out beyond our tree trunk before they’ve grown wings may resort to tin foil hats and the smashing of windows in an effort to forget what they have seen.
Those who hear and see too much may end up like the shining trout thrown back to the river after being hooked and pulled into the air. This sad fish can never return happily to the waves. Such fish tell fabulous tales of another world directly above the water. They speak of a hell-realm of suffocating brightness where their once-weightless bodies suddenly sunk and flopped upon the dirt as enormous demons pricked and pulled at them merely for sport. Alas, these hook-scarred Cassandras are doomed ever to speak the truth but never to be believed. We swim away from them into the comforting darkness of our muddy pond and do our best never to catch their eye again.
The wolf got along fine with dogs and with a house full of cats. Still, however tame and friendly this ‘husky’ was, other dogs knew at once she was not a dog. They might eventually accept her, but their first glance at her was often tinged with the same shocked alarm you and I might feel upon spotting a Neanderthal strolling through the supermarket.
It is scary because it is transcendent and other-worldly — it offers entrance into a larger, more mysterious realm waiting just outside the window. The white wolves are emissaries from a lost empire beyond and surrounding ours, completely forgotten to our waking selves but still accessible if we dare to take the leap into the branches outstretched within our dreams. Poor little Serge was too frightened to leap and his cowardice haunted him for the rest of his sad life.
Consider this: what if the big, bad, black wolf has also been trying to draw you out into the wider world all these years he has chased you? Has he been huffing and puffing all these centuries not to blow your house down, but simply to get you to come outside and look at the stars? Does he long to gobble you up simply to give you a chance to finally see through his eyes the dark forest full of shining light?
And so, dear reader, as you live out your allotted time inside your lonely cage of flesh as I do in mine — can we believe beyond the three minutes of our favorite song that we run together across this arid plain?
Does it bother the turtle that it will never see its own shell? Or does the turtle simply take it on faith that it is born into a world where invisible help is always at the ready?
John Audubon, in fact, killed thousands of birds in order to find perfectly-shaped corpses to pose for his famous water-colors. Once you realize this truth his work is ruined. All you can see in those finely-detailed illustrations are the limp, oddly-bent necks, the empty eyes and the frozen wings all pulled into a terrible facsimile of life with wires and sticks.
Audubon once buried a dead rat in a flower pot with only its tail protruding from the dirt and presented it to a friend as a ‘rare flower’. That’s the kind of joke you can only laugh at, I suppose, after you’ve killed at least a hundred birds. I don’t feel like laughing about any of this.
Termites build nests that are narrow, mound-like structures. These nests can, nevertheless, rise as high as twenty feet tall. Considering the size of these little creatures such buildings are the human equivalent of a 180-storey skyscraper built by hand, brick by brick—yet termites only live about two years.
Termites can communicate with each other by banging their heads on the floor of their tunnels, but they also have a far more deeply-felt level of communication that is somehow transmitted through their queen. Scientists actually call it a ‘group soul’. Deep within this teeming maze of non-stop activity, the queen lies motionless in her dark bower yet somehow without moving a muscle she orchestrates all movement around her.
Even Thoreau, who wisely said that the mark of a man was his ability to leave things alone, still felt the need to write a great deal about all the things he was thinking about leaving alone. Where are the Thoreaus that keep silent? Where are the ones who simply let the light of the moon fall upon them without feeling the need to comment on its beauty? These sages may be all around us, but they leave no trace save for a few stray footprints in the dirt.
suspect then that the reason I find the shark’s glance so frightening is not that it triggers some primordial fear of attack, but rather that the shark’s face reminds me of something I’d rather forget — that this world is by nature a world of hunters and prey, where all are born to be eaten by something else. Our world may be a bloodthirsty place, but it is dispassionately so and thus neither good nor evil. That’s what’s so unsettling — realizing that good and evil are concepts that don’t mean a thing when it comes to a turtle eating a snail or a snail eating a flower or even a shark eating a person.
There’s a good book written on the subject called, The Tibetan Book of the Dead. You might say it’s one of the most irrational books ever written given that it is written specifically as a self-help book for the newly dead.
The gold that these mystics dreamed of isolating from baser materials like lead and copper and quartz was not physical gold, but a golden state of purity from which they believed all things emanated. This invisible, perfect, prima materia, was called the Philosopher’s Stone.
Alas, the salamander is deaf and none of these little creatures will ever hear a Blind Willie Johnson song save by pressing their skin against the speaker and letting the vibrations fill their small bodies. I have never caught a salamander doing such a thing and so I have to believe that either they only listen to music very late at night in houses that are burning down or that they have found other ways to transcend pain besides listening to the Blues.
Rabbits only dare venture out at dusk and dawn because at those half-lit times they are best camouflaged and can still hopefully see a predator approaching. Even so they are constantly watching in all directions as they eat their grass and clover. They’re right to do so. Most rabbits die in the jaws of something.
I was so excited to spot these little creatures that I got out of my car and tried to sneak towards them. Of course they all immediately dove underground. It’s a strange but sad fact: one of the best ways to watch wildlife is by staying inside your car.
Is this why dogs run away during lightning storms? Is this why moths gather around a light bulb? Is this where the astronauts were taking me when they carried me from my bed as a child?
We have, in fact, a long history of saying otherwise about the crow. Our opinion of crows is so low that we call a group of crows a ‘murder’ and a group of ravens an ‘unkindness’. Groups of owls, on the other hand (a bird far more deserving of suspicion) we insist on describing politely as a ‘parliament’.
Think back again to the plague doctor in his crow mask and remember: it was medieval Europe’s penchant for killing cats (they suspected the whole species of devilry) that left an unchecked rat population to spread plague-infested fleas wherever they went.
Crocodiles have survived as a species for at least one hundred million years. They are actually one of the few creatures alive now that once shared the earth with dinosaurs. What legends do they tell among themselves of those ancient giants and of the mysterious cataclysm that turned an empire of titans into a pile of bones?
Others assumed that those with blonde hair were living ghosts, unnatural phantoms not meant to walk the earth.
It is sometimes the case, for example, that a person found dazed and disheveled at the side of the highway remembers only an owl swooping in front their car right before they crashed the car into a tree. Only under hypnosis does the confused victim suddenly recall a flying saucer that had disabled the car engine with a beam of light. They remember now being levitated skyward and examined by aliens with strange machinery before being returned to their car with some four hours of missing time and a false memory of a swooping bird stuck in their head.
The Jains, however, don’t make themselves miserable over the unavoidable suffering their survival must cause — from the things we kill and eat to the bacteria destroyed by our guts and the viruses fought by our blood. Instead Jains focus on feeling empathy with the hunger that drives all life — the lust that has driven the world ever since the first one-celled creature divided itself in two and suddenly found itself facing a delicious looking stranger.
-
The Prize by Daniel Yergin
The Book in 3 Sentences
- The Prize is a detailed history of oil – from about 1870 or so to the present day. It tells the story of the 20th century and makes a lot of politics much more explainable. I was forced to reevalulate several long held positions of mine.
Impressions
A masterful work of scholarship and storytelling.
How I Discovered It
Dwarkesh Patel and his podcast
Who Should Read It?
Anyone looking to understand the recent past
Quotes
The first is the rise and development of capitalism and modern business.
The second theme is that of oil as a commodity intimately intertwined with national strategies and global politics and power.
A third theme in the history of oil illuminates how ours has become a “Hydrocarbon Society” and we, in the language of anthropologists, “Hydrocarbon Man.”
Gasoline was then only an almost useless by-product, which sometimes managed to be sold for as much as two cents a gallon, and, when it could not be sold at all, was run out into rivers at night.
Total world oil consumption grew almost 30 percent between 1990 and 2008—from 67 million to 86 million barrels per day. Over the same time, oil demand in India more than doubled and in China, more than tripled.
George Bissell, who, more than anybody else, was responsible for the creation of the oil industry.
Oil was hardly unfamiliar to mankind. In various parts of the Middle East, a semisolid oozy substance called bitumen seeped to the surface through cracks and fissures, and such seepages had been tapped far back into antiquity—in Mesopotamia, back to 3000 B.C.
Some of these seepages, along with escaping petroleum gases, burned continuously, providing the basis for fire worship in the Middle East.
From the seventh century onward, the Byzantines had made use of oleum incendiarum—Greek fire. It was a mixture of petroleum and lime that, touched with moisture, would catch fire; the recipe was a closely guarded state secret. The Byzantines heaved it on attacking ships, shot it on the tips of arrows, and hurled it in primitive grenades. For centuries, it was considered a more terrible weapon than gunpowder.
Salt “boring,” or drilling, had been developed more than fifteen hundred years earlier in China, with wells going down as deep as three thousand feet. Around 1830, the Chinese method was imported into Europe and copied.
They were convinced of the need and the opportunity. But to whom would they now entrust this lunatic project?6 The “Colonel” Their candidate was one Edwin L. Drake, who was chosen mainly by coincidence. He certainly brought no outstanding or obvious qualifications to the task.
Concerned about the frontier conditions and the need to impress the “backwoodsmen,” the banker sent ahead several letters addressed to “Colonel” E. L. Drake. Thus was “Colonel” Drake invented, though a “colonel” he certainly was not. The stratagem worked. For a warm and hospitable welcome was received by “Colonel” E. L. Drake, when, in December of 1857, he arrived, after an exhausting journey through a sea of mud, on the back of the twice-weekly mail wagon, in the tiny, impoverished village of Titusville, population 125, tucked into the hills of northwestern Pennsylvania. Titusville was a lumber town, whose inhabitants were deeply in debt to the local lumber company’s store. It was generally expected that the village would die when the surrounding hills had all been logged and that the site would then be reclaimed by the wild.
Meanwhile, other wells were drilled in the neighborhood, and more rock oil became available. Supply far outran demand, and the price plummeted. With the advent of drilling, there was no shortage of rock oil. The only shortage now was of whiskey barrels, and they soon cost almost twice as much as the oil inside them.
rock oil emits a dainty light; the brightest and yet the cheapest in the world; a light fit for Kings and Royalists and not unsuitable for Republicans and Democrats.”
Yet all the wells thus far were modest producers and had to be pumped. That changed in April 1861, when drillers struck the first flowing well, which gushed at the astonishing rate of three thousand barrels per day. When the oil from that well shot into the air, something ignited the escaping gases, setting off a great explosion and creating a wall of fire that killed nineteen people and blazed on for three days. Though temporarily lost in the thunderous news of the week before—that the South had fired on Fort Sumter, the opening shots of the Civil War—the explosion announced to the world that ample supplies for the new industry would be available.
In less than two years one memorable well generated $15,000 of profit for every dollar invested.
“The whole place,” said one visitor, “smells like a corps of soldiers when they have the diarrhoea.”
Pithole returned to silence and to the wilderness. A parcel of land in Pithole that sold for $2 million in 1865 was auctioned for $4.37 in 1878.
Even then, at the age of twenty-six, John D. Rockefeller already made a forbidding impression. Tall and thin, he struck others as solitary, taciturn, remote, and ascetic. His unbending quietness—combined with the cold, piercing blue eyes set in an angular face with a sharp chin—made people uneasy and fearful. Somehow, they felt, he could look right through them.
Rockefeller was born in 1839 in rural New York State, and lived almost a full century, until 1937.
The son’s character was already set at a young age—pious, single-minded, persistent, thorough, attentive to detail, with both a gift and a fascination for numbers, especially numbers that involved money. At the age of seven, he launched his first successful venture—selling turkeys.
Mathematics was the young Rockefeller’s best subject in high school. The school stressed mental arithmetic—the ability to do calculations quickly in one’s head—and he excelled at
At the beginning, Rockefeller also established another principle, which he religiously stuck to thereafter—to build up and maintain a strong cash position. Already, before the end of the 1860s, he had built up sufficient financial resources so that his company would not have to depend upon the bankers, financiers, and speculators on whom the railways and other major industries had come to rely. The cash not only insulated the company from the violent busts and depressions that would drive competitors to the wall, but also enabled it to take advantage of such downturns.
His relationship with the remote Rockefeller was to lead Flagler to another adage: “A friendship founded on business is better than a business founded on friendship.”
Many years later, after having made one great fortune with Rockefeller, Flagler set off on a second conquest, the development of the state of Florida. He would build the railways down the east coast of Florida, all the way to the Keys, in order to open up what he called the “American Riviera,” and was to found both Miami and West Palm Beach.
Standard, however, did not stop with rebates. It also used its prowess to win “drawbacks.” A competing shipper might pay a dollar a barrel to send his oil by rail to New York. The railroad would turn around and pay twenty-five cents of that dollar back, not to the shipper, but to the shipper’s rival, Standard Oil! That, of course, gave Standard, which was already paying a lower price on its own oil, an additional enormous financial advantage against its competitors. For what this practice really meant was that its competitors were, unknowingly, subsidizing Standard Oil. Few of its other business practices did as much to rouse public antipathy toward Standard Oil as these drawbacks—when eventually they became known.4
Rockefeller was both strategist and supreme commander, directing his lieutenants to move with stealth and speed and with expert execution. It was no surprise that his brother William categorized relations with other refiners in terms of “war or peace.”
By 1879, the war was virtually over. Standard Oil was triumphant. It controlled 90 percent of America’s refining capacity. It also controlled the pipelines and gathering system of the Oil Regions and dominated transportation. Rockefeller
A Pennsylvania grand jury indicted Rockefeller, Flagler, and several associates for conspiracy to create a monopoly and injure competitors. A vigorous effort was made to extradite Rockefeller to Pennsylvania. He was alarmed enough to exact a promise from the Governor of New York not to approve any extradition order, and the attempt eventually failed.
It was the stockholders of Standard Oil, not Standard Oil itself, who owned shares in the other firms. At that time, corporations themselves could not own stock in other corporations. The shares were held in “trust,” not for the Standard Oil Company of Ohio, but on behalf of the stockholders of that corporation.
The company also used an extraordinary system of corporate intelligence and espionage to keep track of market conditions and competitors. It maintained a card catalog of practically every buyer of oil in the country, showing where virtually every barrel shipped by independent dealers went—and where every grocer, from Maine to California, obtained his kerosene.
From there the entire enterprise was directed, starting with the Executive Committee, its membership being whoever was in town that day.
Later in life, he recited a little rhyme from memory: A wise old owl lived in an oak, The more he saw the less he spoke, The less he spoke, the more he heard, Why aren’t we all like that old bird?
But they warned against poor quality and impure oils, which were responsible for “those terrible explosions.” In the mid-1870s, five to six thousand deaths a year were attributed to such accidents.
Yet it was not a complete monopoly, not even in refining. Somewhere around 15 to 20 percent of oil was sold by competitors, and the directors of Standard were willing to live with that. Control of upwards of 85 percent of the market was sufficient for Standard to maintain the stability it cherished.
To many producers and independent refiners Standard Oil was the Octopus, out to grasp all competitors, “body and soul.” And to those throughout the oil industry who suffered from Rockefeller’s machinations—from the ceaseless commercial pressures and the “good sweatings,” from the duplicity and secret arrangements—he was a bloodless monster, who hypocritically invoked the Lord as he methodically set about destroying people’s livelihoods and even their lives in his pursuit of money and mastery.
Yet, whereas many of the other robber barons amassed their wealth by speculation, stock and financial manipulation, and outright fraud—cheating their stockholders—Rockefeller built his fortune by taking on a youthful, wild, unpredictable, and unreliable industry, and relentlessly transforming it according to his own logic into a highly organized, far-flung business that satisfied the basic hunger for light around the world.
Among the most promising markets for the “new light” was the vast Russian empire, which was beginning to industrialize, and for which artificial light had a special importance. The capital city, St. Petersburg, was so far north that, in the winter, it had barely six hours of daylight.
Baku was the territory of the “eternal pillars of fire” worshiped by the Zoroastrians. Those pillars were, more prosaically, the result of flammable gas, associated with petroleum deposits, escaping from the fissures in porous limestone.
In a very few years, Russian oil was to take on and even surpass American oil, at least for a time; and this Swede, Ludwig Nobel, would become “the Oil King of Baku.”
To make matters worse, severe winter weather precluded the shipment of kerosene on the Caspian between October and March, with the result that many refiners simply shut down for half the year.
Even parts of the empire were inaccessible; in the city of Tiflis (now Tbilisi), it was cheaper to import kerosene from America, 8,000 miles away, than from Baku, 341 miles to the east.
While Ludwig Nobel’s patience and determination did not abate in the face of the never-ending obstacles, physically he was worn out. In 1888 at the age of fifty-seven, the Oil King of Baku died of a heart attack while vacationing on the French Riviera. Some of the European newspapers confused the Nobel brothers and instead reported the death of Alfred. Reading his own premature obituaries, Alfred was distressed to find himself condemned as a munitions maker, the “dynamite king,” a merchant of death who had made a huge fortune by finding new ways to maim and kill. He brooded over these obituaries and their condemnations, and eventually rewrote his will, leaving his money for the establishment of the prizes that would perpetuate his name in a way that would seem to honor the best in human endeavor.
At the time, Marcus and Samuel Samuel were the only British Jews prominent in the trade with the Orient.
Mark was paid five pounds a week and was further rewarded by constant long-range interference, carping, criticism, and insults from his uncles.
Standard Oil’s agents were too late; Samuel’s kerosene was everywhere. Thus, Standard could not cut prices in one market and subsidize them by raising prices elsewhere.
The customers were expected to use old Standard Oil tin cans. But they did not. Throughout the Far East, Standard’s blue oil tins had become a prized mainstay of the local economies, used to construct everything from roofing to birdcages to opium cups, hibachis, tea strainers, and egg beaters. They were not about to give up such a valuable product. The whole scheme was now threatened—not by the machinations of 26 Broadway or by the politics of the Suez Canal, but by the habits and predilections of the peoples of Asia. A local crisis was created in each port, as the kerosene went unsold, and despairing telegrams began to flow into Houndsditch.
But then in 1893, the year after the coup, all—both business and social—seemed for naught. Samuel became seriously ill; his physician diagnosed cancer and gave him no more than six months to live. The prediction was to prove slightly off the mark—by some thirty-four years. Still, the threat of imminent death did motivate Samuel to put his business affairs into a somewhat more orderly form.
Samuel had, however, one serious fault as a businessman. Unlike his rival, Rockefeller, he lacked talent for organization and administration. Where Rockefeller had an instinct for order, Samuel had an addiction to improvisation. For him, organization was an afterthought; he ran everything out of his hat, which made his continuing success all the more astonishing.
The rapid rise of Russian production, the towering position of Standard Oil, the struggle for established and new markets at a time of increasing supplies—all were factors in what became known as the Oil Wars. In the 1890s, there was a continuing struggle involving four rivals—Standard, the Rothschilds, the Nobels, and the other Russian producers. At one moment, they would be battling fiercely for markets, cutting prices, trying to undersell one another; at the next, they would be courting one another, trying to make an arrangement to apportion the world’s markets among themselves; at still the next, they would be exploring mergers and acquisitions.
With great effort, slowed not only by rivalries but by a cholera epidemic that gripped Baku, the Rothschilds, joined by the Nobels, did succeed in getting all the Russian producers to agree to form a common front, as a prelude to a grand negotiation with Standard. But despite its 85 to 90 percent control of American oil, Standard could not deliver the critical missing element, the independent American refiners and producers, to the grand scheme, and the proposed agreement collapsed.
But Samuel rejected it. He wanted to keep the independent identity of his enterprise and his fleet, flying the flag of M. Samuel and Company, and he wanted it all to remain British. For it was British success on British terms on which he was intent, not integration into an American entity.
All sorts of obstacles had to be overcome, including the arrival of almost three hundred marauding pirates from another part of Sumatra,
Its directors and management knew how Standard Oil had operated in America—buying up shares in offending competitors quietly, and then putting them out of action. To forestall such a stratagem, the directors of Royal Dutch created a special class of preference stock, the holders of which controlled the board. To make acquisition even more difficult, admission to this exclusive rank was by invitation only. One of Standard’s agents unhappily reported that Royal Dutch would never merge with the American company. It was not merely a “sentimental barrier” on the part of the Dutch that blocked the way, he said; there was a practical matter, as well. The managers of Royal Dutch greatly enjoyed receiving 15 percent of the company’s profits.
All three of those sources—kerosene, gas and candles—had the same serious problems; they produced soot, dirt, and heat; they consumed oxygen; and there was always the danger of fire. For that last reason, many buildings, including Gore Hall, the library of Harvard College, were not illuminated at all.
For him, invention was not a hobby, it was a business.
Edison immediately applied himself to the question of commercializing his invention, and in the process, created the electric generation industry. He even worked very carefully to price electricity so that it would be highly competitive—at exactly the equivalent of the town gas price of $2.25 per thousand cubic feet. He built a demonstration project in Lower Manhattan, whose territory just happened to include Wall Street. In 1882, standing in the office of his banker, J. P. Morgan, Edison threw a switch, starting the generating plant and opening the door not only on a new industry but on an innovation that would transform the world. Electricity offered superior light, it needed no attention from its user, and it was hardly resistible where available. By 1885, 250,000 light bulbs were in use; by 1902, 18 million.
The natural gas industry had to shift its markets to heating and cooking, while the United States market for kerosene, the staple of the oil industry, leveled out and was increasingly restricted to rural America.
Nevertheless, in the United States, as well as in Europe, the horseless carriage quickly captured the minds of entrepreneurial inventors. One such person was the chief engineer of the Edison Illuminating Company in Detroit, who quit his job so that he could design, manufacture, and sell a gasoline-powered vehicle that he named after himself—the Ford.
Still, there were doubts about the ruggedness and reliability of the car. Those questions were laid to rest, once and for all, by the San Francisco earthquake of 1906. Two hundred private cars were pressed into service for rescue and relief, fueled by fifteen thousand gallons of gasoline donated by Standard Oil. “I was skeptical about the automobile previous to the disaster,” said the acting chief of the San Francisco fire department, who commanded three cars for round-the-clock work, “but now give it my hearty endorsement.”
The growth of the automobile industry was phenomenal. Registrations in the United States rose from 8,000 in 1900 to 902,000 in 1912. In a decade, the automobile went from a novelty to a familiar practicality, changing the face and mores of modern society. And it was all based on oil.
In addition to gasoline, a second major new market for petroleum was developing with the growth in use of fuel oil in the boilers of factories, trains, and ships.
The Los Angeles boom fizzled by the end of the 1860s, severely tarnishing the prospects for California. Professor Silliman’s reputation was hurt even more. Indeed, so great was the humiliation and disgrace that Silliman, heretofore one of the preeminent figures in American science, was forced to resign his professorship of chemistry at Yale.
The dominant producer in California was Union Oil (now Unocal), the only major American corporation outside of Standard Oil to have maintained a continuous independent existence since 1890 as a major integrated oil company.
Though California had by the turn of the century emerged as a major oil province, it was far from the rest of the nation, isolated, and its external markets were in Asia and not east of the Rockies where most of the citizens of the United States happened to live. California might as well have been another country from a business point of view.
Tents, lean-tos, shacks, saloons, gambling houses, whorehouses—all sprang up in Beaumont to serve the various needs of the lusting population. According to one estimate, Beaumont drank half of all the whiskey consumed in Texas in those early months. Fighting was a favorite pastime. There were two or three murders a night, sometimes more. Once sixteen bodies were dredged out of a local river, their throats slit, the victims of a night’s mayhem.
Samuel. He had recently rechristened his rapidly growing company Shell Transport and Trading—again, like the names of his tankers, in honor of his father’s early commerce in seashells.
So, when the news from Spindletop reached London, it immediately set off frantic and comical efforts by Shell, first to find out where Beaumont was—it could not be found in the office atlas at all—and then to make contact with Guffey.
there. A new language was even born on the hill, for it was at Spindletop that a “well borer” first became known as a “driller,” a skilled helper as a “roughneck,” and a semiskilled helper as a “roustabout.” A cash-short “shoe-stringer” would “poor boy” a well by splitting his interest with his crew, the landowner, his supply house, his boardinghouse owner, his favorite saloon keeper and, if need be, his most cherished madam, as
But his uncle Andrew had instilled in him the lesson that such was not the way to run a serious business. Rather, the aim should be to integrate—to control every stage of operations. “The real way to make a business out of petroleum,” said Andrew, was “to develop it from end to end; to get the raw material out of the ground, refine it, manufacture it, distribute it.” Any other way, and one was at the mercy of Standard Oil.
William Mellon engineered a reorganization of Guffey Petroleum and Gulf Refining that resulted in the Gulf Oil Corporation. It was now resolutely a Mellon company.
The syndicate was led by James Hogg, the three-hundred-pound ex-governor and progressive champion of Texas. The former governor was also a tough businessman: “Hogg’s my name,” he once explained, “and hog’s my nature.”
For the capital he needed to develop his leases, Cullinan turned to Lewis H. Lapham, a New Yorker who owned U.S. Leather, the centerpiece of the leather trust, and John W. Gates, a flamboyant Chicago financier known as “Bet-a-Million” Gates because of his willingness to make a wager on anything.
Among those working for Cullinan were Walter B. Sharp, who had drilled Patillo Higgins’s first unsuccessful attempt on Spindletop in 1893 and was now a premier driller, and another expert driller named Howard Hughes, Sr.
But the smaller oil producers, raising the specter of a new oil trust, managed to turn the proposed deal into the hottest issue in the Texas legislature; the chief lobbyists for each side even ended up having a very public fist fight in a hotel lobby in Austin.
But, in terms of overall market shares in oil products in the United States, Standard’s position of overwhelming dominance was receding. Its control of refining capacity declined from over 90 percent in 1880 to only 60 to 65 percent by 1911.
In Kansas, the governor pushed a scheme to build a state-owned oil refinery, which would compete with Standard’s and would be staffed by penitentiary inmates. At
When Rockefeller testified in one of the Ohio suits, he was so unforthcoming that a New York newspaper headlined, “John D. Rockefeller Imitates a Clam.”
1893, he came down with a stress-related disease, alopecia, which not only caused him a good deal of physical distress, but also robbed him of all his hair—which, afterward, he sought to remedy variously with a skullcap or a wig.
Still, Rockefeller began to distance himself, and finally, by 1897, he had—not yet sixty years of age—stepped aside, turning administrative leadership over to one of the other directors, John D. Archbold.
Tarbell devoted herself to career and never married, though later in life she was to become a celebrant of family life and an opponent of women’s suffrage.
After John Archbold, H. H. Rogers was the most senior and powerful director of Standard Oil, as well as a prominent speculator in his own right. He was responsible for Standard’s pipeline and natural gas interests. But Rogers’s own interests did not end with business. In one of the great services to American letters, he had, a decade earlier, taken control of Mark Twain’s tangled and bankrupt finances, put them right, and thereafter managed and invested the famous author’s money so that Twain could, as Rogers instructed him, “stop walking the floors.”
Twain came and went as he pleased from Rogers’s office at 26 Broadway, and sometimes lunched with the “gentlemen upstairs” in their private dining room. One
He asked Twain to find out what kind of history. Twain was also a friend of McClure’s, and he inquired of the publisher. One thing led to another, and Twain ended up arranging for Ida Tarbell to meet Rogers. She now had her connection.
Over the next two years, she met regularly with Rogers. She would be ushered in one door and out another; company policy forbade visitors to encounter one another.
Theodore Roosevelt embodied the progressive movement. The youngest man ever to enter the White House up to that time, he was forever bursting with energy and enthusiasm.
With equal passion, Roosevelt embraced reform causes of all sorts—from the mediation of the Russo-Japanese War to the promotion of simplified spelling. For the former he received the Nobel Peace Prize in 1906. As to the latter, in the same year, he sought to have the Government Printing Office adapt three hundred simplified spellings of familiar words—for instance, “dropt” for “dropped.” The Supreme Court refused to accept such simplifications in legal documents, but Roosevelt steadfastly kept to them in his own private letters.
Roosevelt ordered the hundred thousand dollars returned, and thereupon, in a burst of publicity, promised every American what became his slogan, a “square deal.” Whether the money was ever actually returned was another question. Attorney General Philander Knox told Roosevelt’s successor, William Howard Taft, that, when he had walked into Roosevelt’s office one day in October 1904, he had heard the President dictating a letter directing the return of the money to the Standard Oil Company. “Why, Mr. President, the money has been spent,” Knox said. “They cannot pay it back—they haven’t got it.” “Well,” replied Roosevelt, “the letter will look well on the record, anyhow.”
Over a course of more than two years, 444 witnesses gave testimony, and 1,371 exhibitions were introduced. The full record was to cover 14,495 pages bound in twenty-one volumes. The Chief Justice of the Supreme Court later described the transcript as “inordinately voluminous . . . containing a vast amount of conflicting testimony relating to innumerable, complex, and varied business transactions, extending over a period of nearly forty years.”
How large was the judgment, he asked? “The maximum penalty, I believe—twenty-nine million dollars,” replied Rockefeller. Then, as an afterthought, he added, “Judge Landis will be dead a long time before this fine is paid.” With that single outburst, he resumed his golf, seemingly unperturbed, and went on to play one of the best games of his life. Indeed, Landis’s judgment was eventually overturned.
The company transported more than four-fifths of all oil produced in Pennsylvania, Ohio, and Indiana. It refined more than three-fourths of all United States crude oil; it owned more than half of all tank cars; it marketed more than four-fifths of all domestic kerosene and was responsible for more than four-fifths of all kerosene exported; it sold to the railroads more than nine-tenths of all their lubricating oils. It also sold a vast array of by-products—including 300 million candles of seven hundred different types. It even deployed its own navy—seventy-eight steamers and nineteen sailing vessels. How was all this to be dismembered?
Standard Oil was divided into several separate entities. The largest of them was the former holding company, Standard Oil of New Jersey, with almost half of the total net value; it eventually became Exxon—and never lost its lead. Next largest, with 9 percent of net value, was Standard Oil of New York, which ultimately became Mobil. There was Standard Oil (California), which eventually became Chevron; Standard Oil of Ohio, which became Sohio and then the American arm of BP; Standard Oil of Indiana, which became Amoco; Continental Oil, which became Conoco; and Atlantic, which became part of ARCO and then eventually of Sun. “We
Moreover, cracked gasoline actually had a much better antiknock value than natural gasoline, which meant more power and allowed for higher-compression engines.
would soon be worth more than the whole. Within a year of the dissolution of Standard Oil, the value of the shares of the successor companies had mostly doubled; in the case of Indiana, they tripled. Nobody came out of this better or richer than the man who owned a quarter of all the shares, John D. Rockefeller. After the break-up, because of the increase in the price of the various shares, his personal worth rose to $900 million (equivalent to $9 billion today).
And Marcus Samuel was to become the most vociferous proponent of the conversion of shipping from coal to oil.
That historic development had actually begun in a small way in the 1870s, when ostatki, as the waste residue from kerosene refining was called in Russia, was first successfully used to fuel ships on the Caspian Sea. Pure necessity drove this innovation: Russia had to import coal from England, a very expensive proposition, and wood was scarce in many areas of the empire. Subsequently, the new Trans-Siberian Railway began to use oil fuel, supplied by Samuel’s syndicate through Vladivostok, rather than coal or wood.
Another British oil man, who would from time to time encounter Samuel on horseback, observed with some acuity that Samuel rode his horse much as he rode his vast business, always looking as though he were about to fall off, but never quite doing so.
Much later, his inspirational advice to young men starting out was, “You will go a long way in business if you train yourself to be able to appraise figures almost as rapidly and as shrewdly as a good judge of character can sum up his fellow-men.”
“Simplicity rules everything worth while, and whenever I have been up against a business proposition which, after taking thought, I could not reduce to simplicity, I have realized that it was hopelessly wrong and I have let it alone.”
One of his responsibilities was to interview personally every lunatic who was to be certified insane at the Mansion House, and some were to think he spent more time with the lunatics than he did with the oil men.
Limping along, with collapse in the air, Shell was barely able to pay 5 percent dividends, while Royal Dutch’s were at the rates of 65 percent, 50 percent, and then in 1905, an immensely satisfying 73 percent.
Russia’s industrial economy had gone through stupendous growth under the favorable policies of Count Sergei Witte, the powerful finance minister from 1892 to 1903. Trained as a mathematician, Witte had risen from a position as a lowly railroad administrator to become the master of the Russian economy by sheer ability—a most unusual means of ascent in the Czarist empire.
Witte was truly an exception, a man of great talents in a government populated by people of little ability.
The alumni included a still more important figure, a young Georgian, a former seminarian and son of a shoemaker. His name was Joseph Djugashvili, though he operated in the underground under the name “Koba”—Turkish for “Indomitable.” Only later did he begin to call himself Joseph Stalin.
The Russo-Japanese War began in January 1904 with Japan’s successful surprise attack against the Russian fleet at Port Arthur. Thereafter, the Russian forces lurched from one military disaster to the next, culminating in the burial at sea of the entire Russian fleet at the Battle of Tsushima.
When the news reached Baku, the oil workers again went out on strike. Government officials, fearful of revolution, provided arms to the Moslem Tatars, who rose up to massacre and mutilate Christian Armenians, including the leaders of the oil industry. A legend arose afterward about one of the wealthiest Armenian oil men, one Adamoff. A crack shot, he stationed himself on the balcony of his house, and with the aid of his son, held off a siege for three days, until finally he was killed, the house set fire, and his forty dependents either burned to death or dismembered.
Strikes and open rebellion spread again throughout the empire in September and October of 1905. In the Caucasus, it was race and ethnic conflict, and not socialism, that drove events. Tatars rose up once more in an attack on the oil industry throughout Baku and its environs, intent on killing every Armenian they could find, setting fire to buildings where Armenians had taken refuge, pillaging every piece of property on which they could lay their hands.
The news from Baku had a profound effect on the outside world. Here, for the first time, a violent upheaval had interrupted the flow of oil, threatening to make a vast investment worthless.
In October 1905, the Czar granted, albeit completely against his will and grain, a constitutional government, which included a Parliament, the Duma. Though the revolution was over, the oil region remained in turmoil. The oil workers of Baku elected Bolshevik deputies to the Duma; Nobel’s chief in Batum was murdered in the street. In 1907, strikes swept through Baku, again threatening to become a general strike, while the Czar stupidly undermined the constitution that might ultimately have preserved him and his dynasty. Also in 1907, the Bolsheviks sent Stalin back to Baku, where he directed, organized, and as he said, fomented “unlimited distrust of the oil industrialists” among the workers.
Meanwhile, the Russian government unwisely raised internal transport tariffs to help satisfy the ravenous appetites of its treasury. The result was to increase further the price of Russian oil products on the world market, making them even less competitive. Its price advantage had turned into a disadvantage. Increasingly, Russian oil was a residual, to be bought when other petroleum was not available.
That way, the Rothschilds transformed their uncertain and insecure Russian assets into substantial holdings in a rapidly growing, diversified international company with outstanding prospects.
The acquisition of the Rothschilds’ interests, in turn, gave the Group a globally balanced portfolio of production—53 percent from the East Indies, 17 percent from Rumania, and 29 percent from Russia. Obviously, there was significant risk going into Russia. But the advantages from integrating this additional output into its worldwide system were immediate. As to the risks, time would tell.
And Persia itself—or Iran, as it would be known from 1935 onward— would emerge into a prominence on the world stage that it had not enjoyed since the days of the ancient Persian and Parthian empires.
He was an investor, a speculator, a putter-together of syndicates, not a manager, and he was looking for a new investment. The prospect of petroleum in Persia attracted him, he was again willing to take a chance, and, in so doing, he would become the founder of the oil industry of the Middle East.
The Shah, Muzaffar al-Din, was “merely an elderly child,” in the words of Hardinge, the British minister,
The population was abysmally lacking in technical skills, and indeed, the hostility of the terrain was more than matched by the hostility of the culture toward Western ideas, technology, and presence.
To the Azeris, even the introduction of the lowly wheelbarrow was startling, a major innovation.
The British government would “regard the establishment of a naval base or of a fortified port in the Persian Gulf by any other power as a very grave menace to British interests, and we should certainly resist it with all the means at our disposal.” This declaration, said a delighted Lord Curzon, Viceroy of India, was “our Monroe Doctrine in the Middle East.”
That meant, first of all, building a Navy to rival Britain’s. As the Kaiser himself declared, “Only when we can hold out our mailed fist against his face will the British lion draw back.”
Winston Churchill was the nephew of the Duke of Marlborough and son of the brilliant but erratic Lord Randolph Churchill and his beautiful American wife, Jennie Jerome. He had entered Parliament as a Conservative in 1901, at age twenty-six. Three years later, he bolted from the Tory party over the question of free trade and crossed over to the Liberals. His political conversion did not impede his progress. He
Churchill’s great gamble was to push for conversion to oil before the supply problem had been solved. He eloquently summarized the issue: “To build any large additional number of oil-burning ships meant basing our naval supremacy upon oil. But oil was not found in appreciable quantities in our islands. If we required it we must carry it by sea in peace or war from distant countries. We had, on the other hand, the finest supply of the best steam coal in the world, safe in our mines under our own land. To commit the Navy irrevocably to oil was indeed ‘to take arms against a sea of troubles.’ ”
“They have killed 15 men in experiments with oil engines and we have not killed one! And a d——d fool of an English politician told me the other day that he thinks this creditable to us.”
On August 1, Germany declared war on Russia and mobilized its armies. At 11:00 P.M. on August 4, after Germany had ignored a final British ultimatum against violating Belgium’s neutrality, Churchill flashed a message to all of His Majesty’s ships: “COMMENCE HOSTILITIES AGAINST GERMANY.” The First World War had begun.
Horses were still the basis of planning at the outbreak of the war—one horse for every three soldiers. Moreover, the reliance on horses greatly complicated the problems of supply, for each horse required ten times as much food as each man.
Once night had fallen, each taxi was crammed with soldiers—under the personal watch of General Gallieni, who noted, with a mixture of amusement and understatement, “Well, at least it’s not commonplace.”
Thousands and thousands of troops were rushed to the critical point on the front by Gallieni’s taxicabs.
Yet what was needed was not necessarily wanted. Entrenched opponents in the high command of the British Army did not take the idea seriously and did everything they could to squelch it. Indeed, it might well have died altogether had it not been taken up and championed by Winston Churchill. The First Lord of the Admiralty appreciated military innovation and was outraged at the failure of the Army and the War Office to begin developing such vehicles.
And, in the face of the Army’s resistance, Churchill doled out Navy funds for the continuing research needed to develop the new vehicle. Reflecting the Navy’s temporary sponsorship, the new machine was known as the “land cruiser” or the “landship.” Churchill called it the “caterpillar.” To maintain secrecy, it needed a code name while it was being tested and transported, and various names—among others, the “cistern” and the “reservoir”—were considered. But finally it became known by another of its code names—the “tank.”
When the German High Command declared in October 1918 that victory was no longer possible, the first reason it gave was the introduction of the tank.
It was rightly said after the war that the victory of the Allies over Germany was in some ways the victory of the truck over the locomotive.
By July 1915, every machine that had been in the air at the outbreak of the war, less than a year earlier, had become obsolete.
By the last months of the struggle, the speed of the most advanced aircraft had more than doubled, to over 120 miles per hour,
Yet, despite the central role that the naval rivalry had played in leading the two countries to war, the Grand Fleet and the High Seas Fleet met only once in major engagement—at the Battle of Jutland on May 31, 1916. The outcome of that legendary encounter has been debated ever since. The German fleet was victorious in a tactical sense, succeeding as it did in escaping from a trap. But, strategically, the British won, for they were able to dominate the North Sea for the rest of the war and keep the German fleet penned up in its home bases.
most important step was the purchase from the British government of one of the largest petroleum distribution networks in the United Kingdom, a company called British Petroleum.
Now, with its acquisition of British Petroleum, Anglo-Persian acquired not only a major marketing system, but also what would subsequently prove a most useful name.
In the middle of the night at the end of January 1915, the plant in Rotterdam was disassembled, part by part, each piece numbered and camouflaged, and then carried to the docks and loaded onto a Dutch freighter,
Still, the effects of the submarine attacks were large and quickly felt. Tonnage lost in the first half of 1917 was twice that lost in the comparable period in 1916. Between May and September, Standard Oil of New Jersey lost six tankers, including the brand new John D. Archbold.
In 1914, the United States had produced 266 million barrels—65 percent of total world output. By 1917, output had risen to 335 million barrels—67 percent of world output. Exports accounted for a quarter of total U.S. production, with the bulk going to Europe. Now that access to Russian oil had been closed off by war and revolution, the New World had become the oil granary for the Old; altogether, the United States was to satisfy 80 percent of the Allies’ wartime requirement for petroleum.
In January 1918, the Fuel Administration ordered almost all industrial plants east of the Mississippi to close for a week in order to free fuel for hundreds of ships filled with war materials for Europe that were immobilized in East Coast harbors for want of coal. Thereafter, the factories were ordered to remain closed on Mondays to conserve coal. “Bedlam broke loose,” observed Colonel Edward House, Woodrow Wilson’s political confidant. “I have never seen such a storm of protest.”
The number of cars in use had almost doubled between 1916 and 1918.
Those who challenged Norton-Griffiths or stood in his way were overwhelmed by the sheer force of his personality. If that proved insufficient, he would deliver a powerful kick or pull out his revolver and shout, “I don’t speak your blasted language.”
They began to seek access to Baku petroleum in March 1918 with the Treaty of Brest-Litovsk, which ended hostilities between Germany and revolutionary Russia. However, the Turks, the ally of Germany and Austria, had already begun to advance toward Baku. Fearing that success by their ally would lead to the wanton destruction of the oil fields, the Germans promised the Bolsheviks that they would try to restrain the Turks in exchange for oil. “Of course, we agreed,” said Lenin. Joseph Stalin, who by then had emerged as one of the leading Bolsheviks, telegraphed the Bolshevik Baku Commune, which controlled the city, ordering it to comply with this “request.” But the local Bolsheviks were in no mood to go along. “Neither in victory nor in defeat will we give the German plunderers one drop of oil produced by our labor,” they replied.
Sir Maurice Hankey, the extremely powerful secretary of the War Cabinet, wrote to Foreign Secretary Arthur Balfour that, “oil in the next war will occupy the place of coal in the present war, or at least a parallel place to coal. The only big potential supply that we can get under British control is the Persian and Mesopotamian supply.” Therefore, Hankey said, “control over these oil supplies becomes a first-class British war aim.”
But the newly born “public diplomacy” had to be considered. In early 1918, to counter the powerful appeal of Bolshevism, Woodrow Wilson had come out with his idealistic Fourteen Points and a resounding call for the self-determination of nations and peoples after the war. His own Secretary of State, Robert Lansing, was appalled by the President’s broadside. The call for self-determination, Lansing was sure, would result in many deaths around the world. “A man, who is a leader of public thought, should beware of intemperate or undigested declarations,” he said. “He is responsible for the consequences.”
“I do not care under what system we keep the oil,” he said, “but I am quite clear it is all important for us that this oil should be available.” To help make sure this would happen, British forces, already elsewhere in Mesopotamia, captured Mosul after the armistice was signed with Turkey.
As the chairman of Burmah remarked, “We couldn’t very well haggle or bargain” with Churchill. Burmah’s officers worried about how to pay the money, since if the recipient of such a large fee was not disclosed on the books, the auditors would not approve. Finally it was decided to set up a secret account.
But before anything further could be done, there was an outside intervention. Baldwin called a snap general election at the end of 1923, and Churchill, the job not yet done, resigned his commission, returned the initial fee, and charged back into his natural and beloved fray, politics.
the Undersecretary of the Treasury wrote to Charles Greenway, chairman of Anglo-Persian, “have no intention of departing from the policy of retaining these shares.” The minister responsible for the Treasury was the new Chancellor of the Exchequer, none other than the newest convert to Conservatism, Winston Churchill.
The leaders of engineering and scientific geology shared the fear. The director of the United States Bureau of Mines predicted in 1919 that “within the next two to five years the oil fields of this country will reach their maximum production, and from that time on we will face an ever-increasing decline.”
American eyes fastened on the Middle East, particularly Mesopotamia, under British mandate. But the door was manifestly not open there. When two Standard Oil of New York geologists slipped into the territory, the British civil commissioner handed them over to the chief of police of Baghdad.
But his career as a king was not yet over. The British needed a monarch for Iraq, another new state, this one to be formed out of three former provinces of the Turkish empire.
Churchill, then the head of the Colonial Office, wanted was an Arab government, with a constitutional monarch, that would be “supported” by Britain under League of Nations mandate. It would be cheaper. So Churchill chose the out-of-work Faisal as his candidate. Summoned from exile, he was crowned King of Iraq in Baghdad in August 1921. Faisal’s brother Abdullah—originally destined for the Iraqi throne—was instead installed as king “of the vacant lot which the British christened the Amirate of Transjordan.”
The minority Sunni Arabs held political power, while the Shia Arabs were by far the most numerous. To complicate things further, the Jews were the largest single group among inhabitants of Baghdad, followed by Arabs and Turks.
The first three days, the convoy managed five and two thirds miles an hour—“not quite so good,” said Eisenhower, “as even the slowest troop train.”
Having left Washington on July 7, the caravan did not arrive in San Francisco until September 6, where the drivers were greeted with a parade, followed by a speech by the Governor of California, who compared them to the “Immortal Forty-Niners.” Eisenhower was looking ahead. “The old convoy,” he recalled “had started me thinking about good, two lane highways.”
By 1929, 78 percent of the world’s autos were in America.
In that year, there were five people for each motor vehicle in the United States, compared to 30 people per vehicle in England and 33 in France, 102 people per vehicle in Germany, 702 in Japan, and 6,130 people per vehicle in the Soviet Union.
The transformation of America into an automotive culture was accompanied by a truly momentous development: the emergence and proliferation of a temple dedicated to the new fuel and the new way of life—the drive-in gasoline station. Before
In the infancy of the auto age, some retailers experimented with gasoline wagons that delivered fuel from house to house. That idea never really caught on, partly because of the frequency with which the wagons tended to explode.
The number of drive-in gasoline stations, specifically, had grown from perhaps 12,000 in 1921 to about 143,000 in 1929.
The stations were everywhere—big city street corners, main streets in small towns, country crossroads. East of the Rockies, such facilities were called “filling stations”; west of the Rockies, they were known as “service stations.” And
Competition forced the oil companies to develop trademarks to assure national brand identification. They became the icons of a secular religion, providing drivers with a feeling of familiarity, confidence, and security—and of belonging—as they rolled along the ever-lengthening ribbons of roads that crossed and crisscrossed America.
Gas stations were also the source for what one expert described as “uniquely American contributions to the development and growth of cartography”—the oil company road map.
By 1920, Shell of California was providing free uniforms to attendants and paying for up to three launderings a week. It prohibited the attendants from reading magazines and newspapers while on duty, and its manual forbade the accepting of tips: “Air and water service is a gratuity which you are expected to render the public, showing no distinction as to whether the individual is a Shell customer or not.” By 1927, the “service station salesmen,” as they were called, were expected to ask the customer, “Can I check the tires for you?” They were also forbidden to allow “personal opinions and prejudices” to get in the way of service: “Salesmen should be careful in their attendance upon Oriental and Latin classes of customers and refrain from using broken English in conversation with them.”
Teapot Dome in Wyoming, named for the shape of a geological structure, was one of three oil fields (the other two were in California) that had been set aside as “naval oil reserves” by the Taft and Wilson Administrations as one result of the pre–World War I debate about converting the U.S. Navy from coal to oil. The
When Warren G. Harding, chosen as the Republican candidate because among other reasons he “looked like a President,” won the White House in 1920, he sought, like any good politician, to appeal to both sides in the resource debate, celebrating “that harmony of relationship between conservation and development.” But,
“It would have been possible to pick a worse man for Secretary of the Interior,” the conservationist added, “but not altogether easy.”
“This isn’t the first time that this rumor has come to me,” he said, “but if Albert Fall isn’t an honest man, I’m not fit to be President of the United States.” Both propositions were soon tested to their limit.
Sinclair was sentenced to prison for six and a half months for contempt both of court and of the Senate. On his way to jail, he stopped to attend a board meeting of the Sinclair Consolidated Oil Corporation, where the other directors formally tendered him “a public vote of confidence.” Doheny was judged innocent and never went to jail, leading one Senator to complain, “You can’t convict a million dollars in the United States.”
The “rule of capture” had continued to govern the industry’s operations since its early days in western Pennsylvania, and it had repeatedly been sanctioned by the courts, based upon the English common law regarding migratory wild beasts and game. To some property owners who complained to one court that their oil was being drawn off by their neighbors, the justices had scant solace to offer: “Only go and do likewise.” Because of the rule, every operator everywhere in the United States put down his wells and produced as rapidly as he could, draining not only the oil under his own property but also that under his neighbor’s property, before his neighbor drained his own.
In turn, he declared that an “oil man is a barbarian with a suit on.”
Thus, though the demand for gasoline increased, the demand for crude oil did not grow at the same rate, adding to the rising surplus.
To his son, he added, “Do not hesitate for one second to be in opposition to your colleagues or in overriding their decisions. No business can be a permanent success unless its head is an autocrat—of course the more disguised by the silken glove the better.”
But Pearson also owed his position in Mexico to Díaz’s cold political calculation. “Poor Mexico,” the dictator was supposed to have once remarked, “so far from God and so close to the United States.”
Mexico quickly became a major force in the world oil market. The quality of its crudes was such that they were mainly refined into fuel oil, which competed directly with coal for industrial, railway, and shipping markets. By 1913, Mexican oil was even being used on Russian railroads.
Until 1884, resources in the country beneath the ground, in the “subsoil,” had belonged first to the crown and then to the nation. The regime of Porfirio Díaz had altered that legal tradition, giving over ownership of subsoil resources to the farmers and ranchers and the other surface landowners, who, in turn, welcomed foreign capital, which eventually controlled 90 percent of all oil properties. One of the major objectives of the revolution had been the restoration of the principle of national ownership of those resources.
The oil companies, for their part, felt increasingly vulnerable and endangered, which led to reduced investment and a rapid retreat in terms of activity and personnel. The effects quickly registered on output, which plummeted, and Mexico soon ceased being a world oil power.
wealth. His poor country needed revenues if it was to develop economically, and if he was to become rich. The two objectives blended as one. Revenues meant foreign capital. Oil was Gómez’s opportunity; but he shrewdly recognized that, in order to lure foreign investors, he would have to guarantee a stable political and fiscal environment.
As late as 1929, Shell protected the cabins of its tractors with several layers of a special cloth, dense enough to stop Indian arrows.
The La Rosa strike confirmed that Venezuela could be a world-class producer. The discovery inaugurated a great oil frenzy. Over a hundred groups, mostly American, but some British, were soon active in the country. They extended from the largest companies down to independent oil men like William F. Buckley, who obtained a concession to build an oil port.
Development proceeded at breakneck speed. In 1921, Venezuela produced just 1.4 million barrels. By 1929, it was producing 137 million barrels, thus making it second only to the United States in total output.
Jersey and the Nobels started intense negotiations—despite the strong possibility that the Nobels were trying to sell properties they might no longer own. That risk became more real in April 1920, when the Bolsheviks recaptured Baku and promptly nationalized the oil fields. The British engineers who worked in Baku were thrown into prison, while some of the “Nobelites” were to be put on trial as spies. Yet, so attractive was the deal if the Bolsheviks failed, and so strong the conviction that they would, that Jersey and the Nobels continued their discussions. In July 1920, less than three months after the nationalization, the deal was consummated. Standard Oil bought controlling rights to half of the Nobel oil interests in Russia at what was definitely a bargain basement price—$6.5 million down, with a commitment of up to another $7.5 million. In exchange, Standard gained control over at least one-third of Russian oil output, 40 percent of refining, and 60 percent of the internal Russian market. But, notwithstanding what the Western oil men wanted to believe, the risk was very high indeed—and all too evident. What if the new Bolshevik regime did, after all, survive? Having already nationalized the oil fields, it might operate them itself or put them on the international auction block.
The Soviet oil industry, virtually dormant from 1920 to 1923, thereafter revived quickly, helped by imports of large amounts of Western technology, and the USSR soon reentered the world market as an exporter.
Privately, Teagle was bitter about how the whole matter had been handled. It was the classic business problem of not enough time, of the day never being long enough for long-term thinking.
“I am so glad that nothing came of these Soviet deals,” he wrote to Teagle. “I feel that everybody will regret at some time that he had anything to do with these robbers, whose only aim is the destruction of all civilization and the re-establishment of brute force.”
Vacuum’s president observed that American businessmen and farmers were busily selling cotton and other products to Russia. “Is it more unrighteous,” he asked, “to buy from Russia than to sell to it?” That would be a long-persisting question.
The Jersey board decided to adopt a neutral stance—neither to seek a contract with the Soviets nor to participate in a boycott. Riedemann summed up the matter in the autumn of 1927. “Personally,” he said, “I have buried Russia.” If so, it would prove to be a lively corpse, as growing volumes of Soviet oil entered a sated world market.
He had a silky-smooth complexion, quite unusual for a man his age, which he attributed to eating carrots.
In sheer audacity, few could have matched the General Lee Development Company. Two promoters discovered a certain Robert A. Lee, a descendant of General Robert E. Lee, and prevailed upon him to tell investors around the country, “I would rather lead you and a thousand others to financial independence than to have won Fredericksburg or Chancellorsville.”
Doc Lloyd had provided Dad Joiner with a description of the geology of the East Texas region. To say it was misleading would be an understatement; it was totally incorrect, fabricated. Lloyd was what was called a “trendologist”; he drew up a map of the major oil fields of the United States, showing trend lines from all of them intersecting in East Texas. But Doc Lloyd did one memorable thing; he told Joiner exactly where to drill, when almost everybody thought the idea was completely ridiculous.
A crewman became so excited that he pulled a pistol from his pocket and started firing into the oil spray in the sky. Three men quickly jumped him and wrestled the gun away. One spark could have ignited the volatile escaping gas, causing the well to explode, killing everybody on the spot.
Later, he achieved notoriety as a patron of right wing causes, a promoter of health foods, and an inveterate enemy of white flour and white sugar.
At the beginning of August 1931, while Federal judges were considering the constitutionality of Oklahoma’s prorationing laws, the governor, “Alfalfa Bill” Murray, proclaimed a state of emergency, declared martial law, and ordered the state militia to take control of the major oil fields.
Their holiday entourage included secretaries, typists, and advisers, who were housed in a specially secured cottage seven miles away.
For, by 1928, a Soviet company, Russian Oil Products, was the fourth-largest importer into the United Kingdom.
They were acting under an American law called the Webb-Pomerene Act of 1918, which allowed U.S. companies to do abroad what the antitrust laws did not permit them to do at home—come together in a combination—so long as the combination’s activities took place exclusively outside the United States.
That same year, an observer of the oil industry, noting the intensification of political and economic nationalism in Europe, summed it all up very simply: Operations in the oil business in Europe, he said, “are 90 percent political and 10 percent oil.” The same seemed to be true in the rest of the world.
“It has often been said,” observed a visitor, “that the Shah’s greatest achievement is his victory over the Mullahs.”
At the age of eighteen, having already worked as a tax collector, a printer’s devil, and a jailkeeper, he enlisted in the Mexican Revolution. Recognized for his valor, his self-contained modesty, and his leadership, he was a general by the age of twenty-five and became a protégé of Plutarco Calles, the jefe máximo, the “maximum chief” of the revolution.
Washington could already see the unsettling effects from the British-led embargo and the efforts to close off traditional markets to Mexico. Nazi Germany became Mexico’s number-one petroleum customer (and at discount prices or on barter terms), with Fascist Italy next. Japan became a major customer as well. Japanese companies were also exploring for oil in Mexico and were discussing the construction of a pipeline from the oil fields across the country to the Pacific.
All of this meant that, in a military crisis, production in the Latin American countries would be essential to Britain, not only “because of their size of production, but because they are favourably placed from a sea transport point of view.”
But inescapably, nationalism had to make some concessions to economic reality. In the aftermath of the expropriation, not only was the promised wage hike indefinitely postponed, wages were, in fact, cut.
And his later amassing of immense wealth was no less remarkable for a ruler who, in his early days, could carry his entire national treasury in the saddlebags of a camel.
The Saudi dynasty had been established by Muhammad bin Saud, the emir of the town of Dariya in the Nejd, the plateau in central Arabia, in the early 1700s. There he took up the cause of a spiritual leader, Muhammad bin Abdul Wahab, who espoused a stern puritanical version of Islam that would become the religious mortar for the dynasty and its state. The Saudi family, allied with the Wahabis, began the rapid program of conquest that within half a century carried them to domination of much of the Arabian Peninsula.
For his part, Abdul Rahman had two goals: to reestablish the Saudi dynasty as master of Arabia, and to make universal the Wahabi branch of Sunni Islam. His son, Ibn Saud, would be the instrument to both ends.
To commemorate the consolidation, the name of the realm was changed in 1932 from the “Kingdom of the Hejaz and Nejd and Its Dependencies” to the name by which it is known today— Saudi Arabia.
The companion was an Englishman, a former official in the Indian Civil Service, who had set himself up as a merchant in Jidda and had, just a few months earlier, converted to Islam under the tutelage of Ibn Saud. The King had personally given him his Islamic name, Abdullah. But his real name was Harry St. John Bridger Philby, known as Jack to his English friends, and now, perhaps, remembered best as the eccentric father of one of the most notorious double agents of the twentieth century, Harold “Kim” Philby, who became the head of anti-Soviet counterespionage in British intelligence, while actively spying for the Soviets. He might well have taken lessons from his father on how to play multiple roles. Indeed, many years later, on reading Kim Philby’s own account of his years as a double agent, the retired court interpreter for Ibn Saud could only marvel that Kim was “a true replica of his father.”
All cars traveling that route should carry at least five passengers as “only five persons could get cars out of sand.”
Suleiman was certainly the most important man in the kingdom outside the royal family. He carried an enormous workload that was based upon an accounting system for public finances that he had invented and that only he could understand.
He needed money urgently—among other things, in order to pay the Cambridge University fees for his son, Kim. For his services, Socal agreed to give Jack Philby one thousand dollars a month for six months, plus bonuses both upon the signing of a concession contract and on the discovery of oil. Thus, Kim Philby was able, after all, to pursue his studies at Cambridge, where he took the first steps on the way to becoming a Soviet spy.
Care had been taken that all the coins bore the likeness of a male English monarch, and not Queen Victoria, which, it was feared, would have devalued them in the male-dominated society of Saudi Arabia.
The Great Depression had more generally crippled the economies of Kuwait and the other sheikhdoms. So bad had conditions become that slaveowners along the Arab coast were selling off their African slaves at a loss, to avoid the maintenance costs.
He attributed the “wonderful victory,” at least in good part, to an individual he decided was the most popular man in England, the American ambassador Andrew Mellon—the former U.S. Treasury Secretary and scion of the family that controlled Gulf Oil.
Every item that the geologists, engineers, and construction workers needed—be it equipment or food—had to be brought in over a supply line that stretched back to the port of San Pedro, near Los Angeles.
The Japanese Army now had the desired pretext to launch an attack on Chinese forces, which it proceeded to do without delay. The Manchurian Affair had begun, marking the entry into an era of Japanese history they were to call, when it was all over, the Valley of Darkness.
But the strength of the opposition was brought home a few months later, when a youth, enraged at Japan’s cooperation with the United States and Britain, shot Hamaguchi at a railway station in Tokyo. He never fully recovered, and died in 1931. With him perished the spirit of cooperation, and instead, a new cult of ultranationalism—bolstered by “government by assassination”—took hold.
The fishing fleet, which was one of the main sources of Japan’s food, was ordered to give up oil and instead to depend exclusively upon wind power!
That night in The Hague, Walden and Elliott began working out what to do when the Japanese invasion came. The two men wasted little time in implementing their new plans. As a first step, all German, Dutch, and Japanese employees in the Indies who were of doubtful loyalty were dismissed. Plans were prepared for the destruction of Stanvac’s refinery and oil wells—but rather openly, as a deterrent to the Japanese.
Fearful that a beleaguered Britain would withdraw its own forces from the Far East, Washington made a fateful decision; it transferred the American fleet from its base in Southern California to Pearl Harbor on the island of Oahu in Hawaii. Since the fleet was, at the time, already on maneuvers near Hawaii, the move was accomplished with a minimum of fanfare. One purpose was to stiffen British resolve. The other was to serve as a deterrent to Tokyo.
The British now wanted to find a way to halt the flow of oil. They feared that if Japan did build large stockpiles, it would become relatively immune to any economic sanctions. Still, Roosevelt and Hull resisted cutting the flow.
To be sure, Hull had a startling advantage during all these talks. Thanks to the code-breaking operation known as “Magic,” the United States and Britain had cracked “Purple,” the top-secret Japanese diplomatic code. Thus, Hull was able to read, before the meetings with Nomura, Tokyo’s instructions to the ambassador and, afterward, Nomura’s reports. Hull played his part adroitly, never giving any hint of knowing more than he was supposed to know.
He believed that the Japanese were a chosen people and that they had a special mission in Asia. He would do his duty. “It’s out of the question!” he exclaimed. “To fight the United States is like fighting the whole world. But it has been decided. So I will do my best. Doubtless I will die.”
Yet, early in 1941, despite the secrecy, U.S. Ambassador Grew heard from Peru’s minister to Tokyo about a rumor that Japan was planning an attack on Pearl Harbor. Grew reported it to Washington, where it was immediately discounted. American officials simply could not believe—then or in the months following—that such an audacious assault was even possible. Moreover, officials in the Navy and State departments were astonished that an ambassador of Grew’s caliber could take seriously such an obviously ridiculous story.
With some support from the Navy, Prince Konoye, the Prime Minister, raised the possibility of a summit meeting with Roosevelt. Perhaps he could appeal directly to the American President. Konoye was even willing to try to jettison the Axis alliance with Hitler in order to reach an understanding with the Americans.
So, for the time being, Roosevelt, with his talent for ambiguity, neither agreed to nor rejected such a meeting.
With winter almost at hand in Tokyo, the Japanese authorities retaliated by cutting off all supplies of heating oil to the American and British embassies.
A Japanese diplomat arrived in Washington the third week of November to present the list of demands. To Secretary of State Hull, it read like an ultimatum. There was another arrival of Japanese origin in Washington that week: an intercepted “Magic” message of November 22, informing Nomura that American agreement to Tokyo’s latest proposals had to be received by November 29 at the very latest, for “reasons beyond your ability to guess.” For, “after that, things are automatically going to happen.”
Most of the American officials seemed to have forgotten—or never knew—that Japan’s great victory in the Russo-Japanese War had begun with a surprise attack on the Russian fleet at Port Arthur.
But in those tense months leading up to the attack, the clear signals were lost in the “noise”—the maze of complex, confusing, contradictory, competing, and ambiguous pieces of information. After all, there were also many indications that the Japanese were about to attack the Soviet Union.
Yamamoto himself might well have taken one more chance, but he was thousands of miles away, monitoring events from his flagship, off Japan. The commander of the Hawaiian task force, Chuichi Nagumo, was a far more cautious man; indeed, he had actually opposed the entire operation. Now, despite the entreaties of his emboldened officers and much to their chagrin, he did not want to send planes back to Hawaii, for a third wave, to attack the repair facilities and the oil tanks at Pearl. His luck had been so enormous that he did not want to take more risks. And that, along with the sparing of its aircraft carriers, was America’s only piece of good fortune on that day of devastation.
It was a strategic error with momentous reverberations. Every barrel of oil in Hawaii had been transported from the mainland. If the Japanese planes had knocked out the Pacific Fleet’s fuel reserves and the tanks in which they were stored at Pearl Harbor, they would have immobilized every ship of the American Pacific Fleet, and not just those they actually destroyed. New petroleum supplies would only have been available from California, thousands of miles away. “All of the oil for the Fleet was in surface tanks at the time of Pearl Harbor,” Admiral Chester Nimitz, who became Commander in Chief of the Pacific Fleet, was later to say. “We had about 41/2 million barrels of oil out there and all of it was vulnerable to .50 caliber bullets. Had the Japanese destroyed the oil,” he added, “it would have prolonged the war another two years.”
Mussolini well knew that a shut-off of petroleum supplies would paralyze the Italian military. While his armies advanced, throwing poison gas against the hapless Ethiopians, he resorted to every form of bluff and bluster to intimidate the League.
The question of foreign supplies and sanctions was very much on Hitler’s mind. For he was on the eve of a critical move. The next month, March 1936, he boldly remilitarized the Rhineland, on the border with France, in violation of treaty agreements. It was the first time that he flexed his muscles on the international front, taking what afterward he was to call his gravest risk—the forty-eight hours that were “the most nerve-racking in my life.” He waited to be challenged, but the Western powers did nothing to stop him. The gamble had paid off. The pattern was to be repeated.
The anti-Nazi chairman of the managing board, Carl Bosch, the man who had made the deal with Standard Oil, was pushed aside, while most of the other members of the managing board who did not already belong to the Nazi party fell all over each other in their rush to sign up.
The campaign in the West actually improved Germany’s oil position, for German troops captured oil stocks considerably in excess of the fuel they had expended in the invasions.
From the very start, the capture of Baku and the other Caucasian oil fields was central to Hitler’s concept of his Russian campaign. “In the economic field,” one historian has written, “Hitler’s obsession was oil.”
If the oil of the Caucasus—along with the “black earth,” the farmlands of the Ukraine—could be brought into the German empire, then Hitler’s New Order would have within its borders the resources to make it invulnerable.
To support his plans, Hitler propounded his own bizarre calculations: that the number of German casualties in a war with Russia would be no greater than the number of workers tied up in the synthetic fuel industry. So there was no reason not to go ahead.
Warnings of the impending invasion came from many sources—Americans, British, other governments, his own spies—but Stalin resolutely refused to believe them. Scarcely hours before the invasion, a dedicated German communist defected from a German Army unit and slipped over to the Soviets with word of what was about to happen. Stalin suspected a trick and ordered the man shot.
Just after 3:00 A.M. the German Army, three million men strong, with 600,000 motor vehicles and 625,000 horses, struck along a wide front. The German onslaught caught the Soviet Union completely off guard and put Stalin into a nervous collapse that lasted several days.
The numbers were beyond comprehension; six to eight million Soviet soldiers were killed or captured in the first year of war, and still new men were thrown into battle.
The only thing standing in the way of Germany’s exploitation of Russian oil was the requirement to capture it.
The Germans had captured Russian oil supplies as they had done with French supplies, but this time to no avail, for Russian tanks ran on diesel, which was useless to the German panzer units, which ran on gasoline. Panzer divisions were sometimes at a standstill for several days at a time in the Caucasus, while they waited for fresh supplies. Trucks carrying oil could not catch up because they too had run short of fuel. Finally, in desperation, the Germans took to transporting oil supplies on the backs of camels.
He also had a considerable talent for improvisation, and not only in terms of tactics. Early in his campaign, Rommel ordered a number of “dummy tanks” built at workshops in Tripoli, which were then mounted on Volkswagens in order to frighten the British into thinking his armored divisions were much larger than in fact they were.
On the other side of the line, panic was building in Cairo. The British were burning their documents, Allied personnel were being squeezed into cattle trains for a hasty evacuation, and Cairo merchants were hurriedly replacing the photographs of Churchill and Roosevelt in their shop windows with those of Hitler and Mussolini.
Montgomery would afterward be criticized for being too cautious. But, as one German general would later say, “he is the only Field-Marshal in this war who won all his battles.”
The legend had fallen, and in March 1943 Rommel, now regarded by Hitler as defeatist, was removed from command of the Afrika Korps. By
Three days later, a group of army officers tried to assassinate Hitler, but failed. Rommel was suspected both of involvement in the conspiracy and of plotting a separate surrender in the West to the Allies.
In Rommel’s papers, collected after his death, he left a hard-earned epitaph for the role of supply, and in particular of oil, in the age of mobile warfare. “The battle is fought and decided by the Quartermasters before the shooting begins,” he wrote, looking back on El Alamein.
Speer drove the slack out of the German economy. The two and a half years after his initial appointment would see a more than threefold increase in the production of aircraft, weapons, and ammunition, and a nearly sixfold increase in tanks. And these remarkable production records were being set at the same time that Allied forces were carrying out an extensive if not particularly successful strategic bombing campaign against a variety of German targets, such as the aviation industry and railway depots and ball-bearings factories. German industrial production was still rising; indeed it registered its highest level of the entire war in June 1944.
By 1944, according to one estimate, a third of the total work force in the German synthetic fuels industry, throughout the Reich, was slave labor. I.
Just before Allied invaders forced Italy out of the war, the German military had seized its oil stocks, adding substantially to its own reserves.
Jet fighters, a new German innovation that would have given the Luftwaffe a significant advantage, were being introduced into operational squadrons in the autumn of 1944. But there was no fuel to train the pilots, or indeed even to get the planes into the air.
Only when Russian soldiers were almost directly above his underground bunker, on the doorstep of the now-ruined Chancellery that Speer had designed for him, did Hitler commit suicide. He left orders that his body be doused in gasoline and burned so that it would not fall into the hands of the hated Slavs. There was enough gasoline at hand to carry out that final order.
Occasionally, they could still spot the red glow from Balikpapan high in the sky. They had done their work well; four decades of industrial creation had been destroyed in less than a day.
Premier Tojo bragged that Hong Kong had fallen in eighteen days, Manila in twenty-six, and Singapore in seventy. A “victory fever” gripped the country; the stunning military successes spawned such a runaway stock market boom in the first part of 1942 that the government had to intervene to dampen it down.
The contrast between the Army’s and Navy’s top commanders was enormous. General Douglas MacArthur, although a strategist of great shrewdness, was also egoistic, bombastic, and imperious. At one meeting during the war, after three hours of listening to MacArthur, Franklin Roosevelt told an aide, “Give me an aspirin . . . . In fact, give me another aspirin to take in the morning. In all my life nobody has ever talked to me the way MacArthur did.” For his part, Admiral Chester Nimitz was a soft-spoken, unassuming team player, who, when waiting word on the outcome of a battle, would practice on his pistol range or toss horseshoes right outside his office. “It simply was not in him to make sweeping statements or to give colorful interviews,” noted one correspondent.
Or, as one history of Japanese military operations put it, “The shortage of liquid fuel was Japan’s Achilles’ heel.”
The specific weakness was the vulnerability of Japanese shipping to submarines.
But only in late 1943 did the Japanese begin to give serious attention to the protection of shipping against submarines, including the establishment of convoys. Their efforts were inadequate and incomplete. “When we requested air cover,” one convoy commander said ruefully, “only American planes showed up.”
Of Japan’s total wartime steel merchant shipping, some 86 percent was sunk during the conflict and another 9 percent so seriously damaged as to be out of action by the time the war ended. Less than 2 percent of American naval personnel—the submariners—were responsible for 55 percent of the total loss.
And as the shortage worsened in 1945, navigation training was eliminated altogether; pilots were simply to follow their leaders to targets. Few were expected to return.
The Japanese had methodically calculated that, whereas eight bombers and sixteen fighters were required to sink an American aircraft carrier or battleship, the same effect could be achieved by one to three suicide planes. Not only was the pilot sure to cause more damage if he crashed his plane, not only would his commitment and willingness to die unnerve an enemy who could not comprehend the mentality of such an act, but—since he was not going to return—his fuel requirement was cut in half.
The American leaders knew that Japan’s fighting capability was crumbling, but they saw no sign that its fighting spirit was fading.
Yet so appalling were Japan’s circumstances and so great the shock of the atomic bombs, made worse by the new Soviet threat, that those seeking to end the war finally prevailed over the intense opposition from the military.
The cost of what Tojo and his collaborators had launched was enormous. The Pacific War in its entirety claimed upwards of 20 million lives, including about two-and-a-half million Japanese. Now, in 1945, Tojo’s own life hung in the balance, not because his self-inflicted wound was inevitably fatal, which it was not, but because of the difficulty, first, in locating a suitable doctor, and then in finding an ambulance that had any gasoline in its tank. So widespread was the fuel shortage that it proved easier to find an American doctor than an ambulance with gasoline. But finally a vehicle with sufficient fuel was located, and it arrived at Tojo’s house two hours after he had shot himself. Tojo was carried off to the hospital and nursed back to health. The next year he went on trial as a war criminal, was found guilty, and was in due course executed.
A few months after the war began, before France had been overrun by the Germans, the British and French governments, seeking to replicate what had been carried out in World War I, had jointly offered to pay Rumania $60 million to destroy its oil fields, and thus prevent the Germans from taking the output. But the two sides could never agree on a price, the deal was never struck, and Rumanian oil went, as feared, to the Germans. The destruction was left to be done by Allied bombers, much later in the war.
As Roosevelt was later to explain, “Old Dr. New Deal” had to call in his partner “Dr. Win-the-War.” And what Dr. New Deal had found unpalatable and unhealthy about Big Oil—its size and scale, its integrated operations, its self-reliance, its ability to mobilize capital and technology—was exactly what Dr. Win-the-War would prescribe as the urgent medicine for wartime mobilization.
The Germans also gained two other very significant advantages. They changed their code procedures, so that the British lost the ability to read the U-boat signals; and at the same time, they broke the ciphers that governed the movement of the Anglo-American convoys. The resulting destruction of Allied shipping was appalling. Once again looming before the Allies was one of their greatest fears: the choking off of the absolutely essential oil supplies to Britain from the Western Hemisphere.
In addition, Admiral Doenitz introduced Milchkuhs (“milk cows”), large underwater supply ships that could deliver diesel fuel as well as fresh food to the U-boats.
In May 1943 alone, 30 percent of the U-boats at sea were lost.
After all, about half the total tonnage shipped from the United States during the war was oil.
The Quartermaster Corps calculated that when an American soldier went overseas to fight, to support him in his job he required sixty-seven pounds of supplies and equipment, of which half was petroleum products.
He knew the importance of creating a legend about himself—be it the two revolvers, one pearl-handled, that Patton wore on his hips, or the nickname “Old Blood-and-Guts” that he had bestowed on himself in his unsuccessful bid in the 1930s to become Commandant of Cadets at West Point.
By 1941, Ibn Saud was once more confronting a stark financial crisis. The King had to face the harsh reality. As he explained to an American in 1942, “The Arabs have the religion, but the Allies have the money.”
His group was also to include one hundred live sheep, but after some negotiation, the number was reduced to just seven in light of the sixty days’ worth of provisions, including frozen meat, on board the American ship. Ibn Saud spurned the offer of the commodore’s cabin and slept instead on deck, in an improvised tent made of canvas, stretched over the forecastle, and furnished with Oriental carpets and one of the King’s own chairs.
Then, in February 1946, the Anglo-American Petroleum Agreement ran into a new problem. Its chief sponsor, Harold Ickes, got into a bitter scrap with Harry Truman over the President’s proposed appointment of Edwin Pauley, a California oil man, as Undersecretary of the Navy. Ickes, as had been his wont under Roosevelt, submitted his resignation. And a long good-bye it was—more than six pages typed, single-spaced. “It was the kind of letter sent by a man who is sure that he can have his way if he threatens to quit,” Truman later said. But Ickes had made a mistake; Truman was not Roosevelt. He accepted Ickes’s resignation tersely and with alacrity and delight. Ickes requested six weeks to wind up the many things that only he personally could attend to; Truman gave him two days to clean out his desk.
Ibn Saud was now in his mid-sixties, blind in one eye, and failing in health. His personal force and drive had created and held together the kingdom. But what would happen when that force was gone? He had sired upwards of forty-five sons, of whom thirty-seven were thought to be living, but would that be a factor for stability or for conflict and disorder?
At war’s end, their IPC shares reverted to both CFP and Gulbenkian. But then in late 1946, Jersey and Socony took up the concept of “supervening illegality” with what could only be called extreme enthusiasm. In their view, the whole IPC agreement was no longer in effect.
Shortly after the war, Stalin interrogated his petroleum minister, Nikolai Baibakov (who subsequently was to be in charge of the Soviet economy for two decades—until 1985, when Mikhail Gorbachev replaced him). Mispronouncing
In England, the River Thames froze at Windsor. Throughout Britain, coal was in such short supply that power stations had to be shut down, and electricity to industry was either reduced greatly or cut off entirely. Unemployment abruptly increased six times over, and British industrial production was virtually halted for three weeks—something German bombing had never been able to accomplish.
But the emergence of a Jewish state, along with the American recognition that followed, threatened more than transit rights for the pipeline. Ibn Saud was as outspoken and adamant against Zionism and Israel as any Arab leader. He said that Jews had been the enemies of Arabs since the seventh century. American support of a Jewish state, he told Truman, would be a death blow to American interests in the Arab world, and should a Jewish state come into existence, the Arabs “will lay siege to it until it dies of famine.”
In 1948, Britain, at wit’s end, gave up its mandate and withdrew its Army and administration, plunging Palestine into anarchy. On May 14, 1948, the Jewish National Council proclaimed the state of Israel. It was recognized almost instantly by the Soviet Union, followed quickly by the United States. The Arab League launched a full-scale attack. The first Arab-Israeli war had begun.
Ibn Saud could certainly have canceled the concession, but at considerable risk. For Aramco was the sole source of his rapidly rising wealth, and the broader relationship with the United States provided the basic guarantee of Saudi Arabia’s territorial integrity and independence. Ever suspicious of the British, the King feared that London might be sponsoring a new coalition to champion the Hashemites, as it had done after World War I, enabling the Hashemites—whom Ibn Saud had driven from Mecca only two decades earlier—to recapture the western part of his country.
Ibn Saud found he could distinguish between Aramco, a purely commercial enterprise owned by four private companies, and the policy of the U.S. government elsewhere in the region. When other Arab countries declared that Saudi Arabia should cancel the concession to retaliate against the United States and prove its allegiance to the Arab cause, Ibn Saud replied that oil royalties helped to make Saudi Arabia “a stronger and more powerful nation, better to assist her neighboring Arab states in resisting Jewish pretensions.”
construction proceeded on Tapline. It was finished in September 1950. Two more months were required to fill the line, and in November, the oil began to arrive at Sidon in Lebanon, the terminus on the Mediterranean, where it was picked up by tankers for the last leg of the journey to Europe. Tapline’s 1,040 miles would replace 7,200 miles of sea journey from the Persian Gulf through the Suez Canal. Its annual throughput was the equivalent of sixty tankers in continuous operation from the Persian Gulf, via the Suez Canal, to the Mediterranean. The oil it carried would fuel the recovery of Europe.
A new Federal agency, the National Security Resources Board, made a similar argument in a major policy review in 1948; importing large amounts of Middle Eastern oil would allow a million barrels per day of Western Hemisphere production to be shut in, in effect creating a military stockpile in the ground—“the ideal storage place for petroleum.”
As far back as the mid-1890s, operators were drilling off piers near Santa Barbara, but the wells produced no more than one or two barrels per day.
“Practical men, who believe themselves to be quite exempt from any intellectual influences,” John Maynard Keynes once said, “are usually the slaves of some defunct economist.” When it came to oil, the “practical men” included not only the businessmen that Keynes had in mind, but also kings, presidents, prime ministers, and dictators—as well as their ministers of oil and finance. Ibn Saud and the other leaders of the time, as well as the various potentates since, were under the thrall of David Ricardo, a fantastically successful stockbroker in late-eighteenth-century and early-nineteenth-century England. (Among other things, he made a killing on Wellington’s defeat of Napoleon at Waterloo.) By origin a Jew, Ricardo became a Quaker, then a learned member of the House of Commons, and was one of the founding fathers of modern economics. He and Thomas Malthus, his friend and intellectual rival, constituted between themselves the successor generation to Adam Smith.
He was also convinced that implacable resistance would probably be not only expensive, but also futile. Better to help create the new order, in Pratt’s view, than be a victim of it.
The Neutral Zone was the two thousand or so square miles of barren desert that had been carved out by the British in 1922 in the course of drawing a border between Kuwait and Saudi Arabia. In order to accommodate the Bedouins, who wandered back and forth between Kuwait and Saudi Arabia and for whom nationality was a hazy concept, it was agreed that the two countries would share sovereignty over the area. If every system has within it the seeds of its own destruction, then it was in the Neutral Zone—and in the way its oil rights were parceled out—that the erosion began that would eventually lead to the end of the postwar petroleum order.
The welder’s cowboy hat had provided reason enough to strike up a conversation, and the welder soon found himself a guest in Dasman Palace in Kuwait City, where he garnered lasting gratitude in the water-starved principality by adjusting the palace’s plumbing so that water use was cut 90 percent.
By age fifty-five, he had had his second facelift and was dyeing his hair a funny kind of reddish-brown, all of which gave him a rather wizened, embalmed look.
By the end of the 1950s, Getty was the seventh-largest marketeer of gasoline in the United States. Fortune magazine announced in 1957 that he was America’s richest man and its sole billionaire.
The Saudis did their research; they had even, unbeknownst to Aramco, retained their own adviser on American tax law, and to their delight, they learned about a most interesting and intriguing provision in American tax laws that would leave Aramco whole. It was called the “foreign tax credit.”
Reza Shah had thereafter brought order to the fractious country, begun modernizing it at a pell-mell rate, and had subjugated the powerful mullahs, whom both father and son regarded as dangerous and deadly enemies from the Middle Ages.
Alarmed by rapid German advances in Russia and North Africa, the Allies feared a pincer that would converge in Iran. They deposed Reza Shah, who had shown friendliness and sympathy toward the Nazis, and replaced him with his son, only twenty-one at the time.
Only one thing really united the country—hatred of foreigners and, in particular, the British. Never had so much malevolence been attributed to a so rapidly declining power. The English were regarded as almost supernatural devils, controlling and manipulating the entire nation.
Prime Minister Razmara did not know what to do. Finally, in a speech to Parliament in March 1951, he came out against nationalization. Four days later, as he was about to enter Tehran’s central mosque, he was assassinated by a young carpenter, who had been entrusted by Islamic terrorists with the “sacred mission” of killing the “British stooge.”
Mohammed Mossadegh would completely dominate the drama of the next two years. He would slyly outwit every-one—foreign oil companies, the American and British governments, the Shah, his own domestic rivals. He himself was a man of evident contradictions. Cosmopolitan, educated as a lawyer in France and Switzerland, he was fiercely nationalistic, antiforeigner, and obsessed by his opposition to the British. The son of a high bureaucrat and a great-grandson of a Shah from the preceding dynasty, Mossadegh was an aristocrat with extensive landholdings, including a 150-family village that belonged to him personally. Yet he took on the mantle of reform, republicanism and rabble rousing, appealing to and mobilizing the urban masses.
British suspicions were further stirred when a junketing young Congressman, Representative John F. Kennedy, son of the former American ambassador to London, stopped in Tehran and suggested to the British ambassador that, if no settlement emerged, “it would be a good thing for American concerns to step into the breach.”
On September 25, 1951, Mossadegh gave the last remaining British employees at Abadan exactly one week to clear out. A few days later, Ayatollah Kashani declared a special national holiday—“a day of hatred against the British Government.”
Field control of what was called “Operation Ajax” was vested in the CIA’s Kermit Roosevelt, grandson of Theodore Roosevelt.
Operation Ajax created “a situation and an atmosphere in Tehran that forced the people to choose between an established institution, the monarchy, and the unknown future offered by Mossadeq.”
The other irascible old man who played a major role in the Iranian crisis did not fare nearly as well. Mohammed Mossadegh was put on trial by the reinstated Shah, delivered impassioned speeches in his own defense, and spent three years in prison. He lived out the rest of his days under house arrest on his estate, continuing his experiments with homeopathic medicines, much as he had done, three decades earlier, when the Shah’s father put him under house arrest.
Ferdinand de Lesseps, a Frenchman ever after celebrated as “the Great Engineer.” In fact, he was no engineer at all, though he was a man of other considerable accomplishments—as a diplomat, entrepreneur, and promoter. And his talents did not end there. At the age of sixty-four, he married a woman of twenty, and then, forthwith, proceeded to father twelve children.
The Americans believed that the relics of colonialism were an enormous handicap for the West in its struggle with communism and the Soviet Union.
But the coup de grace came when Senate Republicans told Dulles that foreign aid could be approved for only one of the two “neutralist” leaders slated for assistance: Tito of Yugoslavia, or Nasser of Egypt. But not both. Dulles chose Tito. Eisenhower confirmed the decision. The British were in accord. On July 19, 1956, Dulles canceled the proposed Aswan Dam loan, taking Nasser and the World Bank by surprise.
In an appalling blunder, the British and French never really factored the calendar of the American Presidential election into their calculations.
That same month, with the Suez crisis still brewing, Robert Anderson, a wealthy Texas oil man who was much admired by Eisenhower, made a secret trip to Saudi Arabia as the President’s personal emissary. The objective was to get the Saudis to apply pressure on Nasser to compromise. In Riyadh, Anderson warned King Saud and Prince Faisal, the Foreign Minister, that the United States had made great technical advances that would lead to sources of energy much cheaper and more efficient than oil, potentially rendering Saudi and all Middle Eastern petroleum reserves worthless. The United States might feel constrained to make this technology available to the Europeans if the canal were to be a tool of blackmail. And what might this substitute be, asked King Saud. “Nuclear energy,” replied Anderson. Neither King Saud nor Prince Faisal, who had done some reading on nuclear power, seemed impressed, nor did they show any worry about the ability of Saudi oil to compete in world energy markets. They dismissed Anderson’s warning.
Some later said that such a condition could, literally, slowly poison the mind. To make matters worse, Eden thereafter was on drugs for his stomach pain, as well as on stimulants (apparently amphetamines) to counter the effects of the painkillers. The interaction and side effects of these various medications were not then well known. Eden struck others as very agitated. The dosage of both sets of drugs had to be increased considerably after Nasser’s seizure of the canal. In early October, Eden collapsed and was rushed to a hospital with a 106 degree fever. Though he was back in command for much of October, he continued to show signs of ill health and was put on a regimen of ever-heavier medication.
IN THE PARLANCE of the oil industry, a giant oil field is called an “elephant.”
Proven world oil reserves in the noncommunist world increased from 62 billion barrels in 1948 to 534 billion barrels in 1972, almost a ninefold rise.
Out of every ten barrels added to free world oil reserves between 1948 and 1972, more than seven were found in the Middle East.
Many years later, one of his aides would recall, “Anybody who worked with him would go into the fire for him, although you couldn’t really explain why.”
(In 1954, Anglo-Iranian, taking up the name of the subsidiary it had acquired in World War I, had rechristened itself British Petroleum).
And now a notable technological innovation, the cheap transistor radio, was carrying his rousing voice to the poor masses throughout the Arab world, making him a hero everywhere.
Syria joined Egypt to form the United Arab Republic, seemingly the first step in the realization of his dream of pan-Arabism. The apparent merger, ominously, brought together the two countries which—with the Suez Canal in Egypt and the Saudi and Iraqi pipelines passing through Syria—dominated the transit routes for Middle Eastern petroleum.
Crowds surged through the streets, holding aloft huge photographs of Nasser, along with live squirming dogs, which represented the Iraqi Royal Family. King Faisal II himself was beheaded by troops that stormed the palace. The Crown Prince was shot, and his hands and feet were hacked off and carried on spikes through the city. His mutilated body, along with those of a number of other officials, was dragged through the streets, and then hung from a balcony at the Ministry of Defense.
He wrote articles for exile newspapers and took up woodworking, but more than anything else, he devoted himself to the study of the oil industry.
Venezuela was a relatively high-cost producer, at about eighty cents a barrel, according to one estimate, compared to twenty cents among the Persian Gulf producers. So Venezuela would inevitably be at a disadvantage in an out-and-out production race. It would lose market share. Venezuela, thus, had a very good reason to seek to persuade the Middle Eastern producers to raise their taxes on the companies and thus the cost of their oil.
He came with a proposal to create a Western Hemisphere oil system, but one that would be run by the governments, not the oil companies. Under it, Venezuela would, as a nation, be given a quota—a guaranteed share of the U.S. market. No longer would it be the prerogative of the companies to decide from which producing country to bring in petroleum. What Pérez Alfonzo was asking was not so bizarre; after all, he could point out, it was exactly the way the American sugar quota system worked— each country had its share. But, then, oil was not sugar.
With so much attention and energy focused on this power struggle, and in the absence of a single dominating personality, Tariki was able to shape policy with considerable autonomy in an absolutely critical area, the one that happened to generate the kingdom’s entire wealth.
Between 1955 and 1960, Soviet oil output actually doubled, and by the end of the 1950s, the Soviet Union had displaced Venezuela as the second-largest oil producer in the world, after the United States.
Indeed, a political competition soon developed between Saudi Arabia and Egypt, which culminated in their proxy war in Yemen.
Those wanting to make an overseas call to the United States had to fly to Rome to do it.
Progress by the geologists in the field was impeded by obstacles they had never encountered before: an estimated three million land mines left over from World War II. Geologists and oil field workers were not infrequently injured or killed by undetected mines. The companies formed mine detection and clearance squads, and in due course, some of the Germans who had laid the mines for General Rommel were recruited to remove them.
Between 1960 and 1969, the market price for oil fell by 36 cents a barrel, a drop of 22 percent. Correcting for inflation, the fall was even steeper—a 40 percent decline.
The companies even switched from a Western to an Iranian calendar in order to push more production into that particular year.
Between 1948 and 1972, consumption tripled in the United States, from 5.8 to 16.4 million barrels per day—unprecedented except when measured against what was happening elsewhere. In the same years, demand for oil in Western Europe increased fifteen times over, from 970,000 to 14.1 million barrels per day. In Japan, the change was nothing less than spectacular; consumption increased 137 times over, from 32,000 to 4.4 million barrels per day.
To any manufacturer worried about the continuity of his production line, to a utility manager anxious about his ability to meet electricity requirements in the dead of winter, Lewis’s fiery rhetoric and the militancy of his United Mine Workers constituted a powerful invitation to find a substitute for coal. That
Part of the reason for oil’s victory over coal was environmental, especially in Britain. London had long suffered from “Killer Fogs” as the result of pollution from coal burning, particularly the open fires in houses. So thick were those fogs that confused motorists literally could not find their way home to their own streets and instead would drive their cars onto lawns blocks away from their own houses. Whenever the fogs descended, London’s hospitals would fill with people suffering from acute respiratory ailments. In response, “smokeless zones” were established where the burning of coal for home heating was banned, and in 1957 Parliament passed the Clean Air Act, which favored oil.
The bands of public transportation, primarily rail, which had bound Americans to the relatively high-density central city, snapped in the face of the automotive onslaught, and a great wave of suburbanization spread across the land.
As William Levitt explained, “No man who owns his own house and lot can be a Communist. He has too much to do.”
Altogether, between 1950 and 1976, the number of Americans living in central cities grew by 10 million, the number in suburbs by 85 million. By 1976, more Americans lived in suburbs than in either central cities or rural areas.
The first all-enclosed, climate-controlled mall made its appearance near Minneapolis in 1956.
America had become a drive-in society. In Orange County, California, it was possible to attend religious services sitting in your car at the “world’s largest drive-in church.”
There were no environmental impact studies in those days, no anti-development litigation, only the sense that in America you could get important things done quickly, and the whole job, from the first planning to the last toll booth, was accomplished in less than two years.
Even in their living rooms, oil became part of the lives of Americans. Upwards of 60 million of them were entertained each week by a situation comedy called The Beverly Hillbillies, which became an instantaneous hit when it took to the airwaves in 1962 and was the number-one-rated show for a couple of years.
BP found the job of reorganizing transportation so complex that it gave up using computers—it could not write the programs quickly enough—and went back to pencil and paper.
As for those trucks and tractors on the banks of the Yukon River, the oil companies spent millions of dollars keeping their engines tuned up and just plain warm, waiting for the day.
Yet Alaska and the North Sea had another common bond: Though their reserves were in very difficult places, physically, they were not in unstable places, politically.
When it came to deal-making, there have been few in the entire twentieth century to rival Armand Hammer.
His father, Dr. Julius Hammer, was not only a practicing physician and drug manufacturer, but also a partisan of the left; he had met Lenin in Europe in 1907 and was one of the founders of the American Communist party. Armand did not share his father’s socialist tendencies; he was interested in making money and doing deals—in short, a capitalist.
Hammer renewed his contacts with the Soviet Union under Nikita Khrushchev and ended up as a go-between for five Soviet General Secretaries and seven U.S. Presidents. His access to the Kremlin was unique. He was virtually the only person who could tell Mikhail Gorbachev firsthand about Lenin, who had died a decade before Gorbachev’s birth. As late as 1990, at age ninety-two, Hammer was still the active chairman of Occidental, and loyal stockholders continued to sing his praises.
On the night of August 31–September 1, 1969, a senior Libyan military officer, finding himself unexpectedly awakened in his own bedroom by a junior officer, told the insistent young man that he was too early; the coup was scheduled for a few days later. Alas, for the senior officer, this was a different coup.
A group of radical young officers, led by the charismatic Muammar al-Qaddafi, beat all the others to the punch, including their military superiors, who had scheduled their own coup for just three or four days later.
Thereafter, a bit reassured, he switched back, on each morning’s commute from Paris, to his own more familiar Gulfstream II, with its cork-lined bedroom. He would arrive back in Paris at 2:00 A.M., and would be off again by 6:00 A.M. Throughout his life, he had had a remarkable capacity to nap under all kinds of conditions, and he put that ability to very good use on the flights.
King Faisal of Saudi Arabia was not among them. He hated Israel and Zionism as much as any Arab leader. He was sure there was a Zionist-communist plot to take over the Middle East; he told both Gamal Abdel Nasser and Richard Nixon that the Israelis were the real paymasters for radical Palestinian terrorists. Yet
Saudi Arabia had, at last, graduated to the position once held by Texas; the desert kingdom was now the swing producer for the entire world.
Sadat’s gamble paid off, and the enormity of the surprise of the Arab attack would be for the Israelis what Pearl Harbor had been thirty-two years earlier for Americans. Afterward, the Israelis would ask themselves how they could have been caught so completely off guard. The signals were all so clear. But those signals were not so easily extracted from the noise of contradictory information mixed with deliberate deception, especially when complacency and overconfidence had taken hold.
Nixon also assured Saqqaf and the other foreign ministers that, despite his Jewish origins, Kissinger “was not subject to domestic, that is to say Jewish, pressures.” He went on to add, “I can see that you are concerned about the fact that Henry Kissinger is a Jewish-American. A Jewish-American can be a good American, and Kissinger is a good American. He will work with you.” Kissinger was writhing with embarrassment and anger as the President made his gratuitous remark, but Saqqaf was nonplussed. “We are all Semites together,” Saqqaf deftly replied.
Throughout the clash of arms in the Middle East and the weeks of crisis over oil, one key actor was otherwise preoccupied. Richard Nixon was thoroughly entangled in the series of events that escalated from what he called a “third-rate burglary” into the unprecedented series of Watergate scandals, at the center of which was the President himself. The United States had seen nothing remotely like it since Teapot Dome.
In the weeks that followed, though Nixon would temporarily depart his own personal crisis and weave in and then out again of the world crisis, effective control over American policy was lodged in the hands of Henry Kissinger, who, in addition to being Special Assistant for National Security, had also just been appointed Secretary of State.
A half dozen of the most senior American national security officials were summoned to a hurriedly called late-night emergency meeting in the White House Situation Room. Nixon himself was not awakened for the meeting on the advice of Alexander Haig, who told Kissinger that the President was “too distraught” to join them.
Telling the Cabinet about Simon’s new post, Nixon likened it to Albert Speer’s position as armaments overlord in the Third Reich. Had Speer not been given the power to override the German bureaucracy, Nixon explained, Germany would have been defeated far earlier. Simon was somewhat discomfited by the comparison.
Watergate, Yankelovich added, “has bred a widespread feeling of gloom about the state of the nation,” and, as a direct result, public confidence that “things are going well in the country” had plummeted from 62 percent in May 1973 to a mere 27 percent in late November 1973.
It was difficult to get a political decision on anything on an inter-agency basis. There was no real decision-making apparatus in Washington—other than Henry Kissinger.”
At this point, Heath lost his temper. “You know perfectly well that I can’t put it in writing,” he boomed. After all, he was the great champion of British entry into the European Community and of cooperation with the Europeans. “Then I won’t do it,” Drake replied with absolute firmness.
The posted price had been raised from $1.80 in 1970 to $2.18 in 1971 to $2.90 in mid-1973 to $5.12 in October 1973— and now to $11.65. Thus,
His haughty stance, in the midst of shortages and huge price increases, would cost him dearly only a few years later, when he desperately needed friends.
To be sure, there were significant variations among countries in Europe. The French and the British were the most keen to distance themselves from the United States and to court the producers; the Germans, less so; and the Dutch, by contrast, resolute in their commitment to traditional alliances. Some of the Europeans emphasized that they had large and immediate interests to protect. “You only rely on the Arabs for about a tenth of your consumption,” French President Georges Pompidou bluntly told Kissinger. “We are entirely dependent upon them.”
Watergate, however, was an ever-present reality, and Nixon’s behavior on the trip struck some as stark. Sitting with the Israeli Cabinet in Tel Aviv, he suddenly announced that he knew the best way to deal with terrorists. He sprang to his feet and, with an imaginary submachine gun in his hand, pretended to gun down the entire Cabinet, Chicago style, making a “brrrrr” sound as he did so. The Israelis were perplexed and a little concerned. In Damascus, Nixon told Syria’s Assad that the Israelis should be pushed back until they fell off the cliff, and to underscore the point, he made strange, chopping gestures. Thereafter, in subsequent meetings with other Americans, Assad would insist on recalling Nixon’s exposition.
Why, the Saudis asked, were the Americans so obsessed with the Shah? In August 1975, the U.S. ambassador to Riyadh reported to Washington that Zaki Yamani had said that “the talk of eternal friendship between Iran and the United States was nauseating to him and other Saudis. They knew the Shah was a megalomaniac, that he was highly unstable mentally, and that if we didn’t recognize this there must be something wrong with our powers of observation.” Yamani sounded a warning. “If the Shah departs from the stage, we could also have a violent, anti-American regime in Teheran.”
While Yamani was widely known as “Sheikh,” the title was in his case an honorific, assumed by prominent commoners, which he was.
he also tried always to think long term, as befitted the representative of a country with a small population and one-third of the world’s oil resources.
The Western world, he believed, was afflicted by the curse of short-term thinking, the inevitable result of democracy. By nature, Yamani was also cautious and calculating. “I can’t bear gambling,” he said in 1975, when he was at his apogee. “Yes, I hate it. It rots the soul. I’ve never been a gambler. Never.” In oil politics, he insisted, he never gambled. “It’s always a calculated risk. Oh, I calculate my risks well. And when I take them, it means I’ve taken all necessary precautions to reduce them to the minimum possible. Almost to zero.”
as the Kuwaiti knelt before the King, the nephew stepped forward and fired several bullets into Faisal’s head, killing him almost instantaneously. Afterward, some said that the murder was the revenge for the nephew’s brother, who had been slain a decade earlier in the course of leading a fundamentalist attack on a television station to protest the first broadcast of a woman’s voice in the kingdom.
Then in December of that year, the international terrorist known as “Carlos,” a fanatic Marxist from Venezuela, led five other terrorists in an attack on a ministerial meeting in the OPEC Building on Karl Lueger Ring in Vienna. Three people were killed in the first few minutes. The terrorists took the oil ministers and their aides hostage, and eventually embarked on a harrowing air journey, flying first to Algiers, then to Tripoli, and then back to Algiers, threatening all the way to kill the ministers.
Internationally, however, the consistent central objective of U.S. policy was to build stability back into the price and let inflation wear it down.
Ultimately, the deal fell through. But the Soviet officials did have a fabulous time at Disney World.
by the mid-1970s, Iran was responsible for fully half of total American arms sales abroad.
But inflation was increasing at a more rapid rate, and as had been anticipated, eroding the real price. By 1978, the oil price, when adjusted for inflation, was about 10 percent below what it had been in 1974, immediately after the embargo. In short, by restricting the increases to those two relatively small ones, the real oil price was in fact lowered somewhat. Oil was no longer cheap by any means, but neither had the price, as many feared it would, gone through the roof.
It was trade following the flag—the enormous credibility and respect enjoyed by the United States. The American passport was truly a laissez-passer—a safeguard. Then that began to fade. I could feel it everywhere. It was the ebbing of American power—the Romans retreating from Hadrian’s Wall. I tell you, I could feel it everywhere.”
Now, it seemed, expensive and insecure oil was going to constrain, stunt, or even eradicate economic growth. Who knew what the social and political consequences would be? Yet the risks loomed large, for one of the great lessons of the miserable decades between the two world wars was how central was economic growth to the vitality of democratic institutions. The
With a governmental system much more impervious than other Western countries to such outside interveners as environmentalists, France within a few years would outstrip all the others in its commitment to nuclear power.
electricity generation would, by the early 1980s, become one of the major markets in the West lost to oil,
The various programs gave rise to much wasted motion, endless Congressional hearings, and so much work for attorneys as to comprise one of the great “lawyer’s relief” programs of the century.
In Schlesinger’s mind, the number-one priority was to find a way to let domestic oil, which was under price controls, rise to world market prices so that consumers could react to correct price signals. The current system blended the price-controlled domestic oil and the higher-priced imported oil into the final price that consumers paid, which really meant that the United States was subsidizing imported oil. Thus, the Carter program promulgated a procedure to end price controls on domestically produced oil through a “crude oil equalization tax.” There was some irony here, as it had been the Republican Administration of Richard Nixon that had originally imposed price controls in August 1971, and it was now a Democratic Administration that was trying to lift them.
So bitter was the argument that Schlesinger was moved to observe during the Senate-House conference meetings on natural gas, “I understand now what Hell is. Hell is endless and eternal sessions of the natural gas conference.”
In any event, exploration in most of the OPEC countries was fore-closed because of nationalization, and there was a rather strong presupposition that, if a company had success in other developing countries, the fruits would be seized before they could be ingested, leaving only small pieces and bits for the company.
To make matters worse, the population was growing faster than the economy—one out of every two Mexicans was under the age of fifteen—and 40 percent of the workforce was either unemployed or underemployed. In the months before López Portillo actually took over, conditions were so bad that there were even rumors of a possible military coup.
The end result was often a strong tendency toward consensus, even if the consensus completely changed its tune every couple of years.
So, at a New Year’s Eve banquet, he rose to offer a memorable toast. “Iran, because of the great leadership of the Shah, is an island of stability in one of the more troubled areas of the world,” he said. “This is a great tribute to you, Your Majesty, and to your leadership, and to the respect and the admiration and the love which your people give you.” On that strong and hopeful note, the President and the Shah welcomed the momentous New Year of 1978.
exasperation was increasing with Khomeini’s own harsh attacks on the Shah’s government, which were being circulated clandestinely in cassettes throughout Iran.
A middle manager or a civil servant in Tehran spent up to 70 percent of his salary on rent.
His denunciations from exile in Iraq were cast in the rhetoric of blood and vengeance; he seemed to be driven by an unadulterated anger of extraordinary intensity, and he himself became the rallying point for the growing discontent.
American intelligence on Iran was constrained. As the United States became more dependent on the Shah, there was less willingness to risk his ire by trying to find out what was happening among the opposition that he despised. In Washington, there were surprisingly few people with the requisite analytic skills on Iran.
The fact of the matter was that the Shah did have cancer, specifically a form of leukemia, which French doctors had first diagnosed in 1974. But the seriousness of the illness was kept for several years from both the Shah and his wife.
Denied refuge in Kuwait, Khomeini went to France and established himself and his entourage in a suburb of Paris.
he installed a military government. This was his last chance, but he put a weak general in charge. The general immediately suffered a heart attack and never asserted authority.
The cacophony from the United States certainly confused the Shah and his senior officials, undermined their calculations, and drastically weakened their resolve. And no one in Washington knew how sick the Shah was.
the expatriates assumed that their exit was only temporary, a matter of weeks or months at the most, until order was restored. Thus, they were strictly limited to only two suitcases. They left their houses intact, with pretty much everything in place, for their return. They faced a quandary similar to that of the oil men who had been forced by Mossadegh to leave Abadan in 1951—what to do with their dogs, which they could not bring with them. Since they did not know how long they would be gone, they did what their predecessors had done: took their dogs out back of their houses and either shot them or clubbed them to death.
Then he boarded his plane and left Tehran for the last time, carrying with his luggage a casket of Iranian soil.
But on February 1, Khomeini arrived back in Tehran in a chartered Air France 747. Seats on the plane had been sold to Western reporters to finance the flight, while Khomeini himself spent the trip resting on a carpet on the floor of the first-class cabin.
A group of nurses took to gathering outside his window to chant “Death to Americans.”
Religious fundamentalism wed to feverish nationalism had caught the Western world by surprise. Though it was still incomprehensible and unfathomable, one of its driving forces was obvious: a rejection of the West and of the modern world. That recognition led to an icy, pervasive fear.
A particular barrel of oil could take ninety days to travel from a wellhead in the Gulf through the refining and marketing system into underground storage at a gasoline station.
In sum, the panic buying to build inventories more than doubled the actual shortage and further fueled the panic. That was the mechanism that drove the price from thirteen to thirty-four dollars a barrel.
Some sixty-three Americans—the skeleton staff that had remained at the embassy after the personnel had been scaled down from the fourteen hundred officials of the Shah’s time—were taken hostage by a large, rowdy, violent band of zealots who were thereafter known to the world as “students.”
Remarkably, it was only at the end of September 1979, more than eight months after he was forced to leave Iran, that senior American officials first learned that the Shah was seriously ill, and only on October 18 did they discover that it was cancer.
Thus was provided the impetus, and pretext, for the invasion of the embassy. Perhaps it was only intended, originally, as a sit-in, but it soon turned into an occupation and a mass kidnaping, as well as a bizarre circus, complete with vendors in front of the embassy selling revolutionary tape cassettes, shoes, sweatshirts, hats, and boiled sugar beets. The occupiers even took to answering the embassy phone, “Nest of spies.”
Ironically, with its late-night programs on the hostage crisis, ABC finally found a way to compete successfully against Johnny Carson and the Tonight Show
After the hostages were taken, the dying Shah and his entourage would quickly and apologetically leave the United States, spending their final hours before departure in pathetic isolation in a psychiatric ward, complete with barred windows, on an American Air Force base. They went next to Panama and then back to Egypt, where the wasted Shah finally died in July of 1980, a year and a half after his flight from Tehran. No one really cared.
The Saudis also feared that their own position would be damaged in another way: that the price increases would destroy consumers’ confidence in oil and, thus, stimulate long-lasting competition to OPEC oil, as well as large-scale development of alternative fuels. That would be particularly threatening to a country with huge oil reserves, the life of which would extend far into the twenty-first century.
OPEC met again in Algiers in June 1980. The Saudis, now joined by the Kuwaitis, tried yet again to put an end to the bazaar in the oil market and to stabilize prices—and again to no avail. The average oil price was thirty-two dollars per barrel, almost three times what it had been a year and a half earlier.
Hostility between Iraq and Iran was long-standing; some would find the present struggle merely a latter-day manifestation of the conflicts that went back almost five thousand years, to the very beginning of recorded civilization in the Fertile Crescent, when the soldiers from Mesopotamia, in what is now in modern Iraq, and Elam, in what is now in modern Iran, had habitually slaughtered one another.
Khomeini himself was filled with a hatred of the Iraqi regime and a burning desire for revenge for his treatment at its hands. His ire was focused on the President, Saddam Hussein. Certainly, Hussein had proved himself a champion conspirator in the considerable history of Ba’thist conspiracies.
And the influence of Khayr Allah Talfah himself continued to be felt. In 1981, the government printing house distributed a pamphlet by Talfah. Its title gave some idea of the thrust of his political thought: Three Whom God Should Not Have Invented: Persians, Jews, and Flies
Once asked to list his enemies, Khomeini replied: “First, the Shah, then the American Satan, then Saddam Hussein and his infidel Ba’th Party.” Khomeini and his circle saw the secular, socialist Ba’thists as implacable enemies of their own creed and attacked Ba’thism as “the racist ideology of Arabism.” As if all that was not bad enough, Khomeini had even worse to say; he denounced Hussein as a “dwarf Pharaoh.”
Iraq could strike hard at Iran, topple Khomeini, put an end to the Shiite revolutionary threat to Iraq, and assert sovereignty over the Shatt-al-Arab waterway, protecting Iraq’s oil position.
Children were even used to clear minefields for the far more valuable and much rarer tanks, and thousands of them died.
Their gain was OPEC’s loss, and the demand for OPEC oil fell. As a result, OPEC’s output in 1981 was 27 percent lower than the 1979 output, and in fact was the lowest it had been since 1970.
Coal staged a massive comeback in electricity generation and industry. Nuclear power also made a rapid entry into electricity generation. In Japan, liquefied natural gas increased its share in the energy economy and in electricity generation. All this meant, around the world, that oil was being ejected from some of its most important markets and was rapidly losing ground. Its share of the market for total energy in the industrial countries declined from 53 percent in 1978 to 43 percent by 1985.
Altogether, by 1985, the United States was 25 percent more energy efficient and 32 percent more oil efficient than it had been in 1973.
Those three trends—the collapse in demand, the relentless buildup of non-OPEC supply, and the Great Inventory Dump—reduced the call on OPEC by something like 13 million barrels per day, a fall of 43 percent from the levels of 1979! The
In 1983, competition continued to mount rapidly in the oil market. The British sector of the North Sea alone, which had not even started producing until 1975, was now producing more than Algeria, Libya, and Nigeria combined, and still more North Sea oil would be coming on stream.
As was said, sometimes with approval and sometimes with horror, oil was becoming “just another commodity.”
the absolute volume of Maine potatoes produced each year was also dropping. As a result, the Maine potato futures contract was running into trouble. In 1976, and again in 1979, scandals hit the potato contract, including the mortifying failure of delivered stocks of potatoes to pass inspection in New York City. The exchange, under pressure, abruptly terminated trading in Maine potatoes and was itself threatened with extinction.
A senior executive of one of the majors dismissed oil futures “as a way for dentists to lose money.”
Deregulation of an industry removes protection and increases competitive pressure and, thus, typically results in consolidations, spinoffs, takeovers, and a variety of other corporate changes. Oil, completely deregulated in the United States by 1981, was no exception.
On one level, his campaign, for that was what it was, represented the revenge of the independent oil man on the hated majors.
His father was a landman, who acquired leases from farmers and packaged and sold them to oil companies.
The high point of earnings for Saudi Arabia was $119 billion in 1981. By 1984, its revenues had fallen to $36 billion, and they would fall further, to $26 billion, in 1985.
On graduating from Yale in 1948, Bush had passed up the obvious jobs on Wall Street for someone of his background; after all, his father had been a partner in Brown Brothers, Harriman, before becoming Senator from Connecticut. Then, having failed to be called back after a job interview with Procter and Gamble, he packed up his red 1947 Studebaker and set off for Texas, first Odessa, then its neighbor Midland, which would soon be calling itself the “oil capital of West Texas.” He began at the bottom, as a trainee charged with painting pumping equipment, and then graduated to itinerant salesman, driving from rig to rig, inquiring of the customer what size drill bit he needed and what kind of rock he was drilling through, and then asking for the order.
Bush was an Easterner, with what some would have called a patrician background, but he was not entirely atypical. There was a noble tradition of Easterners coming to seek their fortunes in Texas oil, beginning with the Mellons and the Pews at Spindletop, and continuing through what Fortune magazine once called the “swarm of young Ivy Leaguers” who, Bush among them, in the post–World War II years had “descended on an isolated west Texas oil town”— Midland—“and created a most unlikely outpost of the working rich” as well as “a union between the cactus and the Ivy.” It was not coincidental that the best men’s store in Midland, Albert S. Kelley’s, dressed its customers almost exactly the way that Brooks Brothers did.
Bush quickly mastered the skills of the independent oil man, flying off to North Dakota in atrocious weather to try to buy royalty interests from suspicious farmers, combing courthouse records to find out who owned the mineral rights adjacent to new discoveries, arranging for a good drilling rig crew as quickly and as cheaply as possible—and, of course, making the pilgrimage back East to round up money from investors.
The would-be reconstituted Republican party was under assault from the right, and at one point Bush had to defend himself against the charge from the John Birch Society that his father-in-law was a communist—on grounds that the gentleman happened to be publisher of the unfortunately named woman’s magazine Redbook
Unlike Jimmy Carter, who made energy the centerpiece of his Administration, Ronald Reagan was determined to make it a footnote. The energy crisis resulted mainly, he maintained, from regulation and the misguided policies of the United States government. The solution was to get the government out of energy and return to “free markets.”
Then, one evening, a week after the meeting, Yamani was back in Riyadh at dinner with friends when he received a phone call advising him to turn on the television news. An item at the end of the broadcast reported tersely and without any adornment that Ahmed Zaki Yamani had been “relieved” of his post as oil minister. That was the way he learned he had been fired. Yamani had been in the job twenty-four years, a good, long run in any position anywhere. Still, it was an abrupt, embarrassing, and disconcerting end to a quarter-century career.
By the spring of 1988, Iraq, making use of chemical weapons, was manifestly winning. And Iran’s ability and will to carry on the war were fading fast. Its economy was in shambles. The defeats were draining support for the Khomeini regime. Volunteers, fervent or otherwise, were no longer forthcoming. War weariness gripped the country; in one month, 140 Iraqi missiles fell on Tehran alone.
One month short of eight years after it began, the Iran-Iraq War ended in a stalemate, though one that favored Iraq. As far as Baghdad was concerned, it had won the war, and it now intended to be the dominant political power in the Gulf, and one of the world’s major oil powers.
DURING THE SUMMER of 1990, the world was still in the euphoria over the end of the Cold War and the new, more peaceful world that it portended. For 1989 had certainly been the annus mirabilis—the miracle year—in which the international order had been remade. The East-West confrontation was over. The communist regimes in Eastern Europe had collapsed, along with the Berlin Wall itself, the great symbol of the Cold War. The Soviet Union was in the midst of a profound transformation arising not only from political and economic change but also from the eruption of long-repressed ethnic nationalisms.
Saddam Hussein’s objectives seemed clear: to dominate the Arab world, to gain hegemony over the Persian Gulf, to make Iraq into the predominant oil power—and ultimately to turn Greater Iraq into a global military power.
The Iran-Iraq War, which Saddam Hussein had launched, had cost the country half a million deaths and serious casualties and had ended in a stalemate. Yet a nation of eighteen million was continuing to support a million-man army.
To justify his actions, Hussein offered a plethora of rationales. He claimed that Kuwait rightfully belonged to Iraq and that the Western imperialists had snatched it away. Actually, Kuwait’s origins went back to 1756, two decades before the United States declared its independence, and certainly much before the beginnings of modern Iraq, which was knitted together in 1920 out of three provinces that had been part of the Ottoman Empire for four centuries and, for several centuries before that, had been outlying provinces of various other empires.
Fearful that Saudi Arabia might well be next on Hussein’s list, many countries hurriedly sent military forces into the region. American forces were by far the largest component, reflecting guarantees that went back to Harry Truman’s letter to Ibn Saud in 1950.
The absorption of Kuwait could start Iraq on the path to becoming a new superpower. Eleven years earlier, four out of five of the major producers of the Persian Gulf had been pro-Western. With Kuwait absorbed into Iraq, there would be only two friendly producers.
The air war continued for a month, with systematic attacks on Iraqi command and control centers and a broad range of military and strategic targets. The biggest surprise, at least for the U.S. Air Force, was not that the coalition aircraft and missiles thoroughly knocked out Iraq’s air defense capability, but that they did it so easily and so quickly, with so minimal a loss of aircraft.
Exxon and Mobil— once Standard Oil of New Jersey and Standard Oil of New York—became ExxonMobil.
Saudi Aramco—the successor to Aramco, now state-owned—remains by far the largest upstream oil company in the world, single-handedly producing about 10 percent or more of the world’s entire oil with a massive deployment of technology and coordination.
the U.S. Senate rejected the Kyoto treaty by a vote of 95 to 0. There were three main concerns. The first was the impact of CO2 restrictions on the overall economy and economic growth. The second specifically concerned restrictions on coal, from which half of the nation’s electricity was generated. And the third was that the treaty would require cutbacks from the industrial countries, but not developing countries.
The industrial world is twice as energy efficient as it was in the 1970s.
-
American Revolutions by Alan Taylor
From my notion template – the money quote from the entire book is
the revolution began, rather than culminated, a long, slow, and incomplete process of creating an American identity and nation.
The Book in 3 Sentences
- A very, very detailed account of the Western Hemisphere in the American revolutionary period. What stood out was the detailed accounting of the disease and environmental, both political and natural, factors and a very good accounting of what the world was like at that point. It was a worthy successor to American Colonies
How I Discovered It
The other Alan Taylor books.
Who Should Read It?
Anyone who enjoys world and American history
How the Book Changed Me
How my life / behaviour / thoughts / ideas have changed as a result of reading the book.
- The top things were the concept of wars as a line item, possible then, not so much now, and a very good look at who the players of the revolution were at the time, their incentives, histories and so forth. America was a much more muddled place then than it is now.
- Also – another reminder that Machiavelli basically nailed it with his division of a country being divided into the the people, the aristocracy, and the monarch. That was very much in play in this time period.
Highlights
Only by the especially destructive standards of other revolutions was the American more restrained. During the Revolutionary War, Americans killed one another over politics and massacred Indians, who returned the bloody favors. Patriots also kept one-fifth of Americans enslaved, and thousands of those slaves escaped to help the British oppose the revolution. After the war, 60,000 dispossessed Loyalists became refugees. The dislocated proportion of the American population exceeded that of the French in their revolution. The American revolutionary turmoil also inflicted an economic decline that lasted for fifteen years in a crisis unmatched until the Great Depression of the 1930s. During the revolution, Americans suffered more upheaval than any other American generation, save that which experienced the Civil War of 1861 to 1865.
the revolution began, rather than culminated, a long, slow, and incomplete process of creating an American identity and nation.
In 1775, Benjamin Franklin recalled, “I never had heard in any Conversation from any Person drunk or sober, the least Expression of a Wish for a Separation, or Hint that such a Thing would be advantageous to America.”
By writing of the American Revolution as pitting “Americans” against the British, historians prematurely find a cohesive, national identity. If we equate Patriots with Americans, we recycle the canard that anyone who opposed the revolution was an alien at heart. We also read American nationalism backwards, obscuring the divisions and uncertainties of the revolutionary era. This book refers to the supporters of independence as Patriots and to the opponents as Loyalists, but many more people wavered in the middle, and all were Americans.
Unable to restrain settlers, American leaders needed to help them. By leading, rather than slowing, the process of Indian dispossession, the federal government could gain influence in the West. Jefferson
But the American Revolution generated many conflicting meanings, and some Americans kept alive an alternative, broader vision of revolution that might lead to a “new birth of freedom” in a later generation.
As the imperial wars became global, the “King-in-Parliament” needed cooperation from colonial governments. In a key compromise, the Crown accepted assemblies elected by property-holders as responsible for setting taxes and appropriations within each colony. Most
Rather than thinking of themselves as a distinct new people in America, colonists proudly claimed the status of Britons who lived west of the Atlantic.
As the protector of his subjects and their rights, this king warranted allegiance. Colonists idealized the king as their champion against their Catholic enemies, the French and Spanish, for politics and religion were entangled in colonial culture. By comparison, colonists felt little fondness for Parliament, where they had no representatives. Proud of their British liberty, the colonists looked south and north to see their supposed inferiors in the more authoritarian Spanish and French empires.
Rival empires measured their strength by the range and number of their Indian allies.
Unable to suppress them, the colonial government sought to contain the maroons by paying them bounties for returning more recently escaped slaves.
To the north on the Atlantic coast of the continent, South Carolina (founded in 1670), North Carolina (1712), and Georgia (1733) also produced commodities for export. Too far
In 1701, feeling cheated by the Carolina traders, some Santee natives tried to take their deerskins directly to England by crossing the Atlantic in dugout canoes. Of course, they underestimated the distance and powerful ocean swells, which swamped their open canoes. They were rescued by passing sailors but then promptly sold into slavery: it did not pay to cross Carolinians, whom, a pious visitor insisted, “walk[ed] the straight path to hell.”19
In the late seventeenth and early eighteenth centuries, the Chesapeake colonists imported thousands of enslaved Africans, who comprised 40 percent of the region’s population by 1750.20
In 1713, the British took Acadia from the French and renamed the colony Nova Scotia.
prospering farms attracted immigrants and promoted a higher birth rate and longer life expectancy than in the sickly West Indies to the south, or the colder, northern outposts of the British Empire. As a result, in 1750 four-fifths of British Americans lived in the thirteen colonies of the temperate latitudes on the continent. Those colonies had 1.5 million people compared to a mere 60,000 in French Canada and 10,000 in Louisiana.
After 1700, British America imported 1,500,000 slaves: more than four times the number of white immigrants. The massive escalation of the slave trade produced an unprecedented, transatlantic displacement of people. A brutal business, the slave trade killed a tenth of the enslaved, primarily from disease, during transit across the Atlantic from West Africa. The survivors then suffered the shock of enslavement in a strange and distant land. Separated from friends and kin, they were ordered about in a new language and brutally punished if they balked, resisted, or tried to escape. New masters put the enslaved to work on colonial farms and plantations raising crops for export. Arriving with many distinct languages and ethnic identities, they gradually created new cultures as African Americans.
Three-quarters of the newly enslaved landed in the West Indies, where the sugar plantations were especially profitable but lethal.
Although the West Indies imported more slaves, by 1775 the British mainland colonies had more living slaves, because of their healthier conditions. In continental British America, a fifth of the people were enslaved, with the largest numbers in the Carolina and Chesapeake colonies. Slavery proved less profitable in New England and the middle colonies, where farmers could not raise the more lucrative southern crops: rice, tobacco, indigo, and sugar. The enslaved comprised only 2 percent of the population in New England and 8 percent in the middle colonies, but slavery was legal in every colony, and only Quakers questioned the enslavement of Africans.28
By 1775, Spanish America had more free than enslaved blacks because the Catholic Church promoted manumission in wills, and imperial policy allowed slaves to purchase freedom by hiring out for wages.
Georgia, early settlers bristled when their paternalistic colonial government initially banned slavery as a security threat. The dissidents prevailed under the revealing slogan of “Liberty and Property without restrictions,” for they insisted that white men became fully free by owning blacks. The British offered just one of several formulas for freedom and race in the Americas.
Nothing in the colonies could compare to the teeming metropolis of London, the largest city in Europe and home to 750,000 people.
Thanks to the swelling volume of trade, the colonial economy grew faster than did Britain’s. From just 4 percent of England’s gross domestic product in 1700, the colonial economy blossomed to 40 percent by 1770, assuming greater importance to the empire.
Only the poorest and most rustic people wore homespun; everyone else donned clothing made from imported textiles.
In the mainland colonies, the genteel comprised about one in every twenty free people; the rest were common.
Per the law of coverture, a British colonial woman passed by marriage from legal dependence on her father to reliance on a husband, losing her last name and gaining no civil rights.
A colonial woman usually married in her late teens or early twenties and bore seven to ten children.
On both sides of the Atlantic, Britons insisted that their constitution preserved the liberties of subjects better than in any other realm on earth. Unlike the later American Federal Constitution, the British constitution was not a written document but, instead, a consensus understanding of political institutions and legal precedents.
Conventional thinking insisted that Britain enjoyed a “mixed constitution,” which balanced the three elements of any civilized society: the one (a monarch), the few (aristocrats), and the many (common people).
Common rioters called themselves “regulators,” for they sought to regulate the law rather than destroy it.72
Washington surrendered on July 4, unaware that the date would later assume a happier meaning for him. Fortunately for Washington, the French wanted to prevent a full-scale war, so they disarmed the Virginians and sent them home.
Resistance crumbled, and the victors tortured and executed more than 100 rebels, decorating crossroads with impaled heads as intimidating examples to other slaves. Given a terrible scare, Jamaican planters clung more closely to the empire, petitioning for more redcoats to garrison the colony.
Britons gloated over their global triumphs, but the French exited the war with a leaner and more effective empire, while Britain took on vast debts and expensive new responsibilities. The leading French negotiator, the Duc de Choiseul, boasted that he had burdened the British with future woes.
Within a dozen years of the peace of 1763, a global conflict would erupt from a surprising source: a rebellion by thirteen British colonies along the Atlantic Seaboard of North America. A Briton later recognized that his empire’s triumph in the Seven Years War had borne bitter fruit: “What did Britain gain by the most glorious and successful war on which she ever engaged? A height of Glory which excited the Envy of the surrounding nations and . . . an extent of empire we were equally unable to maintain, defend or govern.” Because of that triumph, the empire would reap a revolution in British America.
Victory had not come cheap, for the conflict nearly doubled the British national debt from a prewar £74 million to a postwar £133 million. During the mid-1760s, servicing that debt consumed £5 million of the empire’s annual budget of £8 million. The government also needed £360,000 annually to sustain 10,000 troops to garrison the conquests in North America.
During the war, British officers and officials had discovered just how prosperous the colonists had become. Yet on a per capita basis, the colonists paid only 1 shilling in tax directly to the empire compared to 26 shillings per capita paid in England.
British hubris was on a collision course with inflated American expectations of a partnership in the empire. Colonists considered themselves “free-born Englishmen” of tried and true loyalty to their king. As British subjects in America, colonial leaders expected the same rights as their counterparts in the mother country. They balked at the label “American” as implying cultural degradation by association with Indians and enslaved Africans.
If denied equality with Britons, colonists feared that they sat on the slippery slope to dependence, dispossession, and even (they claimed) enslavement.
the eighteenth century, the population grew faster in British America than in the mother country. Thanks to abundant land, early marriages, and healthy conditions, the number of colonists doubled every twenty-five years. In 1751, Benjamin Franklin calculated that, within a century, “the greatest number of Englishmen will be on this side of the Water.”
Britons began to think of the colonists as aliens—as Americans—far sooner than did colonists like Franklin who dreamed of a shared empire of equals.
Between 1760 and 1775, 30,000 English, 40,000 Scots, and 55,000 northern Irish (a total of 125,000) moved to British America. During the same period, 12,000 Germans and 85,000 enslaved Africans joined them, raising the total to 222,000. On average, 15,000 immigrants arrived annually, a tripling of the prewar rate.
In February 1764, 500 armed Paxton Boys marched on Philadelphia to intimidate the leaders of Pennsylvania into adopting harsher measures against Indians. On the outskirts of the city, the vigilantes agreed to return home when promised a redress of their grievances. The colony allocated more money for frontier defense and offered bounties for Indian scalps: $134 for a man, $130 for a woman, and $50 for a child. The bounties promoted indiscriminate Indian hunting, for the scalp of a peaceable native told no tales, paid as well, and cost fewer pains to take.
But colonists longed to crush and dispossess native peoples, if only the British would get out of the way.
The proclamation shocked colonists who, after helping to conquer Canada, had expected the British to help them dispossess the Indians in the West.
To further discourage speculative hoarding of wild lands, the Crown required an annual “quit-rent” of 6 pence per acre, which a productive plantation could better afford than could idle lands.
The rest of Florida remained native country largely settled by Seminoles: Creek Indians who had moved south and developed a new identity during the preceding generation.
The warm climate, long growing season, and supposed fertility allured well-connected Britons, including the colony’s governor, General James Grant, and Lord Adam Gordon, who noted that promotional literature had “sett us all Florida mad.”
In 1768, he settled them seventy-five miles south of St. Augustine at a place ominously known as Mosquito Inlet. Changing the name to New Smyrna failed to fool the millions of biting insects who bled the newcomers.
Shocked by harsh conditions and brutal overseers, the servants rebelled and fled, but Governor Grant’s troops captured and returned them. Grant sentenced three to die but pardoned one who agreed to execute the other two. Within seven years of arrival, three-quarters of the immigrants died from a combination of disease, heat prostration, and brutal treatment.
In a single decade, New York’s population more than doubled, from 80,000 in 1761 to 168,000 in 1771.
North Carolina’s population increased sixfold between 1750 and 1775, while Georgia’s grew by a factor of fourteen.
Husband fled north to Pennsylvania, where he adopted the fitting alias of “Tuscape Death.”
Tall, strong, bold, and profane, Ethan Allen led the Green Mountain Boys. Allen denounced the “great state and magnificence” of New York’s “junto of land thieves,” who sought to exploit hardworking settlers by demanding premium prices for frontier land. He insisted that justice favored possession “sealed and confirmed with the Sweat and Toil of the Farmer.” Confronting “Yorkers,” Allen declared “that his name was Ethan Allen, Captain of that Mob, and his authority was his arms, pointing to his gun, that he and his companions were a Lawless Mob, their Law being Mob Law.” After burning a Yorker’s farm, Allen told him: “God Damn your Governour, Laws, King, Council, and Assembly.” Try as he might, Tryon could never catch and hang the Green Mountain Boys, as he had done to the North Carolina regulators. Allen would outlast the governor and lead Vermont to independence during the coming revolution.
Britons considered Indians blessed with perfect freedom or cursed with virtual anarchy.
Unlike most settlers, who insisted on hard and harsh boundaries between the races, Indians believed that they could convert white captives because natives regarded identities as cultural and fluid rather than biological and fixed.
In 1773, the Virginia Gazette insisted, “Not even a second Chinese wall, unless guarded by a million soldiers, could prevent the settlement of the Lands on the Ohio.”
Unable to restrain settlers, British officers hoped that Indians would take bloody revenge on intruders and murderers. In 1765, John Stuart assured Creeks, “We will set up Marks and if any white People settle beyond them we shall never enquire how they came to be Killed.”
To avert war, Hillsborough directed Virginia’s governor to void all surveys and reject all applications for land in the contested zone. Hills- borough also compelled Johnson to reconvene the Haudenosaunee during the summer of 1770 to renounce the cession west of the Great Kanawha, thereby depriving Virginian speculators of 10 million acres in what is now Kentucky. The frustrated Virginians included Patrick Henry, Thomas Jefferson, and George Washington, who resented the empire’s restrictions on their western ambitions. Their anger grew as squatters began to occupy lands denied to speculators. They also feared competition from an immense rival speculation launched in England by Benjamin Franklin.
In London, Franklin recruited wealthy aristocrats and bankers with political clout by promising them fabulous profits from selling fertile lands to settlers. The British partners included Thomas and Richard Walpole, the wealthy and powerful nephews of a late prime minister. The American investors included Franklin’s son, William, who was the royal governor of New Jersey, Sir William Johnson and his leading deputy, George Croghan, and the wealthy Philadelphia merchants Samuel and Thomas Wharton.
General Gage declared, “I wish most sincerely that there was neither Settler nor Soldier in any part of the Indian country.” Instead, the military withdrawal accelerated settlement. By 1774, 50,000 settlers lived beyond the old Proclamation Line, but Gage washed his British hands of the bloodshed that they would provoke: “Let them feel the Consequences, [for] we shall be out of the Scrape.”
a speculator hoped to preempt prime locations from the competition. But overlapping claims soon rendered the Ohio Valley a lawyer’s paradise and a farmer’s lament.
They were led by Richard Henderson, a backcountry judge notorious for greed. Offended by his treatment of squatters and debtors, regulators had burned his house, barn, and stables in 1770. One former debtor, Daniel Boone, proved more forgiving, entering a partnership with Henderson in 1774. A veteran hunter, Boone knew the best routes over the mountains to the finest lands in Kentucky. Folklore casts Boone as a nature-loving refugee from settled civilization; in fact, he helped land speculators fill the forest with farmers.
Most of the settlers were young men of little property and less reputation. Henderson described them as “a set of scoundrels who scarcely believe in God or fear a devil.”
By 1775, the British Empire had lost all credibility and influence in the Ohio Valley. Imperial authority shrank into a few, small forts scattered along the Great Lakes to the north. When
In June 1774, Parliament endorsed Carleton’s proposals by passing the Quebec Act, which broke with the British constitutional tradition that had excluded Catholics from government. Unlike the Irish, Canadian Catholics could own land and serve on the governing council.
While encouraging common buyers, the new policy also promised greater revenue for the Crown: a doubly damning prospect for colonial gentlemen who speculated in lands and sought to keep the empire weak within the colonies.
that his predecessor had taken the wrong side in the colonial land disputes. Replacing Tryon as North Carolina’s governor, Josiah Martin initially agreed that the regulators had gotten their just desserts, but a tour of the backcountry changed his mind. Writing to Hillsborough, Martin explained, “My progress through this Country, My Lord, hath opened my eyes exceedingly.” He concluded that the farmers had “been provoked by insolence and cruel advantages taken of the people’s ignorance by mercenary tricking Attornies, Clerks and other little Officers.” When “the wretched people,” turned to regulation, crafty speculators and county officials used “artful misrepresentations” to deploy the government against common folk.
Turning the tables, Martin sought to build popular support for the empire by mollifying backcountry farmers. In effect, he played the populist card against the leading men who had supported Tryon. Martin pardoned regulators, sacked extortionate officials, and demanded restitution from embezzling sheriffs. A delighted regulator concluded that, because Martin had “given us every satisfaction,” the county cabals “hate [him] as bad as we hated Tryon.” Throughout British America, regulators wishfully believed that a just king favored them against local elites. Martin gave substance to that legend of the protecting king. During the coming revolution, most North Carolina regulators would either support the empire or at least avoid helping the Patriots. On the other hand, North Carolina’s leading Patriots had supported Tryon’s suppression of the regulation.
In April 1766, hundreds of regulators marched on New York City, seeking to free their jailed leader, Samuel Monroe. They turned back when blocked by the city’s militia commanded by merchants and lawyers who also had led the seaport protests against British taxes. A British officer wryly noted that “Sons of Liberty” were “great opposers to these rioters. They are of opinion that no one is entitled to riot but themselves.”
Cherishing the king rather than Parliament, colonial leaders imagined the empire as a federated body of legislatures united only by a shared monarch. A royal governor noted that colonists claimed to live in “perfect States, not otherwise dependent upon Great Britain than by having the same King.” They even petitioned the king to reclaim the power to veto Parliament’s laws: a power that had lapsed during the preceding half century.
To justify resistance, colonists cited political writings by British critics of Parliament. During the 1720s, John Trenchard and Thomas Gordon published eloquent essays, known as Cato’s Letters, in a London newspaper. Suspicious of all power as selfish and malevolent, Trenchard and Gordon insisted that government officials chronically imperiled liberty: “Power is like fire; it warms, scorches, or destroys according as it is watched, provoked, or increased.” By accumulating power, corrupt officials became rich by impoverishing people with heavy taxes. To preserve property and the liberty that it sustained, people needed closely to watch and strictly to limit power. Rather than blame the king, the critics accused his ministers of distorting the will of a benevolent monarch, who sustained the world’s freest constitution. Trenchard and Gordon declared, “We have a constitution that abhors absolute power; we have a king that does not desire it; and we are a people that will never suffer it.”
They insisted that farmers made the best citizens because they sustained “virtue,” which meant the sacrifice of private interests to benefit the community. Such virtue abounded where people were relatively equal in owning productive farmland (or artisans’ shops). The ideal society maximized the number of modest property holders but denied political rights to anyone without independence: women, children, servants, slaves, and wage laborers. Virtue’s corrosive opposite was corruption, the use of money and luxury to render people dependent on others.
Patriots detected a malevolent plot by British leaders to “enslave” the colonists by imposing new taxes and regulations. Boston’s town meeting insisted that “a deep-laid and desperate plan of imperial despotism has been laid, and partly executed, for the extinction of all civil liberty.” That rhetoric struck Britons as so irrational that it must cover a colonial conspiracy by reckless demagogues out to destroy the empire by seeking American independence. Neither plot existed save in the powerful imaginations of political opponents who distrusted one another. A rare moderate on imperial issues, Benjamin Franklin, lamented, “To be apprehensive of chimerical dangers, to be alarmed at trifles, to suspect plots and deep designs where none exist, to regard as mortal enemies those who are really our nearest and best friends, and to be very abusive, are all symptoms of this distemper.”
The Royal Navy added patrols along the North American coast to search merchant ships for smuggled goods. Entitled to a share of what they confiscated, naval officers were highly motivated to seize suspected ships.
To minimize pushback from the colonists, Grenville set their stamp tax at a relatively low level: only two-thirds of what Britons paid. He also stipulated that the money would remain in the colonies to fund the military, and he appointed leading colonists to the lucrative positions selling stamps. Surely, he reasoned, colonists would pay the new tax with no more than the grumbling that had met the Sugar Act.16 Instead, the stamp tax horrified the colonists, who were suffering from a postwar depression in trade. At war’s end, the empire’s military expenditures dried up in the mainland colonies as the British removed troops, sailors, and their subsidies. Meanwhile, demobilized soldiers and sailors poured into the seaports, depressing wages paid for labor. Facing ruin for want of customers, merchants sued their many debtor-customers, who stood to lose their shops and farms for want of specie. Hard-pressed colonists bristled at paying another new imperial tax, no matter how small.
“The rights of parts and individuals must be given up when the safety of the whole shall depend on it . . . in return for the protection received against foreign enemies.” Patriots, however, refused to sacrifice any rights for the benefit of the whole empire.
Hutchinson’s greatest foe was Samuel Adams, the son of a Boston brewer. Unlike the wealthy and restrained Hutchinson, Adams possessed only moderate means but a fierce focus on his political goals. In contrast to his dapper and slender rival, Adams was stocky and shabbily dressed. “I glory in being what the world calls a poor Man,” Adams wrote. Secretive, patient, and cautious, he cultivated popularity as the basis for power. Instead of putting on airs, Adams carefully learned the names and views of shipwrights and other artisans. A leader in Boston’s town meeting, Adams secured appointment as a local tax collector, where he became more popular by neglecting to collect from his neediest supporters. A political rival characterized Adams as “by no means remarkable for brilliant abilities” but “equal to most men in popular intrigue, and the management of a faction. He eats little, drinks little, sleeps little, thinks much, and is most decisive and indefatigable in pursuit of his objects.” Adams aptly described his political strategy as to “keep the attention of his fellow citizens awake to their grievances; and not suffer them to be at rest, till the causes of their just complaints are removed.”
Rather than denounce all of the rich as a predatory class, Patriots encouraged laboring people to focus their animus more narrowly on a few gentlemen who seemed especially menacing because of their imperial connections. Those connections became a liability as people blamed their economic woes on Parliament’s new taxes.
Far from seeking seats in the distant Parliament, colonists simply argued that only their own assemblies could tax them. Pennsylvania’s assemblymen asserted that colonists were “entitled to all the Liberties, Rights and Privileges of his Majesty’s Subjects in Great-Britain,” and it was “the inherent Birth-right . . . of every British Subject, to be taxed only by his own Consent, or that of his legal Representatives.” In Virginia, a fiery young assemblyman, Patrick Henry, declared “that the General Assembly of this Colony have the only and sole exclusive Right and Power to lay Taxes and Impositions upon the Inhabitants of this Colony.”
Loyal to the king, colonists wanted him to intervene and overrule Parliament.
British leaders had expected little resistance from the diverse mainland colonies with their deep suspicions of one another. But the Stamp Act touched a raw nerve, their aversion to taxes imposed by Parliament, so the crisis generated unprecedented intercolonial communication and cooperation.
Despite the Declaratory Act, colonists exulted in news of the repeal and felt renewed pride in an empire apparently ruled by a benevolent king.
The Stamp Act Crisis taught the colonists how to frustrate British measures by combining protest resolutions by elite writers with violent intimidation by common mobs and economic boycotts by everyone. The three forms of resistance worked together. Boycotts required a common front, which intimidation and ostracism helped to produce. In turn, published arguments by leading Patriots vindicated the boycotters and bully boys as defending colonial liberty against a plot by British tyrants.41
Boston’s leading Patriots organized a club known as “the Sons of Liberty.” By early 1766, similar groups had appeared as far south as Georgia. The leaders were respectable tradesmen and merchants, but most had achieved wealth rather than inherited it, and they often worked beside common journeymen and apprentices in their shops or on their wharves.
Familiarity with common men helped to mobilize hundreds of laborers and sailors for mass meetings, parades, and protests. Sons of Liberty proudly declared that their meetings drew “all ranks and condition,” which was a radical development in a political culture where genteel leaders had long excluded the “rabble” from gatherings by the “respectable.”
But their actions tended to discredit the colonial regime as impotent to keep order. The Sons of Liberty increasingly had to fill a vacuum of authority that they helped to create.
A new consensus denied that colonists should pay any tax, external as well as internal, if levied by Parliament.
Washington noted that many a gentleman feared that “an alteration in the System of my living, will create suspicions of a decay in my fortune, & such a thought the world must not harbour.”
The Sons of Liberty sought to shame, isolate, and ruin as “enemies to their country” anyone who violated the boycott.
British officials marveled at the power of extra-legal committees and mobs to compel colonists to forsake their beloved consumer goods.
In the name of liberty, Patriots suppressed free speech, broke into private mail, and terrorized their critics. In Boston in October 1769, a defiant conservative printer, John Mein, revealed that some Sons of Liberty, including John Hancock, covertly imported goods while exploiting the boycott to drive smaller competitors out of business. Mein’s revelations threatened to discredit the boycott, so a mob of a thousand men chased him through the streets, yelling, “Kill him; kill him.” Mein escaped and sailed away for England. In his absence, Hancock bought up debts owed by Mein and used them to seize control of, and shut down, his offending newspaper. Patriots believed only in the liberty of their press.55
Boston mobs continued to menace redcoats, bully importers of British goods, and tar and feather customs informers.
On the night of March 5, 1770, about fifty men and boys gathered to harass seven soldiers guarding the custom house on Boston’s main street by hurling snowballs, chunks of ice, sticks, and rocks while yelling “Kill them.” The captain in charge tried to restrain his men, but they feared for their lives. One fired and the rest followed suit, hitting eleven colonists, five of whom died. Flocking to the scene, hundreds of angry colonists threatened to kill the captain and his men, so Hutchinson ordered them away to Castle William.58 Patriot propagandists turned the tragedy into “the Boston Massacre.” Three-quarters of the town’s inhabitants attended the mass funeral of the victims.
The troops also pulled down a nearby “liberty pole”: a tall post erected as a rallying point for Patriots. When the Sons gathered to counterattack, a massive brawl erupted. Getting the worst of it, soldiers retreated to their fort. While singing “God Save the King,” the Sons of Liberty erected a massive new pole: eighty feet tall, cased in protective iron bands, and topped by a weather-vane inscribed with the word “Liberty.”
Balking at the expense of sending more troops to coerce the colonists, in April 1770, Parliament repealed the Townshend Duties, save for a duty on tea meant to maintain Parliament’s right to tax colonists.
The tax on tea yielded scant revenue because most colonists consumed cheaper tea smuggled in from the Dutch East Indies (now Indonesia).
But the Tea Act angered colonial merchants, who stood to lose their profitable business in smuggled tea. They denounced the act as a plot to seduce Americans to sell their liberty for the tea of a British monopoly. This line of attack revived opposition to any tea tax, no matter how small.
In Philadelphia, a Patriot newspaper denounced the chests of tea as “filled with poverty, oppression, slavery, and every other hated disease.” Patriots threatened to turn any colonial collaborators over to “The Committee for Tarring and Feathering,” which persuaded most merchants to reject the company’s tea.
Reversing Patriot rhetoric, Lord Buckinghamshire argued that the issue had become whether the British “were to be free, or slaves to our colonies.”
By crushing resistance in Boston, British leaders sought to save their empire. If deprived of its American colonies, the empire would, they feared, collapse, exposing Britain to domination by the French and Spanish.
Washington insisted that the British meant to “make us as tame & abject Slaves as the Blacks we Rule over with such arbitrary Sway.”
During the 1760s, the colonists imported 365,000 slaves: more than in any preceding ten-year period.
Most of the enslaved lived in the southern, plantation colonies, accounting for 90 percent of the population in the West Indies, 60 percent in South Carolina, and 40 percent in Virginia. But slavery was legal throughout the colonies, including the northern seaports. In Boston in 1761, 1,000 of the city’s 15,000 people were enslaved and only 18 black residents were free. In the mainland colonies as a whole, the enslaved comprised a fifth of the population.
In a particularly obtuse performance, Richard Henry Lee, a wealthy Patriot in Virginia, had his slaves parade around a county courthouse, carrying banners which denounced Parliament’s taxes as “chains of slavery.”
The English writer Samuel Johnson was appalled that colonists likened Parliament’s small new taxes to slavery: “How is it that we hear the loudest yelps for liberty among the drivers of negroes?”
Defying efforts to exclude them, blacks joined seaport mobs that attacked officials in the name of liberty. An escaped slave of mixed African and Indian ancestry, Crispus Attucks, died in “the Boston Massacre.”
The legend of a liberating king reflected news of a recent antislavery ruling by England’s highest court in the celebrated case of Somerset v. Stewart. A former customs officer in Boston, Charles Stewart, had brought a slave, James Somerset, to England in 1769. When Stewart subsequently tried to send Somerset away to Jamaica, the slave sued for his freedom with the support of an abolitionist attorney who argued that the enslaved became free when brought to England. In 1772, the chief justice, Lord Mansfield, ruled that slavery had no just basis in “natural law” or in English common law, so it required “positive law,” a statute passed by Parliament, to legitimate the system in England: “It’s so odious, that nothing can be suffered to support it, but positive law.” For want of such a statute, Mansfield ruled that Stewart could not force Somerset to leave the mother country for renewed slavery in a colony. Although technically narrow, Mansfield’s ruling became broadly interpreted as upholding the legal maxim that any slave became free upon setting foot in England, deemed the great land of liberty.
In 1774, most colonists still hoped to stay in the empire while compelling Parliament to rescind the Coercive Acts.
Rural Patriots saw their own grim potential future in the British exploitation of Irish peasants, who lacked political rights, suffered from heavy taxes, and paid rising rents to absentee landlords.
Refusing commissions issued by Gage as governor, rural militia officers answered to local committees and prepared for armed resistance by stockpiling munitions and drilling their young men as “minutemen,” who could rally at a moment’s notice.
But Patriots had hardened their stance against Parliament. In addition to the old insistence that Parliament could not tax colonists, they added that Parliament had no power to legislate for the colonies in any way, not even to regulate trade. Regarding the king as their sole link to the empire, the Patriots would still accept royal governors but would heed no British legislature as superior to any of their own.
For true governance, they wanted only a shared king and their colonial legislatures.
Linked to provincial congresses in each colony and ultimately to Congress, the committees provided new political structures, which initiated a revolution beyond what the delegates had anticipated.
Becoming de facto local governments, the committees took over collecting taxes and managing the militia. Many committees even enforced price controls, inspected merchants’ books, and shunned violators until they publicly confessed and mended their ways.
Acting in the name of liberty, the committees suppressed speech by their critics. The Philadelphia Committee insisted that “no person has the right to the protection of a community or society he wishes to destroy . . . by speeches and writings” which “aid and assist our enemies.” Patriots broke open mail to read letters from suspected critics of Congress. A merchant begged his correspondents to censor their words because “the temper of the people is such that misconstructions are put on the most innocent expressions” by “those who call themselves the Assertors of American Freedom.”
Critics feared that the social upheaval would culminate in a violent anarchy destructive to all property and social order. In New York, Reverend Samuel Seabury concluded, “If we must be enslaved, let it be by a KING at least, and not by a parcel of upstart, lawless committee-men. If I must be devoured, let me be devoured by the jaws of a lion and not gnawed to death by rats and vermin.”
Which is better—to be ruled by one tyrant three thousand miles away or by three thousand tyrants not a mile away? —MATHER BYLES, a Loyalist, 17741
To defeat the British, Patriot gentlemen needed to maximize their popular support, so they promised enhanced respect and opportunities for common men. The Patriot movement created many new leadership roles as committeemen, provincial legislators, and militia officers, which common men often filled.
Merit rather than connections would reap wealth and leadership. But like the paternalism favored by Loyalists, Patriots described an elusive ideal that they often compromised in practice. Interests and connections persisted in the new order, as in the old empire, but in a more egalitarian guise. Those contradictions disgusted some common people, who preferred the elitism of leading Loyalists as more honest and transparent.
The British population of 11 million greatly outnumbered the 2.5 million mainland colonists, and a fifth of the latter were slaves more apt to support the British than help the Patriots. The free colonists were also bitterly divided, for a fifth remained loyal to the empire.
By seeking a commander from Virginia, John and Samuel Adams hoped to consolidate southern support for the siege of Boston. As the largest, most populous, and most powerful colony, Virginia was critical to the Patriot coalition.
Tall, strong, and erect, he looked the masculine part of a commander. Reserved and dignified, he was popular with the other delegates, who preferred to hear the sound of their own voices. “He possessed the Gift of Silence,” recalled John Adams, who did not.
Because rifles were slower to load and quicker to jam, most soldiers, American as well as British, fought as infantry with smoothbore muskets. Only by firing in massed volleys could troops compensate for the relative inaccuracy of their muskets. To achieve group cohesion and a rapid pace of fire required months of drilling, which did not sit well with Americans.
Britain retained the loyalty of half of their empire in America, for fourteen colonies rejected the rebellion embraced by thirteen. While the Patriot cause prospered in the thickly settled mainland colonies, it withered along the northern and southern margins, where Anglophones were few and needed British military aid. East and West Florida, for example, were sparsely settled, depended on British subsidies, and relied on redcoats for protection against Indians and the Spanish in nearby Louisiana. Both colonies also lacked newspapers and active legislatures (West Florida’s assembly did not meet for six years after 1772) to agitate the public.
The empire could pressure West Indian assemblies by threatening to withdraw troops, while it had to send redcoats to cow the mainland colonists.
Southern mainland planters also sustained slavery, but they felt more threatened than protected by the British Empire. They also felt more secure in resisting the naval power of the empire because of the vast hinterland of their continental setting. Where whites were less than a tenth of the West Indian population, they accounted for 60 percent of the people in the southern mainland colonies. North American planters could rally thousands of common whites as militiamen to watch slaves and resist British rule. West Indians, however, could neither resist the redcoats (and Royal Navy) nor do without their protection.
Dunmore’s threat to free and arm the enslaved began as a bluff, meant to spook Virginians into passivity, for he did not wish to ruin the plantation economy of a colony that the British hoped to recover. But the Patriot coup in Williamsburg called that bluff, forcing Dunmore to convert his bluster into black soldiers. He also recognized the military potential of the many runaway slaves who sought haven on his warships during the summer and fall.
Rather than recognize that runaways indicted the slave system, Virginians concocted their own wishful legend: that Britons lured away slaves to resell them in the West Indies, where they would suffer far more than in Virginia. If the British were frauds instead of liberators, the enslaved should cling to their masters as protectors rather than flee from them as exploiters. Masters assembled their slaves to warn of their West Indian peril and invite their renewed commitment to servitude in Virginia.
In the South, the enslaved sought a greater revolution, for they meant to “Alter the World” and regarded Britons, rather than Patriots, as the better champions of true liberty. Although the British performance as liberators lagged far behind the wishful hopes of the enslaved, they could find no better ally.
The catastrophe in Canada had cost 5,000 Patriot soldiers, either dead or captured. A British officer mocked the Patriots for offering “a resistance . . . as flimsy & absurd as were their Motives for taking up Arms against their Sovereign.”
Many moderate Patriots as well as Loyalists still revered the mixed constitution of Britain and cherished the commercial benefits of the empire. In April 1776, Franklin observed, “The Novelty of the Thing deters some, the Doubt of Success others, [and] the vain Hope of Reconciliation, many.” But he also noted “a rapid Increase of the formerly small Party who were for an independent Government.”
That shift in opinion owed much to an unlikely man, Thomas Paine. Hard-drinking, self-educated, cranky, and restless, Paine had accomplished little during the previous thirty-seven years of his checkered life. The son of a poor artisan, Paine had lost his job as an excise tax collector in England in 1774, the same year that his marriage crumbled and creditors auctioned his paltry household goods to pay his debts. At rock bottom, Paine sought a new start by migrating to the colonies. Arriving in Philadelphia in November 1774, he embraced the Patriot cause and became the chief writer and editor of the Pennsylvania Magazine. His forceful prose impressed Dr. Benjamin Rush, a leading Patriot who recruited Paine to publish a political pamphlet against reconciliation.66 On January 10, 1776 in Philadelphia, Paine published Common Sense, which became the most powerful pamphlet in American history. The first edition of 1,000 copies sold out within two weeks. By June, reprints raised the total to 150,000 copies: a phenomenal impact for a public of only 2.5 million people, a fifth of them slaves. Many more colonists read excerpts from Common Sense in their local newspapers or heard it read aloud in taverns and streets. Except for the Bible, no written work had ever been so widely read and discussed in British America.
For Paine, style was also substance, for he sought to constitute a new readership: a broad and engaged public for a republican revolution. He insisted that common people should no longer defer to gentlemen in politics. Aptly titled, Common Sense spoke to and for common people.
Paine pushed for immediate independence, a union of thirteen states, and republican governments for those states. All three goals broke dramatically with past experience and received wisdom. No colonies in the Americas had yet revolted from their mother empire; past bickering by the colonies augured poorly for a union; and almost all former republics in Europe had been small, contentious, and short-lived. In a daring stroke, Paine argued that Americans could triumph by combining all three gambles: on independence, union, and republic. Seeking one alone would certainly fail, but the combination would prove invincible. If united in a righteous cause, he insisted, Americans could crush the corrupt mercenaries of a royal tyrant.
He elevated the Patriot struggle in utopian and universal terms. By winning republican self-government, Americans could create an ideal society of peace, prosperity, and equal rights. That conspicuous success would, in turn, inspire common people throughout the world to seek freedom either through revolution at home or by migrating to America, “an asylum for mankind.” Paine concluded, “The cause of America is in a great measure the cause of all mankind. . . . The birth-day of a new world is at hand.”
Americans now take this soaring rhetoric for granted, but it was new and radical for colonists who had long felt self-conscious about their parochial and provincial status.
Loyalists retorted in their own pamphlets, principally James Chalmers’s Plain Truth, which defended the mixed constitution as “the pride and envy of mankind.” But Loyalist writers struggled to reach readers because Patriots seized and burned most of the opposition’s pamphlets.
Adams delighted in the “high tone and the flights of Oratory with which [the Declaration] abounded.” He later added, “The Declaration of Independence I always considered as a Theatrical Show.” Congress forwarded printed copies to state officials with directions to proclaim and circulate. At 1,337 words, it fit on one long, printed page—a broadside—which proved convenient for posting on the doors of public buildings and the walls of taverns. Newspapers reprinted the Declaration, magistrates proclaimed it at county courthouses, and clergymen announced it from their pulpits. Washington had his officers read the Declaration to their assembled troops.
In New York City on July 9, Patriots toppled the great equestrian statue of George III and melted its lead to make 40,000 bullets to shoot at redcoats. In that blow for liberty, the Patriots employed slaves to tear down the statue.
By declaring independence, Congress gave the conflict greater clarity and raised its stakes. No longer were Patriots absurdly fighting in the king’s name against his Parliament. Instead, they defended an American union of republican states. A Delaware man observed, “I could hardly own the King and fight against him at the same time, but now these matters are cleared up. Heart and hand shall move together.”
More than 700 people signed a “declaration of dependence” affirming their loyalty to the empire. In a proclamation, Howe promised to restore the “free enjoyment of their Liberty and Properties upon the true Principles of the Constitution.”
At least 5,000 men accepted British pardons. The defectors included Richard Stockton, a signer of the Declaration of Independence. Washington grimly understood that the people “will cease to depend upon or support a force, from which no protection is given to them.”
The Patriots suffered no dead and only three wounded to take 948 prisoners in Washington’s first significant battlefield victory of the war.
But Washington understood better than Howe that victory hinged on who could endure a long, hard, and bitter struggle.
abilities. The British writer Horace Walpole described Burgoyne as a “vain, very ambitious man with a half-understanding which was worse than none.”
While Howe won the showy battles, Washington was winning a war of attrition as the British lost men whom they could ill afford to replace. His friend General Nathanael Greene, noted: “We cannot conquer the British force at once, but they cannot conquer us at all. The limits of the British government in America are their out-sentinels,” for they lacked enough committed Loyalists to hold the ground that Howe passed over.
An adept political infighter, Washington built a powerful “interest” among officers and in Congress. Underestimating Washington was a fool’s errand.
Two thousand men, nearly a fifth of the army, perished that winter from a debilitating combination of filth, exposure, malnutrition, and disease.
Admiring Washington’s persistent soldiers, Steuben marveled that no European army would have held together under such suffering.
The Patriot diplomats in Paris included Benjamin Franklin, a famed scientist and talented writer, who made the most of his international celebrity. During the early 1770s in London, Franklin had taken pains to appear wealthy and genteel, but in Paris he shrewdly cultivated a new persona as the plain and honest American of simple but dignified clothes and manners. He deftly appealed to the French fashion for romanticizing America as a land of purer simpletons, early versions of Jerry Lewis. Taken with Franklin, the French mass-produced his image on engravings, medallions, busts, and statuettes.
Shocked by France’s alliance with the Patriots, Lord North sought reconciliation with the Patriots in early 1778. Responding to North’s urgent lobbying, Parliament reluctantly rescinded the Coercive Acts of 1774 and promised to exempt colonists from parliamentary taxation forever (save for duties to regulate trade). In return, Patriots would have to demobilize their army, abandon the French alliance, and restore colonial governments with royal governors.
If offered in 1774, the concessions would have resolved the American crisis short of war. By 1778, however, Patriots would accept no dependence, however limited, on the empire. Clinging to independence and the French alliance, Congress refused to meet with the Carlisle commissioners.
Morris’s threats seemed credible thanks to a real terrorist, a young Scot named James Aitken but commonly known as “John the Painter” because John was his alias and he worked as a house painter. Aitken had spent two years in British America, initially as an indentured servant in Virginia, but he had returned to England in the spring of 1775 just as the war began. Short, slim, hard-drinking, and prickly, he wanted revenge against the many English who mocked his poverty and stutter. He hoped to return to a hero’s welcome in America after burning Royal Navy dockyards in Britain. In 1776, he visited Paris to pitch his plan to Silas Deane, who provided some encouragement and money. In late 1776 and early 1777, Aitkin set fires that damaged the Portsmouth dockyard and the city of Bristol, which spread public terror via the newspapers through England. Arrested in January, he was tried, convicted, and executed in March at the age of twenty-four. Hoisted in an iron gibbet, his rotting body hung for years as a warning to others at the entrance to Portsmouth Harbor on the Channel coast.
The entry of the French and Spanish (and later the Dutch) escalated the conflict into a world war. Facing attacks on colonies and shipping around the globe, the British no longer could concentrate troops and warships in North America to suppress the rebellion. Instead, they had to divert military resources to defend Gibraltar and Minorca, the West Indian sugar colonies, the slaving entrepot at Senegal, and the East India Company’s holdings in India.
A more dreaded draft consigned men to the Continental Army for three or more years. Draftees received a notice: “This is to inform you [that you] are this evening drafted as one of the Continental men to go to General Washington’s headquarters, and you must go or find an able bodied man in your Room, or pay a fine of twenty pounds in law[ful] money in twenty-four hours.” A prosperous conscript paid the fine or hired a substitute, but poor men had no choice but to serve or run away.
A French officer noted that Continental regiments “were composed entirely of vagabonds and paupers; no enticement or trick could force solid citizens to enlist as regulars.”55 Apprentices, transients, beggars, drunks, slaves, and indentured immigrants abounded among the new recruits, as communities thrust military duty on marginal men. A Continental officer described most of his recruits as “only Food for Worms . . . hungry, lean faced Villains.”
For want of gold and silver, Congress and the states printed millions of paper dollars, which rapidly depreciated toward worthlessness. A pound of beef cost 4 cents in 1777 but $1.69 three years later.
Soldiers also could band together to go on strike, which officers called “mutiny.” During mutinies, enlisted men refused to do duty until satisfied for their grievances, or they could march away as regiments bound for home. Rare in the early, active years of the northern war, mutinies became more common between 1779 and 1781. Soldiers usually mutinied during the winter or spring: the hungriest seasons of their service. Rather than embroiling the entire army, mutinies were sporadic and involved regiments from the same state. Not yet thinking of themselves as Americans, soldiers lacked bonds and trust across state lines. Officers relied on the animosities and prejudices between states to suppress mutinies. During the spring of 1780, when Connecticut regiments went on strike, Pennsylvania troops forced them to submit.
The post-1776 Continental Army belied the myth of heroic citizen-soldiers putting down the plow to pick up their muskets and win the war. In fact, a small regular army of poor men sustained the Patriot cause by enduring years of hard duty and public neglect.
Women even collected urine in chamber pots to help produce saltpeter, an essential but scarce component for making gunpowder.
Feeding Arnold’s resentments, Peggy urged him to seek greater justice and rewards with the British. She helped him open a covert correspondence with Major John André, who had been her friend during the British occupation of Philadelphia and later became General Clinton’s spymaster. Through intermediaries, Arnold, André, and Clinton reached a deal in August 1780, when Arnold obtained a new command: of West Point, the pivotal fortification in the Patriot defenses on the Hudson River north of New York City. In return for betraying the garrison and embracing Loyalism, Arnold would receive a whopping £20,000 payment, a lifetime pension of £500 annually, and a commission and command as a major general in the British service. Clinton hoped that such a high-profile defection would discredit Congress and initiate a cascading series of conversions that would doom the revolution. With Arnold’s help, Clinton predicted that “the Rebellion would end suddenly in a Crash.”
Delivering himself but not his post, Arnold received a cut-rate price from the British: £6,000 instead of £20,000. Still, he made more money from his service than did any other American general during the war.
Left behind by her fleeing husband, Peggy gave a masterful, wailing performance of utter shock and raving depression, which fooled Washington and his gallant officers, who pronounced her innocent of everything but great beauty. She left West Point for her father’s home in Philadelphia but later slipped away to join her husband in New York.87
Clinton had hoped that Arnold’s betrayal would discredit the Patriot cause and promote mass defections. Instead, the Patriots spun Arnold’s betrayal as the consummate, but selfish and isolated, act of treason.
Revolutions breed civil wars: triangular struggles in which two sides compete for civilian support. British leaders wishfully believed that most colonists were Loyalists temporarily cowed by a minority of brazen and bullying rebels. “I may safely assert,” General Howe reported, “that the insurgents are very few, in comparison with the whole of the people.” Although committed Patriots were a minority outside of New England in 1775, Howe confused the colonial majority, which preferred to remain neutral, with committed Loyalists, who, in fact, were a minority even smaller than the Patriots.
John Adams famously calculated that, at the start of the revolution, “We were about one third Tories, one third timid and one third true Blue.” Adams overestimated the Loyalist proportion, which probably peaked early at a fifth of the population or about half the proportion of Patriots. He also undercounted the wavering, who comprised about two-fifths of the people.
Patriots demanded far more from common people than Parliament ever had.
we suggest that individuals made quick and definitive decisions based on political principles. Some did, but many more committed slowly, reluctantly, and provisionally. In one North Carolina county, the vaguely divided militiamen decided to cast their united lot based on the result of a fistfight between a Loyalist and a Patriot.
In the most common pattern, Loyalists belonged to local minorities who feared living in a republican society where a distrusted majority could dominate them. Unpopular and harassed by their neighbors, the Scots in North Carolina and Virginia rallied to the king as their protector. Anglicans and Quakers dreaded falling under the sway of their more numerous Congregationalist and Presbyterian rivals, who were early and staunch Patriots.
During the period 1774 to 1776, Patriots seized control of almost all the printing presses and militia units. As in other revolutions, a committed and organized minority led the way, demanded that others follow, and punished those who balked. In a hard fight against a powerful empire, Patriots refused to tolerate doubters and critics in their midst.
After partially scalping Brown, they carted him through the streets for public ridicule. By abusing and shaming Loyalists as despised outsiders, rituals of intimidation helped draw wavering neighbors into the Patriot ranks.
Thanks to superior communication and coordination across counties and colonies, Patriots could bring overwhelming force to bear on pockets of disaffection.
Meanwhile, farms and families suffered from the incarceration of a father or son. When released, a Loyalist often returned home to find it looted and burned and his family scattered. Some courts had Loyalists branded on the face or cut off their ears so that Patriots could recognize enemies in their midst.
A Patriot officer confessed, “Any Army, even a friendly one, if any can be called so, are a dreadful Scourge to any People—you cannot conceive what Devastation and Distress mark their Steps.” General Greene considered it “impossible to carry on a war without oppressing the inhabitants.”
And Patriots were masterful at turning enemy brutalities into vivid stories to demonize the British. Possessing less skilled polemicists and fewer printing presses, Loyalists faltered in the competition to convert atrocities into compelling propaganda.
For hanging Loyalists after quick, mock trials, Colonel Charles Lynch of Virginia turned his name into a verb.
While the invaders drew upon runaways for recruits, Patriots would have to reserve men to guard their slaves, reducing the troops available to resist invasion. With the threat of blacks and Indians, the British hoped to intimidate southern Patriot leaders into quick submission. John Stuart, the British Indian agent, had noted, “Nothing can be more alarming” to southern whites “than the Idea of an Attack from Indians and Negroes.”
In late 1778, General Clinton began the southern invasion by sending 3,000 troops in ships from New York to Georgia. The weakest and most vulnerable of Patriot states, Georgia had a long frontier with native peoples, and the small free population of about 18,000 feared their 15,000 slaves.
By the end of the war, blacks comprised a tenth of the Continental Army, serving at a rate double that of their proportion in the northern population.
wealth.” Despite his private qualms about the plan, Washington similarly assured John, “I must confess that I am not at all astonished at the failure of your Plans. That Spirit of Freedom, which at the commencement of this contest would have gladly sacrificed everything to the attainment of its object, has long since subsided and every selfish Passion has taken its place.” The long, hard war had amplified Americans’ vices at the expense of their virtue.
His men were hungry and unnerved by the summer heat, leading Gates to lament, “Our Army is like a dead whale upon the Sea Shore—a monstrous Carcass without Life or Motion.”
Where British atrocities stiffened Patriots’ resistance, their cruelties spooked the Loyalists, a difference which predicted who would ultimately win the partisan war in the backcountry. A Patriot exulted, “The tories, who after the defeat of Genl. Gates had a full range, are chased from their homes, hunted thro’ the woods and shot with as much indifference as you would a buck.”
Despite the setback on his western flank, Cornwallis pressed deep into North Carolina in pursuit of the Continentals under a new commander: Nathanael Greene. A self-educated and genial former blacksmith from Rhode Island, Greene had a surprising self-confidence that impressed his men and superiors, especially Washington, who became his great patron. No general in the war, not even Washington, could do more with less than Greene, a talent which served the Patriots very well indeed in the embattled South.
The raiders captured munitions and seven state legislators, including Daniel Boone, who represented Virginia’s county in Kentucky.
Plain folk bristled when wealthy planters exempted themselves, their sons, and overseers from militia duty so that they could stay home to guard the enslaved. A common farmer complained, “The Rich wanted the Poor to fight for them, to defend their property, whilst they refused to fight for themselves.”
Desperate for fodder and food, both armies stole from civilians, ruining hundreds of farms and impoverishing thousands of people. On both sides, officers reasoned that any resources left behind would only fall to the enemy. But the worst looting and killing derived from the vicious competition of irregular partisans. Rarely paid, they sought compensation in plunder, which discouraged morality and restraint. They assassinated foes, executed prisoners, and looted and burned the homes of civilians caught in the middle. A Patriot partisan recalled victory in a skirmish where “Not one [Loyalist was] taken prisoner—for that occurred but seldom, the rifle usually saved us that trouble.” Both sides applied to whites the cruelties previously practiced in frontier war against Indians.
At night, both sides attacked houses to kill helpless, sleeping foes in their beds or to haul them out for brutal floggings that could become fatal. Hundreds of desperate families became “Outlyers,” abandoning their homes to live in hovels secreted in forests or swamps. A North Carolinian reported, “Fear of being called into the militia has driven many to hide in the woods, [and] as they have nothing on which to live they resort to highway robbery.” Bandit gangs added to the chaos by preying upon everyone.
Initially so promising, the British bid to conquer the South withered as they alienated most of the white people, including many formerly inclined to Loyalism. The superiority of Patriot partisans also swayed the wavering majority. Cornwallis complained, “Colonel [Francis] Marion had so wrought on the minds of the people, partly by the terror of his threats and cruelty of his punishments, and partly by the promise of plunder, that there was scarcely an inhabitant between the Santee and Pedee [rivers], that was not in arms against us.” Marion’s mood was not improved when Loyalists captured and executed his son.
Patriot raiders burned fifteen more towns in March 1781. Burning out Indians proved wildly popular with southern settlers, who rallied to the Patriots and despised the British. Cherokees
served the Patriot cause better as enemies than they could have as allies.
This violent redistribution of the enslaved disrupted their families and communities—dividing relatives and friends forever. By flirting with Indians, bandits, and runaways, the British unwittingly alienated most white southerners, who defended private property and white supremacy by rallying to the Patriots.
Although unreliable when thrust into combat with regulars, Patriot militiamen were plenty good at suppressing their Loyalist counterparts.
Despite some showy, early victories, the British forces were big enough to spread chaos but far too small to restore peace.
The Patriot cause merged a frontier hunger for Indian land with a dread of British power. Colonists feared confinement within a boundary patrolled by savage warriors allied to a domineering empire.
In the east, where Indians lived in poverty and small enclaves enveloped by settlements, Patriots demanded their armed support. Bowing to that pressure, southern New England’s Indians sent men to fight and die for the Patriot cause. Indeed, a greater proportion of enclave natives served in Patriot forces than did their white neighbors. Similarly situated in the Carolinas, Catawbas fought beside, and caught slaves for, the Patriots. By serving as allies, eastern natives sought Patriot protection for their enclaves. Instead, they suffered especially heavy casualties, which emboldened white neighbors to grab more native land.
No matter how little the British did to help Indian raids, they bore the blame so skillfully cast by Patriot writers.
Renowned as fertile, Kentucky attracted hundreds of settlers in 1775–1776, just as revolution disrupted the empire.
Exploiting Henderson’s unpopularity, Virginia’s Patriot governor, Patrick Henry, organized Kentucky into counties, appointed magistrates and militia officers, and promised generous land grants at bargain prices.
The Spanish had to proceed carefully because of their weakness in Louisiana. Sprawled along the entire western side of the Mississippi River, the colony stretched across the Great Plains to the Rocky Mountains to include about 828,000 square miles. But Spain had only 500 soldiers and 30,000 colonists: about an eighth of the native numbers within that vast expanse.
Most of the colonists lived in two modest clusters on Louisiana’s eastern margin in the Mississippi Valley. The main cluster occupied the lower river around the capital and seaport at New Orleans. In upper Louisiana, a secondary set of settlements emerged at the confluence of the Mississippi, Missouri, and Ohio Rivers (in the future state of Missouri) with St. Louis as the primary town. Most of the colonists were Francophones who had lingered after the colony’s transfer from French to Spanish rule in 1763. Enslaved Africans comprised the second-largest group. The Spanish-speaking minority consisted primarily of newcomers from the Canary Islands, who settled along the river below New Orleans.
His armed force included more than 400 free blacks, whom the Spanish more readily employed as soldiers than did either Britons or Patriots.
With the conquest of West Florida, Spain dominated the entire Gulf of Mexico.
During the 1770s, Spanish officials tripled their sales tax and applied it, for the first time, to coca, which Peruvian natives depended on to cope with the cold and thin air of their mountainous region.
Although well educated as a Jesuit, José Gabriel Condorcanqui Noguera claimed descent from the Incan royal line and adopted the regal name of Tupac Amaru.
cracked in 1769 under the strain of fighting elusive Indians on the northern frontier. One night, Gálvez burst from his tent to announce a plan to “destroy the Indians in three days simply by bringing 600 monkeys from Guatemala, dressing them like soldiers, and sending them against” the natives. He then assumed, in succession, the identities of Moctezuma, the king of Sweden, St. Joseph, and finally God. But none of them could defeat the Indians. The viceroy decided that Gálvez needed to rest and recuperate. Two years later, he sailed back to Spain and won promotion to Secretary of the Indies, the cabinet post responsible for American policy. Madness in America was no bar to power in Spain.
Faster and nimbler than bison, mounted men armed with bows could maneuver and attack with deadly skill. By killing more bison, the Great Plains natives became better fed, clothed, and housed. Enriched by meaty protein, they raised the tallest children on the continent, taller even than the relatively well-fed people of the United States. The alluring combination of horses and bison drew more native nations to relocate onto the Great Plains. Coming from either the Rocky Mountains to the west or the Mississippi Valley to the east, the newcomers included Osages, Comanches, Cheyennes, Arapahos, Blackfeet, and the especially numerous peoples known to others as the Sioux and to themselves as Lakotas or Dakotas.
Moving southeastward out of the Rocky Mountains, Comanches drove out, killed, or captured their predecessors on the southern plains, primarily peoples known to outsiders as Apaches. Determined to keep their enemies weak, Comanches blocked efforts by French gun traders to push through to deal with desperate Apaches. By 1800, Comanches had grown in number to 20,000—twice as many as all other native peoples on the southern plains and more than the Hispanics of Texas and New Mexico.
Smallpox was deadlier for natives than for Euro-Americans, and people in the close quarters of an earth-lodge village suffered greater losses than did the nomads of dispersed and transient encampments. Often an epidemic affected everyone in a village, so that victims had no one to care for them.
By scalping their diseased foes, victors unwittingly carried away the virus to afflict their own friends and kin. The epidemic probably halved the native population of the Great Plains between 1779 and 1783.75
A commanding admiral tried to coordinate his vessels by means of signal flags from his “flagship,” but the signals often produced confusion during battle
He obtained a commission from Congress in 1776 and raided the Scottish port of Whitehaven in April 1778.
The French and British valued their West Indian colonies more than anything on the North American continent. Sugar cane raised by slave labor enriched planters and the merchants who transported sugar and rum to Europe. Taxes on slave imports and sugar exports generated greater revenue for the rival empires than any other colonial trade.
For the rest of the conflict, the empire sent more reinforcements to the Caribbean than to North America.
The 15 percent annual loss of troops to disease in the West Indies exceeded the 6 percent of those serving in New York or the 1 percent in Canada.
The French and Spanish could sustain larger forces in the Caribbean in part because they more readily enlisted and armed free blacks. These empires also had more free blacks to recruit because the Catholic church and their laws encouraged manumissions.
More pragmatic in tincturing their racism, the Spanish and French recognized that an armed and intermediary caste of free blacks tended to secure, rather than imperil, the slave system.
Cornwallis also crippled British prospects by accepting surrender terms that denied his Loyalist troops the protection of prisoner-of-war status. Unlike the redcoats, Loyalists taken at Yorktown risked harsh punishment, even execution, as deserters and traitors by the Patriots. News of Cornwallis’s indifference shocked and demoralized Loyalists elsewhere.
Giddy with relief, the British public erupted in celebration at the news, particularly the capture of de Grasse and his massive flagship, which multiplied the impression made by the victory. Jamaica’s merchants, planters, and legislators funded an octagonal stone temple to house a neoclassical statue of the heroic Rodney. Medals, ballads, poems, pottery, and prints celebrated him as the greatest of British admirals. “Rodney Forever” became the hit song that summer in London.
Instead of the looter who helped to lose America, Rodney became the hero who saved the more important colonies in the West Indies. The Crown promoted him to the aristocracy as a baronet and awarded a lifetime, annual pension of £2,000. His former critics in Parliament joined in extolling the admiral. Lacking such acclaim, Sir Henry Clinton (rather than Rodney) bore the blame for the debacle at Yorktown because empires need scapegoats as well as heroes.
During the 1760s and 1770s, India grew in importance to the empire as a prime source of commerce and revenue. While native princes controlled the interior, the British held four colonies on the coast: Bombay on the west and Bihar, Bengal (including the overall capital of Calcutta), and Madras on the eastern shore. The empire entrusted their management to the East India Company, which appointed an overall governor-general. Unlike in North America, British settlers were few and redundant in distant and crowded India, so several hundred officials and soldiers relied on thousands of native collaborators to run the colonies. Based in seaports, the East India Company projected power into the interior through military alliances with some princes at the expense of others in a divide-and-conquer strategy. Claiming paternalistic motives, officials insisted that British rule in India protected the Hindu majority from brutal Muslim princes and rapacious French interlopers.
Weak and dependent on French aid, Congress dutifully instructed its diplomats “to undertake nothing in the negotiations for peace or truce” without informing French leaders to secure their “concurrence, and ultimately to govern yourselves by their advice and opinion.” To further restrain Adams, Congress added Benjamin Franklin, John Jay, Henry Laurens, and Thomas Jefferson to the peace commission (but Jefferson declined to go).
By keeping the British in Canada, Vergennes also hoped to perpetuate their frictions with the Americans, who would then have to cling to their alliance with France. “But you will agree, Monsieur,” Vergennes assured a French diplomat, “that this way of thinking ought to be an impenetrable secret from the Americans. It would be a crime that they would never pardon.” But Vergennes would soon find that Americans were his superiors at deception.
In theory, Ireland was an independent kingdom, but its monarch was the British king, and his ministers controlled civil, military, and judicial appointments. Thousands of occupying redcoats dominated the Irish and served as a reserve for deployment overseas in an emergency. Although Catholics comprised three-quarters of the people, they were denied political and property rights. Most were poor laborers and peasants who rented small farms and cottages from Anglican landlords. Presbyterian dissenters prevailed in Ulster (in northern Ireland), and they also resented domination by the Anglican elite based in Dublin. Anglicans owned 90 percent of the land and held almost all government offices, military commissions, and seats in the Irish Parliament, which had less autonomy than a colonial assembly.
Of course, Adams resented the restraining instructions from Congress: “Those Chains I will never wear.” Wearied by Adams’s insistent fears, Franklin characterized him as “always an honest Man, often a wise one, but sometimes and in some things absolutely out of his Senses.”
In late October, Adams and Jay decided to deal directly with the British without consulting Vergennes. To their surprise, Franklin accepted that negotiating strategy. By pursuing distinct and secret negotiations with the British, the commissioners sought the best possible deal, but they did so in defiance of their instructions from Congress.
The war also doubled France’s national debt, creating a fiscal crisis that compelled the king to summon the long-suspended Estates-General, the French parliament. In 1789, that parliament initiated a revolution that destroyed the French monarchy and sucked Europe into a massive new war, which proved especially disastrous for the Spanish.
Likening the empire to an edifice, Lord Macartney declared that without America “the building not only looks much better but is a great deal stronger.”
West Indian planters denounced new imperial restrictions on the slave trade and slavery as attacks on the rights of property. Many regretted that they had not joined the revolution by their fellow slaveholders in North America. But the islanders had lacked that option given their dependence on the British market for their sugar and on the Royal Navy for protection. They were stuck when the empire abolished the slave trade in 1807 and emancipated the slaves during the 1830s.
In one late skirmish, John Laurens died in a reckless charge into a Loyalist ambush outside Charles Town. The Patriot champion of black enlistment and emancipation was shot by runaway slaves in the British service.
Militia victories culminated in many executions, and the heads of dead maroons decorated poles erected outside county courthouses in Georgia and South Carolina.
At the end of the war, Virginia had 236,000 slaves, up from the 210,000 at the start.
Thousands of former slaves sought havens within the British Empire, initially in Nova Scotia and later in West Africa.
Hamilton urged Washington to lead the army’s demand for the nationalist program, but the general balked, lest he “open the flood Gates of Civil discord” that would “deluge our rising Empire in Blood.” Washington preferred to wait and trust the states eventually to do right by soldiers and officers.
If Congress ever did make good on the certificates, most payments would benefit the speculators who bought them from desperate soldiers in 1783.15
But civilians filled the hall and gallery of the statehouse to watch Washington resign. He aptly described his performance as theatrical: “Nothing now remains, but for the actors of this mighty Scene to preserve a perfect unvarying consistency of character through the very last act; to close the Dramma with applause, and retire from the Military Theatre.” An impressed congressman praised the “solemn and affecting spectacle.” Washington projected a dignified and selfless devotion to the republican cause: a precious rarity among his squabbling countrymen. John Adams later described Washington’s performance as “Shakespearean” and praised his “Excellence in Dramatic Exhibitions.”
He got support from London’s abolitionists, led by Granville Sharp, who organized the Sierra Leone Company to found a West African colony as a haven for free blacks. The abolitionists funded Peters to recruit black colonists with assurances of generous land grants and political equality. The company leaders promised, “No distinction is to be made, between them & the Whites.”
Alluringly named Freetown, the colonial capital had, by 1796, four hundred homes on a grid of nine streets. The black colonists relied upon an emotional, evangelical faith that distinguished them from their white neighbors, who were often transported criminals. A governor noted, “While the white inhabitants are roaring with strong drink at one end [of Freetown], the Nova Scotians are roaring out hymns at the other.”
That ground proved slippery. When Burke discharged a former Loyalist and accused horse thief for lack of evidence, a furious mob rushed in, seized the suspect, and hanged him in front of the courthouse door. This postwar violence troubled some elite Patriots as a threat to social order. Nathanael Greene denounced the “intolerance to persecute men for opinions which, but twenty years before, had been the universal belief of every class of society.”
Eight thousand refugees went to Britain, which one called “The Isle of Liberty and Peace.” But most of them struggled with drenching rain, bitter poverty, a high cost of living, cultural disorientation, and British prejudices against Americans as cunning cheats.
About a third of the Maritime Loyalists drifted back to the United States during the late 1780s or early 1790s, after the hatreds of the civil war had cooled.
Location: 4,996 The revolution led to virtually free land for settlers in British Canada while rendering land more expensive in the United States. Burdened by immense public debts incurred to wage the war, federal and state governments sold vast tracts of frontier land to speculators, who could immediately pay large sums in cash.
Parliament also funded the salaries for the colony’s executive and judicial officers, which ensured the Crown’s control over its servants. This arrangement reflected another British lesson drawn from the thirteen rebelling colonies: that elected assemblies had used their power over salaries to sway Crown officials. In the Canadas after the war, the British government paid those salaries to prolong colonial dependence on the empire.
Thanks to this British largesse, the Canadian colonists paid lower taxes than did Americans in the republican states. In 1794, Upper Canada’s landholders paid only 5 shillings in tax for every £100 in assessed property: a rate of one-quarter of one percent and a fifth of the tax rate in New York. An ironic consequence of the revolution was that the Patriot “winners” had to bear higher postwar taxes to finance their war debts.
Lord Thurlow explained that the British “have given them more civil liberty, without political liberty.” In sum, the British worked to deny the colonists both the civil motives and the political means for agitation.
Under the Articles, the states conceded only a few, limited powers to Congress: to wage war, conduct diplomacy, and arbitrate disputes between them. To satisfy small states, each delegation cast a single vote, so little Rhode Island’s vote matched that of vast Virginia. On the major issues of war and peace, only a supermajority of nine states (out of thirteen) could commit the confederacy. The Articles required ratification by all thirteen states to become operative, and any future amendment also required unanimity, which discouraged any change.
Thomas Jefferson drafted the first “Northwest Ordinance,” which Congress provisionally adopted in April 1784. It subdivided the federal domain into ten territories and stipulated that, once any one had 20,000 free citizens, they could convene a convention to draft a republican constitution and send a delegate to Congress. When the settler population reached the threshold of the smallest original state, Rhode Island, the territory could join the union as an equal partner in its powers and share in the national debt. While holding out future statehood, the ordinance bought time for the federal government to sell land within the territories.
To help the Ohio Company attract respectable settlers from New England, the new ordinance outlawed slavery in the Northwest Territory. To reconcile southern congressmen, the ordinance also mandated a fugitive slave law, requiring territorial settlers to return runaways from the slave states. The ordinance also implicitly kept open more southern territories to slavery. The combination of a partial restriction on slavery’s expansion with a fugitive slave law set a precedent deemed fundamental to preserving the union.
Regarding Congress as hapless and hopeless, settlers fought their own war against Indians in the Ohio Valley. In 1786, Benjamin Logan and George Rogers Clark led Kentucky militiamen in massive raids that destroyed seven Shawnee villages, although their chiefs had cooperated with federal treaty councils. One of them, Molunthy, received an American flag, which he flew over his village as a source of protection. Instead, Kentuckians seized the chief and smashed his skull with a tomahawk. After blowing his body to pieces with gunpowder, they burned his village and its crops. While smashing Shawnees, the raids also discredited the federal government that had promised to protect them.
The growing settlements also alarmed their imperial neighbors: the British to the north in Canada and the Spanish to the southwest in Louisiana. From just 12,000 in 1783, Kentucky’s population exploded to 73,000 in 1790. In
Another Spanish governor regarded the settlers as “distinguished from savages only in their color, language, and the superiority of their depraved cunning and untrustworthiness.”
In effect, the Spanish tried to build a settler fire wall to keep away more hostile settlers from the American orbit. During the 1780s and early 1790s, about 20,000 Americans moved to Louisiana. Some newcomers, including Daniel Boone, settled near St. Louis, but most developed farms and plantations around Natchez, on the east bank of the lower Mississippi. “May God keep us Spanish,” wrote one grateful settler.
McGillivray warned against gambling that Americans could become good Spanish subjects. He insisted that “filling up your country with those accursed republicans is like placing a common thief as a guard on your door and giving him the key.”
In 1784, Robert Morris resigned as superintendent of finance for want of any money to superintend.
Franklin agreed, “Our States are on the point of separation, only to meet hereafter for the purpose of cutting one another’s throats.”
During the revolution, however, they faced growing pressure from common voters, who often favored ambitious men of newly gained wealth. In 1786, the French minister to the United States reported, “Although there are no nobles in America, there is a class of men denominated ‘gentlemen,’ who by reason of their wealth, their talents, their education, their families, or the offices they hold, aspire to a preeminence which the people refuse to grant them.” Gentlemen cast their new rivals as ignorant, vulgar, and greedy men aspiring to more than their due. Robert Morris complained that republican politics favored “vulgar Souls whose narrow Optics can see but the little Circle of selfish Concerns.” This sounded obtuse coming from a man who had gotten rich by manipulating government contracts, subsidies, and land grants.
Common folk also bore the greatest sacrifices and hardships of the war. As the conflict dragged on, they resented their increasing burdens from taxes and militia service. They blamed leaders for waging a rich man’s war by making it a poor man’s fight.
By disrupting traditional networks of trade and political influence, the war created opportunities for daring and ambitious men to wheel and deal their way to new wealth and greater influence. They profited from military contracts and speculation in the confiscated estates of exiled Loyalists.
But Clinton repeatedly won reelection, governing the state for nearly twenty years by consolidating his popularity as the farmer’s champion. Common voters appreciated that Clinton celebrated their way of life instead of displaying the traditional elitism within his grasp.
Alexander Hamilton rose in the approved fashion to win acceptance into the New York elite. Although born poor and illegitimate in the West Indies, Hamilton secured a wealthy patron, college education, high-status wife (Schuyler’s daughter), and the mores and manners of his new peers. Brilliant, short, elegant, and handsome, Hamilton wore a self-assurance that belied his origins. After serving as an officer in the army, he became a lawyer and politician who championed the interests of wealthy merchants and great landlords, obscuring his early poverty. Gentility as well as wealth jointly qualified the proper leader.
To defend liberty, Patriots dispersed power among many legislators and rendered them collectively superior to the governor. Most governors lost their powers to appoint officials and veto laws. No state constitution allowed a governor to prorogue a legislature.
While granting sovereignty to “the people,” the state constitutions insisted that only men with property could vote or hold office. As in the colonial era, a voter had to own enough real estate to support his family, which meant having either a farm or a shop. Because small farms and workshops abounded, about two-thirds of white men qualified to vote. Patriot leaders insisted that only propertied men possessed the “independence” from a landlord or employer required to make judicious political decisions.
The new constitution broadened the electorate to include any man who had resided for at least a year in the state and had paid any tax. Where only two-thirds of men could vote under the colonial property requirement, 90 percent qualified as a result of the new taxpayer standard.
Distrusting human nature, they regarded inaction (or “gridlock” as we would put it today), as preferable to the hyperactive legislation of popular government.
It was folly, he insisted, to deprive the wealthy of all institutional power, for then they would subvert the government—as they were busy doing in Pennsylvania. If concentrated instead in a senate, elite power would become more conspicuous, inviting scrutiny and restraint from the lower house and common voters.
In a stroke of political genius, conservatives packaged a separation of powers as the essence of true republicanism. In New Hampshire, a conservative insisted that the different branches of a properly complex government should, by watching one another, “become centinels in behalf of the people to guard against every possible usurpation.”
The long, hard war had devastated the American economy. Roaming armies and frontier raiders uprooted thousands of people by destroying their farms, plantations, and towns. At least 25,000 Americans died in military service, usually of disease. As a percentage of the population, the mortality exceeded every American conflict but the Civil War of the 1860s. British
Economic historians find a 30 percent decline in national income between 1774 and 1790: a decline which they characterize as “America’s greatest income slump ever,” and an “economic disaster.”
To reduce inflation, radical Patriots favored strict price controls, the confiscation and sale of Loyalist property, and tender laws that punished anyone who refused to take paper money in payment. Radicals blamed inflation on rich merchants who allegedly hoarded scarce commodities and sold them only when paid premium prices. The growing wealth of merchants like Robert Morris bred angry resentment among poor men, who blamed “greedy Muckworms.”
Women often took the lead in riots, gathering to break open merchants’ warehouses to seize sugar, coffee, and flour. Rioters left behind paper money at the price they deemed just. According to Abigail Adams, in July 1777 in Boston, a hundred women “assembled with a cart and trucks” to break open the warehouse of “an eminent, wealthy, stingy Merchant (who is a Batchelor).” They beat him until he gave up his keys. While they carted away his coffee, “a large concourse of Men stood [by as] amazed, silent Spectators.”
In early October, renewed disturbances led thirty armed conservatives, including Robert Morris and James Wilson, to hole up in Wilson’s mansion, which became known as “Fort Wilson.” Radicals and conservatives exchanged gunfire, killing men on both sides. Just as common militiamen broke down the front door, a genteel militia unit, the City Light Horse, arrived to save the mansion and its defenders. The authorities dared not prosecute anyone on either side for the violence. The Fort Wilson riot deepened conservative hatred for the Pennsylvania constitutional regime, which apparently tolerated economic regulation by mobs.
To fund those payments, most states levied taxes at unprecedented rates. In 1786, for example, taxes in Massachusetts were at least four times higher than before the war. The taxes were regressive, with the poor land of common farmers paying at a higher rate than the vast tracts held by land speculators. Many states relied on poll taxes levied at the same rate on every man, poor and rich. Recalling the prewar protests over small British taxes, some rural people complained, “Our Grievances Ware Less Real and more Ideal then they are Now.”
The postwar recession multiplied lawsuits for debts, and court judgments also demanded payment in hard money. If a debtor failed to pay, sheriffs seized his livestock and land for auctions, where property sold for a fraction of its appraised value. If a farmer hid his livestock, a creditor could cast him in jail as a hostage for the debt. A gloomy debtor insisted that New England had become “a country, which holds out nothing but poverty and prisons.” Farmers dreaded losing title to their farms to become the dependent tenants of aristocratic landlords.
The legislature, however, badly represented rural interests because towns had to pay the costs of their representatives, so fifty-two poor towns economized by declining to send any. Naturally, the representatives of wealthier towns saw no reason to heed their petitions.
Across the country, conservatives overreacted to Shays’s Rebellion, which they improbably described as a leveling movement meant to abolish all debts and confiscate great property to benefit the poor. Henry Knox assured Washington, “Their creed is that the property of the United States has been protected from confiscation . . . by the joint exertion of all, and therefore ought to be the common property of all.”
Some states offered more limited measures to delay or reduce taxes or allow payments of debts or taxes in farm produce rather than in scarce cash.
Although no state, save Pennsylvania, was particularly democratic, conservatives did not like to compromise when their property was at stake. They regarded state constitutions as pretty on paper but failures in practice, for too many alleged demagogues won legislative seats by pandering for popularity. Dr. Benjamin Rush detected a new pathology in American minds, a disease he named “Anarchia” and described as “the excess of the passion for liberty . . . which could not be removed by reason, nor restrained by government.” By 1787, conservatives concluded that the revolution had gone too far.
As he saw it, the problem was that too many people could use state governments to shut their purses to him while they reached into his pocket.
The eldest son of the wealthiest man in a Virginia county, Madison inherited hundreds of acres, scores of slaves, and the leisure time to read widely. Attending the College of New Jersey (now Princeton), Madison studied moral philosophy and graduated in 1772. Returning to Virginia, he became active in his county’s Committee of Safety and the colony’s Provincial Congress. Scholarly, sickly, astute, and shy, he worked best in a library or on a committee rather than on the stump in an election. Joining Congress in 1779, he grew frustrated with regional rivalries and the weak confederacy.
Madison defied conventional thinking, which insisted that a republic could only work on a small scale, no larger than a state, so that the legislators could know their constituents and feel the local consequences of their laws. The diverse climates, landscapes, ethnicities, and classes in the vast United States seemed to defy unification in one superrepublic. Past republics had worked best on the relatively intimate scale of a city, a province, or a canton, which enjoyed relative homogeneity. Madison retorted that no American state enjoyed homogeneity and harmony, least of all the smallest one, Rhode Island.
In the New York delegation, Clinton assigned two clients to keep a close eye on the third member, Hamilton.
Through self-discipline and hard service, he became known as the consummate republican leader of virtue, dignity, and honor. Sensitive to his public audience, Washington weighed every word, gesture, and decision with painstaking care. His vaunted reputation gave him great political leverage, but he dreaded losing his impeccable aura with any public controversy.
The fifty-five delegates were mature, wealthy, and well-educated gentlemen from prestigious families. Aside from two Catholics, all were Protestants, and 70 percent belonged to the relatively elitist Episcopal Church. None adhered to the more plebeian evangelical churches. At a time when few men (and no women) had access to higher education, twenty-nine of the delegates held college degrees. Professionally, they were a mix of lawyers, merchants, great planters, and landlords. Twenty-five owned slaves. Aside from the wealthy Pennsylvania merchants, the richest delegates came from Virginia and South Carolina, where they possessed vast plantations worked by scores of the enslaved. Most had federal experience as congressmen (forty-two) or Continental officers (thirty). No delegate embodied the perspective of the American majority: the hinterland farmers of modest means.
To discourage snoopers, delegates kept the hall’s windows and doors shut through the heat and humidity of a Philadelphia summer. By excluding public feedback, they sought frank discussions. Madison later insisted, “No Constitution would ever have been adopted by the convention if the debates had been public.”
A Connecticut delegate declared that Hamilton’s plan “had been praised by everybody” but “supported by none.” The delegates could not afford to embrace a controversial British-style mixed constitution. Citing an ancient precedent, Pierce Butler of South Carolina concluded, “We must follow the example of Solon who gave the Athenians not the best Government he could devise but the best they would receive.”
The heated debates over representation exposed a second, even more dangerous fault line between northern and southern states. Slaves comprised less than 4 percent of the northern population compared to 40 percent in the South. Determined
Some northern delegates argued that only free citizens should count in allocating seats in the House of Representatives. Gouverneur Morris denounced the inequity that the inhabitant of Georgia and South Carolina who goes to the Coast of Africa, and in defiance of the most sacred laws of humanity, tears away his fellow creatures from their dearest connections and damns them to the most cruel bondage, shall have more votes in a government instituted for the protection of the rights of mankind, than the citizen of Pennsylvania or New Jersey who views with a laudable horror so nefarious a practice.
In August, another heated debate erupted over continuing the import slave trade. On this issue, southern delegates divided. Seeking to expand their operations, planters in the Lower South demanded continued imports from Africa. The Upper South’s leaders, however, believed that they had a surplus and could profit by sales to the Lower South. Banning import competition would enhance those profits, but principle also played a role with some delegates, particularly Madison, who insisted that importing more slaves would “dishonor” the nation.
Saving the union and promoting profits ultimately mattered more than any moral principles against bondage for others. Thanks to this compromise, the Lower South would import another 200,000 Africans as slaves during the next twenty years.
Delegates longed to believe that slavery eventually would wither without Americans having to make difficult sacrifices.
They cleverly claimed the popular name of “Federalists,” which previously had meant a supporter of strong states and a weak confederation. Better still, the newly minted “Federalists” cast their critics as mere contrarians called “Anti-Federalists,” whom they characterized as cunning demagogues bent on cheating their creditors. Winning that propaganda battle, the Federalists created labels that stuck with the press and historians ever since.
Anti-Federalists pointed out that the constitution created a national government with greater power than Britain had ever exercised over the colonists.
An American national identity emerged later, slowly, painfully, and partially. It would follow from that constitution rather than lie behind its creation.
Only in Rhode Island did Anti-Federalists have the wit to call the Federalists’ bluff by holding a more direct appeal to the people: a referendum, where Rhode Islanders voted against the constitution by ten to one.
In the national campaign, Federalists enjoyed better networks for sharing information and political strategy. As congressmen, Philadelphia delegates, or Continental officers, many had worked with one another across state lines. A Connecticut Anti-Federalist lamented that the Federalists “have got almost all the best Writers (as well as speakers) on their side.” A New York Anti-Federalist noted, “The great easily form associations,” but “the poor and middling class form them with difficulty.”
Cities hosted most of the nation’s newspapers, and their editors usually agreed with their advertisers and subscribers, who were overwhelmingly Federalists. Eighty of the nation’s ninety-two newspapers tilted toward the Federalists, favoring their essays, particularly the celebrated series, The Federalist, written by Hamilton (51 essays), Madison (29), and Jay (5). Because postmasters doubled as merchants or printers, most supported Federalism, so Anti-Federalist letters and pamphlets often got lost or delayed in the mail. Thanks to distortions by press and post, readers could gain the misleading impression that the Federal Constitution was wildly popular elsewhere. Northerners read that Patrick Henry had endorsed the constitution when, in fact, he was leading the opposition in Virginia.
In the spring, two more states ratified overwhelmingly: Maryland in April and South Carolina in May. In both states, the apparent inevitability of national ratification had deflated opposition.
On June 21, New Hampshire narrowly provided that ninth state, but it was a minor prize compared to two major holdouts: New York and Virginia. A union without them would be painfully incomplete and likely to falter. Both favored Anti-Federalism, for their voters preferred to live in strong states within a weak union. Both states also had especially able, clever, and committed Anti-Federalist leaders: New York’s Governor George Clinton and Virginia’s Patrick Henry, whom Jefferson considered “the greatest orator that ever lived.” Considering his political clout invincible, Jefferson once observed to Madison, “What we have to do I think is devoutly pray for his death.” Clinton feared losing the state’s robust revenues from taxing imports, for his popularity derived from minimizing internal taxes. Henry dreaded that northerners would dominate a powerful nation and could threaten southern slavery.
stronger union would benefit Virginia, he argued, more than any other state. After three weeks of debate, Madison’s dogged optimism trumped Henry’s gloomy eloquence.
As the new administration’s prime mover, Alexander Hamilton cultivated “a principle of strength and stability in the organization of our government, and vigor in its operations.”
The great American nightmare was that foreign powers would exploit the internal divisions of a tenuous union.
Washington dreaded the new office as inevitably contentious and potentially fatal to his cherished reputation. He felt like “a culprit who is going to the place of his execution,” as he exchanged “a peaceful abode for an Ocean of difficulties.” Washington ensured the Constitution’s early survival by administering the new government with careful deliberation, good judgment, and public dignity. John Adams later recalled Washington as “the best actor of Presidency We have ever had.”
Washington went along with the quasi-regal style, acting with cold formality while wearing a dignified, dark suit accessorized with a ceremonial sword. He rode in a richly decorated coach drawn by six white horses and attended by four servants in orange-and-white livery. During public appearances, officials called him “Your Excellency,” and bands sometimes played “God Save the King.”
Rather than alarm southern leaders, Madison also avoided a philosophical preface on human equality featured in the bills of rights adopted by most states. Consequently, no one called Madison’s slate of amendments “the Bill of Rights” until the 1860s.
Madison’s amendments did give North Carolinians enough political cover to enter the union in November 1789. Rhode Islanders remained stubbornly independent until Congress put economic screws to the little republic by barring its trade with the United States in May 1790. At the end of that month a special state convention narrowly voted to rejoin the United States, which once again had thirteen states. The union added a fourteenth state with Vermont’s admission a year later.
Madison’s amendments protected only the free, leaving untouched the slavery suffered by a fifth of the American people. In February 1790, Pennsylvania Quakers and other antislavery activists submitted two petitions to Congress. The petitioners included an aged and dying Benjamin Franklin, who had owned slaves but had developed new scruples.
Overriding state legislatures and courts, the federal law authorized masters or their agents to pursue and seize alleged runaways in another state. Entrusting adjudication to any magistrate, the law denied trial by jury or the protections of habeas corpus to the accused runaways. Tacitly permitting kidnappings, so long as the victims were black, the law empowered southern agents who profited by hunting down and seizing supposed runaways in northern states.
Brash and energetic, he meddled in every department, including foreign affairs, to Jefferson’s disgust. Attentive to details and the big picture, Hamilton designed both policy and the bureaucracy to implement it. By deploying patronage, he built support in Congress and the states. A critic grumbled, “Mr. Hamilton is all powerful and fails in nothing which he attempts.”
Locating the temporary capital in Philadelphia created a problem for slaveholders, who by state law could keep a slave in Pennsylvania for no more than six months. Washington brought a few household slaves to Philadelphia but shuttled them back to Mount Vernon every few months to keep them from claiming freedom. “I wish to have it accomplished,” he explained, “under [some] pretext that may deceive both them and the Public.”
During the fall of 1791, Arthur St. Clair led 1,400 federal soldiers deep into Indian country. Before dawn on November 4, about 1,000 warriors surprised the sleeping troops in their poorly guarded camp. Quickly overwhelmed, they broke and fled, suffering 630 dead in the greatest single victory won by Indians over Americans during their long history of conflict. The victors stuffed the mouths of the enemy dead with soil to mock their fatal lust for Indian land.
By 1800, half a million Americans lived west of the Appalachian Mountains, three times as many as in 1790. The newcomers spilled over the western border into Louisiana, where Spanish officials reluctantly welcomed them for want of any way to stop them.
Federalists regarded the regulation as a rebellion that threatened the new constitutional order. They insisted that “busy and restless sons of anarchy” were bringing “us back to those scenes of humiliation and distress from which the new Constitution has so wonderfully extricated us.”
Standing down, the regulators stayed home rather than confront overwhelming force. Federal troops arrested twenty supposed rebels, whom they hauled back to Philadelphia for trial. They included Herman Husband, an aged religious mystic and a refugee from the bloody suppression of the regulation in North Carolina in 1772. After a year in a frigid jail, Husband was released, but he died of pneumonia on his journey home.
The American Revolution had helped to provoke the French Revolution by generating a massive war debt that the crown could not finance and by setting a republican precedent for replacing that monarchy.
Initially, almost all Americans welcomed the revolutionary creation of a sister republic by their wartime allies. Americans hoped that republicanism would become a moral contagion that could liberate the entire world from kings and aristocrats. Americans celebrated French victories, sang their revolutionary songs, waved the tricolored revolutionary flag, wore tricolored cockades on their hats, and even donned red French liberty caps. In Boston, an angry audience rioted and demolished a theater after concluding that a play had mocked a French character.
As the French and British escalated their naval warfare, both powers pressured the United States for assistance. The British counted on America’s dependence on British imports and its vulnerability to the superior might of the Royal Navy. The French expected American gratitude for their help in defeating the British during the War of the American Revolution. The Washington administration opted for neutrality because the deeply indebted and politically divided United States could ill afford a new conflict. In April 1793, Washington issued a proclamation of neutrality, barring Americans from assisting the French as privateers or filibusterers.
By continuing to celebrate the French Revolution, Republicans terrified Federalists, who dreaded that their rivals meant to copy the bloody French Jacobins. Cultivating a xenophobic American patriotism, Federalists disdained foreign revolutions as perversions. They cast the Republicans as traitors in league with “foreign disorganizers.” Federalists claimed that only the United States could sustain a stable republic—and only so long as they kept European ideas and conspirators away. “Holding
British repression accelerated Irish migration to the United States, where most of the newcomers voted for Republicans as the enemies of their enemies, the British. Disgusted Federalists denounced the newcomers as “United Irishmen, Free Masons, and the most God-provoking Democrats on this side of Hell.” Adopting a nativist position, Federalists derided most immigrants as too ignorant, poor, violent, and brutish to become citizens. A Federalist journalist declared that “every United Irishman ought to be hunted from the country, as much as a wolf or a tyger.” A Federalist congressman insisted, “The time is now come when it will be proper to declare that nothing but birth shall entitle a man to citizenship in this country.”
Federalists also blamed the French Revolution for instigating a massive and bloody slave revolt in the Caribbean. The greatest slave revolt in the Americas erupted on the evening of August 22, 1791, later known as the “Night of Fire.” On the northern plain of the French West Indian colony of Saint-Domingue, enslaved Africans began to kill overseers and masters and torched the buildings and cane fields of a thousand plantations. The revolt caused an international sensation because Saint-Domingue was the wealthiest colony in the Caribbean, for its 465,000 slaves produced richer crops of coffee and sugarcane than in the entire British West Indies. About the size of Maryland, Saint-Domingue occupied the western third of the island of Hispaniola, with the Spanish holding the eastern two-thirds as their colony of Santo Domingo. Because of brutal work conditions and tropical diseases, slave deaths exceeded births in the colony, so the planters imported thousands to replace the corpses. As a consequence, most of the enslaved in Saint-Domingue were Africans by birth.
The French sought to rally thousands of black men to defend the French West Indies while promoting slave revolts in the British West Indies.
Rejecting the racial radicalism of the French Revolution, Napoleon resolved to restore French control and plantation slavery in Saint-Domingue.
Their new country, named “Haiti,” became the second new nation, after the United States, to win sovereignty in the Americas.
To fend off that storm, Jefferson favored emancipating and deporting Virginia’s slaves over the course of two generations. But his fellow southerners rejected his plan as too expensive and economically debilitating.
A few fearful slaves sought to save themselves by revealing the plot to their masters, who alerted militia officers. Called into service, militiamen patrolled the roads and arrested suspects.79 The trials commenced on September 11, and executions began the next day.
Jefferson also tried to pay down the national debt that Hamilton had designed for perpetuity. During their twelve years in power, the Federalists had increased that debt from $76 million to $83 million; during his eight years as president, Jefferson reduced it to $57 million.
In 1811, the United States spent only $1 per capita, a mere twenty-fifth of the public expenditures in Great Britain. A French traveler reported that Jefferson’s administration was “neither seen [n]or felt.”
Hamilton noted that “the courage and obstinate resistance made by black inhabitants” had destroyed the invaders and induced Napoleon to cut his losses in America by selling Louisiana to the United States at a bargain price in 1803. Three years earlier, Napoleon had extorted that vast colony from the Spanish king.
Jefferson feared becoming entrapped by Marshall’s quick mind and rigorous logic. The new president assured another judge, “So great is his sophistry [that] you must never give him an affirmative answer, or you will be forced to grant his conclusions. Why, if he were to ask me whether it were daylight or not, I’d reply, ‘Sir, I don’t know, I can’t tell.’”
Jefferson held the presidency for eight years, while Marshall served as Chief Justice for thirty-four years thanks to a life tenure provided by the Federal Constitution. Marshall participated in more than a thousand decisions, writing over half of them. Thanks to that prodigious output during a critical generation in the consolidation of the federal union, Marshall had an impact at least comparable to Jefferson’s. The president complained that Federalists had “retired into the Judiciary as a stronghold.
By 1819, Jefferson and his Republican successor, James Madison, had appointed most of the Supreme Court justices, but the justices had come around to Marshall’s Federalist philosophy, to Jefferson’s dismay.
But any political victory is temporary. Like a kaleidoscope, we continue in every generation to make new combinations of clashing principles derived from the enduring importance and incompleteness of our revolution. The revolution remains embedded as selective memory in every contemporary debate.
Strained by revolution and migration, old hierarchies had to be reinvented in a new republic. Although excluded from formal politics, poorer colonists and women could protest violently as urban mobs and rural regulators.
Once limited to the governing elite, gentility spread to the middle class. Prosperous tradesmen, shopkeepers, and farmers read more, adopted better manners, and wore nicer clothes. Teacups and saucers, carpets, clocks, and looking glasses proliferated in their homes as they entertained more visitors. Watching closely, they measured one another for signs of refinement or vulgarity. A dread of scorn compelled attention to every detail of appearance and performance.
Numerous in the Northeast and Middle Atlantic, the outposts of middle-class gentility were scarce in the West and scarcest of all in the South, where most rural folk preserved a plain style of life.
In 1786, Jefferson pitched a secular and public system of education for Virginia. He reasoned that “the tax which will be paid for this purpose is not more that the thousandth part of what will be paid to [the] kings, priests, and nobles who will rise up among us if we leave the people in ignorance.” But most citizens preferred to keep their taxes low, so Virginia’s legislators rejected Jefferson’s plan.
Comprehensive, public school systems did not emerge beyond New England until the 1820s and 1830s and only in the other northern states.
The colonies had supported only nine colleges, which each year collectively graduated fewer than 200 students, most of them from the genteel elite. After the war, reformers sought to expand colleges and recruit more middle-class students. By 1815, the United States sustained thirty-three colleges, but that growth barely kept up with the exploding population. Many rustics still distrusted colleges as training grounds for potential aristocrats.
The early republic promoted more self-education than public education as many people exploited the greater availability of print. Thanks to rudimentary education at home, about three-quarters of free American adults were literate: one of the highest rates in the world.
The 69 post offices of 1788 mushroomed to 903 in 1800, when the postal system carried nearly 2 million copies of newspapers to far-flung readers, giving the United States the largest and widest circulation in the world.
Foreign visitors marveled at the number and influence of American newspapers, which became pivotal to republican politics, for shrewd editors could make or break candidates. The French visitor Alexis de Tocqueville noted “Only a newspaper can put the same thought at the same time before a thousand readers.” But many moralists worried that newspapers put the wrong thoughts in too many minds, spreading slander rather than insights. In 1807, Jefferson concluded, “Nothing can now be believed which is seen in a newspaper. . . . The man who never looks into a newspaper is better informed than he who reads them.” A free press might threaten, rather than bolster, the fragile new republic.
By seeking European approval, Americans kept their art derivative and prolonged their cultural dependency.
Before 1820, American publishers preferred to pirate books and articles from Britain rather than publish American writers. English texts were more fashionable and less expensive for publishers who, by law, had to pay royalties to American authors but not to foreign writers. Susanna Rowson’s novel Charlotte Temple (1791) sold very well, but she reaped few profits because American printers pirated it from the London edition. Because of that unintended consequence of copyright law, no American novelist could live by her or his work until the 1820s.
If government can answer for individuals at the day of judgement, let me be controlled by it in religious matters; otherwise, let men be free.”
In 1789, a Methodist preacher described “the true disciple of Christ” as “meek in heart, thirsting after holiness, crucified with Christ, and dead to the world.”
While prioritizing spirituality, evangelicals did not withdraw from the world, for they expected believers to work hard and honestly—and then support church and charity. Methodism’s founder, John Wesley, exhorted, “Earn all you can, save all you can, give all you can.”
Conservatives argued that a republic could not survive without the virtuous and moral citizens promoted by state-mandated financial support for religion. In 1784, Richard Henry Lee defended a religious establishment in Virginia: “The experience of all times shows Religion to be the guardian of morals—and he must be a very inattentive observer in our Country, who does not see that avarice is accomplishing the destruction of religion, for want of a legal obligation to contribute something to its support.”
From 1784 to 1786, Madison deftly outmaneuvered the state’s governor, Patrick Henry, who wanted a mandatory tithe paid by all taxpayers to support incorporated churches. In the key move, Madison persuaded Presbyterians that they would lose out to Anglicans under Henry’s proposal. Madison boasted to Jefferson, “The mutual hatred of these sects has been much inflamed. . . . I am far from being sorry for it.” Sectarian rivalry helped Madison make the case for dissolving the church-state alliance.
Virginia’s example proved slowly contagious. None of the new states created after the revolution provided for a church establishment, and those of the old states gradually toppled: New York in 1777; Virginia in 1786; South Carolina in 1790; Maryland in 1810; Connecticut in 1818; New Hampshire in 1819; and Massachusetts in 1833.34
They promoted an emotional, physical style of worship that often provoked convulsions, known as “the jerks,” in listeners absorbed in spiritual ecstasy. This visceral worship appealed to settlers coping with the hardships of frontier life and skeptical of college learning.
To curry favor with France during the revolution, the states had repealed colonial laws barring Catholics from voting and holding office. Catholics made the most of their new legal equality to compete for souls through persuasion.
Protestant theology shifted with the more egalitarian times. During the colonial era, almost all Protestant churches clung to Calvinist doctrines, which emphasized inherent human depravity and utter dependence on divine grace for salvation. The postwar competition for believers led most denominations instead to emphasize individual free will in seeking and gaining salvation. Instead of dwelling on God the all-powerful father, preachers emphasized the mix of the human and divine in Jesus. Under the pressure of competition, most mainstream churches—Presbyterian, Congregationalist, and Lutheran—adopted the new emphasis on outreach, free will, and your own personal Jesus. They founded missionary organizations to compete for believers in the West, where the nation’s future emerged through expansion.
Visitors from Europe marveled that America’s many denominations and lack of a church establishment led to a broader and more intense interest in Christianity. A higher proportion of Americans attended church and adhered to religious values than in any other country of Christendom. Most American households owned a Bible—in stark contrast to Europe.
After 1800, most Americans regarded republicanism and Protestantism as mutually reinforcing, for both preached individual choice and encouraged voluntary association.
By shattering religious establishments, he had sought the triumph of secular thinking. Instead, disestablishment created space for an increasingly evangelical public culture.
In only one state, New Jersey, did some women gain the right to vote. In 1776, the state constitution neglected to specify citizenship as male, instead defining voters as “all free inhabitants” who met property and residence requirements—which qualified widows and spinsters (but not married women). Some eligible women exploited that oversight to vote during the 1780s, which led the New Jersey legislature explicitly to accept the practice in a 1790 election law that referred to a voter as “he or she.” During the 1790s, most widows preferred Federalist candidates rather than Republicans, who favored expanding rights for common white men while limiting those of women.
Republicanism opened all forms of social inequality and domination to criticism in the name of individual rights. More young people had sex before marriage in defiance of their parents. Growing steadily during the eighteenth century, premarital pregnancy peaked during the revolutionary generation at a third of brides. Throwing up their hands, magistrates stopped the traditional prosecutions of young people for fornication. Instead, new mothers sued presumed fathers for support if they had not married yet.
Most states also made it easier for women to seek divorces if abandoned or brutalized by their husbands. More couples simply and cheaply “self-divorced” as one bolted from the other to seek happiness outside of marriage. In one of many elopement notices published by jilted men, a Virginian advertised, “Whereas, my wife Annie Dixon has revolted from my bed, and refuses copulation . . . I will not pay any of the debts she contracts.”
Aside from Quakers, who rejected slavery as a sin, almost all colonists had regarded human bondage as natural and immutable. In the traditional society of British America, slavery was but the lowest rung in a social hierarchy of dependency, below white indentured servants, tenant farmers, women and children, and wage laborers.
John Jay headed New York’s antislavery society, but he continued to buy slaves and freed them only once he felt fully compensated by years of their hard labor. Although George Mason criticized slavery, he clung to his own slaves and, with consummate insensitivity, named one of them “Liberty.”
Jefferson agreed that southerners were “zealous for their own liberties, but trampling on those of others.” Whites especially cherished their own freedom because they denied it to the enslaved.
Most northern states gradually emancipated their slaves through laws designed to soften the blow for masters. In 1780, Pennsylvania’s legislature declared slavery “disgraceful to any people, and more especially to those who have been contending in the great cause of liberty themselves.” But the law freed no slaves then alive, only those born after the law and only once they turned twenty-eight years old. No slave, if properly registered by a master, could become free until 1808. A slave born a day before the act remained enslaved for life. Pennsylvania still had some elderly slaves until 1847, when the state finally and fully abolished the system.
The laws sought primarily (and very gradually) to free northern states of slavery, rather than to ensure freedom for the enslaved. Lawmakers meant to save taxpayers from having to compensate masters for their lost property. Instead, young slaves would pay for their eventual freedom by working without compensation into their mid-twenties. Economic historians calculate that this young labor recouped to masters 95 percent of the slaves’ market value.
Northern legislators acted more from a distaste for slavery than from empathy for the enslaved. Legal loopholes often enabled masters to sell slaves to southern buyers before freedom claimed them.
In the northern states, gradual emancipation laws increased the free black population from fewer than 1,000 in 1775 to nearly 50,000 in 1810. But the northern states still had 27,000 slaves that year because the gradual process of emancipation remained incomplete. The freed tended to gravitate from the countryside into the cities, where they found greater comfort and safety in numbers. By 1810, nearly a third of New York State’s blacks lived in New York City, and two-fifths of black Pennsylvanians resided in Philadelphia. By providing growing numbers for runaways to hide in, black neighborhoods pressured northern masters to sell early freedom to their remaining slaves. In the greatest boon of freedom, by 1820 most northern black couples could live together and keep their children.
Northern racism intensified as the free black population grew. Tocqueville noted, “The prejudice of race appears to be stronger in the states that have abolished slavery than in those where it still exists.”
Slaves could join their masters in genteel parlors and public conveyances, but free blacks faced exclusion. Even the antislavery societies of New York and Pennsylvania barred blacks from joining.
The manumitters included Washington, who freed his slaves in a last will and testament. Most manumitters, however, were Quakers or evangelicals with religious scruples.
The postrevolutionary manumissions modestly increased the free black population in Virginia from 2,000 in 1782 (1 percent of all black people) to 20,000 (7 percent) in 1810. Maryland’s free black population grew from 4 percent of blacks in 1755 to 20 percent in 1810.97
Jefferson and Madison challenged Virginia’s colonial laws of “entail and primogeniture,” which had enabled a great landowner to bind his heirs never to subdivide or alienate any part of a plantation (including its slaves). Aristocratic in design, entail and primogeniture preserved great estates through the generations at the expense of all the children save the first-born son. Reformers denounced the colonial inheritance laws as the tyranny of a past generation overriding the rights of the living to buy and sell land and slaves as they wished.
Minor crop during the colonial era, cotton became profitable and widespread in the Lower South after the invention in 1793 of Eli Whitney’s cotton gin: a machine to separate valuable fibers from their husks.
South Carolina’s cotton exports soared from 9,840 pounds in 1790 to 6,425,000 pounds just ten years later. By 1810, exports surged to 50 million pounds, and planters scrambled to buy more slaves from Chesapeake traders.
In barns and secluded spots, they whipped backs and inflicted “cat-hauling”: dragging a cat by the tail along the bare back of a trussed-up victim.
From 700,000 in 1790, the number of enslaved doubled to 1.5 million in 1820. As foreign imports faded after 1807, natural increase accounted for most of the population growth. Between 1790 and 1860, slave traders and migrants herded over a million slaves south and west from the Chesapeake to expand southern society to the Mississippi and beyond.
Postrevolutionary America was the first society premised on individualism (for and by the free). Only in the United States did most men think of themselves as lone actors making free choices that determined success or failure.
In sum, the revolution generated clashing contagions, of slavery and liberty, and pitted them against one another.
Constitutional crises flared in every generation as one region or another threatened secession until those threats became real and bloody in 1860–1861.
-
Unsong by Scott Alexander – Reviewed
What It’s About
Not easy to describe, but NASA accidently breaks mathematics and reveals the angels who control reality
How I Discovered It
The Rationalist community
Thoughts
One of the funnier books I’ve ever read – unique in concept and delivery.
What I Liked About It
Pretty much everything, very readable
What I Didn’t Like About It
Nothing really. There were a lot of charachters
Who Would Like It?
Everyone
Related Books
Star Maker by Olaf Stapledon
Quotes
This is the kabbalah. The rest is just commentary. Very, very difficult commentary, written in Martian, waiting to devour the unwary.
In the Book of Job, God takes an innocent man and afflicts him with various curses. First He kills all Job’s cattle, sheep, and camels. Then He kills all Job’s servants, sons, and daughters. Then He covers Job from head to toe with boils and leprous sores. But—and this is crucial—He never asks Job, “Oh, great, and how do you like it?”
I won’t say I had gazed upon it bare, exactly, but in the great game of strip poker every scholar plays against the universe, I’d gotten further than most.
That was why you needed to know the rules. God is awesome in majesty and infinite in glory. He’s not going to have a stupid name like GLBLGLGLBLBLGLFLFLBG.
the clouds were forming ominous patterns, and Tuesdays had stopped happening. The Tuesdays were the most worrying part. For the past three weeks, people all over the world had gone to sleep on Monday and woken up Wednesday. Everything had been in order. The factories had kept running. Lawns had been mowed. Some basic office work had even gotten done. But of the preceding twenty-four hours, no one had any memories.
The Names of God are long, apparently meaningless, and hard to remember. I don’t know who first figured out that if you sing them to a melody, they’ll stick with you longer, but so they do. That’s why we call it choir practice, why I’m choir director, why the people who learn the Names are called Singers and Cantors. The twenty of us joined together in song.
“In a different part of the Talmud,” I said, “Rabbi Akiva gives a different explanation. He says that even the Heaven-bound righteous have a few sins, and since those sins won’t be punished in Heaven, they have to be punished here on Earth. Therefore, the righteous suffer on Earth. But even the Hell-bound wicked have a few virtues. And since those virtues won’t be rewarded in Hell, they have to be rewarded here on Earth. Therefore, the wicked prosper on Earth. Then people ask why the righteous suffer and the wicked prosper, and it looks like a mystery, but it actually makes total sense.”
But humans can’t leave well enough alone, so we got in the Space Race, tried to send Apollo 8 to the moon, crashed into the crystal sphere surrounding the world, and broke a huge celestial machine belonging to the Archangel Uriel that bound reality by mathematical laws. It turned out keeping reality bound by mathematical laws was a useful hack preventing the Devil from existing. Break the machinery, and along with the Names of God and placebomancy and other nice things we got the Devil back.
The Bible is silent on the subject, but Rabbi Klass of Brooklyn points out that during the 420 years of the Second Temple, there were three hundred different High Priests, even though each High Priest was supposed to serve for life. Clearly, High Priests of Israel had the sorts of life expectancies usually associated with black guys in horror movies.
The kabbalistic meaning of news is “the record of how the world undoes human ambitions.”
It had taken a kabbalistic rearrangement of the Midwest’s spatial coordinate system that rendered roads there useless, plus a collapse of technology so profound that airplanes were only able to fly if Uriel was having a really good day, plus the transformation of the Panama Canal into some sort of conduit for mystical energies that drove anyone in its vicinity mad—but America had finally gotten its act together and created a decent rail system.
tend to think of the Central Valley as a nightmarish stretch of endless farms inhabited by people who, while not exactly dead, could hardly be called living.
ATTENTION. DUE TO A SCALE BACK IN COVERAGE, THE MORAL ARC OF THE UNIVERSE NO LONGER BENDS TOWARD JUSTICE. WE APOLOGIZE FOR THE INCONVENIENCE.
(Mark Twain once said, “There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.” I think he would have liked Kabbalah.)
The hardest hit were the atheists. They’d spent their whole lives smugly telling everyone else that God and the Devil were fairy tales and really wasn’t it time to put away fairy tales and act like mature adults, and then suddenly anyone with a good pair of binoculars can see angels in the sky. It was rough.
In 1972, the President, Mr. Kissinger, and several other high officials took an unexpected trip to Yakutsk, where they opened full diplomatic relations with Hell. Nixon and Thamiel agreed to respect the boundary at the Bering Strait and cooperate economically and militarily against their mutual enemy.
Kissinger was lauded, but the real praise fell on Nixon, whose stern anti-Communist stance had given him the moral credentials he needed to forcefully defend his action. Thus the saying that sprang up in the wake of the trip: “Only Nixon can go to Hell.”
“Look,” Nixon told Kissinger, in one of the most damning tapes. “Everything I did, I did for the love of this country, I did it to fight Communism. But [expletive deleted] God isn’t going to see it that way. He’s going to be too soft to realize what had to be done. And I’m going to end up burning in [expletive deleted] for all eternity. Why the [expletive deleted] did I ever let you convince me to sign an alliance with [expletive deleted]?” “The idea behind the alliance was sound,” Kissinger answered. “We did not entirely understand how things stood at the time, but even if we had, I would have made the same suggestion. Brezhnev was getting too strong, especially with the Vietnamese and the South American communist movements. We did what we had to do. If the good Lord disagrees with me, I will be happy to point out His tactical errors.” “[expletive deleted] easy for you to say!” said the President. “You can talk anybody into anything. But I’m the one whose [expletive deleted] soul is on the line. Doesn’t the Bible say something about that? What use is it to something something the world if it costs you your soul? Something [expletive deleted] hippie dippy like that? I’m breaking the alliance. There’s no other choice.”
Spies reported that the Other King had been gravely wounded, a Fisher King wound that never healed, his mind intact but his body hopelessly mangled.
“I know your True Name,” he said. “You are Gadiriel, the Lady of Los Angeles, the maker of golems. The angel of celebrity and popularity and pretense. This is your work.” When Reagan next spoke, it was with a lilting feminine voice, one with a faint undertone of amusement. “Come, Jalaketu ben Kokab. Let’s go somewhere a little more private.”
He has loosed the fateful lightning of His terrible swift sword. His truth is marching on.” “I have read a fiery Gospel writ in burnished rows of steel,” said Jalaketu. “As ye deal with my contemners, so with you my grace shall deal. Let the hero, born of woman, crush the serpent with his heel.”
Reagan’s voice shifted back to the masculine, drawling register that would entrance millions. “Guess we lost track of the time out here! Well, Jala, are you ready? Let’s go make America great again!”
And Mexico is starting to industrialize really heavily—like, more heavily than any country has ever industrialized in all of history. Turns out communism works just fine when there are no individuals. The two countries start to prepare for war.
UM, said Uriel. I THINK OF GOD AS SORT OF INTERPLAY BETWEEN THE LOGICAL AND MORAL CONCEPTS OF NECESSITY, WHICH CAUSES UNIVERSES TO EXIST AND CONTAIN THE POTENTIAL FOR HOLINESS. I AM NOT SURE HE IS REALLY THE KIND OF ENTITY THAT GETS INVOLVED IN REAL ESTATE NEGOTIATIONS.
Fast cars! Fast women! Fastidious adherence to the precepts of the moral law! —Steven Kaas
There was Trump Hotel, whose etymology traced back to triumph and thence to thriambos, the orgiastic rites of the pagan gods of chaos.
“Tell me, Ms. Lowry, you’re a writer, what would be an appropriate message to put on a card for a letterbomb?” Valerie thought for a second. “How about—condolences on the recent death in your family?”
Hell made the first offer. US recognition of all Thamiel’s outstanding territorial conquests, including Russia, Alaska, Canada, and the US north of Colorado and west of the Mississippi—even Salish, which Hell had never actually managed to conquer. In exchange he would disarm all but a token remnant of his ICKMs. If not, he would nuke the Untied States, and let Reagan decide whether to launch a useless retaliation that would kill hundreds of thousands of innocents but allow the demons to recoalesce after a few months. Reagan made a counteroffer: not doing any of that. And if Hell used any nuclear weapons, he would nuke the whole world, destroying all human life. Thamiel’s goal, he said, was to corrupt humanity and make them suffer. Piss off the Untied States, and they would knock humanity beyond all corruptibility and pain forever. Some would go to Hell, others to Heaven, and that would be the end of that for all time. Mutually assured destruction was the only way that anyone had ever prevented nuclear war, and sometimes that meant threatening something terrible in the hopes that your enemy didn’t want it either. Reagan gambled everything on the idea that the Devil didn’t want a final end to all sin.
The ability of a vast empire to subsidize heroin stores was no match for the ability of addicts to want more heroin.
This man here on my right side is the incomparable Clark Deas, my trusted lieutenant. Comes to us all the way from Ireland, where he used to engage in ‘republican activity’ up in the parts where that means something a little more decisive than voting for tax cuts. Had his own splinter group for a while, the Deas IRA, which like all good splinter groups spent 95% of the time fighting people on its own side and the other 5% catching unrelated people in crossfires. With His Majesty’s finest breathing down his back, he joined millions of his countrymen in crossing the Atlantic to a promised land of wealth and freedom where all the policemen are blind and deaf and the streets are paved with plastic explosives.”
You say you have problems as great as my own I am forced to admit it is true But the thing is that my problems happen to me Whereas yours only happen to you.
you’re going to make a certain mistake. It’s an easy mistake to make. You’re going to hate evil. And you’re going to think that’s enough. That it’s the same as loving goodness. It isn’t. It’s nowhere close. It will lead you to Hell—whether as tenant or landlord.
Asher spoke the Incendiary Name again. It hit home, and Brenda Burns went up in a conflagration of nominative determinism.
Ana wondered exactly what kind of a priest he was. Apparently the type who would agree to join an expedition to hunt down God if they paid him enough. Probably not Pope material.
THE REASON EVIL EXISTS IS TO MAXIMIZE THE WHOLE COSMOS’ TOTAL SUM GOODNESS. SUPPOSE WE RANK POSSIBLE WORLDS FROM BEST TO WORST. EVEN AFTER CREATING THE BEST, ONE SHOULD CREATE THE SECOND-BEST, BECAUSE IT STILL CONTAINS SOME BEAUTY AND HAPPINESS. THEN CONTINUE THROUGH THE SERIES, CREATING EACH UNTIL REACHING THOSE WHERE WICKEDNESS AND SUFFERING OUTWEIGH GOOD. SOME WORLDS WILL INCLUDE MUCH INIQUITY BUT STILL BE GOOD ON NET. THIS IS ONE SUCH.
-
American Colonies by Alan Taylor
From my Notion template
The Book in 3 Sentences
- A wonderful, incredibly detailed look at American history from the Big Bang to 1770 (or so). Taylor goes very, very, very deep into the particulars and surfaces a lot of hidden insight and many factors I would never have considered, like European weeds and agriculture. Slavery and disease were given their proper (i.e. enormous) importance in the telling of the story. The book would have been better as five shorter books (it is very packed with detail) but I have never come across a book that tells the complete story – by which I mean how the Native American tribes interacted with each other, how the European powers interacted with each other, how the Native Americans interacted with the Europeans better than this one.
How I Discovered It
I read American Republics by the same author.
Who Should Read It?
Anyone interested in history.
How the Book Changed Me – AKA Random Observations
- I had never thought of the impact of European weeds and livestock on the natural environment – Taylor documents this very important phenomenon very well
- The balance of disease (for lack of a better term) greatly favored the Europeans, i.e. the ones they brought were far more deadlier and numerous than the ones they got
- Seemingly Spain set it self out to be an evil villainous empire with no redeeming virtues in modern eyes
- There was a lot more religious based human sacrifice in the Americas than I would have thought.
- Slavery was instituted everywhere slavery was profitable. The Europeans found more ways to make it profitable but the institution was well entrenched before colonization.
- Many of the Indian tribes were recent creations – tribes devasted by European arms, disease and displacement formed very new tribes that I always assumed had been around forever.
- From reading the book (the author doesn’t say this anywhere, but this seems like a reasonable guess) – the Europeans caused and 80% reduction in population size of anyone they came into contact with
- Taylor does a very good job of explaining the role of time in all of these events. What before I had thought of as “Euro group X came over and drove out the Indian tribe Y and that’s why Virginia exists” turned into “Over a 120 year period there was a complex series of highly contingent events involving multiple generations of people, their leaders and incentives”
- The role of the West Indies is fully explained, including why it had such a disproportionate share of money, protection and slaves. The moral depravity and suffering was truly epic.
- My instinctive bias against South Carolina is now more reasoned
- The concept of “Native American” and “European” is about 80% modern projection. No one back then thought of that distinction was as meaningful as we do now
- The entire colonization process is Garrett Hardin’s first law of ecology “You cannot do only one thing”. The arrival of the Europeans, their diseases, livestock, crops, weeds, diseases and weapons had profound impacts on everyone, whether they interacted with the Europeans or not. Disease risk, guns, horses and balance of power somewhere affects everyone everywhere
- Too many more to list
Summary + Notes
After about 1640, the great majority of free colonists were better fed, clothed, and housed than their common contemporaries in England, where half the people lived in destitution.
Between 1492 and 1776, North America lost population, as diseases and wars killed Indians faster than colonists could replace them.
Until lumped together in colonial slavery, the African conscripts varied even more widely in their ethnic identities, languages, and cultures.
Most diverse of all were the so-called Indians. Divided into hundreds of linguistically distinct peoples, the natives did not know that they were a common category until named and treated so by the colonial invaders.
In these cultural and environmental encounters, the various peoples were not equal in power. In most (but not all) circumstances, the European colonizers possessed tremendous ecological, technological, and organizational advantages, which demanded disproportionate adjustments by the Indians in their way and the Africans in their grasp.
In the colonies, that difference grew stronger over the generations as British America developed an especially polarized conception of race in tandem with greater political power for common whites. Unlike the French and the Spanish, the British colonies relied in war primarily on local militias of common people, rather than on professional troops. That increased the political leverage of common men as it involved them in frequent conflicts with Indians and in patrolling the slave population. In those roles, the ethnically diverse militiamen found a shared identity as white men by asserting their superiority defined against Indians and Africans conveniently cast as brutish inferiors.
Once race, instead of class, became the primary marker of privilege, colonial elites had to concede greater social respect and political rights to common white men.
Reading the United States back in time and geography to frame the colonial story has the distorting effect known as “teleology”: making all events lead neatly to a determined outcome, in the colonial case to the American Revolution and its republic. Teleology costs us a sense of the true drama of the past: the “contingency” of multiple and contested possibilities in a place where, and time when, no one knew what the future would bring. As late as 1775, few British colonists expected to frame an independent country. And very few Hispanics and fewer Indians wished for incorporation within such a nation.
In fact, it would be difficult (and pointless) to make the case that either the Indians or the Europeans of the early modern era were by nature or culture more violent and “cruel” than the other. Warfare and the ritual torture and execution of enemies were commonplace in both native America and early modern Europe. Without pegging Europeans as innately more cruel and violent, we should recognize their superior power to inflict misery.
Almost all early explorers and colonizers marveled at the natural abundance they found in the Americas, a biodiversity at odds with the deforestation and extinctions that the Europeans had already wrought in most of their own continent. Colonization transformed the North American environment, which had already experienced more modest changes initiated by the native occupation.
Dental, genetic, and linguistic analysis reveals that most contemporary Native Americans are remarkably homogeneous and probably descend from a few hundred ancestors who came to North America within fifteen thousand years of the present
reaching Labrador and Greenland by about twenty-five hundred years ago.
Note From Steve: – Interesting to see if this holds up. that is very late
Through some combination of climatic change and the spread of highly skilled hunters, almost all of the largest mammals rapidly died out in the Americas. The extinctions comprised two-thirds of all New World species that weighed more than one hundred pounds at maturity—including the giant beaver, giant ground sloth, mammoth, mastodon, and horses and camels. It is ironic that horses and camels first evolved in North America and migrated westward into Asia, where they were eventually domesticated, while those that remained in the Americas became extinct. The giant bison died out, leaving its smaller cousin, the buffalo, as the largest herbivore on the Great Plains. Of the old, shaggy great beasts, only the musk oxen survived and only in the more inaccessible reaches of the arctic.
Obtaining more to eat, more reliably, they resumed their population growth. The more local and eclectic Archaic way of life could sustain about ten times as many people on a given territory as could the Paleolithic predation on herds of great beasts.
Archaic Indians also began to modify the environment to increase the yields of plants and animals that sustained them.
Gender structured work roles: men were responsible for fishing and hunting while women harvested and prepared wild plants. In general, men’s activities entailed wide-ranging travel and the endurance of greater exposure and danger, while women’s activities kept them close to the village, where they bore and raised children.
The native peoples of North America spoke at least 375 distinct languages by 1492.
The Indians of central Mexico pioneered the three great crops of North American horticulture: maize, squashes, and beans.
The new horticulture also promoted economic differentiation and social stratification as the food surplus enabled some people to specialize as craftsmen, merchants, priests, and rulers.
The skeletons of early farmers reveal a want of sufficient salt or protein, episodes of early childhood malnutrition, and an overall loss of stature. Moreover, the denser populations of horticultural villages facilitated the spread of communicable diseases, principally tuberculosis, which was less common among dispersed hunter-gatherers.
Rather than horticulture, the most significant development for these people was their adoption of the bow and arrow after about A.D. 500.
The largest pueblo, at Chaco Canyon, required thirty thousand tons of sandstone blocks, stood four stories tall, and contained at least 650 rooms.
Founded in 1300, Acoma is probably the longest continuously inhabited community within the United States. Other
Like the people of central Mexico, the Mississippians regarded the sun as their principal deity, responsible for the crops that sustained their survival; they considered their chiefs as quasi-sacred beings related to the sun; and they practiced human sacrifice. When a chief died, his wives and servants were killed for burial beside him, as companions for the afterlife.
Within a century, European diseases, supplemented by European violence, killed most of the Mississippian peoples and transformed the world of the survivors.
The urban centers tended to collapse within two centuries of their peak, which obliged their inhabitants either to relocate or to revert to a more decentralized and less hierarchical mode of life, which allowed the recovery of wild plants, animals, and soils. Because native peoples more promptly felt the negative consequences of their local abuse of nature (relative to Europeans), they more quickly shifted to alternative environmental strategies.
Lacking horses and oxen, native North Americans knew the wheel only in Mesoamerica as a toy.
Consequently, in the North America of 1492, only the Aztecs of Mexico constituted an imperial power capable of governing multiple cities and their peoples by command.
compared with Europeans, the natives of America carried a more limited and less deadly array of pathogenic microbes.
By contrast, the Europeans of 1492 were the heirs to an older and more complex array of domesticated plants and animals developed about nine thousand years ago at the eastern end of the Mediterranean. The European mode of agriculture featured domesticated mammals—sheep, pigs, cattle, and horses—endowing their owners with more fertilizer, mobility, motive power, animal protein, and shared disease microbes. Building on a long head start and the power of domesticated mammals, the Europeans had, over the centuries, developed expansionist ambitions, systems of written records and communication, the maritime and military technology that permitted global exploration and conquest, and (unwittingly) a deadly array of diseases to which they enjoyed partial immunities.
Indians understood that humans could live only by killing fish and animals and by clearing trees for fields, but they had to proceed cautiously. Natives usually showed restraint, not because they were ecologically minded in the twentieth-century sense, but because spirits, who could harm people, lurked in the animals and plants. A healthy fear of the spirits limited how the Indians dealt with other forms of life, lest they reap some supernatural counterattack. Offended spirits might hide away the animals or the fish, afflict the corn crop, or churn up a devastating windstorm. Any success in hunting, fishing, or cultivating had to be accepted with humility, in recognition that the fruits of nature were provisional gifts from temperamental spirits.
Indian animism should not be romantically distorted into a New Age creed of stable harmony. In fact, the natives regarded the spiritual world as volatile and full of tension, danger, and uncertainty. To survive and prosper, people had to live warily and opportunistically.
The logic of restraint was animist rather than ecological—but that restraint tended to preserve a nature that sustained most native communities over many generations.
The Christian alienation of spirit from nature rendered it supernaturally safe for Europeans to harvest all the resources that they wanted from nature, for they offended no spirits in doing so. In wild plants and animals, the colonizers simply saw potential commodities: items that could be harvested, processed, and sold to make a profit.
A French priest in Acadia noted of the Indians, “They are never in a hurry. Quite different from us, who can never do anything without hurry and worry; worry, I say, because our desire tyrannizes over us and banishes peace from our actions.”
By offering such moral criticism, however, Christians helped to preserve a capitalist society from consuming itself. Indeed, without some moral counterweight and some sense of a higher purpose, capitalist competition degenerates into a rapacious, violent kleptocracy. Without a God, the capitalist is simply a pirate, and markets collapse for want of a minimal trust between buyers and sellers. The seventeenth-century English minister Thomas Shepard aptly commented that self-interest was a “raging Sea which would overwhelm all if [it] have not bankes.” Shepard did not wish to abolish self-interest, merely to strengthen its restraining banks. Christianity provided the banks that permitted capitalist enterprise to persist, prosper, and expand into the Americas.
A sixteenth-century Italian physician marveled “that I was born in this century in which the whole world became known; whereas the ancients were familiar with but a little more than a third part of
During the 1550s the explorer Jean de Léry reported that America was so “different from Europe, Asia and Africa in the living habits of its people, the forms of its animals, and, in general, in that which the earth produces, that it can well be called the new world.”
But the differences began to diminish as soon as they were recognized. The invasion by European colonists, microbes, plants, and livestock eroded the biological and cultural distinctions formerly enforced by the Atlantic Ocean. Newly connected, the two “worlds,” old and new, became more alike in their natures, in their combinations of plants and animals.
The environmental revolution worked disproportionately in favor of the Europeans and to the detriment of the native peoples, who saw their numbers dwindle. Although never under the full control of the colonizers, the transformation enhanced their power by undermining the nature that indigenous communities depended upon. Colonization literally alienated the land from its native inhabitants. In particular, the colonizers accidentally introduced despised weeds, detested vermin, and deadly microbes. All three did far more damage to native peoples and their nature than to the colonists. While exporting their own blights, the European colonizers imported the most productive food plants developed by the Indians. The new crops fueled a population explosion in seventeenth- and eighteenth-century Europe. Part of that growth then flowed back across the Atlantic to resettle the Americas as European colonies.
The long and usually secure trade routes of the Muslim world reached from Morocco to the East Indies and from Mongolia to Senegal. Within that range, Muslim traders benefited from the far-flung prevalence of Arabic as the language of law, commerce, government, and science.
Inspired by their literary fantasies, European visionaries longed to reach the Far East to enlist their peoples and wealth for a climactic crusade against Islam. As a fabulous land that could fulfill Europeans’ dreams, eastern Asia (and especially China) rendered the intruding barrier of the Muslim world all the more frustrating.
In 1469 the marriage of Queen Isabella and Prince Ferdinand united Aragon and Castile to create “Spain.”
But, with more greed than consistency, the Iberians also enslaved Guanche who had converted to Christianity in the vain hope of living peaceably beside their invaders.
In their invasion of the small and long-isolated Canaries, the Iberians reaped the perverse advantage of their relatively large population located at a nexus of commercial exchange, which made for an especially diverse and regularly reinforced pool of diseases.
So complete was the cultural destruction that only nine sentences of the Guanche language have survived.
At first, most of the slaves were Guanche, but they inconveniently and rapidly died from the new diseases. To replace the dead, the colonists imported Africans to work the sugar plantations. West African societies had long enslaved war captives and convicted criminals for sale to Arab traders, who drove them in caravans across the Sahara to the Mediterranean. This
By turning native peoples into commodities, for sale as plantation slaves, the invaders developed a method for financing the further destruction of their resistance. In the Atlantic islands, the newcomers also pioneered the profitable combination of the plantation system and the slave trade. In the fifteenth-century Atlantic islands (and principally the Canaries), we find the training grounds for the invasion of the Americas.
For in 1492 no one in Europe had any idea that the next islands farther west lay close to two immense continents inhabited by millions of people.
In 1498 Vasco da Gama exploited that discovery to enter and cross the coveted Indian Ocean, the gateway to the trade riches of the East. The profits kept the Portuguese focused on the southern and eastward route to Asia, leaving the westward route largely unguarded for their Spanish rivals to explore by default.
Spain pioneered transatlantic voyages, thanks to the aggressive ambition, religious mysticism, and navigational prowess of the Genoese mariner Christopher Columbus. In popular histories and films, Columbus appears anachronistically as a modernist, a secular man dedicated to humanism and scientific rationalism, a pioneer who overcame medieval superstition. In fact, he was a devout and militant Catholic who drew upon the Bible for his geographic theories. He also owned, cherished, and heavily annotated a copy of The Travels of Marco Polo, which inspired his dreams of reaching the trade riches and the unconverted souls of East Asia. Columbus hoped to convert the Asians to Christianity and to recruit their bodies and their wealth to assist Europeans in a final crusade to crush Islam and reclaim Jerusalem. Such a victory would then invite Christ’s return to earth to reign over a millennium of perfect justice and harmony.
What deterred Europeans from sailing due west for Asia was not a fear of sailing off the edge of the world but, instead, their surprisingly accurate understanding that the globe was too large.
Exploiting the trade winds, he turned west into the open ocean and had clear, easy sailing, reaching a new land after just thirty-three days.
But Columbus supposed that all of the islands belonged to the East Indies and lay near the mainland of Asia. Although the native inhabitants (the Taino) were unlike any people he had ever seen or read about, Columbus insisted that they were “Indians,” a misnomer that has endured.
Thanks to the newly invented printing press, word of Columbus’s voyage and discovery spread rapidly and widely through Europe. Eagerly
With the assistance of the pope, the Spanish and the Portuguese negotiated the 1494 Treaty of Tordesillas, which split the world of new discoveries by drawing a north-south boundary line through the mid-Atlantic west of the Azores.
In 1495 he shipped 550 captives to Spain for sale to help pay for his expedition. Because most died during the voyage or within a year of arrival from exposure to European diseases, Columbus had to abandon the project of selling Indians in Spain. Instead, he distributed Indian captives among the colonists to work on their plantations and to serve as sex slaves.
Violent mutinies and more violent reprisals by Columbus induced the monarchs to revoke his executive authority in 1500.
Although displaced as governor, Columbus continued to serve the Spanish as a maritime explorer. In 1498 and 1502 his third and fourth transatlantic voyages revealed long stretches of the South and Central American coast. Nonetheless, to his death in 1506, Columbus stubbornly insisted that all of his discoveries lay close to the coast of Asia.
A year later, Amerigo Vespucci, a Genoese mariner who alternated between Spanish and Portuguese employ, explored enough of the coast of South America to deem it a new continent. Consequently, European map-makers began to call the new land by a variant of his first name—America.
Although Columbus had not reached Asia, he did find the substance of what he sought: a source of riches that would, in the long term, enable European Christendom to grow more powerful and wealthy than the Muslim world.
With the Canaries as their colonial model, the Spanish aggressively modified Hispaniola, introducing new crops, especially sugarcane, and new animals, including cattle, mules, sheep, horses, and pigs.
colonization rapidly destroyed the Taino people of Hispaniola. In 1494 a Spaniard reported that more than 50,000 Taino had died, “and they are falling each day, with every step, like cattle in an infected herd.” From a population of at least 300,000 in 1492, the Taino declined to about 33,000 by 1510 and to a mere 500 by 1548. The great missionary friar Bartolomé de Las Casas mourned the virtual extermination “of the immensity of the peoples that this island held, and that we have seen with our own eyes.”
In sum, the natives suffered from a deadly combination of microparasitism by disease and macroparasitism by Spanish colonizers, preying upon native labor. Although not genocidal in intent—for the Spanish preferred to keep the Taino alive and working as tributaries and slaves—the colonization of Hispaniola was genocidal in effect.
In any given locale, the first wave of epidemics afflicted almost every Indian. Within a decade of contact, about half the natives died from the new diseases. Repeated and diverse epidemics provided little opportunity for native populations to recover by reproduction. After about fifty years of contact, successive epidemics reduced a native group to about a tenth of its precontact numbers.
Most scholars now gravitate to the middle of that range: about fifty million Indians in the two American continents, with about five million of them living north of Mexico.
Apparently only one major disease, venereal syphilis, passed from the Americas into Europe with the returning explorers and sailors. If so, syphilis exacted a measure of revenge on behalf of the native women raped by the invaders.
The Europeans died in far greater numbers when they tried to colonize sub-Saharan Africa, where they did encounter relatively novel and especially virulent tropical diseases, principally falciparum malaria and yellow fever. Unwittingly, the Europeans imported those African diseases into the American tropics and subtropics with the slaves brought to work on their plantations. Those African maladies then added to the epidemics that devastated the Native Americans.
In effect, the Old World diseases benefited from a much larger pool of potential hosts. Passing to and fro, these pathogens gradually strengthened the immunities of the disease-embattled peoples of the Old World, rendering them deadly carriers when they passed into places where those diseases were not endemic.
By living in filth, urban Europeans paid a high price in steady losses to endemic disease and occasional exposure to new epidemics. But they also rendered themselves formidable carriers of diseases to distant and cleaner peoples with far less experience with so many pathogens.
North American natives domesticated only one mammal, the dog, which rarely shares diseases with its best friends.
One disease often weakened a victim for a second to kill. For example, many Indians barely survived smallpox only to succumb to measles, pneumonia, or pleurisy.
For want of healthy people to tend the sick, to fetch food and water and keep fires going, many victims died of starvation, dehydration, or exposure.
Neither sixteenth-century natives nor colonizers knew about the existence of microbes, much less that some caused disease. Instead, both assumed that the epidemics manifested some violent disruption of supernatural power. Colonists interpreted the diseases as sent by their God to punish Indians who resisted conversion to Christianity. Indians blamed the epidemics on sorcery practiced by the newcomers. When the native shamans failed to stop or cure a disease, they became discredited as ineffectual against the superior sorcery of the newcomers, who survived epidemics that slaughtered the natives.
During the sixteenth and seventeenth centuries, the colonizers did not intentionally disseminate disease. Indeed, they did not yet know how to do so. Especially during the sixteenth century, the colonizers valued Indian bodies and souls even more than they coveted Indian land. They needed Indians as coerced labor to work on mines, plantations, ranches, and farms. And Christian missionaries despaired when diseases killed Indians before they could be baptized.
Prior to 1820, at least two-thirds of the twelve million emigrants from the Old to the New World were enslaved Africans rather than free Europeans. Most
From a population of 5 million in 1492, the inhabitants of Great Britain surged to 16 million by 1800, when another 5 million Britons already lived across the Atlantic. The
The demographic and colonial history of Africa offers an instructive contrast to North America. Despite inferior firepower, until the nineteenth century the Africans more than held their own against European invaders because African numbers remained formidable. Unlike the Native Americans, the Africans did not dwindle from exposure to European diseases, with which they were largely familiar. On the contrary, African tropical diseases killed European newcomers in extraordinary numbers until the development of quinine in the nineteenth century.
Native Americans had developed certain wild plants into domesticated hybrids that were more productive than their Old World counterparts. Measured as an average yield in calories per hectare (a hectare is ten thousand square meters, the equivalent of 2.5 acres), cassava (9.9 million), maize (7.3 million), and potatoes (7.5 million) all trump the traditional European crops: wheat (4.2 million), barley (5.1 million), and oats (5.5 million). By introducing the New World crops to the Old World, the colonizers dramatically expanded the food supply and their population.
In Europe, maize and potatoes endowed farmers with larger yields on smaller plots, which benefited the poorest peasants. It took at least five acres planted in grain to support a family, but potatoes could subsist three families on the same amount of land.
In effect, maize and potatoes extended the amount of land that Europeans could cultivate either to feed themselves or to produce fodder for their cattle.
During the eighteenth century, the potato first gained its close association with Ireland, and Irish numbers grew from 3 million in 1750 to 5.25 million in 1800. The Irish then became vulnerable to any blight that devastated their potato crop. When such a blight struck during the 1840s, thousands starved to death and millions fled overseas, primarily to North America.
Other European animals hitched along to the Americas despite the colonizers’ best efforts to prevent it. These included the European rats, which were larger and more aggressive than their North American counterparts. Hated parasites on crops and granaries, the rats were skilled stowaways in almost every wooden ship.
Today botanists estimate that 258 of the approximately 500 weed species in the United States originated in the Old World.
By a mix of design and accident, the newcomers triggered a cascade of processes that alienated the land, literally and figuratively, from its indigenous people.
In sum, native peoples and their nature experienced an invasion not just of foreign people but also of their associated livestock, microbes, vermin, and weeds.
To justify their own imperialism, the rival Europeans elaborated upon some very real Spanish atrocities to craft the notorious and persistent “Black Legend”: that the Spanish were uniquely cruel and far more brutal and destructive than other Europeans in their treatment of the Indians.
Alternating brutal force with shrewd diplomacy, Cortés won support from the native peoples subordinated by the Aztecs.
Every year it hosted public ritual human sacrifices of captured people, their chests cut open and their still-beating hearts held up to the sun.
The population of about 200,000 dwarfed the largest city in Spain, Seville, which had only 70,000 inhabitants. Accustomed to the din, clutter, and filth of European cities, Spaniards marveled at the relative cleanliness and order of the Aztec metropolis.
The conquistadores certainly benefited from the technological superiority of Spanish weaponry. Because sixteenth-century guns, known as arquebuses, were crude, heavy, inaccurate, and slow to reload, only a few conquistadores carried them (Cortés’s force of six hundred men had only thirteen guns). Instead, most relied on steel-edged swords and pikes and crossbows. Although essentially late-medieval, this steel weaponry was far more durable and deadly than the stone-edged swords, axes, and arrows of the natives. And
Spanish military technology also exploited horses and war dogs (mastiffs), both of which were new and stunning to Indians. Although most conquistadores fought on foot, the few with horses proved especially dreadful to the natives,
The conquistador expeditions were private enterprises led by independent military contractors in pursuit of profit. The commander ordinarily obtained a license from the crown, which reserved a fifth of the plunder and claimed sovereign jurisdiction over any conquered lands. Known as an adelantado, the holder of a crown license recruited and financed his own expedition, with the help of investors who expected shares in the plunder. Developed in the course of the reconquista and applied to the Canaries, the adelantado system reflected the crown’s chronic shortage of men and money.
Greed was a prerequisite for pursuing the hard life of a conquistador. Cortés meant to be disingenuous when he assured the Aztecs, “I and my companions suffer from a disease of the heart which can be cured only with gold.” Of course, he was more profoundly right than he realized.
After all, the conquistadores scrupulously adhered to the Spanish law of conquest by reading the requerimiento, which ordered defiant Indians immediately to accept Spanish rule and Christian conversion. If the Indians ignored this order, they deserved the harsh punishments of a “just war.” The requerimiento announced, “The resultant deaths and damages shall be your fault, and not the monarch’s or mine or the soldiers.” Attending witnesses and a notary certified in writing that the requerimiento had been read and ignored, justifying all the deaths and destruction that followed. The cruel absurdity of reading the requerimiento in a language alien to Indians was apparent to many Spanish priests if not to the conquistadores.
During the 1530s the leading conquistadores either died fighting one another over the spoils of conquest, as did Pizarro in Peru, or were forced into retirement by the crown, which was the fate of Cortés in 1535.
Hungry, overworked, and dislocated, the natives of Mexico were especially vulnerable to disease. The native population dwindled from a pre-conquest ten million natives to about one million by 1620.
By the 1570s the number of emigrant women had increased but remained less than a third of the total. As a result, the male emigrants usually took wives and concubines among the Indians, producing mixed offspring known as mestizos.
One investigative report on a viceroy of Peru ran to 49,555 pages.
Unfortunately, any colonial request for crown instructions required at least a year for an answer, given the slow pace of transatlantic shipping and the bureaucratic inertia in Spain. One despairing viceroy complained, “If death came from Madrid, we should all live to a very old age.”
Between 1500 and 1650 the Spanish shipped from America to Europe about 181 tons of gold and 16,000 tons of silver.
After the relative price stability of the fifteenth century, Europeans experienced a fivefold rise in prices during the sixteenth century. Laboring people especially suffered from the inflation, because the cost of living rose faster than their wages.
In 1523 much of the gold stolen by Cortés from the Aztecs and shipped homeward was restolen by French pirates in the Atlantic. During
Because conquistadores lived as parasites off the native produce of the invaded regions, they could not linger where the Indians did not practice horticulture. Fields of maize attracted conquistadores, and their absence deterred them.
He and his companions had trekked across much of North America, from the swamps of Florida to the coast of Texas and then through the deserts, mountains, and valleys of northern Mexico. Along the way, Cabeza de Vaca endured a searing double transformation, first from conquistador to slave, and then from slave to sacred healer.
The passage of nearly five centuries has rendered the sixteenth-century peoples even more culturally alien from us than they were to one another.
As their societies shrank and relocated, they became less complex, diminishing the power of the chiefs. In most places there were simply no longer enough people to raise the agricultural tribute necessary to sustain a costly and elaborate elite.
In the depopulated valleys, forests and wildlife gradually reclaimed the abandoned maize and bean fields, while the refugees farmed the less fertile but safer hills. The resurgent wildlife included bison, common in the southeast by 1700 but never sighted by Soto’s conquistadores 160 years before. Far from timeless, the southeastern forest of the eighteenth century was wrought by the destructive power of a sixteenth-century European expedition. Soto had created an illusion of a perpetual wilderness where once there had been a populous and complex civilization.
The new confederations exemplified the widespread process of colonial “ethnogenesis”—the emergence of new ethnic groups and identities from the consolidation of many peoples disrupted by the invasion of European peoples, animals, and microbes.
In fact, after 1700 most North American Indian “tribes” were relatively new composite groups formed by diverse refugees coping with the massive epidemics and collective violence introduced by colonization.
To make a vivid and intimidating example, Coronado ordered one hundred captured warriors burned to death at the stake.
In frustration and fury, the Spaniards tortured and strangled their Pueblo guide, who confessed the plot to lead the Spanish astray where they might get lost and die.
In an ironic reversal of the usual colonial process, the wrecks endowed a native people with gold, silver, and slaves, for the Calusa Indians scavenged the hulks for the shiny metals and enslaved the castaway sailors.
In 1673 the governor of Cuba confessed, “It is hard to get anyone to go to San Agustín because of the horror with which Florida is painted. Only hoodlums and the mischievous go there from Cuba.”
Conversion, however, came at a cultural cost. The priests systematically ferreted out and burned the wooden idols cherished by the natives, banned their traditional ball game, and enforced Christian morality, which required marriage, monogamy, and clothing that covered female breasts.
Conversion bought safety from Spanish muskets but not from Spanish microbes.
Caught in a double squeeze of high costs and small income, the New Mexicans had the lowest standard of living of any colonists in North America.
Never more than 1,000 during the seventeenth century, the colonists remained greatly outnumbered by the Indians, despite the epidemics that reduced Pueblo numbers from 60,000 in 1598 to 17,000 in 1680.
Because the Pueblo peoples already lived in permanent, compact horticultural villages, it was relatively easy to create a mission simply by adding a church, a priest or two, and a few soldiers.
Indeed, many Pueblo hoped that a military alliance with the Spanish would protect both from the nomadic warrior bands—Apache and Ute—of the nearby mountains and Great Plains.
The priests also stood out among the other Hispanics because they rarely raped Indian women and preferred their vow of poverty to the accumulation of gold.
In their theatricality, celibacy, endurance of pain, and readiness to face martyrdom the priests manifested an utter conviction of the truth and power of their God.
Consequently, the priests were in a state of probation as the Pueblo tried to determine whether they benefited or suffered from the Christian power over the spirit world. No matter how successful in getting a church built and hundreds baptized, every priest lived in the shadow of violent death. If the epidemics increased, natives who had seemed docile could conclude that their priests were dangerous sorcerers who must be killed. Of the approximately one hundred Franciscans who served in New Mexico during the seventeenth century, forty died as martyrs to their faith.
The missionaries also encouraged restive soldiers to imprison one governor for nine months and to assassinate a second.
Previously lacking any common language and identity, the Pueblo peoples obtained both—as Spanish became a common second language and as they developed shared grievances against a set of exploiters. Both developments improved their ability to unite against the colonizers.
Especially appealing to men outraged at the Franciscan attack on polygamy, Popé promised each warrior a new wife for every Hispanic he killed.
Popé encouraged the Pueblo to restore their native names and to reverse their baptisms by plunging into the Rio Grande in a ceremony of purification. He declared Christian marriages dissolved and polygamy restored. To replace the churches, the Indians restored their sacred kivas. Popé urged forsaking everything Hispanic, including the new crops and domesticated livestock, but most Pueblo found these too useful to relinquish. Selective in adapting Hispanic culture, the Indians were equally selective in rejecting it.
Entangled in alliances with Indians, European traders often felt compelled to assist native wars that complicated and slowed their pursuit of profit. From the Indian perspective, the French came, in the words of historian Allan Greer, “not as conquering invaders, but as a new tribe negotiating a place for itself in the diplomatic webs of Native North America.” In those webs, the Indians negotiated from a position of strength.
Because Indians voluntarily performed the hard work of hunting the animals and treating their furs, traders could immediately profit in America without the time, trouble, expense, and violence of conquering Indians to reorganize their labor in mines and plantations.
The natives also adapted alcohol to their own purposes. At first, they balked at the novel taste and disorienting effect, but eventually they developed a craving. Drinking as much and as rapidly as they could, the Indians got drunk as a short cut to the spiritual trances that had previously required prolonged fasting and exhaustion. Alcohol also offered a tempting release of aggressions, ordinarily repressed with great effort and much stress, because Indian communities demanded the consistent appearance of harmony. Regarding alcohol as an animate force, natives believed that drinkers were not responsible for their violent actions. Initially appealing and apparently liberating, alcohol became profoundly destructive once it became common and cheap. In drink, natives lashed out with knives and hatchets, killing their own people far more often than the colonial suppliers of their new drug. Fortunately, during the seventeenth century, the natives’ access to alcohol remained limited and sporadic, permitting only occasional binges.
Occasionally the more ruthless mariners interrupted trade to kidnap Indians as human commodities. Taken to Europe, they were put on profitable display as curiosities and trained to assist future voyages as interpreters. Eager for a voyage home, the captives shrewdly told their captors what they wanted to hear, promising to reveal gold and silver and friendly Indians eager for Christianity. Unfortunately, European diseases consigned most of the captives to European graves before European fantasies could take them home.
By the mid-seventeenth century, the trade goods were sufficiently common that the northeastern Algonquian peoples had forsaken their stone tools and weapons—and the craft skills needed to produce them. If cut off from trade, natives faced deprivation, hunger, and destruction by their enemies.
Although the fur trade pitted the Indians against one another in destructive competition, no people could opt out of the intertwined violence and commerce. As a matter of life and death, every native people tried to attract European traders and worked to keep them away from their Indian enemies.
Combining the talents of trader, soldier, cartographer, explorer, and diplomat, Champlain recognized that French success in Canada depended upon building an alliance with a network of native peoples.
The introduction of firearms revolutionized Indian warfare as the natives recognized the uselessness of wooden armor and the folly of massed formations. Throughout the northeast, the Indians shifted to hit-and-run raids and relied on trees for cover from gunfire.
Natives feared that their dead would linger about the village, inflicting disease and misfortune unless appeased with loud and expressive mourning. To draw the bereaved out of their agony and to encourage dead spirits to proceed to their afterlife, neighbors staged condolence rituals with feasts and presents. The best present of all was a war captive meant to replace the dead.
Captive men more often faced death by torture, especially if they had received some crippling wound. Inflicting death as slowly and painfully as possible, the Iroquois tied their victim to a stake, and villagers of both genders and all ages took turns wielding knives, torches, and red-hot pokers systematically to torment and burn him to death. The ceremony was a contest between the skills of the torturers and the stoic endurance of the victim, who manifested his own power, and that of his people, by insulting his captors and boasting of his accomplishments in war. After the victim died, the women butchered his remains, cast them into cooking kettles, and served the stew to the entire village, so that all could be bound together in absorbing the captive’s power. By practicing ceremonial torture and cannibalism, the Iroquois promoted group cohesion, hardened their adolescent boys for the cruelties of war, and dramatized their contempt for outsiders.
Although horrifying to European witnesses, the torments of northeastern torture had their counterparts in early modern Europe, where thousands of suspected heretics, witches, and rebels were publicly tortured to death: burned at the stake, slowly broken on a wheel, or pulled apart by horses. The seventeenth century was a merciless time for the defeated on either side of the Atlantic.
In these ceremonies, the chiefs presided as the kinfolk of a killer gave presents to the relatives of the victim. Delivery and acceptance restored peace and broke the cycle of revenge killings.
Seventeenth-century Europeans regarded non-Europeans as socially and culturally inferior—but not as racially incapable of equality. Lacking a biological concept of race, seventeenth-century Europeans did not yet believe that all people with a white skin were innately superior to all of another color. European elites primarily perceived peoples in terms of social rank rather than pigmentation.
Rather than compel Indians to learn French and relocate into new mission towns, the Jesuits mastered the native languages and went into their villages to build churches.
One priest returned to the Huron after having survived capture and torture by the Iroquois, losing most of his fingers. Because the Huron cherished stoicism under torture as the ultimate test of manhood, they honored this priest. One Huron remarked, “I can neither read nor write, but those fingers which I see cut off are the answer to all my doubts.”
As the Jesuits gathered a following, they demanded more cultural concessions from their Huron converts. The Jesuits denounced torture and ritual cannibalism, premarital sex, divorce, polygamy, and the traditional games, feasts, and dances. An
During the assaults, Jesuit priests hurriedly baptized all they could reach before they too were hacked or burned to death. By 1650 the Huron villages had all been destroyed
Moreover, during the mid-sixteenth century, the English were preoccupied with the conquest and colonization of Ireland.
Later in the century, success in Ireland emboldened English leaders to extend their colonial ambitions across the Atlantic to the region they called Virginia, named in honor of their queen, Elizabeth I, a supposed virgin. Between 1580 and 1620 the English applied the name to the entire mid-Atlantic coast between Florida and Acadia.
Unlike the authoritarian kings of France and Spain, Queen Elizabeth had to share national power with the aristocracy and gentry, who composed the bicameral national legislature known as Parliament.
Although a narrow system of government by our standards, the English constitution was extraordinarily open and libertarian when compared with the absolute monarchies then developing in the rest of Europe. Consequently, it mattered greatly to the later political culture of the United States that England, rather than Spain or France, eventually dominated colonization north of Florida.
Probably about half the rural peasantry lost their lands between 1530 and 1630.
Addressing propertied Englishmen, the colonial promoters announced that they had an easy solution for England’s social woes: exported to a new colony in Virginia, the idle and larcenous poor could be put to work raising commodities for transport to, and sale in, England.
Contrary to the Black Legend, the English treated the Irish no better than the Spanish treated the Guanche, and they offered no prospect of fairer play for the Indians of Virginia. Indeed, the conquest and colonization
At last, in August 1590, White returned to Roanoke with a relief expedition to find the settlement mysteriously abandoned with no signs of attack by either the Indians or the Spanish. The lone clue was carved into a tree—the word “Croatoan,” the name of a nearby island. But the fearful and impatient English mariners refused to venture through the dangerously shallow waters to Croatoan to investigate. Sailing away in pursuit of Spanish treasure ships, the mariners abandoned any surviving colonists to their still mysterious fate.
After retreating to Croatoan and failing to contact a passing ship, the surviving colonists probably headed north to Chesapeake Bay to execute their original plan. They apparently found haven in an Indian village. In 1607, when English colonists reached Chesapeake Bay, some Indians reported that white people had recently lived nearby as refugees in a native village. Unfortunately, the village had run afoul of a powerful chieftain, Powhatan, who killed all the refugees.
Neither house nor furnishings provided opportunities for the conspicuous consumption that helped determine status in England.
But the Algonquians recoiled in horror at the prospect of adopting a European way of life that would obligate their men to forsake war and, instead, adopt the female role of agricultural laborer.
One starving colonist killed and ate his wife, for which he was tried, convicted, and burned at the stake.
Between 1607 and 1622 the Virginia Company transported some 10,000 people to the colony, but only 20 percent were still alive there in 1622.
In England, birth and wealth had screened the gentlemen from manual labor, while the vagrants, for want of employment, had learned to survive by begging and stealing.
Indeed, the company adopted a “head-right system” that awarded land freely to men with the means to pay for their own passage (and that of others) across the Atlantic. Such emigrants received fifty acres apiece, and another fifty acres for every servant or relative brought at their own expense. Servants were also entitled to fifty acres each, if and when they survived their terms of indenture—which afforded them new incentive to emigrate. As private property owners, rather than company employees, the colonists showed much greater initiative and effort in cultivating the corn, squash, and beans that ensured their subsistence. But to prosper, they still needed a commercial crop to market in England.
the annual mortality rate remained about 25 percent until mid-century.
The Virginians developed the strategy, practiced in subsequent colonial wars, of waiting until just before corn harvest to attack and destroy the Indian villages and their crops, consigning the natives to a winter and spring of exposure and starvation.
During the seventeenth century, the English developed two types of colonial governments: royal and proprietary. Relatively few until the eighteenth century, the royal colonies belonged to the crown. Initially more numerous, the proprietary colonies belonged to private interests.
And, as the promoters had predicted, the Chesapeake absorbed thousands of poor laborers considered redundant and dangerous in England.
Their alliance became both easier and more essential at the turn of the century, when the great planters switched their labor force from white indentured servants to enslaved Africans. Class differences seemed less threatening as both the common and great planters became obsessed with preserving their newly shared sense of racial superiority over the African slaves.
In both Chesapeake colonies, the distant crown (for Virginia) or lord proprietor (for Maryland) had to share power with the wealthiest and most ambitious colonists. They refused to pay taxes unless authorized by their own elected representatives in a colonial assembly. Governors who defied the local elite faced obstruction and risked rebellion.
This decentralization of power stood in marked contrast to the Spanish and the French colonies, which permitted neither elected assemblies nor individual liberties.
Indeed, widows were few and their status brief in colonies where women were in such short supply and in such great demand for remarriage.
The husband also supervised and disciplined his dependents: wife, children, and servants. If a servant, child, or wife killed his or her master, the law considered the culprit guilty of “Petit Treason” as well as murder.
But the authorities also held the patriarchs responsible for the misconduct of their dependents. In 1663, a Virginia county court rebuked and punished both a maidservant, for public insolence, and her master, for failing to control her “scolding.”
The planters also needed regularly to clear new fields with axes, for after three years of cropping, the cultivated lands lost their fertility, and the planter had to clear another field to allow the old to lie fallow.
Given the short life expectancy of all Chesapeake laborers, planters wisely preferred to buy English indentured servants for four or five years rather than purchase the more expensive lifelong slaves from Africa. In 1650 enslaved Africans numbered only three hundred, a mere 2 percent of the Chesapeake population.
English servants composed at least three-quarters of the emigrants to the Chesapeake during the seventeenth century: about 90,000 of the 120,000 total.
Given that a sturdy beggar could never anticipate obtaining land in England, the colony offered an opportunity unavailable at home. Of course, that opportunity required men and women to gamble their lives in a dangerous land of hard work and deadly diseases.
Before 1640, most indentured servants endured harsh but short lives in the Chesapeake. Having staked their health in pursuit of farms, most lost their gamble, finding graves before their terms expired.
In part, health improved as many new plantations expanded upstream into locales with fresh running streams, away from the stagnant lowlands, which favored malaria, dysentery, and typhoid fever.
The “seasoned” acquired a higher level of immunity, which they passed on to their offspring.
The entry costs of tobacco planting were modest: a set of hand tools, a year’s provisions, a few head of cattle and pigs, some seed, and about fifty acres of land.
At any given time, a planter cultivated only about a tenth of his farm, leaving most of his domain heavily forested.
The common people ate with their fingers, sharing a bowl and drinking from a common tankard, both passed around the table. They usually ate a boiled porridge of corn, beans, peas, and pork, washed down with water or cider. Most colonists had plenty to eat, in contrast to their past in both England and the early years of the colony. By moving to the Chesapeake, the common colonist sacrificed comfort and life expectancy for an improved diet and the pride and autonomy of owning land.
During the 1660s, new imperial regulations worsened the tobacco glut by requiring colonists to ship their tobacco exclusively to England in English ships.
An assemblyman received 150 pounds of tobacco in pay per day in session—about five times in value what was paid to his counterpart and contemporary in colonial Massachusetts. Governor Berkeley annually collected a salary of £1,000. To put that in perspective, most emigrants mortgaged at least four years of their working lives to pay the £6 cost of a transatlantic passage, and a small planter was fortunate to clear £3 annually over and above expenses.
In pity for himself, Governor Berkeley complained, “How miserable that man is that Governes a People wher[e] six parts of seaven at least are Poore, Endebted, Discontented, and Armed.”
Determined to enjoy the perquisites and rewards of a hierarchical society, Bacon and his lieutenants intended no egalitarian revolution.
Although Bacon attacked a royal governor, he did not seek independence from England. In 1676 no Virginian imagined that independence was feasible or desirable.
In stark contrast to those of Berkeley’s day, Virginia’s eighteenth-century assemblymen cultivated popularity by conspicuously opposing taxes, infuriating a succession of royal governors with instructions to secure a revenue for imperial defense.
denounced the assemblymen for striving “to recommend themselves to the populace upon a received opinion among them, that he is the best Patriot that most violently opposes all Overtures for raising money.”
By reducing taxes, the Virginia gentry reinvented themselves and Virginia politics, transferring the odium of parasitism and tyranny to the royal governor. This dramatically reversed the role that the crown had claimed in 1677 as the putative defender of the common planter.
century Virginians both exceptionally hospitable and genial but shallow and materialistic.
by mastering the genteel public style known as “condescension”: a gentleman’s ability to treat common people affably without sacrificing his sense of superiority. More
Held at the county courthouse, the election was public, with each voter individually stepping forward to voice his vote, for recording by a clerk. By such performances, common voters showed gratitude for past favors and solicited future goodwill from their favored gentleman. Upon receiving a vote, the candidate politely thanked the voter, displaying the condescending gratitude of a true gentleman worthy of high office.
At the end of the seventeenth century, slaves became a better investment, as servants became scarcer and more expensive: £25 to £30 for a lifelong slave compared well with £15 to purchase just four years of a servant’s time.
The slave numbers surged from a mere 300 in 1650 to 13,000 by 1700, when Africans constituted 13 percent of the Chesapeake population. During the early eighteenth century, their numbers and proportion continued to grow, reaching 150,000 people and 40 percent by 1750.
The planters shifted from servants to slaves for economic reasons, but that change incidentally improved their security against another rebellion by angry freedmen.
More commonly, masters permitted slaves to acquire and manage their own property, primarily a few chickens, hogs, cattle, and small garden plots of maize and tobacco. By accumulating and selling property, dozens of early slaves purchased their freedom and obtained the tools, clothing, and land to become common planters. Because the colonial laws did not yet forbid black progress, the black freedmen and women could move as they pleased, baptize their children, procure firearms, testify in court, buy and sell property, and even vote. Some black men married white women, which was especially remarkable given their scarcity and high demand as wives for white men. A few black women took white husbands.
The most successful and conspicuous black freedman, Anthony Johnson, acquired a 250-acre tobacco plantation and at least one slave. With apparent impunity, Johnson boldly spoke his own mind to his white neighbors, telling one meddler: “I know myne owne ground and I will worke when I please and play when I please.” When white neighbors lured away his slave, Johnson went to court, winning damages and the return of his property. That the authorities supported an African against whites and upheld his right to own slaves reveals that slavery and racism had not yet become inseparably intertwined in the Chesapeake. That a black man would own a slave also indicates that getting ahead in planter society was more important to Johnson than any sense of racial solidarity with his fellow Africans in Virginia.
1680, Virginia prescribed thirty lashes on the bare back of any black slave who threatened or struck any white person, which invited poor whites to bully slaves with impunity, creating a common sense of white mastery over all blacks.
Raping a slave was not a crime but marrying her was. In 1705 the law subjected any minister who conducted an interracial marriage to a fine of ten thousand pounds of tobacco. A white man who married a free black or a white woman who slept with any black man faced six months in prison and a £10 fine.
Dreading reenslavement, the descendants of Anthony Johnson fled from Virginia, where their grandfather had been a respected freeholder able to defeat whites in lawsuits.
Where most Chesapeake settlers were poor and short-lived indentured servants, New England attracted primarily “middling sorts” who preserved their freedom because they could pay their own way across the Atlantic.
Puritan values helped the colonists prosper in a demanding land. In the process, they developed a culture that was both the most entrepreneurial and the most vociferously pious in Anglo-America. Contrary to the declension model promoted by some historians, the increasing commercialism of New England life at the end of the seventeenth century derived from Puritan values rather than manifested their decay.
Begun as an epithet, “Puritan” persists in scholarship to name the broad movement of diverse people who shared a conviction that the Protestant Reformation remained incomplete in England.
A Puritan explained, “God sent you unto this world as unto a Workhouse, not a Playhouse.”
Puritans longed to purify the churches by ousting all conspicuous sinners and by inviting members to monitor one another for consistent morality and sound theology. This zeal, however, dismayed most English people, who preferred Anglicanism and the traditional culture characterized by church ales, Sunday diversions, ceremonial services, inclusive churches, and deference to the monarch.
The first Puritan emigrants consisted of 102 Separatists, subsequently called the Pilgrims. In 1620 they crossed the Atlantic in the ship Mayflower to found a town named Plymouth on the south shore of Massachusetts Bay. Beneficiaries of a devastating epidemic that had recently decimated the coastal Indians, the Plymouth colonists occupied an abandoned village with conveniently cleared fields.
Once in Massachusetts, the company leaders established the most radical government in the European world: a republic, where the Puritan men elected their governor, deputy governor, and legislature (known as the General Court). Until his death in 1649, John Winthrop almost always won annual reelection as governor.
Because the Puritans prepared for the next world by their moral life in this one, their rhetoric yoked together material aspiration and the pursuit of salvation. It is anachronistic for us to separate the two.
Purely economic motives, however, would have dispatched few people to cold, distant, and rocky New England. English people could more cheaply, easily, and certainly improve their material circumstances by moving to the nearby and booming Netherlands, which welcomed skilled immigrants.
The Puritans understood in spiritual terms many causes that we might define as “economic.” They interpreted the wandering beggars, increased crime, cloth trade depression, and famines as divine afflictions meant to punish a guilty land that wallowed in sin.
Battling the prevailing Atlantic winds and currents, the slow-moving vessels usually took eight to twelve weeks to cross. Few of the Puritans, who were mostly artisans and farmers, or their wives and children had traveled by ship. On board the standard vessel, about one hundred passengers shared the cold, damp, and cramped hold with their property, including some noisy and rank livestock.
First, most English Puritans persisted at home, waiting to see how God would treat both the mother country and the New England experiment. Second, the New England emigration represented only 30 percent of all the English who crossed the Atlantic to the various colonies during the 1630s. Many more people emigrated to the Chesapeake and the West Indies. Third, the Great Migration was brief, for emigration declined to a trickle after 1640, amounting to only seven thousand for the rest of the century.
At mid-century, the New England sex ratio was six males for every four females, compared with four males for every female in the Chesapeake. Greater balance encouraged a more stable society and a faster population growth.
In 1700 less than 2 percent of New England’s inhabitants were slaves, compared with 13 percent for Virginia and 78 percent for the English West Indies. Compared with the rest of the empire, New England possessed an unusually homogeneous colonial population and culture: free, white, and transplanted English.
Relative to the Chesapeake, the New England environment demanded more labor and provided smaller rewards, but it also permitted longer and healthier lives. In contrast to the Chesapeake tidewater with its long, hot, and humid summers and low topography, New England was a northern and hilly land with a short growing season and faster-flowing rivers and streams, which discouraged the malaria and dysentery that afflicted southern planters. In New England, people who survived childhood could expect to live to about seventy; in the Chesapeake, only a minority survived beyond forty-five.
Because New England had the most decentralized and popularly responsive form of government in the English empire, royalists despised the region as a hotbed of “republicanism.”
days. Puritan parents rarely dictated marriage partners to their children, but they could veto choices that seemed unwise.
New England women could also more easily obtain divorce when abandoned or sexually betrayed by their husbands. Historian Cornelia Dayton concludes that the effort “to create the most God-fearing society” tended “to reduce the near-absolute power that English men by law wielded over their wives.”
In effect, seventeenth-century New England and the English West Indies developed in tandem as mutually sustaining parts of a common economic system. Each was incomplete without the other. New English freedom depended on West Indian slavery.
By 1700, Boston alone had fifteen shipyards, which produced more ships than the rest of the English colonies combined. Indeed, Boston ranked second only to London as a shipbuilding center in the empire.
Seizing upon New England’s reputation in the mother country as a den of Puritan heretics and hypocrites, English economic interests called for an end to New England’s virtual autonomy within the empire.
As God’s favored people, they considered themselves the heirs to the ancient Israelites of the Old Testament. If they honored his wishes, God would bestow health and abundance upon them in this world. But should they deviate from his will in any way, God would punish them as rebels—more severely than he chastised common pagans, like the Indians.
In 1650, Massachusetts had one minister for every 415 persons, compared with one per 3,239 persons in Virginia.
The average New English churchgoer heard about seven thousand sermons in the course of his or her lifetime. To train an orthodox Puritan ministry for so many churches, Massachusetts founded Harvard College in 1636—the first such institution in English America (the
The remaining sticklers for the old purity bolted to join the Baptists, a Separatist denomination that rejected infant baptism in favor of adult baptism as an initiation to full membership.
The most sensational cases involved male sex with animals. In 1642 the New Haven authorities suspected George Spencer of bestiality when a sow bore a piglet that carried his resemblance. He confessed and they hanged both Spencer and the unfortunate sow. New Haven also tried, convicted, and executed the unfortunately named Thomas Hogg for the same crime.
No Catholics, Anglicans, Baptists, or Quakers need come to New England (except to exceptional Rhode Island). All dissenters were given, in the words of one Massachusetts Puritan, “free Liberty to keep away from us.”
By drawing dissidents out of Massachusetts and Connecticut, the Rhode Island settlements helped to maintain orthodoxy in the two major Puritan colonies. Although the orthodox leaders of Massachusetts and Connecticut despised Rhode Island, they benefited from it as a safety valve for discontents who would otherwise fester in their midst.
The authorities pardoned witches who confessed and testified against others, but persistent denial consigned the convicted witch to public execution by hanging. Contrary to popular myth and previous European practice, the New English did not burn witches at the stake.
Witchcraft was also plausible because some colonists did dabble in the occult to tell fortunes and to cure, or inflict, ills (but there is little reason to believe that such “cunning folk” worshiped Satan). Moreover, occult beliefs are self-fulfilling. Anthropologists have repeatedly found that people
Communities and authorities disproportionately detected witchcraft in women who seemed angry and abrasive, violating the cultural norm celebrating female modesty. Women constituted both the majority of the accusers and 80 percent of the accused.
Because it was no easy matter to prove witchcraft, juries usually found innocence. The New English prosecuted ninety-three witches but executed only sixteen—until 1692, when a peculiar mania at Salem dramatically inflated the numbers.
jeremiad exhorted listeners to reclaim the lofty standards and pure morality ascribed to the founders of New England. Paradoxically, the popularity of the genre attested to the persistence, rather than the decline, of Puritan ideals in New England. Determined to live better, the laity longed for the cathartic castigation of the jeremiad. And the ministry complied with eloquence and zeal. But English Puritans often took the jeremiads at face value, confirming their unduly low estimation of New England.
In 1679 the Boston synod of ministers denounced frontier settlers who succumbed to “an insatiable desire after Land and worldly Accommodations, yea, so as to forsake Churches and Ordinances, and to live like Heathen, only that so they might have Elbow-room enough in the world.”
The squashes and pumpkins spread out along the ground, discouraging the appearance of weeds between the maize plants and preserving moisture by shielding the earth from the sun. The interwoven roots strengthened the plants against the winds, and the cornstalks provided convenient poles for the climbing bean vines. In return, beans drew nitrogen from the air for fixing in the soil, partially compensating for the maize, which was nitrogen-depleting. The combination of plants also provided a balanced diet, because the beans offered protein and an amino acid, lysine, that when eaten with corn releases the corn’s protein.
To facilitate their hunting and gathering, the Indians also set fire to the forest beyond their fields. The aboriginal fires were less intense and destructive than the American forest fires of the present day. Because our own society suppresses fire, contemporary forests accumulate, over the years, large quantities of deadwood and dry brush. When a fire does ignite and escapes control, it is explosive, spreading rapidly and destructively up into the forest canopy to consume mature trees. The seventeenth-century Indians managed more modest fires. Because their fires were kindled twice a year, in both spring and fall, they found only the limited amount of deadwood and brush that had accumulated in the interim. Such fires spared the tall and thick mature trees with a dense bark, shaping a relatively open forest of many large trees and few small ones. Noting the effect, if not always the cause, colonists marveled at their ability to ride freely between immense trees through long stretches of the forest.
With fire the Indians shaped and sustained a forest that suited their needs. Regular burning favored large hardwoods, many of which yielded edible nuts. The relatively open forest also made it easier for hunters to see and pursue game. The regular burning diminished mice, fleas, and parasites that troubled people or the game that they ate. The fires also fertilized the forest floor and opened patches of sunlight. Both effects promoted ground-hugging plants, especially grasses and berries, which sustained a larger deer herd, to the ultimate benefit of their human hunters.
Because hospitality and generosity were fundamental duties, violators reaped shame and ridicule. No one went hungry in an Indian village unless all starved. With so little to steal and so little need to, theft was virtually unknown and no one locked a wigwam.
The colonists extorted wampum from the southern New England Indians and then shipped it to Maine to procure furs for shipment to England. In great and growing English demand, the furs helped finance the New English debts.
In effect, the Puritan colonies ran a protection racket that compelled native bands to purchase peace with wampum.
Lacking a collective identity as “Indians,” the natives continued to think of themselves as members of particular bands and tribes—which rendered them all vulnerable to colonial manipulation and domination.
In 1670 the 52,000 New England colonists outnumbered the Indians of southern New England by nearly three to one.
Above all, the missionaries exhorted the Indians to adopt the Puritan pace and mode of work, which meant long days of agricultural labor. Insisting upon the gendered division of labor favored by the English, the missionaries urged the Indian men to forsake hunting and fishing in favor of farming. The Indian women were supposed to withdraw from the cornfields to tend the home and to spin and weave cloth, just as New English women did.
The New English called the bloodiest Indian war in their history King Philip’s War, after the Wampanoag sachem named Metacom but known to the New English as King Philip.
The Indians’ mastery of the flintlock deprived the colonists of the technological edge they had enjoyed in the Pequot War.
During 1675 the colonists could rarely find and attack their more mobile and elusive foes. As a result, many settlers succumbed to the temptation to attack, plunder, and kill those Indians they could easily locate: the praying town Indians.
Because about a third of the natives in southern New England assisted the colonists, King Philip’s War became a civil war among the Indians.
In the late seventeenth century, tourists did not visit Plymouth to see the now celebrated rock (which was then unidentified). Instead, they gaped at Metacom’s skull. One visitor, the famous minister Cotton Mather, angrily wrenched off and took away Metacom’s jawbone, completing his silencing.
Tried and convicted, Tift suffered a traitor’s painful death, pulled apart by horses.
Tobacco was valuable to the empire—indeed, more precious than all other mainland produce combined—but sugar was king. Sugar could bear the costs of long-distance transportation (and the purchase of slaves by the thousand) because it was in great and growing European demand to sweeten food and drink.
Lacking cities and gold but possessing a fearsome reputation, the Caribs were the sort of Indians that the Spanish had learned to avoid.
received over two-thirds of the English emigrants to the Americas between 1640 and 1660.
West Indies (44,000) than in the Chesapeake (12,000) and New England colonies (23,000) combined. The
As positive incentives decayed after 1635, masters resorted more frequently and more brutally to punishment. They contemptuously referred to their servants as “white slaves” and applied the whip to drive and punish them—language and measures unthinkable in England.
By preindustrial standards, the sugar planter ran a large and complex operation that combined agriculture and manufacturing. He needed at least one mill to crush juice out of the cane, a boiling house to clarify and evaporate the juice into brown sugar crystals, a curing house to drain out the molasses and dry the sugar, a distillery to convert the molasses into rum, and a warehouse to store the barreled sugar until he could ship it to Europe.
Because cut cane spoiled unless processed within a few hours, the harvesting, milling, and boiling required close synchronization and quick work. Field gangs cut the ripe canes by hand with curved knives and carted the stalks to the mill, for prompt grinding between rollers turned by wind or cattle. Crushed from the cane, the juice had to be boiled within a few hours, before it could ferment. Boiling in a succession of copper kettles hung over a furnace evaporated the water, leaving a golden-brown sugar known as muscovado, which the planters packed into immense thousand-pound hogshead barrels and shipped to Europe, for further refinement there into white sugar for sale to consumers. Making muscovado also generated a cheap by-product, molasses, which could be rendered more valuable by distilling it into rum. Inexpensive to make, rum became the principal alcohol sold and consumed in the English empire.
By 1660, Barbados made most of the sugar consumed in England and generated more trade and capital than all other English colonies combined.
Despite its small scale, by 1660 Barbados had 53,000 inhabitants—a density of 250 persons per square mile, which rose to 400 by the end of the century. In 1700 the human concentration on Barbados was four times greater than in England.
Because white men could more easily escape to pass as free on another island or aboard a pirate ship, planters increasingly saw an advantage in employing only permanent slaves of a distinctive color immediately and constantly identified with slavery.
By 1660, Barbados had become the first English colony with a black and enslaved majority: 27,000 compared with 26,000 whites.
The growing slave population depended on increased slave imports, for the Barbadian slaves died faster than they could reproduce. Although the planters brought 130,000 Africans into Barbados between 1640 and 1700, only 50,000 remained alive there at the dawn of the new century.
Invariably, some reckless, frightened, or greedy slave alerted a master to the impending danger. Such reports kept the planters on edge and produced brutal retribution upon the suspected. In the first major alarm, in 1675, the planters executed thirty-five suspects; at least six of them were burned alive at the stake. The slave woman who revealed the conspiracy received her freedom from the colonial government, which compensated her master.
This English refusal to convert slaves diverged sharply from the practice of French, Spanish, and Portuguese masters, who felt religiously and legally bound to promote the Catholic initiation of every soul, while they exploited the body. Only the Quaker minority challenged the ban at Barbados on converting the slaves. For this, they were considered dangerous radicals, and the government fined them about £7,000, executed one, and ordered their meetinghouse nailed shut.
Once a land of apparent promise for common tobacco planters, Barbados had become the domain of sugar grandees and their African slaves.
During the 1640s, they had increased their exposure to deadly diseases by importing slaves bearing new pathogens from Africa: principally yellow fever and malaria, which became the greatest killers of Barbadians, free and slave. The Africans also introduced and shared hookworm, yaws, guinea worm, leprosy, and elephantiasis.
Fewer than fifteen hundred Spanish colonists and their slaves occupied part of the south coast in 1655, when their weakness attracted an English invasion and occupation.
When buccaneers blew into town after a successful raid, Port Royal earned its reputation as the wickedest place in the English-speaking world: the Sodom of the West Indies. But paradoxical Port Royal also astonished visitors by hosting four churches (Anglican, Presbyterian, Quaker, and Catholic) and a synagogue for the more pious colonists. With 2,900 inhabitants in 1680, Port Royal was the third-largest town in English America, behind only Bridgetown on Barbados and Boston in New England.
During the 1690s the crown dispatched a new governor with instructions to oust the buccaneers from Jamaica, which proved easier to accomplish in the wake of Sir Henry Morgan’s death in 1688. Suffering from cirrhosis of the liver, the heavy-drinking Sir Henry sought relief from an African folk doctor. But his treatments—injections of urine and an all-body plaster of moist clay—only hastened Morgan’s death.
In 1660, Jamaica had seemed big enough for both small and great planters, but by the end of the century it became the English colony most dominated by great planters and their slaves. By
At the end of the seventeenth century, white emigrants from the West Indies, particularly Barbados, carried the seeds of that society to the southern mainland by founding the new colony of Carolina.
THE 1670s, West Indian planters established a new colony on the Atlantic seaboard north of Florida but south of the Chesapeake. Called Carolina to honor King Charles II, the new colony included present-day North and South Carolina and Georgia.
In their treaties with native peoples, the colonists insisted upon the return of all fugitive slaves as the price of peace and trade. As a further incentive, Carolina paid bounties to Indians who captured and returned runaways, at the rate of a gun and three blankets for each.
To secure Carolina from Spanish attack and accelerate its economic development, the Lords Proprietor needed to attract more colonists quickly. The Lords offered the incentives most alluring to English settlers of the late seventeenth century: religious toleration, political representation in an assembly with power over public taxation and expenditures, a long exemption from quitrents, and large grants of land. The Lords Proprietor assured religious tolerance to everyone but atheists (who hardly existed anywhere in the seventeenth century), promising even Jews the liberty to practice their faith. To discourage violent religious disputes, the Lords forbade “any reproachful, reviling, or abusive Language” against the faith of another.
that the average Carolina freedman accumulated more than 350 acres of land before death.
a detached cluster of settlements on Albemarle Sound, near Virginia. Founded by Virginians during the 1650s, these settlements resented their inclusion in Carolina and resisted, sometimes violently, the collection of quitrents and customs duties by proprietary officials. In 1691 the Lords Proprietor mollified the Albemarle Sound colonists by establishing “North Carolina” as a distinct government with its own assembly and deputy governor.
The division left Charles Town the capital of “South Carolina,” which the Goose Creek Men dominated. Arrogant and Anglican, the Goose Creek Men stifled the policy of religious toleration. In 1702, the assembly barred non-Anglicans from holding political office and established the Church of England as the colony’s official, tax-supported church. The Lords Proprietor accepted the restrictive new legislation, abandoning their principal supporters in the colony, the religious dissenters.
As in the Chesapeake, the common and the great planters of Carolina established a white racial solidarity that, in politics, trumped their considerable differences in wealth and power.
Carolina’s early leaders concluded that the key to managing the local Indians was to recruit them as slave catchers by offering guns and ammunition as incentive.
The Carolina trader benefited from the native custom of providing wives to welcome newcomers.
Consequently, the Carolinians exported most of the Indian captives to the West Indies, especially Barbados, trading them for Africans, who were then brought back to work the Carolina plantations. The exchange rate of two Carolina Indian slaves for one African reflected the shorter life expectancy of the enslaved native.
Florida’s Indian population collapsed from about 16,000 in 1685 to 3,700 in 1715, and the missions shrank to a few in the immediate vicinity and partial security of San Agústin.
perfected on a grand scale in the American West, including cattle branding, annual roundups, cow pens, and cattle drives from the interior to the market in Charles Town.
In Carolina the black herdsmen became known as “cowboys”—apparently the origin of that famous term.
The colony rewarded with freedom any black who killed an enemy in time of war.
Enjoying a protected market within the empire for both rice and indigo, Carolina planters became the wealthiest colonial elite on the Atlantic seaboard—and second only to the West Indians within the empire.
even more gracious, polite, genteel, and lavish than the gentlemen of Virginia. Competing for status, the Carolina planters vied to serve the best wines, to display fine silverware and furniture, to appear in silk clothing, and to muster servants dressed in livery.
An elite Carolinian conceded, “We eat, we drink, we play, and shall continue to until everlasting flames surprise us.”
working conditions and the disease-ridden lowland environment produced a slave mortality in excess of the birthrate.
bear firearms to church, to deter the blacks from rebelling on a Sunday.
The authorities employed torture to obtain confessions, which led to executions, sometimes by hanging but usually by burning at the stake.
merchants, landed gentry, and Anglican ministers. They hoped to alleviate English urban poverty by shipping “miserable wretches” and “drones” to a new southern colony, where hard work on their own farms would cure indolence. By this moral alchemy, people who drained English charity would become productive subjects working both to improve themselves and to defend the empire on a colonial frontier.
Farther upriver he located the town of Ebenezer, as a haven for German Lutherans recently evicted from a Catholic principality.
Moreover, black slavery made manual labor seem degrading to free men, which discouraged exertion by common whites, who aspired, instead, to acquire their own slaves to do the dirty work.
willing to labor and capable of bearing arms, the Georgia Trustees wanted many compact farms worked by free families, instead of larger but fewer plantations dependent upon enslaved Africans. To mandate their vision, the founders restricted most new settlers to fifty-acre tracts—about an eighth of the size of a Carolina plantation—and the trustees forbade the importation or possession of slaves.
reject the slave system so fundamental and profitable to the rest of the empire. Driven by concerns for military security and white moral uplift, the antislavery policy expressed neither a principled empathy for enslaved Africans nor an ambition to emancipate slaves elsewhere.
To discourage litigation and agitation, the founders also banned lawyers from practicing in the new colony.
During the late 1730s and early 1740s, the trustees lifted the bans on lawyers, liquor, and large landholdings—but held firm against slavery and an assembly.
The Georgia dissidents rallied behind the revealing slogan “Liberty and Property without restrictions”—which explicitly linked the liberty of white men to their right to hold blacks as property. Until they could own slaves, the white Georgians considered themselves unfree. Such reasoning made sense in an eighteenth-century empire where liberty was a privileged status that almost always depended upon the power to subordinate someone else.
in 1751 the trustees capitulated, permitting slavery and surrendering Georgia to the crown.
From about 3,000 whites and 600 blacks in 1752, Georgia’s population surged to 18,000 whites and 15,000 blacks in 1775.
More fertile and temperate than New England, but far healthier than the Chesapeake, the mid-Atlantic region was especially promising for cultivating grain, raising livestock, and reproducing people.
The acquisition of New Netherland (which had swallowed up New Sweden) would also close the gap between the Chesapeake and New England, promoting their mutual defense against other empires and the Indians.
By virtue of their especially indulgent charters, the New England colonies were virtually independent of crown authority. Answering to no external proprietors, the New English developed republican regimes where the propertied men elected their governors and councils, as well as their assemblies, and where much decision-making was dispersed to the many small towns.
the colonial arrangement seemed designed for many separate surrenders rather than for collective defense. In
During the early seventeenth century, the Netherlands emerged as an economic and military giant, out of all proportion to its confined geography and small population of 1.5 million (compared with 5 million English and 20 million French).
While the other European states were developing authoritarian and centralized monarchies, the Dutch opted for a decentralized republic dominated by wealthy merchants and rural aristocrats.
European intellectuals also gravitated to Amsterdam because the Dutch allowed greater latitude to new ideas. The great seventeenth-century philosophers René Descartes, John Locke, and Benedict de Spinoza all emigrated to escape intolerance in their own countries.
After 1640 most of the slaves sent to the Americas went in Dutch rather than Portuguese vessels, enriching the merchants of Amsterdam rather than those of Lisbon.
a Dutch flotilla intercepted and captured the entire Spanish treasure fleet homeward bound from the Caribbean in 1628. The loss of the ships and 200,000 pounds of silver virtually bankrupted the Spanish crown and enormously enriched the Dutch investors in the attacking fleet.
Beginning with Henry Hudson in 1609, Dutch merchants annually sent ships across the Atlantic and up the Hudson River to trade for furs with the Indians. Seventeenth-century ships could ascend the river 160 miles, as far as the future Albany, a greater distance than was possible on any other river on the Atlantic seaboard.
In 1625, the Dutch founded the fortified town of New Amsterdam on Manhattan Island at the mouth of the river. Possessing the finest harbor on the Atlantic seaboard, New Amsterdam served as the colony’s largest town, major seaport, and government headquarters.
Colonists’ roving pigs and cattle invaded cornfields, provoking the natives to kill and eat the livestock—which, of course, outraged the settlers.
Some were Swedes, but most came from Finland, then under Swedish rule. Skilled at pioneer farming in heavily forested Sweden and Finland, the colonists adapted quickly to the New World and introduced many frontier techniques that eventually became classically “American,” including the construction of log cabins.
A zealous Calvinist, Governor Stuyvesant joined the Dutch Reformed clergy in urging a new policy meant to keep Jews as well as other Protestants out of New Netherland. But the Dutch West India Company consistently defended tolerance as best for business, reminding Stuyvesant of “the large amount of capital which [Jews] still have invested in the shares of this company.” The Jews remained, enjoying more freedom in New Netherland than in any other colony.
As in New England, the emigrants were primarily family groups of modest means and farmer or artisan status, rather than the indentured, unmarried, and young men who prevailed in the early Chesapeake and West Indies.
In New Netherland, women also enjoyed greater legal rights and economic opportunities than did their sisters in the English colonies. In contrast to English women, Dutch wives kept their maiden names, which reflected their more autonomous identity by law. Unlike the “coverture” of English common law, the Dutch legal code (derived from Roman law) did not deprive married women of their legal identity and their rights to own property. If a wife survived her husband, she received half of the property, while the other half went to their heirs—significantly better than the one-third allowed widows by English law.
Between 1661 and 1664, 383 women conducted or faced lawsuits in the courts at New Amsterdam.
But if religious conflict and economic misery sufficed to push colonial emigration, the French would have triumphed over both the English and the Dutch. The further difference was that, unlike France, England permitted its discontented freer access to its overseas colonies and greater incentives for settling there.
Begun in 1651 and strengthened in 1660 and 1663, the Navigation Acts had three fundamental principles. First, only English ships could trade with any English colony. The acts defined as English any ship built within the empire, owned and captained by an English subject, and sailed by a crew at least three-quarters English.
Confronting and overcoming more resistance there, the English plundered indiscriminately and sold the captured Dutch garrison into servitude in Virginia.
the king agreed in 1680 to grant the younger Penn 45,000 square miles west of the Delaware River as the colony of Pennsylvania (“Penn’s Woods”).
But as a young man and against his father’s wishes, Penn had converted to Quakerism, then an especially mystical, radical, and persecuted form of Protestantism.
Renouncing formal prayers, sermons, and ceremony of any sort, Quakers met together as spiritual equals and sat silently until the divine spirit inspired someone, anyone, to speak. Although they rejected a specially educated and salaried ministry, certain especially devout and articulate laymen (and women) served as “Public-Friends,” itinerant preachers supported by voluntary contributions.
In contrast to the Puritan emphasis on sacred scripture, Quakers primarily relied on mystical experience to find and know God. The Quakers sought an “Inner Light” to understand the Bible, which they read allegorically rather than literally. More than a distant divinity or an ancient person, their Jesus Christ was fundamentally here and now and eternal: the Holy Spirit potentially dwelling within every person. Anyone truly awakened by that Spirit could thereafter live in sanctity.
Penn was both a devout Quaker and an ingrained elitist, both highly principled and habitually condescending. A tireless crusader for religious toleration, Penn traveled widely as a preacher, in Germany and Holland as well as Great Britain.
Penn’s financial interest also argued for hastening development by welcoming every productive emigrant. In founding a colony, Penn meant to enhance rather than to sacrifice his fortune. In promising a “Free Colony,” he did not offer free land, for he meant to profit by selling real estate and by collecting annual quitrents. He explained, “Though I desire to extend religious freedom, yet I want some recompense for my trouble.”
Pennsylvania commenced after the local natives had plunged in numbers and power from multiple epidemics, prolonged exposure to the alcohol of Dutch and Swede traders, and destructive raids by both the Iroquois Five Nations and the Chesapeake colonists.
During the late seventeenth and early eighteenth centuries, many native peoples fled from mistreatment in other colonies to settle in Pennsylvania. Penn’s government welcomed Shawnees from South Carolina, the Nanticoke and Conoy of Maryland, the Tutelo from Virginia, and some Mahicans from New York. One refugee explained to the Quakers, “The People of Maryland do not treat the Indians as you & others do, for they make slaves of them & sell their Children for Money.”
Penn consented to their division in 1704 into the distinct colonies of Pennsylvania and Delaware, with separate legislatures but a common governor appointed by their proprietor.
Living beyond his means and donating generously to support Quaker meetings and Public Friends, Penn accumulated the debts that would consign him to an English debtors’ prison in 1707.
Neither any single ethnic group nor any particular religious denomination enjoyed a majority in any middle colony.
In the mid-eighteenth century, a German immigrant reported, “They have a saying here: Pennsylvania is heaven for farmers, paradise for artisans, and hell for officials and preachers.”
James II regarded the American colonies as cash cows meant to fund a more authoritarian crown. Endowed with a larger colonial revenue, the crown could dispense with Parliament, which was constitutionally necessary to levy taxes within England.
Although routine in southern colonies (but poorly collected), quitrents were novel and provocative in New England. Because English folk regarded secure real estate as fundamental to their liberty, status, and prosperity, the colonists felt horrified by the sweeping and expensive challenge to their land titles.
In a bold and desperate gamble, William invaded England as a preemptive strike to capture that realm for a Dutch alliance. Aided by collusion in the disaffected English army and navy, William crossed the Channel and landed without resistance in November.
Whigs, called the transfer of power a “Glorious Revolution,” which they creatively depicted as a spontaneous uprising by a united English people. In fact, the revolution was fundamentally a coup spearheaded by a foreign army and navy.
In all of its reforms, the crown favored the local oligarchies of great planters and merchants, rather than any colonial longing for democracy (which was not evident).
By 1694 the English sustained an army of 48,000 subjects plus 21,000 German mercenaries.
Formerly the bulwark against unpopular taxes and crown power, Parliament became the great collection agency for the new monarch, a Protestant succession, and a transatlantic empire. Formerly the lightest-taxed people in Europe, the English joined the French and the Dutch as the most heavily taxed.
In stark contrast to France, England built a fiscal-military state without submitting to the despotism of an absolutist monarchy.
Despite their numerical superiority, the English colonists suffered repeated defeats as New France mustered small but effective combinations of royal troops, Canadian militia, and Indians to raid and destroy frontier settlements in New York and New England. In response, the English tried to invade Canada both by land from Albany via Lake Champlain and by sea via the St. Lawrence River, but both invasions were expensive and humiliating failures.
Neutrality did not bring a universal peace to Iroquoia. On the contrary, peace to the north and west obliged the Iroquois to find enemies elsewhere, for they remained committed to mourning wars to sustain their numbers, their spiritual power, and their warrior ethos. A colonist noted that “if you go to persuade them to live peaceably” the Iroquois “will answer you, that they cannot live without war.” After
And after 1707, the Scots outnumbered the English as emigrants to the colonies.
Pirates took a special pride in their ability to eat, drink, dance, gamble, and whore with abandon, in a style that they called “living well.” Although unstable and dangerous, piracy proved intoxicating and addictive. The
In a colonial world divided between masters and servants, the pirates defined freedom as their own opportunity to prey upon others.
By 1716 colonial authorities estimated that at least two thousand Anglo-American pirates were operating in the West Indies and along the Atlantic seaboard. They found havens on the unsettled islands of the Bahamas and in the secluded inlets of the Carolinas.
In 1688 the crown captured about 3 percent of the national income as taxes; by 1715 that had tripled to 9 percent of an enlarged economy.
Viewing the French as an “other,” the British characterized them as economically backward, religiously superstitious, culturally decadent, aggressively militarist, and broken to despotic rule. By inverse definition, the British saw themselves as especially enlightened by commerce, individual liberties, the rule of law, and a Protestant faith.
Despite the proliferation of British shipping, the overall number of emigrants declined in the early eighteenth century from its seventeenth-century peak.
The new recruitment invented America as an asylum from religious persecution and political oppression in Europe—with the important proviso that the immigrants had to be Protestants.
Formerly the great colonial entrepôt, Boston slipped to third, behind Philadelphia and New York, by 1760.
a modest increase in productivity per capita, of at least 0.3 and perhaps 0.5 percent annually. Although not much by the standards of our time, this growth rate was impressive for a preindustrial economy. Indeed, the colonies grew more rapidly than any other economy in the eighteenth century, including the mother country. In 1700 the colonial gross domestic product was only 4 percent of England’s; by 1770 it had blossomed to 40 percent, as the colonies assumed a much larger place within the imperial economy.
Indeed, the wealth of colonial regions varied directly and positively with the number of slaves. The West Indian planters lived in the greatest luxury because they conducted the harshest labor system with the greatest number of slaves. Next, in both wealth and slavery, came South Carolina, followed by the Chesapeake and the middle colonies. At the other extreme of the imperial spectrum, New England had the lowest standard of living and the fewest slaves. But even without many slaves, a common farmer or artisan lived better in New England than in the mother country. Slavery explained some, but not all, of the colonial prosperity. Access to abundant farmland accounted for the difference.
The muster rolls for colonial military regiments recorded heights, revealing that the average colonial man stood two or three inches taller than his English counterpart. Stature depends upon nutrition, and especially protein, so the superior height of free colonists attested to their better diet, especially rich in meat and milk. On average, the tallest colonists were southern planters—those who profited most from African slavery and Indian land.
Because appearances mattered so much in regulating status and credit, colonists wished to see themselves, and to be seen by others, as something more than rude rustics.
The genteel performed constantly for one another, ever watching and ever watched for the proper manners, conversation, dress, furnishings, and home. Every action, every statement, every object was on display and subject to applause or censure.
Of course, the common folk could never fully match the consumption and taste of the colonial elite of great planters, merchants, and lawyers. Indeed, the common emulation constantly drove the gentility to reiterate their superior status by cultivating more expensive tastes in the most current fashions.
In addition to goods, the swelling volume of British shipping carried emigrants across the Atlantic. Relatively few, however, were English: only 80,000 between 1700 and 1775, compared with 350,000 during the seventeenth century. The decline is especially striking because after 1700 the colonies became cheaper and easier to reach by sea and safer to live in. But push prevailed over pull factors in colonial emigration.
In England, crime surged with every peace as thousands of unemployed and desperate people stole to live. The inefficient but grim justice of eighteenth-century England imposed the death penalty for 160 crimes, including grand larceny, which was loosely defined as stealing anything worth more than a mere shilling.
Between 1718 and 1775, the empire transported about fifty thousand felons, more than half of all English emigrants to America during that period. The transported were overwhelmingly young, unmarried men with little or no economic skill: the cannon fodder of war and the jail fodder of peace. About 80 percent of the convicts went to Virginia and Maryland, riding in the English ships of the tobacco trade. Convicts provided a profitable sideline for the tobacco shippers, who had plenty of empty cargo space on the outbound voyage from England. At about a third of the £35 price of an African male slave, the convict appealed to some planters as a better investment.
Once in the colonies, the Ulster Scots gravitated to the frontier, where land was cheaper, enabling large groups to settle together. Their clannishness helped the emigrants cope with their new setting, but it also generated frictions with the English colonists. Feeling superior to the Catholic Irish, the Ulster Scots bitterly resented that so many colonists lumped all the Irish together. In 1720 some Ulster Scots in New Hampshire bristled that they were “termed Irish people, when we so frequently ventured our all, for the British crown and liberties against the Irish Papists.” As a compromise, they became known in America as the Scotch-Irish.
Outnumbering the English emigrants, the 100,000 Germans were second only to the Scots as eighteenth-century immigrants to British America.
The average Pennsylvania farm of 125 acres was six times larger than a typical peasant holding in southwestern Germany, and the colonial soil was more fertile, yielding three times as much wheat per acre. Lacking princes and aristocrats or an established church, Pennsylvania demanded almost no taxes, and none to support someone else’s religion. And Pennsylvania did not conscript its inhabitants for war.
The German emigrant trade developed a relatively attractive form of indentured servitude adapted to the needs of families. Known as “redemptioners,” the Germans contracted to serve for about four to five years. Unlike other indentured servants, the redemptioner families had to be kept together by their employers and not divided for sale. Most contracts also gave the emigrant family a grace period of two weeks, upon arrival in Pennsylvania, to find a relative or acquaintance who would purchase their labor contract. Often arranged by prior correspondence, these deals afforded the emigrants some confidence in their destination and employer. If the two-week period passed, the redemption
But he exaggerated a tad, for the overall death rate for the voyage was only about 3 percent, a bit better than the 4 percent rate for convicts and far better than the 10 to 20 percent suffered by enslaved Africans. Germans probably risked more by staying at home in the path of the next European war.
Highly literate, the Pennsylvania Germans also sustained a vibrant press that produced German-language almanacs, books, and a newspaper.
Swiss emigrant Esther Werndtlin denounced her new home, Pennsylvania: “Here are religions and nationalities without number; this land is an asylum for banished sects, a sanctuary for all evil-doers from Europe, a confused Babel, a receptacle for all unclean spirits, an abode of the devil, a first world, a Sodom, which is deplorable.”
With German votes, the Quaker party retained control over the Pennsylvania assembly, to the dismay of the Scotch-Irish, who felt ignored and maligned by the new coalition. Clustered on the frontier, the Scotch-Irish especially resented the refusal of the Quakers and Germans, who dwelled safely and prosperously around Philadelphia, to fund a frontier militia to attack the Indians. Feeling abandoned by the Pennsylvania government, the Scotch-Irish resolved to fight the natives on their own harsh terms. In killing Indians, the Scotch-Irish could vent their political resentments without overtly confronting the Germans and the Quakers.
In 1737, Thomas Penn and James Logan conducted the “Walking Purchase,” perhaps the most notorious land swindle in colonial history—which is saying a great deal. Unable to stop invading squatters, the local Lenni Lenape band agreed to relinquish a tract that would be bounded by what a man could walk around in thirty-six hours. Of course, the Lenni Lenape expected to lose only a modest parcel, but Logan and Penn had made elaborate preparations to maximize their purchase. They employed scouts to blaze a trail, and they trained three runners. On the appointed September day, the runners astonished and infuriated the Lenni Lenape by racing around a tract of nearly twelve hundred square miles, including most of their homeland.
During the eighteenth century, the British colonies imported 1.5 million slaves—more than three times the number of free immigrants.
The slave trade diminished the inhabitants of West Africa, who declined from 25 million in 1700 to 20 million in 1820. At least two million people died in slave-raiding wars and another six million captives went to the New World as slaves. That demographic loss hampered economic development, rendering West Africa vulnerable to European domination during the nineteenth century.
During the eighteenth century, the British seized a commanding lead in the transatlantic slave trade, carrying about 2.5 million slaves, compared with the 1.8 million borne by the second-place Portuguese (primarily to Brazil) and the 1.2 million transported by the third-place French.
During the eighteenth century at least one-third of the slaves died within three years of their arrival on the island of Barbados.
On the coast of West Africa, the sojourning Britons suffered from the dank humidity, fierce heat, and frequent torrential rains. They also died by the hundreds from tropical diseases, for Africa reversed the immunological advantage that Europeans enjoyed as colonizers in more temperate climes.
Popular myth has it that the Europeans obtained their slaves by attacking and seizing Africans. In fact, the shippers almost always bought their slaves from African middlemen, generally the leading merchants and chiefs of the coastal kingdoms. Determined to profit from the trade, the African traders and chiefs did not tolerate Europeans who foolishly bypassed them to seize slaves on their own initiative. And during the eighteenth century the Africans had the power to defeat Europeans who failed to cooperate. Contrary to the stereotype of shrewd Europeans cheating weak and gullible natives, the European traders had to pay premium, and rising, prices to African chiefs and traders, who drove a hard bargain. During the 1760s, traders paid about £20 per slave, compared with £17 during the 1710s.
The Europeans exploited and expanded the slavery long practiced by Africans. Some slaves were starving children sold by their impoverished parents. Others were debtors or criminals sentenced to slavery. But most were captives taken in wars between kingdoms or simply kidnapped by armed gangs.
Although they did not directly seize slaves, the European traders indirectly promoted the wars and kidnapping gangs by offering premium prices for captives.
neighbors. As guns became essential for defense, a people had to procure them by raiding on behalf of their suppliers, lest they instead participate in the slave trade as victims. By the end of the century, the British alone were annually exporting nearly 300,000 guns to West Africa.
About a quarter of the captives died along the way from some combination of disease, hunger, exhaustion, beatings, and suicide.
Once the ship set sail, the slaves entered the notorious “middle passage” across the Atlantic to colonial America.
The European crews exposed the slaves to smallpox, measles, gonorrhea, and syphilis. And the Africans brought along their own diseases to exchange with the crew: yellow fever, dengue fever, malaria, yaws, and especially a bacillary dysentery (a gastrointestinal disorder) known as the “bloody flux.”
The greatest uprising racked Jamaica in 1760, killing ninety whites. Ruthless repression then killed four hundred blacks; most were burned at the stake, belying the eighteenth century’s reputation as an “Age of Enlightenment.”
But, in an effort to sustain their own cultural space, northern blacks developed an annual ritual festival known as “Negro election day,” when they gathered to drink, feast, play, and dance. The festivities culminated with the raucous election of local kings, governors, and judges, who acted throughout the next year as arbitrators of disputes within the black community.
On the sugar islands, slaves outnumbered whites by more than three to one.
Often the urban, skilled, and favored slaves were lighter-skinned mulattoes, the offspring of white masters and their female slaves. Adopting colonial words, ways, and clothes, the urban slaves usually felt little solidarity with the more numerous and African-born field hands of the rice and indigo plantations. But when frustrated in their aspirations for still greater freedom and privilege, the urban slaves could become especially formidable plotters against their masters.
Chesapeake slaves also lived in sufficient concentrations to find marriage partners and bear children, in contrast to many northern slaves. Consequently, natural increase swelled the Chesapeake slave population, which enabled the planters to reduce their African imports after 1750. Thereafter, creole slaves predominated in the Chesapeake.
In 1780 the black population in British America was less than half the total number of African emigrants received during the preceding century, while the white population exceeded its emigrant source by three to one, thanks especially to the healthy conditions in New England and the middle colonies.
And although some English dissenters, principally the Quakers, did seek in America a general religious freedom, many more emigrants wanted their own denomination to dominate, to the prejudice of all others. Indeed, at the end of the seventeenth century, most colonies offered less religious toleration than did the mother country.
And unlike other colonial regions, New England had plenty of official clergyman to fill the many pulpits. Most were graduates of Harvard (founded in 1636) or Yale (1701). Indeed, New England struck visitors as the most conspicuously devout and religiously homogeneous region in British North America. The New English towns enforced a Sabbath that restricted activity to the home and church, imposing arrests and fines on people who worked, played, or traveled on Sunday. An English visitor found the New England Sabbath “the strictest kept that ever I saw.”
As Kay so unpleasantly learned, an establishment tended to increase the power of colonial elites over the church rather than the power of the church over the colonists.
In addition to the many denominational divisions, colonial churches were developing an internal rift between evangelicals and rationalists.
Favoring critical and empirical inquiry, the rationalists slighted the traditional foundations of Christian faith: scriptural revelation and spiritual experience. The rationalists instead found guidance in the science that depicted nature as the orderly and predictable operation of fundamental and discernible “laws,” such as Isaac Newton’s explication of gravity. Christian rationalism held that God created the natural universe and thereafter never interfered with its laws. God seemed less terrifying as learned people reinterpreted epidemics, earthquakes, and thunderbolts as “natural” rather than as direct interventions of divine anger. The Reverend Andrew Eliot, a New England Congregationalist, explained, “There is nothing in Christianity that is contrary to reason. God never did, He never can, authorize a religion opposite to it, because this would be to contradict himself.”
Discarding the Calvinist notion of an arbitrary and punishing God, the rationalists worshiped a benign, predictable, forgiving, and consistent deity who rewarded good behavior with salvation, but who expected common people to defer to the learned and authoritative men at the top of the social hierarchy.
During his 1739–41 tour from Maine to Georgia, Whitefield furthered transatlantic and intercolonial integration by becoming the first celebrity seen and heard by a majority of the colonists.
Whitefield stirred controversy by blaming rationalist ministers for neglecting their duty to seek, experience, and preach conversion. He charged, “The generality of preachers talk of an unknown and unfelt Christ. The reason why congregations have been dead is, because they had dead men preaching to them.” Such rebukes divided the ministry, inspiring some to adopt Whitefield’s spontaneous, impassioned, evangelical style while hardening others in opposition.
“It was a very frequent thing to see an house full of outcries, faintings, convulsions and such like, both with distress, and also with admiration and joy.” The Old Lights called the outbursts “enthusiasm,” then a pejorative term that meant human madness, at best, or Satan’s manipulation, at worst. The Reverend Ezra Stiles commented that “multitudes were seriously, soberly and solemnly out of their wits.”
Where the New Lights championed the uninhibited and disruptive flow of divine grace by inspired itinerants, the Old Lights regarded Christianity as a stable faith that needed barricades against intrusive innovations.
In defying the established authority of minister and magistrate, the radical evangelicals championed individualism, a concept then considered divisive and anarchic.
The radical evangelicals sought to include every person in conversion, regardless of gender, race, and status, but they worked to exclude from church membership anyone they deemed unconverted by the New Birth.
Governor William Gooch denounced the itinerants for seeking “not liberty of conscience but freedom of speech.” His distinction was important and revealing. Gooch and other elitists accepted “liberty of conscience” as the passive persistence of longstanding denominational loyalties, but they dreaded “freedom of speech” for inviting people to rethink their allegiances, which seemed likely to disrupt social harmony. By this reasoning, Presbyterian preachers should limit their preaching to their traditional constituencies in Scots and Scotch-Irish settlements, rather than roam into other parishes to recruit Anglican defectors.
Used to reading character from external appearances, the Virginia Anglicans regarded the Baptists as somber and melancholy people, for they wore dark and plain clothing, cut their hair short, and wove their faith into every conversation. But their external sobriety and austerity covered a more emotional, intimate, and supportive community for worship. Gathered together, they shared their despair and ecstasy in a manner discouraged by ridicule in the highly competitive and gentry-dominated society of Anglican Virginia. Addressing one another as “brother” and “sister,” the Baptists conducted an egalitarian worship that contrasted with the hierarchical seating and service of the Anglican churches. The Baptists even welcomed slaves into their worship as “brothers” and “sisters,” and encouraged some to become preachers. To break down worldly pride and build solidarity, Baptist services included extensive physical contact: laying on of hands, the exchange of the “kiss of charity,” and ritual foot-washing. A visceral distaste for such intimate contact with ordinary people discouraged gentlemen and ladies from becoming Baptists. Appealing primarily to common planters and some slaves, the Baptists drew them together while drawing them away from the gentry.
By calling upon converts to desert their Anglican churches, the Baptists threatened a foundation of Virginia society: the expectation that everyone in a parish would worship together in the established church supervised by the county gentry. Baptists also discouraged the public amusements that had long demonstrated the gentry’s leadership as the finest dancers and the owners of the best racehorses and gamecocks. Landon Carter bitterly complained that the Baptists were “quite destroying pleasure in the Country; for they encourage ardent Prayer; strong & constant faith, & an intire Banishment of Gaming, Dancing, & Sabbath-Day Diversions.” The withdrawal of common evangelicals from public diversions and Anglican services implicitly rebuked the gentry and parsons for leading worldly lives.
Rigorously enforcing the laws against itineracy, Anglican magistrates whipped and jailed dozens of unlicensed preachers. Far from avoiding or resisting confrontation, the Baptists welcomed opportunities to endure persecution conspicuously for their faith. In 1771 a county sheriff and a posse of gentry tried to break up a Baptist meeting by pulling the preacher, John Waller, from the stage to inflict twenty lashes with a horsewhip. In Waller’s words, the congregation gathered around the whipping to sing psalms “so that he Could Scarcely feel the Stripes.” Released, Waller “Went Back singing praise to God, mounted the Stage & preached with a Great Deal of Liberty.” For evangelicals, to preach with “Liberty” meant to channel the Holy Spirit… Some highlights have been hidden or truncated due to export limits.
In 1758 the Philadelphia Yearly Meeting also barred Quaker slaveholders from church leadership, and in 1776 it disowned them from membership. In colonies premised on slavery, the Quakers became the lone denomination to seek abolition systematically.
Conversion on their own terms brought them a new source of discipline to resist the worst vices of the dominant society. In particular, converted Indians reduced the alcohol consumption that rendered enclave Indians so poor, indebted, and exploited by their colonial neighbors. By creating their own local congregations, enclave Indians also limited the cultural control of outsiders.
In sum, the Great Awakening accelerated a religious dialectic that pulled seekers and their congregations between the spiritual hunger to transcend the world and the social longing for respect in
In 1627, after nearly two decades of colonization, Quebec still had only eighty-five French colonists.
Most of the female emigrants came from an orphanage in Paris and were known as filles du roi (“daughters of the king”). In addition to paying their passage, the crown provided a cash marriage dowry: an alluring incentive for orphan girls lacking family money.
This growth was too little too late to compete with the swelling number of English colonists, who numbered 234,000 whites plus 31,000 enslaved Africans by 1700.
That restrictive policy deprived Canada of an especially promising set of colonists, the Protestant minority known as Huguenots, who resembled the English Puritans in their Calvinist faith and middling status as artisans, shopkeepers, and merchants.
A visitor commented that a Canadian needed glass eyes, a brass body, and brandy for blood to endure the bitter cold. When winter at last receded, warm weather unleashed tormenting clouds of mosquitoes and blackflies—denser and fiercer than any in Europe.
Habitants took pride in their regular consumption of meat and white bread, which few French peasants could afford. Thanks to small, tight houses and plentiful firewood, the New French also kept warmer in the winter, despite its rigors and duration. And in contrast to their French relatives, the New French could afford horses, another cherished mark of higher status among peasants. Finally, the Canadian habitant enjoyed privileges of hunting and fishing—both of which were environmentally and legally denied to the peasants in crowded, depleted, and hierarchical France, where the aristocrats monopolized the limited supply of game.
Adapting to the cold, the habitants transformed winter into a cherished season of festive visiting, facilitated by horse-drawn sleighs, known as carioles.
At death, the widow inherited half the assets (and debts), while the children obtained the other half—a better split than the one-third that English widows ordinarily received.
Because each novice had to pay a substantial dowry to enter a convent, most came from seigneurial or mercantile families. By paying convent dowries to place some daughters, parents could vest most of the family estate in fewer heirs, especially an elder son.
To govern New France, the crown appointed three rival officials: a military governor-general, a civil administrator known as the intendant, and a Catholic bishop. The three were supposed to cooperate to enforce crown orders while competing for crown favor by jealously watching one another for corruption, heresy, and disloyalty.
But the number of army commissions, civil offices, and fur trade licenses lagged behind the proliferating children of seigneurial families. Inhibited from entering trade by their code of nobility, growing numbers dwelled in genteel indolence and poverty. In 1737 a priest reported that many seigneurial families were “as poor as artists and as vain as peacocks.” Charlevoix noted, “There is a great fondness for keeping up one’s position, and nearly no one amuses himself by thrift. Good cheer is supplied, if its provision leaves means enough to be well clothed; if not, one cuts down on the table in order to be well dressed.” Appearances mattered in New France.
After 1700, hard labor, rapid reproduction, and peace with the Iroquois brought greater security, prosperity, and development to the valley. From 15,000 in 1700, the population grew to 52,000 by 1750. The amount of cleared and cultivated land, the size of the wheat harvest, and the number of mills all tripled.
Far more readily than their English or Dutch competitors, the French traders married native women, which proved critical to their persistent predominance in the fur trade of the Great Lakes country. Indian women overcame their initial dislike of the pale and bearded French as ugly. Owing to war losses, Indian men had become relatively scarce, and the coureurs de bois offered their wives and Indian kin privileged access to the coveted trade goods of Europe. Over the generations, these relationships produced a distinctive mixed-blood people known as the métis, who spoke multiple languages, lived in their own villages, and acted as intermediaries between their French and Indian relatives.
The Indians accepted the terminology only because they understood it very differently, for they did not have patriarchal families. In their matrilineal kinship systems, mothers and uncles had far more authority than did fathers. The natives happily called the French their “fathers” in the expectation that they would behave like Indian fathers: indulgent, generous, and weak. Among Indians, a father gave much more than he received.
From Carolina’s success and Florida’s failure, the French concluded that a commerce in guns better secured native support than did missionaries. Determined to compete with the Carolina traders, the French in Louisiana wooed the Indians with trade goods, especially firearms.
Because few French volunteered to colonize distant and alien Louisiana, the company relied on military conscripts and convicted criminals (a mix of vagrants, blasphemers, thieves, smugglers, tax evaders, political prisoners, and prostitutes). To a far greater degree than in Canada, the French used Louisiana as a penal colony, which further undermined its reputation. In 1720 a colonial official complained, “What can one expect from a bunch of vagabonds and wrong-doers in a country where it is harder to repress licentiousness than in Europe?”
To sow antipathies, the French conspicuously employed especially trusted blacks in their militias sent to fight the Indians. A few particularly courageous black soldiers won their freedom as a reward. On the other hand, colonial leaders periodically punished rebel slaves by turning them over to Indians for burning to death. A French priest said that the executions “inspired all the Negroes with a new horror of the Savages, … which will have a beneficial effect in securing the safety of the Colony.”
To maintain the racial divisions essential to Louisiana’s security, the officers relied on Indians and blacks to track down and punish deserters. Military tribunals often specified that insubordinate soldiers be flogged by a black man.
By contrast, the British colonists reserved such treatment exclusively for their African property. To execute convicted whites, Louisiana employed a black man, Louis Congo, who drove a hard bargain for his services as executionor: freedom for himself and his wife, a plot of land, a steady supply of alcohol, and generous fees levied in pounds of tobacco—ten for a flogging or branding, thirty for a hanging, and forty for breaking on the wheel or burning alive.
The Natchez people preserved substantial elements of the Mississippian culture, including ceremonial mounds, painted and carved temples, and powerful chiefs who, in death, were honored with the human sacrifice of their servants.
Harsh experience had taught them that any people cut off from the gun trade faced destruction by their native enemies. Consequently, they considered any cessation of trade or escalation of prices to be acts of hostility, demanding war.
By combining Hispanic horses with French guns, many native bands reinvented themselves as buffalo-hunting nomads, which brought them unprecedented prosperity and power.
For want of sufficient water and because of the prevailing high winds, only a few species of trees, primarily cottonwood and willow, grew on the Great Plains, and only along the narrow, sheltered margins beside the permanent rivers. Instead of trees, hardy and drought-resistant grasses covered most of the Great Plains. On
The bison flesh abounded in protein with relatively little fat, and the internal organs supplied many vitamins and minerals. Cut into thin strips and dried in the hot summer sun, the meat could be preserved for months and even years.
The dried dung, known as “buffalo chips,” served as fuel on the treeless plains.
Those ties rendered band membership highly fluid, as the dissatisfied could readily shift into another band where they had relatives.
By conscripting the Pueblo to raid the nomads, the Hispanics further alienated them from one another. The raids procured the one paying commodity in New Mexico: slaves.
Today the predominant image of the American Indian is a warrior and buffalo hunter, wearing an eagle-feather bonnet and riding across the Great Plains. We imagine that the mounted warrior defended a timeless, deeply rooted way of life, independent of the European invasion of America. In fact, the association of Great Plains Indians with the horse is relatively recent and depended upon the colonial intrusion. Although horses first evolved in North America, before spreading eastward into Asia and Europe about twelve thousand years ago, they had become extinct in this continent by about ten thousand years ago. During the sixteenth century, the horse returned to North America as a domesticated animal kept by the Hispanic colonists.
The great material benefits fed into a new psychology, a sense of liberation from old limits into an intoxicating sense of speed, power, and range—an offering of both security and immense, open possibility.
There was a conspicuous exception to the general pattern: on the upper Missouri River some Hidatsa bands broke away westward, abandoning horticulture to become nomads, assuming a new identity as the Crow.
Especially numerous, the Lakota totaled some 25,000 people in 1790. Their own word lakota means “allies,” but their foes, including the French, called them the Sioux, which meant “enemies.”
In sum, most of the Indian peoples we now associate with the Great Plains were relative newcomers who arrived during the eighteenth century.
In 1800 a trader on the northern plains marveled at the abundant buffalo and remarked, “This is a delightful country, and were it not for perpetual wars, the natives might be the happiest people on earth.”
Once an advantage, the concentration in villages became a deadly liability as the villagers suffered disproportionately from the contagious epidemics. As their numbers dwindled, the horticulturalists could no longer effectively defend many of their villages, much less their claim to the surrounding buffalo herds. Because the more mobile and dispersed nomads suffered smaller losses to the epidemics they grew in relative power as the villagers waned.
The greater rewards of successful manhood came at a high price, for Great Plains warriors led shorter lives of increased violence. Because so many males died in their youth or prime, women outnumbered men, which encouraged polygamy by the most successful warriors.
In 1760 only about 1,200 colonists lived in Texas, nearly half of them (580) at San Antonio. The
As they became distinctive from the other Apaches, these composite and increasingly prosperous western bands became known to the Hispanics as the Apache de Navihu, which soon became shortened to Navajo.
In 1769, Galvez cracked under the strain of a formidable rebellion by the Seri and Pima Indians in Sonora (which included southern Arizona). One morning he bolted from his tent to announce a plan to “destroy the Indians in three days simply by bringing 600 monkeys from Guatemala, dressing them like soldiers, and sending them against Cerro Prieto,” a Seri stronghold. Galvez proceeded to assume the identity of Moctezuma, the king of Sweden, Saint Joseph, and finally God. The concerned viceroy of New Spain recalled Galvez to Mexico City, where he slowly recovered his mental health; sent home, he later rose to higher office in Spain.
Usually locked by ice, Hudson Bay was accessible by ships only during two months in the summer. But
Like most fur-trading enterprises, the Hudson’s Bay Company preferred to provide guns rather than missionaries, from a conviction that Christianity ruined hunters.
But the British colonists dissipated their numerical advantage by their division into fourteen distinct mainland colonies (Nova Scotia was the fourteenth, neglected by historians who speak of only thirteen).
Making a virtue of their small colonial population, the French usually kept their promises not to intrude new settlements on Indian lands.
In sum, by 1750 the Indians faced a greater threat of settler invasion and environmental transformation from the numerous and aggressive English than from the few and more generous French.
Embarked on his first command, Washington promptly displayed his inexperience. Although superior French numbers were building Fort Duquesne, Washington foolishly attacked and destroyed a small French patrol. Understandably upset, the main French force and their Indian allies surrounded Washington’s camp, a crude stockade that he had built in a swamp surrounded by high ground. When it began to rain heavily, his soldiers wallowed in water as the French and Indians fired on them from the hills. Compelled to surrender on July 4,
Although politically expedient, Pitt’s policy was financially reckless: by augmenting the monstrous public debt, Pitt saddled the colonists and Britons with a burden that would violently disrupt the empire after the war.
The conquest of Canada cost the British empire about £4 million, more than ten times what the French spent to defend it.
The collapse of New France was dreadful news to the Indians of the interior. No longer could they play the French and the British off against one another to maintain their own independence, maximize their presents, and ensure trade competition.
reckless Carolina settlers invaded Cherokee lands and poached their deer. Some especially ruthless frontiersmen killed Cherokee to procure scalps to collect the large bounties offered by the colony of Virginia. It was impossible to tell a Cherokee scalp from that of a hostile Shawnee—and far easier to kill an unsuspecting people than one prepared for war. The £50 bounty for an adult scalp allured settlers who rarely could make that much in a year. They rationalized that all Indians were their enemies, if not immediately, then inevitably.
The natives also felt a new commonality as Indians, above and beyond their traditional tribal and village identities. This Pan-Indian sensibility emerged from the teaching of a new set of religious prophets, led by a Lenni Lenape named Neolin. Adapting Christian ideas selectively to update native traditions, the prophets proclaimed a double creation: one for all Indians, the others for whites. In defense of their own divinely ordained way of life, Indians were supposed to resist colonial innovations, especially the consumption of alcohol and the cession of lands.
Although spared from massacre, a third of the Indian refugees died of smallpox contracted while crowded in their Philadelphia barracks.
That shocking conflict between the colonies and the mother country developed from strains initiated by winning the Seven Years War.
in 1763 imperial taxation averaged twenty-six shillings per person in Britain, where most subjects were struggling, compared with only one shilling per person in the colonies, where most free people were prospering.
Paradoxically, by protesting British taxation, the colonists affirmed their cherished identity as liberty-loving Britons, as they rallied behind the most cherished proposition of their shared political culture: that a free man paid no tax unless levied by his own representatives.
Colonists were quick to speak of “slavery” because they knew from their own practice on Africans where unchecked domination ultimately led. The conspicuous presence of slavery rendered liberty the more dear to the colonial owners of human property.
The free colonists intently defended their property rights because property alone made men truly independent and free.
European leaders increasingly concluded that wealth and power accrued to nations that discovered and analyzed new information.
The traders primarily sought sable, the premier fur-bearing mammal of Siberia. At first, the Russians marketed their furs in western Europe, but in 1689 they opened an even more lucrative trade with China, via the Siberian border town of Kaikhta, where the Russians obtained, in return, Chinese porcelains, teas, and silks.
As the French depended upon Indian hunters to harvest beaver, the Russians relied on Siberian tribal peoples to kill sable. Living in many bands of highly mobile hunter-gatherers with animist beliefs, the native Siberians resembled their distant kin the Inuit and the Indians of subarctic Canada.
In their reliance on tribute rather than trade to capture native labor, the Russians resembled the Spanish conquistadores of Mexico rather than the French traders in Canada.
When they submitted, the Siberians became exposed to deadly new diseases and a debilitating new dependency on alcohol, a combination that devastated their population.
Like the French and the English, leading Russians longed to believe that they could easily establish an American empire by appearing before the Indians as kinder and gentler colonizers. Subscribing to the “Black Legend” of peculiar Spanish brutality, the Russians predicted that the American Indians would welcome them as liberators.
Instead of welcoming the Russians as liberators, the local Tlingit Indians ambushed and destroyed two small boats filled with fifteen men sent to probe the shallow waters. In alarm, Chirikov promptly sailed back to Kamchatka, bearing neither hostages nor tribute.
The naturalist Georg Steller kept alive and busy observing, killing, dissecting, and naming wildlife previously unknown to Europeans, including Steller’s eagle, Steller’s jay, Steller’s white raven, and Steller’s sea cow, the last an immense northern manatee unique to the western Aleutians. Steller’s sea cows were, when mature, thirty-five feet long and exceeded four tons. Steller and the other survivors endured by hunting sea cows and sea otters and by grubbing for roots with sufficient vitamin C to ease their scurvy.
To accumulate sufficient furs for a profit, the voyages were long: at least two years and as many as six.
The Aleut divided into castes of chiefs, commoners, and slaves (principally war captives). The chiefs enjoyed larger dwellings and more prominent burials, with executed slaves as their companions in the afterworld.
the victors held the native women and children for ransom, while releasing the Aleut men to fill a large quota of furs (which took months). Once the furs were delivered, the promyshlenniki released the children and the women. In the interim, the Russians exploited the Aleut women as sex slaves. Upon departing, the traders left behind venereal diseases and some trade goods—wool, beads, knives, and hatchets—in token payment for the sea otter pelts.
From a contact population of about 20,000, the Aleut dwindled to only 2,000 by 1800.
As a precaution, the Spanish crown ordered the colonization of California to secure the unguarded northwestern door to precious Mexico. The Spanish divided California into southern “Baja California” (now in Mexico) and northern “Alta California” (approximately the present state of California).
Much larger and more complex than Baja, Alta California extended eleven hundred miles, contained about 100 million acres, and included the most spectacular topography and greatest environmental range of any region in North America.
In 1768 about 300,000 natives dwelled in Alta California: an especially impressive number given that only a few practiced horticulture.
In sum, much of the California landscape was subtly anthropogenic (human influenced) long before colonizers arrived with their own even more demanding system of manipulating nature, which they called civilization.
These human-tended landscapes sustained larger numbers of plants and animals and were healthier than today’s forests in California. Indeed, for lack of regular fires, contemporary forests are crowded with small trees, cluttered with deadwood, infested with pests, and vulnerable to destruction by huge and catastrophic fires.
Because the land belonged collectively to the villagers rather than to individuals, there was no market in land, no buying and selling of real estate.
Like most native cultures, the California Indians had powerful shamans but weak chiefs.
Although greatly outnumbered, the Hispanics possessed an intimidating monopoly of horses and guns, as well as a formal command structure.
proved surprisingly successful as economic enterprises, becoming self-sustaining in food by 1778. In 1775 the missions had only 427 head of cattle, but these grew phenomenally to at least 95,000 by 1805.
In 1769 the California coast between San Diego and San Francisco had a native population of 72,000, which declined to just 18,000 by 1821.
Cook’s reconnaissance facilitated subsequent British colonization of Australia, which began with the arrival of 723 convicts at Botany Bay in early 1788.
Lacking metallurgy, the natives exercised their ingenuity in crafting tools and weapons from wood and stone. They did occasionally recover bits of iron, mostly nails, from driftwood that apparently originated with Japanese and, perhaps, Spanish wrecks. Cherishing this metal, they longed to obtain more.
Dependent upon a bountiful but volatile nature, the Hawaiians maintained their harmony with the supernatural by worshiping an array of divine spirits, each manifesting some aspect of their environment. In particular, Lono dispensed the nourishing rain, while Ku had to be propitiated with human sacrifice to secure victory in war.
men and women ate different foods and ate apart from one another. Only men could eat pig and only chiefs could eat dog.
Other taboos prohibited women from fishing and forbade menstruating woman even to enter a river.
The well-fed natives also had the leisure time to compete violently for prestige. The victors in their endemic warfare collected numerous slaves and the skulls of the dead for prominent display in their villages.
Experienced traders and devoted to property, the raincoast peoples belied the classic stereotype of naive natives easily cheated by European traders bearing a few beads. Although eager to get metal knives, chisels, and arrowheads, the Moachat drove a hard bargain for their pelts and salmon. An expedition scientist noted that the raincoast natives were “very keen traders, getting as much as they could for everything they had; always asking for more, give them what you would.” The expedition artist John Webber had to pay for the right to draw the interior of a Nootka house.
Attentive to ancient tradition as well as new technology, Kamehameha ritually sacrificed defeated chiefs to Ku.
By investing in children’s souls instead of the sea otter trade, the Spanish ensured their own long-term irrelevance in the north Pacific.
-
The Anxious Generation by Jonathan Haidt
From my notion template
The Book in 3 Sentences
- An in-depth look at how cell phones and social media are changing the younger generations. It’s helpfully packaged with remediation strategies.
Impressions
On the whole it was good – it was quite repetitive, and Haidt’s substack had ruined several points before the book actually came out
How I Discovered It
I’ve read Haidt’s other books.
Who Should Read It?
Parents of children under 18
How the Book Changed Me
My immediate actions were to cut down on Marleigh’s already very rationed phone time – also it reaffirmed and extended my current belief that a proper relationship with technology is both symbiotic and adversarial. It also clarified my notions of anomie and one’s relationship with modern society as it is, not as it should be.
Summary + Notes
By designing a firehose of addictive content that entered through kids’ eyes and ears, and by displacing physical play and in-person socializing, these companies have rewired childhood and changed human development on an almost unimaginable scale.
While the reward-seeking parts of the brain mature earlier, the frontal cortex—essential for self-control, delay of gratification, and resistance to temptation—is not up to full capacity until the mid-20s, and preteens are at a particularly vulnerable point in development.
Gen Z became the first generation in history to go through puberty with a portal in their pockets that called them away from the people nearby and into an alternative universe that was exciting, addictive, unstable, and—as I will show—unsuitable for children and adolescents.
They spent far less time playing with, talking to, touching, or even making eye contact with their friends and families, thereby reducing their participation in embodied social behaviors that are essential for successful human development.
So even while parents worked to eliminate risk and freedom in the real world, they generally, and often unknowingly, granted full independence in the virtual world, in part because most found it difficult to understand what was going on there, let alone know what to restrict or how to restrict it.
My central claim in this book is that these two trends—overprotection in the real world and underprotection in the virtual world—are the major reasons why children born after 1995 became the anxious generation.
A few notes about terminology. When I talk about the “real world,” I am referring to relationships and social interactions characterized by four features that have been typical for millions of years: They are embodied, meaning that we use our bodies to communicate, we are conscious of the bodies of others, and we respond to the bodies of others both consciously and unconsciously. They are synchronous, which means they are happening at the same time, with subtle cues about timing and turn taking. They involve primarily one-to-one or one-to-several communication, with only one interaction happening at a given moment. They take place within communities that have a high bar for entry and exit, so people are strongly motivated to invest in relationships and repair rifts when they happen. In contrast, when I talk about the “virtual world,” I am referring to relationships and interactions characterized by four features that have been typical for just a few decades: They are disembodied, meaning that no body is needed, just language. Partners could be (and already are) artificial intelligences (AIs). They are heavily asynchronous, happening via text-based posts and comments. (A video call is different; it is synchronous.) They involve a substantial number of one-to-many communications, broadcasting to a potentially vast audience. Multiple interactions can be happening in parallel. They take place within communities that have a low bar for entry and exit, so people can block others or just quit when they are not pleased. Communities tend to be short-lived, and relationships are often disposable.
No smartphones before high school. Parents should delay children’s entry into round-the-clock internet access by giving only basic phones (phones with limited apps and no internet browser) before ninth grade (roughly age 14). No social media before 16. Let kids get through the most vulnerable period of brain development before connecting them to a firehose of social comparison and algorithmically chosen influencers. Phone-free schools. In all schools from elementary through high school, students should store their phones, smartwatches, and any other personal devices that can send or receive texts in phone lockers or locked pouches during the school day. That is the only way to free up their attention for each other and for their teachers. Far more unsupervised play and childhood independence. That’s the way children naturally develop social skills, overcome anxiety, and become self-governing young adults.
Adults in Gen X and prior generations have not experienced much of a rise in clinical depression or anxiety disorders since 2010,[21] but many of us have become more frazzled, scattered, and exhausted by our new technologies and their incessant interruptions and distractions.
For most of the parents I talk to, their stories don’t center on any diagnosed mental illness. Instead, there is an underlying worry that something unnatural is going on, and that their children are missing something—really, almost everything—as their online hours accumulate.
We found important clues to this mystery by digging into more data on adolescent mental health.[5] The first clue is that the rise is concentrated in disorders related to anxiety and depression, which are classed together in the psychiatric category known as internalizing disorders. These are disorders in which a person feels strong distress and experiences the symptoms inwardly. The person with an internalizing disorder feels emotions such as anxiety, fear, sadness, and hopelessness. They ruminate. They often withdraw from social engagement.
In contrast, externalizing disorders are those in which a person feels distress and turns the symptoms and responses outward, aimed at other people. These conditions include conduct disorder, difficulty with anger management, and tendencies toward violence and excessive risk-taking. Across ages, cultures, and countries, girls and women suffer higher rates of internalizing disorders, while boys and men suffer from higher rates of externalizing disorders.[6] That said, both sexes suffer from both, and both sexes have been experiencing more internalizing disorders and fewer externalizing disorders since the early 2010s.[7]
Anxiety is related to fear, but is not the same thing. The diagnostic manual of psychiatry (DSM-5-TR) defines fear as “the emotional response to real or perceived imminent threat, whereas anxiety is anticipation of future threat.”[12] Both can be healthy responses to reality, but when excessive, they can become disorders.
Cognitively, it often becomes difficult to think clearly, pulling people into states of unproductive rumination and provoking cognitive distortions that are the focus of cognitive behavioral therapy (CBT), such as catastrophizing, overgeneralizing, and black-and-white thinking. For those with anxiety disorders, these distorted thinking patterns often elicit uncomfortable physical symptoms, which then induce feelings of fear and worry, which then trigger more anxious thinking, perpetuating a vicious cycle.
So whatever happened in the early 2010s, it hit preteen and young teen girls harder than any other group. This is a major clue. Acts of intentional self-harm in figure 1.4 include both nonfatal suicide attempts, which indicate very high levels of distress and hopelessness, and NSSI, such as cutting. The latter are better understood as coping behaviors that some people (especially girls and young women) use to manage debilitating anxiety and depression.
Millennial teens, who grew up playing in that first wave, were slightly happier, on average, than Gen X had been when they were teens. The second wave was the rapid increase in the paired technologies of social media and the smartphone, which reached a majority of homes by 2012 or 2013. That is when girls’ mental health began to collapse, and when boys’ mental health changed in a more diffuse set of ways.
Smartphones are very different. They connect you to the internet 24/7, they can run millions of apps, and they quickly became the home of social media platforms, which can ping you continually throughout the day, urging you to check out what everyone is saying and doing. This kind of connectivity offers few of the benefits of talking directly with friends. In fact, for many young people, it’s poisonous.
People don’t get depressed when they face threats collectively; they get depressed when they feel isolated, lonely, or useless. As I’ll show in later chapters, this is what the Great Rewiring did to Gen Z.
college students, activism, and flourishing.[43] Yet more recent studies of young activists, including climate activists, find the opposite: Those who are politically active nowadays usually have worse mental health.[44] Threats and risks have always haunted the future, but the ways that young people are responding, with activism carried out mostly in the virtual world, seem to be affecting them very differently compared to previous generations, whose activism was carried out mostly in the real world.
Intriguingly, a child’s brain is already 90% of its full size by around age 5.
Children can only learn how to not get hurt in situations where it is possible to get hurt, such as wrestling with a friend, having a pretend sword fight, or negotiating with another child to enjoy a seesaw when a failed negotiation can lead to pain in one’s posterior, as well as embarrassment. When parents, teachers, and coaches get involved, it becomes less free, less playful, and less beneficial. Adults usually can’t stop themselves from directing and protecting.
but information doesn’t do much to shape a developing brain. Play does. This relates to a key CBT insight: Experience, not information, is the key to emotional development. It is in unsupervised, child-led play where children best learn to tolerate bruises, handle their emotions, read other children’s emotions, take turns, resolve conflicts, and play fair. Children are intrinsically motivated to acquire these skills because they want to be included in the playgroup and keep the fun going.
Even if the content on these sites could somehow be filtered effectively to remove obviously harmful material, the addictive design of these platforms reduces the time available for face-to-face play in the real world. The reduction is so severe that we might refer to smartphones and tablets in the hands of children as experience blockers.
The two that are most relevant for our discussion of social media are conformist bias and prestige bias.
In a real-life social setting, it takes a while—often weeks—to get a good sense for what the most common behaviors are, because you need to observe multiple groups in multiple settings. But on a social media platform, a child can scroll through a thousand data points in one hour (at three seconds per post), each one accompanied by numerical evidence (likes) and comments that show whether the post was a success or a failure. Social media platforms are therefore the most efficient conformity engines ever invented. They can shape an adolescent’s mental models of acceptable behavior in a matter of hours,
But humans have an alternative ranking system based on prestige, which is willingly conferred by people to those they see as having achieved excellence in a valued domain of activity, such as hunting or storytelling back in ancient times.
Language learning is the clearest case. Children can learn multiple languages easily, but this ability drops off sharply during the first few years of puberty.[32] When a family moves to a new country, the kids who are 12 or younger will quickly become native speakers with no accent, while those who are 14 or older will probably be asked, for the rest of their lives, “Where are you from?”
For girls, the worst years for using social media were 11 to 13; for boys, it was 14 to 15.
By building physical, psychological, and social competence, it gives kids confidence that they can face new situations, which is an inoculation against anxiety.
I’ll refer to BIS as defend mode. For people with chronic anxiety, defend mode is chronically activated.
thrilling experiences have anti-phobic effects.
Sandseter and Kennair analyzed the kinds of risks that children seek out when adults give them some freedom, and they found six: heights (such as climbing trees or playground structures), high speed (such as swinging, or going down fast slides), dangerous tools (such as hammers and drills), dangerous elements (such as experimenting with fire), rough-and-tumble play (such as wrestling), and disappearing (hiding, wandering away, potentially getting lost or separated). These are the major types of thrills that children need. They’ll get them for themselves unless adults stop them—which we did in the 1990s. Note that video games offer none of these risks, even though games such as Fortnite show avatars doing all of them.
Our goal in designing the places children play, she says, should be to “keep them as safe as necessary, not as safe as possible.”
Children were getting less time to play, but they suddenly got more time with their time-starved parents?
The Australian psychologist Nick Haslam originated the term “concept creep,” [48] which refers to the expansion of psychological concepts in recent decades in two directions: downward (to apply to smaller or more trivial cases) and outward (to encompass new and conceptually unrelated phenomena). You can see concept creep in action by observing the expansion of terms like “addiction,” “trauma,” “abuse,” and “safety.” For most of the 20th century, the word “safety” referred almost exclusively to physical safety. It was only in the late 1980s that the term “emotional safety” began to show up at more than trace levels in Google’s Ngram viewer. From 1985 to 2010, at the start of the Great Rewiring, the term’s frequency rose rapidly and steadily, a 600% increase.
Mammal babies therefore have a long period of dependence and vulnerability during which they must achieve two goals: (1) develop competence in the skills needed for adulthood, and (2) don’t get eaten. The best way to avoid getting eaten is generally to stick close to Mom. But as mammals mature, their experience-expectant brains need to wire up by practicing skills such as running, fighting, and befriending. This is why young mammals are so motivated to move away from Mom to play, including risky play. The psychological system that manages these competing needs is called the attachment system.
As I noted in chapter 2, the human brain reaches 90% of its adult size by age 5, and it has far more neurons and synapses at that moment than it will have in its adult form.
If a child goes through puberty doing a lot of archery, or painting, or video games, or social media, those activities will cause lasting structural changes in the brain, especially if they are rewarding. This is how cultural experience changes the brain, producing a young adult who feels American instead of Japanese, or who is habitually in discover mode as opposed to defend mode.
In fact, smartphones and other digital devices bring so many interesting experiences to children and adolescents that they cause a serious problem: They reduce interest in all non-screen-based forms of experience.
Are screen-based experiences less valuable than real-life flesh-and-blood experiences? When we’re talking about children whose brains evolved to expect certain kinds of experiences at certain ages, yes. A resounding yes. Communicating by text supplemented by emojis is not going to develop the parts of the brain that are “expecting” to get tuned up during conversations supplemented by facial expressions, changing vocal tones, direct eye contact, and body language. We can’t expect children and adolescents to develop adult-level real-world social skills when their social interactions are largely happening in the virtual world.
In the real world, it often matters how old you are. But as life moved online, it mattered less and less. The
A country that is large, secular, and diverse by race, religion, and politics may not be able to construct shared rites of passage that are full of moral guidance, like the Apache sunrise ceremony. Yet despite our differences, we all want our children to become socially competent and mentally healthy adults who are able to manage their own affairs, earn a living, and form stable romantic bonds. If we can agree on that much, then might we be able to agree on norms that lay out some of the steps on that path?
“Daddy, can you take the iPad away from me? I’m trying to take my eyes off it but I can’t.” My daughter was in the grip of a variable-ratio reinforcement schedule administered by the game designers, which is the most powerful way to take control of an animal’s behavior short of implanting electrodes in its brain.
In this chapter, I describe the four foundational harms of the new phone-based childhood that damage boys and girls of all ages: social deprivation, sleep deprivation, attention fragmentation, and addiction.
By the early 2010s, our phones had transformed from Swiss Army knives, which we pulled out when we needed a tool, to platforms upon which companies competed to see who could hold on to eyeballs the longest.
First and foremost, in 2009, Facebook introduced the “like” button and Twitter introduced the “retweet” button. Both of these innovations were then widely copied by other platforms, making viral content dissemination possible. These innovations quantified the success of every post and incentivized users to craft each post for maximum spread, which sometimes meant making more extreme statements or expressing more anger and disgust.[8] At the same time, Facebook began using algorithmically curated news feeds, which motivated other platforms to join the race and curate content that would most successfully hook users.
By the early 2010s, social “networking” systems that had been structured (for the most part) to connect people turned into social media “platforms” redesigned (for the most part) in such a way that they encouraged one-to-many public performances in search of validation, not just from friends but from strangers.
Children and adolescents, who were increasingly kept at home and isolated by the national mania for overprotection, found it ever easier to turn to their growing collection of internet-enabled devices, and those devices offered ever more attractive and varied rewards. The play-based childhood was over; the phone-based childhood had begun.
Putting it all together, the Great Rewiring and the dawn of the phone-based childhood seem to have added two to three hours of additional screen-based activity, on average, to a child’s day, compared with life before the smartphone.
These numbers vary somewhat by social class (more use in lower-income families than in high-income families), race (more use in Black and Latino families than in white and Asian families[13]), and sexual minority status (more use among LGBTQ youth; see more detail in this endnote
In 2020, we began telling everyone to avoid proximity to any person outside their “bubble,” but members of Gen Z began socially distancing themselves as soon as they got their first smartphones.
The Great Rewiring devastated the social lives of Gen Z by connecting them to everyone in the world and disconnecting them from the people around them.
Teens need more sleep than adults—at least nine hours a night for preteens and eight hours a night for teens.
It makes intuitive sense. A study by Jean Twenge and colleagues of a large U.K. data set found that “heavy use of screen media was associated with shorter sleep duration, longer sleep latency, and more mid-sleep awakenings.”[37] The sleep disturbances were greatest for those who were on social media or who were surfing the internet in bed.
In other words, when your sleep is truncated or disturbed, you’re more likely to become depressed and develop behavioral problems. The effects were larger for girls.
In short, children and adolescents need a lot of sleep to promote healthy brain development and good attention and mood the next day. When screens are allowed in bedrooms, however, many children will use them late into the night—especially if they have a small screen that can be used under the blanket. The screen-related decline of sleep is likely a contributor to the tidal wave of adolescent mental illness that swept across many countries in the early 2010s.
When you add it all up, the average number of notifications on young people’s phones from the top social and communication apps amounts to 192 alerts per day, according to one study.[42] The average teen, who now gets only seven hours of sleep per night, therefore gets about 11 notifications per waking hour, or one every five minutes.
Thanks to the tech industry and its voracious competition for the limited resource of adolescent attention, many members of Gen Z are now living in Kurt Vonnegut’s dystopia.
They found that performance was best when phones were left in the other room, and worst when phones were visible, with pocketed phones in between.
Figure 5.3. The Hooked model. From Nir Eyal’s 2014 book, Hooked: How to Build Habit-Forming Products. In the book, Eyal warned about the ethical implications of misusing the model in a section titled “The Morality of Manipulation.”
The loop starts with an external trigger, such as a notification that someone commented on one of her posts. That’s step 1, the off-ramp inviting her to leave the path she was on. It appears on her phone and automatically triggers a desire to perform an action (step 2) that had previously been rewarded: touching the notification to bring up the Instagram app. The action then leads to a pleasurable event, but only sometimes, and this is step 3: a variable reward. Maybe she’ll find some expression of praise or friendship, maybe not. This is a key discovery of behaviorist psychology: It’s best not to reward a behavior every time the animal does what you want. If you reward an animal on a variable-ratio schedule (such as one time out of every 10 times, on average, but sometimes fewer, sometimes more), you create the strongest and most persistent behavior.
What the Hooked model adds for humans, which was not applicable for those working with rats, was the fourth step: investment. Humans can be offered ways to put a bit of themselves into the app so that it matters more to them. The girl has already filled out her profile, posted many photos of herself, and linked herself to all of her friends plus hundreds of other Instagram users.
At this point, after investment, the trigger for the next round of behavior may become internal. The girl no longer needs a push notification to call her over to Instagram. As she is rereading a difficult passage in her textbook, the thought pops up in her mind: “I wonder if anyone has liked the photo I posted 20 minutes ago?”
We know that Facebook intentionally hooked teens using behaviorist techniques thanks to the Facebook Files—the trove of internal documents and screenshots of presentations brought out by the whistleblower Frances Haugen in 2021. In one chilling section, a trio of Facebook employees give a presentation titled “The Power of Identities: Why Teens and Young Adults Choose Instagram.” The stated objective is “to support Facebook Inc.–wide product strategy for engaging younger users.” A section titled “Teen Fundamentals” delves into neuroscience, showing the gradual maturation of the brain during puberty, with the frontal cortex not mature until after age 20.
Unfortunately, when an addicted person’s brain adapts by counteracting the effect of the drug, the brain then enters a state of deficit when the user is not taking the drug. If dopamine release is pleasurable, dopamine deficit is unpleasant. Ordinary life becomes boring and even painful without the drug. Nothing feels good anymore, except the drug. The addicted person is in a state of withdrawal, which will go away only if she can stay off the drug long enough for her brain to return to its default state (usually a few weeks).
Lembke says that “the universal symptoms of withdrawal from any addictive substance are anxiety, irritability, insomnia, and dysphoria.”[57] Dysphoria is the opposite of euphoria; it refers to a generalized feeling of discomfort or unease. This is basically what many teens say they feel—and what parents and clinicians observe—when kids who are heavy users of social media or video games are separated from their phones and game consoles involuntarily. Symptoms of sadness, anxiety, and irritability are listed as the signs of withdrawal for those diagnosed with internet gaming disorder.
Most obviously, those who are addicted to screen-based activities have more trouble falling asleep, both because of the direct competition with sleep and because of the high dose of blue light delivered to the retina from just inches away, which tells the brain: It’s morning time! Stop making melatonin!
Certainly, these digital platforms offer fun and entertainment, as television did for previous generations. They also confer some unique benefits for specific groups such as sexual minority youth and those with autism—where some virtual communities can help soften the pain of social exclusion in the real world.
However, unlike the extensive evidence of harm found in correlational, longitudinal, and experimental studies, there is very little evidence showing benefits to adolescent mental health from long-term or heavy social media use.[66] There was no wave of mental health and happiness breaking out around the world in 2013, as young people embraced Instagram. Teens are certainly right when they say that social media gives them a connection with their friends, but as we’ve seen in their reports of increasing loneliness and isolation, that connection does not seem to be as good as what it replaced.
A second reason why I am skeptical of claims about the benefits of social media for adolescents is that these claims often confuse social media with the larger internet. During the COVID shutdowns I often heard people say, “Thank goodness for social media! How would young people have connected without it?” To which I respond: Yes, let’s imagine a world in which the only way that children and adolescents could connect was by telephone, text, Skype, Zoom, FaceTime, and email, or by going over to each other’s homes and talking or playing outside. And let’s imagine a world in which the only way they could find information was by using Google, Bing, Wikipedia, YouTube,[67] and the rest of the internet, including blogs, news sites, and the websites of the many nonprofit organizations devoted to their specific interests.
A third reason for skepticism is that the same demographic groups that are widely said to benefit most from social media are also the most likely to have bad experiences on these platforms. The 2023 Common Sense Media survey found that LGBTQ adolescents were more likely than their non-LGBTQ peers to believe that their lives would be better without each platform they use.[69] This same report found that LGBTQ girls were more than twice as likely as non-LGBTQ girls to encounter harmful content related to suicide and eating disorders. Regarding race, a 2022 Pew report found that Black teens were about twice as likely as Hispanic or white teens to say they think their race or ethnicity made them a target of online abuse.[70] And teens from low-income households ($30,000 or less) were twice as likely as teens from higher-income families ($75,000 or higher) to report physical threats online (16% versus 8%).
We need to develop a more nuanced mental map of the digital landscape. Social media is not synonymous with the internet, smartphones are not equivalent to desktop computers or laptops, PacMan is not World of Warcraft, and the 2006 version of Facebook is not the 2024 version of TikTok.
Time with friends dropped further because of COVID restrictions, but Gen Z was already socially distanced before COVID restrictions were put in place.
Around 2013, psychiatric wards in the United States and other Anglo countries began to fill disproportionately with girls.
There is a clear, consistent, and sizable link[7] between heavy social media use and mental illness for girls,[8] but that relationship gets buried or minimized in studies and literature reviews that look at all digital activities for all teens.
Taken as a whole, the dozens of experiments that Jean Twenge, Zach Rausch, and I have collected[15] confirm and extend the patterns found in the correlational studies: Social media use is a cause of anxiety, depression, and other ailments, not just a correlate.
This meant that they made eye contact less frequently, laughed together less, and lost practice making conversation. Social media therefore harmed the social lives even of students who stayed away from it.
These group-level effects may be much larger than the individual-level effects, and they are likely to suppress the true size of the individual-level effects.[18] If an experimenter assigns some adolescents to abstain from social media for a month while all of their friends are still on it, then the abstainers are going to be more socially isolated for that month. Yet even still, in several studies, getting off social media improves their mental health. So just imagine how much bigger the effect would be if all of the students in 20 middle schools could be randomly assigned to give up social media for a year, or (more realistically) to put their phones in a phone locker each morning, while 20 other middle schools served as the control group. These are the kinds of experiments we most need in order to examine group-level effects.
Agency arises from striving to individuate and expand the self and involves qualities such as efficiency, competence, and assertiveness. Communion arises from striving to integrate the self in a larger social unit through caring for others and involves qualities such as benevolence, cooperativeness, and empathy.
The two motives are woven together in changing patterns across the life course, and that weaving is particularly important for adolescents who are developing their identities. Part of defining the self comes from successfully integrating into groups; part of being attractive to groups is demonstrating one’s value as an individual with unique skills.[30] Researchers have long found that boys and men are more focused on agency strivings while girls and women are more focused on communion strivings.
It was bad enough when I was growing up in the 1970s and 1980s, when girls were exposed to airbrushed and later photoshopped models. But those were adult strangers; they were not a girl’s competition. So what happened when most girls in a school got Instagram and Snapchat accounts and started posting carefully edited highlight reels of their lives and using filters and editing apps to improve their virtual beauty and online brand? Many girls’ sociometers plunged, because most were now below what appeared to them to be the average. All around the developed world, an anxiety alarm went off in girls’ minds, at approximately the same time.
These tuning apps gave girls the ability to present themselves with perfect skin, fuller lips, bigger eyes, and a narrower waist (in addition to showcasing the most “perfect” parts of their lives).[38] Snapchat offered similar features through its filters, first released in 2015, many of which gave users full lips, petite noses, and doe eyes at the touch of a button.
Girls are especially vulnerable to harm from constant social comparison because they suffer from higher rates of one kind of perfectionism: socially prescribed perfectionism, where a person feels that they must live up to very high expectations prescribed by others, or by society at large.[39] (There’s no gender difference on self-oriented perfectionism, where you torture yourself for failure to live up to your own very high standards.) Socially prescribed perfectionism is closely related to anxiety; people who suffer from anxiety are more prone to it. Being a perfectionist also increases your anxiety because you fear the shame of public failure from everything you do. And, as you’d expect by this point in the story, socially prescribed perfectionism began rising, across the Anglosphere nations, in the early 2010s.
Striving to excel can be healthy when it motivates girls to master skills that will be useful in later life. But social media algorithms home in on (and amplify) girls’ desires to be beautiful in socially prescribed ways, which include being thin. Instagram and TikTok send them images of very thin women if they show any interest in weight loss, or beauty, or even just healthy eating. Researchers for the Center for Countering Digital Hate created a dozen fake accounts on TikTok, registered to 13-year-old girls, and found that TikTok’s algorithm served them tens of thousands of weight-loss videos within a few weeks of joining the platform.
The researchers also noted that “social comparison is worse” on Instagram than on rival apps. Snapchat’s filters “keep the focus on the face,” whereas Instagram “focuses heavily on the body and lifestyle.”
Boys are also more interested in watching stories and movies about sports, fighting, war, and violence, all of which appeal to agency interests and motivations. Traditionally, boys have negotiated who is high and who is low in social status based in part on who could dominate whom if it came to a fight, or who can hurl an insult at whom without fear of violent reprisal. But because girls have stronger communion motives, the way to really hurt another girl is to hit her in her relationships.
Researchers have found that when you look at “indirect aggression” (which includes damaging other people’s relationships or reputations), girls are higher than boys—but only in late childhood and adolescence.
Studies confirm that as adolescents moved their social lives online, the nature of bullying began to change. One systematic review of studies from 1998 to 2017 found a decrease in face-to-face bullying among boys but an increase among girls, especially among younger adolescent girls.
They found that happiness tends to occur in clusters. This was not just because happy people seek each other out. Rather, when one person became happier, it increased the odds that their existing friends would become happier too. Amazingly, it also had an influence on their friends’ friends, and sometimes even on their friends’ friends’ friends. Happiness is contagious; it spreads through social networks.
The second twist was that depression spread only from women. When a woman became depressed, it increased the odds of depression in her close friends (male and female) by 142%. When a man became depressed, it had no measurable effect on his friends.
But on social media, the way to gain followers and likes is to be more extreme, so those who present with more extreme symptoms are likely to rise fastest, making them the models that everyone else locks onto for social learning. This process is sometimes known as audience capture—a process in which people get trained by their audiences to become more extreme versions of whatever it is the audience wants to see.[59] And if one finds oneself in a network in which most others have adopted some behavior, then the other social learning process kicks in too: conformity bias.
The recent growth in diagnoses of gender dysphoria may also be related in part to social media trends. Gender dysphoria refers to the psychological distress a person experiences when their gender identity doesn’t align with their biological sex. People with such mismatches have long existed in societies around the world. According to the most recent diagnostic manual of psychiatry,[68] estimates of the prevalence of gender dysphoria in American society used to indicate rates below one in a thousand, with rates for natal males (meaning those who were biological males at birth) being several times higher than for natal females. But those estimates were based on the numbers of people who sought gender reassignment surgery as adults, which was surely a vast underestimate of the underlying population. Within the past decade, the number of individuals who are being referred to clinics for gender dysphoria has been growing rapidly, especially among natal females in Gen Z.[69] In fact, among Gen Z teens, the sex ratio has reversed, with natal females now showing higher rates than natal males.
Sexual predation and rampant sexualization mean that girls and young women must be warier, online, than most boys and young men. They are forced to spend more of their virtual lives in defend mode, which may be part of the reason that their anxiety levels went up more sharply in the early 2010s.
The clinical psychologist Lisa Damour says that regarding friendship for girls, “quality trumps quantity.” The happiest girls “aren’t the ones who have the most friendships but the ones who have strong, supportive friendships, even if that means having a single terrific friend.”[82] (She notes that this is true for boys as well.)
When teens as a whole cut back on hanging out and doing things together in the real world, their culture changed. Their communion needs were left unsatisfied—even for those few teens who were not on social media.
Two major categories of motivations are agency (the desire to stand out and have an effect on the world) and communion (the desire to connect and develop a sense of belonging). Boys and girls both want each of these, but there is a gender difference that emerges early in children’s play: Boys choose more agency activities; girls choose more communion activities. Social media appeals to the desire for communion, but it often ends up frustrating
The net effect of this push-pull is that boys have increasingly disconnected from the real world and invested their time and talents in the virtual world instead. Some boys will find career success there, because their mastery of that world can lead to lucrative jobs in the tech industry or as influencers. But for many, though it can be an escape from an increasingly inhospitable world, growing up in the virtual world makes them less likely to develop into men with the social skills and competencies to achieve success in the real world.
They calm their anxieties by staying inside, but the longer they stay in, the less competent they become in the outside world, fueling their anxiety about the outside world. They are trapped.
world with too much supervision and not enough risk is bad for all children, but it seems to be having a larger impact on boys.
But around 2010, something unprecedented started happening: Both sexes shifted rapidly toward the pattern traditionally associated with females. There has been a notable increase in agreement with items related to internalizing disorders (such as “I feel that I can’t do anything right”) for both sexes, with a sharper rise among girls as you can see in figure 7.2. At the same time, agreement with items related to externalizing disorders (such as “how often have you damaged school property on purpose?”) plummeted for both sexes, more sharply for boys. By 2017, boys’ responses looked like those from girls in the 1990s.
By 2015, many boys found themselves exposed to a level of stimulation and attention extraction that had been unimaginable just 15 years earlier.
In previous decades, the main way for heterosexual boys[33] to get a look at naked girls was through what we’d now consider very low-quality pornography—printed magazines that could not be sold to minors. As puberty progressed and the sex drive increased, it motivated boys to do things that were frightening and awkward, such as trying to talk to a girl, or asking a girl to dance at events organized by adults.
When we look at daily users or users for whom porn has become an addiction that interferes with daily functioning, the male-female ratios are generally more than five or 10 to one,
Porn separates the evolved lure (sexual pleasure) from its real-world reward (a sexual relationship), potentially making boys who are heavy users turn into men who are less able to find sex, love, intimacy, and marriage in the real world.
Prevalence estimates vary,[58] but one 2016 study found that 1 or 2% of adult gamers qualify as having gaming addiction, 7% are problematic gamers, 4% are engaged gamers, and 87% are casual gamers.
As Peter Gray and other play researchers point out, one of the most beneficial parts of free play is that kids must act as legislators (who jointly make up the rules) and as judges and juries (who jointly decide what to do when rules appear to be violated). In most multiplayer video games, all of that is done by the platform. Unlike free play in the real world, most video games give no practice in the skills of self-governance.
Video games also deliver far less of the anti-phobic benefits of risky play. Video games are disembodied. They are thrilling in their own way, but they can’t activate the kind of physical fear, thrill, and pounding heart that riding a roller coaster, or playing full-court basketball, or using hammers to smash things at an adventure playground can give. Jumping out of planes, having knife fights, and getting brutally murdered are just things that happen dozens of times each day for boys playing Fortnite or Call of Duty. They do not teach boys how to judge and manage risks for themselves in the real world.
Boys thrive when they have a stable group of reliable friends, and they create their strongest and most durable friendships from being on the same team or in a stable pack, facing risks or rival teams. Virtual packs create weaker bonds, although today’s increasingly lonely boys cling to them and value them because that’s all they have. That’s where their friends are, as Chris told me.
Drawing on data that was just becoming available as governments began to keep statistics, he noted that in Europe the general rule was that the more tightly people are bound into a community that has the moral authority to restrain their desires, the less likely they are to kill themselves.
The phone-based life produces spiritual degradation, not just in adolescents, but in all of us.
But there’s another vertical dimension, shown as the z axis coming out of the page. I called it the divinity axis because so many cultures wrote explicitly that virtuous actions bring one upward, closer to God, while base, selfish, or disgusting actions bring one downward, away from God and sometimes toward an anti-divinity such as the Devil. Whether or not God exists, people simply do perceive some people, places, actions, and objects to be sacred, pure, and elevating; other people, places, actions, and objects are disgusting, impure, and degrading (meaning, literally, “brought down a step”).
Conversely, witnessing people behaving in petty, nasty ways, or doing physically disgusting things, triggers revulsion. We feel pulled “down” in some way. We close off and turn away. Such actions are incompatible with our elevated nature. This is how I’m using the word “spiritual.” It means that one endeavors to live more of one’s life well above zero on the z axis. Christians ask, “What would Jesus do?” Secular people can think of their own moral exemplar. (I should point out that I am an atheist, but I find that I sometimes need words and concepts from religion to understand the experience of life as a human being. This is one of those times.)
In the rest of this chapter, I’ll draw on wisdom from ancient traditions and modern psychology to try to make sense of how the phone-based life affects people spiritually by blocking or counteracting six spiritual practices: shared sacredness; embodiment; stillness, silence, and focus; self-transcendence; being slow to anger, quick to forgive; and finding awe in nature.
This is one of the founding insights of sociology: Strong communities don’t just magically appear whenever people congregate and communicate. The strongest and most satisfying communities come into being when something lifts people out of the lower level so that they have powerful collective experiences. They all enter the realm of the sacred together, at the same time. When they return to the profane level, where they need to be most of the time to address the necessities of life, they have greater trust and affection for each other as a result of their time together in the sacred realm. They are also happier and have lower rates of suicide.
People who live only in networks, rather than communities, are less likely to thrive.
Living in a world of structureless anomie makes adolescents more vulnerable to online recruitment into radical political movements that offer moral clarity and a moral community, thereby pulling them further away from their in-person communities.
DeSteno notes that synchronous movement during religious rituals is not only very common; it is also an experimentally validated technique for enhancing feelings of communion, similarity, and trust, which means it makes a group of disparate individuals feel as though they have merged into one.
Sports are not exactly spiritual, but playing them depends on some of spirituality’s key ingredients for bonding people together, like coordinated and collective physical movement and group celebrations. Research consistently shows that teens who play team sports are happier than those who don’t.
One of the fundamental teachings of the Buddha is that we can train our minds.
This is why many religions have monasteries and monks. Those seeking spiritual growth are well served by separating themselves from the noise and complexity of human interactions, with their incessant words and profane concerns. When people practice silence in the company of equally silent companions, they promote quiet reflection and inner work, which confers mental health benefits. Focusing your attention and meditating have been found to reduce depression and anxiety.
You stand on the platform and post content to influence how others perceive you. It is almost perfectly designed to crank up the DMN to maximum and pin it there. That’s not healthy for any of us, and it’s even worse for adolescents.
Social media is a fountain of bedevilments. It trains people to think in ways that are exactly contrary to the world’s wisdom traditions: Think about yourself first; be materialistic, judgmental, boastful, and petty; seek glory as quantified by likes and followers. Many users may believe that the implicit carrots and sticks built into platforms like Instagram don’t affect them, but it’s hard not to be affected unconsciously. Unfortunately, most young people become heavy users of social media during the sensitive period for cultural learning, which runs from roughly age 9 to 15.
The Tao Te Ching lists “ideas of right and wrong” as a bedevilment. In my 35 years of studying moral psychology, I have come to see this as one of humanity’s greatest problems: We are too quick to anger and too slow to forgive.
But I believe his point was that the mind, left to its own devices, evaluates everything immediately, which shapes what we think next, making it harder for us to find the truth. This insight is the foundation of the first principle of moral psychology, which I laid out in The Righteous Mind: Intuitions come first, strategic reasoning second. In other words, we have an immediate gut feeling about an event, and then we make up a story after the fact to justify our rapid judgment—often a story that paints us in a good light.
From a spiritual perspective, social media is a disease of the mind. Spiritual practices and virtues, such as forgiveness, grace, and love, are a cure.
In 2003, Dacher Keltner and I published a review paper on the emotion of awe in which we argued that awe is triggered by two simultaneous perceptions: first, that what you are looking at is vast in some way, and, second, that you can’t fit it into your existing mental structures.[30] That combination seems to trigger a feeling in people of being small in a profoundly pleasurable—although sometimes also fearful—way. Awe opens us to changing our beliefs, allegiances, and behaviors.
The great evolutionary biologist E. O. Wilson said that humans are “biophilic,” by which he meant that humans have “the urge to affiliate with other forms of life.”
Yet one of the hallmarks of the Great Rewiring is that children and adolescents now spend far less time outside, and when they are outside, they are often looking at or thinking about their phones. If they encounter something beautiful, such as sunlight reflected on water, or cherry blossoms wafting on gentle spring breezes, their first instinct is to take a photograph or video, perhaps to post somewhere. Few are open to losing themselves in the moment as Yi-Mei did.
As for our children, if we want awe and natural beauty to play a larger role in their lives, we need to make deliberate efforts to bring them or send them to beautiful natural areas. Without phones. The
Soon before his death in 1662, the French philosopher Blaise Pascal wrote a paragraph often paraphrased as “there is a God-shaped hole in every human heart.”
It matters what we expose ourselves to. On this the ancients universally agree. Here is Buddha: “We are what we think. All that we are arises with our thoughts.”[37] And here is Marcus Aurelius: “The things you think about determine the quality of your mind. Your soul takes on the color of your thoughts.”[38] In a phone-based life, we are exposed to an extraordinary amount of content, much of it chosen by algorithms and pushed to us via notifications that interrupt whatever we were doing. It’s too much, and a lot of it pulls us downward on the divinity dimension. If we want to spend most of our lives above zero on that dimension, we need to take back control of our inputs. We need to take back control of our lives.
Awe in nature may be especially valuable for Gen Z because it counteracts the anxiety and self-consciousness caused by a phone-based childhood.
called back when a safety issue was discovered. After the Titanic sank in 1912, its two sister ships were pulled out of service and modified to make them safer. When new consumer products are found to be dangerous, especially for children, we recall them and keep them off the market until the manufacturer corrects the design.
Parents face collective action problems around childhood independence too. It was easy to send kids out to play back when everyone was doing it, but in a neighborhood where nobody does that, it’s hard to be the first one. Parents who let their children walk or play unchaperoned in a public place face the risk that a misguided neighbor will call the police, who may refer the case to Child Protective Services, who’d then investigate them for “neglect” of their children. Each parent decides that it’s best to do what every other parent is doing: Keep kids supervised, always, even if that stunts the development of all children.
If Instagram were to make a real effort to block or expel underage users, it would lose those users to TikTok and other platforms. Younger users are particularly valuable because the habits they form early often stick with them for life, so companies need younger users to ensure robust future usage of their products.
What is the right age of internet adulthood? Note that we are not talking about the age at which children can browse the web or watch videos on YouTube or TikTok. We’re talking only about the age at which a minor can enter into a contract with a company to use the company’s products. We’re talking about the age at which a child can open an account on YouTube or TikTok and begin uploading her own videos and getting her own highly customized feed, while giving her data to the company to use and sell as it says it will do in its terms of service.
We expect liquor stores to enforce age limits. We should expect the same from tech companies.
Parents should have a way of marking their child’s phones, tablets, and laptops as devices belonging to a minor. That mark, which could be written either into the hardware or the software, would act like a sign that tells companies with age restrictions, “This person is underage; do not admit without parental consent.”
Governments at all levels, from local to federal, could support this transition by allocating funds to pay the small cost of buying phone lockers or lockable pouches for any school that wants to keep phones out of students’ pockets and hands during the school day.
Yet in some U.S. states, such as Connecticut, the law said a child should never be left alone in public before the age of 12, meaning that 11-year-olds needed babysitters. Indeed, a Connecticut mom was arrested for letting her 11-year-old wait in the car while she ran into the store.
Tech companies can be a major part of the solution by developing better age verification features, and by adding features that allow parents to designate their children’s phones and computers as ones that should not be served by sites with minimum ages until they are old enough. Such a feature would help to dissolve multiple collective action problems for parents, kids, and platforms.
Voss says that when he walks into a school without a phone ban, “It’s kind of like the zombie apocalypse, and you have all these kids in the hallways not talking to each other. It’s just a very different vibe.”
In other words, the phone ban ameliorates three of the four foundational harms of the phone-based childhood: attention fragmentation, social deprivation, and addiction. It reduces social comparison and the pull into the virtual world. It generates communion and community.
(Some parents object that they need to be able to reach their children immediately in case of an emergency, such as a school shooting. As a parent I understand this desire. But a school in which most students are calling or texting their parents during an emergency is likely to be less safe than a school in which only the adults have phones and the students are listening to the adults and paying attention to their surroundings.[6]
The value of phone-free and even screen-free education can be seen in the choices that many tech executives make about the schools to which they send their own children, such as the Waldorf School of the Peninsula, where all digital devices—phones, laptops, tablets—are prohibited. This is in stark contrast with many public schools that are advancing 1:1 technology programs, trying to give every child their own device.[9] Waldorf is probably right.
The “digital divide” is no longer that poor kids and racial minorities have less access to the internet, as was feared in the early 2000s; it is now that they have less protection from
“It seems small. But in the moment, when I saw her get on the bus and it drove away, I felt really important to her, important to someone.” That’s what was so new to her. At last, instead of feeling needy, she was needed.
We should all be aghast that the average American elementary school student gets only 27 minutes of recess a day.[19] In maximum-security federal prisons in the United States, inmates are guaranteed two hours of outdoor time per day. When
The key thing to understand about “loose parts” playgrounds is that kids have control over their environment. They have agency. Playgrounds with fixed structures can hold kids’ attention only so long. But loose parts keep kids’ attention for hours, allowing them to build not only forts and castles but also focus, compromise, teamwork, and creativity.
Kids will take on responsibility for their safety when they are actually responsible for their safety, rather than relying on the adult guardians hovering over them.
The Let Grow Project is another activity that seems to reduce anxiety. It is a homework assignment that asks children to “do something they have never done before, on their own,” after reaching agreement with their parents as to what that is. Doing projects increases children’s sense of competence while also increasing parents’ willingness to trust their children and grant them more autonomy.
New parents lost access to local wisdom and began to rely more on experts.
After school is for free play. Try not to fill up most afternoons with adult-supervised “enrichment” activities. Find ways that your children can just hang out with other children such as joining a Play Club (see chapter 11), or going to each other’s homes after school. Friday is a particularly good day for free play because children can then make plans to meet up over the weekend. Think of it as “Free Play Friday.”
The cure for such parental anxiety is exposure. Experience the anxiety a few times, taking conscious note that your worst fears did not occur, and you learn that your child is more capable than you had thought. Each time, the anxiety gets weaker. After five days of our son walking to school, we stopped watching his blue dot. We got more comfortable with his ability to navigate the city, and soon its subway system.
Delay the opening of social media accounts until 16. Let your children get well into puberty, past the most vulnerable early years, before letting them plug into powerful socializing agents like TikTok or Instagram. This
Encourage more and better off-base excursions with friends. Let your teen hang out at a “third place” (not home or school) like the Y, the mall, the park, a pizzeria—basically, a place where they can be with their friends, away from adult supervision. Otherwise, the only place they can socialize freely is online.
Rely more on your teen at home. Teens can cook, clean, run errands on a bicycle or public transit, and, once they turn 16, run errands using a car. Relying on your teen is not just a tool to instill work ethic. It’s also a way to ward off the growing feeling among Gen Z teens that their lives are useless.
Encourage your teen to find a part-time job.
Find ways for them to nurture and lead. Any job that requires guiding or caring for younger children is ideal, such as a babysitter, camp counselor, or assistant coach. Even as they need mentors themselves, they can serve as a mentor to younger kids. Helping younger kids seems to turn on an empathy switch and a leadership gene.
Take a gap year after high school. Many young people go directly to college without any sense of what else is out there.
Risking a serious injury for no good reason is dumb. But some risk is part of any hero’s journey, and there’s plenty of risk in not taking the journey too.
Their parents got smartphones too. Those smartphones gave parents a new superpower that they did not have in the era of flip phones: the ability to track their children’s movements at every moment.
Whether we think of the phone as “the world’s longest umbilical cord” or as an “invisible fence,” childhood autonomy plummeted when kids started carrying them.
I didn’t set out to write this book. In late 2021, I began writing a book on how social media was damaging American democracy. My plan was to begin with a chapter on the impact of social media on Gen Z, showing how it disrupted their social lives and caused a surge of mental illness. The rest of the book would analyze how social media disrupted society more broadly. I’d show how it fragmented public discourse, Congress, journalism, universities, and other foundational democratic institutions. But when I finished writing that first chapter—which became chapter 1 of this book—I realized that the adolescent mental health story was so much bigger than I had thought. It wasn’t just an American story, it was a story playing out across many Western nations.
Until someone finds a chemical that was released in the early 2010s into the drinking water or food supply of North America, Europe, Australia, and New Zealand, a chemical that affects adolescent girls most, and that has little effect on the mental health of people over 30, the Great Rewiring is the leading theory.
In part 4, I offered dozens of suggestions, but the four foundational reforms are: No smartphones before high school No social media before 16 Phone-free schools Far more unsupervised play and childhood independence
If a community enacts all four, they are likely to see substantial improvements in child and adolescent mental health within two years.
You shouldn’t have to compete for your students’ attention with the entire internet.
Growing up in the virtual world promotes anxiety, anomie, and loneliness. The Great Rewiring of Childhood, from play-based to phone-based, has been a catastrophic failure. It’s time to end the experiment. Let’s bring our children home.
-
Eumeswil by Ernst Junger
From my notion book template
What It’s About
Part philosophical ramble, part science fiction, all world building – Eumeswil is a book of fiction posing as the diary/notes/ruminations of a full time history grad student and part time bartender to the tyrant (the term tyrant is used descriptively, not pejoratively) of a city state somewhere in North Africa in the far, far future. This weird position to both history in the past (via education) and history in the making (the tyrant does most of his business with his underlings at the bar) spawns weird ruminations and insights into the relationship between the individual, state, and society.
Junger also creates one of his signature concepts, namely “The Anarch” – defined as “The Anarch is to the anarchist what the monarch is to the monarchist.”. A more useful definition would be something like “An Anarch is someone who is unaffected inwardly by government and society, even if outwardly affected”. If that sounds like a stoic “sage”, i.e. one who has mastered stoicism, then you’re pretty close. The main difference between the sage and the anarch would be that the sage concerns himself with serenity, emotional control and happiness whereas the anarch resists all influence. Hey, Germans!
How I Discovered It
Amazon was kind enough to suggest it to me as “Something you might like”
Thoughts
Reading this book time me quite a long time – I found myself reading and then rereading whole pages just to make sure I was understanding things correctly. The Kindle highlight word for a definition feature was extremely helpful on this book.
As I write this I’m struck by the utter absence of any Old and New Testament influences in the book. Biblical influence, in one way or the other seeps into any sort of moral lesson or redemptions arcs (that’s what we’re used to, and there is no other way to do moral lessons or redemption arcs, the bible as literature and all that.) There was none of that in Eumeswil. Instead there were lots of lessons from Greek and Roman mythology, and a smattering of Norse. Classical Roman and Greek leaders were raised to near mythological levels as well. The most prominent example was the narrator’s discovery of letters from his dad attempting to pressure his mom to have an abortion and the mythological story of Zeus, Rhea and Kronos.
A thought I had was the Junger is a classically educated writer who was educated on other classics.
This book was yet another example of “fully formed observer looks out at the world” genre I’m so fond of.
What I Didn’t Like About It
The biggest negative about the book was that pretty much nothing happens – Eumeswil is 99% world building, 1% plot. Other massive negatives are what I can only presume are unavoidable translation problems and a mind boggling use of archaic terms (which is odd for a sort of sci fi novel written in the 1970s). I’m normally a fan of such things, but this was a lot, even for me.
Who Would Like It?
I liked it, but I’m not sure I can think of anyone else that would. It appealed to my sense of rationalism, fondness for magical realism in fiction, dystopias, works by people born before 1920, and legions of baroque historical references.
Related Books
On the Marble Cliffs by Ernst Junger, Memoirs of a Superfluous Man by Albert Jay Nock
Highlights
But if an utterance begins with a lie, so that it has to be propped up by more and more lies, then eventually the structure collapses. Hence my suspicion that Creation itself began with a fraud. Had it been a simple mistake, then paradise could be restored through evolution. But the Old Man concealed the Tree of Life.
The Condor sets great store by visual acuteness: seldom does a candidate who wears glasses stand a chance with him.
So one can comfortably let time pass—time itself provides enjoyment. Therein, presumably, lies the secret of tobacco—indeed, of any lighter drug.
The Condor feels that the presence of women, whether young or old, would only promote intrigue. Still, it is hard to reconcile the rich diet and leisurely life-style with asceticism.
When we look back, our eyes alight on graves and ruins, on a field of rubble. We are then inveigled by a mirage of time: while believing that we are advancing and progressing, we are actually moving toward that past. Soon we will belong to it: time passes over us. And this sorrow overshadows the historian.
Among the animals, he says, the bees have rediscovered this kinship. Their mating with the flowers is neither a forward nor a backward step in evolution, it is a kind of supernova, a flashing of cosmogonic eros in a favorable conjunction. Even the boldest thinking has not yet hit on that, he says; the only things that are real are those that cannot be invented.
The goal was the copper flasks in which King Solomon had jailed rebellious demons. Now and again, the fishermen who cast their nets in the El-Karkar Sea would haul up one of these flasks in their catches. They were closed with the seal of Solomon; when they were opened, the demon spurted forth as smoke that darkened the sky.
This emir, the conqueror of Northwest Africa, may be regarded as their prototype. His Western features are unmistakable; of course, we must bear in mind that the distinctions between races and regions vanish on the peaks. Just as people resemble one another ethically, indeed become almost identical, when approaching perfection, so too spiritually. The distance from the world and from the object increases; curiosity grows and with it the desire to get closer to the ultimate secrets, even amid great danger. This is an Aristotelian trait. One that makes use of arithmetic.
Evil becomes all the more dreadful the longer it is deprived of air.
The loss of perfection can be felt only if perfection exists.
As the word is weighed by the poet, so, too, must the deed be weighed by the historian—beyond good and evil, beyond any conceivable ethics.
I contented myself, as I have mentioned, with shaking my head; it is better, especially among men, for emotions to be guessed rather than verbalized.
Yellow highlight | Location: 553 These are the suspicions with which two sorts of faculty members operate here: they are either crooks disguised as professors or professors posing as crooks in order to gain popularity. They try to outdo one another in the race for infamy,
Basically, it is beauty that he serves. Power and riches should be its thralls.
To be sure, extremely importunate persecutorial types thrive in our putrid lagoon. “Each student is a viper nursed in the bosom,” Vigo once said to me in a gloomy moment when speaking about Barbassoro, who, granted, belongs more to the species of purebred rats.
He has an instinct for conformity and for irresistible platitudes, which he stylizes in a highbrow manner. He can also reinterpret them, depending on which way the wind is blowing.
No salvation comes from exhumed gods; we must penetrate deeper into substance.
A man who knows his craft is appreciated anywhere and anytime. This is also one of the means of survival for the aristocrat, whose diplomatic instinct is almost irreplaceable.
They cut their finest figure in their obituaries. As survivors, they soon become unpleasant again.
Distinctions must be drawn here: love is anarchic, marriage is not. The warrior is anarchic, the soldier is not. Manslaughter is anarchic, murder is not. Christ is anarchic, Saint Paul is not. Since, of course, the anarchic is normal, it is also present in Saint Paul, and sometimes it erupts mightily from him. Those are not antitheses but degrees. The history of the world is moved by anarchy. In sum: the free human being is anarchic, the anarchist is not.
The anarch can lead a lonesome existence; the anarchist is sociable and must get together with peers.
The positive counterpart of the anarchist is the anarch. The latter is not the adversary of the monarch but his antipode, untouched by him, though also dangerous. He is not the opponent of the monarch, but his pendant. After all, the monarch wants to rule many, nay, all people; the anarch, only himself.
According to Thales, the rarest thing he encountered in his travels was an old tyrant.
My mother died young, during my early school years. I regarded the loss as a second birth, an expulsion into a brighter, colder foreign land—this time consciously.
My mother had been the world for me; she gradually became a person.
When I could no longer be thought away, he tackled me physically. I do not wish to go into detail. In any case, while floating in the amniotic fluid, I was menaced with dangerous adventures, like Sindbad the Sailor. He tried to get at me with poisons and sharp instruments and also with the help of an accomplice on the medical faculty. But my mother stuck by me, and that was my good fortune.
The ancients depicted time as Cronus, who eats his own children. As a Titan, the father devours his engendered son; as a god, he sacrifices him. As a king, he squanders him in the wars that he instigates. Bios and myth, history and theology offer any number of examples. The dead return not to the father, but to the mother.
When he swaggers, I sometimes feel like reminding him of the map room and the tricks he harassed my mother with. She sheltered me from him in her cavern just as Rhea shielded her Zeus against the gluttonous Cronus.
There are truths that we must hush if we are to live together; but you cannot knock over the chessboard.
The person who teaches us how to think makes us lords over men and facts.
Bruno, too, considers the situation in Eumeswil favorable: the historical substance is used up. Nothing is taken seriously now except for the gross pleasures and also the demands of everyday life. The body social resembles a pilgrim who, exhausted by his wanderings, settles down to rest. Now images can come in. These ideas also had a practical meaning for my work.
Florence was enough for a Machiavelli.
Once, people got fed up with pure dynamics, and so technology declined in the larger areas. This was matched on the other side by its plutonian concentration in the hands of a small, now autonomous personnel.
Although an anarch, I am not anti-authoritarian. Quite the opposite: I need authority, although I do not believe in it. My critical faculties are sharpened by the absence of the credibility that I ask for. As a historian, I know what can be offered.
In the animal kingdom, there are parasites that clandestinely hollow out a caterpillar. Eventually, a mere wasp emerges instead of a butterfly. And that is what those people do with their heritage, and with language in particular, as counterfeiters; that is why I prefer the Casbah, even from behind my counter.
I am curious by nature; this is indispensable for the historian. A man is a born historian or else he is boring.
I consider it poor historical form to make fun of ancestral mistakes without respecting the eros that was linked to them. We are no less in bondage to the Zeitgeist; folly is handed down, we merely don a new cap.
I therefore would not resent my genitor for merely believing in a fallacy; no one can help that. What disturbs me is not error but triteness, the rehashing of bromides that once moved the world as grand utterances. Errors can shake the political world to its very core; yet they are like diseases: in a crisis, they can accomplish a great deal, and even effect a cure—as hearts are tested in a fever. An acute illness: that is the waterfall with new energies. A chronic illness: sickliness, morass. Such is Eumeswil: we are wasting away—of course, only for lack of ideas; otherwise, infamy has been worthwhile.
Thus, it is the language of a man who knows what he wants and who transfers this wanting to others: Dico: “I speak”; dicto: “I speak firmly, dictate.” The t concentrates.
The Domo said, “Whatever a man does in bed or even in a stable is his own business; we do not interfere. Bien manger, bien boire, bien foutre—by giving our blessing to all that, we relieve the police and the courts of an enormous workload. This way, aside from lunatics and gross criminals, we only have to deal with do-gooders, who are more dangerous.
Tyranny must value a sound administration of justice in private matters. This, in turn, increases its political authority. The latter rests on equality, to which tyranny sacrifices freedom. Tyranny is intent on overall leveling, which makes it akin to rule by the people. Both structures produce similar forms. They share a distaste for elites that nurture their own language and recognize themselves in it; poets are even hated.
The idea of the Eternal Return is that of a fish that wants to jump out of the frying pan. It falls on the stove plate.
As I have already said, I have nothing against authority, nor do I believe in it. Rather, I need authority, for I have a conception of greatness. That is why, although not without skepticism here too, I associate with the top rank.
We play on slanting chessboards. If some day his pontiffs—and I do not doubt it—topple the Condor, then Eumeswil will once again celebrate liberazione—the transition, that is, from visible to anonymous power. For a long time now, soldiers and demagogues have been spelling one another.
How, then, shall I classify the Condor? Among the tyrants—though not to be doubted, it says little. According to linguistic usage, tyrants find a more fertile soil in the West and despots in the East. Both are unbounded, but the tyrant follows certain rules, the despot his cravings. That is why tyranny is bequeathed more easily, though at most to a grandchild. The bodyguard is likewise more reliable, as is one’s own son. Despite profound disagreements, Lycophron, the son of Periander, rebels against his father only in spirit but not in deed.
Such is the role of the anarch, who remains free of all commitments yet can turn in any direction.
Gullibility is the norm; it is the credit on which states live: without it, even their most modest survival would be impossible.
Tiberius is remarkable for his character; the sheer fact that he, virtually as a private citizen, could hold on to the reins for such a long time verges on witchcraft.
was reckless enough to broach this topic at the family table, only to reap an answer worthy of my genitor: namely, that the invention of the phonograph has rendered such speculations null and void. The inventor was, I believe, an especially disagreeable American, a disciple of Franklin’s named Edison.
Action is more easily emulated than character; this is borne out by the bromidic reiterations in world history.
The special trait making me an anarch is that I live in a world which I “ultimately” do not take seriously. This increases my freedom; I serve as a temporary volunteer.
The world civil war changed values. National wars are fought between fathers, civil wars between brothers. It has always been better to fall under the father’s hand than into the brother’s; it is easier being an enemy of another nation than another class.
For the anarch, little is changed when he strips off a uniform that he wore partly as fool’s motley, partly as camouflage. It covers his spiritual freedom, which he will objectivate during such transitions. This distinguishes him from the anarchist, who, objectively unfree, starts raging until he is thrust into a more rigorous straitjacket.
Now, I am not putting down fear. It is a foundation of physicality, indeed of physics. If the ground wobbles or if the house so much as threatens to collapse, one looks for the door. This, too, creates a selection—say, of those people who did not fall into the trap. In this respect, Odysseus is one of our greatest models—the whiffer par excellence. Fear is primary: the instinctive whiffing of danger. It is joined by caution, then canniness and also cunning. Odysseus’ caution is so extraordinary because he also has courage and curiosity. He is the harbinger of Western man’s intellect, boldness, and inquiring mind.
“Dear friend, where have you been? We haven’t seen you in ages.” “I’ve been living.”
Man is a rational being who does not like sacrificing his safety to theories. Placards come and go, but the wall they are pasted on endures. Theories and systems pass over us in the same way.
Incidentally, I notice that our professors, trying to show off to their students, rant and rail against the state and against law and order, while expecting that same state to punctually pay their salaries, pensions, and family allowances, so that they value at least this kind of law and order. Make a fist with the left hand and open the right hand receptively—that is how one gets through life. This was easier under the tribunes; it is also one reason for my dear brother’s nostalgia for their splendor. Yet he himself helped to saw off their branch.
The political trend is always to be observed, partly as a spectacle, partly for one’s own safety. The liberal is dissatisfied with every regime; the anarch passes through their sequence—as inoffensively as possible—like a suite of rooms. This is the recipe for anyone who cares more about the substance of the world than its shadow—the philosopher, the artist, the believer.
The last time must have been after the Second World War—that is, after the final triumph of the technician over the warrior.
There is simply nothing new in the cosmos; otherwise the universe would not deserve its name.
The difference will be obvious when I go to my forest shack while my Lebanese joins the partisans. I will then not only hold on to my essential freedom, but also gain its full and visible enjoyment. The Lebanese, by contrast, will shift only within society; he will become dependent on a different group, which will get an even tighter hold on him.
The partisan operates on the margins; he serves the great powers, which arm him with weapons and slogans. Soon after the victory, he becomes a nuisance. Should he decide to maintain the role of idealist, he is made to see reason.
As I have said, I have nothing to do with the partisans. I wish to defy society not in order to improve it, but to keep it at bay no matter what.
As for the do-gooders, I am familiar with the horrors that were perpetrated in the name of humanity, Christianity, progress. I have studied them. I do not know whether I am correctly quoting a Gallic thinker: “Man is neither an animal nor an angel; but he becomes a devil when he tries to be an angel.”
The partisan wants to change the law, the criminal break it; the anarch wants neither. He is not for or against the law. While not acknowledging the law, he does try to recognize it like the laws of nature, and he adjusts accordingly.
The difference is that the forest fleer has been expelled from society, while the anarch has expelled society from himself. He is and remains his own master in all circumstances.
Incidentally, most revolutionaries suffer from not having become professors. The Domo knows this, too: once, at the night bar, I heard him telling the Condor: “We’ll make him a professor—that should take him off our backs.”
I began with the respect that the anarch shows toward the rules. Respectare as an intensive of respicere means: “to look back, think over, take into account.” These are traffic laws. The anarchist resembles a pedestrian who refuses to acknowledge them and is promptly run down. Even a passport check is disastrous for him. “I never saw a cheerful end,” as far back as I can look into history. In contrast, I would assume that men who were blessed with happiness—Sulla, for example—were anarchs in disguise.
Cadmo, to enlighten me, often takes me along to his “Storm Companions.” I am not really welcome there—perhaps they even regard me as an agent of the Domo, who, by the by, knows about their meetings but considers them irrelevant, indeed almost useful. “A barking dog never bites.”
The true historian is more of an artist, especially a tragedian, than a man of science.
Let me repeat that I prefer the history of cultures to the history of states. That is where humanity begins and ends. Accordingly, I value the history of royal courts and even back courts over that of politics and parties. History is made by people and at most regulated by laws; that is why it is so inexhaustible with surprises.
intellectual rank was no longer to be identified by a mastery of language. The result is a banal chitchat defective in both its heights and its depths.
Similarly, when elites have grown rare or shrunk down to a few individuals, the clear, unadulterated word convinces the uneducated man—indeed, precisely him, the non-miseducated man. He senses—and this puts his mind at ease—that the ruler still observes rules despite his power. Caesar non supra grammaticos. A solace in periods of decline.
large-scale demagogue, who turned up when the planet Pluto was discovered, dabbled in painting just as Nero did in singing. He persecuted painters whose works he did not like. He dabbled in other areas, too—for instance, as a strategist who doomed many people, but was technically perfect; as a chauffeur in all directions, who eventually had himself cremated with the help of gasoline. His outlines melt into insignificance; the torrent of numbers wipes them out. The pickings are slim for both the historian and the anarch. Red monotony, even in the atrocities.
The anarch thinks more primitively; he refuses to give up any of his happiness. “Make thyself happy” is his basic law. It is his response to the “Know thyself” at the temple of Apollo in Delphi. These two maxims complement each other; we must know our happiness and our measure.
At times, I suspect the Condor of hoping to turn Eumeswil into a small-scale Florence; he would then have his Machiavelli in the Domo.
Transcendence is the side track of reason. The world is more miraculous than as depicted by sciences and religions. Only art has any inkling of it.
One error of the anarchists is their belief that human nature is intrinsically good. They thereby castrate society, just as the theologians (“God is goodness”) castrate the Good Lord. This is a Saturnian trait.
will content myself with his maxim: “Primal image is image and mirror image.” His actual strategem was to reduce the platonic idea to phenomenon, thereby reanimating matter, which had been emasculated by abstract thinking. A miracle, he said, could not be expected from above or from the future—say, from a world spirit ascending from level to level; despite its variable elements, he said, a miracle always remains the same, in every blade of grass, in every pebble.
A little generosity is worth more than a lot of administration. The tribunes were redistributors; they raised the prices of bread for the poor in order to make them happy with their ideas—say, by building extravagant universities whose jobless graduates became a burden to the state (hence once again to the poor) and never touched another hammer. The pauper, so long as he does not think parasitically, wishes to see as little government as possible, no matter what pretexts the state may use. He does not want to be schooled, vaccinated, or conscripted; all these things have senselessly increased the numbers of the poor, and with them, poverty.
I stand before the mirror and view Emanuelo: clothing, physical appearance, smile, and movements must be casual and pleasant. It is important—we can learn this from women—to look the way others picture us in their wishes.
Like many young men with time on their hands, he occupied his mind with the “perfect crime”—about which he also had a theory.
I have noticed that a cat will turn up her nose at a piece of meat if I hand it to her, but she will devour it with gusto if she has “stolen” it. The meat is the same, but the difference lies in the predator’s delight in recognizing itself.
Opposition is collaboration; this was something from which Dalin, without realizing it, could not stay free. Basically, he damaged order less than he confirmed it. The emergence of the anarchic nihilist is like a goad that convinces society of its unity.
In Eumeswil, abortion is one of the actions that are punishable but not prosecuted. They include, among other things, gambling, smoking opium,
A demonological literature à la The Witches’ Hammer still exists, but underground. Whenever it has an effect, whenever it turns virulent, one can assume other causes—above all, a cosmic angst in search of objects.
“The hunter has companions, but tillage brought slavery, killing became murder. Freedom ended; the game was driven away. In Cain a descendant of the primal hunter was resurrected, his avenger, perhaps. Genesis supplies only a rumor about all this. It hints at Yahweh’s bad conscience regarding the slayer.”
Otherwise the Inuits were thoroughly corrupted by dealing with the whalers, who, next to the sandalwood skippers, were notoriously the worst villains ever to plow the seas. From them, they had learned how to smoke, drink, and gamble. They gambled away their dogs, boats, weapons, and also their wives; a woman might change hands five times in a single night. *
But this did not seem to be Attila’s point. His guiding thought in that discussion (which, as we recall, concerned abortion) was, more or less: It is reprehensible to delegate a misdeed. The hunter takes his son to the mother’s grave and kills him. He does not assign the task to anyone else—not his brother, not the shaman; he carries it out himself.
My father hounded me when my life was frailest. This may be our most exquisite time. My mother concealed me from him in her womb, like Rhea hiding Zeus in the grotto of Ida to shield him from the clutches of a voracious Cronus. Those are monstrous images; they make me shudder—conversations between matter and time. They lie as erratic boulders, uninterpreted, beneath the surveyed land.
Such are the standards in Eumeswil, a fellah society that periodically suffers moral harassment from demagogues until generals come and insert an artificial spine.
And revolutions lose their charm if they become permanent fixtures. Tyrannicide, the killing of the tyrannus absque titulo, presumes the existence of underdogs of quality.
The Casbah has a rule that an execution must be done by hand and that blood must flow. Criminals are decapitated, politicals shot. The public viewing is guaranteed, but limited.
Above all, I believe, Salvatore owed his life to the Domo’s secret sympathy with criminals. I notice that his head begins swaying almost benevolently whenever the conversation turns to major felonies. This happens less with fraud and property offenses than with armed robbery and violence, which have stirred the imagination since time immemorial. In spreading terror, the forces they unleash confirm the ruler and his justice. Such observations could support theories that power per se is evil.
“Most offenses can be taken care of quickly and painfully with a flogging. Who would not prefer that to a longer incarceration? Everyone is unanimous on this issue—the culprit, the judge, the opinio publica. Certain offenses simply cry for a flogging. It clears the air. While the deterrent effect may be arguable for capital punishment, it is beyond all question for corporal punishment. Besides, the latter makes reparation possible—compensation makes more sense for pain than for false imprisonment.”
The rulers change, the prisons abide; they are even overcrowded with each new regime.
Protection against aerial landings was assured by permanently revolving projectiles, which had come down to our era along with other remnants of the age of high technology.
The selection of inmates for the individual islands has led to sociological experiments. But, whatever the mixture of deportees, the initial “anything goes” situation soon developed into an authoritarian system.
The original and semi-mythical Brutus killed the last Roman king, his historical descendant killed the first caesar—both with their own hands. One commenced and one concluded the five-hundred-year history of the republic.
But when something that was already boring in the editorials read at breakfast is passed off as elite wisdom, then you feel annoyed.
The anarch is no individualist either. He wishes to present himself neither as a Great Man nor as a Free Spirit. His own measure is enough for him; freedom is not his goal; it is his property. He does not come on as a foe or reformer: one can get along with him nicely in shacks or in palaces. Life is too short and too beautiful to sacrifice it for ideas, although contamination is not always avoidable. But hats off to the martyrs.
The Domo has a sharp eye for anything concerning greetings and clothing, and rightly so, for therein lies the start of insubordination. If a man is not reprimanded for leaving his top button open, he will soon walk in naked.
At first blush, the anarch seems identical with the anarchist in that both assume that man is good. The difference is that the anarchist believes it while the anarch concedes it. Thus, for the anarch it is a hypothesis, for the anarchist an axiom. A hypothesis must be confirmed in each individual case; an axiom is unshakable. It is followed by personal disappointments. Hence, the history of anarchism is a series of schisms. Ultimately, the individual remains alone, a despairing outcast.
So much for the transmission of texts and their combination. The Tower of Babel was dismantled brick by brick, quantified, and rebuilt. A question-and-answer game leads to the upper stories, the chambers, the details of its appointments. This suffices for the historian who practices history as a science.
A conversation with someone who introduces himself as a realist usually comes to a vexatious end. He has a limited notion of the thing, just as the idealist does of the Idea or the egoist of the self. Freedom is labeled. This also holds for the anarchist’s relationship to anarchy.
In a town where thirty anarchists get together, they herald the smell of fires and corpses. These are preceded by obscene words. If thirty anarchists live there without knowing one another, then little or nothing happens; the atmosphere improves.
As in everyone, as in all of us, the anarch is also concealed in the anarchist—the latter resembling an archer whose arrow has missed the bull’s-eye.
Above all, the anarch must not think progressively. That is the anarchist’s mistake; he thereby lets go of the reins.
Merlino, one of the disillusioned, hit the nail on the head: “Anarchism is an experiment.”
Taking part in civil but not national wars is consistent with anarchist logic.
This revolution is bizarre in that throughout the European countries where it took place, it achieved the exact opposite of its goals, thereby damming up the world torrent for nearly a hundred years. The reasons have been examined from different vantage points. In medicine, such a process is known as maladie de relais: a disease providing new impulses—in this case, say, Bismarck and Napoleon III.
The anarch can face the monarch unabashedly; he feels like an equal even among kings. This basic mood affects the ruler; he senses the candid look. This produces a mutual benevolence favorable to conversation.
Spain is one of the great strongholds of reactionism, just as England is a bulwark of liberalism, Sicily of tyranny, Silesia of mysticism, and so forth. “Blood and soil”—this inspired muttonheads, who amused blockheads.
State capitalism is even more dangerous than private capitalism because it is directly tied to political power. Only the individual can succeed in escaping it, but not the group.
The most obvious things are invisible because they are concealed in human beings; nothing is harder to evince than what is self-evident. Once it is uncovered or rediscovered, it develops explosive strength. Saint Anthony recognized the power of the solitary man, Saint Francis that of the poor man, Stirner that of the only man. “At bottom,” everyone is solitary, poor, and “only” in the world.
This recalls a certain philosopher’s judgment of solipsism: “An invincible stronghold defended by a madman.”
Now just what are the cardinal points or the axioms of Stirner’s system, if one cares to call it that? There are only two, but they suffice for thorough reflection: 1. That is not My business. 2. Nothing is more important than I.
It is especially difficult to tell the essential from that which is similar to and indeed seems identical with it. This also applies to the anarch’s relation to the anarchist. The latter resembles the man who has heard the alarm but charges off in the wrong direction.
The milk of human kindness has gone sour; no Cato will make it fresh again. Besides, any present time is grim; that is why better times are sought partly in the past, partly in the future.
It makes no difference to me whether Eumeswil is ruled by tyrants or demagogues. Any man who swears allegiance to a political change is a fool, a facchino for services that are not his business. The most rudimentary step toward freedom is to free oneself from all that. Basically each person senses it, and yet he keeps voting.
Two steps, or rather leaps, could get me out of the city in which evolution has run its course.
a human being is revealed more in his lies than in his banal truth—his measure is his wishful thinking.
Sometimes the warrior caste is disempowered by the demos or by the senate and it then migrates to remote territories. That is how the motherland gets rid of its agitated minds, aristocrats, and reactionaries; in those areas, as in nature reserves, they can wage old-fashioned wars against nomads and mountain tribes. Adventures in service. On the other hand, they can turn dangerous when they, like Caesar, create their Gaul or, like an Iberian general named Franco, return with their legionnaires during a crisis.
“The heir to the Last Man is not the primitive, but the zombie.”
-
My Confession by Samuel Chamberlain
The Book in 3 Sentences
- This book is a bizarre travel memoir mostly concerning the Mexican American War. The author gets kicked out Boston at a young age and rambles the country more or less witnessing war crimes and fighting cavalry fights. It serves as a dramatic lesson in how the world has changed if nothing else.
Summary + Notes
Now while I was ready to forgive the sinner for his insult to me, I felt it was my Christian duty to punish him for his blasphemy.
Thus I lost confidence in woman’s love, and faith in religion, and went forth shunned as if I was another Cain.
She was sorry I was a Yankee, but when I assured her that I had never made a wooden Nutmeg or peddled a wooden Clock in my life she thought better of me. She
made an objection to having his Sable majesty ride inside, but I was verdant to Southern customs. A young Virginian, the master of the Negro, got into a rage and swore, “that the boy was worth twelve hundred dollars, and doggone his buttons if he would allow him to catch his death a cold for all the cursed Yankees that ever wore Store Clothes.”
the frightened inmates thought the whole house was on fire. I cried out that the fire was in the roof and seeing a row of Fire Buckets hanging in the Hall, I threw them down, rushed with two to the well, filled them, and run up the Stairs, asking one of the teachers to see they were all filled and brought up.
We formed a plan to elope to the North, and without waiting for the tie to be severed that bound her to Laboyce we would marry and be happy for life!
Our company, the Alton Guards, elected our own officers, as did all the other volunteers.
The Company was composed of the floating population of a Mississippi River town, wild reckless fellows, excellent material for soldiers, but requiring strict discipline to curb their lawless spirits.
The Rangers were the Scouts of our Army and a more reckless, devil-may-care looking set, it would be impossible to find this side of the Infernal Regions. Some
Take them altogether, with their uncouth costumes, bearded faces, lean and brawny forms, fierce wild eyes and swaggering manners, they were fit representatives of the outlaws which made up the population of the Lone Star State.
The warm body was carried out, sawdust was sprinkled over the bloodstained floor, Glanton carefully wiped his knife on the leather sleeve of his jacket, and matters in the Bexar Exchange resumed their usual course.
We went for each other, and he very foolishly run onto the point of my “Arkansas toothpick” and was badly cut for his want of judgement. I was seized by the guard, old Spanish irons were placed on me, and I was thrust into the “Callaboose,” a room about twenty feet square, inhabited by a very select society of Indians, Texans, Horsethieves, Murderers and the vilest characters of the lawless frontier.
This family placed me under the greatest obligations by their extreme kindness.
But I resisted and triumphed and the honor of the house of Ritter suffered not at my hands.
strolling under the shade of the sheltering woods. Katherine lay reclining in my arms, her arms pressed around me as of old, and I—well, my nature is too volcanic to play the Joseph too often!
Here I have listened to thrilling stories of Napoleon’s campaigns, related by an old cavalryman of fifty years’ service who had served in Italy, Egypt and in the Russian campaign, and at the age of seventy was still a vigorous soldier in the United States service.
On seeing us the “Rackensackers” broke ranks, and surrounded us yelling and whooping like Indians. Their officers had no control over them, and only our bold front saved our defenseless prisoners from being massacred by these brave chivalric sons of the South. Finding they could not butcher our charge, they went off at a jump to find other victims. Woe to the cripples and sick women who fell in their way, for their cruelty was only exceeded by their insubordination.
No man of any spirit and ambition would join the “Doughboys” and go afoot, when he could ride a fine horse and wear spurs like a gentleman.
No one was punished for this outrage; General Wool, in a general order, reprimanded the Arkansas Cavalry, but nothing more was done. The direct cause of the massacre was the barbarous murder of a young man belonging to the Arkansas Regiment. But this murder was undoubtedly committed in retaliation for the outrages committed on the women of the Agua Nueva ranch by the volunteers on Christmas day.
Most of them were wild reckless young fellows, with the most inflated ideas of their own personal prowess and a firm belief that their own State could whip the world and Mexico in particular. This independence of character, and self-confidence was fatal to their efficiency as soldiers. Many of them were duelists and desperados of the frontier, quite famous in their own locality as fighting men, to whom the wholesome restraints of discipline seemed tyranny in its worst form. The battles of the Alamo, San Jacinto and Mier, with the exploits of their demigods Crockett, Travis, and Bowie, caused them to religiously believe that a dozen Southern gentlemen armed with the Kentucky rifle and that southern institution, the Bowie Knife, could travel all over Mexico.
They took no care of their arms—not one Carbine in fifty would go off—and most of their Sabres were rusted in their scabbards. This shameful state of affairs seemed to have no remedy; the War was a southern democratic one, and ex-Governor Yell of the great and sovereign State of Arkansas, and ex-Senator Marshall, of the immaculate and still greater State of Kentucky, were men of too much importance to take advice, much less orders, from a little Yankee general like Wool. “We come here to fight sir!
Sergeant Gorman was reduced to the ranks for seeing a Ghost.
Under the cliffs at the pass the Surgeon and his assistants were busy preparing amputating tables.
The air was so clear we could see every movement: The Infantry knelt down, the Cavalry lowered their lances and uncovered, and their colors drooped as the benedictions were bestowed. This ceremony offered a striking contrast to conditions in our lines; there was not a Chaplain in our army!
I heard General Taylor say, “Steady boys! Steady for the honor of Old Mississippi!”
The Mexicans had a heavy battery of three guns, manned by Irish deserters from our army. These desperadoes were organized as a battalion known as the Battalia San Patricio, or Legion of Saint Patrick; the commander was the notorious Reilly, who ranked as a Colonel in the Mexican Army.
The gallant Colonels, not having time to settle their debate, decided to act independently, so when the enemy was within five hundred yards, Marshall gave the order to “Fire!” and Colonel Yell cried out, “Hold! Don’t fire until they are nearer!” The consequence was, some fired, others did not, but all turned and fled excepting Colonel Yell and a few officers of both regiments. Colonel Yell was killed—pierced by lance thrusts in the mouth and breast—and Marshall was senior beyond all dispute! Captain Porter of Arkansas and Adjutant Vaughan of Kentucky were also slain. Our column gave a wild Hurrah and charged the foe in the flank, taking them by surprise, and at a disadvantage.
On examining his body it was discovered that the shot which broke his thigh bone was fired by his own men (there being Buckshot in it). This was considered accidental, but believed otherwise, as battles often decide private grievances, as well as those of nations.
I halted at a spring and found my good steed apparently as fresh and as lively as when we set out. I raised up his head and gave him a drink of the whiskey (he was a regular old soldier), took some myself, let him drink at the spring, in which I bathed my head, and then tightening the saddle girth I was off again.
The guerillars, if possible, were guilty of worse acts than the Rangers, and the conflict was no longer war but murder, and a disgrace to any nation calling itself Christian. Our officers became disgusted with the many revolting acts committed by volunteers and Rangers, and no reports were ever made of these cruel raids.
This “Yankee” regiment was essentially an Irish one, the best material in the world to make infantry of, but requiring great efficiency on the part of the officers to enforce discipline. Unfortunately,
Visions of prize money flitted through our brains when a dignified little yellow-faced man, dressed in a suit of Nankeen, cut English fashion, came from the cuartel and stuck a pole surmounted by the Union Jack of England in one of the piles and, in the most pompous manner, informed our officers the silver was the property of Her Majesty Queen Victoria, and that the United States Government would be held to a strict accountability if it was molested! How potent is the power of Great Britain! Here thousands of miles away from all apparent power of that nation a miserable little cockney, with only the insignia of his country’s greatness, defies and threatens three hundred of Uncle Sam’s roughest riders. I believe that one of the Silver Pigs was sequestered by a graceless artillery officer, who not having the fear of Her Majesty’s displeasure, hid one in one of his guns, and thus it was brought to camp.
was already far gone in love; wild schemes flittered through my brain to adopt her as a sister, but alas!—man proposes and God disposes—platonic attachment between a wild Dragoon not yet out of his teens, and a young, passionate daughter of Mexico was an impossibility.
They were tried by a Court Martial, fifty sentenced to be hanged, the rest to dig the graves of their executed comrades, and “to receive two hundred lashes on the bare back, the letter D to be branded on the cheek with a red hot iron, to wear an iron yoke weighing eight pounds with three prongs, each one foot in length, around the neck, to be confined to hard labor, in charge of the guard during the time the army should remain in Mexico, and then to have their heads shaved and be drummed out of camp.”
During the war many of the females of the country had proved firm friends of “Los Gringos,” and we were often indebted to them for valuable information regarding the movements of the enemy, their own countrymen. Our fair female friends showed the utmost contempt for the weak dissolute “greasers,” and were public in their outspoken admiration of the stalwart frames, fair skins, blue eyes, and the kind and courteous demeanor of Los Barbarianos del Norte. This feeling was not confined to the lower classes; the señoritas ricas and the “doñas puros Castillanas” of the towns shared it with the poblanas and margaritas of the villages.
As might be supposed, this did not increase the love of the hombres for us, or render the position of the “Yankedos” now that their protectors were leaving the country, a pleasant one. They suffered fearful outrages from the returned Mexican soldiery and the ladrones of the country—they were violated, ears cut off, branded with the letters “U.S.” and in some cases impaled by the cowardly “greasers,” who thus wreaked their vengeance on defenseless women.
Through her influence I obtained the position of wagon master, at sixty-five dollars per month and two rations—a much better arrangement than the $7 a month I had been receiving as a Dragoon.
The bearer of this, Miss Ellen Ramsey, is desirous of going to California, and I have recommended you to her as a suitable party for her to contract a ‘Scotch marriage’ with, to enable her to do so. She will explain all. Yours, &c, Hugh Elmsdale.” This extraordinary epistle was written by a friend of mine, a clerk in the commissary Dept.
At Santa Cruz de Rosales, about 60 miles from Chihuahua, I sketched a monument built to commemorate a victory over the Comanches, who terrorize the country.
Colonel Washington, Majors Graham and Rucker gave the fair bride a chaste salute and the happy couple departed, hand in hand, to the bridegroom’s home, i.e., his tent.
Glanton had made two raids in the Indian country, with but small profit, and had met with considerable loss. There was in camp drying thirty-seven of those disgusting articles of trade, Apache scalps, cut with the right ear on, to prevent fraud, as some Indians have two circles to their hair.
Holden’s lecture no doubt was very learned, but hardly true, for one statement he made was “that millions of years had witnessed the operation producing the result around us,” which Glanton with recollections of the Bible teaching his young mind had undergone said “was a d——d lie.”
The Great Canyon of the Colorado at last!
I am satisfied that we were the first white men who ever saw the Great Canyon from this point. What is very singular in regard to it is that the cut is not through mountains, but through a level plain, with mountains rising above it from three to twelve thousand feet.
Their fields are irrigated by a system of canals from the Gila, the women doing the work of the fields while the men take care of the children and do the weaving.
-
The Origins of Woke by Richard Hanania
From my notion template
The Book in 3 Sentences
- Hanania examines the legal underpinnings of what we now call “Wokeness” and is kind enough to define what he means by the term. Then he goes into an extreme amount of detail explaining his theories. TLDR – there is a lot of money and power in wokeness due to the the way exact way the civil rights laws are written (light on text, heavy on bureaucracy) – it is NOT postmodernism. Al;so – Hanania is book form is much less Hanania than his twitter feed, or substack.
How I Discovered It
Hanania’s substack.
Who Should Read It?
Right wingers who care about being correct about our confusing modern times
How the Book Changed Me
The single biggest thing I took from the book was that what we call Modern Wokeness is not the end result of post modernism, endless fashion, etc, etc, but it is the result of how civil rights laws are written and have been interpreted. It’s the Jones Act of political movements.
Quotes and Highlights
For so many public intellectuals and politicians to be anti-woke but indifferent to civil rights law struck me as similar to worrying about global warming but not bothering to know anything about energy policy. Of course, something changed in the mid-2010s.
Opponents of wokeness sometimes say that “facts don’t care about your feelings.” But the federal judiciary does.
Yet on issues related to race, gender, and sexual orientation, the country has consistently moved left, toward institutions emphasizing classification based on identity, a results-based approach to seeking out equality between groups, and the stamping out of dissent from liberal orthodoxy.
For the purposes of this book, we can say that wokeness has three central pillars. The belief that disparities equal discrimination: Practically any disparity that appears to favor men over women, or whites over non-whites, is caused by some combination of past and present discrimination. Disparities that favor women over men or non-whites over whites are either ignored or celebrated. This includes not only material outcomes like differences in income or representation in high-status professions but “disparities in thought,” or stereotypes about different groups.
Speech restrictions: In the interest of overcoming such problematic disparities, speech needs to be restricted, particularly speech that suggests that they are caused by factors other than discrimination or that stereotypes are true. Human resources (HR) bureaucracy: In the interest of overcoming disparities and regulating speech, a full-time bureaucracy is needed to enforce correct thought and action.
If a person believes that discrimination is the primary cause of disparities but not that there should be speech restrictions to enforce that idea, we generally just call them a liberal instead of woke.
Title IX, which as a matter of statutory text simply banned discrimination in government-funded educational institutions and programs, has been used to micromanage the sex lives of college students.
Government got into the business of social engineering, while outsourcing much of the enforcement of its mandates and regulations to the private sector.
Wokeness resembles civil rights law more than it does Protestantism or the writings of any postmodern philosopher, and we can look at the historical and legal record to understand the motivations of those who made that law.
particular problem for the idea that wokeness came from the university is the fact that identity politics had to originally be forced upon much of higher education by Washington, with the Department of Health, Education, and Welfare originally coercing schools like Columbia and UC Berkeley to adopt quota-based faculty hiring during the early 1970s.9 The government mandates came first, and the ideology later.
Long before wokeness was a cultural phenomenon, it was law.
Enforcement is not carried out by the state itself, but mostly outsourced to trial lawyers and the human resources industry, at the expense of private institutions, which end up absorbing much of the cost of and backlash to political correctness.
The members of Congress who voted for the Civil Rights Act believed that they were dismantling a caste system in the South that was sustained by intentional and conscious private and state-backed discrimination. They did not see the bill as a way to remake American society, redistribute wealth, or destroy capitalism.
Once again, there was more intellectual honesty on both the right and the left than there was in the center.
While “diversity” is certainly an idea, it is not one that can claim any kind of intellectual depth or historical pedigree. It was basically the creation of one judge acting out of either political timidity or intellectual laziness.
The fact that feminist and LGBT dogma contradict each other is a problem for logicians and political philosophers but not for the law or the psychology of true believers.
All of the contradictions noted above can be explained by understanding wokeness as the name we give to a collection of beliefs that one must hold for legal and psychosocial reasons, without any mechanism to ensure logical consistency built into the system. We do not look for logical consistency in an act of Congress that we know was the product of logrolling, compromise, and debate between various factions. The creation of a cultural phenomenon is even more complicated than a piece of federal legislation, and how it is lived and experienced is even less likely to have a close relationship with any philosophical text.
Like an act of Congress, wokeness can similarly be seen as a “logrolled” set of cultural beliefs.
This means that the whole project of seeking a grand philosophical explanation for wokeness relies on a conceptual mistake, likely rooted in the need of intellectuals to exaggerate their importance.
defined wokeness in terms of three pillars—disparities equal discrimination, speech controls, and HR bureaucracy—these beliefs and practices should be seen less as a philosophical doctrine with its own impeccable inner logic than as a political program that has emerged from a combination of factors such as interest group lobbying, mass emotional sentiment, and bureaucrats seeking to increase their power.
Why, despite a war on terror that led to the victimization of Muslims both at home and abroad, do we see so little organized political activity among Muslim Americans relative to politicians and activists who identify with artificial categories like Hispanics and AAPI? The wokeness-as-law perspective can help one understand all of this and much else.
The entire concept of merit—whether measured through standardized tests or other forms of academic achievement—is treated with suspicion, as schools close down gifted programs and more universities drop SAT requirements, putting increasing emphasis on subjective measures like extracurriculars that have even worse class and political biases than the practices being eliminated.
The public health profession was discredited among wide swaths of the population when much of the community recommended lockdowns during the Covid-19 pandemic but made an exception for protests against racism.
Despite the fact that a similar rise in homicide did not occur in other nations that were also suffering from the pandemic, the media sought to place the blame on disruptions related to Covid-19 instead of the Black Lives Matter movement.
The state is so intertwined with the rest of life that it makes little sense to treat culture and politics as separate forces in a modern society.
We see hints of what may come in the rise of corporations, like Coinbase, Substack, and Basecamp, that are explicitly disavowing political activism, even in the absence of any changes in civil rights law.
This is the story of civil rights law. It is likely that few, if any, members of Congress at the time would have believed that the bill signed by President Johnson would ultimately force police departments to lower their physical fitness standards to accommodate women, much less make employers subscribe to theories about the malleability and subjectivity of gender that had yet to be invented.
but ultimately text on paper passed by two large and divided legislative bodies has proved no match for the machinations of permanently placed bureaucrats and judges.
What changed in the mid-twentieth century? It would be surprising if the rise of television did not play a role, as it both nationalized politics and provided footage that increased sympathy for the plight of black southerners. In 1950, only 9 percent of American homes had a television. This number rose to 65 percent in 1955, and 87 percent in 1960.
The year after the CRA was passed, President Johnson signed EO 11246, which, as amended throughout the years, has become the basis of the modern affirmative action in contracting regime.
It created what would come to be called the Office of Federal Contract Compliance Programs (OFCCP), located within the Labor Department. In 1967, Johnson added “sex” to its prohibited categories, and Obama included “sexual orientation” and “gender identity” in 2014. While
Today, about a quarter of the American workforce is employed by a government contractor.
Affirmative action is required for every employer with fifty employees that does at least $50,000 worth of business a year with the federal government, and every subcontractor with at least $10,000 in business.
Race and sex are to be determined by self-identification, with the employer prohibited from overruling an individual’s selection, although visual classification is acceptable under certain conditions.20
From the contractor’s perspective, all they can know for certain is that they must go through the motions, and that hiring and promoting more minorities and women will be less likely to get them in trouble.
Each business must be aware of the racial dynamics in its own community, forcing the private sector into the dishonest project of creating identity-obsessed institutions that simultaneously champion equal treatment.
At various points throughout the debate over the Civil Rights Act, critics of the bill expressed concern that it might do x. In response, supporters of the bill would say, “no, it won’t do x,” and the two sides would agree to a compromise that involved entering a clause into the bill in effect saying that “x is prohibited.” Usually within a decade, the EEOC and the federal courts would do x anyway.
If this sounds like a judge making up the law to fit his own political preference, that is because that is exactly what it is. At the very least, Justice Blackmun should be credited for his candor.
which showed an acknowledgment “that constant change is the order of our day and that the seemingly reasonable practices of the present can easily become the injustices of the morrow.”
We have therefore moved from bans on explicit discrimination to practically any behavior or speech potentially offensive to women.
Explicit quotas are preferable to the current system in that they could potentially place limits on discrimination, leave more room for merit, and provide clarity on what is and isn’t allowed. They would also be simpler to administer, lead to less bureaucracy, and not require ideological litmus tests in the form of “diversity statements,” which increasingly are required in university hiring. What we have instead is a system where civil rights law serves as the skeleton key of the left.
As of 2019, among those twenty-five and older, 40 percent of whites had a bachelor’s degree or higher, compared to 52 percent of Asians, 26 percent of blacks, and 19 percent of Hispanics. Clearly any employer that requires a BA or postgraduate degree could be accused of engaging in a practice that has a disparate impact on the latter two groups under the four-fifths rule. Unlike with cognitive tests, though, employers have seldom, if ever, gotten in trouble for requiring college degrees, even when the kind of credential necessary to be hired or promoted has no connection to the profession in question.
If it seems that our culture has built an elaborate ranking system of races, genders, and “traumas,” it is because our legal system did it first.
When most people think about what types of people are affiliated with universities, they usually think of professors and students. That impression is dated, as higher education has been taken over by professional managers who neither teach nor do research. Yale currently has about as many administrators and managers as it does students.1 Many new employees have job titles that did not exist only a few decades before. As of 2020, Ohio State University employed 132 administrators with “diversity” or “equity” in their job titles at the cost of $13.4 million.
None of this would matter all that much if civil rights law wasn’t also self-financing, the second reason for the existence of a robust human resources industry.
Finally, there is the “best practices” doctrine, through which an institution can defend itself by showing that it is behaving in accordance with industry norms. Employers must pay attention not only to what judges and bureaucrats think but to the things that other corporations are doing to address discrimination. This creates an arms race, which helps explain why practices that once seemed absurd can become common.
The results show the creation of an entire industry. In 1968, only 1 in 558 American workers were employed in human resources. By 2021, that number had risen to 1 in 102, including 1 in 184 men and 1 in 68 women. In his 1941 book The Managerial Revolution, James Burnham argued that the world was witnessing a shift from a system where capitalists comprised the ruling class to one in which they were being replaced by a managerial elite that controlled the means of production.
But vagueness wrapped in jargon is the great trick of civil rights law.
That would be an arbitrary standard, but adding more words has the effect of only making the rule look more exact and precise, while in effect doing no such thing.
A traditionally “strong” state can issue mandates that are clear, do not undergo transformations over time through judicial and bureaucratic procedures, are enforced through one part of the national government, and are of uncontested legitimacy. The government of France is held up as an example of a strong state, one that has been able to create a quota for hiring handicapped employees and has laws regarding employment that are stable and enforced exclusively through the Ministry of Labor. In contrast, the American state is “weak.” It does not mandate quotas; in fact, it explicitly bans them. Instead, government contractors have “goals” and “timetables” they set themselves, and all large employers must be on the lookout for “disparate impact” in a world where everything has a disparate impact. Enforcement is also highly decentralized. In the private sector, an employer may face negative consequences through a lawsuit filed by a private party, an investigation through the EEOC, or, if they have a federal contract, via the Department of Labor or the agency that the firm is directly dealing with. Firms may also face pressures at the state or local level. Public institutions such as schools similarly can face individual lawsuits or investigations and threats that funding from Washington will be cut
Due to Christiansburg, however, we now have an asymmetry in which plaintiffs have a right to recover attorney’s fees if they win, but defendants must swallow the costs of defending themselves even when courts have determined they have done nothing wrong.
Under the Obama administration, it was normal practice for the Justice Department to reach settlements with corporations that required them to pay money to left-wing activist groups, therefore providing funding to the administration’s political allies without having to go through Congress.21 Civil rights law implements a relatively small tax on corporations that has a massive effect in terms of creating an entire industry of lawyers, activists, and human resources professionals.
In 2020, there were about 1.5 million businesses in the US with at least fifteen or more employees, the threshold to be covered under Title VII of the Civil Rights Act and the ADA.
Civil rights law, through its vagueness, works in a similar way. Each corporation has an incentive to seem “less discriminatory” than others, which in effect means adopting fads out of academia or the HR industry and having to engage in ever more blatant forms of reverse discrimination.
“Woke capital,” which often refers to corporations taking left-wing stances on identity-related issues, is a natural response to a system that rewards this kind of virtue signaling.
Data from the Department of Education shows that while the number of K–12 teachers in the US increased by 8 percent from 2000 to 2017, the number of administrators increased by 75 percent.
The federal relations director of the Association of American Universities once put forward what he called “administrative clone theory,” in which every new form of federal spending comes with a new federal office to administer the money, and then “clones” of the department are created at each university.
In the 1950s, the field of human resources barely existed. Over the next decades, it would grow into a massive community, today comprising around 1 percent of the workforce.
The government decides which categories are relevant to public life, and which are not.
As it turned out, it was easier to create a race than it was to disestablish one.
Americans are usually asked to choose a “race” and an “ethnicity,” with Hispanic or Latino being the only kind of “ethnicity” officially recognized.
In the next year, the SBA would reject Iranians and Arabs for inclusion, and conclude that the category of “Asian” stopped at the Afghanistan-Pakistan border for the purposes of government classification in this area.38 Although it doesn’t appear to have been given much thought, further up north, Central Asians were and remain considered whites for the purposes of government classification, with the status of Uzbeks being the subject of a 2008 SBA hearing that was settled by the petitioner being declared disadvantaged on nonracial grounds.
The EEOC originally denied Poles official minority designation on the grounds that there was no room left on their form, and including them might lead to demands for similar treatment of “Italians, Yugoslavs, Greeks, etc.”42 More extensive efforts were made on behalf of American Jews. Reports from the Truman administration on civil rights gave them substantially more attention than European ethnics, and the Eisenhower administration listed Jews as one group that could be voluntarily reported on by employers in the “other minorities” category.
The idea that the Civil Rights Act would ban employment discrimination based on sex started out as an attempt by a southern segregationist to kill the bill.
Moreover, much of the increase in LGBT identity appears to be among those who engage in only heterosexual behavior, indicating that we are arguably witnessing more of a social contagion of identity than a situation where greater tolerance has allowed more individuals to live as their authentic selves. Again, as with changes in gender relations, it is difficult to prove a causal effect of government policy, though it would be surprising if it had none. Nonetheless, with regard to gender and sexual identity issues, we see the same story of social engineering evident in the way we think about and classify individuals according to race.
Businesses find themselves having to adopt policies that are less tolerant of flirtation and other forms of organic, healthy interactions between men and women, meaning that government has in effect legalized all sexual behavior that goes on behind closed doors, while also carving out an exception for when two individuals work together—in which case it has problematized every step in the process to get to that point.
How could one area of law have such disastrous downstream effects in so many different areas of life? By way of analogy, this question can be answered by noting that one might be skeptical of a claim that there is a medicine that cures a large number of ailments, while being more ready to believe that there is a poison with a large number of negative health effects. Like the human body, society is an extremely complex system, which means that there are many more potentially harmful interventions than there are beneficial ones.
The difficulty created by civil rights law is so well known that it is referred to as the “validity-diversity trade-off”: the better a metric is for predicting job performance, the larger its disparate impact.
At the risk of oversimplification, we may divide American governance into four eras. From the Founding to the presidency of Andrew Jackson, there was the era of elite rule. Then came the spoils system, which was ended, albeit imperfectly, with the Pendleton Civil Service Reform Act of 1883 and the formation of the Civil Service Commission. The era of meritocratic hiring lasted just under a century. Since 1978, when the Civil Service Commission was abolished, if not earlier, we have been living in the racial spoils era, where government maintains impersonal standards but seeks to distribute jobs across various official categories.
Interestingly, one way to potentially get around the problem is through explicit quotas. In the first decades of the EEOC’s war on testing, employers had begun to “race-norm” exams, which simply meant giving extra points to individual blacks and Hispanics by only comparing their scores to those of the same ethnic group. With that method, instead of eliminating standards, one can at least find the most qualified people from each race. The EEOC was in favor of this practice as a means to achieve equal representation, at one point actually prosecuting a Tennessee company for not giving extra points to black applicants.
personnel need to meet certain physical standards. Unfortunately for the left, almost any non-negligible physical fitness standard that men have to achieve is going to exclude practically all women.
In the workplace, however, the practical impact of civil rights law can be to create an environment in which only the left-wing position is permitted, and any employer who thinks otherwise is opening himself up to legal liability.
Human relations are complex, so much so that, according to the social brain hypothesis, the reason we have such high levels of cognitive ability in the first place is because intelligence is necessary to navigate and manage our social relations.
Capitalism does not guarantee optimal societal outcomes—such a thing is too much to hope for. But it aggregates information in a way like no other process on earth, and produces a result that, if not perfect, continuously builds on previous improvements and makes people’s lives better.
This is why the standardization of the American workplace that resulted from civil rights law has likely had such disastrous effects on productivity. A series of practices, such as structured interviews, the deemphasizing of tests, and HR departments managing social relations did not emerge necessarily because they reflected the best ways to run a business. Rather, they emerged as a compromise between market pressures that reward productivity and aggregate human preferences, on the one hand, and arbitrary government fiats aimed at achieving demographic parity while hiding what they are doing, on the other.
Civil rights law is killing experimentation at work, with implications for the rest of life. As a matter of simple logic, more diversity within institutions will lead to less diversity between them.
If individuals desire a sexless, androgynous, and sanitized workplace free of anything that might cause offense, the market will create such spaces.
In other words, major American institutions are required to declare within the same sentence both that they do not discriminate and that they practice affirmative action.
Some have argued that Trumpian lies play a social role in binding the right—that by expressing belief in falsehoods that are clearly absurd, followers of the former president show their loyalty to him.56 They have failed to notice that the same can be said regarding lies couched in legalese or academic jargon. Trumpian lies at least make clear the rules of the game and delineate the sides. They at the very least do not insult anyone’s intelligence through obfuscation.
Civil rights law declares some practices related to sex and race unacceptable and others mandatory, restricting personal freedom and harming economic efficiency. It prevents creative destruction in the economic and social realms and taste-based discrimination, while making life more difficult for certain “unofficial minorities,” including the neuro-atypical, the socially inept, the highly religious, and the hypermasculine.
The economist Robin Hanson asks us to imagine political debate as a tug-of-war, with each side pulling on one side of the rope. If one wants to have an unusually high level of influence, the best strategy is to pull the rope sideways—that is, take a position not clearly aligned with either side of the political spectrum.1
For a Reaganite or libertarian, using government power to roll back the excesses of civil rights law is no more philosophically problematic than reducing environmental regulations or lowering taxes. Doing so is not only something libertarians shouldn’t feel uncomfortable about, it is something they should actively support.
Sometimes when different factions of the elite agree on a policy approach, they can exclude alternative viewpoints that might resonate with the masses. This was easier to do before the fragmentation of the media landscape. In the 1960s, national politics was covered on television by three major news stations.
The fact that the increase in crime was overwhelmingly concentrated precisely where Great Society legislation aimed to help—that is, in the inner cities—helped further discredit the liberal project, as did the rise of race riots in those same communities.
Nixon in particular adopted the language of conservatives, but he was a centrist on domestic policy whose primary interest was in foreign affairs.15 As he focused most intently on geopolitical issues surrounding Vietnam, China, and the Cold War, at home Nixon gave moderate staffers a dominant policy role while letting conservatives handle speechwriting and PR.
In the Philadelphia Plan, government construction contractors were first held to a “goals-and-timetables requirement” to hire more minorities, with Nixon having personally lobbied members of Congress on behalf of the policy in December 1969 as a way to pit civil rights organizations and labor unions against one another and split the Democratic coalition.
And while partisan polarization now prevents Republicans from working with Democrats to expand civil rights law, education polarization ensures that they find it difficult to move policy in their preferred direction.
Republicans after taking the House in 1994 found that abolishing affirmative action split their own caucus while uniting Democrats, tilting the playing field in favor of the latter despite there being majority support for the conservative position in the country as a whole.
In few areas is the mainstream press less trustworthy than on issues of identity, as can be seen in recent years in various supposed hate crimes that journalists have championed being exposed as hoaxes, and the narratives about police shootings that they credulously reported on that turned out to unravel over the course of time.
This story teaches us something important about policymaking. There can be a practice that 100 percent of people think should be banned, but it can remain legal if no one thinks about the issue or brings it to the attention of the public.
While the left still outnumbers the right in number of committed lawyers, nonprofits, and activists on its side, conservatives have built enough of a critical mass to be effective and are now well represented in the judiciary, with Republican presidents having appointed most federal judges as of 2022.
know that there is a practical way to fight back. Civil rights law, like other legal areas, is something of a battle of attrition between bureaucrats and lawyers. Progress depends not only on getting judges and bureaucrats to see things in the way one prefers, but also on galvanizing enough members of what is sometimes called “the managerial class” within and outside of government to build upon legal victories and blunt the impact of defeats.
The rise of environmental, social, and governance (ESG) investing, in which Wall Street firms acting as corporate shareholders push for diversity in hiring and promotions as well as other left-wing causes, represents an acute threat to American ideals based in the private sector, and may make a purely libertarian approach to fighting wokeness unrealistic.
Instead of more sweeping bills, a Republican-controlled Congress could nibble around the edges of civil rights law, passing legislation that attracts little public attention but may have major effects on the incentive structures faced by bureaucrats, lawyers, and potential plaintiffs and defendants in court cases. Topics like federal jurisdiction, whether a plaintiff can get attorney’s fees and how much, and the burden of proof in different kinds of actions arouse little in the way of mass sentiment; nevertheless, as seen in the rise of certain kinds of lawsuits in the aftermath of the CRA of 1991, they can have a profound impact on how the law is practiced and its societal effects.
Moreover, if there are measures to enforce an anti-wokeness policy agenda through the courts, they should be pursued. The 2021 Florida bill creating a cause of action against schools that teach critical race theory is a good example of this.30 It even includes attorney’s fees for parents who successfully file suits, mimicking federal civil rights law.
Unlike the federal government, a state exercises direct control over its higher education system, and there is little reason not to go to war with the diversity bureaucracy, with the ultimate aim of getting universities out of the business of social engineering or taking a side in the culture war.
Asking what should be done to fight wokeness while in power is somewhat like asking where to get water from the ocean. Opportunities abound.
The right hates wokeness, but its failures have resulted from seeing the phenomenon as simply a cultural trend or class marker rather than as a left-wing mode of bureaucratic governance.
This conclusion has two purposes. First, it is to show that wokeness is not as strong as it looks. Analogies to religious faith—which carry with them the implicit argument that the phenomenon may last for thousands of years—rest on a weak foundation.
Populations changing their national loyalties is more common than the adoption of new religious faiths; it has been noted that when Russia moved into eastern Ukraine in early 2022 it found much less support among the local population than it had only eight years before.
Wokeness thus has no history of surviving without state support. In fact, even with state support, and with practically unlimited rhetorical backing from elite institutions, it still struggles to win hearts and minds. Wokeness remains mostly a political loser for the left, which is why it obfuscates on issues like critical race theory and the fact that civil rights law in its current form all but requires speech restrictions and racial quotas. Wokeness does not appear to be able to motivate its adherents to make the extreme kinds of sacrifices that are the hallmarks of true religious faith. It can’t even convince liberals to keep their kids in inner-city public schools.
France provides a counterexample to the American model of the management of race and gender issues. Its laws generally ban the state from collecting data on the race, religion, and ethnicity of individuals.7 This means that, much to the chagrin of some American liberals, France cannot have disparate impact standards, state-enforced affirmative action, or even programs targeted at a group with a particular ancestry. It is perhaps not a coincidence that in a country that bans the kind of data collection necessary to enforce woke policies, we see much more resistance to wokeness as a cultural force among the political elite.
Thus if we see homosexuality becoming more acceptable in most or all countries, is this a natural consequence of modernity, or just a sign that Hollywood and the State Department are everywhere? Nonetheless,
Wokeness can be understood as a series of recurring moral panics backed up by state power.
This demonstrates that there is not always a strong correlation between how much energy surrounds a public policy debate and how important it ends up being.
As it turns out, moral panics only become a permanent part of life when they are backed up by state power and lead to the creation of new laws and bureaucracies.
Few people think too deeply about the connection between law and popular culture—music, art, and TV shows. Yet even in the freest societies, law shapes culture, which means that it cannot help but drive popular entertainment.
This is why many of the most critically acclaimed TV shows of recent years have been set in either the distant past, fantasy universes, or the criminal underworld, where there is less pressure for politically correct stories that obviate natural differences between men and women and insert unrealistic levels of ethnic diversity that distract from the ability to find inspiration in a work.
In the relationship between culture and law, the arrow of causation does not flow in one direction.
To even ask which causal arrow has greater weight is likely asking too much of social scientists. At the same time, historical research, by looking at the order of events, can make the case that many of the ideas fundamental to wokeness were part of law before they were part of American culture. In other words, there is a striking resemblance between assumptions of civil rights laws that go back to the 1970s and cultural ideas and forces that have come to ascendance much more recently.
Among its many other goals, this book argues against our tendency to mistake salience for importance. Wokeness inspires passions on both sides. While debates over hot-button issues are often dismissed as insignificant by those who have an aesthetic or ideological commitment to the idea that politics should mostly be about economic issues, Americans, just like other people, have made clear that they care deeply about what kind of culture they live in. Yet for half a century now the culture war has been an asymmetric fight, with one side able to inspire a critical mass of bureaucrats and activists who do their work far from public attention, and the other doing little more than encouraging and reflecting mass discontent without much impact.