• Adages,  Guitar

    Quote of the day – from YT

    From some video I didn’t bookmark

    I’m only good at two things, playing guitar, and hiding my dependence on anti-psychotic drugs. And I’m all out of Klozapine…

    Comments Off on Quote of the day – from YT
  • Adages

    Marleigh’s line of the day –

    While waiting for a parent teacher conference, and looking up at one of the circular air vents in an otherwise dark place

    It’s like a hole in the emptiness

    Comments Off on Marleigh’s line of the day –
  • BigThink,  Culture,  Subcultures,  Uncategorized

    Some Recent Recalibrations

    Thought One
    The rabidly political have continued their incremental improvements in dominating the political discussion space and have made the political discussion space somehow even worse than it was. However, the non-rabidly political have made great strides in separating themselves from the rabidly political, either by splitting off from existing groups or improvements in changing subjects and avoiding topics. That sounds very trivial, but in my very anecdotal experience the non rabidly political have undergone some major social skill upgrades. It’s halfway to real life hellbanning.

    Thought Two
    The very online (producing and consuming) and the not very online will be the main political divide in the coming years. I was thinking about (for no obvious reason) Cal Newport (Deep Work) and James Scott (Seeing Like a State). Newport and Scott make interesting points about identity creation and maintenance, albeit from different angles. In my formulations, the various points converge with the notion that identity creation is hard, time consuming, and bends towards coherence and legibility. The online identity will become much more monotonic due to the greater amount of standardized influences. The not very online will maintain much more nuanced and varied identities.

    Pair that with thought one above and the very online will become very visible and legible to the not very online and the not very online will become confusing then invisible to the very online.

    Comments Off on Some Recent Recalibrations
  • Adages

    Quote of the day – podcast edition

    From a long digression abut the economic and demographic ailments of the world, and the relatively good position of the United States – summed up as

    We’re definitely the leper with the most fingers

    Comments Off on Quote of the day – podcast edition
  • Books,  History

    American Colonies by Alan Taylor

    From my Notion template

    The Book in 3 Sentences

    1. A wonderful, incredibly detailed look at American history from the Big Bang to 1770 (or so). Taylor goes very, very, very deep into the particulars and surfaces a lot of hidden insight and many factors I would never have considered, like European weeds and agriculture. Slavery and disease were given their proper (i.e. enormous) importance in the telling of the story. The book would have been better as five shorter books (it is very packed with detail) but I have never come across a book that tells the complete story – by which I mean how the Native American tribes interacted with each other, how the European powers interacted with each other, how the Native Americans interacted with the Europeans better than this one.

    How I Discovered It

    I read American Republics by the same author.

    Who Should Read It?

    Anyone interested in history.

    How the Book Changed Me – AKA Random Observations

    • I had never thought of the impact of European weeds and livestock on the natural environment – Taylor documents this very important phenomenon very well
    • The balance of disease (for lack of a better term) greatly favored the Europeans, i.e. the ones they brought were far more deadlier and numerous than the ones they got
    • Seemingly Spain set it self out to be an evil villainous empire with no redeeming virtues in modern eyes
    • There was a lot more religious based human sacrifice in the Americas than I would have thought.
    • Slavery was instituted everywhere slavery was profitable. The Europeans found more ways to make it profitable but the institution was well entrenched before colonization.
    • Many of the Indian tribes were recent creations – tribes devasted by European arms, disease and displacement formed very new tribes that I always assumed had been around forever.
    • From reading the book (the author doesn’t say this anywhere, but this seems like a reasonable guess) – the Europeans caused and 80% reduction in population size of anyone they came into contact with
    • Taylor does a very good job of explaining the role of time in all of these events. What before I had thought of as “Euro group X came over and drove out the Indian tribe Y and that’s why Virginia exists” turned into “Over a 120 year period there was a complex series of highly contingent events involving multiple generations of people, their leaders and incentives”
    • The role of the West Indies is fully explained, including why it had such a disproportionate share of money, protection and slaves. The moral depravity and suffering was truly epic.
    • My instinctive bias against South Carolina is now more reasoned
    • The concept of “Native American” and “European” is about 80% modern projection. No one back then thought of that distinction was as meaningful as we do now
    • The entire colonization process is Garrett Hardin’s first law of ecology “You cannot do only one thing”. The arrival of the Europeans, their diseases, livestock, crops, weeds, diseases and weapons had profound impacts on everyone, whether they interacted with the Europeans or not. Disease risk, guns, horses and balance of power somewhere affects everyone everywhere
    • Too many more to list

    Summary + Notes

    After about 1640, the great majority of free colonists were better fed, clothed, and housed than their common contemporaries in England, where half the people lived in destitution.

    Between 1492 and 1776, North America lost population, as diseases and wars killed Indians faster than colonists could replace them.

    Until lumped together in colonial slavery, the African conscripts varied even more widely in their ethnic identities, languages, and cultures.

    Most diverse of all were the so-called Indians. Divided into hundreds of linguistically distinct peoples, the natives did not know that they were a common category until named and treated so by the colonial invaders.

    In these cultural and environmental encounters, the various peoples were not equal in power. In most (but not all) circumstances, the European colonizers possessed tremendous ecological, technological, and organizational advantages, which demanded disproportionate adjustments by the Indians in their way and the Africans in their grasp.

    In the colonies, that difference grew stronger over the generations as British America developed an especially polarized conception of race in tandem with greater political power for common whites. Unlike the French and the Spanish, the British colonies relied in war primarily on local militias of common people, rather than on professional troops. That increased the political leverage of common men as it involved them in frequent conflicts with Indians and in patrolling the slave population. In those roles, the ethnically diverse militiamen found a shared identity as white men by asserting their superiority defined against Indians and Africans conveniently cast as brutish inferiors.

    Once race, instead of class, became the primary marker of privilege, colonial elites had to concede greater social respect and political rights to common white men.

    Reading the United States back in time and geography to frame the colonial story has the distorting effect known as “teleology”: making all events lead neatly to a determined outcome, in the colonial case to the American Revolution and its republic. Teleology costs us a sense of the true drama of the past: the “contingency” of multiple and contested possibilities in a place where, and time when, no one knew what the future would bring. As late as 1775, few British colonists expected to frame an independent country. And very few Hispanics and fewer Indians wished for incorporation within such a nation.

    In fact, it would be difficult (and pointless) to make the case that either the Indians or the Europeans of the early modern era were by nature or culture more violent and “cruel” than the other. Warfare and the ritual torture and execution of enemies were commonplace in both native America and early modern Europe. Without pegging Europeans as innately more cruel and violent, we should recognize their superior power to inflict misery.

    Almost all early explorers and colonizers marveled at the natural abundance they found in the Americas, a biodiversity at odds with the deforestation and extinctions that the Europeans had already wrought in most of their own continent. Colonization transformed the North American environment, which had already experienced more modest changes initiated by the native occupation.

    Dental, genetic, and linguistic analysis reveals that most contemporary Native Americans are remarkably homogeneous and probably descend from a few hundred ancestors who came to North America within fifteen thousand years of the present

    reaching Labrador and Greenland by about twenty-five hundred years ago.

    Note From Steve: – Interesting to see if this holds up. that is very late

    Through some combination of climatic change and the spread of highly skilled hunters, almost all of the largest mammals rapidly died out in the Americas. The extinctions comprised two-thirds of all New World species that weighed more than one hundred pounds at maturity—including the giant beaver, giant ground sloth, mammoth, mastodon, and horses and camels. It is ironic that horses and camels first evolved in North America and migrated westward into Asia, where they were eventually domesticated, while those that remained in the Americas became extinct. The giant bison died out, leaving its smaller cousin, the buffalo, as the largest herbivore on the Great Plains. Of the old, shaggy great beasts, only the musk oxen survived and only in the more inaccessible reaches of the arctic.

    Obtaining more to eat, more reliably, they resumed their population growth. The more local and eclectic Archaic way of life could sustain about ten times as many people on a given territory as could the Paleolithic predation on herds of great beasts.

    Archaic Indians also began to modify the environment to increase the yields of plants and animals that sustained them.

    Gender structured work roles: men were responsible for fishing and hunting while women harvested and prepared wild plants. In general, men’s activities entailed wide-ranging travel and the endurance of greater exposure and danger, while women’s activities kept them close to the village, where they bore and raised children.

    The native peoples of North America spoke at least 375 distinct languages by 1492.

    The Indians of central Mexico pioneered the three great crops of North American horticulture: maize, squashes, and beans.

    The new horticulture also promoted economic differentiation and social stratification as the food surplus enabled some people to specialize as craftsmen, merchants, priests, and rulers.

    The skeletons of early farmers reveal a want of sufficient salt or protein, episodes of early childhood malnutrition, and an overall loss of stature. Moreover, the denser populations of horticultural villages facilitated the spread of communicable diseases, principally tuberculosis, which was less common among dispersed hunter-gatherers.

    Rather than horticulture, the most significant development for these people was their adoption of the bow and arrow after about A.D. 500.

    The largest pueblo, at Chaco Canyon, required thirty thousand tons of sandstone blocks, stood four stories tall, and contained at least 650 rooms.

    Founded in 1300, Acoma is probably the longest continuously inhabited community within the United States. Other

    Like the people of central Mexico, the Mississippians regarded the sun as their principal deity, responsible for the crops that sustained their survival; they considered their chiefs as quasi-sacred beings related to the sun; and they practiced human sacrifice. When a chief died, his wives and servants were killed for burial beside him, as companions for the afterlife.

    Within a century, European diseases, supplemented by European violence, killed most of the Mississippian peoples and transformed the world of the survivors.

    The urban centers tended to collapse within two centuries of their peak, which obliged their inhabitants either to relocate or to revert to a more decentralized and less hierarchical mode of life, which allowed the recovery of wild plants, animals, and soils. Because native peoples more promptly felt the negative consequences of their local abuse of nature (relative to Europeans), they more quickly shifted to alternative environmental strategies.

    Lacking horses and oxen, native North Americans knew the wheel only in Mesoamerica as a toy.

    Consequently, in the North America of 1492, only the Aztecs of Mexico constituted an imperial power capable of governing multiple cities and their peoples by command.

    compared with Europeans, the natives of America carried a more limited and less deadly array of pathogenic microbes.

    By contrast, the Europeans of 1492 were the heirs to an older and more complex array of domesticated plants and animals developed about nine thousand years ago at the eastern end of the Mediterranean. The European mode of agriculture featured domesticated mammals—sheep, pigs, cattle, and horses—endowing their owners with more fertilizer, mobility, motive power, animal protein, and shared disease microbes. Building on a long head start and the power of domesticated mammals, the Europeans had, over the centuries, developed expansionist ambitions, systems of written records and communication, the maritime and military technology that permitted global exploration and conquest, and (unwittingly) a deadly array of diseases to which they enjoyed partial immunities.

    Indians understood that humans could live only by killing fish and animals and by clearing trees for fields, but they had to proceed cautiously. Natives usually showed restraint, not because they were ecologically minded in the twentieth-century sense, but because spirits, who could harm people, lurked in the animals and plants. A healthy fear of the spirits limited how the Indians dealt with other forms of life, lest they reap some supernatural counterattack. Offended spirits might hide away the animals or the fish, afflict the corn crop, or churn up a devastating windstorm. Any success in hunting, fishing, or cultivating had to be accepted with humility, in recognition that the fruits of nature were provisional gifts from temperamental spirits.

    Indian animism should not be romantically distorted into a New Age creed of stable harmony. In fact, the natives regarded the spiritual world as volatile and full of tension, danger, and uncertainty. To survive and prosper, people had to live warily and opportunistically.

    The logic of restraint was animist rather than ecological—but that restraint tended to preserve a nature that sustained most native communities over many generations.

    The Christian alienation of spirit from nature rendered it supernaturally safe for Europeans to harvest all the resources that they wanted from nature, for they offended no spirits in doing so. In wild plants and animals, the colonizers simply saw potential commodities: items that could be harvested, processed, and sold to make a profit.

    A French priest in Acadia noted of the Indians, “They are never in a hurry. Quite different from us, who can never do anything without hurry and worry; worry, I say, because our desire tyrannizes over us and banishes peace from our actions.”

    By offering such moral criticism, however, Christians helped to preserve a capitalist society from consuming itself. Indeed, without some moral counterweight and some sense of a higher purpose, capitalist competition degenerates into a rapacious, violent kleptocracy. Without a God, the capitalist is simply a pirate, and markets collapse for want of a minimal trust between buyers and sellers. The seventeenth-century English minister Thomas Shepard aptly commented that self-interest was a “raging Sea which would overwhelm all if [it] have not bankes.” Shepard did not wish to abolish self-interest, merely to strengthen its restraining banks. Christianity provided the banks that permitted capitalist enterprise to persist, prosper, and expand into the Americas.

    A sixteenth-century Italian physician marveled “that I was born in this century in which the whole world became known; whereas the ancients were familiar with but a little more than a third part of

    During the 1550s the explorer Jean de Léry reported that America was so “different from Europe, Asia and Africa in the living habits of its people, the forms of its animals, and, in general, in that which the earth produces, that it can well be called the new world.”

    But the differences began to diminish as soon as they were recognized. The invasion by European colonists, microbes, plants, and livestock eroded the biological and cultural distinctions formerly enforced by the Atlantic Ocean. Newly connected, the two “worlds,” old and new, became more alike in their natures, in their combinations of plants and animals.

    The environmental revolution worked disproportionately in favor of the Europeans and to the detriment of the native peoples, who saw their numbers dwindle. Although never under the full control of the colonizers, the transformation enhanced their power by undermining the nature that indigenous communities depended upon. Colonization literally alienated the land from its native inhabitants. In particular, the colonizers accidentally introduced despised weeds, detested vermin, and deadly microbes. All three did far more damage to native peoples and their nature than to the colonists. While exporting their own blights, the European colonizers imported the most productive food plants developed by the Indians. The new crops fueled a population explosion in seventeenth- and eighteenth-century Europe. Part of that growth then flowed back across the Atlantic to resettle the Americas as European colonies.

    The long and usually secure trade routes of the Muslim world reached from Morocco to the East Indies and from Mongolia to Senegal. Within that range, Muslim traders benefited from the far-flung prevalence of Arabic as the language of law, commerce, government, and science.

    Inspired by their literary fantasies, European visionaries longed to reach the Far East to enlist their peoples and wealth for a climactic crusade against Islam. As a fabulous land that could fulfill Europeans’ dreams, eastern Asia (and especially China) rendered the intruding barrier of the Muslim world all the more frustrating.

    In 1469 the marriage of Queen Isabella and Prince Ferdinand united Aragon and Castile to create “Spain.”

    But, with more greed than consistency, the Iberians also enslaved Guanche who had converted to Christianity in the vain hope of living peaceably beside their invaders.

    In their invasion of the small and long-isolated Canaries, the Iberians reaped the perverse advantage of their relatively large population located at a nexus of commercial exchange, which made for an especially diverse and regularly reinforced pool of diseases.

    So complete was the cultural destruction that only nine sentences of the Guanche language have survived.

    At first, most of the slaves were Guanche, but they inconveniently and rapidly died from the new diseases. To replace the dead, the colonists imported Africans to work the sugar plantations. West African societies had long enslaved war captives and convicted criminals for sale to Arab traders, who drove them in caravans across the Sahara to the Mediterranean. This

    By turning native peoples into commodities, for sale as plantation slaves, the invaders developed a method for financing the further destruction of their resistance. In the Atlantic islands, the newcomers also pioneered the profitable combination of the plantation system and the slave trade. In the fifteenth-century Atlantic islands (and principally the Canaries), we find the training grounds for the invasion of the Americas.

    For in 1492 no one in Europe had any idea that the next islands farther west lay close to two immense continents inhabited by millions of people.

    In 1498 Vasco da Gama exploited that discovery to enter and cross the coveted Indian Ocean, the gateway to the trade riches of the East. The profits kept the Portuguese focused on the southern and eastward route to Asia, leaving the westward route largely unguarded for their Spanish rivals to explore by default.

    Spain pioneered transatlantic voyages, thanks to the aggressive ambition, religious mysticism, and navigational prowess of the Genoese mariner Christopher Columbus. In popular histories and films, Columbus appears anachronistically as a modernist, a secular man dedicated to humanism and scientific rationalism, a pioneer who overcame medieval superstition. In fact, he was a devout and militant Catholic who drew upon the Bible for his geographic theories. He also owned, cherished, and heavily annotated a copy of The Travels of Marco Polo, which inspired his dreams of reaching the trade riches and the unconverted souls of East Asia. Columbus hoped to convert the Asians to Christianity and to recruit their bodies and their wealth to assist Europeans in a final crusade to crush Islam and reclaim Jerusalem. Such a victory would then invite Christ’s return to earth to reign over a millennium of perfect justice and harmony.

    What deterred Europeans from sailing due west for Asia was not a fear of sailing off the edge of the world but, instead, their surprisingly accurate understanding that the globe was too large.

    Exploiting the trade winds, he turned west into the open ocean and had clear, easy sailing, reaching a new land after just thirty-three days.

    But Columbus supposed that all of the islands belonged to the East Indies and lay near the mainland of Asia. Although the native inhabitants (the Taino) were unlike any people he had ever seen or read about, Columbus insisted that they were “Indians,” a misnomer that has endured.

    Thanks to the newly invented printing press, word of Columbus’s voyage and discovery spread rapidly and widely through Europe. Eagerly

    With the assistance of the pope, the Spanish and the Portuguese negotiated the 1494 Treaty of Tordesillas, which split the world of new discoveries by drawing a north-south boundary line through the mid-Atlantic west of the Azores.

    In 1495 he shipped 550 captives to Spain for sale to help pay for his expedition. Because most died during the voyage or within a year of arrival from exposure to European diseases, Columbus had to abandon the project of selling Indians in Spain. Instead, he distributed Indian captives among the colonists to work on their plantations and to serve as sex slaves.

    Violent mutinies and more violent reprisals by Columbus induced the monarchs to revoke his executive authority in 1500.

    Although displaced as governor, Columbus continued to serve the Spanish as a maritime explorer. In 1498 and 1502 his third and fourth transatlantic voyages revealed long stretches of the South and Central American coast. Nonetheless, to his death in 1506, Columbus stubbornly insisted that all of his discoveries lay close to the coast of Asia.

    A year later, Amerigo Vespucci, a Genoese mariner who alternated between Spanish and Portuguese employ, explored enough of the coast of South America to deem it a new continent. Consequently, European map-makers began to call the new land by a variant of his first name—America.

    Although Columbus had not reached Asia, he did find the substance of what he sought: a source of riches that would, in the long term, enable European Christendom to grow more powerful and wealthy than the Muslim world.

    With the Canaries as their colonial model, the Spanish aggressively modified Hispaniola, introducing new crops, especially sugarcane, and new animals, including cattle, mules, sheep, horses, and pigs.

    colonization rapidly destroyed the Taino people of Hispaniola. In 1494 a Spaniard reported that more than 50,000 Taino had died, “and they are falling each day, with every step, like cattle in an infected herd.” From a population of at least 300,000 in 1492, the Taino declined to about 33,000 by 1510 and to a mere 500 by 1548. The great missionary friar Bartolomé de Las Casas mourned the virtual extermination “of the immensity of the peoples that this island held, and that we have seen with our own eyes.”

    In sum, the natives suffered from a deadly combination of microparasitism by disease and macroparasitism by Spanish colonizers, preying upon native labor. Although not genocidal in intent—for the Spanish preferred to keep the Taino alive and working as tributaries and slaves—the colonization of Hispaniola was genocidal in effect.

    In any given locale, the first wave of epidemics afflicted almost every Indian. Within a decade of contact, about half the natives died from the new diseases. Repeated and diverse epidemics provided little opportunity for native populations to recover by reproduction. After about fifty years of contact, successive epidemics reduced a native group to about a tenth of its precontact numbers.

    Most scholars now gravitate to the middle of that range: about fifty million Indians in the two American continents, with about five million of them living north of Mexico.

    Apparently only one major disease, venereal syphilis, passed from the Americas into Europe with the returning explorers and sailors. If so, syphilis exacted a measure of revenge on behalf of the native women raped by the invaders.

    The Europeans died in far greater numbers when they tried to colonize sub-Saharan Africa, where they did encounter relatively novel and especially virulent tropical diseases, principally falciparum malaria and yellow fever. Unwittingly, the Europeans imported those African diseases into the American tropics and subtropics with the slaves brought to work on their plantations. Those African maladies then added to the epidemics that devastated the Native Americans.

    In effect, the Old World diseases benefited from a much larger pool of potential hosts. Passing to and fro, these pathogens gradually strengthened the immunities of the disease-embattled peoples of the Old World, rendering them deadly carriers when they passed into places where those diseases were not endemic.

    By living in filth, urban Europeans paid a high price in steady losses to endemic disease and occasional exposure to new epidemics. But they also rendered themselves formidable carriers of diseases to distant and cleaner peoples with far less experience with so many pathogens.

    North American natives domesticated only one mammal, the dog, which rarely shares diseases with its best friends.

    One disease often weakened a victim for a second to kill. For example, many Indians barely survived smallpox only to succumb to measles, pneumonia, or pleurisy.

    For want of healthy people to tend the sick, to fetch food and water and keep fires going, many victims died of starvation, dehydration, or exposure.

    Neither sixteenth-century natives nor colonizers knew about the existence of microbes, much less that some caused disease. Instead, both assumed that the epidemics manifested some violent disruption of supernatural power. Colonists interpreted the diseases as sent by their God to punish Indians who resisted conversion to Christianity. Indians blamed the epidemics on sorcery practiced by the newcomers. When the native shamans failed to stop or cure a disease, they became discredited as ineffectual against the superior sorcery of the newcomers, who survived epidemics that slaughtered the natives.

    During the sixteenth and seventeenth centuries, the colonizers did not intentionally disseminate disease. Indeed, they did not yet know how to do so. Especially during the sixteenth century, the colonizers valued Indian bodies and souls even more than they coveted Indian land. They needed Indians as coerced labor to work on mines, plantations, ranches, and farms. And Christian missionaries despaired when diseases killed Indians before they could be baptized.

    Prior to 1820, at least two-thirds of the twelve million emigrants from the Old to the New World were enslaved Africans rather than free Europeans. Most

    From a population of 5 million in 1492, the inhabitants of Great Britain surged to 16 million by 1800, when another 5 million Britons already lived across the Atlantic. The

    The demographic and colonial history of Africa offers an instructive contrast to North America. Despite inferior firepower, until the nineteenth century the Africans more than held their own against European invaders because African numbers remained formidable. Unlike the Native Americans, the Africans did not dwindle from exposure to European diseases, with which they were largely familiar. On the contrary, African tropical diseases killed European newcomers in extraordinary numbers until the development of quinine in the nineteenth century.

    Native Americans had developed certain wild plants into domesticated hybrids that were more productive than their Old World counterparts. Measured as an average yield in calories per hectare (a hectare is ten thousand square meters, the equivalent of 2.5 acres), cassava (9.9 million), maize (7.3 million), and potatoes (7.5 million) all trump the traditional European crops: wheat (4.2 million), barley (5.1 million), and oats (5.5 million). By introducing the New World crops to the Old World, the colonizers dramatically expanded the food supply and their population.

    In Europe, maize and potatoes endowed farmers with larger yields on smaller plots, which benefited the poorest peasants. It took at least five acres planted in grain to support a family, but potatoes could subsist three families on the same amount of land.

    In effect, maize and potatoes extended the amount of land that Europeans could cultivate either to feed themselves or to produce fodder for their cattle.

    During the eighteenth century, the potato first gained its close association with Ireland, and Irish numbers grew from 3 million in 1750 to 5.25 million in 1800. The Irish then became vulnerable to any blight that devastated their potato crop. When such a blight struck during the 1840s, thousands starved to death and millions fled overseas, primarily to North America.

    Other European animals hitched along to the Americas despite the colonizers’ best efforts to prevent it. These included the European rats, which were larger and more aggressive than their North American counterparts. Hated parasites on crops and granaries, the rats were skilled stowaways in almost every wooden ship.

    Today botanists estimate that 258 of the approximately 500 weed species in the United States originated in the Old World.

    By a mix of design and accident, the newcomers triggered a cascade of processes that alienated the land, literally and figuratively, from its indigenous people.

    In sum, native peoples and their nature experienced an invasion not just of foreign people but also of their associated livestock, microbes, vermin, and weeds.

    To justify their own imperialism, the rival Europeans elaborated upon some very real Spanish atrocities to craft the notorious and persistent “Black Legend”: that the Spanish were uniquely cruel and far more brutal and destructive than other Europeans in their treatment of the Indians.

    Alternating brutal force with shrewd diplomacy, Cortés won support from the native peoples subordinated by the Aztecs.

    Every year it hosted public ritual human sacrifices of captured people, their chests cut open and their still-beating hearts held up to the sun.

    The population of about 200,000 dwarfed the largest city in Spain, Seville, which had only 70,000 inhabitants. Accustomed to the din, clutter, and filth of European cities, Spaniards marveled at the relative cleanliness and order of the Aztec metropolis.

    The conquistadores certainly benefited from the technological superiority of Spanish weaponry. Because sixteenth-century guns, known as arquebuses, were crude, heavy, inaccurate, and slow to reload, only a few conquistadores carried them (Cortés’s force of six hundred men had only thirteen guns). Instead, most relied on steel-edged swords and pikes and crossbows. Although essentially late-medieval, this steel weaponry was far more durable and deadly than the stone-edged swords, axes, and arrows of the natives. And

    Spanish military technology also exploited horses and war dogs (mastiffs), both of which were new and stunning to Indians. Although most conquistadores fought on foot, the few with horses proved especially dreadful to the natives,

    The conquistador expeditions were private enterprises led by independent military contractors in pursuit of profit. The commander ordinarily obtained a license from the crown, which reserved a fifth of the plunder and claimed sovereign jurisdiction over any conquered lands. Known as an adelantado, the holder of a crown license recruited and financed his own expedition, with the help of investors who expected shares in the plunder. Developed in the course of the reconquista and applied to the Canaries, the adelantado system reflected the crown’s chronic shortage of men and money.

    Greed was a prerequisite for pursuing the hard life of a conquistador. Cortés meant to be disingenuous when he assured the Aztecs, “I and my companions suffer from a disease of the heart which can be cured only with gold.” Of course, he was more profoundly right than he realized.

    After all, the conquistadores scrupulously adhered to the Spanish law of conquest by reading the requerimiento, which ordered defiant Indians immediately to accept Spanish rule and Christian conversion. If the Indians ignored this order, they deserved the harsh punishments of a “just war.” The requerimiento announced, “The resultant deaths and damages shall be your fault, and not the monarch’s or mine or the soldiers.” Attending witnesses and a notary certified in writing that the requerimiento had been read and ignored, justifying all the deaths and destruction that followed. The cruel absurdity of reading the requerimiento in a language alien to Indians was apparent to many Spanish priests if not to the conquistadores.

    During the 1530s the leading conquistadores either died fighting one another over the spoils of conquest, as did Pizarro in Peru, or were forced into retirement by the crown, which was the fate of Cortés in 1535.

    Hungry, overworked, and dislocated, the natives of Mexico were especially vulnerable to disease. The native population dwindled from a pre-conquest ten million natives to about one million by 1620.

    By the 1570s the number of emigrant women had increased but remained less than a third of the total. As a result, the male emigrants usually took wives and concubines among the Indians, producing mixed offspring known as mestizos.

    One investigative report on a viceroy of Peru ran to 49,555 pages.

    Unfortunately, any colonial request for crown instructions required at least a year for an answer, given the slow pace of transatlantic shipping and the bureaucratic inertia in Spain. One despairing viceroy complained, “If death came from Madrid, we should all live to a very old age.”

    Between 1500 and 1650 the Spanish shipped from America to Europe about 181 tons of gold and 16,000 tons of silver.

    After the relative price stability of the fifteenth century, Europeans experienced a fivefold rise in prices during the sixteenth century. Laboring people especially suffered from the inflation, because the cost of living rose faster than their wages.

    In 1523 much of the gold stolen by Cortés from the Aztecs and shipped homeward was restolen by French pirates in the Atlantic. During

    Because conquistadores lived as parasites off the native produce of the invaded regions, they could not linger where the Indians did not practice horticulture. Fields of maize attracted conquistadores, and their absence deterred them.

    He and his companions had trekked across much of North America, from the swamps of Florida to the coast of Texas and then through the deserts, mountains, and valleys of northern Mexico. Along the way, Cabeza de Vaca endured a searing double transformation, first from conquistador to slave, and then from slave to sacred healer.

    The passage of nearly five centuries has rendered the sixteenth-century peoples even more culturally alien from us than they were to one another.

    As their societies shrank and relocated, they became less complex, diminishing the power of the chiefs. In most places there were simply no longer enough people to raise the agricultural tribute necessary to sustain a costly and elaborate elite.

    In the depopulated valleys, forests and wildlife gradually reclaimed the abandoned maize and bean fields, while the refugees farmed the less fertile but safer hills. The resurgent wildlife included bison, common in the southeast by 1700 but never sighted by Soto’s conquistadores 160 years before. Far from timeless, the southeastern forest of the eighteenth century was wrought by the destructive power of a sixteenth-century European expedition. Soto had created an illusion of a perpetual wilderness where once there had been a populous and complex civilization.

    The new confederations exemplified the widespread process of colonial “ethnogenesis”—the emergence of new ethnic groups and identities from the consolidation of many peoples disrupted by the invasion of European peoples, animals, and microbes.

    In fact, after 1700 most North American Indian “tribes” were relatively new composite groups formed by diverse refugees coping with the massive epidemics and collective violence introduced by colonization.

    To make a vivid and intimidating example, Coronado ordered one hundred captured warriors burned to death at the stake.

    In frustration and fury, the Spaniards tortured and strangled their Pueblo guide, who confessed the plot to lead the Spanish astray where they might get lost and die.

    In an ironic reversal of the usual colonial process, the wrecks endowed a native people with gold, silver, and slaves, for the Calusa Indians scavenged the hulks for the shiny metals and enslaved the castaway sailors.

    In 1673 the governor of Cuba confessed, “It is hard to get anyone to go to San Agustín because of the horror with which Florida is painted. Only hoodlums and the mischievous go there from Cuba.”

    Conversion, however, came at a cultural cost. The priests systematically ferreted out and burned the wooden idols cherished by the natives, banned their traditional ball game, and enforced Christian morality, which required marriage, monogamy, and clothing that covered female breasts.

    Conversion bought safety from Spanish muskets but not from Spanish microbes.

    Caught in a double squeeze of high costs and small income, the New Mexicans had the lowest standard of living of any colonists in North America.

    Never more than 1,000 during the seventeenth century, the colonists remained greatly outnumbered by the Indians, despite the epidemics that reduced Pueblo numbers from 60,000 in 1598 to 17,000 in 1680.

    Because the Pueblo peoples already lived in permanent, compact horticultural villages, it was relatively easy to create a mission simply by adding a church, a priest or two, and a few soldiers.

    Indeed, many Pueblo hoped that a military alliance with the Spanish would protect both from the nomadic warrior bands—Apache and Ute—of the nearby mountains and Great Plains.

    The priests also stood out among the other Hispanics because they rarely raped Indian women and preferred their vow of poverty to the accumulation of gold.

    In their theatricality, celibacy, endurance of pain, and readiness to face martyrdom the priests manifested an utter conviction of the truth and power of their God.

    Consequently, the priests were in a state of probation as the Pueblo tried to determine whether they benefited or suffered from the Christian power over the spirit world. No matter how successful in getting a church built and hundreds baptized, every priest lived in the shadow of violent death. If the epidemics increased, natives who had seemed docile could conclude that their priests were dangerous sorcerers who must be killed. Of the approximately one hundred Franciscans who served in New Mexico during the seventeenth century, forty died as martyrs to their faith.

    The missionaries also encouraged restive soldiers to imprison one governor for nine months and to assassinate a second.

    Previously lacking any common language and identity, the Pueblo peoples obtained both—as Spanish became a common second language and as they developed shared grievances against a set of exploiters. Both developments improved their ability to unite against the colonizers.

    Especially appealing to men outraged at the Franciscan attack on polygamy, Popé promised each warrior a new wife for every Hispanic he killed.

    Popé encouraged the Pueblo to restore their native names and to reverse their baptisms by plunging into the Rio Grande in a ceremony of purification. He declared Christian marriages dissolved and polygamy restored. To replace the churches, the Indians restored their sacred kivas. Popé urged forsaking everything Hispanic, including the new crops and domesticated livestock, but most Pueblo found these too useful to relinquish. Selective in adapting Hispanic culture, the Indians were equally selective in rejecting it.

    Entangled in alliances with Indians, European traders often felt compelled to assist native wars that complicated and slowed their pursuit of profit. From the Indian perspective, the French came, in the words of historian Allan Greer, “not as conquering invaders, but as a new tribe negotiating a place for itself in the diplomatic webs of Native North America.” In those webs, the Indians negotiated from a position of strength.

    Because Indians voluntarily performed the hard work of hunting the animals and treating their furs, traders could immediately profit in America without the time, trouble, expense, and violence of conquering Indians to reorganize their labor in mines and plantations.

    The natives also adapted alcohol to their own purposes. At first, they balked at the novel taste and disorienting effect, but eventually they developed a craving. Drinking as much and as rapidly as they could, the Indians got drunk as a short cut to the spiritual trances that had previously required prolonged fasting and exhaustion. Alcohol also offered a tempting release of aggressions, ordinarily repressed with great effort and much stress, because Indian communities demanded the consistent appearance of harmony. Regarding alcohol as an animate force, natives believed that drinkers were not responsible for their violent actions. Initially appealing and apparently liberating, alcohol became profoundly destructive once it became common and cheap. In drink, natives lashed out with knives and hatchets, killing their own people far more often than the colonial suppliers of their new drug. Fortunately, during the seventeenth century, the natives’ access to alcohol remained limited and sporadic, permitting only occasional binges.

    Occasionally the more ruthless mariners interrupted trade to kidnap Indians as human commodities. Taken to Europe, they were put on profitable display as curiosities and trained to assist future voyages as interpreters. Eager for a voyage home, the captives shrewdly told their captors what they wanted to hear, promising to reveal gold and silver and friendly Indians eager for Christianity. Unfortunately, European diseases consigned most of the captives to European graves before European fantasies could take them home.

    By the mid-seventeenth century, the trade goods were sufficiently common that the northeastern Algonquian peoples had forsaken their stone tools and weapons—and the craft skills needed to produce them. If cut off from trade, natives faced deprivation, hunger, and destruction by their enemies.

    Although the fur trade pitted the Indians against one another in destructive competition, no people could opt out of the intertwined violence and commerce. As a matter of life and death, every native people tried to attract European traders and worked to keep them away from their Indian enemies.

    Combining the talents of trader, soldier, cartographer, explorer, and diplomat, Champlain recognized that French success in Canada depended upon building an alliance with a network of native peoples.

    The introduction of firearms revolutionized Indian warfare as the natives recognized the uselessness of wooden armor and the folly of massed formations. Throughout the northeast, the Indians shifted to hit-and-run raids and relied on trees for cover from gunfire.

    Natives feared that their dead would linger about the village, inflicting disease and misfortune unless appeased with loud and expressive mourning. To draw the bereaved out of their agony and to encourage dead spirits to proceed to their afterlife, neighbors staged condolence rituals with feasts and presents. The best present of all was a war captive meant to replace the dead.

    Captive men more often faced death by torture, especially if they had received some crippling wound. Inflicting death as slowly and painfully as possible, the Iroquois tied their victim to a stake, and villagers of both genders and all ages took turns wielding knives, torches, and red-hot pokers systematically to torment and burn him to death. The ceremony was a contest between the skills of the torturers and the stoic endurance of the victim, who manifested his own power, and that of his people, by insulting his captors and boasting of his accomplishments in war. After the victim died, the women butchered his remains, cast them into cooking kettles, and served the stew to the entire village, so that all could be bound together in absorbing the captive’s power. By practicing ceremonial torture and cannibalism, the Iroquois promoted group cohesion, hardened their adolescent boys for the cruelties of war, and dramatized their contempt for outsiders.

    Although horrifying to European witnesses, the torments of northeastern torture had their counterparts in early modern Europe, where thousands of suspected heretics, witches, and rebels were publicly tortured to death: burned at the stake, slowly broken on a wheel, or pulled apart by horses. The seventeenth century was a merciless time for the defeated on either side of the Atlantic.

    In these ceremonies, the chiefs presided as the kinfolk of a killer gave presents to the relatives of the victim. Delivery and acceptance restored peace and broke the cycle of revenge killings.

    Seventeenth-century Europeans regarded non-Europeans as socially and culturally inferior—but not as racially incapable of equality. Lacking a biological concept of race, seventeenth-century Europeans did not yet believe that all people with a white skin were innately superior to all of another color. European elites primarily perceived peoples in terms of social rank rather than pigmentation.

    Rather than compel Indians to learn French and relocate into new mission towns, the Jesuits mastered the native languages and went into their villages to build churches.

    One priest returned to the Huron after having survived capture and torture by the Iroquois, losing most of his fingers. Because the Huron cherished stoicism under torture as the ultimate test of manhood, they honored this priest. One Huron remarked, “I can neither read nor write, but those fingers which I see cut off are the answer to all my doubts.”

    As the Jesuits gathered a following, they demanded more cultural concessions from their Huron converts. The Jesuits denounced torture and ritual cannibalism, premarital sex, divorce, polygamy, and the traditional games, feasts, and dances. An

    During the assaults, Jesuit priests hurriedly baptized all they could reach before they too were hacked or burned to death. By 1650 the Huron villages had all been destroyed

    Moreover, during the mid-sixteenth century, the English were preoccupied with the conquest and colonization of Ireland.

    Later in the century, success in Ireland emboldened English leaders to extend their colonial ambitions across the Atlantic to the region they called Virginia, named in honor of their queen, Elizabeth I, a supposed virgin. Between 1580 and 1620 the English applied the name to the entire mid-Atlantic coast between Florida and Acadia.

    Unlike the authoritarian kings of France and Spain, Queen Elizabeth had to share national power with the aristocracy and gentry, who composed the bicameral national legislature known as Parliament.

    Although a narrow system of government by our standards, the English constitution was extraordinarily open and libertarian when compared with the absolute monarchies then developing in the rest of Europe. Consequently, it mattered greatly to the later political culture of the United States that England, rather than Spain or France, eventually dominated colonization north of Florida.

    Probably about half the rural peasantry lost their lands between 1530 and 1630.

    Addressing propertied Englishmen, the colonial promoters announced that they had an easy solution for England’s social woes: exported to a new colony in Virginia, the idle and larcenous poor could be put to work raising commodities for transport to, and sale in, England.

    Contrary to the Black Legend, the English treated the Irish no better than the Spanish treated the Guanche, and they offered no prospect of fairer play for the Indians of Virginia. Indeed, the conquest and colonization

    At last, in August 1590, White returned to Roanoke with a relief expedition to find the settlement mysteriously abandoned with no signs of attack by either the Indians or the Spanish. The lone clue was carved into a tree—the word “Croatoan,” the name of a nearby island. But the fearful and impatient English mariners refused to venture through the dangerously shallow waters to Croatoan to investigate. Sailing away in pursuit of Spanish treasure ships, the mariners abandoned any surviving colonists to their still mysterious fate.

    After retreating to Croatoan and failing to contact a passing ship, the surviving colonists probably headed north to Chesapeake Bay to execute their original plan. They apparently found haven in an Indian village. In 1607, when English colonists reached Chesapeake Bay, some Indians reported that white people had recently lived nearby as refugees in a native village. Unfortunately, the village had run afoul of a powerful chieftain, Powhatan, who killed all the refugees.

    Neither house nor furnishings provided opportunities for the conspicuous consumption that helped determine status in England.

    But the Algonquians recoiled in horror at the prospect of adopting a European way of life that would obligate their men to forsake war and, instead, adopt the female role of agricultural laborer.

    One starving colonist killed and ate his wife, for which he was tried, convicted, and burned at the stake.

    Between 1607 and 1622 the Virginia Company transported some 10,000 people to the colony, but only 20 percent were still alive there in 1622.

    In England, birth and wealth had screened the gentlemen from manual labor, while the vagrants, for want of employment, had learned to survive by begging and stealing.

    Indeed, the company adopted a “head-right system” that awarded land freely to men with the means to pay for their own passage (and that of others) across the Atlantic. Such emigrants received fifty acres apiece, and another fifty acres for every servant or relative brought at their own expense. Servants were also entitled to fifty acres each, if and when they survived their terms of indenture—which afforded them new incentive to emigrate. As private property owners, rather than company employees, the colonists showed much greater initiative and effort in cultivating the corn, squash, and beans that ensured their subsistence. But to prosper, they still needed a commercial crop to market in England.

    the annual mortality rate remained about 25 percent until mid-century.

    The Virginians developed the strategy, practiced in subsequent colonial wars, of waiting until just before corn harvest to attack and destroy the Indian villages and their crops, consigning the natives to a winter and spring of exposure and starvation.

    During the seventeenth century, the English developed two types of colonial governments: royal and proprietary. Relatively few until the eighteenth century, the royal colonies belonged to the crown. Initially more numerous, the proprietary colonies belonged to private interests.

    And, as the promoters had predicted, the Chesapeake absorbed thousands of poor laborers considered redundant and dangerous in England.

    Their alliance became both easier and more essential at the turn of the century, when the great planters switched their labor force from white indentured servants to enslaved Africans. Class differences seemed less threatening as both the common and great planters became obsessed with preserving their newly shared sense of racial superiority over the African slaves.

    In both Chesapeake colonies, the distant crown (for Virginia) or lord proprietor (for Maryland) had to share power with the wealthiest and most ambitious colonists. They refused to pay taxes unless authorized by their own elected representatives in a colonial assembly. Governors who defied the local elite faced obstruction and risked rebellion.

    This decentralization of power stood in marked contrast to the Spanish and the French colonies, which permitted neither elected assemblies nor individual liberties.

    Indeed, widows were few and their status brief in colonies where women were in such short supply and in such great demand for remarriage.

    The husband also supervised and disciplined his dependents: wife, children, and servants. If a servant, child, or wife killed his or her master, the law considered the culprit guilty of “Petit Treason” as well as murder.

    But the authorities also held the patriarchs responsible for the misconduct of their dependents. In 1663, a Virginia county court rebuked and punished both a maidservant, for public insolence, and her master, for failing to control her “scolding.”

    The planters also needed regularly to clear new fields with axes, for after three years of cropping, the cultivated lands lost their fertility, and the planter had to clear another field to allow the old to lie fallow.

    Given the short life expectancy of all Chesapeake laborers, planters wisely preferred to buy English indentured servants for four or five years rather than purchase the more expensive lifelong slaves from Africa. In 1650 enslaved Africans numbered only three hundred, a mere 2 percent of the Chesapeake population.

    English servants composed at least three-quarters of the emigrants to the Chesapeake during the seventeenth century: about 90,000 of the 120,000 total.

    Given that a sturdy beggar could never anticipate obtaining land in England, the colony offered an opportunity unavailable at home. Of course, that opportunity required men and women to gamble their lives in a dangerous land of hard work and deadly diseases.

    Before 1640, most indentured servants endured harsh but short lives in the Chesapeake. Having staked their health in pursuit of farms, most lost their gamble, finding graves before their terms expired.

    In part, health improved as many new plantations expanded upstream into locales with fresh running streams, away from the stagnant lowlands, which favored malaria, dysentery, and typhoid fever.

    The “seasoned” acquired a higher level of immunity, which they passed on to their offspring.

    The entry costs of tobacco planting were modest: a set of hand tools, a year’s provisions, a few head of cattle and pigs, some seed, and about fifty acres of land.

    At any given time, a planter cultivated only about a tenth of his farm, leaving most of his domain heavily forested.

    The common people ate with their fingers, sharing a bowl and drinking from a common tankard, both passed around the table. They usually ate a boiled porridge of corn, beans, peas, and pork, washed down with water or cider. Most colonists had plenty to eat, in contrast to their past in both England and the early years of the colony. By moving to the Chesapeake, the common colonist sacrificed comfort and life expectancy for an improved diet and the pride and autonomy of owning land.

    During the 1660s, new imperial regulations worsened the tobacco glut by requiring colonists to ship their tobacco exclusively to England in English ships.

    An assemblyman received 150 pounds of tobacco in pay per day in session—about five times in value what was paid to his counterpart and contemporary in colonial Massachusetts. Governor Berkeley annually collected a salary of £1,000. To put that in perspective, most emigrants mortgaged at least four years of their working lives to pay the £6 cost of a transatlantic passage, and a small planter was fortunate to clear £3 annually over and above expenses.

    In pity for himself, Governor Berkeley complained, “How miserable that man is that Governes a People wher[e] six parts of seaven at least are Poore, Endebted, Discontented, and Armed.”

    Determined to enjoy the perquisites and rewards of a hierarchical society, Bacon and his lieutenants intended no egalitarian revolution.

    Although Bacon attacked a royal governor, he did not seek independence from England. In 1676 no Virginian imagined that independence was feasible or desirable.

    In stark contrast to those of Berkeley’s day, Virginia’s eighteenth-century assemblymen cultivated popularity by conspicuously opposing taxes, infuriating a succession of royal governors with instructions to secure a revenue for imperial defense.

    denounced the assemblymen for striving “to recommend themselves to the populace upon a received opinion among them, that he is the best Patriot that most violently opposes all Overtures for raising money.”

    By reducing taxes, the Virginia gentry reinvented themselves and Virginia politics, transferring the odium of parasitism and tyranny to the royal governor. This dramatically reversed the role that the crown had claimed in 1677 as the putative defender of the common planter.

    century Virginians both exceptionally hospitable and genial but shallow and materialistic.

    by mastering the genteel public style known as “condescension”: a gentleman’s ability to treat common people affably without sacrificing his sense of superiority. More

    Held at the county courthouse, the election was public, with each voter individually stepping forward to voice his vote, for recording by a clerk. By such performances, common voters showed gratitude for past favors and solicited future goodwill from their favored gentleman. Upon receiving a vote, the candidate politely thanked the voter, displaying the condescending gratitude of a true gentleman worthy of high office.

    At the end of the seventeenth century, slaves became a better investment, as servants became scarcer and more expensive: £25 to £30 for a lifelong slave compared well with £15 to purchase just four years of a servant’s time.

    The slave numbers surged from a mere 300 in 1650 to 13,000 by 1700, when Africans constituted 13 percent of the Chesapeake population. During the early eighteenth century, their numbers and proportion continued to grow, reaching 150,000 people and 40 percent by 1750.

    The planters shifted from servants to slaves for economic reasons, but that change incidentally improved their security against another rebellion by angry freedmen.

    More commonly, masters permitted slaves to acquire and manage their own property, primarily a few chickens, hogs, cattle, and small garden plots of maize and tobacco. By accumulating and selling property, dozens of early slaves purchased their freedom and obtained the tools, clothing, and land to become common planters. Because the colonial laws did not yet forbid black progress, the black freedmen and women could move as they pleased, baptize their children, procure firearms, testify in court, buy and sell property, and even vote. Some black men married white women, which was especially remarkable given their scarcity and high demand as wives for white men. A few black women took white husbands.

    The most successful and conspicuous black freedman, Anthony Johnson, acquired a 250-acre tobacco plantation and at least one slave. With apparent impunity, Johnson boldly spoke his own mind to his white neighbors, telling one meddler: “I know myne owne ground and I will worke when I please and play when I please.” When white neighbors lured away his slave, Johnson went to court, winning damages and the return of his property. That the authorities supported an African against whites and upheld his right to own slaves reveals that slavery and racism had not yet become inseparably intertwined in the Chesapeake. That a black man would own a slave also indicates that getting ahead in planter society was more important to Johnson than any sense of racial solidarity with his fellow Africans in Virginia.

    1680, Virginia prescribed thirty lashes on the bare back of any black slave who threatened or struck any white person, which invited poor whites to bully slaves with impunity, creating a common sense of white mastery over all blacks.

    Raping a slave was not a crime but marrying her was. In 1705 the law subjected any minister who conducted an interracial marriage to a fine of ten thousand pounds of tobacco. A white man who married a free black or a white woman who slept with any black man faced six months in prison and a £10 fine.

    Dreading reenslavement, the descendants of Anthony Johnson fled from Virginia, where their grandfather had been a respected freeholder able to defeat whites in lawsuits.

    Where most Chesapeake settlers were poor and short-lived indentured servants, New England attracted primarily “middling sorts” who preserved their freedom because they could pay their own way across the Atlantic.

    Puritan values helped the colonists prosper in a demanding land. In the process, they developed a culture that was both the most entrepreneurial and the most vociferously pious in Anglo-America. Contrary to the declension model promoted by some historians, the increasing commercialism of New England life at the end of the seventeenth century derived from Puritan values rather than manifested their decay.

    Begun as an epithet, “Puritan” persists in scholarship to name the broad movement of diverse people who shared a conviction that the Protestant Reformation remained incomplete in England.

    A Puritan explained, “God sent you unto this world as unto a Workhouse, not a Playhouse.”

    Puritans longed to purify the churches by ousting all conspicuous sinners and by inviting members to monitor one another for consistent morality and sound theology. This zeal, however, dismayed most English people, who preferred Anglicanism and the traditional culture characterized by church ales, Sunday diversions, ceremonial services, inclusive churches, and deference to the monarch.

    The first Puritan emigrants consisted of 102 Separatists, subsequently called the Pilgrims. In 1620 they crossed the Atlantic in the ship Mayflower to found a town named Plymouth on the south shore of Massachusetts Bay. Beneficiaries of a devastating epidemic that had recently decimated the coastal Indians, the Plymouth colonists occupied an abandoned village with conveniently cleared fields.

    Once in Massachusetts, the company leaders established the most radical government in the European world: a republic, where the Puritan men elected their governor, deputy governor, and legislature (known as the General Court). Until his death in 1649, John Winthrop almost always won annual reelection as governor.

    Because the Puritans prepared for the next world by their moral life in this one, their rhetoric yoked together material aspiration and the pursuit of salvation. It is anachronistic for us to separate the two.

    Purely economic motives, however, would have dispatched few people to cold, distant, and rocky New England. English people could more cheaply, easily, and certainly improve their material circumstances by moving to the nearby and booming Netherlands, which welcomed skilled immigrants.

    The Puritans understood in spiritual terms many causes that we might define as “economic.” They interpreted the wandering beggars, increased crime, cloth trade depression, and famines as divine afflictions meant to punish a guilty land that wallowed in sin.

    Battling the prevailing Atlantic winds and currents, the slow-moving vessels usually took eight to twelve weeks to cross. Few of the Puritans, who were mostly artisans and farmers, or their wives and children had traveled by ship. On board the standard vessel, about one hundred passengers shared the cold, damp, and cramped hold with their property, including some noisy and rank livestock.

    First, most English Puritans persisted at home, waiting to see how God would treat both the mother country and the New England experiment. Second, the New England emigration represented only 30 percent of all the English who crossed the Atlantic to the various colonies during the 1630s. Many more people emigrated to the Chesapeake and the West Indies. Third, the Great Migration was brief, for emigration declined to a trickle after 1640, amounting to only seven thousand for the rest of the century.

    At mid-century, the New England sex ratio was six males for every four females, compared with four males for every female in the Chesapeake. Greater balance encouraged a more stable society and a faster population growth.

    In 1700 less than 2 percent of New England’s inhabitants were slaves, compared with 13 percent for Virginia and 78 percent for the English West Indies. Compared with the rest of the empire, New England possessed an unusually homogeneous colonial population and culture: free, white, and transplanted English.

    Relative to the Chesapeake, the New England environment demanded more labor and provided smaller rewards, but it also permitted longer and healthier lives. In contrast to the Chesapeake tidewater with its long, hot, and humid summers and low topography, New England was a northern and hilly land with a short growing season and faster-flowing rivers and streams, which discouraged the malaria and dysentery that afflicted southern planters. In New England, people who survived childhood could expect to live to about seventy; in the Chesapeake, only a minority survived beyond forty-five.

    Because New England had the most decentralized and popularly responsive form of government in the English empire, royalists despised the region as a hotbed of “republicanism.”

    days. Puritan parents rarely dictated marriage partners to their children, but they could veto choices that seemed unwise.

    New England women could also more easily obtain divorce when abandoned or sexually betrayed by their husbands. Historian Cornelia Dayton concludes that the effort “to create the most God-fearing society” tended “to reduce the near-absolute power that English men by law wielded over their wives.”

    In effect, seventeenth-century New England and the English West Indies developed in tandem as mutually sustaining parts of a common economic system. Each was incomplete without the other. New English freedom depended on West Indian slavery.

    By 1700, Boston alone had fifteen shipyards, which produced more ships than the rest of the English colonies combined. Indeed, Boston ranked second only to London as a shipbuilding center in the empire.

    Seizing upon New England’s reputation in the mother country as a den of Puritan heretics and hypocrites, English economic interests called for an end to New England’s virtual autonomy within the empire.

    As God’s favored people, they considered themselves the heirs to the ancient Israelites of the Old Testament. If they honored his wishes, God would bestow health and abundance upon them in this world. But should they deviate from his will in any way, God would punish them as rebels—more severely than he chastised common pagans, like the Indians.

    In 1650, Massachusetts had one minister for every 415 persons, compared with one per 3,239 persons in Virginia.

    The average New English churchgoer heard about seven thousand sermons in the course of his or her lifetime. To train an orthodox Puritan ministry for so many churches, Massachusetts founded Harvard College in 1636—the first such institution in English America (the

    The remaining sticklers for the old purity bolted to join the Baptists, a Separatist denomination that rejected infant baptism in favor of adult baptism as an initiation to full membership.

    The most sensational cases involved male sex with animals. In 1642 the New Haven authorities suspected George Spencer of bestiality when a sow bore a piglet that carried his resemblance. He confessed and they hanged both Spencer and the unfortunate sow. New Haven also tried, convicted, and executed the unfortunately named Thomas Hogg for the same crime.

    No Catholics, Anglicans, Baptists, or Quakers need come to New England (except to exceptional Rhode Island). All dissenters were given, in the words of one Massachusetts Puritan, “free Liberty to keep away from us.”

    By drawing dissidents out of Massachusetts and Connecticut, the Rhode Island settlements helped to maintain orthodoxy in the two major Puritan colonies. Although the orthodox leaders of Massachusetts and Connecticut despised Rhode Island, they benefited from it as a safety valve for discontents who would otherwise fester in their midst.

    The authorities pardoned witches who confessed and testified against others, but persistent denial consigned the convicted witch to public execution by hanging. Contrary to popular myth and previous European practice, the New English did not burn witches at the stake.

    Witchcraft was also plausible because some colonists did dabble in the occult to tell fortunes and to cure, or inflict, ills (but there is little reason to believe that such “cunning folk” worshiped Satan). Moreover, occult beliefs are self-fulfilling. Anthropologists have repeatedly found that people

    Communities and authorities disproportionately detected witchcraft in women who seemed angry and abrasive, violating the cultural norm celebrating female modesty. Women constituted both the majority of the accusers and 80 percent of the accused.

    Because it was no easy matter to prove witchcraft, juries usually found innocence. The New English prosecuted ninety-three witches but executed only sixteen—until 1692, when a peculiar mania at Salem dramatically inflated the numbers.

    jeremiad exhorted listeners to reclaim the lofty standards and pure morality ascribed to the founders of New England. Paradoxically, the popularity of the genre attested to the persistence, rather than the decline, of Puritan ideals in New England. Determined to live better, the laity longed for the cathartic castigation of the jeremiad. And the ministry complied with eloquence and zeal. But English Puritans often took the jeremiads at face value, confirming their unduly low estimation of New England.

    In 1679 the Boston synod of ministers denounced frontier settlers who succumbed to “an insatiable desire after Land and worldly Accommodations, yea, so as to forsake Churches and Ordinances, and to live like Heathen, only that so they might have Elbow-room enough in the world.”

    The squashes and pumpkins spread out along the ground, discouraging the appearance of weeds between the maize plants and preserving moisture by shielding the earth from the sun. The interwoven roots strengthened the plants against the winds, and the cornstalks provided convenient poles for the climbing bean vines. In return, beans drew nitrogen from the air for fixing in the soil, partially compensating for the maize, which was nitrogen-depleting. The combination of plants also provided a balanced diet, because the beans offered protein and an amino acid, lysine, that when eaten with corn releases the corn’s protein.

    To facilitate their hunting and gathering, the Indians also set fire to the forest beyond their fields. The aboriginal fires were less intense and destructive than the American forest fires of the present day. Because our own society suppresses fire, contemporary forests accumulate, over the years, large quantities of deadwood and dry brush. When a fire does ignite and escapes control, it is explosive, spreading rapidly and destructively up into the forest canopy to consume mature trees. The seventeenth-century Indians managed more modest fires. Because their fires were kindled twice a year, in both spring and fall, they found only the limited amount of deadwood and brush that had accumulated in the interim. Such fires spared the tall and thick mature trees with a dense bark, shaping a relatively open forest of many large trees and few small ones. Noting the effect, if not always the cause, colonists marveled at their ability to ride freely between immense trees through long stretches of the forest.

    With fire the Indians shaped and sustained a forest that suited their needs. Regular burning favored large hardwoods, many of which yielded edible nuts. The relatively open forest also made it easier for hunters to see and pursue game. The regular burning diminished mice, fleas, and parasites that troubled people or the game that they ate. The fires also fertilized the forest floor and opened patches of sunlight. Both effects promoted ground-hugging plants, especially grasses and berries, which sustained a larger deer herd, to the ultimate benefit of their human hunters.

    Because hospitality and generosity were fundamental duties, violators reaped shame and ridicule. No one went hungry in an Indian village unless all starved. With so little to steal and so little need to, theft was virtually unknown and no one locked a wigwam.

    The colonists extorted wampum from the southern New England Indians and then shipped it to Maine to procure furs for shipment to England. In great and growing English demand, the furs helped finance the New English debts.

    In effect, the Puritan colonies ran a protection racket that compelled native bands to purchase peace with wampum.

    Lacking a collective identity as “Indians,” the natives continued to think of themselves as members of particular bands and tribes—which rendered them all vulnerable to colonial manipulation and domination.

    In 1670 the 52,000 New England colonists outnumbered the Indians of southern New England by nearly three to one.

    Above all, the missionaries exhorted the Indians to adopt the Puritan pace and mode of work, which meant long days of agricultural labor. Insisting upon the gendered division of labor favored by the English, the missionaries urged the Indian men to forsake hunting and fishing in favor of farming. The Indian women were supposed to withdraw from the cornfields to tend the home and to spin and weave cloth, just as New English women did.

    The New English called the bloodiest Indian war in their history King Philip’s War, after the Wampanoag sachem named Metacom but known to the New English as King Philip.

    The Indians’ mastery of the flintlock deprived the colonists of the technological edge they had enjoyed in the Pequot War.

    During 1675 the colonists could rarely find and attack their more mobile and elusive foes. As a result, many settlers succumbed to the temptation to attack, plunder, and kill those Indians they could easily locate: the praying town Indians.

    Because about a third of the natives in southern New England assisted the colonists, King Philip’s War became a civil war among the Indians.

    In the late seventeenth century, tourists did not visit Plymouth to see the now celebrated rock (which was then unidentified). Instead, they gaped at Metacom’s skull. One visitor, the famous minister Cotton Mather, angrily wrenched off and took away Metacom’s jawbone, completing his silencing.

    Tried and convicted, Tift suffered a traitor’s painful death, pulled apart by horses.

    Tobacco was valuable to the empire—indeed, more precious than all other mainland produce combined—but sugar was king. Sugar could bear the costs of long-distance transportation (and the purchase of slaves by the thousand) because it was in great and growing European demand to sweeten food and drink.

    Lacking cities and gold but possessing a fearsome reputation, the Caribs were the sort of Indians that the Spanish had learned to avoid.

    received over two-thirds of the English emigrants to the Americas between 1640 and 1660.

    West Indies (44,000) than in the Chesapeake (12,000) and New England colonies (23,000) combined. The

    As positive incentives decayed after 1635, masters resorted more frequently and more brutally to punishment. They contemptuously referred to their servants as “white slaves” and applied the whip to drive and punish them—language and measures unthinkable in England.

    By preindustrial standards, the sugar planter ran a large and complex operation that combined agriculture and manufacturing. He needed at least one mill to crush juice out of the cane, a boiling house to clarify and evaporate the juice into brown sugar crystals, a curing house to drain out the molasses and dry the sugar, a distillery to convert the molasses into rum, and a warehouse to store the barreled sugar until he could ship it to Europe.

    Because cut cane spoiled unless processed within a few hours, the harvesting, milling, and boiling required close synchronization and quick work. Field gangs cut the ripe canes by hand with curved knives and carted the stalks to the mill, for prompt grinding between rollers turned by wind or cattle. Crushed from the cane, the juice had to be boiled within a few hours, before it could ferment. Boiling in a succession of copper kettles hung over a furnace evaporated the water, leaving a golden-brown sugar known as muscovado, which the planters packed into immense thousand-pound hogshead barrels and shipped to Europe, for further refinement there into white sugar for sale to consumers. Making muscovado also generated a cheap by-product, molasses, which could be rendered more valuable by distilling it into rum. Inexpensive to make, rum became the principal alcohol sold and consumed in the English empire.

    By 1660, Barbados made most of the sugar consumed in England and generated more trade and capital than all other English colonies combined.

    Despite its small scale, by 1660 Barbados had 53,000 inhabitants—a density of 250 persons per square mile, which rose to 400 by the end of the century. In 1700 the human concentration on Barbados was four times greater than in England.

    Because white men could more easily escape to pass as free on another island or aboard a pirate ship, planters increasingly saw an advantage in employing only permanent slaves of a distinctive color immediately and constantly identified with slavery.

    By 1660, Barbados had become the first English colony with a black and enslaved majority: 27,000 compared with 26,000 whites.

    The growing slave population depended on increased slave imports, for the Barbadian slaves died faster than they could reproduce. Although the planters brought 130,000 Africans into Barbados between 1640 and 1700, only 50,000 remained alive there at the dawn of the new century.

    Invariably, some reckless, frightened, or greedy slave alerted a master to the impending danger. Such reports kept the planters on edge and produced brutal retribution upon the suspected. In the first major alarm, in 1675, the planters executed thirty-five suspects; at least six of them were burned alive at the stake. The slave woman who revealed the conspiracy received her freedom from the colonial government, which compensated her master.

    This English refusal to convert slaves diverged sharply from the practice of French, Spanish, and Portuguese masters, who felt religiously and legally bound to promote the Catholic initiation of every soul, while they exploited the body. Only the Quaker minority challenged the ban at Barbados on converting the slaves. For this, they were considered dangerous radicals, and the government fined them about £7,000, executed one, and ordered their meetinghouse nailed shut.

    Once a land of apparent promise for common tobacco planters, Barbados had become the domain of sugar grandees and their African slaves.

    During the 1640s, they had increased their exposure to deadly diseases by importing slaves bearing new pathogens from Africa: principally yellow fever and malaria, which became the greatest killers of Barbadians, free and slave. The Africans also introduced and shared hookworm, yaws, guinea worm, leprosy, and elephantiasis.

    Fewer than fifteen hundred Spanish colonists and their slaves occupied part of the south coast in 1655, when their weakness attracted an English invasion and occupation.

    When buccaneers blew into town after a successful raid, Port Royal earned its reputation as the wickedest place in the English-speaking world: the Sodom of the West Indies. But paradoxical Port Royal also astonished visitors by hosting four churches (Anglican, Presbyterian, Quaker, and Catholic) and a synagogue for the more pious colonists. With 2,900 inhabitants in 1680, Port Royal was the third-largest town in English America, behind only Bridgetown on Barbados and Boston in New England.

    During the 1690s the crown dispatched a new governor with instructions to oust the buccaneers from Jamaica, which proved easier to accomplish in the wake of Sir Henry Morgan’s death in 1688. Suffering from cirrhosis of the liver, the heavy-drinking Sir Henry sought relief from an African folk doctor. But his treatments—injections of urine and an all-body plaster of moist clay—only hastened Morgan’s death.

    In 1660, Jamaica had seemed big enough for both small and great planters, but by the end of the century it became the English colony most dominated by great planters and their slaves. By

    At the end of the seventeenth century, white emigrants from the West Indies, particularly Barbados, carried the seeds of that society to the southern mainland by founding the new colony of Carolina.

    THE 1670s, West Indian planters established a new colony on the Atlantic seaboard north of Florida but south of the Chesapeake. Called Carolina to honor King Charles II, the new colony included present-day North and South Carolina and Georgia.

    In their treaties with native peoples, the colonists insisted upon the return of all fugitive slaves as the price of peace and trade. As a further incentive, Carolina paid bounties to Indians who captured and returned runaways, at the rate of a gun and three blankets for each.

    To secure Carolina from Spanish attack and accelerate its economic development, the Lords Proprietor needed to attract more colonists quickly. The Lords offered the incentives most alluring to English settlers of the late seventeenth century: religious toleration, political representation in an assembly with power over public taxation and expenditures, a long exemption from quitrents, and large grants of land. The Lords Proprietor assured religious tolerance to everyone but atheists (who hardly existed anywhere in the seventeenth century), promising even Jews the liberty to practice their faith. To discourage violent religious disputes, the Lords forbade “any reproachful, reviling, or abusive Language” against the faith of another.

    that the average Carolina freedman accumulated more than 350 acres of land before death.

    a detached cluster of settlements on Albemarle Sound, near Virginia. Founded by Virginians during the 1650s, these settlements resented their inclusion in Carolina and resisted, sometimes violently, the collection of quitrents and customs duties by proprietary officials. In 1691 the Lords Proprietor mollified the Albemarle Sound colonists by establishing “North Carolina” as a distinct government with its own assembly and deputy governor.

    The division left Charles Town the capital of “South Carolina,” which the Goose Creek Men dominated. Arrogant and Anglican, the Goose Creek Men stifled the policy of religious toleration. In 1702, the assembly barred non-Anglicans from holding political office and established the Church of England as the colony’s official, tax-supported church. The Lords Proprietor accepted the restrictive new legislation, abandoning their principal supporters in the colony, the religious dissenters.

    As in the Chesapeake, the common and the great planters of Carolina established a white racial solidarity that, in politics, trumped their considerable differences in wealth and power.

    Carolina’s early leaders concluded that the key to managing the local Indians was to recruit them as slave catchers by offering guns and ammunition as incentive.

    The Carolina trader benefited from the native custom of providing wives to welcome newcomers.

    Consequently, the Carolinians exported most of the Indian captives to the West Indies, especially Barbados, trading them for Africans, who were then brought back to work the Carolina plantations. The exchange rate of two Carolina Indian slaves for one African reflected the shorter life expectancy of the enslaved native.

    Florida’s Indian population collapsed from about 16,000 in 1685 to 3,700 in 1715, and the missions shrank to a few in the immediate vicinity and partial security of San Agústin.

    perfected on a grand scale in the American West, including cattle branding, annual roundups, cow pens, and cattle drives from the interior to the market in Charles Town.

    In Carolina the black herdsmen became known as “cowboys”—apparently the origin of that famous term.

    The colony rewarded with freedom any black who killed an enemy in time of war.

    Enjoying a protected market within the empire for both rice and indigo, Carolina planters became the wealthiest colonial elite on the Atlantic seaboard—and second only to the West Indians within the empire.

    even more gracious, polite, genteel, and lavish than the gentlemen of Virginia. Competing for status, the Carolina planters vied to serve the best wines, to display fine silverware and furniture, to appear in silk clothing, and to muster servants dressed in livery.

    An elite Carolinian conceded, “We eat, we drink, we play, and shall continue to until everlasting flames surprise us.”

    working conditions and the disease-ridden lowland environment produced a slave mortality in excess of the birthrate.

    bear firearms to church, to deter the blacks from rebelling on a Sunday.

    The authorities employed torture to obtain confessions, which led to executions, sometimes by hanging but usually by burning at the stake.

    merchants, landed gentry, and Anglican ministers. They hoped to alleviate English urban poverty by shipping “miserable wretches” and “drones” to a new southern colony, where hard work on their own farms would cure indolence. By this moral alchemy, people who drained English charity would become productive subjects working both to improve themselves and to defend the empire on a colonial frontier.

    Farther upriver he located the town of Ebenezer, as a haven for German Lutherans recently evicted from a Catholic principality.

    Moreover, black slavery made manual labor seem degrading to free men, which discouraged exertion by common whites, who aspired, instead, to acquire their own slaves to do the dirty work.

    willing to labor and capable of bearing arms, the Georgia Trustees wanted many compact farms worked by free families, instead of larger but fewer plantations dependent upon enslaved Africans. To mandate their vision, the founders restricted most new settlers to fifty-acre tracts—about an eighth of the size of a Carolina plantation—and the trustees forbade the importation or possession of slaves.

    reject the slave system so fundamental and profitable to the rest of the empire. Driven by concerns for military security and white moral uplift, the antislavery policy expressed neither a principled empathy for enslaved Africans nor an ambition to emancipate slaves elsewhere.

    To discourage litigation and agitation, the founders also banned lawyers from practicing in the new colony.

    During the late 1730s and early 1740s, the trustees lifted the bans on lawyers, liquor, and large landholdings—but held firm against slavery and an assembly.

    The Georgia dissidents rallied behind the revealing slogan “Liberty and Property without restrictions”—which explicitly linked the liberty of white men to their right to hold blacks as property. Until they could own slaves, the white Georgians considered themselves unfree. Such reasoning made sense in an eighteenth-century empire where liberty was a privileged status that almost always depended upon the power to subordinate someone else.

    in 1751 the trustees capitulated, permitting slavery and surrendering Georgia to the crown.

    From about 3,000 whites and 600 blacks in 1752, Georgia’s population surged to 18,000 whites and 15,000 blacks in 1775.

    More fertile and temperate than New England, but far healthier than the Chesapeake, the mid-Atlantic region was especially promising for cultivating grain, raising livestock, and reproducing people.

    The acquisition of New Netherland (which had swallowed up New Sweden) would also close the gap between the Chesapeake and New England, promoting their mutual defense against other empires and the Indians.

    By virtue of their especially indulgent charters, the New England colonies were virtually independent of crown authority. Answering to no external proprietors, the New English developed republican regimes where the propertied men elected their governors and councils, as well as their assemblies, and where much decision-making was dispersed to the many small towns.

    the colonial arrangement seemed designed for many separate surrenders rather than for collective defense. In

    During the early seventeenth century, the Netherlands emerged as an economic and military giant, out of all proportion to its confined geography and small population of 1.5 million (compared with 5 million English and 20 million French).

    While the other European states were developing authoritarian and centralized monarchies, the Dutch opted for a decentralized republic dominated by wealthy merchants and rural aristocrats.

    European intellectuals also gravitated to Amsterdam because the Dutch allowed greater latitude to new ideas. The great seventeenth-century philosophers René Descartes, John Locke, and Benedict de Spinoza all emigrated to escape intolerance in their own countries.

    After 1640 most of the slaves sent to the Americas went in Dutch rather than Portuguese vessels, enriching the merchants of Amsterdam rather than those of Lisbon.

    a Dutch flotilla intercepted and captured the entire Spanish treasure fleet homeward bound from the Caribbean in 1628. The loss of the ships and 200,000 pounds of silver virtually bankrupted the Spanish crown and enormously enriched the Dutch investors in the attacking fleet.

    Beginning with Henry Hudson in 1609, Dutch merchants annually sent ships across the Atlantic and up the Hudson River to trade for furs with the Indians. Seventeenth-century ships could ascend the river 160 miles, as far as the future Albany, a greater distance than was possible on any other river on the Atlantic seaboard.

    In 1625, the Dutch founded the fortified town of New Amsterdam on Manhattan Island at the mouth of the river. Possessing the finest harbor on the Atlantic seaboard, New Amsterdam served as the colony’s largest town, major seaport, and government headquarters.

    Colonists’ roving pigs and cattle invaded cornfields, provoking the natives to kill and eat the livestock—which, of course, outraged the settlers.

    Some were Swedes, but most came from Finland, then under Swedish rule. Skilled at pioneer farming in heavily forested Sweden and Finland, the colonists adapted quickly to the New World and introduced many frontier techniques that eventually became classically “American,” including the construction of log cabins.

    A zealous Calvinist, Governor Stuyvesant joined the Dutch Reformed clergy in urging a new policy meant to keep Jews as well as other Protestants out of New Netherland. But the Dutch West India Company consistently defended tolerance as best for business, reminding Stuyvesant of “the large amount of capital which [Jews] still have invested in the shares of this company.” The Jews remained, enjoying more freedom in New Netherland than in any other colony.

    As in New England, the emigrants were primarily family groups of modest means and farmer or artisan status, rather than the indentured, unmarried, and young men who prevailed in the early Chesapeake and West Indies.

    In New Netherland, women also enjoyed greater legal rights and economic opportunities than did their sisters in the English colonies. In contrast to English women, Dutch wives kept their maiden names, which reflected their more autonomous identity by law. Unlike the “coverture” of English common law, the Dutch legal code (derived from Roman law) did not deprive married women of their legal identity and their rights to own property. If a wife survived her husband, she received half of the property, while the other half went to their heirs—significantly better than the one-third allowed widows by English law.

    Between 1661 and 1664, 383 women conducted or faced lawsuits in the courts at New Amsterdam.

    But if religious conflict and economic misery sufficed to push colonial emigration, the French would have triumphed over both the English and the Dutch. The further difference was that, unlike France, England permitted its discontented freer access to its overseas colonies and greater incentives for settling there.

    Begun in 1651 and strengthened in 1660 and 1663, the Navigation Acts had three fundamental principles. First, only English ships could trade with any English colony. The acts defined as English any ship built within the empire, owned and captained by an English subject, and sailed by a crew at least three-quarters English.

    Confronting and overcoming more resistance there, the English plundered indiscriminately and sold the captured Dutch garrison into servitude in Virginia.

    the king agreed in 1680 to grant the younger Penn 45,000 square miles west of the Delaware River as the colony of Pennsylvania (“Penn’s Woods”).

    But as a young man and against his father’s wishes, Penn had converted to Quakerism, then an especially mystical, radical, and persecuted form of Protestantism.

    Renouncing formal prayers, sermons, and ceremony of any sort, Quakers met together as spiritual equals and sat silently until the divine spirit inspired someone, anyone, to speak. Although they rejected a specially educated and salaried ministry, certain especially devout and articulate laymen (and women) served as “Public-Friends,” itinerant preachers supported by voluntary contributions.

    In contrast to the Puritan emphasis on sacred scripture, Quakers primarily relied on mystical experience to find and know God. The Quakers sought an “Inner Light” to understand the Bible, which they read allegorically rather than literally. More than a distant divinity or an ancient person, their Jesus Christ was fundamentally here and now and eternal: the Holy Spirit potentially dwelling within every person. Anyone truly awakened by that Spirit could thereafter live in sanctity.

    Penn was both a devout Quaker and an ingrained elitist, both highly principled and habitually condescending. A tireless crusader for religious toleration, Penn traveled widely as a preacher, in Germany and Holland as well as Great Britain.

    Penn’s financial interest also argued for hastening development by welcoming every productive emigrant. In founding a colony, Penn meant to enhance rather than to sacrifice his fortune. In promising a “Free Colony,” he did not offer free land, for he meant to profit by selling real estate and by collecting annual quitrents. He explained, “Though I desire to extend religious freedom, yet I want some recompense for my trouble.”

    Pennsylvania commenced after the local natives had plunged in numbers and power from multiple epidemics, prolonged exposure to the alcohol of Dutch and Swede traders, and destructive raids by both the Iroquois Five Nations and the Chesapeake colonists.

    During the late seventeenth and early eighteenth centuries, many native peoples fled from mistreatment in other colonies to settle in Pennsylvania. Penn’s government welcomed Shawnees from South Carolina, the Nanticoke and Conoy of Maryland, the Tutelo from Virginia, and some Mahicans from New York. One refugee explained to the Quakers, “The People of Maryland do not treat the Indians as you & others do, for they make slaves of them & sell their Children for Money.”

    Penn consented to their division in 1704 into the distinct colonies of Pennsylvania and Delaware, with separate legislatures but a common governor appointed by their proprietor.

    Living beyond his means and donating generously to support Quaker meetings and Public Friends, Penn accumulated the debts that would consign him to an English debtors’ prison in 1707.

    Neither any single ethnic group nor any particular religious denomination enjoyed a majority in any middle colony.

    In the mid-eighteenth century, a German immigrant reported, “They have a saying here: Pennsylvania is heaven for farmers, paradise for artisans, and hell for officials and preachers.”

    James II regarded the American colonies as cash cows meant to fund a more authoritarian crown. Endowed with a larger colonial revenue, the crown could dispense with Parliament, which was constitutionally necessary to levy taxes within England.

    Although routine in southern colonies (but poorly collected), quitrents were novel and provocative in New England. Because English folk regarded secure real estate as fundamental to their liberty, status, and prosperity, the colonists felt horrified by the sweeping and expensive challenge to their land titles.

    In a bold and desperate gamble, William invaded England as a preemptive strike to capture that realm for a Dutch alliance. Aided by collusion in the disaffected English army and navy, William crossed the Channel and landed without resistance in November.

    Whigs, called the transfer of power a “Glorious Revolution,” which they creatively depicted as a spontaneous uprising by a united English people. In fact, the revolution was fundamentally a coup spearheaded by a foreign army and navy.

    In all of its reforms, the crown favored the local oligarchies of great planters and merchants, rather than any colonial longing for democracy (which was not evident).

    By 1694 the English sustained an army of 48,000 subjects plus 21,000 German mercenaries.

    Formerly the bulwark against unpopular taxes and crown power, Parliament became the great collection agency for the new monarch, a Protestant succession, and a transatlantic empire. Formerly the lightest-taxed people in Europe, the English joined the French and the Dutch as the most heavily taxed.

    In stark contrast to France, England built a fiscal-military state without submitting to the despotism of an absolutist monarchy.

    Despite their numerical superiority, the English colonists suffered repeated defeats as New France mustered small but effective combinations of royal troops, Canadian militia, and Indians to raid and destroy frontier settlements in New York and New England. In response, the English tried to invade Canada both by land from Albany via Lake Champlain and by sea via the St. Lawrence River, but both invasions were expensive and humiliating failures.

    Neutrality did not bring a universal peace to Iroquoia. On the contrary, peace to the north and west obliged the Iroquois to find enemies elsewhere, for they remained committed to mourning wars to sustain their numbers, their spiritual power, and their warrior ethos. A colonist noted that “if you go to persuade them to live peaceably” the Iroquois “will answer you, that they cannot live without war.” After

    And after 1707, the Scots outnumbered the English as emigrants to the colonies.

    Pirates took a special pride in their ability to eat, drink, dance, gamble, and whore with abandon, in a style that they called “living well.” Although unstable and dangerous, piracy proved intoxicating and addictive. The

    In a colonial world divided between masters and servants, the pirates defined freedom as their own opportunity to prey upon others.

    By 1716 colonial authorities estimated that at least two thousand Anglo-American pirates were operating in the West Indies and along the Atlantic seaboard. They found havens on the unsettled islands of the Bahamas and in the secluded inlets of the Carolinas.

    In 1688 the crown captured about 3 percent of the national income as taxes; by 1715 that had tripled to 9 percent of an enlarged economy.

    Viewing the French as an “other,” the British characterized them as economically backward, religiously superstitious, culturally decadent, aggressively militarist, and broken to despotic rule. By inverse definition, the British saw themselves as especially enlightened by commerce, individual liberties, the rule of law, and a Protestant faith.

    Despite the proliferation of British shipping, the overall number of emigrants declined in the early eighteenth century from its seventeenth-century peak.

    The new recruitment invented America as an asylum from religious persecution and political oppression in Europe—with the important proviso that the immigrants had to be Protestants.

    Formerly the great colonial entrepôt, Boston slipped to third, behind Philadelphia and New York, by 1760.

    a modest increase in productivity per capita, of at least 0.3 and perhaps 0.5 percent annually. Although not much by the standards of our time, this growth rate was impressive for a preindustrial economy. Indeed, the colonies grew more rapidly than any other economy in the eighteenth century, including the mother country. In 1700 the colonial gross domestic product was only 4 percent of England’s; by 1770 it had blossomed to 40 percent, as the colonies assumed a much larger place within the imperial economy.

    Indeed, the wealth of colonial regions varied directly and positively with the number of slaves. The West Indian planters lived in the greatest luxury because they conducted the harshest labor system with the greatest number of slaves. Next, in both wealth and slavery, came South Carolina, followed by the Chesapeake and the middle colonies. At the other extreme of the imperial spectrum, New England had the lowest standard of living and the fewest slaves. But even without many slaves, a common farmer or artisan lived better in New England than in the mother country. Slavery explained some, but not all, of the colonial prosperity. Access to abundant farmland accounted for the difference.

    The muster rolls for colonial military regiments recorded heights, revealing that the average colonial man stood two or three inches taller than his English counterpart. Stature depends upon nutrition, and especially protein, so the superior height of free colonists attested to their better diet, especially rich in meat and milk. On average, the tallest colonists were southern planters—those who profited most from African slavery and Indian land.

    Because appearances mattered so much in regulating status and credit, colonists wished to see themselves, and to be seen by others, as something more than rude rustics.

    The genteel performed constantly for one another, ever watching and ever watched for the proper manners, conversation, dress, furnishings, and home. Every action, every statement, every object was on display and subject to applause or censure.

    Of course, the common folk could never fully match the consumption and taste of the colonial elite of great planters, merchants, and lawyers. Indeed, the common emulation constantly drove the gentility to reiterate their superior status by cultivating more expensive tastes in the most current fashions.

    In addition to goods, the swelling volume of British shipping carried emigrants across the Atlantic. Relatively few, however, were English: only 80,000 between 1700 and 1775, compared with 350,000 during the seventeenth century. The decline is especially striking because after 1700 the colonies became cheaper and easier to reach by sea and safer to live in. But push prevailed over pull factors in colonial emigration.

    In England, crime surged with every peace as thousands of unemployed and desperate people stole to live. The inefficient but grim justice of eighteenth-century England imposed the death penalty for 160 crimes, including grand larceny, which was loosely defined as stealing anything worth more than a mere shilling.

    Between 1718 and 1775, the empire transported about fifty thousand felons, more than half of all English emigrants to America during that period. The transported were overwhelmingly young, unmarried men with little or no economic skill: the cannon fodder of war and the jail fodder of peace. About 80 percent of the convicts went to Virginia and Maryland, riding in the English ships of the tobacco trade. Convicts provided a profitable sideline for the tobacco shippers, who had plenty of empty cargo space on the outbound voyage from England. At about a third of the £35 price of an African male slave, the convict appealed to some planters as a better investment.

    Once in the colonies, the Ulster Scots gravitated to the frontier, where land was cheaper, enabling large groups to settle together. Their clannishness helped the emigrants cope with their new setting, but it also generated frictions with the English colonists. Feeling superior to the Catholic Irish, the Ulster Scots bitterly resented that so many colonists lumped all the Irish together. In 1720 some Ulster Scots in New Hampshire bristled that they were “termed Irish people, when we so frequently ventured our all, for the British crown and liberties against the Irish Papists.” As a compromise, they became known in America as the Scotch-Irish.

    Outnumbering the English emigrants, the 100,000 Germans were second only to the Scots as eighteenth-century immigrants to British America.

    The average Pennsylvania farm of 125 acres was six times larger than a typical peasant holding in southwestern Germany, and the colonial soil was more fertile, yielding three times as much wheat per acre. Lacking princes and aristocrats or an established church, Pennsylvania demanded almost no taxes, and none to support someone else’s religion. And Pennsylvania did not conscript its inhabitants for war.

    The German emigrant trade developed a relatively attractive form of indentured servitude adapted to the needs of families. Known as “redemptioners,” the Germans contracted to serve for about four to five years. Unlike other indentured servants, the redemptioner families had to be kept together by their employers and not divided for sale. Most contracts also gave the emigrant family a grace period of two weeks, upon arrival in Pennsylvania, to find a relative or acquaintance who would purchase their labor contract. Often arranged by prior correspondence, these deals afforded the emigrants some confidence in their destination and employer. If the two-week period passed, the redemption

    But he exaggerated a tad, for the overall death rate for the voyage was only about 3 percent, a bit better than the 4 percent rate for convicts and far better than the 10 to 20 percent suffered by enslaved Africans. Germans probably risked more by staying at home in the path of the next European war.

    Highly literate, the Pennsylvania Germans also sustained a vibrant press that produced German-language almanacs, books, and a newspaper.

    Swiss emigrant Esther Werndtlin denounced her new home, Pennsylvania: “Here are religions and nationalities without number; this land is an asylum for banished sects, a sanctuary for all evil-doers from Europe, a confused Babel, a receptacle for all unclean spirits, an abode of the devil, a first world, a Sodom, which is deplorable.”

    With German votes, the Quaker party retained control over the Pennsylvania assembly, to the dismay of the Scotch-Irish, who felt ignored and maligned by the new coalition. Clustered on the frontier, the Scotch-Irish especially resented the refusal of the Quakers and Germans, who dwelled safely and prosperously around Philadelphia, to fund a frontier militia to attack the Indians. Feeling abandoned by the Pennsylvania government, the Scotch-Irish resolved to fight the natives on their own harsh terms. In killing Indians, the Scotch-Irish could vent their political resentments without overtly confronting the Germans and the Quakers.

    In 1737, Thomas Penn and James Logan conducted the “Walking Purchase,” perhaps the most notorious land swindle in colonial history—which is saying a great deal. Unable to stop invading squatters, the local Lenni Lenape band agreed to relinquish a tract that would be bounded by what a man could walk around in thirty-six hours. Of course, the Lenni Lenape expected to lose only a modest parcel, but Logan and Penn had made elaborate preparations to maximize their purchase. They employed scouts to blaze a trail, and they trained three runners. On the appointed September day, the runners astonished and infuriated the Lenni Lenape by racing around a tract of nearly twelve hundred square miles, including most of their homeland.

    During the eighteenth century, the British colonies imported 1.5 million slaves—more than three times the number of free immigrants.

    The slave trade diminished the inhabitants of West Africa, who declined from 25 million in 1700 to 20 million in 1820. At least two million people died in slave-raiding wars and another six million captives went to the New World as slaves. That demographic loss hampered economic development, rendering West Africa vulnerable to European domination during the nineteenth century.

    During the eighteenth century, the British seized a commanding lead in the transatlantic slave trade, carrying about 2.5 million slaves, compared with the 1.8 million borne by the second-place Portuguese (primarily to Brazil) and the 1.2 million transported by the third-place French.

    During the eighteenth century at least one-third of the slaves died within three years of their arrival on the island of Barbados.

    On the coast of West Africa, the sojourning Britons suffered from the dank humidity, fierce heat, and frequent torrential rains. They also died by the hundreds from tropical diseases, for Africa reversed the immunological advantage that Europeans enjoyed as colonizers in more temperate climes.

    Popular myth has it that the Europeans obtained their slaves by attacking and seizing Africans. In fact, the shippers almost always bought their slaves from African middlemen, generally the leading merchants and chiefs of the coastal kingdoms. Determined to profit from the trade, the African traders and chiefs did not tolerate Europeans who foolishly bypassed them to seize slaves on their own initiative. And during the eighteenth century the Africans had the power to defeat Europeans who failed to cooperate. Contrary to the stereotype of shrewd Europeans cheating weak and gullible natives, the European traders had to pay premium, and rising, prices to African chiefs and traders, who drove a hard bargain. During the 1760s, traders paid about £20 per slave, compared with £17 during the 1710s.

    The Europeans exploited and expanded the slavery long practiced by Africans. Some slaves were starving children sold by their impoverished parents. Others were debtors or criminals sentenced to slavery. But most were captives taken in wars between kingdoms or simply kidnapped by armed gangs.

    Although they did not directly seize slaves, the European traders indirectly promoted the wars and kidnapping gangs by offering premium prices for captives.

    neighbors. As guns became essential for defense, a people had to procure them by raiding on behalf of their suppliers, lest they instead participate in the slave trade as victims. By the end of the century, the British alone were annually exporting nearly 300,000 guns to West Africa.

    About a quarter of the captives died along the way from some combination of disease, hunger, exhaustion, beatings, and suicide.

    Once the ship set sail, the slaves entered the notorious “middle passage” across the Atlantic to colonial America.

    The European crews exposed the slaves to smallpox, measles, gonorrhea, and syphilis. And the Africans brought along their own diseases to exchange with the crew: yellow fever, dengue fever, malaria, yaws, and especially a bacillary dysentery (a gastrointestinal disorder) known as the “bloody flux.”

    The greatest uprising racked Jamaica in 1760, killing ninety whites. Ruthless repression then killed four hundred blacks; most were burned at the stake, belying the eighteenth century’s reputation as an “Age of Enlightenment.”

    But, in an effort to sustain their own cultural space, northern blacks developed an annual ritual festival known as “Negro election day,” when they gathered to drink, feast, play, and dance. The festivities culminated with the raucous election of local kings, governors, and judges, who acted throughout the next year as arbitrators of disputes within the black community.

    On the sugar islands, slaves outnumbered whites by more than three to one.

    Often the urban, skilled, and favored slaves were lighter-skinned mulattoes, the offspring of white masters and their female slaves. Adopting colonial words, ways, and clothes, the urban slaves usually felt little solidarity with the more numerous and African-born field hands of the rice and indigo plantations. But when frustrated in their aspirations for still greater freedom and privilege, the urban slaves could become especially formidable plotters against their masters.

    Chesapeake slaves also lived in sufficient concentrations to find marriage partners and bear children, in contrast to many northern slaves. Consequently, natural increase swelled the Chesapeake slave population, which enabled the planters to reduce their African imports after 1750. Thereafter, creole slaves predominated in the Chesapeake.

    In 1780 the black population in British America was less than half the total number of African emigrants received during the preceding century, while the white population exceeded its emigrant source by three to one, thanks especially to the healthy conditions in New England and the middle colonies.

    And although some English dissenters, principally the Quakers, did seek in America a general religious freedom, many more emigrants wanted their own denomination to dominate, to the prejudice of all others. Indeed, at the end of the seventeenth century, most colonies offered less religious toleration than did the mother country.

    And unlike other colonial regions, New England had plenty of official clergyman to fill the many pulpits. Most were graduates of Harvard (founded in 1636) or Yale (1701). Indeed, New England struck visitors as the most conspicuously devout and religiously homogeneous region in British North America. The New English towns enforced a Sabbath that restricted activity to the home and church, imposing arrests and fines on people who worked, played, or traveled on Sunday. An English visitor found the New England Sabbath “the strictest kept that ever I saw.”

    As Kay so unpleasantly learned, an establishment tended to increase the power of colonial elites over the church rather than the power of the church over the colonists.

    In addition to the many denominational divisions, colonial churches were developing an internal rift between evangelicals and rationalists.

    Favoring critical and empirical inquiry, the rationalists slighted the traditional foundations of Christian faith: scriptural revelation and spiritual experience. The rationalists instead found guidance in the science that depicted nature as the orderly and predictable operation of fundamental and discernible “laws,” such as Isaac Newton’s explication of gravity. Christian rationalism held that God created the natural universe and thereafter never interfered with its laws. God seemed less terrifying as learned people reinterpreted epidemics, earthquakes, and thunderbolts as “natural” rather than as direct interventions of divine anger. The Reverend Andrew Eliot, a New England Congregationalist, explained, “There is nothing in Christianity that is contrary to reason. God never did, He never can, authorize a religion opposite to it, because this would be to contradict himself.”

    Discarding the Calvinist notion of an arbitrary and punishing God, the rationalists worshiped a benign, predictable, forgiving, and consistent deity who rewarded good behavior with salvation, but who expected common people to defer to the learned and authoritative men at the top of the social hierarchy.

    During his 1739–41 tour from Maine to Georgia, Whitefield furthered transatlantic and intercolonial integration by becoming the first celebrity seen and heard by a majority of the colonists.

    Whitefield stirred controversy by blaming rationalist ministers for neglecting their duty to seek, experience, and preach conversion. He charged, “The generality of preachers talk of an unknown and unfelt Christ. The reason why congregations have been dead is, because they had dead men preaching to them.” Such rebukes divided the ministry, inspiring some to adopt Whitefield’s spontaneous, impassioned, evangelical style while hardening others in opposition.

    “It was a very frequent thing to see an house full of outcries, faintings, convulsions and such like, both with distress, and also with admiration and joy.” The Old Lights called the outbursts “enthusiasm,” then a pejorative term that meant human madness, at best, or Satan’s manipulation, at worst. The Reverend Ezra Stiles commented that “multitudes were seriously, soberly and solemnly out of their wits.”

    Where the New Lights championed the uninhibited and disruptive flow of divine grace by inspired itinerants, the Old Lights regarded Christianity as a stable faith that needed barricades against intrusive innovations.

    In defying the established authority of minister and magistrate, the radical evangelicals championed individualism, a concept then considered divisive and anarchic.

    The radical evangelicals sought to include every person in conversion, regardless of gender, race, and status, but they worked to exclude from church membership anyone they deemed unconverted by the New Birth.

    Governor William Gooch denounced the itinerants for seeking “not liberty of conscience but freedom of speech.” His distinction was important and revealing. Gooch and other elitists accepted “liberty of conscience” as the passive persistence of longstanding denominational loyalties, but they dreaded “freedom of speech” for inviting people to rethink their allegiances, which seemed likely to disrupt social harmony. By this reasoning, Presbyterian preachers should limit their preaching to their traditional constituencies in Scots and Scotch-Irish settlements, rather than roam into other parishes to recruit Anglican defectors.

    Used to reading character from external appearances, the Virginia Anglicans regarded the Baptists as somber and melancholy people, for they wore dark and plain clothing, cut their hair short, and wove their faith into every conversation. But their external sobriety and austerity covered a more emotional, intimate, and supportive community for worship. Gathered together, they shared their despair and ecstasy in a manner discouraged by ridicule in the highly competitive and gentry-dominated society of Anglican Virginia. Addressing one another as “brother” and “sister,” the Baptists conducted an egalitarian worship that contrasted with the hierarchical seating and service of the Anglican churches. The Baptists even welcomed slaves into their worship as “brothers” and “sisters,” and encouraged some to become preachers. To break down worldly pride and build solidarity, Baptist services included extensive physical contact: laying on of hands, the exchange of the “kiss of charity,” and ritual foot-washing. A visceral distaste for such intimate contact with ordinary people discouraged gentlemen and ladies from becoming Baptists. Appealing primarily to common planters and some slaves, the Baptists drew them together while drawing them away from the gentry.

    By calling upon converts to desert their Anglican churches, the Baptists threatened a foundation of Virginia society: the expectation that everyone in a parish would worship together in the established church supervised by the county gentry. Baptists also discouraged the public amusements that had long demonstrated the gentry’s leadership as the finest dancers and the owners of the best racehorses and gamecocks. Landon Carter bitterly complained that the Baptists were “quite destroying pleasure in the Country; for they encourage ardent Prayer; strong & constant faith, & an intire Banishment of Gaming, Dancing, & Sabbath-Day Diversions.” The withdrawal of common evangelicals from public diversions and Anglican services implicitly rebuked the gentry and parsons for leading worldly lives.

    Rigorously enforcing the laws against itineracy, Anglican magistrates whipped and jailed dozens of unlicensed preachers. Far from avoiding or resisting confrontation, the Baptists welcomed opportunities to endure persecution conspicuously for their faith. In 1771 a county sheriff and a posse of gentry tried to break up a Baptist meeting by pulling the preacher, John Waller, from the stage to inflict twenty lashes with a horsewhip. In Waller’s words, the congregation gathered around the whipping to sing psalms “so that he Could Scarcely feel the Stripes.” Released, Waller “Went Back singing praise to God, mounted the Stage & preached with a Great Deal of Liberty.” For evangelicals, to preach with “Liberty” meant to channel the Holy Spirit… Some highlights have been hidden or truncated due to export limits.

    In 1758 the Philadelphia Yearly Meeting also barred Quaker slaveholders from church leadership, and in 1776 it disowned them from membership. In colonies premised on slavery, the Quakers became the lone denomination to seek abolition systematically.

    Conversion on their own terms brought them a new source of discipline to resist the worst vices of the dominant society. In particular, converted Indians reduced the alcohol consumption that rendered enclave Indians so poor, indebted, and exploited by their colonial neighbors. By creating their own local congregations, enclave Indians also limited the cultural control of outsiders.

    In sum, the Great Awakening accelerated a religious dialectic that pulled seekers and their congregations between the spiritual hunger to transcend the world and the social longing for respect in

    In 1627, after nearly two decades of colonization, Quebec still had only eighty-five French colonists.

    Most of the female emigrants came from an orphanage in Paris and were known as filles du roi (“daughters of the king”). In addition to paying their passage, the crown provided a cash marriage dowry: an alluring incentive for orphan girls lacking family money.

    This growth was too little too late to compete with the swelling number of English colonists, who numbered 234,000 whites plus 31,000 enslaved Africans by 1700.

    That restrictive policy deprived Canada of an especially promising set of colonists, the Protestant minority known as Huguenots, who resembled the English Puritans in their Calvinist faith and middling status as artisans, shopkeepers, and merchants.

    A visitor commented that a Canadian needed glass eyes, a brass body, and brandy for blood to endure the bitter cold. When winter at last receded, warm weather unleashed tormenting clouds of mosquitoes and blackflies—denser and fiercer than any in Europe.

    Habitants took pride in their regular consumption of meat and white bread, which few French peasants could afford. Thanks to small, tight houses and plentiful firewood, the New French also kept warmer in the winter, despite its rigors and duration. And in contrast to their French relatives, the New French could afford horses, another cherished mark of higher status among peasants. Finally, the Canadian habitant enjoyed privileges of hunting and fishing—both of which were environmentally and legally denied to the peasants in crowded, depleted, and hierarchical France, where the aristocrats monopolized the limited supply of game.

    Adapting to the cold, the habitants transformed winter into a cherished season of festive visiting, facilitated by horse-drawn sleighs, known as carioles.

    At death, the widow inherited half the assets (and debts), while the children obtained the other half—a better split than the one-third that English widows ordinarily received.

    Because each novice had to pay a substantial dowry to enter a convent, most came from seigneurial or mercantile families. By paying convent dowries to place some daughters, parents could vest most of the family estate in fewer heirs, especially an elder son.

    To govern New France, the crown appointed three rival officials: a military governor-general, a civil administrator known as the intendant, and a Catholic bishop. The three were supposed to cooperate to enforce crown orders while competing for crown favor by jealously watching one another for corruption, heresy, and disloyalty.

    But the number of army commissions, civil offices, and fur trade licenses lagged behind the proliferating children of seigneurial families. Inhibited from entering trade by their code of nobility, growing numbers dwelled in genteel indolence and poverty. In 1737 a priest reported that many seigneurial families were “as poor as artists and as vain as peacocks.” Charlevoix noted, “There is a great fondness for keeping up one’s position, and nearly no one amuses himself by thrift. Good cheer is supplied, if its provision leaves means enough to be well clothed; if not, one cuts down on the table in order to be well dressed.” Appearances mattered in New France.

    After 1700, hard labor, rapid reproduction, and peace with the Iroquois brought greater security, prosperity, and development to the valley. From 15,000 in 1700, the population grew to 52,000 by 1750. The amount of cleared and cultivated land, the size of the wheat harvest, and the number of mills all tripled.

    Far more readily than their English or Dutch competitors, the French traders married native women, which proved critical to their persistent predominance in the fur trade of the Great Lakes country. Indian women overcame their initial dislike of the pale and bearded French as ugly. Owing to war losses, Indian men had become relatively scarce, and the coureurs de bois offered their wives and Indian kin privileged access to the coveted trade goods of Europe. Over the generations, these relationships produced a distinctive mixed-blood people known as the métis, who spoke multiple languages, lived in their own villages, and acted as intermediaries between their French and Indian relatives.

    The Indians accepted the terminology only because they understood it very differently, for they did not have patriarchal families. In their matrilineal kinship systems, mothers and uncles had far more authority than did fathers. The natives happily called the French their “fathers” in the expectation that they would behave like Indian fathers: indulgent, generous, and weak. Among Indians, a father gave much more than he received.

    From Carolina’s success and Florida’s failure, the French concluded that a commerce in guns better secured native support than did missionaries. Determined to compete with the Carolina traders, the French in Louisiana wooed the Indians with trade goods, especially firearms.

    Because few French volunteered to colonize distant and alien Louisiana, the company relied on military conscripts and convicted criminals (a mix of vagrants, blasphemers, thieves, smugglers, tax evaders, political prisoners, and prostitutes). To a far greater degree than in Canada, the French used Louisiana as a penal colony, which further undermined its reputation. In 1720 a colonial official complained, “What can one expect from a bunch of vagabonds and wrong-doers in a country where it is harder to repress licentiousness than in Europe?”

    To sow antipathies, the French conspicuously employed especially trusted blacks in their militias sent to fight the Indians. A few particularly courageous black soldiers won their freedom as a reward. On the other hand, colonial leaders periodically punished rebel slaves by turning them over to Indians for burning to death. A French priest said that the executions “inspired all the Negroes with a new horror of the Savages, … which will have a beneficial effect in securing the safety of the Colony.”

    To maintain the racial divisions essential to Louisiana’s security, the officers relied on Indians and blacks to track down and punish deserters. Military tribunals often specified that insubordinate soldiers be flogged by a black man.

    By contrast, the British colonists reserved such treatment exclusively for their African property. To execute convicted whites, Louisiana employed a black man, Louis Congo, who drove a hard bargain for his services as executionor: freedom for himself and his wife, a plot of land, a steady supply of alcohol, and generous fees levied in pounds of tobacco—ten for a flogging or branding, thirty for a hanging, and forty for breaking on the wheel or burning alive.

    The Natchez people preserved substantial elements of the Mississippian culture, including ceremonial mounds, painted and carved temples, and powerful chiefs who, in death, were honored with the human sacrifice of their servants.

    Harsh experience had taught them that any people cut off from the gun trade faced destruction by their native enemies. Consequently, they considered any cessation of trade or escalation of prices to be acts of hostility, demanding war.

    By combining Hispanic horses with French guns, many native bands reinvented themselves as buffalo-hunting nomads, which brought them unprecedented prosperity and power.

    For want of sufficient water and because of the prevailing high winds, only a few species of trees, primarily cottonwood and willow, grew on the Great Plains, and only along the narrow, sheltered margins beside the permanent rivers. Instead of trees, hardy and drought-resistant grasses covered most of the Great Plains. On

    The bison flesh abounded in protein with relatively little fat, and the internal organs supplied many vitamins and minerals. Cut into thin strips and dried in the hot summer sun, the meat could be preserved for months and even years.

    The dried dung, known as “buffalo chips,” served as fuel on the treeless plains.

    Those ties rendered band membership highly fluid, as the dissatisfied could readily shift into another band where they had relatives.

    By conscripting the Pueblo to raid the nomads, the Hispanics further alienated them from one another. The raids procured the one paying commodity in New Mexico: slaves.

    Today the predominant image of the American Indian is a warrior and buffalo hunter, wearing an eagle-feather bonnet and riding across the Great Plains. We imagine that the mounted warrior defended a timeless, deeply rooted way of life, independent of the European invasion of America. In fact, the association of Great Plains Indians with the horse is relatively recent and depended upon the colonial intrusion. Although horses first evolved in North America, before spreading eastward into Asia and Europe about twelve thousand years ago, they had become extinct in this continent by about ten thousand years ago. During the sixteenth century, the horse returned to North America as a domesticated animal kept by the Hispanic colonists.

    The great material benefits fed into a new psychology, a sense of liberation from old limits into an intoxicating sense of speed, power, and range—an offering of both security and immense, open possibility.

    There was a conspicuous exception to the general pattern: on the upper Missouri River some Hidatsa bands broke away westward, abandoning horticulture to become nomads, assuming a new identity as the Crow.

    Especially numerous, the Lakota totaled some 25,000 people in 1790. Their own word lakota means “allies,” but their foes, including the French, called them the Sioux, which meant “enemies.”

    In sum, most of the Indian peoples we now associate with the Great Plains were relative newcomers who arrived during the eighteenth century.

    In 1800 a trader on the northern plains marveled at the abundant buffalo and remarked, “This is a delightful country, and were it not for perpetual wars, the natives might be the happiest people on earth.”

    Once an advantage, the concentration in villages became a deadly liability as the villagers suffered disproportionately from the contagious epidemics. As their numbers dwindled, the horticulturalists could no longer effectively defend many of their villages, much less their claim to the surrounding buffalo herds. Because the more mobile and dispersed nomads suffered smaller losses to the epidemics they grew in relative power as the villagers waned.

    The greater rewards of successful manhood came at a high price, for Great Plains warriors led shorter lives of increased violence. Because so many males died in their youth or prime, women outnumbered men, which encouraged polygamy by the most successful warriors.

    In 1760 only about 1,200 colonists lived in Texas, nearly half of them (580) at San Antonio. The

    As they became distinctive from the other Apaches, these composite and increasingly prosperous western bands became known to the Hispanics as the Apache de Navihu, which soon became shortened to Navajo.

    In 1769, Galvez cracked under the strain of a formidable rebellion by the Seri and Pima Indians in Sonora (which included southern Arizona). One morning he bolted from his tent to announce a plan to “destroy the Indians in three days simply by bringing 600 monkeys from Guatemala, dressing them like soldiers, and sending them against Cerro Prieto,” a Seri stronghold. Galvez proceeded to assume the identity of Moctezuma, the king of Sweden, Saint Joseph, and finally God. The concerned viceroy of New Spain recalled Galvez to Mexico City, where he slowly recovered his mental health; sent home, he later rose to higher office in Spain.

    Usually locked by ice, Hudson Bay was accessible by ships only during two months in the summer. But

    Like most fur-trading enterprises, the Hudson’s Bay Company preferred to provide guns rather than missionaries, from a conviction that Christianity ruined hunters.

    But the British colonists dissipated their numerical advantage by their division into fourteen distinct mainland colonies (Nova Scotia was the fourteenth, neglected by historians who speak of only thirteen).

    Making a virtue of their small colonial population, the French usually kept their promises not to intrude new settlements on Indian lands.

    In sum, by 1750 the Indians faced a greater threat of settler invasion and environmental transformation from the numerous and aggressive English than from the few and more generous French.

    Embarked on his first command, Washington promptly displayed his inexperience. Although superior French numbers were building Fort Duquesne, Washington foolishly attacked and destroyed a small French patrol. Understandably upset, the main French force and their Indian allies surrounded Washington’s camp, a crude stockade that he had built in a swamp surrounded by high ground. When it began to rain heavily, his soldiers wallowed in water as the French and Indians fired on them from the hills. Compelled to surrender on July 4,

    Although politically expedient, Pitt’s policy was financially reckless: by augmenting the monstrous public debt, Pitt saddled the colonists and Britons with a burden that would violently disrupt the empire after the war.

    The conquest of Canada cost the British empire about £4 million, more than ten times what the French spent to defend it.

    The collapse of New France was dreadful news to the Indians of the interior. No longer could they play the French and the British off against one another to maintain their own independence, maximize their presents, and ensure trade competition.

    reckless Carolina settlers invaded Cherokee lands and poached their deer. Some especially ruthless frontiersmen killed Cherokee to procure scalps to collect the large bounties offered by the colony of Virginia. It was impossible to tell a Cherokee scalp from that of a hostile Shawnee—and far easier to kill an unsuspecting people than one prepared for war. The £50 bounty for an adult scalp allured settlers who rarely could make that much in a year. They rationalized that all Indians were their enemies, if not immediately, then inevitably.

    The natives also felt a new commonality as Indians, above and beyond their traditional tribal and village identities. This Pan-Indian sensibility emerged from the teaching of a new set of religious prophets, led by a Lenni Lenape named Neolin. Adapting Christian ideas selectively to update native traditions, the prophets proclaimed a double creation: one for all Indians, the others for whites. In defense of their own divinely ordained way of life, Indians were supposed to resist colonial innovations, especially the consumption of alcohol and the cession of lands.

    Although spared from massacre, a third of the Indian refugees died of smallpox contracted while crowded in their Philadelphia barracks.

    That shocking conflict between the colonies and the mother country developed from strains initiated by winning the Seven Years War.

    in 1763 imperial taxation averaged twenty-six shillings per person in Britain, where most subjects were struggling, compared with only one shilling per person in the colonies, where most free people were prospering.

    Paradoxically, by protesting British taxation, the colonists affirmed their cherished identity as liberty-loving Britons, as they rallied behind the most cherished proposition of their shared political culture: that a free man paid no tax unless levied by his own representatives.

    Colonists were quick to speak of “slavery” because they knew from their own practice on Africans where unchecked domination ultimately led. The conspicuous presence of slavery rendered liberty the more dear to the colonial owners of human property.

    The free colonists intently defended their property rights because property alone made men truly independent and free.

    European leaders increasingly concluded that wealth and power accrued to nations that discovered and analyzed new information.

    The traders primarily sought sable, the premier fur-bearing mammal of Siberia. At first, the Russians marketed their furs in western Europe, but in 1689 they opened an even more lucrative trade with China, via the Siberian border town of Kaikhta, where the Russians obtained, in return, Chinese porcelains, teas, and silks.

    As the French depended upon Indian hunters to harvest beaver, the Russians relied on Siberian tribal peoples to kill sable. Living in many bands of highly mobile hunter-gatherers with animist beliefs, the native Siberians resembled their distant kin the Inuit and the Indians of subarctic Canada.

    In their reliance on tribute rather than trade to capture native labor, the Russians resembled the Spanish conquistadores of Mexico rather than the French traders in Canada.

    When they submitted, the Siberians became exposed to deadly new diseases and a debilitating new dependency on alcohol, a combination that devastated their population.

    Like the French and the English, leading Russians longed to believe that they could easily establish an American empire by appearing before the Indians as kinder and gentler colonizers. Subscribing to the “Black Legend” of peculiar Spanish brutality, the Russians predicted that the American Indians would welcome them as liberators.

    Instead of welcoming the Russians as liberators, the local Tlingit Indians ambushed and destroyed two small boats filled with fifteen men sent to probe the shallow waters. In alarm, Chirikov promptly sailed back to Kamchatka, bearing neither hostages nor tribute.

    The naturalist Georg Steller kept alive and busy observing, killing, dissecting, and naming wildlife previously unknown to Europeans, including Steller’s eagle, Steller’s jay, Steller’s white raven, and Steller’s sea cow, the last an immense northern manatee unique to the western Aleutians. Steller’s sea cows were, when mature, thirty-five feet long and exceeded four tons. Steller and the other survivors endured by hunting sea cows and sea otters and by grubbing for roots with sufficient vitamin C to ease their scurvy.

    To accumulate sufficient furs for a profit, the voyages were long: at least two years and as many as six.

    The Aleut divided into castes of chiefs, commoners, and slaves (principally war captives). The chiefs enjoyed larger dwellings and more prominent burials, with executed slaves as their companions in the afterworld.

    the victors held the native women and children for ransom, while releasing the Aleut men to fill a large quota of furs (which took months). Once the furs were delivered, the promyshlenniki released the children and the women. In the interim, the Russians exploited the Aleut women as sex slaves. Upon departing, the traders left behind venereal diseases and some trade goods—wool, beads, knives, and hatchets—in token payment for the sea otter pelts.

    From a contact population of about 20,000, the Aleut dwindled to only 2,000 by 1800.

    As a precaution, the Spanish crown ordered the colonization of California to secure the unguarded northwestern door to precious Mexico. The Spanish divided California into southern “Baja California” (now in Mexico) and northern “Alta California” (approximately the present state of California).

    Much larger and more complex than Baja, Alta California extended eleven hundred miles, contained about 100 million acres, and included the most spectacular topography and greatest environmental range of any region in North America.

    In 1768 about 300,000 natives dwelled in Alta California: an especially impressive number given that only a few practiced horticulture.

    In sum, much of the California landscape was subtly anthropogenic (human influenced) long before colonizers arrived with their own even more demanding system of manipulating nature, which they called civilization.

    These human-tended landscapes sustained larger numbers of plants and animals and were healthier than today’s forests in California. Indeed, for lack of regular fires, contemporary forests are crowded with small trees, cluttered with deadwood, infested with pests, and vulnerable to destruction by huge and catastrophic fires.

    Because the land belonged collectively to the villagers rather than to individuals, there was no market in land, no buying and selling of real estate.

    Like most native cultures, the California Indians had powerful shamans but weak chiefs.

    Although greatly outnumbered, the Hispanics possessed an intimidating monopoly of horses and guns, as well as a formal command structure.

    proved surprisingly successful as economic enterprises, becoming self-sustaining in food by 1778. In 1775 the missions had only 427 head of cattle, but these grew phenomenally to at least 95,000 by 1805.

    In 1769 the California coast between San Diego and San Francisco had a native population of 72,000, which declined to just 18,000 by 1821.

    Cook’s reconnaissance facilitated subsequent British colonization of Australia, which began with the arrival of 723 convicts at Botany Bay in early 1788.

    Lacking metallurgy, the natives exercised their ingenuity in crafting tools and weapons from wood and stone. They did occasionally recover bits of iron, mostly nails, from driftwood that apparently originated with Japanese and, perhaps, Spanish wrecks. Cherishing this metal, they longed to obtain more.

    Dependent upon a bountiful but volatile nature, the Hawaiians maintained their harmony with the supernatural by worshiping an array of divine spirits, each manifesting some aspect of their environment. In particular, Lono dispensed the nourishing rain, while Ku had to be propitiated with human sacrifice to secure victory in war.

    men and women ate different foods and ate apart from one another. Only men could eat pig and only chiefs could eat dog.

    Other taboos prohibited women from fishing and forbade menstruating woman even to enter a river.

    The well-fed natives also had the leisure time to compete violently for prestige. The victors in their endemic warfare collected numerous slaves and the skulls of the dead for prominent display in their villages.

    Experienced traders and devoted to property, the raincoast peoples belied the classic stereotype of naive natives easily cheated by European traders bearing a few beads. Although eager to get metal knives, chisels, and arrowheads, the Moachat drove a hard bargain for their pelts and salmon. An expedition scientist noted that the raincoast natives were “very keen traders, getting as much as they could for everything they had; always asking for more, give them what you would.” The expedition artist John Webber had to pay for the right to draw the interior of a Nootka house.

    Attentive to ancient tradition as well as new technology, Kamehameha ritually sacrificed defeated chiefs to Ku.

    By investing in children’s souls instead of the sea otter trade, the Spanish ensured their own long-term irrelevance in the north Pacific.

    Comments Off on American Colonies by Alan Taylor
  • Books

    The Anxious Generation by Jonathan Haidt

    From my notion template

    The Book in 3 Sentences

    1. An in-depth look at how cell phones and social media are changing the younger generations. It’s helpfully packaged with remediation strategies.

    Impressions

    On the whole it was good – it was quite repetitive, and Haidt’s substack had ruined several points before the book actually came out

    How I Discovered It

    I’ve read Haidt’s other books.

    Who Should Read It?

    Parents of children under 18

    How the Book Changed Me

    My immediate actions were to cut down on Marleigh’s already very rationed phone time – also it reaffirmed and extended my current belief that a proper relationship with technology is both symbiotic and adversarial. It also clarified my notions of anomie and one’s relationship with modern society as it is, not as it should be.

    Summary + Notes

    By designing a firehose of addictive content that entered through kids’ eyes and ears, and by displacing physical play and in-person socializing, these companies have rewired childhood and changed human development on an almost unimaginable scale.

    While the reward-seeking parts of the brain mature earlier, the frontal cortex—essential for self-control, delay of gratification, and resistance to temptation—is not up to full capacity until the mid-20s, and preteens are at a particularly vulnerable point in development.

    Gen Z became the first generation in history to go through puberty with a portal in their pockets that called them away from the people nearby and into an alternative universe that was exciting, addictive, unstable, and—as I will show—unsuitable for children and adolescents.

    They spent far less time playing with, talking to, touching, or even making eye contact with their friends and families, thereby reducing their participation in embodied social behaviors that are essential for successful human development.

    So even while parents worked to eliminate risk and freedom in the real world, they generally, and often unknowingly, granted full independence in the virtual world, in part because most found it difficult to understand what was going on there, let alone know what to restrict or how to restrict it.

    My central claim in this book is that these two trends—overprotection in the real world and underprotection in the virtual world—are the major reasons why children born after 1995 became the anxious generation.

    A few notes about terminology. When I talk about the “real world,” I am referring to relationships and social interactions characterized by four features that have been typical for millions of years: They are embodied, meaning that we use our bodies to communicate, we are conscious of the bodies of others, and we respond to the bodies of others both consciously and unconsciously. They are synchronous, which means they are happening at the same time, with subtle cues about timing and turn taking. They involve primarily one-to-one or one-to-several communication, with only one interaction happening at a given moment. They take place within communities that have a high bar for entry and exit, so people are strongly motivated to invest in relationships and repair rifts when they happen. In contrast, when I talk about the “virtual world,” I am referring to relationships and interactions characterized by four features that have been typical for just a few decades: They are disembodied, meaning that no body is needed, just language. Partners could be (and already are) artificial intelligences (AIs). They are heavily asynchronous, happening via text-based posts and comments. (A video call is different; it is synchronous.) They involve a substantial number of one-to-many communications, broadcasting to a potentially vast audience. Multiple interactions can be happening in parallel. They take place within communities that have a low bar for entry and exit, so people can block others or just quit when they are not pleased. Communities tend to be short-lived, and relationships are often disposable.

    No smartphones before high school. Parents should delay children’s entry into round-the-clock internet access by giving only basic phones (phones with limited apps and no internet browser) before ninth grade (roughly age 14). No social media before 16. Let kids get through the most vulnerable period of brain development before connecting them to a firehose of social comparison and algorithmically chosen influencers. Phone-free schools. In all schools from elementary through high school, students should store their phones, smartwatches, and any other personal devices that can send or receive texts in phone lockers or locked pouches during the school day. That is the only way to free up their attention for each other and for their teachers. Far more unsupervised play and childhood independence. That’s the way children naturally develop social skills, overcome anxiety, and become self-governing young adults.

    Adults in Gen X and prior generations have not experienced much of a rise in clinical depression or anxiety disorders since 2010,[21] but many of us have become more frazzled, scattered, and exhausted by our new technologies and their incessant interruptions and distractions.

    For most of the parents I talk to, their stories don’t center on any diagnosed mental illness. Instead, there is an underlying worry that something unnatural is going on, and that their children are missing something—really, almost everything—as their online hours accumulate.

    We found important clues to this mystery by digging into more data on adolescent mental health.[5] The first clue is that the rise is concentrated in disorders related to anxiety and depression, which are classed together in the psychiatric category known as internalizing disorders. These are disorders in which a person feels strong distress and experiences the symptoms inwardly. The person with an internalizing disorder feels emotions such as anxiety, fear, sadness, and hopelessness. They ruminate. They often withdraw from social engagement.

    In contrast, externalizing disorders are those in which a person feels distress and turns the symptoms and responses outward, aimed at other people. These conditions include conduct disorder, difficulty with anger management, and tendencies toward violence and excessive risk-taking. Across ages, cultures, and countries, girls and women suffer higher rates of internalizing disorders, while boys and men suffer from higher rates of externalizing disorders.[6] That said, both sexes suffer from both, and both sexes have been experiencing more internalizing disorders and fewer externalizing disorders since the early 2010s.[7]

    Anxiety is related to fear, but is not the same thing. The diagnostic manual of psychiatry (DSM-5-TR) defines fear as “the emotional response to real or perceived imminent threat, whereas anxiety is anticipation of future threat.”[12] Both can be healthy responses to reality, but when excessive, they can become disorders.

    Cognitively, it often becomes difficult to think clearly, pulling people into states of unproductive rumination and provoking cognitive distortions that are the focus of cognitive behavioral therapy (CBT), such as catastrophizing, overgeneralizing, and black-and-white thinking. For those with anxiety disorders, these distorted thinking patterns often elicit uncomfortable physical symptoms, which then induce feelings of fear and worry, which then trigger more anxious thinking, perpetuating a vicious cycle.

    So whatever happened in the early 2010s, it hit preteen and young teen girls harder than any other group. This is a major clue. Acts of intentional self-harm in figure 1.4 include both nonfatal suicide attempts, which indicate very high levels of distress and hopelessness, and NSSI, such as cutting. The latter are better understood as coping behaviors that some people (especially girls and young women) use to manage debilitating anxiety and depression.

    Millennial teens, who grew up playing in that first wave, were slightly happier, on average, than Gen X had been when they were teens. The second wave was the rapid increase in the paired technologies of social media and the smartphone, which reached a majority of homes by 2012 or 2013. That is when girls’ mental health began to collapse, and when boys’ mental health changed in a more diffuse set of ways.

    Smartphones are very different. They connect you to the internet 24/7, they can run millions of apps, and they quickly became the home of social media platforms, which can ping you continually throughout the day, urging you to check out what everyone is saying and doing. This kind of connectivity offers few of the benefits of talking directly with friends. In fact, for many young people, it’s poisonous.

    People don’t get depressed when they face threats collectively; they get depressed when they feel isolated, lonely, or useless. As I’ll show in later chapters, this is what the Great Rewiring did to Gen Z.

    college students, activism, and flourishing.[43] Yet more recent studies of young activists, including climate activists, find the opposite: Those who are politically active nowadays usually have worse mental health.[44] Threats and risks have always haunted the future, but the ways that young people are responding, with activism carried out mostly in the virtual world, seem to be affecting them very differently compared to previous generations, whose activism was carried out mostly in the real world.

    Intriguingly, a child’s brain is already 90% of its full size by around age 5.

    Children can only learn how to not get hurt in situations where it is possible to get hurt, such as wrestling with a friend, having a pretend sword fight, or negotiating with another child to enjoy a seesaw when a failed negotiation can lead to pain in one’s posterior, as well as embarrassment. When parents, teachers, and coaches get involved, it becomes less free, less playful, and less beneficial. Adults usually can’t stop themselves from directing and protecting.

    but information doesn’t do much to shape a developing brain. Play does. This relates to a key CBT insight: Experience, not information, is the key to emotional development. It is in unsupervised, child-led play where children best learn to tolerate bruises, handle their emotions, read other children’s emotions, take turns, resolve conflicts, and play fair. Children are intrinsically motivated to acquire these skills because they want to be included in the playgroup and keep the fun going.

    Even if the content on these sites could somehow be filtered effectively to remove obviously harmful material, the addictive design of these platforms reduces the time available for face-to-face play in the real world. The reduction is so severe that we might refer to smartphones and tablets in the hands of children as experience blockers.

    The two that are most relevant for our discussion of social media are conformist bias and prestige bias.

    In a real-life social setting, it takes a while—often weeks—to get a good sense for what the most common behaviors are, because you need to observe multiple groups in multiple settings. But on a social media platform, a child can scroll through a thousand data points in one hour (at three seconds per post), each one accompanied by numerical evidence (likes) and comments that show whether the post was a success or a failure. Social media platforms are therefore the most efficient conformity engines ever invented. They can shape an adolescent’s mental models of acceptable behavior in a matter of hours,

    But humans have an alternative ranking system based on prestige, which is willingly conferred by people to those they see as having achieved excellence in a valued domain of activity, such as hunting or storytelling back in ancient times.

    Language learning is the clearest case. Children can learn multiple languages easily, but this ability drops off sharply during the first few years of puberty.[32] When a family moves to a new country, the kids who are 12 or younger will quickly become native speakers with no accent, while those who are 14 or older will probably be asked, for the rest of their lives, “Where are you from?”

    For girls, the worst years for using social media were 11 to 13; for boys, it was 14 to 15.

    By building physical, psychological, and social competence, it gives kids confidence that they can face new situations, which is an inoculation against anxiety.

    I’ll refer to BIS as defend mode. For people with chronic anxiety, defend mode is chronically activated.

    thrilling experiences have anti-phobic effects.

    Sandseter and Kennair analyzed the kinds of risks that children seek out when adults give them some freedom, and they found six: heights (such as climbing trees or playground structures), high speed (such as swinging, or going down fast slides), dangerous tools (such as hammers and drills), dangerous elements (such as experimenting with fire), rough-and-tumble play (such as wrestling), and disappearing (hiding, wandering away, potentially getting lost or separated). These are the major types of thrills that children need. They’ll get them for themselves unless adults stop them—which we did in the 1990s. Note that video games offer none of these risks, even though games such as Fortnite show avatars doing all of them.

    Our goal in designing the places children play, she says, should be to “keep them as safe as necessary, not as safe as possible.”

    Children were getting less time to play, but they suddenly got more time with their time-starved parents?

    The Australian psychologist Nick Haslam originated the term “concept creep,” [48] which refers to the expansion of psychological concepts in recent decades in two directions: downward (to apply to smaller or more trivial cases) and outward (to encompass new and conceptually unrelated phenomena). You can see concept creep in action by observing the expansion of terms like “addiction,” “trauma,” “abuse,” and “safety.” For most of the 20th century, the word “safety” referred almost exclusively to physical safety. It was only in the late 1980s that the term “emotional safety” began to show up at more than trace levels in Google’s Ngram viewer. From 1985 to 2010, at the start of the Great Rewiring, the term’s frequency rose rapidly and steadily, a 600% increase.

    Mammal babies therefore have a long period of dependence and vulnerability during which they must achieve two goals: (1) develop competence in the skills needed for adulthood, and (2) don’t get eaten. The best way to avoid getting eaten is generally to stick close to Mom. But as mammals mature, their experience-expectant brains need to wire up by practicing skills such as running, fighting, and befriending. This is why young mammals are so motivated to move away from Mom to play, including risky play. The psychological system that manages these competing needs is called the attachment system.

    As I noted in chapter 2, the human brain reaches 90% of its adult size by age 5, and it has far more neurons and synapses at that moment than it will have in its adult form.

    If a child goes through puberty doing a lot of archery, or painting, or video games, or social media, those activities will cause lasting structural changes in the brain, especially if they are rewarding. This is how cultural experience changes the brain, producing a young adult who feels American instead of Japanese, or who is habitually in discover mode as opposed to defend mode.

    In fact, smartphones and other digital devices bring so many interesting experiences to children and adolescents that they cause a serious problem: They reduce interest in all non-screen-based forms of experience.

    Are screen-based experiences less valuable than real-life flesh-and-blood experiences? When we’re talking about children whose brains evolved to expect certain kinds of experiences at certain ages, yes. A resounding yes. Communicating by text supplemented by emojis is not going to develop the parts of the brain that are “expecting” to get tuned up during conversations supplemented by facial expressions, changing vocal tones, direct eye contact, and body language. We can’t expect children and adolescents to develop adult-level real-world social skills when their social interactions are largely happening in the virtual world.

    In the real world, it often matters how old you are. But as life moved online, it mattered less and less. The

    A country that is large, secular, and diverse by race, religion, and politics may not be able to construct shared rites of passage that are full of moral guidance, like the Apache sunrise ceremony. Yet despite our differences, we all want our children to become socially competent and mentally healthy adults who are able to manage their own affairs, earn a living, and form stable romantic bonds. If we can agree on that much, then might we be able to agree on norms that lay out some of the steps on that path?

    “Daddy, can you take the iPad away from me? I’m trying to take my eyes off it but I can’t.” My daughter was in the grip of a variable-ratio reinforcement schedule administered by the game designers, which is the most powerful way to take control of an animal’s behavior short of implanting electrodes in its brain.

    In this chapter, I describe the four foundational harms of the new phone-based childhood that damage boys and girls of all ages: social deprivation, sleep deprivation, attention fragmentation, and addiction.

    By the early 2010s, our phones had transformed from Swiss Army knives, which we pulled out when we needed a tool, to platforms upon which companies competed to see who could hold on to eyeballs the longest.

    First and foremost, in 2009, Facebook introduced the “like” button and Twitter introduced the “retweet” button. Both of these innovations were then widely copied by other platforms, making viral content dissemination possible. These innovations quantified the success of every post and incentivized users to craft each post for maximum spread, which sometimes meant making more extreme statements or expressing more anger and disgust.[8] At the same time, Facebook began using algorithmically curated news feeds, which motivated other platforms to join the race and curate content that would most successfully hook users.

    By the early 2010s, social “networking” systems that had been structured (for the most part) to connect people turned into social media “platforms” redesigned (for the most part) in such a way that they encouraged one-to-many public performances in search of validation, not just from friends but from strangers.

    Children and adolescents, who were increasingly kept at home and isolated by the national mania for overprotection, found it ever easier to turn to their growing collection of internet-enabled devices, and those devices offered ever more attractive and varied rewards. The play-based childhood was over; the phone-based childhood had begun.

    Putting it all together, the Great Rewiring and the dawn of the phone-based childhood seem to have added two to three hours of additional screen-based activity, on average, to a child’s day, compared with life before the smartphone.

    These numbers vary somewhat by social class (more use in lower-income families than in high-income families), race (more use in Black and Latino families than in white and Asian families[13]), and sexual minority status (more use among LGBTQ youth; see more detail in this endnote

    In 2020, we began telling everyone to avoid proximity to any person outside their “bubble,” but members of Gen Z began socially distancing themselves as soon as they got their first smartphones.

    The Great Rewiring devastated the social lives of Gen Z by connecting them to everyone in the world and disconnecting them from the people around them.

    Teens need more sleep than adults—at least nine hours a night for preteens and eight hours a night for teens.

    It makes intuitive sense. A study by Jean Twenge and colleagues of a large U.K. data set found that “heavy use of screen media was associated with shorter sleep duration, longer sleep latency, and more mid-sleep awakenings.”[37] The sleep disturbances were greatest for those who were on social media or who were surfing the internet in bed.

    In other words, when your sleep is truncated or disturbed, you’re more likely to become depressed and develop behavioral problems. The effects were larger for girls.

    In short, children and adolescents need a lot of sleep to promote healthy brain development and good attention and mood the next day. When screens are allowed in bedrooms, however, many children will use them late into the night—especially if they have a small screen that can be used under the blanket. The screen-related decline of sleep is likely a contributor to the tidal wave of adolescent mental illness that swept across many countries in the early 2010s.

    When you add it all up, the average number of notifications on young people’s phones from the top social and communication apps amounts to 192 alerts per day, according to one study.[42] The average teen, who now gets only seven hours of sleep per night, therefore gets about 11 notifications per waking hour, or one every five minutes.

    Thanks to the tech industry and its voracious competition for the limited resource of adolescent attention, many members of Gen Z are now living in Kurt Vonnegut’s dystopia.

    They found that performance was best when phones were left in the other room, and worst when phones were visible, with pocketed phones in between.

    Figure 5.3. The Hooked model. From Nir Eyal’s 2014 book, Hooked: How to Build Habit-Forming Products. In the book, Eyal warned about the ethical implications of misusing the model in a section titled “The Morality of Manipulation.”

    The loop starts with an external trigger, such as a notification that someone commented on one of her posts. That’s step 1, the off-ramp inviting her to leave the path she was on. It appears on her phone and automatically triggers a desire to perform an action (step 2) that had previously been rewarded: touching the notification to bring up the Instagram app. The action then leads to a pleasurable event, but only sometimes, and this is step 3: a variable reward. Maybe she’ll find some expression of praise or friendship, maybe not. This is a key discovery of behaviorist psychology: It’s best not to reward a behavior every time the animal does what you want. If you reward an animal on a variable-ratio schedule (such as one time out of every 10 times, on average, but sometimes fewer, sometimes more), you create the strongest and most persistent behavior.

    What the Hooked model adds for humans, which was not applicable for those working with rats, was the fourth step: investment. Humans can be offered ways to put a bit of themselves into the app so that it matters more to them. The girl has already filled out her profile, posted many photos of herself, and linked herself to all of her friends plus hundreds of other Instagram users.

    At this point, after investment, the trigger for the next round of behavior may become internal. The girl no longer needs a push notification to call her over to Instagram. As she is rereading a difficult passage in her textbook, the thought pops up in her mind: “I wonder if anyone has liked the photo I posted 20 minutes ago?”

    We know that Facebook intentionally hooked teens using behaviorist techniques thanks to the Facebook Files—the trove of internal documents and screenshots of presentations brought out by the whistleblower Frances Haugen in 2021. In one chilling section, a trio of Facebook employees give a presentation titled “The Power of Identities: Why Teens and Young Adults Choose Instagram.” The stated objective is “to support Facebook Inc.–wide product strategy for engaging younger users.” A section titled “Teen Fundamentals” delves into neuroscience, showing the gradual maturation of the brain during puberty, with the frontal cortex not mature until after age 20.

    Unfortunately, when an addicted person’s brain adapts by counteracting the effect of the drug, the brain then enters a state of deficit when the user is not taking the drug. If dopamine release is pleasurable, dopamine deficit is unpleasant. Ordinary life becomes boring and even painful without the drug. Nothing feels good anymore, except the drug. The addicted person is in a state of withdrawal, which will go away only if she can stay off the drug long enough for her brain to return to its default state (usually a few weeks).

    Lembke says that “the universal symptoms of withdrawal from any addictive substance are anxiety, irritability, insomnia, and dysphoria.”[57] Dysphoria is the opposite of euphoria; it refers to a generalized feeling of discomfort or unease. This is basically what many teens say they feel—and what parents and clinicians observe—when kids who are heavy users of social media or video games are separated from their phones and game consoles involuntarily. Symptoms of sadness, anxiety, and irritability are listed as the signs of withdrawal for those diagnosed with internet gaming disorder.

    Most obviously, those who are addicted to screen-based activities have more trouble falling asleep, both because of the direct competition with sleep and because of the high dose of blue light delivered to the retina from just inches away, which tells the brain: It’s morning time! Stop making melatonin!

    Certainly, these digital platforms offer fun and entertainment, as television did for previous generations. They also confer some unique benefits for specific groups such as sexual minority youth and those with autism—where some virtual communities can help soften the pain of social exclusion in the real world.

    However, unlike the extensive evidence of harm found in correlational, longitudinal, and experimental studies, there is very little evidence showing benefits to adolescent mental health from long-term or heavy social media use.[66] There was no wave of mental health and happiness breaking out around the world in 2013, as young people embraced Instagram. Teens are certainly right when they say that social media gives them a connection with their friends, but as we’ve seen in their reports of increasing loneliness and isolation, that connection does not seem to be as good as what it replaced.

    A second reason why I am skeptical of claims about the benefits of social media for adolescents is that these claims often confuse social media with the larger internet. During the COVID shutdowns I often heard people say, “Thank goodness for social media! How would young people have connected without it?” To which I respond: Yes, let’s imagine a world in which the only way that children and adolescents could connect was by telephone, text, Skype, Zoom, FaceTime, and email, or by going over to each other’s homes and talking or playing outside. And let’s imagine a world in which the only way they could find information was by using Google, Bing, Wikipedia, YouTube,[67] and the rest of the internet, including blogs, news sites, and the websites of the many nonprofit organizations devoted to their specific interests.

    A third reason for skepticism is that the same demographic groups that are widely said to benefit most from social media are also the most likely to have bad experiences on these platforms. The 2023 Common Sense Media survey found that LGBTQ adolescents were more likely than their non-LGBTQ peers to believe that their lives would be better without each platform they use.[69] This same report found that LGBTQ girls were more than twice as likely as non-LGBTQ girls to encounter harmful content related to suicide and eating disorders. Regarding race, a 2022 Pew report found that Black teens were about twice as likely as Hispanic or white teens to say they think their race or ethnicity made them a target of online abuse.[70] And teens from low-income households ($30,000 or less) were twice as likely as teens from higher-income families ($75,000 or higher) to report physical threats online (16% versus 8%).

    We need to develop a more nuanced mental map of the digital landscape. Social media is not synonymous with the internet, smartphones are not equivalent to desktop computers or laptops, PacMan is not World of Warcraft, and the 2006 version of Facebook is not the 2024 version of TikTok.

    Time with friends dropped further because of COVID restrictions, but Gen Z was already socially distanced before COVID restrictions were put in place.

    Around 2013, psychiatric wards in the United States and other Anglo countries began to fill disproportionately with girls.

    There is a clear, consistent, and sizable link[7] between heavy social media use and mental illness for girls,[8] but that relationship gets buried or minimized in studies and literature reviews that look at all digital activities for all teens.

    Taken as a whole, the dozens of experiments that Jean Twenge, Zach Rausch, and I have collected[15] confirm and extend the patterns found in the correlational studies: Social media use is a cause of anxiety, depression, and other ailments, not just a correlate.

    This meant that they made eye contact less frequently, laughed together less, and lost practice making conversation. Social media therefore harmed the social lives even of students who stayed away from it.

    These group-level effects may be much larger than the individual-level effects, and they are likely to suppress the true size of the individual-level effects.[18] If an experimenter assigns some adolescents to abstain from social media for a month while all of their friends are still on it, then the abstainers are going to be more socially isolated for that month. Yet even still, in several studies, getting off social media improves their mental health. So just imagine how much bigger the effect would be if all of the students in 20 middle schools could be randomly assigned to give up social media for a year, or (more realistically) to put their phones in a phone locker each morning, while 20 other middle schools served as the control group. These are the kinds of experiments we most need in order to examine group-level effects.

    Agency arises from striving to individuate and expand the self and involves qualities such as efficiency, competence, and assertiveness. Communion arises from striving to integrate the self in a larger social unit through caring for others and involves qualities such as benevolence, cooperativeness, and empathy.

    The two motives are woven together in changing patterns across the life course, and that weaving is particularly important for adolescents who are developing their identities. Part of defining the self comes from successfully integrating into groups; part of being attractive to groups is demonstrating one’s value as an individual with unique skills.[30] Researchers have long found that boys and men are more focused on agency strivings while girls and women are more focused on communion strivings.

    It was bad enough when I was growing up in the 1970s and 1980s, when girls were exposed to airbrushed and later photoshopped models. But those were adult strangers; they were not a girl’s competition. So what happened when most girls in a school got Instagram and Snapchat accounts and started posting carefully edited highlight reels of their lives and using filters and editing apps to improve their virtual beauty and online brand? Many girls’ sociometers plunged, because most were now below what appeared to them to be the average. All around the developed world, an anxiety alarm went off in girls’ minds, at approximately the same time.

    These tuning apps gave girls the ability to present themselves with perfect skin, fuller lips, bigger eyes, and a narrower waist (in addition to showcasing the most “perfect” parts of their lives).[38] Snapchat offered similar features through its filters, first released in 2015, many of which gave users full lips, petite noses, and doe eyes at the touch of a button.

    Girls are especially vulnerable to harm from constant social comparison because they suffer from higher rates of one kind of perfectionism: socially prescribed perfectionism, where a person feels that they must live up to very high expectations prescribed by others, or by society at large.[39] (There’s no gender difference on self-oriented perfectionism, where you torture yourself for failure to live up to your own very high standards.) Socially prescribed perfectionism is closely related to anxiety; people who suffer from anxiety are more prone to it. Being a perfectionist also increases your anxiety because you fear the shame of public failure from everything you do. And, as you’d expect by this point in the story, socially prescribed perfectionism began rising, across the Anglosphere nations, in the early 2010s.

    Striving to excel can be healthy when it motivates girls to master skills that will be useful in later life. But social media algorithms home in on (and amplify) girls’ desires to be beautiful in socially prescribed ways, which include being thin. Instagram and TikTok send them images of very thin women if they show any interest in weight loss, or beauty, or even just healthy eating. Researchers for the Center for Countering Digital Hate created a dozen fake accounts on TikTok, registered to 13-year-old girls, and found that TikTok’s algorithm served them tens of thousands of weight-loss videos within a few weeks of joining the platform.

    The researchers also noted that “social comparison is worse” on Instagram than on rival apps. Snapchat’s filters “keep the focus on the face,” whereas Instagram “focuses heavily on the body and lifestyle.”

    Boys are also more interested in watching stories and movies about sports, fighting, war, and violence, all of which appeal to agency interests and motivations. Traditionally, boys have negotiated who is high and who is low in social status based in part on who could dominate whom if it came to a fight, or who can hurl an insult at whom without fear of violent reprisal. But because girls have stronger communion motives, the way to really hurt another girl is to hit her in her relationships.

    Researchers have found that when you look at “indirect aggression” (which includes damaging other people’s relationships or reputations), girls are higher than boys—but only in late childhood and adolescence.

    Studies confirm that as adolescents moved their social lives online, the nature of bullying began to change. One systematic review of studies from 1998 to 2017 found a decrease in face-to-face bullying among boys but an increase among girls, especially among younger adolescent girls.

    They found that happiness tends to occur in clusters. This was not just because happy people seek each other out. Rather, when one person became happier, it increased the odds that their existing friends would become happier too. Amazingly, it also had an influence on their friends’ friends, and sometimes even on their friends’ friends’ friends. Happiness is contagious; it spreads through social networks.

    The second twist was that depression spread only from women. When a woman became depressed, it increased the odds of depression in her close friends (male and female) by 142%. When a man became depressed, it had no measurable effect on his friends.

    But on social media, the way to gain followers and likes is to be more extreme, so those who present with more extreme symptoms are likely to rise fastest, making them the models that everyone else locks onto for social learning. This process is sometimes known as audience capture—a process in which people get trained by their audiences to become more extreme versions of whatever it is the audience wants to see.[59] And if one finds oneself in a network in which most others have adopted some behavior, then the other social learning process kicks in too: conformity bias.

    The recent growth in diagnoses of gender dysphoria may also be related in part to social media trends. Gender dysphoria refers to the psychological distress a person experiences when their gender identity doesn’t align with their biological sex. People with such mismatches have long existed in societies around the world. According to the most recent diagnostic manual of psychiatry,[68] estimates of the prevalence of gender dysphoria in American society used to indicate rates below one in a thousand, with rates for natal males (meaning those who were biological males at birth) being several times higher than for natal females. But those estimates were based on the numbers of people who sought gender reassignment surgery as adults, which was surely a vast underestimate of the underlying population. Within the past decade, the number of individuals who are being referred to clinics for gender dysphoria has been growing rapidly, especially among natal females in Gen Z.[69] In fact, among Gen Z teens, the sex ratio has reversed, with natal females now showing higher rates than natal males.

    Sexual predation and rampant sexualization mean that girls and young women must be warier, online, than most boys and young men. They are forced to spend more of their virtual lives in defend mode, which may be part of the reason that their anxiety levels went up more sharply in the early 2010s.

    The clinical psychologist Lisa Damour says that regarding friendship for girls, “quality trumps quantity.” The happiest girls “aren’t the ones who have the most friendships but the ones who have strong, supportive friendships, even if that means having a single terrific friend.”[82] (She notes that this is true for boys as well.)

    When teens as a whole cut back on hanging out and doing things together in the real world, their culture changed. Their communion needs were left unsatisfied—even for those few teens who were not on social media.

    Two major categories of motivations are agency (the desire to stand out and have an effect on the world) and communion (the desire to connect and develop a sense of belonging). Boys and girls both want each of these, but there is a gender difference that emerges early in children’s play: Boys choose more agency activities; girls choose more communion activities. Social media appeals to the desire for communion, but it often ends up frustrating

    The net effect of this push-pull is that boys have increasingly disconnected from the real world and invested their time and talents in the virtual world instead. Some boys will find career success there, because their mastery of that world can lead to lucrative jobs in the tech industry or as influencers. But for many, though it can be an escape from an increasingly inhospitable world, growing up in the virtual world makes them less likely to develop into men with the social skills and competencies to achieve success in the real world.

    They calm their anxieties by staying inside, but the longer they stay in, the less competent they become in the outside world, fueling their anxiety about the outside world. They are trapped.

    world with too much supervision and not enough risk is bad for all children, but it seems to be having a larger impact on boys.

    But around 2010, something unprecedented started happening: Both sexes shifted rapidly toward the pattern traditionally associated with females. There has been a notable increase in agreement with items related to internalizing disorders (such as “I feel that I can’t do anything right”) for both sexes, with a sharper rise among girls as you can see in figure 7.2. At the same time, agreement with items related to externalizing disorders (such as “how often have you damaged school property on purpose?”) plummeted for both sexes, more sharply for boys. By 2017, boys’ responses looked like those from girls in the 1990s.

    By 2015, many boys found themselves exposed to a level of stimulation and attention extraction that had been unimaginable just 15 years earlier.

    In previous decades, the main way for heterosexual boys[33] to get a look at naked girls was through what we’d now consider very low-quality pornography—printed magazines that could not be sold to minors. As puberty progressed and the sex drive increased, it motivated boys to do things that were frightening and awkward, such as trying to talk to a girl, or asking a girl to dance at events organized by adults.

    When we look at daily users or users for whom porn has become an addiction that interferes with daily functioning, the male-female ratios are generally more than five or 10 to one,

    Porn separates the evolved lure (sexual pleasure) from its real-world reward (a sexual relationship), potentially making boys who are heavy users turn into men who are less able to find sex, love, intimacy, and marriage in the real world.

    Prevalence estimates vary,[58] but one 2016 study found that 1 or 2% of adult gamers qualify as having gaming addiction, 7% are problematic gamers, 4% are engaged gamers, and 87% are casual gamers.

    As Peter Gray and other play researchers point out, one of the most beneficial parts of free play is that kids must act as legislators (who jointly make up the rules) and as judges and juries (who jointly decide what to do when rules appear to be violated). In most multiplayer video games, all of that is done by the platform. Unlike free play in the real world, most video games give no practice in the skills of self-governance.

    Video games also deliver far less of the anti-phobic benefits of risky play. Video games are disembodied. They are thrilling in their own way, but they can’t activate the kind of physical fear, thrill, and pounding heart that riding a roller coaster, or playing full-court basketball, or using hammers to smash things at an adventure playground can give. Jumping out of planes, having knife fights, and getting brutally murdered are just things that happen dozens of times each day for boys playing Fortnite or Call of Duty. They do not teach boys how to judge and manage risks for themselves in the real world.

    Boys thrive when they have a stable group of reliable friends, and they create their strongest and most durable friendships from being on the same team or in a stable pack, facing risks or rival teams. Virtual packs create weaker bonds, although today’s increasingly lonely boys cling to them and value them because that’s all they have. That’s where their friends are, as Chris told me.

    Drawing on data that was just becoming available as governments began to keep statistics, he noted that in Europe the general rule was that the more tightly people are bound into a community that has the moral authority to restrain their desires, the less likely they are to kill themselves.

    The phone-based life produces spiritual degradation, not just in adolescents, but in all of us.

    But there’s another vertical dimension, shown as the z axis coming out of the page. I called it the divinity axis because so many cultures wrote explicitly that virtuous actions bring one upward, closer to God, while base, selfish, or disgusting actions bring one downward, away from God and sometimes toward an anti-divinity such as the Devil. Whether or not God exists, people simply do perceive some people, places, actions, and objects to be sacred, pure, and elevating; other people, places, actions, and objects are disgusting, impure, and degrading (meaning, literally, “brought down a step”).

    Conversely, witnessing people behaving in petty, nasty ways, or doing physically disgusting things, triggers revulsion. We feel pulled “down” in some way. We close off and turn away. Such actions are incompatible with our elevated nature. This is how I’m using the word “spiritual.” It means that one endeavors to live more of one’s life well above zero on the z axis. Christians ask, “What would Jesus do?” Secular people can think of their own moral exemplar. (I should point out that I am an atheist, but I find that I sometimes need words and concepts from religion to understand the experience of life as a human being. This is one of those times.)

    In the rest of this chapter, I’ll draw on wisdom from ancient traditions and modern psychology to try to make sense of how the phone-based life affects people spiritually by blocking or counteracting six spiritual practices: shared sacredness; embodiment; stillness, silence, and focus; self-transcendence; being slow to anger, quick to forgive; and finding awe in nature.

    This is one of the founding insights of sociology: Strong communities don’t just magically appear whenever people congregate and communicate. The strongest and most satisfying communities come into being when something lifts people out of the lower level so that they have powerful collective experiences. They all enter the realm of the sacred together, at the same time. When they return to the profane level, where they need to be most of the time to address the necessities of life, they have greater trust and affection for each other as a result of their time together in the sacred realm. They are also happier and have lower rates of suicide.

    People who live only in networks, rather than communities, are less likely to thrive.

    Living in a world of structureless anomie makes adolescents more vulnerable to online recruitment into radical political movements that offer moral clarity and a moral community, thereby pulling them further away from their in-person communities.

    DeSteno notes that synchronous movement during religious rituals is not only very common; it is also an experimentally validated technique for enhancing feelings of communion, similarity, and trust, which means it makes a group of disparate individuals feel as though they have merged into one.

    Sports are not exactly spiritual, but playing them depends on some of spirituality’s key ingredients for bonding people together, like coordinated and collective physical movement and group celebrations. Research consistently shows that teens who play team sports are happier than those who don’t.

    One of the fundamental teachings of the Buddha is that we can train our minds.

    This is why many religions have monasteries and monks. Those seeking spiritual growth are well served by separating themselves from the noise and complexity of human interactions, with their incessant words and profane concerns. When people practice silence in the company of equally silent companions, they promote quiet reflection and inner work, which confers mental health benefits. Focusing your attention and meditating have been found to reduce depression and anxiety.

    You stand on the platform and post content to influence how others perceive you. It is almost perfectly designed to crank up the DMN to maximum and pin it there. That’s not healthy for any of us, and it’s even worse for adolescents.

    Social media is a fountain of bedevilments. It trains people to think in ways that are exactly contrary to the world’s wisdom traditions: Think about yourself first; be materialistic, judgmental, boastful, and petty; seek glory as quantified by likes and followers. Many users may believe that the implicit carrots and sticks built into platforms like Instagram don’t affect them, but it’s hard not to be affected unconsciously. Unfortunately, most young people become heavy users of social media during the sensitive period for cultural learning, which runs from roughly age 9 to 15.

    The Tao Te Ching lists “ideas of right and wrong” as a bedevilment. In my 35 years of studying moral psychology, I have come to see this as one of humanity’s greatest problems: We are too quick to anger and too slow to forgive.

    But I believe his point was that the mind, left to its own devices, evaluates everything immediately, which shapes what we think next, making it harder for us to find the truth. This insight is the foundation of the first principle of moral psychology, which I laid out in The Righteous Mind: Intuitions come first, strategic reasoning second. In other words, we have an immediate gut feeling about an event, and then we make up a story after the fact to justify our rapid judgment—often a story that paints us in a good light.

    From a spiritual perspective, social media is a disease of the mind. Spiritual practices and virtues, such as forgiveness, grace, and love, are a cure.

    In 2003, Dacher Keltner and I published a review paper on the emotion of awe in which we argued that awe is triggered by two simultaneous perceptions: first, that what you are looking at is vast in some way, and, second, that you can’t fit it into your existing mental structures.[30] That combination seems to trigger a feeling in people of being small in a profoundly pleasurable—although sometimes also fearful—way. Awe opens us to changing our beliefs, allegiances, and behaviors.

    The great evolutionary biologist E. O. Wilson said that humans are “biophilic,” by which he meant that humans have “the urge to affiliate with other forms of life.”

    Yet one of the hallmarks of the Great Rewiring is that children and adolescents now spend far less time outside, and when they are outside, they are often looking at or thinking about their phones. If they encounter something beautiful, such as sunlight reflected on water, or cherry blossoms wafting on gentle spring breezes, their first instinct is to take a photograph or video, perhaps to post somewhere. Few are open to losing themselves in the moment as Yi-Mei did.

    As for our children, if we want awe and natural beauty to play a larger role in their lives, we need to make deliberate efforts to bring them or send them to beautiful natural areas. Without phones. The

    Soon before his death in 1662, the French philosopher Blaise Pascal wrote a paragraph often paraphrased as “there is a God-shaped hole in every human heart.”

    It matters what we expose ourselves to. On this the ancients universally agree. Here is Buddha: “We are what we think. All that we are arises with our thoughts.”[37] And here is Marcus Aurelius: “The things you think about determine the quality of your mind. Your soul takes on the color of your thoughts.”[38] In a phone-based life, we are exposed to an extraordinary amount of content, much of it chosen by algorithms and pushed to us via notifications that interrupt whatever we were doing. It’s too much, and a lot of it pulls us downward on the divinity dimension. If we want to spend most of our lives above zero on that dimension, we need to take back control of our inputs. We need to take back control of our lives.

    Awe in nature may be especially valuable for Gen Z because it counteracts the anxiety and self-consciousness caused by a phone-based childhood.

    called back when a safety issue was discovered. After the Titanic sank in 1912, its two sister ships were pulled out of service and modified to make them safer. When new consumer products are found to be dangerous, especially for children, we recall them and keep them off the market until the manufacturer corrects the design.

    Parents face collective action problems around childhood independence too. It was easy to send kids out to play back when everyone was doing it, but in a neighborhood where nobody does that, it’s hard to be the first one. Parents who let their children walk or play unchaperoned in a public place face the risk that a misguided neighbor will call the police, who may refer the case to Child Protective Services, who’d then investigate them for “neglect” of their children. Each parent decides that it’s best to do what every other parent is doing: Keep kids supervised, always, even if that stunts the development of all children.

    If Instagram were to make a real effort to block or expel underage users, it would lose those users to TikTok and other platforms. Younger users are particularly valuable because the habits they form early often stick with them for life, so companies need younger users to ensure robust future usage of their products.

    What is the right age of internet adulthood? Note that we are not talking about the age at which children can browse the web or watch videos on YouTube or TikTok. We’re talking only about the age at which a minor can enter into a contract with a company to use the company’s products. We’re talking about the age at which a child can open an account on YouTube or TikTok and begin uploading her own videos and getting her own highly customized feed, while giving her data to the company to use and sell as it says it will do in its terms of service.

    We expect liquor stores to enforce age limits. We should expect the same from tech companies.

    Parents should have a way of marking their child’s phones, tablets, and laptops as devices belonging to a minor. That mark, which could be written either into the hardware or the software, would act like a sign that tells companies with age restrictions, “This person is underage; do not admit without parental consent.”

    Governments at all levels, from local to federal, could support this transition by allocating funds to pay the small cost of buying phone lockers or lockable pouches for any school that wants to keep phones out of students’ pockets and hands during the school day.

    Yet in some U.S. states, such as Connecticut, the law said a child should never be left alone in public before the age of 12, meaning that 11-year-olds needed babysitters. Indeed, a Connecticut mom was arrested for letting her 11-year-old wait in the car while she ran into the store.

    Tech companies can be a major part of the solution by developing better age verification features, and by adding features that allow parents to designate their children’s phones and computers as ones that should not be served by sites with minimum ages until they are old enough. Such a feature would help to dissolve multiple collective action problems for parents, kids, and platforms.

    Voss says that when he walks into a school without a phone ban, “It’s kind of like the zombie apocalypse, and you have all these kids in the hallways not talking to each other. It’s just a very different vibe.”

    In other words, the phone ban ameliorates three of the four foundational harms of the phone-based childhood: attention fragmentation, social deprivation, and addiction. It reduces social comparison and the pull into the virtual world. It generates communion and community.

    (Some parents object that they need to be able to reach their children immediately in case of an emergency, such as a school shooting. As a parent I understand this desire. But a school in which most students are calling or texting their parents during an emergency is likely to be less safe than a school in which only the adults have phones and the students are listening to the adults and paying attention to their surroundings.[6]

    The value of phone-free and even screen-free education can be seen in the choices that many tech executives make about the schools to which they send their own children, such as the Waldorf School of the Peninsula, where all digital devices—phones, laptops, tablets—are prohibited. This is in stark contrast with many public schools that are advancing 1:1 technology programs, trying to give every child their own device.[9] Waldorf is probably right.

    The “digital divide” is no longer that poor kids and racial minorities have less access to the internet, as was feared in the early 2000s; it is now that they have less protection from

    “It seems small. But in the moment, when I saw her get on the bus and it drove away, I felt really important to her, important to someone.” That’s what was so new to her. At last, instead of feeling needy, she was needed.

    We should all be aghast that the average American elementary school student gets only 27 minutes of recess a day.[19] In maximum-security federal prisons in the United States, inmates are guaranteed two hours of outdoor time per day. When

    The key thing to understand about “loose parts” playgrounds is that kids have control over their environment. They have agency. Playgrounds with fixed structures can hold kids’ attention only so long. But loose parts keep kids’ attention for hours, allowing them to build not only forts and castles but also focus, compromise, teamwork, and creativity.

    Kids will take on responsibility for their safety when they are actually responsible for their safety, rather than relying on the adult guardians hovering over them.

    The Let Grow Project is another activity that seems to reduce anxiety. It is a homework assignment that asks children to “do something they have never done before, on their own,” after reaching agreement with their parents as to what that is. Doing projects increases children’s sense of competence while also increasing parents’ willingness to trust their children and grant them more autonomy.

    New parents lost access to local wisdom and began to rely more on experts.

    After school is for free play. Try not to fill up most afternoons with adult-supervised “enrichment” activities. Find ways that your children can just hang out with other children such as joining a Play Club (see chapter 11), or going to each other’s homes after school. Friday is a particularly good day for free play because children can then make plans to meet up over the weekend. Think of it as “Free Play Friday.”

    The cure for such parental anxiety is exposure. Experience the anxiety a few times, taking conscious note that your worst fears did not occur, and you learn that your child is more capable than you had thought. Each time, the anxiety gets weaker. After five days of our son walking to school, we stopped watching his blue dot. We got more comfortable with his ability to navigate the city, and soon its subway system.

    Delay the opening of social media accounts until 16. Let your children get well into puberty, past the most vulnerable early years, before letting them plug into powerful socializing agents like TikTok or Instagram. This

    Encourage more and better off-base excursions with friends. Let your teen hang out at a “third place” (not home or school) like the Y, the mall, the park, a pizzeria—basically, a place where they can be with their friends, away from adult supervision. Otherwise, the only place they can socialize freely is online.

    Rely more on your teen at home. Teens can cook, clean, run errands on a bicycle or public transit, and, once they turn 16, run errands using a car. Relying on your teen is not just a tool to instill work ethic. It’s also a way to ward off the growing feeling among Gen Z teens that their lives are useless.

    Encourage your teen to find a part-time job.

    Find ways for them to nurture and lead. Any job that requires guiding or caring for younger children is ideal, such as a babysitter, camp counselor, or assistant coach. Even as they need mentors themselves, they can serve as a mentor to younger kids. Helping younger kids seems to turn on an empathy switch and a leadership gene.

    Take a gap year after high school. Many young people go directly to college without any sense of what else is out there.

    Risking a serious injury for no good reason is dumb. But some risk is part of any hero’s journey, and there’s plenty of risk in not taking the journey too.

    Their parents got smartphones too. Those smartphones gave parents a new superpower that they did not have in the era of flip phones: the ability to track their children’s movements at every moment.

    Whether we think of the phone as “the world’s longest umbilical cord” or as an “invisible fence,” childhood autonomy plummeted when kids started carrying them.

    I didn’t set out to write this book. In late 2021, I began writing a book on how social media was damaging American democracy. My plan was to begin with a chapter on the impact of social media on Gen Z, showing how it disrupted their social lives and caused a surge of mental illness. The rest of the book would analyze how social media disrupted society more broadly. I’d show how it fragmented public discourse, Congress, journalism, universities, and other foundational democratic institutions. But when I finished writing that first chapter—which became chapter 1 of this book—I realized that the adolescent mental health story was so much bigger than I had thought. It wasn’t just an American story, it was a story playing out across many Western nations.

    Until someone finds a chemical that was released in the early 2010s into the drinking water or food supply of North America, Europe, Australia, and New Zealand, a chemical that affects adolescent girls most, and that has little effect on the mental health of people over 30, the Great Rewiring is the leading theory.

    In part 4, I offered dozens of suggestions, but the four foundational reforms are: No smartphones before high school No social media before 16 Phone-free schools Far more unsupervised play and childhood independence

    If a community enacts all four, they are likely to see substantial improvements in child and adolescent mental health within two years.

    You shouldn’t have to compete for your students’ attention with the entire internet.

    Growing up in the virtual world promotes anxiety, anomie, and loneliness. The Great Rewiring of Childhood, from play-based to phone-based, has been a catastrophic failure. It’s time to end the experiment. Let’s bring our children home.

    Comments Off on The Anxious Generation by Jonathan Haidt
  • Music,  RIP

    RIP Kinky Friedman

    Farewell to the Kinkster – I saw him at a book signing once in Alpharetta in the mid 1990s and was never able to see him play live but for a time he was my favorite act.

    He was similar in many ways to Tom Lehrer, but I would say that for Friedman the music was first and the comedy second, whereas Lehrer was the reverse. The originality was very, very high.

    Obit here

    Comments Off on RIP Kinky Friedman