Age and development are concepts central to contemporary Western understandings of children's growth and to the way industrialized societies have been organized since approximately the middle of the nineteenth century. If the notion of development offers a map to social and cultural constructions of maturity, the physiological and psychological characteristics accompanying chronological age are the signposts of notable change along its path. By the late twentieth century, the developmental stages of juvenile maturation were thought to proceed in sequence through eight distinguishable but overlapping stages: from early infancy to later infancy; to early childhood and then middle childhood; to pre-, early, middle, and later ADOLESCENCE.
The so-called modern family as described by PHILIPPE ARIÈS in Centuries of Childhood was characterized (in part) by the degree to which parents were alive to these phases of their children's growth, as opposed to earlier, more rudimentary, divisions. Overall, the enlarging discernment of predictable chronological and developmental changes in the child reflected the "sentimentalization" of childhood; that is, children were increasingly regarded as requiring special care and attention by adults and were believed to occupy a stage of life precious to their formation individually and collectively as future adults. The sentimentalization of children occurred on a broad scale and was combined with the creation of institutions–notably universal schooling–that were sponsored, with a few exceptions, by the emerging urban middle class of industrialized Europe and North America. They were later applied by the state at all levels of the social structure in these societies.
The dawning concern with children's development can be traced in part to the ancient recognition of the ages of life (or "ages of man"), a concept acknowledged in Western culture from at least the sixth century B.C.E. According to Ariès, Ionian formulations of the ages of man ultimately found their way centuries later into Byzantine writings and, subsequently, into what he called "scientific vulgarizations" of the sixteenth century. During the 1500s the popular understanding of human biology, according to Ariès, derived from the notion of a "universal system of correspondences," that is, the belief that there is a symbolism in numbers that binds natural phenomena into relations or "correspondences" with one another. (Accordingly, the ages of man, of which there were seven, were believed to parallel the number of planets observed in the night sky.)
The seven ages of man, from birth to death, were depicted iconographically beginning in the fourteenth century as the seven "steps of the ages," which enjoyed popular currency as a means of visualizing human aging. Ascending from the left and descending to the right, these representations begin with a child on a hobbyhorse (the "age of toys"). On the next, boys are depicted as learning to read and girls as spinning
yarn. This is succeeded by the "age of love"–scenes of boys and girls walking together, or of wedding celebrations. At the summit is the age of war and chivalry, where a man bearing arms is pictured. Then decline: represented on the next (and lower) stair are "men of law, science, or learning"; just below, the "old bearded scholar" sitting by the fire; and finally, infirmity and death.
While children occupied the first two stairs on the steps of the ages, Ariès contended that it was not until the seventeenth century that idea of childhood was recognized as a stage of life in the West, at which point it began commanding particular attention by parents and society's institutions. Ariès argued that this change could be seen in the depictions of children that became common in the 1600s, when they were shown as individuals with characteristic childlike features, clothing, and accoutrements. Indeed, Ariès felt it was the increased effort to differentiate children from adults and the physical separation of children from adult society that defined family life as "modern" and suggested a seemingly permanent departure from child-treatment practices that had been common since ancient times.
Since Ariès, historians have looked much more closely for clues to test the validity of his chief claim, that childhood did not exist before the seventeenth century, or more precisely, that anything like the contemporary fixation on children's well-being and growth flourished before the 1600s. They have discovered a much greater attentiveness to childhood and to the stages of children's emergence from physical, material, and psychological dependence on adults than had been appreciated by Ariès.
Most notably, Shulamith Shahar found that medieval medical works, didactic literature, and moral treatises not only recognized several stages in human life but commonly divided childhood itself into three stages: infantia, pueritia, and adolescentia. Further, most authorities referred to what is now termed the postadolescent phase as juventus. Each of these stages implied fairly uniform age groupings.
Infantia lasted from birth to about age seven. Within this stage an early phase lasted from birth to roughly age two– the point when the child has all of its teeth and can walk. Some writers detected a second substage that ended around age five, when the child's speech is perfected.
Pueritia lasted from age seven to age twelve for girls and to age fourteen for boys, which recognized the differing physiological maturation of girls and boys. The hallmark of this stage, the so-called age of reason, was marked by the capacity of children to distinguish between right and wrong. And yet this potential was accompanied by a supposed proclivity for sin beginning around the age of seven, which was seen as corresponding with the growth of the child's intellect. Still, some religious moralists argued that such reasoning was not commonly accessible to children until the age of ten or ten-and-a-half. Thus, a substage was spliced into pueritia that suggested, again, more acute sensitivity to children's aptitudes than was once thought to have been the case. Hence, children under the age of fourteen usually were not considered accountable for crimes, were not liable for oaths, were not subject to penance for sexual sins, and generally performed a lighter penance than adults for sins to which they confessed. Further, young people in this stage of life were observed to be very impressionable, both moody and carefree, to crave sleep and food, to prefer the company, praise, and admonitions of their peers to those of adults, and to be immodest about their bodies. Like moderns, the medieval sages noted the onset of PUBERTY at age twelve for girls and age fourteen for boys.
While there was wide consensus among writers about the initiation of adolescentia there was little agreement about when this third stage concluded. Some set its end at the age of twenty-one, others at twenty-five, twenty-eight, thirty, or as late as thirty-five. Many, as intimated by Ariès, divided the ages into multiples of seven, which meant that this stage began straightforwardly at fourteen and ended at twentyone. Legally, adulthood could include the rights of males and females to marry, own, inherit, and transmit property, bear witness at legal or ecclesiastical proceedings, and to be fully accountable under the laws of the land. While statutes varied dramatically across the European continent, Roman law remained a powerful influence throughout the Middle Ages. Under Roman law, twenty-five was the age of adulthood, the age at which males entailed many rights and responsibilities.
Notable in Shahar's view is that while writers during the Middle Ages addressed themselves to both girls and boys in their musings on the earliest phases of child life, separate treatments are devoted to girls and boys in discussions of pueritia. Boys, it was urged, should be schooled. Girls, on the other hand, might be advised to learn how to perform more homely tasks, yet they received decreasing attention in this literature and, according to Shahar, were "almost overlooked in discussions of transitions to full adulthood" (p. 30).
The relative newness of the concept of childhood–the foundation of Ariès's claims about the supposed modernity of family life after the seventeenth century and for the eventual articulation of children's observed physical, mental, moral, and emotional development–was modified if not overturned by Shahar and others. Nonetheless, just as the bases of Ariès's inferences were distrusted for having been drawn from the upper strata of Western culture (think of the seven steps which depicted a child on a hobby horse or boys learning to read, for instance), so also were his critics guilty of invoking the formulations of a social elite in studying the early modern era. Until the nineteenth century the great mass of people were illiterate, therefore recommendations to parents to educate their sons could only be aimed at parents who were themselves literate and who saw the necessity of LITERACY for sons who, too, would circulate among other literate men. Popular literacy would spread in the wake of the PROTESTANT REFORMATION after the sixteenth century, as the individual's ability to interpret scripture was critical to personal salvation in nascent Protestantism. Yet it was presumed at that time and for centuries to follow that parents, especially fathers, would be responsible for instructing their children in the basics of literacy. Therefore, the formation of schools for the masses, so crucial to spreading ideas about age and development, still lay far in the future. How prevalent then was the sensitivity to gradations of chronological age and development in the everyday lives of common people? In other words, how common were these ideas among those who constituted the great bulk of humankind, rather than in the observations and judgments of jurists, moralists, and philosophers?
Certainly in agrarian societies children's physical development was not lost on households that needed the labor of every hand. Work and need were always in greater abundance than the capacity to meet them. The ability of children to do small chores was helpful, but eagerly anticipated was the day when a young person could perform the tasks of an adult. In societies in which child mortality was high, disease shortened the adult life span, and illness, accident, or misfortune truncated so many working lives, the individual physical capacities of children must have been contemplated with the passing of each season. It is reasonable to believe that parents and kin were at least grossly attuned to the bodily and emotional changes of young ones, even if without the kind of sensitivity to children's growth that prevails in modern life. Nonetheless, the arc of children's growth as a social, rather than a physical, phenomenon only faintly resembled what we are accustomed to today.
One of the most significant reasons for this is that the productive bases of the household lay in agrarian rather than industrial pursuits. A striking difference between agrarian societies and later industrial ones is that agrarian societies did not recognize adolescence as a separate physical and psychological stage in an individual's advance toward adulthood. The addition of adolescence as a recognized life stage would not become common until the beginning of the twentieth century.
The absence of widespread recognition of adolescence in agrarian societies is arrestingly evident in two descriptions of children's coming of age. The first is historical and is offered by John Demos about children in colonial Massachusetts in the seventeenth century:
Once the child had begun to assume an adult role and style, around the age of six or seven, the way ahead was fairly straightforward. Development toward full maturity could be accomplished in a gradual, piecemeal, and largely automatic fashion.… Here was no"awkward" age–but rather the steady lengthening of a young person's shadow, and the whole instinctive process through which one generation yielded imperceptibly to its successor. (p. 150)
The second description is rendered by educator and ethnographer Leonard Covello about attitudes toward young people's development as expressed by southern Italian immigrants to the United States during the 1920s. Notice the similarity to Demos's observations: again, growing up is described as a process of incremental, seamless emergence into adult society:
Under the southern Italian cultural patterns, all children were useful and effective members of their families from an early age [five or six]. As the child became older and increased in physical strength and experience, in judgment and dependability, he performed more numerous and more difficult tasks.… Therewere no sharp age divisions; each shaded into the older and younger. So general was the pattern of life where children fitted into family life and its economy that all people were divided into two groups: children and adults. There was no adolescent group.… Therewere helpless infants and playful tots, young men and women, feeble folk, but there was never a group of adolescents. (pp. 270, 288-289)
As John Gillis points out, until the industrial era relations between the generations within the household were arranged spatially rather than temporally. That is, one's status derived not by achieving a certain age but rather by one's relation to the head of the household. In a very real sense, as Gillis says, until one became the head of one's own household, a man or a woman was ever a boy or a girl. Still, in a functional sense, a boy's or girl's maturation propelled him or her toward full membership in the economy of household relations at a steady rate, and so their subordinate status was also surely provisional and relative.
This all began to change, however, with the reorganization of production initiated by industrialization, which in turn stimulated the rapid growth of cities. Cities and their industries were fed by the migration of people, primarily from the countryside–whether the hinterlands from which these migrations issued were local, regional, national, or intercontinental in nature. Swelling nineteenth-century cities revealed two developments relevant here: the growing degree of household dependence on waged labor and the extent to which children's time, in particular, was unstructured. Children on farms were of no concern to anyone except their own households; children present on city streets and underoccupied, however, made apparent the social costs of production's reorganization under burgeoning industrialism.
As capitalist economies grew and their commodities came increasingly from the expansion of mechanized industries, cycles of prosperity and depression became more frequent, making unemployment a recurrent feature of urban working-class life. Cities concentrated unemployment and poverty and so rendered its victims both visible and anonymous. Despite the eagerness of civic leaders in Europe and, especially, the United States to make individual family heads accountable for their own and their children's penury, eventually it was seen that some accommodation must be made for children and youths whose time and activities were underutilized. During the nineteenth century a connection would be drawn between the presumed moral effects of unoccupied time for children and its broader social consequences: a child of tender age enrolled in the "street school" of idleness, it was perceived, would inevitably become a soul-hardened adult due to frequent encounters with poverty and its twin, vice, both familiar escorts on the road to a life of crime. While the histories of social welfare in Europe and North America are variegated and complex, one common solution to children's exposure to idleness and its reputed effects was schooling.
Schooling, of course, owed its beginnings to ancient civilizations, and in the Middle Ages schools were instituted for the purpose of ecclesiastical recruitment and training. But again, the proportion of children (boys) educated for this purpose was miniscule. Popular schooling, which came to be seen as a panacea to the widening plagues of child unemployment and unsupervised time, could only develop if education were free. The establishment of mass schooling, which was to inculcate age consciousness to an unprecedented degree, was contingent upon two occurrences: the first was to gather all children into schools of some kind, and the second was to arrange schools internally so that children would be kept in schools for as long as was desirable for them individually and for the greater social good. The first examples of efforts to attract the great majority of children (typically children of the emerging working class and of the urban poor) into schools for the purpose of instilling morality and the fundaments of literacy were the "charity," "Sunday," and "infant" schools of the late 1700s and early 1800s in Western Europe and the United States.
In the United States, each of these examples in its way inspired the creation of the COMMON SCHOOL, which offered education at no or low cost to children of all social (if not racial) backgrounds, of all ages, and both genders. The school class, which sorted pupils by age and achievement into cohorts, was pioneered in early sixteenth-century London. Yet the necessity for widespread adoption of this device (grading) was not realized until it became clear that common schools were problematic for both the working class and the middle class. Working-class parents needed their children to contribute to household income by being employed; middleclass parents desired a greater degree of training that would allow their children to master the skills necessary to become accountants, teachers, clerks, businessmen, or professionals. These skills, which included verbal and quantitative facility, familiarity with the arts and letters and fluency in a foreign language, were transportable from industry to industry. By arranging the public school curriculum sequentially, during the second half of the nineteenth century grading endowed the student who progressed through it with great usefulness in an economy in which proprietorship was rapidly being eclipsed by salaried employment. Because grading tended to standardize the curriculum wherever it was implemented, it also certified a level and range of competency at each grade.
Still, pressure on working-class children to quit school to assist the household's need for income or labor was irresistible, both in times of economic depression and prosperity. Therefore, it was decided that school attendance must be made not only free but also compulsory if the majority of children were to be educated for an extended period of their lives. The establishment of COMPULSORY SCHOOL ATTENDANCE and free, graded schools in the United States took many decades to accomplish. Compulsory school attendance was first instituted by Massachusetts in 1852. The last state to pass a compulsory school attendance law was Mississippi in 1918. Of course, compulsory education legislation was not effective without complementary reinforcement by child labor laws, and most forms of CHILD LABOR were not prohibited throughout the United States until the mid 1930s. Similarly, the grading of schools began in Boston in 1847 and spread throughout the northeastern and midwestern United States but was not pervasive until the 1940s.
Nonetheless, where effectively enforced, compulsory school attendance reversed the "wealth flow" between parents and children. By forcing households to withhold their children's earning power from the sphere of commerce, children's status was altered, and parent-child relations were transformed from a spatial to temporal orientation. That is, in agrarian societies children typically worked for the benefit and well-being of the entire household and of the patriarch in particular; with compulsory school attendance and child labor laws, children could not be legally put to work until the age of fourteen, which curtailed their value as assets to the household economy.
At the same time, new literary and religious sentiments took hold among the middle class that led to children becoming "sacralized." That is, childhood came to be viewed as a special state, and children were increasingly seen as a resource to be preserved, cherished, and celebrated. When this view was imposed on the working class through school requirements, the value of children within the household shifted from their utility as earners to objects of parental sentiment, and assumed a quality that Viviana A. Zelizer has called "pricelessness." One consequence of this was that at the conclusion of the nineteenth century children's BIRTHDAYS began to be celebrated annually for the first time in history. Some societies in the past had recognized specific chronological markers, such as turning age fifteen or age sixteen, but the habit of commemorating the arrival of each new year in the life of the child betokened a novel, sentimental regard for the passing of time. The accrual of experience and competence, which so enhanced child utility in the family wage economy, came at the expense of innocence and thus was viewed as a bittersweet process in the era of "priceless" children.
At the most basic level, the experience of grading revealed the extent to which children's expanding physical and intellectual aptitudes could be captured statistically and mapped onto entire groups of children. "Child accounting," which included statistics on enrollment, attendance, absenteeism, tardiness, and withdrawal from school were implemented in most northeastern and midwestern urban school systems by the latter decades of the nineteenth century. Age-grade tables, which documented the degree to which children's ages in individual schools corresponded to specified grade levels, were developed by the end of the century as well. All of these indicators implicitly measured the school's pedagogical effectiveness but also betrayed the spreading conviction that children's maturation could be normatively defined and so, too, subjected to corrective management. The peer group, previously more socially porous and tolerant of age span, was tightened to exclude children whose ages were too far above or below those set down by the cohorts created and maintained through the age-graded school class.
Was this sense of a schedule imposed on children by agegrading straightforwardly internalized by them, or did children's increasing awareness of themselves as being in or out of step with their classmates emerge through a process of give-and-take between the school's reinforcing structure and arising conceptions of development growing alongside of this new apparatus? It is important to recall that the philosophical convictions that undergirded compulsory school attendance, the prohibition of child labor, and a dawning awareness of children's biological and mental development were rooted in ENLIGHTENMENT notions of children's plasticity as growing creatures. English philosopher JOHN LOCKE's essays on the necessity for parental watchfulness and constant correction of the child's natural instincts were grounded in the idea of human perfectibility, which suggested the opposite possibility–that children unmonitored and untutored would fall well short of this goal. Eighteenth-century French writer and philosopher JEAN-JACQUES ROUSSEAU's writings on child rearing promoted the superiority of yielding to the child's natural proclivities as the best guide to parenting. While Rousseau was effectively an antidote to Locke, the belief in the child's destiny as an unfolding natural phenomenon shared the assumption that children's growth revealed a progression of changes, for ill or good. A scientific basis for this conception was stimulated by Charles Darwin's The Descent of Man (1871), which spawned a generation of studies that sketched out parallels between the course of early human history and phases of the child's development. Nineteenth-century English philosopher Herbert Spencer added to this the notion of human progress.
Thus, the course of child study by the end of the 1800s was committed to charting the progressive appearance of cognitive, emotional, and motor functions in children from birth to adulthood. It is possible that novel theories of child development and new opportunities to observe the mass of children arranged in cohorts enabled both educators and the new discipline of psychology to formulate and test the idea of stages of child maturation precisely because children were now sorted in a way that made it possible to generalize about development with some accuracy for the first time.
The arrival of adolescence as a widely acknowledged stage in human growth signaled the successful linkage of ideas about aging and development current by the beginning of the twentieth century; and again it was schooling that provided this connection. For without schooling, the significance of adolescence would always be overshadowed by the imperatives of the family economy in working-class households. American educator and psychologist G. STANLEY HALL, who popularized the "discovery" of adolescence in 1904, when only 7 percent of all seventeen year olds graduated from high school, provided the rationale for this institutionalized cultural space in children's development. HIGH SCHOOL expanded exponentially thereafter: by 1940 almost half of all seventeen year olds earned diplomas nationally.
For the remainder of the century the nascent discipline of academic psychology vacillated on the question of whether nature or nurture is more important to children's development. Behaviorists such as American psychologist JOHNWATSON insisted on the critical place of environmental influence on children's development, while the rival view stressed the significance of biological and genetic blueprints for human maturation. American psychology for the first half of the twentieth century was Watsonian in orientation, but this perspective was broken by the ascendance of Swiss psychologist JEAN PIAGET, who proposed a more synthetic interplay between genetic endowment and social environment.
Apart from this preoccupying (and ongoing) issue, however, two other trends arose: (1) The stages of development from birth through adolescence were divided and subdivided as specialization within the discipline proceeded; and (2) there was a tendency to privilege one stage of development over others as the "focal period." Infancy, middle childhood, and adolescence have been stressed variously as the periods of development that impart lasting critical effects on later outcomes for the individual. This line of inquiry led developmental psychologists to cast their eyes to later stages of human development and aging to try to untangle cause-and-effect relationships between stages of maturation.
Hall's disciples such as Frederic Thrasher connected observations of boys' development to justify the creation of a host of organizations ancillary to schooling, such as the PLAYGROUND MOVEMENT, youth athletics, scouting, and other after-school activities to engage children and youths in adult-supervised settings. Child and youth recreational activities adopted the idea of age groupings as an article of faith in organizing athletic competition. It has been reflected in a wide range of activities, from New York's Police Athletic League early in the 1900s to the most popular participatory sport for boys, Little League Baseball, at midcentury. By the 1920s, athletic competition was commonplace in American high schools and they, too, arranged contestants into graduated ability and age groups, typically dividing them into junior varsity and varsity teams.
American educator and philosopher JOHN DEWEY and his followers promoted the idea of schooling as preparation for life. The purpose of education was not just to direct intellectual inquiry but to promote "desired social ends" as well (Cahan, p. 157). While the aims of Dewey's philosophy of PROGRESSIVE EDUCATION were varied and only ever implemented in a limited way, the idea of child development was a central feature and enabled school reformers to promote the extension of schooling for virtually all young people through the adolescent years by creating a curriculum that addressed their differentiated educational and social needs.
A by-product of this philosophy was the concept of "social promotion," which solidified the connections between the social and educative functions of schooling by more loosely aligning chronological age, achievement levels, and the cohort through secondary school. Social promotion legitimized promotion from one grade level to the next by demonstrating the positive social affects on child development as compared with the outcomes of promotion based on academic merit. By the early twentieth century the academic-merit system was seen as detrimental to both the educational and social needs of the individual. On the face of it, social promotion, which peaked as an educational policy during the 1970s, would seem to mitigate the heightened consciousness of age induced by age-grading, but it actually reinforced the perceived necessity for taut coordination between chronological age and cognitive and emotional development. The child's, and then adolescent's, concept of self had become so tightly bound up with the progress of his or her age cohort that developmental progress itself was seen as being fostered by keeping up socially if not educationally with one's cohort.
Not coincidentally, sociologists and historians during the 1960s and 1970s began to trace the broader outlines of the normative movement from childhood to adulthood in American society. Inspired at least in part by the popular perception that young people were taking longer to come of age– that is, to assume the roles and responsibilities of adult status–they struck out to determine whether social norms involving transitions to adulthood had altered in the period between the mid-1800s and the mid-1900s. What they found was that the age at which the final transition to adulthood (family formation) occurred in the mid-twentieth century was roughly the same as it had been at the midpoint of the nineteenth century. However, the experience of coming of age–the turbulence one felt in moving from childhood to adulthood, especially in adolescence and the post adolescent years–was actually amplified by the rise of institutions to regulate these transitions. Studies by John Modell and others noted that the establishment of universal schooling during the mid-nineteenth century had created a roughly uniform starting line from which children entered and left school, entered the full-time work force, left their parents' household, married, and began their own families. Whereas before the late nineteenth century, a young person could occupy more than one of these states at the same time (i.e., be both at school and working, or working, married, and living in one's parents' household simultaneously), by the early to middle twentieth century, the transitions between them became much more tightly sequenced and separate. Widely observed age norms appeared to coordinate the individual's transition from childhood to adulthood, but the consequences for mixing these states had become (until the 1940s) detrimental to one's life chances. Marrying and remaining in one's parental household, for instance, had come to be regarded as unusual where it had once been an expected condition in agrarian societies in which the first-born male would eventually inherit the property of the household head. This was no longer the case when children were all schooled to the age of fourteen (sixteen by the 1930s) and began working until they could afford to establish their own households, marry, and procreate.
By the 1950s, however, this sequencing of transitions began to get jumbled as marriage ages dropped in the wake of World War II and schooling was extended past the age of sixteen, with larger numbers of young people entering and finishing college. To pay for higher education it was necessary for many college students to work while attending classes, and it was also not unusual for some students to start their own households, marry, and even begin bearing and raising children while enrolled in college. In short, by the 1960s and 1970s a once highly ordered set of transitions to adulthood overlapped extensively. This meant that, on one hand, individuals had wide latitude in the choices they could make about completing school, starting work, and getting married; on the other hand, they experienced greater anxiety both individually and socially about making and ordering these choices.
Since the 1970s several trends affecting the early life course were reversed: ages of first marriage for both women and men returned to their traditional averages, the middle twenties and late twenties, respectively; the average age of women at the birth of their first child increased; the birth rate plummeted; and a new trend, cohabitation, rose steadily during the last quarter of the century, suggesting that men and women no longer felt compelled to remain living with their parents immediately before establishing their own households and could defer both marriage and family formation. By the late twentieth century the early life course of women and men grew more similar. Where previously early life-course transitions were based on the assumption of a male breadwinner, male and female education levels evened out during the last quarter of the century, and wage gaps between them narrowed somewhat.
If the tightly sequenced, non-overlapping set of transitions to adulthood attested to a high degree of age consciousness at midcentury, did their unraveling intimate a loosening concern with age? Historians and social critics see mixed signs. On one hand there appears to be less age segregation than there was earlier in the century when, for instance, universities admitted few undergraduates over the ages of eighteen or nineteen. Increasingly older Americans have reentered the ranks of college students, and educational "retooling" in a rapidly changing economy has become a widely accepted practice for people who may change careers several times in their lives. It has also become more common for women to return to the workforce after bearing and rearing children. On the other hand, in the 1960s a YOUTH CULTURE flourished. It was fed by the expansion of higher education after World War II and the influx of the so-called BABY BOOM GENERATION into U.S. universities, which glorified youthfulness to an unprecedented degree. This preoccupation with the concerns of youth and images of youthfulness were intimately connected with consumer products for a mass audience for the first time and accompanied the abandonment of the previous generation's mortal fear of spending. This trend only gained momentum during the late twentieth century as this large cohort of people has aged, creating new markets for consumers–whether truly young or young at heart–to express their feelings of affiliation with the young.
See also: Child Development, History of the Concept of; Child Psychology; Life Course and Transitions to Adulthood; Theories of Childhood.
Ariès, Philippe. 1962. Centuries of Childhood: A Social History of Family Life. Trans. Robert Baldick. New York: Vintage Books.
Cahan, Emily D. 1994. "John Dewey and Human Development." In A Century of Developmental Psychology, ed. Ross D. Parke, et. al. Washington, DC.: American Psychological Association.
Caldwell, John C. 1980. "Mass Education as a Determinant of the Timing of Fertility Decline." Population and Development Review 6 (June): 225-255.
Chudacoff, Howard P. 1989. How Old Are You? Age Consciousness in American Culture. Princeton, NJ: Princeton University Press.
Cole, Michael, and Sheila R. Cole. 2000. The Development of Children, 3rd ed. New York: Scientific American Books.
Covello, Leonard. 1967. The Social Background of the Italo-American School Child: A Study of the Southern Italian Family Mores and their Effect on the School Situation in Italy and America. Leiden, Netherlands: E. J. Brill.
Cunningham, Hugh. 1990. "The Employment and Unemployment of Children in England, c. 1680-1851." Past and Present 126 (February): 115-150.
Cunningham, Hugh. 1995. Children and Childhood in Western Society since 1500. New York: Longman.
Demos, John. 1970. A Little Commonwealth: Family Life in Plymouth Colony. New York: Oxford University Press.
Elder, Glen H. 1974. Children of the Great Depression: Social Change in Life. Chicago: University of Chicago Press.
Fass, Paula S. 1977. The Damned and the Beautiful: American Youth in the 1920's. New York: Oxford University Press.
Kessen, William, ed. 1965. The Child. New York: John Wiley and Sons.
Labaree, David F. 1997. How to Succeed in School without Really Learning: The Credentials Race in American Education. New Haven, CT: Yale University Press.
Lassonde, Stephen. 1996. "Learning and Earning: Schooling, Juvenile Employment, and the Early Life Course in Late Nineteenth-Century New Haven." Journal of Social History 29 (summer): 839-870.
Levine, David. 1987. Reproducing Families: The Political Economy of English Population History. New York: Cambridge University Press.
Modell, John. 1989. Into One's Own: From Youth to Adulthood in the United States, 1920-1975. Berkeley: University of California Press.
Modell, John, Frank F. Furstenberg, Jr., and Theodore Hershberg. 1981. "Social Change and Transitions to Adulthood in Historical Perspective." In Philadelphia: Work, Space, and Group Experience in the 19th Century, ed. Theodore Hershberg. New York: Oxford University Press.
Parke, Ross D., Peter A. Ornstein, John J. Rieser, et. al. 1994. "The
Past as Prologue: An Overview of a Century of Developmental Psychology." In A Century of Developmental Psychology, ed. Ross D. Parke, et. al. Washington, DC: American Psychological Association.
Shahar, Shulamith. 1990. Childhood in the Middle Ages. New York: Routledge.
Vinovskis, Maris. 1995. "Historical Development of Age Stratification in Schooling." In his Education, Society, and Economic Opportunity: A Historical Perspective on Persistent Issues. New Haven, CT: Yale University Press.
Zelizer, Viviana A. 1985. Pricing the Priceless Child: The Changing Social Value of Children. New York: Basic Books.