Although parents and students of human development have been observing children for millennia, researchers in America and Europe began to conduct systematic studies of childhood behavior around the turn of the twentieth century. Prior attempts to codify normal development had been published as diaries that described the behavior of a single child, usually the son or daughter of the author. In 1787, for example, German psychologist Dietrich Tiedemann documented the growth of a child's intellectual abilities; a century later, German psychologist William Preyer authored elaborate essays that described the development of both the embryo and the young child. In 1887, even British naturalist Charles Darwin published the observations he had gathered during his son's first two years.
During the late nineteenth century, American psychologists conducted the first objective evaluations of large groups of children. These researchers were committed to egalitarianism; that is, they held the idealistic hope that most children could become responsible adults if their early family experiences had been optimal. Yet the psychologists' studies revealed dramatic variation among children in their intellectual ability, school achievement, and character. The troubling results motivated the researchers not only to document the magnitude of the variation, but also to attempt to explain why so many children had not attained a minimum proficiency level.
Most psychologists working during this first phase of systematic inquiry held five implicit premises that were consistent with scientific thought during the eighteenth and nineteenth centuries. For example, they believed that any changes in the psychological properties of children occurred gradually rather than abruptly. This belief was consonant both with the views of eighteenth-century mathematicians and philosophers as well as with Darwin's conviction that evolution was a gradual process.
The second premise was that a child's psychological traits were due in large part to the profile of rewards and punishments administered by adults, especially parents. This assumption rested on the belief that children acted in order to maximize pleasure or to minimize pain, a view asserted two hundred years earlier by philosopher JOHN LOCKE and promoted in the 1920s by the American behaviorist JOHN WATSON. According to this premise, actions that brought the child pleasure would be strengthened and repeated while actions that brought pain would be weakened and discontinued. American parents were told, and many believed, that their treatment of their child would determine his or her personality, talents, and character.
At the end of the nineteenth century, however, the American psychologist James Mark Baldwin represented the views of a growing minority of researchers who recognized the importance of reasoning, language, and symbolism in a child's development. He suggested that the influences of pleasure and pain were only ascendant during infancy. As children matured after age two, Baldwin asserted, they began to distinguish between right and wrong and implicitly asked, "What should I do?"
During the early decades of the twentieth century, however, Watson's behavioristic principles continued to be favored over Baldwin's theories. Scientists and journalists published accounts that described the inferior academic achievements of European immigrants to the United States. These authors attributed the immigrant children's poor school performance and deviant behavior to their inherited propensities. Not only did this fatalistic explanation trouble the egalitarians, it also motivated other researchers to deny the importance of biological factors and enthusiastically emphasize the role of social experience.
Besides political tensions related to immigration, a second reason for the continuing popularity of behaviorism was psychology's status as a new scientific discipline. Psychologists wanted to present their field to biologists and physicists as an experimental and rigorous science, distinct from philosophy and pruned of all metaphysics. Theories that emphasized the conditioning of habits demonstrated elegant empirical science and appealed to young faculty who were beginning careers in developmental psychology. Thus, by the late 1920s, the learning of new habits through conditioning and the proper application of reward and punishment had become the primary way to explain both the appearance of universal characteristics as well as the variation in these and other features. This behavioristic view persisted until the middle of the twentieth century.
The third premise favored a strong connection between childhood habits and moods and those of adulthood, asserting that behavior and emotions acquired during the first years of a child's life could be preserved indefinitely. Intellectual retardation that hampered the acquisition of reading and arithmetic, as well as asocial habits that led to a delinquent career, were two symptoms that caused societal concern. Some commentators claimed that the intellectual profile of every adult had its origins in infancy. Others warned parents not to take their young children to the movies because the film's scenes might be overly stimulating and thus produce an undesirable trait years later. A faith in connectedness was consistent with egalitarian principles, however, for it implied that if one could arrange similar growth-enhancing experiences for all infants and young children, every citizen could attain an ideal profile of abilities, beliefs, and emotions. Both the premises of gradual change and connectedness were consonant with eighteenth-century scientists' attraction to historicism, the belief that in order to understand any phenomenon, one had to know its complete history.
The fourth premise was that the mother represented the most important influence on the child's growth. Although ancient Roman, medieval, and Renaissance scholars all believed that the father played the more important role, John Locke and subsequent thinkers insisted that it was the mother's relation to the child, and her socialization practices, that had greater potency. This assertion appealed to Americans because of the enhanced significance of American women in the families that had left the East Coast to settle in the Appalachians and the Midwest. Middle-class women in European cities were less necessary than those among the pioneers who were settling throughout Tennessee, Kentucky, Ohio, Indiana, and Illinois. These isolated families required a woman's labor, loyalty, and affection in order to survive.
The final premise, an affirmation of John Locke's declaration that children love liberty, assumed that a child's freedom was the most important quality to nurture. This imperative was hidden in psychological essays on the importance of PLAY and the encouragement of personal autonomy. One commentator suggested that an infant who protested the mother's removal of a nursing bottle was showing the first sign of a defiance of authoritarian control that should be encouraged. The most popular developmental textbooks of the 1930s stated that children should be emancipated from parental control and allowed to free themselves from a close emotional attachment to their family. The eroticization of individual freedom was one reason why children's play was a popular topic of research. It seemed obvious to many that when a child was playing he was maximally free, and according to one expert, child's play was the foundation of American democracy.
It is possible to discern five historical eras in the study of children over the last century. The first, from approximately 1900 to 1925, is distinguished by the study of differences among children in intellectual ability and character, motivated largely by concern for the many immigrant children who were failing in school and committing crimes. Chicago established the first JUVENILE COURT for delinquents in 1899, and in the following decade, the Judge Baker Children's Center in Boston was the first institution to attempt a scientific study of the causes of DELINQUENCY.
The second phase, which occupied the next twenty-five years and was theoretically consonant with the first, was marked by the influence of Freudian theory. Psychoanalytic ideas seemed to be intuitively correct to large numbers of psychiatrists, psychologists, and educated parents. The latter brooded over whether they should breast-or bottle-feed, when to wean their nursing infant, how to TOILET TRAIN,whether to SLEEP with the child, and how to handle the child's sexual curiosity. SIGMUND FREUD's notions were popular with the public because he left intact most of the nineteenth-century views of human nature, altering only the less essential features.
For example, nineteenth-century scientists believed that humans varied in the amount of energy available for psychic activity. Each person's brain was supposed to possess a fixed amount of energy, and psychological symptoms could appear if the individual depleted this resource. Charles Beard, a neurologist, coined the term neurasthenia in 1869 to describe individuals who experienced tension, depression, and insomnia because their brains ran out of energy. Freud accepted the popular understanding that each person inherited a fixed amount of energy, but he attributed the depletion of energy to the repression of libidinal instincts rather than to excessive mental work. A person who used energy to repress sexual impulses would therefore have less energy available for adaptive work.
Another popular belief Freud exploited was that early experiences influenced personality development and, therefore, the possibility of acquiring symptoms. Freud accepted the significance of early childhood, but he made the improper socialization of sexual impulses, rather than obedience, the major cause of symptoms. Freud took advantage of the popular belief that excessive bouts of sexual pleasure were dangerous and frequent MASTURBATION or an obsession with sex could cause insanity or mental retardation.
The final feature of nineteenth-century thought was that physical therapeutic interventions–such as cold baths, herbs, and electrical stimulation–could alleviate psychological problems. Freud substituted psychological therapies instead, insisting that patients could gain insight into the causes of their repression by telling their therapist their deepest thoughts.
The third phase is characterized by the cognitive revolution, which was initiated by American linguist Noam Chomsky's radical critique of the behaviorist's interpretation of language acquisition and continued by Swiss psychologist JEAN PIAGET's extensive research. The growing dissatisfaction with the demonstrated limitations of conditioning theory rendered child psychologists receptive to Piaget's rejection of the conditioning assumptions and his emphasis on the child's autonomous behavior. Piaget replaced Watson's passive child with one who is cognitively active in acquiring knowledge, initially through manipulations of objects and, later, through the manipulations of ideas. Echoing Baldwin, Piaget insisted that the child was continually trying to construct the most coherent understanding of an event. Surprisingly, even though he adopted a stage theory, Piaget was loyal to the doctrines of gradualism and connectedness, and he minimized the importance of brain maturation. Although Piaget acknowledged that each infant was born with some sensory motor functions, he wished to award biology as little power as possible. Some scholars have speculated that Piaget made encounters with the environment, rather than biology, the primary sculptor of growth because he wanted to base human morality on a history of experiences.
The fourth phase is defined by the research of British psychiatrist JOHN BOWLBY, who introduced the concept of infant attachment. The sense meaning of attachment is an emotional connection to a person who cares for the infant, created by the infant's pleasant experiences in the presence of the caretaker and a reduction in distress when she returns. The broad interest in Bowlby's speculations on infant attachment was partially due to the large numbers of American mothers who had begun placing their infants and young children in surrogate care in order to join the work force after World War II. This new social arrangement violated the normative nineteenth-century conception of a mother who remained at home to care for her brood of children. The public was receptive to a wise scholar who believed that the young infant should develop an emotional attachment to a single caretaker. Bowlby's presumption that these early attachments represented the hub around which a person's life revolved promised to reduce the tensions caused by greater geographic mobility, a higher percentage of working mothers, and the increasingly strained relationships among people and between citizens and their community.
Although many nineteenth-century observers would have understood his theories, and probably agreed with Bowlby, few would have written three books on attachment because this idea seemed to be as obviously true as the fact that the sky is blue. Bowlby's conclusions became newsworthy in the last half of the twentieth century, however, because historical events had led many to question the inevitability of maternal devotion to the child and children's love for their parents. Newspaper headlines that described parental abuse and adolescent children killing their parents undermined the nineteenth-century faith in the naturalism of parental love. Citizens were saddened by these new conditions and were eager to hear a psychiatrist declare that the love between child and parent was a requisite for psychological health.
Every society needs some transcendental theme to which its citizens can be loyal. In the past, the existence of God, the beauty and utility of knowledge, and the sanctity of faithful romantic love were among the most sacred ideas in the American ethic. The facts of modern life had made it difficult for many Americans to remain loyal to those ideals. The sacredness of the bond between mother and infant persisted as one of the last beliefs that remained unsullied. The large number of books and magazine articles on the attachment of infant to mother, and the necessity of skin bonding to the mother in the first postnatal hours, generated strong emotion, suggesting that something more than scientific fact was prompting the discussion. If an infant could be cared for by any concerned adult, the biological mother was expendable, and one of the few remaining ethical imperatives would be at risk.
Biology has returned to the study of children during the last two decades as a result of elegant discoveries in genetics, molecular biology, and neuroscience. The enthusiasm for biological influences assumes two forms. It is represented first by descriptions of the biologically prepared competencies of infants and young children in the opening years of life. These include infants' attentional preferences for certain kinds of stimuli (e.g., an attraction to contour, motion, and curvature); the enhancement of memory and the appearance of imitation later in the first year; and the emergence of language, a moral sense, and self-consciousness in the second year. Each of these developments is inevitable as long as children live in a world of objects and people. None requires the regimen of rewards and punishments that behaviorists had described as essential a century earlier.
A second form of biological influence involves the study of human temperaments, which the American psychiatrists Alexander Thomas and Stella Chess reintroduced to researchers during the late 1950s. Neuroscientists speculated on the reasons for variation among infants in traits such as irritability, activity, or fearfulness. These speculations center on inherited variation in the neurochemistry of the brain. The social environment was assumed to influence each infant's temperament to produce the personality of the older child.
With these advances, developmental psychology has come full circle; the early researchers' diaries also emphasized the common psychological properties that emerge in all children who grow up exposed to people and objects. We have learned, however, that the time of emergence of each of these competencies corresponds closely to maturational events in the brain. Both biological and experiential influences contribute to growth; consequently, an attempt to synthesize both forces, necessary for complete understanding, will dominate research and theory in the decades to come. This synthesis will require one vocabulary to describe the biological events and another to describe the psychological phenomena. A behavior, thought, or feeling is the final product of a series of cascades that begins with an external event, thought, or spontaneous biological change. The forms that comprise each succeeding cascade have to be described with a distinct vocabulary. Genes, neurons, and children require distinct predicates because each has unique functions: genes mutate, neurons inhibit, and children act. Developing an understanding of these issues will dominate the future of child psychology.
Bowlby, John. 1969. Attachment and Loss: Vol. 1. Attachment. New York: Basic Books.
Cairns, R. B. 1998. "The Making of Developmental Psychology." In Handbook of Child Psychology: Vol. 1, 5th edition, ed. R. M. Lerner and W. Damon. New York: Wiley.
Freud, Sigmund. 1957. A General Selection From the Works of Sigmund Freud. New York: Liveright.
Kagan, Jerome. 1983. "Classifications of the Child." In Handbook of Child Psychology: Vol. 1, 4th edition, ed. W. Kessen and P. H. Mussen. New York: Wiley.
Piaget, Jean. 1951. Play Dreams and Imitation in Childhood. New York: Norton.