What makes us human? | An incomplete, but operative definition
What does it mean to be human?
I still remember that Halloween evening; my father’s hand resting on top of my head as I lay next to his immobilized body in the ICU of the Los Angeles Veterans’ Affairs Hospital. After several merciless weeks in and out of intensive care and countless tests which yielded little to no information on my father’s deteriorating condition, all I could do was be next to him, be still, be present, despite the crushing agony that came with witnessing the better part of me lose more life with each passing breath.
As a child, I still remember falling asleep to the consoling sounds of his Earth-tremoring snoring reverberating down the wooden hallway of my aunt’s house. My father was my backbone. He left his job as an engineer to be a full-time, stay-at-home dad, and he was undoubtedly, the Ultimate Dad, reluctantly defying gender norms, despite his conservative, military upbringing; the creative Dad who learned how to sew an “ice skating dress” together for my first competition because he could not afford to buy one; the problem-solving Dad who made ice skating dresses so ugly that people felt bad enough to give me their old ones; the prideful Dad who lived under his wife’s sister’s roof for several years, giving up his full-time income to be a full-time father.
I can still feel the rapid palpitations of my heart as I ran outside of my father’s hospital room, aggressively calling for a nurse to attend to him after watching him gasp for air. Then, his eyes rolled to the back of his head. The only nurse I came across, visibly irritated by my query, curtly asked me to wait in the room, saying she would call the doctor for me. The obedient and other-serving side of my personality, ever so reliably kicked in and I returned to his side, questioning whether or not what I witnessed warranted such alarm. But, after several minutes of carefully scrutinizing my father’s irregular breathing patterns and lack of responsiveness to my vocalizations, I decided to call for assistance again, but this time, with a strengthened determination. Being turned away a second time from the same nurse, who casually sat at her desk staring at a computer screen, I instinctively began to panic and sought help from anyone else. Unfortunately, no one else was around. Within a few moments, a litany of “code blue” alarms rang, one after another, throughout the hallways from four different patients’ rooms. A swarm of women and men in white coats ran into the room furthest down the corridor. Four doctors stood at the door and watched as the other two desperately attempted to resuscitate the dying patient. I watched as they failed. A nurse responding to the call hurried past me and I immediately called to him.
“Please! Can you look at my dad? I think he’s dying!”
The nurse ran into my father’s room, checked his vitals, and immediately pushed the metal emergency “code blue” button behind his bed. I had never seen anyone actually push that button before. I never want to again. The sound of his alarm cascaded between four others; a cacophony that fell and ultimately drowned beneath a mountain of desperation. By the time they reached my father, it was too late. He died the next day.
That evening, I witnessed two Veterans die within an hour and my father was one of them. They did not have to die that night. Their fate was written into being by a less than adequate institution, poor staffing and breaches in protocol, or perhaps an absence of a protocol in the first place. And ultimately, even the glaring disparities in the VA’s protocols are only symptomatic of the deeper, more insidious, ideological problems its massive bureaucracy faces.
After my father’s death, I plunged myself into a black hole of endless statistics and stories from the frontlines about medical malpractice and institutional passivity in an attempt to understand where responsibility lied, and the more I learned, the more familiar the terrain became. The short-term solutions, the failure to evolve systems, the uniformity of treatment, the unquestioned adulation for authority, the idealization of obedience, the systematic desensitization of people, the moral and ethical absenteeism, the bureaucratic convolution, and the abdication of humanity in appealing to the status quo. Upon reflection, I realized the problems in public health mirrored, almost exactly, the problems in public education, and such instantiations are made pervasive in every other institution connected to it; our justice system, prison system, political system, our social system.
It took nearly 4 years for me to forgive myself for my father’s death; for not initially ringing the alarms amidst my father’s silent cries for help; for not responding in a manner, in retrospect, that might have extended his life such that he would be sitting beside me today. Yet, my process left me with more questions than answers. What stopped me from turning around and demanding assistance? What prevented me from running over to the crowd of doctors less than 200 hundred feet away and begging them for help? What prevented me from demanding a change in behavior from the nurse or from pushing the emergency button myself? In the same way, what impedes those in service from prioritizing people over protocols? What obstructs those in positions of power and influence from challenging archaic, inefficient and sometimes inhumane institutional practices in the creation of new, improved systems? Perhaps the answers to our inhumanity lie in our humanity itself.
On Humanity
It is so easy to forget; to be ignore-ant; surprised by the immense ingenuity, intelligence, and compassion that exists in all of us. We’ve all likely seen the YouTube videos. Is it really a surprise when a homeless person (a label which we forget is situated in transience and not permanence) sits down, and like a virtuoso, expertly and emotively plays a piano on the street? Or is it more remarkable that a body of people could be so mystified by the sight? Or, when a person of modest means accepts money offered to them, buys food and shares it with friends (unaware that a not-so-forthcoming videographer rejects the idea that a less privileged person would be so generous, even, have a right to privacy)? Or when a teenage girl invents a vaccine for a dangerous virus, ultimately saving millions of lives, is it really that astonishing? Or is it more astonishing how entrenched our reductive, deficit-based ideologies continue to incapacitate our ability to recognize that “they” like “us” are human, with all of the creative, critical, communicative, communal faculties that come with the name. It is indeed remarkable, but for a different reason, and it is these deficits in our thinking that must be challenged in our endeavor to understand and extend humanity to and from ourselves.
As an educator, I’ve come to recognize the fundamentality of learning and discussing humanity as concept, action, philosophy, and framework. We cannot tackle dehumanization of any kind, be it public or private, institutions of health, education, or criminal justice, without an operating definition of what it means to be human. In the absence of such intellectual explorations, our inner worlds devolve into moral flatlands to be dominated by the strongest, loudest, most intimidating of ideologues and demagogues, bought and sold to the highest bidders, through acts which can only be regarded as violence. When we are told what is human, or rather, who is human, we abdicate the responsibility toward inquiry embedded within each of us to determine the world for ourselves.
So, let us ask together, what makes us human? And perhaps more importantly, what implications will our definitions for a time have on the way we “do” teaching and learning, education, and life itself.
Defining the Human
What makes us human? The question is both a scientific and philosophical one; empirical and socially constructed. In many ways, it tasks us to identify our uniqueness above other life forms, not only with a proclivity, but hypersensitivity toward discrepancy. Quite evidently, there are vast differences between humans and other living organisms on the planet; yet, there are likely more similarities, particularly found the deeper we explore, at the molecular, atomic, and subatomic levels that make up our parts. In the scope of sentience, any perceptions of superiority are humanity’s deficits made visible.
Though I will not be dedicating this article to sentience itself, it is important to recognize that defining the human is much like carving out an arbitrary, microscopic piece of a muddled, marked-up, living chalkboard, breathing and moving, full of evolutionary scribbles and scratches created over the course of billions of years, with infinite, incoherent connections and disconnections. The human being is that small, indiscriminate piece of chalkboard, connected to the whole of the universe. Our existence, our nature, our humanity is not easily distinguishable from parts it is made of and connected to. So we must move forward with a careful cognizance that our definitions, which beg for black and white comparisons and easily identifiable divisions, are mere veneers of a deeper reality: We are one, connected by a web of mutuality, part of an even greater whole. Any answer we come to together, will and should be an answer for a time, because there are no “perfect” answers, true in all space and time. Like a friend we meet and spend time with along this movement of life, we will find comfort in our answers for a time, and continue to perfect that we our knowledge grows.
The Limitations of Definition
Scientists have long struggled to disambiguate human genetic and phenotypic manifestations apart from other forms of life. Characteristics often referred to as the prodigious cornerstones of humanity, such as language, social organization, cooperation, large brains, etc. often lose remarkability upon further examination. For example, bipedalism is not exclusively a human form of locomotion. Birds, kangaroos, and even chimpanzees (our closest primate ancestors) manage to maneuver regularly or somewhat so on two feet. The answer to what makes us human is not a definitive one; rather, the characteristics most notably affixed to humans differ in their potentialities, frequencies, and sophistication, and perhaps, manifest themselves with greater profundity in comparison to other species.
While distinctions can be made between human and non-human organisms, further distinctions can be constructed within the human species itself. Using the original example of bipedalism, an evolutionary peculiarity, many anthropologists agree it is a signature feature of humans; however, it does not follow that one who travels by wheelchair, or born without functional legs, is less human. Similarly, a human being without the capacity to speak is not necessarily less human. Instead, we understand the transactional nature of the question, “What makes us human”, as both pattern identification within large populations and social construction and reconstruction; that is the empirical and the imagined.
In 1758, Carl Linnaeus, the Swedish botanist who first classified species, termed us Homo sapiens, which means “wise man”. When it was later revealed that there were multiple-branches in our human evolution, we felt a need to disaffiliate ourselves with Neanderthals, a subset of Homo sapiens, and became Homo sapiens-sapiens, the wisest of the wise. This is an example of how we are not simply constructing definitions of humanity, but we are constantly reconstructing, and these reconstructions are not necessarily based on the empirical, but also, the social. If and when the lines blur, it is because they are being created as we go.
Evolutionary anthropologist, Robert Sussman, settles the discrepancies in the distribution of behaviors by describing the “toti-potentiality” of different species:
Each population of a species should have the potential to perform the total repertoire of the species, but the distribution of behaviors of any population might display a statistical representation of the total repertoire. We should be able to compare the toti-potentiality of different species and compare the repertoire of different populations within the same species.
Sussman asserts that humans are distinct in behavioral toti-potentiality from the rest of the animal kingdom in three major ways: symbolic behavior, language, and culture. Symbolic behavior is the ability to “imagine things that don’t exist” (pg. 185). We can ponder about the past and future in alternate realities or worlds. Language is the unique, temporally malleable vehicle with which we elaborate upon our symbols and impress them upon others. And finally, culture is the ability to pass on learned behaviors and world-views to future generations. Ultimately, it is the toti-potentiality of our species that is the subject of definitive analysis. Thus, we are identifying the macro-potentialities of humans against the idea(l) the definition itself creates for individuals.
The Smithsonian Institute identified six primary traits that evolved over the course of six million years, which led to the development of modern humans: Bipedalism; tool-making; anatomical adaptiveness to differing climates; increased brain size; social life; and, symbolic thought, particularly in relation to language. Our analyses will be focused on these key traits, which evolved symbiotically and influenced one another over the course of human evolution, and can be rendered into more inclusive categories of toti-potentiality: language and communication, social adaptiveness through collaboration, cognitive agency or critical thinking, and creativity. These faculties are extensions of broader mental capacities: mental time travel, theory of mind, and language.
Mental Time Travel
Following the genetic revolution of the 1980’s, our close ancestral relationship with other primates became nearly impossible to dismiss, particularly with the discovery that the human genome is nearly 99% identical to chimpanzees. To put this into perspective, rats and mice are more chromosomally deviant than what can be found in the primate lineage (Pollard, 2012). Yet, despite our indistinguishable genomic likeness, the differences between our ape cousins and us are difficult to ignore. For better or for worse, humans can call into question their very existence, conceive of alternative realities, create and command environments they inhabit, and nurse into existence the socio-cultural systems and traditions for future generations to accept or contest. Humans demonstrate a depth in consciousness, unlike any other species on Earth.
Dunbar (2007) argues the essential difference between humans and other species in our capacity to imagine and conceive of alternate realities.
What we can do, and they cannot, is to step back from the real world and ask: could it have been otherwise than we experience it? In that simple question lies the basis for everything that we would think of as uniquely human (pg. 38).
Essentially, humans demonstrate an ability to transcend temporal fixedness; that is, we can escape the reality as it is now and enter an inner, imagined world that calls upon our personal past or invented future. Corballis and Suddendorf (2007) describe this adaptation as mental time travel, a function of episodic memory, which enables our capacity to envisage future episodes and mentally relive events in the past.
Perhaps the most extensive research on mental time travel in nonhuman species comes from work with birds that cache food they later recover from stored locations. Nearly all birds that belong to the crow family store and hide food for later use, a behavior known as re-caching, which suggests they’re calling upon memories of the past and planning for an uncertain future. For example, scrub jays select food locations according to what they have stored in a location previously, while simultaneously re-caching when they have been observed by other birds, presumably to avoid the potentiality of their food being stolen in the future (Clayton et al., 2003). These observations of scrub jays provide some convincing evidence of mental time travel. However, additional studies suggest their ability to recover food and re-cache may not be a form of remembering and avoiding future threat, but a domain specific, instinctive mechanism. Van der Vaart and colleagues (2012) argue that the scrub jay behavior may be due to the stress that comes with the presence of other birds. Mental time travel remains a primarily human phenomenon.
Tool manufacture is also used as evidence for mental time travel in nonhuman species. New Caledonia crows construct hook-tools out of twigs and pandanus leaves to aid prey capture and extract grubs from holes (Hunt, 2000). Similarly, chimpanzees, bonobos, and orangutans sometimes improvise objects for tool use, dislodging rocks or branches as intimidation or defensive tools, cleaning body parts with leaves, or using stones to crack open nuts. Still, there is little evidence that they construct tools for some future purpose. In Oakley’s (1961) study of chimpanzee tool use, he wrote “tool making occurred only in the presence of a visible reward, never without it. In the chimpanzee, mental range seems to be limited to present situations, with little conception of the past or future” (p. 187). Comparatively, evidence of this capacity in nonhuman species pales in comparison to humans as the manufacture of tools by humans is clearly positioned for future use and re-use.
Dan Falk writes in his book, In Search of Time, “to be human is to be aware of the passage of time; no concept lies closer to the core of our consciousness.” This ability to draw upon our autobiographical experiences and emotions from the past matched with the ability to imagine and envision future events is indispensable to all that we recognize as human. He continues, “Without it, there would be no planning, no building, no culture; without an imagined picture of the future, our civilization would not exist.” The brain is a foretelling machine that uses environmental input to make predictions about the future, envisaging scenarios as they may occur.
From this elasticity with which we experience time, we witness the emergence of higher order thinking (Murray, 2003), allowing humans to process and update information crucial to survival itself while operating outside of a biologically automated and predictably fixed set of behaviors. Endel Tulving, PhD, argues that mental time travel, also known as chronesthesia, enables us to create and pass cultural knowledge on to future generations, including how to plant seeds, provide the dead with grave goods, make and keep records, formally educate young, create gods and please them, explore the stars. Mental time travel likely provided our ancestors with an evolutionary advantage in the struggle for survival, remembering and imagining future scenarios that ultimately increased their fitness like distinguishing between friends and foes, evolving food gathering processes, and developing tools that worked well while discarding others. Chronesthesia emancipated humans from their own biological wiring, allowing us to remember, understand, apply, analyze, evaluate, and create plans for the future by permitting our asking questions like, why? Why not? When? How? And, What if? Chronesthesia may be the gateway to critical thinking and questioning itself.
The propensity to entertain ideas of alternate realities, questioning current realities, and playing with the physical world in the creation of new realities is closely tied to curiosity. Curiosity, from the Latin root, cura, or care, essentially means to care to learn. Humans demonstrate an unparalleled curiosity. We search for knowledge, not out of mere necessity; rather we explore and seek explanations out of sheer curiosity. From a very young age, toddlers explore the world around them, actively engaging with it, touching, feeling, taking things apart, fitting the pieces together, wandering off into new spaces, intently observing and becoming captivated by everything around them. And throughout history, curiosity, which often manifests itself through the act of questioning where language is accessible, is responsible for the very quests that have allowed us to explore micro universes and land on the moon.
The act of questioning requires an awareness of one’s own thoughts in tandem with the knowledge that there is more to be known. Questioning is disruptive in nature, for doing so is to recognize the environment not only forms the individual, but that the individual, too, forms the environment. This dialectic empowers the individual to re-cognize the plasticity of circumstance and maneuver beyond a perceived biological rigidity.
For instance: What are the lights in the sky? Why does the sun rise and fall every day? Where is last Tuesday? Does bacon taste good on an oatmeal cookie? What is consciousness? Can humans fly? What would happen if I threw a rock into the water? Do potatoes have feelings? What makes us human? Are humans the only animals that are curious?
A study by Laurie Santos conducted at Yale University offers some insight into curiosity in other primates. When presented with a mysterious machine, capuchin monkeys attempted to reason its workings only in the presence of a reward. In contrast, young children demonstrated causal reasoning regardless of the presence of an external reward. Similar studies have been conducted with monkeys to understand animal causal cognition with comparable results, including the “blinket detector” test, where toys were placed on a Styrofoam box, emitting visual and auditory cues when triggered. Monkeys sought out combinations of toys to activate the device in the presence of food rewards but ceased investigations when the food dispenser was detached from the machine. Contrastingly, young children in the same test explored regardless of immediate rewards. Though it is possible that the monkeys would demonstrate curiosity in other contexts, there is little existing evidence of curiosity in non-human primates or non-primates.
Taken together, the propensity to learn and seek explanation (curiosity), to question the environment (critical thinking) and explore alternative realities, while subsequently manipulating the external world (creativity), are not only inseparable qualities, but also together, fundamentally human practices.
Theory of Mind and Sociality
Mental time travel, that is the ability to imagine – or draw images of - other futures, is not the only capacity that precedes the things we commonly identify as human. Another basic feature, working in concert with chronesthesia is theory of mind, the ability to imagine others’ internal worlds and thought processes.
According to the Oxford Handbook of Philosophy and Cognitive Science (2012), theory of mind refers to the cognitive capacity to attribute mental states of self and others” (pg. 2). There is a growing consensus by developmental and comparative psychologists that the ability to recognize and imagine the mind states of other people (Dunbar, 2007) is a defining characteristic of humans. Also referred to as “mentalizing,” “mind-reading” or “folk psychology,” theory of mind enables individuals to inhabit, describe, and conjure up the inner worlds of others, animate or inanimate, understanding the world from a different vantage point. From it, humans are able to recognize beliefs, intents, desires, emotions, and perspectives different than one’s own.
Simon Baron-Cohen (1985) identified that infants by 7 to 9 months, are able to identify directions of attention in others, a critical precursor to theory of mind. In other words, infants recognize the selective direction of another’s gaze, noticing when attention is on them or not. By the age of four, most children are able to recognize false beliefs in others. An experiment conducted by Wimmer and Perner (1983) tested children between the ages of 3 and 9, telling them a story about a boy named Maxi whose mother brought home chocolate to make a cake. In the story, Maxi observes her mother put the cake in the blue cupboard and then goes outside to play. While Maxi is outside, she moves the chocolate from the blue cupboard to the green cupboard. The children are asked which cupboard Maxi will think the chocolate is in. The story is acted out using dolls and matchboxes to support their understanding. While 3 to 4 year olds mostly failed the test, pointing out where the actual position of the chocolate was instead of where Maxi thought it would be, 4 to 5 year olds were able to point out where from Maxi’s perspective. Between the ages of 4 and 6, children are able to simulate the mind states of others, engaging in fictive play like tea parties with dolls, pretending empty cups are full, and actualizing imagined realities through the inhabitants of alternate perspectives.
There remains extensive debate as to whether or not nonhumans can attribute the mental state of others (Van Der Vaart & Hemelrijk, 2012). A wealth of studies conducted with apes and monkeys suggest that both chimpanzees and macaques hold the capacity to understand what others know, but not when others hold false beliefs. But even more importantly, the studies focus on whether or not nonhuman species have the capacity for theory of mind matters perhaps less than the depth and intentionality with which humans inhabit and explore alternate mind states. For theory of mind is the mental toolkit, which renders possible the vast scope of human behaviors associated with imagining the inner worlds of others: empathy, compassion, altruism, communication, cooperation, collaboration, symbolic thought, imitation, teaching, learning, language, society, and culture; while also rendering possible their counterparts: sadism, dishonesty, intimidation, division, false characterizations and stereotype threats, and abuse of power.
Seyfarth and Cheney (2012) suggest that a fully developed theory of mind, that is, the understanding that one’s own beliefs, intentions, desires, and knowledge can differ from others, in tandem with the motivation to share information with others, were both necessary precursors to language. Like humans, monkeys and apes have sophisticated perceptual systems and can recognize other individual’s motives and anticipate what they may do next. However, they cannot recognize what another individual knows or when they hold a false belief. Studies suggest they are limited in the awareness of their own thoughts, showing little evidence of critical problem solving or imaginative behavior. In contrast, young children display not only an awareness of their own thoughts and ‘what if’ behavior, but are further motivated to share them with others. Theory of mind and the motivation to share information with others is, according to Seyfarth and Cheney, not only what makes us human, but together embody the abnormal properties that gave rise to language itself.
Whiten (1999) refers to this fundamentally human anomaly of language as a product of our “deep social mind”. He asserts humans are more deeply social than any other species on earth with the ability to penetrate the cognition of other individuals through everyday psychology. These complex interpretations are further complicated by culture, which in its vast variations stems from our ability to imitate, acquire, and transmit mental representations to others. Therefore, mind reading, imitation, and culture are the synergistic precursors making it possible for language to become one of the most significant tools ever created by humankind (Whiten, 1999; Cartmill & Brown, 2012).
Language
Human language is ostensibly one of the most distinguishing features and perhaps, one of the most ubiquitous technologies of our species (Bickerton, 2009; Pinker, 1994). It is estimated there are between 5,000 and 7,000 spoken languages in the world, rich with complexity and the potential to be encoded into auditory, visual, and tactile modalities like writing, sign, braille, or even whistling. While it is true almost all animate organisms communicate with one another, like birds that sing, or dolphins that emit sonar signals, or even trees that release tannins when harmed, no other organism’s communication system is as far isolated from other species as human language.
Gentilucci and Corballis (2007) describe language as a composition of symbols - audible, visual and tactual - that bear little resemblance to the objects, actions, or ideas they represent. For example, when we look at the English letter T, there is no implicit or inherent connection to the sound it makes; yet, when we come upon it in a word, we recognize it has a function or connotation, what it is, and make the corresponding sound “tuhh”. Subsequent vocalizations with the letter in context support the formulation of words, which also hold meaning and evolve over time. Through shared perceptions of these symbols, the artifacts of human cognition are transmitted from one person to another. Even as you read the words on this page, the ideas I have generated in my brain and made cohesive through language itself, are being transported from my head, into words, then put in a book, met with your eyes and into your brain for analysis, interpretation, critique, etc. This, evolution’s masterful version of telepathy, made possible via symbolic thought, has allowed for the social, cultural, and technological innovations that we see all around us today.
However, there are examples of other primates that communicate. Nim Chimpsky, a chimpanzee who was taken at birth from its mother to live with a human family, successfully learned over 125 words in American Sign Language. Other animal language studies obtained similar results. Yet, because non-human primate vocal production is highly constrained, their vocalizations are restricted to a small repertoire of calls. In comparison, the human larynx at birth looks very similar to that of a non-human primate, yet develops significantly in the first two years enabling more precise vocalizations, allowing for a greater diversity in vocal repertoire.
Ultimately, the ability to learn, to build upon one another’s innovations, to socially and culturally evolve into newer, more inclusive or exclusive iterations of ourselves, has been rendered possible by the co-evolution of and co-interaction between theory of mind and mental time travel. Furthermore, language as a product of these basic cognitive faculties is the vehicle with which our collective historical inventory is projected and reflected upon ourselves time and again, through storytelling, arranging our sense of identity, direction, and place within the whole of the universe. Our reality as we know it is a collaborative construction made up partly by the real, the interpreted, the inspired, and the imagined; an interplay between the self and other, whose inner worlds we are re-presented and transferred from mind to mind, over and over again.
The Resulting Social Adaptiveness of Cooperation
As they traveled down a mountain road, three monks, carrying nothing more than a large pot, came upon a war-torn village. The villagers, suspicious of the strangers, covered the food they had available and hid in their homes. The monks decide to make “stone soup”, comprised of nothing more than water and stones. The villagers, curious as to what the monks were cooking with nothing but stones, begin approaching the monks. The monks convince the villagers to share some of their spices, vegetables, and other ingredients, and together, they feast on a meal that feeds the entire village, ultimately teaching them the meaning of happiness.
The old folk story of Stone Soup with its many variations continues to teach young children all over the world the importance of sharing and the joy that comes with doing so. But where does this proclivity toward pro-social, cooperative, collaborative behavior come from? Is there something inherently human about it?
Over the last two million years, humans developed a unique, incredible ability to learn from each other, making way for cumulative learning and cultural evolution. As a result, human psychology evolved to support larger, more cooperative societies. In fact, human pro-sociality and cooperation operates at a larger scale than any other known species and there is strong evidence to indicate human predisposition to cooperative and collaborative behavior. Among social mammals, cooperative behavior is generally limited to relatives with little complexity as opposed to what we see in humans: division of labor, trade, or large-scale conflict (Boyd & Richerson, 2009). For this reason, even the most collaborative of mammals have direct familial relationships.
Cooperation is generally defined as “costly behavior performed by one individual that increases the payoff of others” (Boyd & Richerson, 2009, p. 3283). Cross-cultural studies on cooperation suggest a strong, human proclivity toward reciprocal and altruistic behavior. The ultimatum game, which has been played all over the world, shows two players a sum of money, for example, $10. The first player, the “proposer,” is instructed to offer any amount between $1 and $10 to the second player, the “responder.” The proposer is only allowed to make one offer, which the responder can accept or deny. If the responder accepts, the money is shared accordingly. If the responder rejects, then both players receive nothing. Due to the fact that the entire game is played only once between two individuals and completely anonymously, the general assumption is that self-interested proposers would offer the least amounts, $1, while self-interested responders would accept any amount of positive money. Yet in actuality, this is not the case. Despite the many replications of this study with varying conditions and amounts of money, the proposers consistently offer respondents substantial amounts, 50% on average, with respondents routinely rejecting offers below 30%.
Cooperative, or collaborative behavior is not unique to humans. It is widespread in nature with many species living in social groups with both kin and non-kin. There is also growing literature on animals being highly motivated to learn about other individuals, particularly when it comes to dominance and social ranking. These behaviors are evidenced not only in other primates, but pinyon jays, hyenas, some birds and fish (Cheney, 2011). For example, subordinate baboons will groom dominant individuals in exchange for greater tolerance when accessing food sites. Female baboons will often groom other females, developing long-term bonds, in the absence of immediate rewards. Yet, humans are different in that their cooperative behavior is not limited to the stabilization of social order or non-kin relationships. Our ability to be cooperative and coordinate our individual skills in tandem with our species’ abilities in the achievement of shared goals has brought about culturally-shifting innovations like the printing press, symphonies, air travel, the Internet, language, Wikipedia, and landed us on the moon. And while there are countless examples of cooperation in other nonhuman animals and between them, few animals can abandon pre-wired frameworks of behaviors.
Humanity As Framework for Life and Education
Human beings are perhaps one of evolution’s most fascinating innovations. And while the act of defining ourselves can be a messy, monstrosity of an academic and philosophical endeavor, it is perhaps the only way with which we can identify inhumanity when we encounter it. For defining humanity automatically brings into existence its inverse; that is dehumanization.
If essential aspects of our humanity can be identified by the outputs of our fundamental capacities; that is to say the abilities that derive from mental time travel and theory of mind, then what makes us human can be specified by language, critical thought, deep social predilections, and imaginations. We are critical, communicative, collaborative, and creative beings: The 4 C’s.
To be human is to be able to be critical and question the reality as it presents itself and ask what if and why. To be human is to contemplate realities that do not currently exist and create circumstances that fit our imagined worlds. To be human is to be able to communicate ideas, sharing our visions, our complex inner and outer worlds, our understandings and our questions, receiving and relating them in never-ending dialectical achievement. To be human is to be able to coordinate and collaborate with others in the transcendence and real-time convergence of the real and the imagined. And taken together, these potentialities make up the teaching and learning process. Teaching and learning is perhaps the most human enterprise of them all.
The transitive word, humanity, is both an ideal and a process. Its noun, human, and a verb, humanize, allow for movement toward and further away from the ideals we construct by way of our definition and conditioning. Just as we have the freedom to choose critical thought, we could also choose ignorance, indifference, and irrelevance, leaving any or all of our human potentialities unexamined and unpracticed. Depending on our directionality, deliberate or unintentional, our humanity itself, without examination, could become considered unconventional, uncanny, idealistic dogma.
Defining what it means to be human gifts us a lens with which we can understand and identify dehumanization not only in our institutions, but perhaps more importantly, in ourselves. Definitions direct and distinguish destinations for our ethical travels. They grant parents, teachers, people authority. If we are unwilling to extend to others or those individuals whose lives we significantly form, a voice, a choice, time and space for critical thought and respectful dissension, and authentic opportunities to work WITH us and not FOR us, we risk falling short of our own definition of what it means to be human; short of our own ethical expectations.
So what then does it mean to dehumanize? Using this framework as a lens with which to study the world, dehumanization might be the repression of the human qualities we’ve identified. For example, preventing human beings from questioning and examining the world for themselves would thus be a form of dehumanization, because it strips them of their fundamentally human occupation. Restricting the flow of information and preventing communication, or free speech, may be a form of dehumanization, because it actively restricts human beings from a fundamentally human occupation. And taking away people's rights to choosing their own lives, which may diverge from our own wishes for them, would then also be dehumanizing because it inhibits their right to live creatively, regardless of the existing social constructions or expectations we have for them.
Questioning Our Ideologies as Educators
It is possible that for some of us, identifying what it means to be human, and what it means to dehumanize, comes easy to us and falls short of controversial territory. But what happens when the same framework is layered atop of everyday axioms? I recently read a list called Mom’s Rules shared by over 12,000 people on social media. It read, “If I cook it, you eat it. If I buy it, you wear it. If I wash it, you put it away. If I clean it, you keep it clean. If I say bedtime, you say goodnight. If I say get off the phone, you hang up. If I say no, you don’t ask why. Because I’m the mom!”
Now, admittedly, I’m not a mother, yet, I have a great deal of respect for them, including my own. But I am a teacher. And I’ve heard similar rhetoric from other teachers about students; that they are not allowed to question the authority of the teacher, for doing so would be considered “talking back.” In fact, the “because I said so” mentality is pervasive in education. Some of the most regarded learning environments I’ve encountered in schools are the ones where students do little to no talking, let alone question their teacher. And this may be because many of our ideas about teaching and learning, about parenting, about authority, revolve around the unswerving convenience and un-mitigating veneration of those we see as above us and the simultaneous renunciation of our fundamentally human qualities. Children are expected not to question parents, students are expected not to question teachers, workers are expected not to question managers, and the list goes on. In fact, merely suggesting a scenario in which a child says, “I am not to be questioned,” to an adult, invokes great discomfort, yet not its inverse. More stunningly, knowing why and wherefore, is at the heart of learning and supports the growth of everyone involved because it necessitates a re-observation, or re-cognition, of the questioned.
I still remember my first year in the classroom. I was considered one of the most effective educators in the program and moved 3rd grade students from below basic standing into proficiency with what appeared to be great ease. In fact, every two weeks, my students’ scores would skyrocket on post-examinations and I gladly took all of the credit. But, at what cost? Almost every day, I made a child cry by my sternness. Talking was rarely allowed in the class unless a question was raised, and even then, students were given few, if any opportunities, to question or co-create the classroom, which we all shared. In fact, for many years, I fervently believed that teaching was an autonomous act, independent of student thoughts or opinions, but grounded in my expertise and my ability to communicate it. It was all about me. It was inequitable and conveniently, conventional.
My previous notions around teaching remain the status quo in many educational institutions and environments today. But countless studies in cognitive and behavioral psychology, as well as education, concede that teaching, and parenting, in its most effective, operationalized form, is reciprocal, not autarchic. It is dialogical, not unilateral. It is collaborative, not teacher-centered. It is facilitative, not directive. It is democratic, not authoritarian. It is humanizing, not oppressive.
The purpose of our inquiry together is to be uncomfortable, to sit in dissonance long enough to move and readjust our notions about the world and ourselves, with freedom and intentionality toward that which makes the deepest sense to us. But most importantly it is to choose who and how we want to be as human beings toward those we’ve qualified as “others,” who may be as “other” as our hands are to our mind, or our histories to our present.
We perpetually rest in the wreckage of our humanity, but choosing to put it back together, to make cohesive again that which we unknowingly tear down, to question and re-cognize our truisms about it, is both our noblest vocation and greatest virtue. And it begins with the self, a definition, and a mirror.