Evolution of the Human Mind and Brain: Hominoid, Australopithecus, H. Hablis H. Erectus, the Neanderthal Cro-Magnon Wars: Frontal Lobes, Angular Gyrus, Tool Technology, Language, Writing, Math

Multi-Regional Evolution of the Human Mind and Brain: Hominoid, Australopithecus, H. Hablis H. Erectus, the Neanderthal Cro-Magnon Wars: Frontal Lobes, Angular Gyrus, Tool Technology, Language, Writing, Math

Rhawn Gabriel Joseph, Ph.D.

Until around 580 million years ago, the vast majority of life forms sojourning on Earth and beneath the seas, were single celled organisms and simple multi-celled creatures composed of less than 11 different cell types (Bottjer et al., 2006; Glaessner, et al. 2010; Narbonne 2005; Narbonne and Gehling 2003; Shen et al., 2008). Until sufficient oxygen, silica, and calcium had been released and the oceans had become oxygenated, body and cell size were restricted and unable to expand or engage in strenuous physical activity. Larger bodies require skeletal support. Internal organs require skeletal protection. Moreover, in the absence of ozone, larger sized bodies would be burnt by UV rays and would pop and explode. Therefore, beginning around 640 mya, once silica, calcium, and oxygen levels had increased and a protective (oxygen-initiated) ozone layer was established, creatures expanded in size, diversified, and grew spines, silica skeletal compartments, then silica-collagen skeletons, collagen-calcium skeletons, armor plates (sclerites) and small shells like those of brachiopods and snail-like molluscs (Matthews and Missarzhevsky, 1975; Mooi and Bruno,2011; Butterfield 2003; Conway Morris 2003; Lin et al., 2006).

There ensued an explosion of life with all manner of complex creatures appearing in every river, ocean, and stream. This vast explosion of bilateral metazoan diversity appeared multi-regionally throughout the oceans of the Earth within 5 my to 10 millions (Levinton, 1992; Kerr, 1993, 1995). Over 32 phyla rapidly evolved, many with the "modern" body plans seen in modern animals (Fortey et al., 1997; Valentine et al., 2011; Conway and Morris 2000; Budd and Jensen 2000; Peterson et al. 2005). These included organisms with a hard tube-like outer-skeleton consisting of calcium carbonate, and all manner of "small shelly fish" (Anabrites, Protohertzina), as well as sponges and jelly fish, and later, mollusks, brachipods, and the first anthropods (e.g. trilobites) which immediately sprouted legs. In fact, with no history of derivative ancestral forms, and over the course of just a few million years, all manner of complex life forms emerged, and many species were equipped with gills, intestines, joints, and modern eyes with retinas and fully modern optic lenses. In fact, every phylum in existence today (including several which have since become extinct), emerged during the Cambrian Explosion, including the phylum Chordata and animals which would possess a rudimentary 2-layered brain consisting of a spinal cord, brainstem, midbrain and limbic forebrain.

The evolution of fish, amphibians, reptiles, and then repto-mammals then ensued over the course of the next 300 million years. The repto-mammals were distinguished by the cerebral metamorphosis of the 4-5 layered cingulate gyrus, which stretched in an arc over the hypothalamus, hippocampus, and amygdala. True 6 layered mammalian neocortex was probably also beginning to form in thin patches.

By 200 million years ago therapsids had already evolved from repto-mammals, whereas by 150 to 100 million years ago, mammals began to evolve from and to slowly replace the therapsids, emerging multi-regionally on every continent. The evolution of mammals is distinguished by the six layered neocortical mantle which sits and surrounds, like a shroud, the more ancient regions of the brain. It is this six layered neocortex which would give rise to language, thought, and the rational mind.

Corresponding with the ascendance of mammals was the continued expansion of the brain and development of the six layered telencephalon. Indeed, much of the new cortex had evolved for the specific purpose of serving the needs of the limbic system, and thus, these additional layers evolved out of the amygdala, hippocampus, and cingulate gyrus, thereby giving rise to the frontal, temporal, occipital, and parietal lobes.

For example, the medial and then the lateral (motor areas) of the frontal lobe were fashioned from the anterior cingulate gyrus, whereas portions of the amygdala and hippocampus had become increasingly cortical and had given rise to the inferior, medial, lateral, and superior temporal lobe (e.g. Sanides 1964). Hence, six to seven layers of cortex, i.e. neocortex, began to enshroud and form the cerebral hemispheres which in turn conferred upon these creatures extraordinary powers of intelligence, foresight, planning, and communication. Being much more intelligent, mammalian predators were becoming a serious competitive threat to the more dim witted dinosaurs.

By 75 million years ago the number of dinosaurs began to decline throughout most parts of the world (cf Paul, 2012) with the possible exception of North America. However, about 65 million years ago a huge asteroid or meteor struck the planet near the gulf of Mexico (Alvarez, 1986; Alvarez & Asaro, 2010; Hildebrand, 1991) and a vast number of (large sized) species died out in response to this cataclysm (Raup, 1991), particularly those who lived in North America; i.e. the dinosaurs. Apparently the explosion was so massive that animals for thousands of miles around were instantly incinerated, and so much dust was thrown into the air that the sunlight was bloated out causing temperatures to plummet--thus killing off larger sized cold blooded animals, and reducing the plant-food supply for many other creatures.

Nevertheless, mammals (including small primates), being warm blooded, and other smaller sized creatures were able to recover and apparently take competitive advantage of the situation. Hence, with the evolution of neocortically endowed mammalian Creodonta and related carnivores (Carnivora) the remaining dinosaurs could not compete and were completely eradicated. The Age of Mammals had begun.


With mammals and primates, who apparently diverged from the mammalian line some 70-100 million years ago (Colbert, 1980; Jarvik, 1980; Jerison, 1973; Romer, 1970), in ascendance, and exploring, exploiting and gaining dominion over the Earth, the brain also underwent further adaptive alterations and evolutionary advances in structure and organization. For example, as these creatures now ruled the day as well as the night and as they were advancing into every niche that had formerly been forbidden to them, there were tremendous expansions in visual cortex and thus the size of the occipital and temporal lobes -especially among the primates some 50 million years ago (Allman, 2010; Jerison, 2010). In fact, 50 million years ago the primate brain was much larger than the brain of similar sized mammals.

It is believed that some of the first proto-primates were possibly little rat-like creatures with long snouts and whiskers who devoured insects. In this regard they were no match for the numerous mammalian predators who lurked everywhere. On the other hand, it is just as likely that the first primates were in fact much larger, perhaps equivalent to a medium sized dog. Nevertheless, it was presumably from one or several of these first primates stocks that monkeys, apes, and humans branched off and allegedly descended.

Some primate lines adapted to life on the ground and these creatures flourished for almost 10 million years before dying out. By contrast, those primates who took to the trees adapted, flourished and rapidly adapted to living among the branches of the trees and the forest. They began to grow fingers and their hands and feet became adapted for grasping. As the environment acts on gene selection, in consequence tremendous alterations occurred in the occipital and parietal lobes and the frontal motor system which was now significantly contributing to the control of the extremities as these creatures became increasingly adapted to a life in the trees. However, what has been referred to as "prefrontal" cortex--that is the tissue anterior to the frontal motor areas, remained poorly developed (LeGros Clark, 1962).

The visual system was also modified in that the eyes had shifted from the sides of the head and became frontally placed which in turn resulted in stereoscopic vision, greater central visual acuity as well as improved hand-eye coordination (Allman, 2010). Like other mammals, these primates also continued to rely on olfactory and pheromonal communication; and as in mammals, the olfactory bulb remained large (Jerison, 1973, 2010).

It was presumably from these widely dispersed tree loving stocks that gave rise to monkeys in Africa, India, Asia, and the Americas about 40 million years ago (Leaky, 1988; Pfeiffer, 2013). One or several branches of this wide ranging stock of monkey in turn gave rise to apes (hominoids) about 30 million years ago (Sibley & Alhquist, 1984), with what would become chimpanzees and gorillas eventually appearing in Africa, and Orangutans appearing in India and Asia. Presumably numerous branches from these varied primate-hominoid trunk lines diverged again, and yet again, and gave rise to several possible hominid ancestral lines, one or several of which would eventually lead to the evolutionary metamorphosis of the first hominids.


As to the ancestors of the first hominids, there are several candidates. These include Dryopithecus and Sivapithecus, ape-like hominid/hominoids who emerged in Europe and India, about 16 million years ago, as well as Ankarapithecus of Turkey, Ouranopithecus of Greece, and especially Ramapithecus whose remains have been discovered in Africa, India, and Southwest China (Jurmain, et al. 2010; Munthe et al. 2013). Ramapithecus stood about three foot high, had a low forehead, flat wide nose, and face shaped like muzzle. Although he was a seed eater and tended to grind his food, Ramapithecus (or like minded) males probably occasionally hunted, captured and killed by hand small game and possibly other primates, whereas females and their young obtained the brunt of their food by gathering (Leakey & Lewin, 1977; Pfeiffer 2013). Ramapithecus, in fact, appears closely related to Dryopithecus and Sivapithecus, and may have given rise to Giganotopithecus whose 8 million year old remains have been found in India, China, and Vietnam. Giganotopithecus in turn may have been the ancestor of those Australopithecines which emerged in these lands.

Evolutionary metamorphosis is most likely to occur when an organism is exposed to a multiplicity of changing environments, or where two divergent worlds meet; as the environment acts on gene selection. For the pre-hominid hominoids such as Dryopithecus, Sivapithecus, Ramapithecus, Ankarapithecus, Ouranopithecus, and Giganotopithecus, the netherworld of change occurred during a period in which parts of the planet were bathed in renewed warmth, thereby causing the forests to shrink, thus giving rise to expanding savannas. It was during this period of climatic change, around 5-10 million years ago (Sibley & Alhquist, 1984; Takahata et al. 1995) that the descendants of Ramapithecus or Giganotopithecus or Ankarapithecus or Ouranopithecus, or some other primate-pre-hominid, underwent further evolutionary metamorphosis and gave rise to a variety of more advanced hominids, the descendants of which would come to include Australopithecus, H. Habilis, H. erectus, and eventually Homo sapiens sapiens--an evolutionary advanced being who would soon dominate and then threaten a good part of the planet's multiple life forms with death and extinction.


It may have been as recently as 5 million years ago that hominoids and hominids diverged from a common ancestor (Sibley & Alhquist, 1984; Takahata et al. 1995) with the first pre-humans, with over a half dozen species of Australopithecus, emerging soon thereafter.

Australopithecus, however, was in many respects more ape than human especially in regard to head and brain size (Conroy, 1998), their small semi-circular canals (inner ear), robust body build, conical chest, and curved feet. In fact, although Australopithecus (aferensis/africanus) had acquired the ability to walk on two legs and was well on the way to becoming human (e.g. Howells, 1997; Johanson, 1980; Johanson, & Shreeve, 1989; Leakey, 1981), this pre-human was built like a chimpanzee, and retained chimp-like capabilities and limitations, including an ape-like maturational growth rate, and a propensity for climbing, eating in, and perhaps sleeping in trees (Fleagle, 1988; Stern & Susman, 2013; Wood, 2004). Similar traits may have been possessed by early H. habilis).

Like chimpanzees, it can also be assumed that Australopithecus (as well as H. habilis) employed elaborate vocalizations and facial, hand, arm, and body gestures to communicate. They likely developed long lasting sibling and mother-infant relationships, and used a variety of strategies for achieving dominance and forming coalitions. Australopithecines (and H. habilis) probably greeted their own kind with hugs, pats, and kisses, would hold and shake hands, engage in long periods of mutual grooming, seek reassurance by embracing, and were probably willing to risk their lives to help family members who were in distress or danger.

As is characteristic of chimpanzees (e.g., de Waal, 1989; Goodall, 1986, 2010; Nishida, 2010; Wrangham et al., 2012), Female Australopithecus/H. habilis likely spent considerable time socializing and engaged in prolonged child care with mother-son and especially mother-daughter bonds lasting a lifetime. Incessant mutual vocalizing and prolonged daily food gathering activities were probably characteristic (Joseph, 2011e).

By contrast male Australopithecines (and H. habilis) were probably more independent, though like chimps, they likely formed coalitions, as well as hunting or raiding parties in which they would kill other animals (White et al., 2011) or hominids from adjacent troops; and, on occasion, each other (Dart, 1949).

Australopithecus (as well as H. habilis) were sexually dimorphic, with the female weighing half as much as the male (Howell, 1997; Johanson, & Shreeve, 1989; Leakey, 2004). All larger sized primates are sexually dimorphic and tend to live in multi-male, multi-female groups (Fedigan, 2012), with males competing for access to estrus females who may mate with numerous males. By contrast, smaller and similar sized primates are more likely to be monogamous (Fedigan, 2012). Hence, we can assume that monogamous sexual relations had not yet been established with the emergence of these pre-humans (see chapter 8).

Modern human vs Australopithcus Brain

It should be stressed, however, that there were numerous species of Australopithecus (as well as H. habilis), who shared the planet simultaneously from around 4 million to 2 million years B.P. In fact, Australopithecus (and H. habilis) appears to have "evolved" multi-regionally, emerging in Africa (see Grine, 1988; Leakey & Walker, 1988; Skelton & McHenry 2012) as well as China and Java (reviewed in Barnes, 1993). However, over the ensuing three million years, some species of Australopithecus died out whereas the descendants of others underwent a step-wise and progressive "evolution" with increasingly human-like species replacing their more primitive ancestors.

Hence, Ardipithecus ramidus, "ground ape" (the remains of which are 4.4 million years old) was joined and/or replaced by A. anamesis (who emerged 4 million years B.P.), and A, anamesis was joined and/or replaced by A. aferensis (3.6 million B.P.), who in turn was succeeded by A. africanus (3 million years B.P.), who in turn was joined and/or replaced by A. garhi (around 2.5 million years B.P.) who was then joined and then replaced by early H. habilis (2.2 million B.P).

However, various subtypes H. habilis also appeared in Africa, as well as China (Dragon Hill) and Indonesia (reviewed in Barnes, 1993; Howells, 1997). Hence, H. habilis (the handy man) seems to have evolved multi-regionally, perhaps as descendants of various species of Australopithecus who may have also evolved multi-regionally.

It must be emphasized, however, that there is no general or widespread agreement as to the various possible phylogentic relationships shared by the wide variety of Plio-pleistocene hominids so far discovered (Leakey, 2004; Skelton & McHenry 2012). Moreover, despite the apparent step-wise progression that appears to have led from A. anamesis to A. Africanus to A. garhi (the remains of which were in fact found in the same fossil rich Afar Triangle, where skeletons of A. Aferensis were discovered), and although Australopithecus was later joined by and then succeeded by H. habilis, there is no general agreement and it is not yet established if present-day humans descended from Australopithecus or H. habilis, or even if H. habilis descended from Australopithecus (Grine, 1988; Howells, 1997; Johanson, & Shreeve, 1989; Leakey & Walker, 1988; Leakey, 2004; Skelton & McHenry, 2012).

However, it has also been proposed that many of the supposed different subtypes of Australopithecus and/or H. habilis (e.g. H. rudolfensis, H ergaster) and/or H. erectus may have evolved from different ancestral species, and that each in turn has given rise to separate branches of the human race which then evolved multi-regionally, including those which long ago became extinct, such as the Neanderthals (see chapter 4). For example, in 1863, Carl Vogt argued that racial differences "leads us back not to a common stem, to a single intermediate form between man and apes, but to manifold lines of succession, which were able to develop, more or less within local limits, from parallel lines of apes.

A more neurologically oriented view has been proposed by MacLean (2010). As based on enlargements in the frontal portion of the cranium, MacLean (2010) argues that modern humans (including the Cro-Magnon) may have descended from Homo habilis who he believes descended from Australopithecus africanus, whereas Neanderthals evolved from a distinct branch of H. erectus who evolved from Astralopithecus robustus. Indeed, the existence of the Neanderthal people, and the fact that they evolved separately but then coexisted in Europe and/or the Middle East with "early modern" and anatomically "modern" Upper Paleolithic peoples, can be considered recent proof for multi-regional evolution.

Some scientists, however, propose that it is only the descendants of H. erectus who evolved "multi-regionally, and that commonalities between present day races and previous species of humanity, are due to interbreeding. Indeed, interbreeding is a likelihood as so many divergent species coexisted for thens of thousands of years, including Australopithecus, H. habilis, and H. erectus, and then later, H. erectus, archaic, "early modern" and anatomically "modern" Paleolithic H. sapiens sapiens. Yet others argue for a direct line of descent, with all species of humanity originating in Africa from a single ancestor and then radiated outward, displacing and killing off those previous species who immigrated before them. The evidence supporting the "multi-regional" vs the "out-of-Africa" replacement arguments are discussed in chapters 4 & 5 and will be reviewed below.

Unlike those who had come before them, Australopithecus and Homo habilis were extremely advanced for they had learned to stand on their hind legs and to walk in an upright manner (Howells, 1997; Johanson & Shreeve, 1989; Leakey, 2004). Similarly, although curved and still used for climbing, their feet evolved so as to better accommodate standing, and they tended to walk and run on two legs rather than on all four as do apes and monkeys. In consequence, the arms and hands ceased to be weight bearers and were freed of the necessity of holding or hanging on to a tree branch in order to move about. In addition, the fingers and thumb underwent further modification and functional elaboration and they gained the ability to not only hold and grasp, but to explore and manipulate objects.

For example, by time H. habilis had come to wonder the Earth, the thumb had become longer and stronger which enabled them to engage in complex acts involving a refined precision grasp; e.g. the construction and utilization of simple tools (see Hamrick & Inouye, 1995; McGrew, 1995; Susman, 1995 for discussion of supportive and contrary evidence). Correspondingly the areas of the neocortex devoted to the representation of the hand and in particular, the fingers increased (Richards, 1986) which also improved versatility in fine motor control and communication.

Because the hands and fingers had become more versatile they could also be used for tool construction and elaborate signaling among other primates so as to convey feelings of anger (such as by making a fist, pounding a tree, angrily swinging or throwing a tree branch), a desire to share (such as begging for a small piece of meat), to indicate friendliness or the desire to form a social bond, or to even soothe an angry neighbor by grooming his coat. In this regard, the neocortex subserving the manipulation of the hand (the superior parietal lobe) became increasingly adapted for social-emotional communication as well as exploration and manipulation.

As based on the fossil evidence, both Australopithecus and H. habilis were capable of creating and manipulating simple stone and wooden tools, just as chimpanzees use rocks, twigs, and leaves as tools. In fact, antelope leg bones which show the signs of cutting and pounding by a shark and blunt rock were found in association with the remains of A. garhi, which in turn is evidence of deliberate tool use (White et al., 2011).

However, only the remains of H. habilis have been found in association with stone tools --some of which were discovered in the Omo valley of Ethiopia from a site dated to 2.4 million years, whereas yet others were found at Kada Gona and dated to 2.1 million years (Gowlett, 1986; Lewin, 1981). These stone tools, however, were little more than rocks that had been banged together in order to arrive at a desired shape, e.g., pebble choppers and stone flakes used for cutting. This tool making tradition has been referred to as the Oldowan and prevailed for a million years.


Following on the heels of Australopithecus and briefly sharing the planet with Homo habilis were a wide range of quite different individuals collectively referred to as Homo erectus. Homo erectus were big and robust, with thick browridges, large teeth, and bulging shoulder muscles (Day, 1996; Potts, 1996; Rightmire, 2010), and ranged throughout Africa, Europe, Russia, Indonesia and China from approximately 1.9 million until about 300,000 years ago, with a few isolated populations possibly hanging on in the island of Java, until 27,000 years B.P. (Day, 1996; Potts,1996; Rightmire, 2010; Swisher et al., 1996). Thus, H. erectus emerged almost immediately after H. habilis appeared upon the scene, such that they shared the planet simultaneously. Hence, H. erectus and may well have been responsible for the demise of H. habilis and any remaining Australopithecines.

Nevertheless, early H. erectus was rather small brained, that is, as compared to modern H. sapiens sapiens, with a cranial capacity of about 800 to 937 cc (Tobias, 1971). It was not until the latter stages of H. erectus evolution that the brain became significantly enlarged, doubling in size as compared to that of Australopithecus (440 cc) and approaching within 15% of present-day humans (Conroy 1998; Potts, 1996; Rightmire, 2010; Tobias, 1971).

H. Habilis vs H. Erectus v Modern Human Brain

Nevertheless, both early and late H. erectus were exceedingly intelligent and resourceful. H. erectus was the inventor of the hand ax and the Achulean tool making industry, and may well have been constructing camp sites 1.8 million years ago. As with earlier hominids, these camps were established near rivers and lake shores, and served as semi-permanent as well as transitory sites where stone tools could be constructed and animals butchered (e.g., Isaac, 1971, 1981, 1982; Leakey, 1976, 1978).

Hence, in contrast to chimps and other primates who eat as they range and forage, H. erectus was apparently returning to the camp site with food that had been gathered, scavenged, or slaughtered and butchered, so that it could be shared with their compatriots. They also utilized various earth pigments (ochre) for perhaps cosmetic or artistic purposes. In this regard, perhaps such individuals were beginning to experiment with individual creative and artistic expression.

It was during the later stages of Homo erectus development, about half a million years ago, that the human brain appears to have become significantly enlarged (Rightmire 2010; Tobias 1971) with some individuals sporting craniums with a cerebral capacity of up to 1100 cc. Likewise, around 500,000 years B.P., they were living in crude shelters and had established semi-permanent home bases (Clark & Harris 2013; Potts 1984). It was also during this time that big game hunting had its onset. Moreover, it may have been during this later period that the H. erectus female finally lost her estrus and became sexually receptive at all times (Joseph, 2011e; see chapter 8). The loss of the estrus cycle and full time sexual receptivity no doubt contributed significantly to the development of long term male-female relationships, which in turn may explain the the development of the semi-permanent home base (Joseph, 2011e; see chapters 7, 8).

However, because a bigger brain comes in a bigger head, trends in the division of labor became more pronounced and an additional divergence in the mind and brain of man and woman may have resulted as males and females became increasingly specialized to perform certain tasks (Joseph, 1993, 2011e; chapter 7). Males increasingly engaged in big game hunting which coincided with improved visual-spatial perceptual and motor functions involving predominantly the right hand, whereas females increasingly spent a considerable amount of time engaged in child care, domestic tool making, and food gathering which coincided with improvements in bilateral fine motor control and which contributed to the development of language (Joseph, 1993, 2011e).


Presumably, H. erectus is the common ancestor for Neanderthals and modern humans. However, the fossil evidence suggests that H. erectus "evolved" multi-regionally (see chapter 4), and that Neanderthals and different subpopulations of "modern" humans, "evolved" from different subpopulations of H. erectus who were dwelling in Africa, the Caucasus, Indonesia, and China.

For example, the jaw from a H. erectus has been discovered in the Caucasus (Georgian Republic) dated to 1.9 million years (Howells, 1997). The skeletal remains of H. erectus (and associated stone tools) have also been discovered in Java, Indonesia, (Pithecanthropus erectus) and near the Solo River (e.g. Solo Man, Java Man)--sites dated from 1.8 million to 700,000 B.P (respectively). In addition, the remains of H. erectus (H. erectus pekinesis) have been discovered in Northern China (e.g. Peking Man) and in Zhoukoudian, Yuanmou and Xihouda China--sites dated from 1.5 million years B.P., to 800,000 to 750,000, to 500,000 B.P. (Jia, 1975, 1980; Jurmain et al. 2010; Stanley, 1979, 1981; Wu & Wang, 2013).

Given that the earliest evidence of H. erectu (Homo ergaster) in Africa is dated to around 1.6 million years B.P. (Leakey & Walker, 1988), it is not likely that "Georgia Man," "Solo/Java Man" and "Peking Man" migrated out of Africa. In fact, ancestral forms which include the remains of Gigantopithecus as well as an H. habilis (Jia, 1980; Munthe et al. 2013) have been discovered in Java, and China (Dragon Hill). Of equal importance, stone tools were discovered from Dragon Hill, China, which are dated to 2 million years. However, it was not until almost a million years later that H. erectus finally invaded Europe as the remains of H. erectus have been found in Italy from a site dated to 800,000 B.P., whereas others have been found in Southern England (Howells, 1997).

Hence, contrary to the out-of-Africa scenario, it appears that H. erectus evolved multi-regionally, in Java, Asia, the Caucasus, and Africa, and/or immigrated from these distant lands to Africa (and then lastly, Europe) where H. habilis was still the dominant hominid. Indeed, if H. erectus did emerge first in China, Java, Georgia, and so on, and then spread to Africa, this may explain why the African H. habilis managed to hang on for almost half a million years before becoming extinct around 1.6 million years ago; i.e. at about the same time that H. ergaster appeared upon the scene.

Of course, the above scenario, although supported by the fossil evidence, is completely contrary to the Darwinian notion of a single line of descent. Moreover, it should be stressed that many of those who support the "multi-regional" view of evolution, hold a much more conservative view. That is, although arguing that "modern" humans evolved multi-regionally, many of these scientists accept the notion that H. erectus first emerged and migrated out-of-Africa.


About 500 thousand years ago the first primitive and archaic Homo sapiens began to appear in increasing numbers in North Africa, the near Middle East, and even southern England. Wherever they wandered they proved to be far more intelligent and resourceful as compared to the last remnants of Homo erectus who over the next two hundred thousand years were eradicated or simply died out as a species.

It has been logically surmised that H. sapiens ("the wise man") evolved from H. erectus. However, if archaic humans first "evolved" in Africa, Asia or Indonesia, is subject to considerable debate. As noted above there is considerable evidence which suggests Australopithecus, H. habilis, and H. erectus "evolved" multi-regionally.

Moreover, there is evidence which also indicates that not all H. sapiens evolved at the same time, in the same place, or from the same stock of H. erectus (Frayer et al. 1993; Moritz et al. 1987; Wolpoff, 1989); that is, it appears that H. sapiens evolved "multi-regionally."

As argued by Frayer, Wolpoff, Thorne, Smith, and Pope (1993, p. 17) "the best evidence of the fossil record indicates that modern humans did not have a single origin, or for that matter a number of independent origins, but rather it was modern human features that originated--at different times and in different places. This view also implies that certain features that distinguish some modern groups were developed very early in our history, after the exodus from Africa. It is also important to recognize that not all features common to modern humans emerged at precisely the same time."

Hence, just as there is evidence which suggests that different subspecies of Australopithecus, H. habilis, and H. erectus may have "evolved" from separate stocks of antecedent species, there is fossil evidence that "archaic" Homo sapiens may have "evolved" from at least three different branches of H. erectus as recently as 400,000 years ago, each of which in turn gave rise to at least three branches of human beings who first appeared and ranged throughout Africa, Europe (e.g., Petralona Italy; Hungary, Germany), China (Hupei Province), the Middle East, and India. Moreover, there is evidence of evolutionary progression in each of these distant lands for the remains of more advanced and modern appearing archaic ("early modern") H. sapiens have been discovered in China as well as in East Africa from sites dated to 130,000 B.P. and 120,000 B.P respectively (Barnes, 1993; Butzer, 1982; Grun et al. 2010; Howells, 1997; Rightmire, 1984). Hence, there appears to be evidence of multi-regional descent.


The "multi-regional" view is rejected by many scientists who hold to the Darwinian notion of a single line of descent. Although most also admit that European archaics (Neanderthals) evolved separately (which supports the multi-regional view) these scientists argue that modern humans originated in Africa, perhaps about 250,000 years ago, and then radiated outward replacing those who came before them including the European Neanderthals. That is, according to the out-of-Africa scenario, successive waves of hominids originated in Africa and then spread to China, Indonesia, etc., only to be followed by more advanced species who also radiated outward, killing off those who came before them. Presumably this out-of Africa pattern of dispersal and invasion was initiated by and the followed by successive species of Australopithecus, who were then followed (and killed off) by H. habilis, who were then followed (and killed off) by H. erectus, who were then followed (and killed off) by archaic H. sapiens, all of whom originated in Africa and all of whom must have followed the same well traveled route out of Africa to parts unknown.

Presumably, this same pattern of dispersal and invasion was followed b "early modern" and "modern" Paleolithic H. sapiens sapiens ("the wise man who knows he is wise"). According to the "replacement" theory, in venturing out of Africa and invading the Middle East and Europe, "early modern" and "modern" Paleolithic Homo sapiens sapiens "replaced" perhaps by killing off or out-competing, the more archaic H. sapiens (e.g. Neanderthals) who already dwelled in these coveted lands. Hence, presumably, due to population growth and other factors perhaps related to changes in culture, tool making, climate or availability of game animals, "early modern" and then anatomically "modern" H. sapiens sapiens eventually pushed their way up through Northeastern Africa, and then into the Middle East, and then into Europe, China, Indonesia, Australia, and the Northeastern and Southwestern Americas, and so on, eradicating "inferior" species as they went, such that by 50,000 to 40,000 B.P. the Cro-Magnon and "modern" H. sapiens had come to occupy a considerable part of the Earth.

Hence (and broadly speaking), whereas the "mutli-regional" view presents humans as evolving in multiple places over multiple time periods from multiple ancestral species, the "replacement" theorists argue that all hominids including "modern" humans evolved in Africa and in proliferating and invading other lands, eradicated those who had come before them. Moreover, consistent with the belief in a single line of descent, the "replacement" theorists have argued, as based on mitochondrial DNA, that all modern humans have descended from a single female ancestor in Africa about 250,000 years ago;. i.e. the "Eve" hypothesis. Specifically, based on an analysis of a small fragment (610 base pairs) of the mitochondrial genome (which consists of about 16500 base pairs), taken from 189 individuals, Stoneking and Cann (1989; Vigilant et al. 1991) have argued that based on the varying patterns in sequencing ("types") that there is greater diversity within Africa than outside Africa. From this data they concluded that all humans must have descended from African ancestors.

However, the DNA evidence and "mutation rates" upon which this theory has been based, has been discredited and shown to be statistically invalid (Templeton, 2012; Wolpoff & Thorne, 1991). Indeed, human mitochondrial DNA consists of only 37 genes, circulates independently within the cell's cytoplasm, is inherited only from the maternal line, and there is no evidence of evolution in the mitochondria genome, although deleterious mutations are not uncommon. In fact, others have found, using the same data, that all humans could have also descended from ancestors who lived in New Guinea (Ruvolo & Swafford, 1993).

As per more recent data provided by Chu et al., (1998), regarding the origins of modern Chinese, it is noteworthy that although these researchers concluded that "genetic evidence does not support an independent origin of Homo sapiens in China" they also found that the peoples of Northern and Southern China clustered into distinct regional genetic populations that could be divided into even smaller separate genetic groups. Moreover, the East Asian populations they studied were genetically more closely related to "Native American" indians, followed by Australian aborigines, and New Guineans and most distantly related to Africans. Hence, this data could be interpreted to mean that anatomically "modern" humans originated in Australia or New Guinea, then migrated to southern Asia, and then the Americas whereas Africans remains in Africa.

The out-of-Africa model demands a single line of descent, which in turn is based on the single seed theory of the origin and evolution of life on Earth. However, as based on genetics alone, a single line of descent appears to be an unlikely (see chapter 3, 4). For example, if a new species were to emerge in a single line of descent (due to a chance mutation), who would that individual breed with so as to perpetuate the "new" species?

In order to breed, and in order to prevent the"mutation" from being eliminated from the genome, and/or so as to insure that an intermediate (i.e. regressed) version of the new species would not be produced following breeding, an unrelated male and female, that is, a potential breeding pair would both have to experience this "random" "mutation" (that unlike all other mutations just happens to be adaptive) so as to pass on this mutation to their children who in turn would have to find nearby, unrelated mates who also possess the very same randomly acquired mutation. A chance scenario such as this, of course borders on the ridiculous and is statistically improbable; that is, unless this change in the genome was not random or a mutation, but was in fact programmed into the DNA of this species. Indeed, if an unrelated male and female both evolved this "mutation" so as to breed, then this would be evidence for multi-individual and thus multi-regional evolution, for the same exact genetic events would then be expected to occur throughout the species almost regardless of where they lived; that is, depending on the environment, as the environment acts on gene selection--genes which exist prior to their selection). It has been argued (see chapters 3, 4, Joseph, 1997) that it is precisely because of a genetic commonality, that is, the existence of shared DNA-based genetic instructions, that new members of the species "evolve" multi-regionally, and thus come to posses almost identical physical and biological attributes.


Although the "Eve" hypothesis requires it, there is no evidence to suggest that "modern" or even "early modern" appearing Paleolithic humans had evolved by 250,000 years ago. Moreover, the first evidence for the evolution of "early modern" H. sapiens appears not in Africa but in China around 130,000 B.P., with similarly advanced species appearing in East Africa 10,000 years later. Moreover, the first evidence for the emergence of "modern" H. sapiens sapiens is not found in Africa, but in Australia. As based on thermoluminescence and optical dating techniques, "modern" H. sapiens had arrived in Australia at least 60,000 years ago and maybe as long ago as 75,000 years B.P, (Rhys Jones, cited by Morell 1995), 25,000 to 40,000 years before "modern" H. sapiens sapiens had emerged in Africa.

In fact, although separated from the mainland by over 100 miles of water, "modern" H. sapiens sapiens not only appeared in Australia, perhaps as long ago as 75,000 years B.P., but had established numerous settlements, were using ocher, and were fashioning complex tools as early as 60,000 B. P. These included grooved "waisted blades" which could be bound to a handle.

Given that these early Australians would have also had to sail almost 100 kilometers across open ocean in order to reach Australia (southern Asia being the closest land mass), considerable intellectual prowess and thus modernness is evident. In fact, these early Australians were engaging in widespread agriculture 60,000 to 50,000 years ago; clearing forests with fire so that fruit-bearing trees could be planted (Miller et al., 2011). Because they were destroying these forests and given their advanced hunting capabilities, more than 85% of Australia's land-dwelling megafauna became extinct almost simultaneously, 50,000 years ago (Miller et al., 2011). It would take 40,000 or more years before their African/Middle Eastern counterparts began to demonstrate similar capabilities.

Moreover, Asian "moderns" also appeared tens of thousands of years before their counterparts in Africa. In fact, after Australia, there is evidence that "modern" Paleolithic humans appeared in China by 67,000 B.P., and then Israel, Romania, and Bulgaria by 43,000 B.P., and then Iraq and Siberia by 40,000 B.P., and then Spain, France, and North East Africa by 35,000 B.P. (reviewed in Howell, 1997). In addition, there is evidence for modern human occupation in Brazil, Peru, Chile and North America as long ago as 50,000 to 30,000 B.P. (Bahn, 1993). In fact, there is evidence to suggest that humans appeared in the Americas perhaps 250,000 years ago--though this date is rejected by almost all anthropologists. Rather, what the evidence strongly indicates is that modern humans arrived in the America's by 50,000 to 40,000 years ago, and this is based on linguistic analysis (e.g. Johanna Nichols of U.C. Berkeley), the analysis of "Native American" DNA (e.g. Theodore Schurr of Emory University) and the discovering of complex tools and human bones in Monte Verde, Chile.

Obviously, these and other findings are not consistent with the out-of Africa scenario, as "modern" humans were emerging out-side of Africa in advance of "modern" humans inside of Africa. Rather, the data instead supports the multi-regional view and even the out-of-Asia view of evolutionary metamorphosis.

Hence, similar to the step-wise world-wide pattern of multi-regional, multi-phyletic metamorphosis which has characterized the progressive emergence and increased complexity of plants and animals (see chapters 4 and 5), and as based on the available evidence it appears that human "evolution" has unfolded multi-regionally in a step wise, progressive fashion, with some lagging far behind and others being left behind altogether and becoming extinct.

Nevertheless, it must be stressed that those adhering to the "out-of-Africa" and the mainstream Darwinian position of a single line of descent, dispute, challenge and reject the evidence for multi-regional evolution, and have offered up numerous plausible explanations so as to discount these findings. Moreover, many also vehemently reject the notion of "progress" and dispute any and all evidence which indicates that there has been a step-wise progression in complexity over the course of evolution. As argued by Harvard paleontologist and popular science writer, Stephen J. Gould (1988, p. 319), "progress is a noxious, culturally embedded, untestable, nonoperational, intractable idea that we must replace if we wish to understand the patterns of history."


Although the "replacement" vs the "multi-regional" theories appear to be at odds, as will be detailed below, these theories are (at least somewhat) mutually supportive; particularly if one considers that the "replacement" theory may accurately depict events that occurred "multi-regionally". For example, in Africa and Asia, pockets of each successive and more advanced species of Australopithecus may well have radiated outward and eradicated any and all contemporary ancestral species. And this same multi-regional pattern of evolution and "replacement" may have characterized the brief rein of H. habilis who in turn was eradicated (multi-regionally) over the course of just a few million years by H. erectus who also may have emerged multi-regionally and then radiated outward to cleanse the surrounding lands of any and all inferior ancestral species.

Likewise, based on the fossil evidence reviewed above, pockets of archaic, "early modern" and anatomically "modern" H. sapiens sapiens who "evolved" in the Far East, Asia and Africa, likely radiated and ventured outward from these pockets, in all directions; each successive species in turn "replacing" by killing off or out-competing, any and all inferior ancestral species of humanity who already dwelled in surrounding lands. In this regard, the "replacement" and "multi-regional" views of evolutionary development are complementary in that both views entail competition and the replacement of "inferior" species.

Indeed, it is exactly the above scenario which likely accounts for the demise of Neanderthals--a peoples whose very existence strongly supports if not verifies the multi-regional theory. However, as will be detailed below, "modern" Paleolithic H. sapiens sapiens (be they of African, Asian, Georgian, or Australian origin) were provided a competitive advantage over archaics including Neanderthals due to the anatomical and functional expansion of the frontal and inferior parietal lobe and the evolution of the angular gyrus. That is, although Neanderthals and other archaic humans may have possessed the DNA and thus the genetic potential to someday evolve into modern humans, and although there was likely some interbreeding, they were eradicated by a neurologically superior race of people, the Cro-Magnon.


As will be detailed below, the frontal lobes and inferior parietal lobes have obviously expanded over the course of hominoid and human evolution. Hence, it has been theorized, based on cognitive, cerebral, and endocast comparisons, that the evolution and expansion of the frontal lobe and inferior parietal lobe, the angular gyrus in particular, have significantly contributed to the evolution of language, artistry, and tool and hunting technology, as well as the Middle to Upper Paleolithic transition and the eradication of the Neanderthals (Joseph 1993, 2011e).

As is well known, the frontal lobes serve as the "senior executive" of the brain, cognition, and personality, regulating information processing throughout the cerebrum (Fuster 1997; Joseph 1986a, 2011a; Passingham 1993; Selemon et al. 1995; Shallice & Burgess 1991; Stuss & Benson 1986), including, via Broca's area, the expression of speech. Hence, the evolution and expansion of the anterior portion of the brain would necessarily confer greater cognitive, linguistic and intellectual capabilities upon those so endowed.

Likewise, the angular gyrus of the inferior parietal lobe (IPL) plays an important role in language, as well as artistry, tool use and manipulation (Joseph 1982, 1993, 2011e). Hominoids (and other non-human mammals) lack an angular gyrus (Geschwind 1965) and their tool-making capabilities are limited to hammering with rocks, and throwing or manipulating leaves, sticks, and twigs.

Likewise, the tool making tradition of H. habilis was exceedingly primitive, consisting or rocks that had been banged together in order to arrive at a desired shape. Hence, it can be concluded that H. habilis had not yet evolved an angular gyrus. Moreover, as the IPL/angular gyrus sits at the junction of the tactile, visual, and auditory association areas, and assimilates, sequentially organizes, and injects this material into the stream of language and thought, it can be also be concluded (see below) that H. habilis had not yet evolved the ability to speak in a manner even remotely resembling the speech of modern humans--an impression that is also bolstered by their poorly developed (pre-) frontal lobe, a region that contains Broca's expressive speech area.


Over the course of mammalian evolution, the frontal lobes have signficantly expanded in size and complexity, reaching their greatest degree of development in humans (Fuster, 1997). Moreover, it is apparent that in the course of human evolution, beginning with Australopithecus, that the frontal lobes have continued to expand.

As noted above, hominids and hominoids (i.e. chimpanzees) apparently diverged from a common ancestral stock around 5 million years ago, with Australopithecus appearing soon thereafter. However, not only was Australopithecus more ape-like than human-like, but as based on cranial comparisons, it appears that the brain of the chimpanzee is similar in size and volume (390 cc on average, range=282-500 cc, male/female mean of 410/380 cc), to that of Australopithecus aferensis with a range=385-450 cc (Ashton & Spence, 1958; Tobias, 1971; see also Holloway 1988). These similarities have important implications in regard to the evolution of the frontal lobe.

The human versus chimpanzee frontal lobe has been reported by Brodmann (1912) to comprise 36.3% versus 30.5% of the overall neocortical surface, whereas Tilney (1928) provides estimates of 47% versus 33% (see Figure 1 & 3). Blinkov and Glezer (1968) have estimated that the prefrontal and precentral regions comprise 32.8% of the human versus 22.1% of the chimpanzee cerebral hemispheres. As the chimpanzee frontal lobe comprises from 30.5% to 33% of the neocortical surface (whereas the human frontal lobe ranges from 36.3% to 47%), and as there is no evidence of a functional Broca's area in primates (see above) the same may be assumed of Australopithecus aferensis.

As noted, A. aferensis is the likely ancestor of A. africanus who had become marginally more human-like--as reflected by slight physical and neurological advances. For example, Conroy (1998) and others (see Day 1986; Gibbons 1998; Tobias, 1971) have determined, as based on volumetric measurements, that the cranial capacity of Australopithecus africanus is slightly larger than the chimpanzee, ranging from 413 to 515 ccs with a mean volume of 440 cc). Hence, it is apparent that the brain has become larger in the transition from A. aferensis to A. africanus and it can be assumed that some of this expansion took place within the frontal lobe.

Nevertheless, the frontal (and parietal and temporal) lobes of A. africanus (like A. aferensis) was probably more ape-like than human-like (Figures 1 & 2; see also Falk 1982, 2013a) and there is no evidence to suggest that they had "evolved" a functional Broca's area or an angular gyrus. In fact, it was not until the emergence of H. habilis that the frontal (and temporal and parietal lobes) began to enlarge and become more human-like (for related discussion see Holloway 1981, 2013a; Tobias 1971; though again, there is absolutely no evidence (other than that based on phrenology, see below) to suggest they had evolved a functional Broca's area.


Hominoids lack an angular gyrus (Geschwind 1965) and the neocortical tissues homologous to Broca's area does not subserve speech or vocalization. Although damage to Broca's area in humans results in a profound expressive aphasia, similar destruction in non-human primates has no effect on vocalization rate, the acoustical structure of primate calls, or social emotional communication (Jurgens et al. 1982; Myers 1976).

Vocalization in non-human primates is unaffected following lesions to the left frontal lobe as sound production is mediated by limbic and subcortical nuclei such as the cingulate gyrus and amygdala (Jurgens 1979, 1987, 2010, 2012; MacLean 2010; Ploog 2012; Robinson 1967, 1972) which act on the brainstem vocalization centers including the periqueductal gray (Jurgens 2004) which in turn controls the oral musculature, via, for example, the hypoglossal nerves. In fact, the hypoglossal canal, which enables the hypoglossal nerve and thus the cerebrum to control the tongue, is so tiny in these species, that as also confirmed by behavioral indices, they simply do not have the neurological capability to control the oral musculature sufficiently so as to produce speech.

Thus, among non-human primates, there is no functional, and thus no anatomical equivalent to the human Broca's area and there is absolutely no evidence to indicate they can produce human speech. Likewise, as early hominids possessed a chimpanzee-like brain and/or a poorly developed frontal lobe, and a tiny hypoglossal canal, and as there is no functional evidence that the angular gyrus had evolved, it can be concluded that they too probably lacked a functional Broca's area as well as an angular gyrus and were completely incapable of producing human-like speech.

In fact, whereas the brain of Australopithecus was similar to that of a chimpanzee, the brain of H. habilis was ony somewhat larger than a male ape. In fact, the brain of H. habilis is half the size of the modern human brain (see Conroy 1998), i.e., 640 cc compared to 1350 cc. Again, given the simplistic and unchanging nature of their stone-flake "Oldowan" tool-making tradition, it is also unlikely that these hominids evolved an angular gyrus or a functional Broca's area.

Indeed, H. habilis possessed an exceedingly primitive vocal tract that could not have subserved modern expressive speech (Lieberman 1984). Rather, like the angular gyrus (see below), a functional Broca's area must have evolved in parallel with handedness and advances in tool technology and artistry, and in conjunction with increases in the size and volume of the temporal-parietal neocortex and the frontal lobe. Broca's expressive speech area, like other frontal lobe capacities, was acquired much later in human evolution (Aboitiz & Garcia 1997; Joseph 1993, 2011e).

Some authors, however, have claimed to have discovered an impression of Broca's area on the inside of the H. habilis' skull (Falk 2013b; Tobias 1987). These claims and counterclaims are not only exceedingly controversial (see Falk 2013a; Holloway 2013; Jerison 2010), but the phrenological methodology of determining functional landmarks based on an examination of the skull was rightly denounced as a pseudoscience over 100 years ago.

Claims to have discovered functional landmarks are in fact based on the bumps, grooves, and marks supposedly engraved on the inside of the hard inner surface of the skull by an exceeding soft and gelatinous cerebral mass covered by a thick dural membrane. Although an endocast of the inner skull provides impressions regarding the location of major blood vessels and the gross size and position of the lobes of the brain (see Figures ), it is exceedingly difficult to discern or identify primary sulci and gyri, much less a "Wernicke's" or "Broca's" area (see Jerison 2010 for related discussion).

In fact, it's impossible as neurosurgeons can't perform this feat even when palpating a living brain, and must employ an electrode to identify functional landmarks such as Broca's area. In fact, although some people suffer strokes or other injuries which are precisely localized to "Broca's" or "Wernicke's" areas, they do not become aphasic, whereas those with lesions which spare the classic language areas may become profoundly aphasic (Dronkers 1993; Kertesz, Lesk & McCabe 1977). Expressive and receptive speech is sometimes represented in the right hemisphere (Joseph 1986b).

Hence, even if we choose to accept the dubious phrenological claims that H. habilis (or H. erectus, or archaic H. sapiens), possessed neural tissue that could be construed as "homologous" to Broca's or Wernicke's area, the possession of "homologous" anatomy does not necessarily imply homologous function, as demonstrated in non-human primates.

However, in the transition from H. habilis , to H. erectus, to archaic H. sapiens, to anatomically modern Upper Paleolithic H. sapiens sapiens (Blinkov and Glezer 1968; Joseph 1993; MacLean 2010; Tilney 1928, Weil 1929; Wolpoff 1980) the frontal lobes continued to expand in length and height reaching their greatest degree of development with the evolution of the Cro-Magnon peoples. In fact, it is with the emergence of the Cro-Magnon that there is finally evidence that the angular gyrus and a functional Broca's area had evolved.

Broca's Area, Limbic Language, and the Upper Paleolithic Angular Gyrus

Given that the emergence of "modern" Upper Paleolithic humans is characterized by a cognitive "revolution" in tool, artistic, and hunting technology (Chauvet, Deschamps & Hillaire 1996; Gilman 1984; Leroi-Gourhan 1964), it appears that these cultural, technological, and cognitive achievements paralleled and were made possible, in part, via the emergence of the angular gyrus and the expansion of the frontal lobes--the evolution of which also resulted in the development of Broca's speech area and modern language (Joseph, 1993, 2011e).

Specifically, as vocalization in non-humans is mediated by limbic and brainstem structures, it has been argued that what has been referred to as "limbic language" (Joseph 1982; Jurgens 2010) became hierarchically represented, punctuated, sequenced, and reorganized by these neocortical speech areas thereby giving rise to modern, grammatical, vocabulary-rich speech (Joseph 1982, 1988a, 2012a, 1993, 2011e,d). That is, due to the interconnections maintained with the frontal speech areas and the anterior cingulate, and Wernicke's area and the amygdala, with the evolution of the angular gyrus and a functional Broca's areas (and the pathways between them) auditory input and limbic output was hierarchically and neocortically subsumed and subject to temporal sequencing, thus giving rise to grammatical, vocabulary-rich language (see below).

Specifically, coinciding with or at least following these transformations in the posterior speech areas, and due to its reception of auditory-linguistic signals, over the course of human evolution Broca's area was transformed from a hand area (as is the case in hominoids) to a speech area and thus began programming the primary motor areas and thus the oral laryngeal musculature for expressive speech. Hence, via connections with the cingulate and amygdala, what had been limbic language was thus hierarchically subsumed by the neocortex, thus giving rise to modern human language and conferring a tremendous intellectual advantage to those so endowed, i.e. the Cro-Magnon (Joseph, 2011e).

As will be detailed below, the evolution of the frontal lobes and angular gyrus parallels and accounts for the cultural, social, technological, and linguistic adaptions which characterize the Middle to Upper Paleolithic transition, including the evolution of language. Moreover, the expansion and evolution of these structures likely contributed to the demise of Neanderthals who were not as well endowed as their Upper Paleolithic, "Cro-Magnon" counterparts.


The human frontal lobes serve as the "Senior Executive" of the brain and personality, and are "interlocked" via converging and reciprocal connections with the limbic system, striatum, thalamus, and the primary, secondary, and multi-modality associational areas including Wernicke's area and the IPL. Through these interactional pathways, the frontal lobes are able to coordinate and regulate attention, memory, personality, and information processing throughout the neocortex so as to direct intellectual and cognitive processes.

As based on human (and animal) experimental and case studies, it is well established that the frontal lobes enable humans to plan for the future and to consider the consequences of certain acts, to formulate secondary goals, and to keep one goal in mind even while engaging in other tasks, so that one may remember and act on those goals at a later time. Selective attention, planning skills, and the ability to marshal one's intellectual resources so as to not only remember but achieve those goals, and the capacity to anticipate the future rather than living in the past, are capacities clearly associated with the frontal lobes (Fuster 1997; Joseph 1986a, 2011a; Stuss & Benson 1986).

In addition, the right and left frontal lobes respectively subserve the expression of emotional-melodic, and vocabulary-rich grammatical speech. Specifically, upon receiving converging impulses from the IPL and the language and auditory areas in the temporal lobes, Broca's area (and its emotional speech producing counterpart in the right frontal lobe) act on the immediately adjacent secondary and primary motor areas which control, regulate and program the oral laryngeal musculature (Foerster 1936; Fox 1995; Joseph 1982, 1988a, 2011e,f). Therefore, the ability to express one's thoughts, ideas, and emotions through complex speech is made possible by the frontal lobes.

Although endocasts should not be employed to localize functional landmarks such as Broca's area, they are useful for making gross determinations as to the overall size and configuration of the cerebrum and the lobes of the brain. In this regard, and as based on cranial comparisons, or endocasts using the temporal and frontal poles as reference points, it has been demonstrated that the brain has tripled in size, and that the frontal lobes have significantly expanded in length and height over the course of human evolution and during the Middle to Upper Paleolithic transition (Blinkov and Glezer 1968; Joseph 1993; MacLean 2010; Tilney 1928; Weil 1929; Wolpoff 1980) such that Cro-Magnon people were obviously superiorally endowed as compared to Neanderthals.

For example, it is apparent (see Figures 27, 28, 29), that the height of the frontal portion of the skull is greater in the six foot tall, anatomically modern Upper Paleolithic H. sapiens sapiens (Cro-Magnon) versus Neanderthal/archaic H. sapiens (see also Wolpoff 1980, Table 12.1; and Tilney, 1928). Hence, impoverished Neanderthal frontal lobe development and expanded Cro-Magnon frontal lobe capacity is indicated. Indeed, the characteristic "slopping forehead" was an obvious limiting factor in archaic and Neanderthal frontal lobe development). In fact, the Cro-Magnon brain was significantly larger than the Neanderthal brain, with volumes ranging from around 1600 to1880 cc on average compared with 1,033 to 1,681 cc for Neanderthals.

The differential evolution of the Cro-Magnon vs the Neanderthal frontal lobe (and angular gyrus) is also apparent as based on paleo-neurological and neuropsychological analysis of tool and hunting technology, artistic and symbolic development, and social organization in the Upper vs the Middle Paleolithic (Joseph 1993). As will be detailed below, the angular gyrus probably emerged and Broca's area probably became fully functional during the Middle to Upper Paleolithic transition; evolutionary developments which likely contributed to the demise of the Neanderthals.

Since Cro-Magnon's shared the planet with Neanderthals during overlapping time periods (and coupled with evidence reviewed below) it certainly seems reasonable to assume that the expansion and evolution of the frontal lobe and angular gyrus, provided these people with an obvious competitive advantage as they clearly dwarfed the Neanderthals in all aspects of cultural, intellectual, social, linguistic and technological achievement. Hence, endowed with a bigger brain and expanded frontal and IPL/angular gyral capacity, the cognitively, linguistically, technologically and intellectually superior "Cro-Magnons" and other "modern" Upper Paleolithians, probably engaged in wide spread ethnic cleansing and exterminated the rather short (5ft 4in.), sloped-headed, heavily muscled Neanderthals, eradicating all but hybrids from the face of the Earth, some 35,000 to 28,000 years ago.


It is important to note that there continues to be considerable debate as to Neanderthal cognitive and linguistic capability as well as their ultimate fate. From a "replacement" standpoint, it has been argued that Neanderthals died out and/or were killed off and thus replaced (see Nitecki & Nitecki, 2004). Others have argued that despite the considerable differences between these peoples and the Cro-Magnon in height, stature, cranial development, and so on, that Neanderthals suddenly "evolved" into anatomically modern humans over the course of approximately 5,000 years.

However, when considered not just from a cultural but from a neurological perspective, the evidence supports a "replacement" scenario. Although there was likely some interbreeding, as intermediate forms have been discovered, the Neanderthals were probably subject to widespread genocide and were exterminated as a species; a consequence of an inability to successfully compete due to inferior cultural, cognitive, and frontal and parietal lobe evolutionary development (Joseph, 1993).

Unfortunately, as there are no Neanderthal brains in existence, this competitive disadvantage must be inferred or deduced neuropsychologically from the archaeological record and as based on an examination of cranial configuration. As noted, it is apparent that the height and length of the frontal portion of the skull is greater in moderns and "early moderns" vs Neanderthals. Similarly, a gross photographic examination of various Neanderthal vs "modern" endocasts (taken from Smith, 1982: 673; and from Tilney, 1928: 919, and Figures 389-400, and 417-422), and the data provided by Tilney (1928: 923) strongly suggests that the Cro-Magnon as contrasted with the Neanderthal (superior and inferior-orbital) frontal lobe has in fact expanded (Joseph, 1993; MacLean, 2010).

By contrast, whereas the Neanderthals frontal lobe is not as well developed, the occipital and superior parietal areas are larger in length and breadth (Wolpoff, 2011). However, these posterior regions of the brain are concerned with visual analysis and positioning the body in space (see chapters 20, 22). As male and female Neanderthals spent a considerable amount of their time engaged in hunting activities (see below), scanning the environment for prey and running and throwing in visual space were more or less ongoing concerns. A large occipital and superior parietal lobe would reflect these activities.

Nevertheless, this is to not suggest that the "modern" occipital and superior parietal lobes have shrunk. Rather it appears that the visual cortex was displaced into the medial regions of the brain -which is typical of the modern human cerebrum (chapter 22)-- whereas the superior parietal lobe became reorganized, as the environment acts on gene selection, thereby giving rise to the angular gyrus of the inferior parietal lobe.

Given the functional significance of the anterior (non-motor) frontal lobe" (chapter 19), and its role in planning, foresight, and creative and intellectual activities, and the contribution of the IPL to tool making and language (see chapter 11 and below), the Neanderthals with their inferior cognitive and cultural capabilities and poorly developed frontal and inferior parietal lobule would have been no match for the superior endowed Cro-Magnon. Hence, they were likely "replaced" by another species of humanity which evolved in another region of the world--yet another example of multi-regional evolution. However, if considered from a "multi-regional" perspective, it could also be surmised that Neanderthals might have possessed the necessary genes that would have enabled them to someday "evolve" into modern humans; that is, if they had not been exterminated by the Cro-Magnon.


Brain size alone tells us little about function, whereas action and behavioral accomplishment speak volumes as to cerebral, cognitive, and intellectual functional integrity and capability. Moreover, it could certainly be argued that what appears to be impoverished Neanderthal frontal lobe development is in fact an illusion created by their massive brow ridge. But this does not appear to be the case.

Even if we forgo the temptation to make dubious claims based on the size and shape of various endocasts, and ignore the somewhat flattened, sloping Neanderthal forehead and the evolutionary record which indicates a comparative and considerable impoverishment in Middle Paleolithic cultural, tool making, and symbolic accomplishment, there is physical evidence which suggests Neanderthal brain development was characterized by significant limitations in plasticity and learning capability (e.g. chapter 28 for related discussion).

For example, based on Neanderthal pubic morphology and speculations on gestation length, it appears that their infants were born at an advanced stage of maturity and development (Rosenberg, 1988; Wolpoff, 1980). This would not be unusual as the brain of our closest living primate relative, the chimpanzee, is born with a brain that is almost 50% of that of an adult, whereas newborn human brain volume and size is less than 25% of an adult (Passingham, 1982; Sacher & Staffeldt, 1974).

If Neanderthals were born at an advanced state of development, then like other rapidly maturing creatures, their brains would have been more "hard wired". Although this might have enabled them to respond to environmental emergencies without being hindered by prolonged helplessness, their ability to modify behavior patterns over the course of development would have been significantly hampered, at least as compared to modern humans.

Moreover, given that the frontal and inferior parietal lobes take well over seven to twenty years to functionally mature in present day humans, rapid brain development and maturity would preclude functional growth and elaboration in these regions. Consider, for example, that the more rapidly developing chimpanzee brain is completely lacking an angular gyrus (see Geschwind, 1966), whereas the majority of their frontal lobe is devoted to motor functions.

In fact, from an educational (enriched vs deprived) perspective alone, given that Neanderthals lived on average to age 40, compared with age 60 for Cro-Magnon, and the amount of time males and females spent engaged in hunting (see below), the Neanderthal young would not have had aged relatives or grandparents to care for and train them, and their mothers would have not have had the time to educate them. As detailed in chapter 28, these culturally deprived conditions would have also exerted pronounced deleterious influences on the developing brain.

However, even if the Neanderthal brain did not mature more rapidly than modern humans, and we ignore early death rates and time spent hunting, it is still clear that they were not as well endowed neurologically and thus cognitively and intellectually, and were thus destined to lose the human race.


According to the interpretation of the data that will be presented here, variant populations of H. erectus evolved multi-regionally and occupied parts of Asia, the Middle East, Europe, and Africa. From some of these separate branches of humanity, archaic humans also evolved multi-regionally. However, as the environment acts on gene selection, differences in these varied environments in turn affected the pace and nature of evolutionary development, such that European archaics (i.e. Neanderthals) and those H. erectus living in Java failed to "evolve." In the case of the Neanderthals this is due to the extreme artic cold and unique environmental conditions that characterized Middle Paleolithic glacial Europe and the middle East, Neanderthal, became geologically, genetically, and socially isolated and appear to have evolved differently and more slowly that those archaic Homo sapiens who were living in Africa and Asia under different environmental and climatic conditions.

As has now been well established, the environment acts on gene selection, and environmental influences, particularly during infancy and childhood, can exert drastic and significance influences on neurochemistry, neural development, functional and structural interconnections throughout the brain, including the neocortex and limbic system (see chapters 28, 29). Behavior, emotional expression, perception, intelligence, neuronal growth and size, dendritic aborization, and neurochemistry are correspondingly effected.

If generation after generation are exposed to similar adverse and intellectually limiting environmental influences, such as characterized the Neanderthal occupation of glacial Europe during the Middle Paleolithic, the impact would have been even more profound; effecting not just gene selection, but cerebral functional development across generations. Given that the racial-cultural groups who "evolved" in Africa, Asia, etc., over the last two million years were exposed to different environmental experiences and challenges for tens of thousands of years, it might be expected that their brains and thus their cultural and cognitive capabilities would have also "evolved" or at least developed, somewhat differently.


These "racial" cerebral/cognitive evolutionary-racial differences are evident by 100,000 years B.P. and include post-cranial anatomy, tool construction, hunting techniques, dental wear, and the size and functional capacity of the frontal lobe and the evolution of the IPL and angular gyrus. For example, in Africa "archaic" H. sapiens with their characteristic sloping frontal cranium have been discovered buried in caves and rock shelters on the Klasies River of South Africa, and from Laetoli in Tanzania, from sites dated around 120,000 B.P. (Butzer, 1982; Grun et al. 2010; Rightmire, 1984). However, over the next 30,000 years the frontal cranium increases in size and "early modern" Middle Paleolithic H. sapiens begin to appear in Africa and the iddle East. However, in Asia, these evolutionary events appear to have been accelerated and to have taken place 10,000 years early.

Nevertheless, by 90,000 B.P. these "early modern" peoples begin to appear in the Middle East, as Homo sapiens with much more advanced features, including individuals with rounded frontal cranium, have been unearthed from Qafzeh in the Middle East, (Schwarcz et al. 1988). As these peoples apparently evolved from archaic H. sapiens, it is reasonable to assume the frontal lobes increased in size -as based on an examination of cultural record, as well as the frontal cranium.

However, these "early modern" H. sapiens, were not only more advanced than "archaics" but they were superior to the Neanderthals who were living in Europe as well as those living and occupying nearby Middle Eastern sites (Mellar, 1989; Stringer, 1988) albeit at a much later date--which again is evidence for multi-regional evolution. Indeed, classic Neanderthal skulls with their large brows, flattened foreheads and thus reduced frontal lobes have been found in the Middle East (i.e. Kebara cave) -sites dated as recently as 60,000 B.P. (Bar-Yosef, 1989). Therefore the 90,000 year old "early modern" Qafzeh skulls are more advanced and show a greater degree of frontal development than the 60,000 year old Neanderthals skulls located nearby. Moreover, a Neanderthal skull from Saint-Cesaire in Western France and dated from between 33,000 to 35,000 B.P., also displays the characteristically reduced frontal lobe cranial features.

However, beginning as early as 75,000 B.P. (as is evident in Australia) "early modern" Paleolithic H. sapiens apparently had evolved into "modern" H. sapiens; individuals who sported fully modern cranial and post-cranial features and who began emerging throughout Asia, Africa, the Middle East, and who began invading Europe. For example, also unearthed in Western France are the 30,000 to 34,000 year old skeletal remains of a Cro-Magnon displaying the bulging frontal cranium that is characteristic of "modern" humans (see Mellars, 1989). This strongly suggests that Cro-Magnon and Neanderthals occupied adjacent (or even the same) territories at about the same time, or that a tremendous expansion suddenly occurred in frontal lobe capacity such that Neanderthals "evolved" into modern humans within 5,000 years.


As noted, the frontal lobes serve as the "Senior Executive" of the brain and personality, and together with the motor areas make up almost half of the cerebrum (see chapter 19). In contrast, the "archaic" H. sapien and Neanderthal frontal lobe makes up about a third, as a larger proportion of the brain appears to be occipital lobe and visual cortex.

Because the modern frontal lobe is so extensive and highly developed, and as different frontal regions have evolved at different times periods and are organized differently and have different neuroanatomical connections, they are concerned with different functions (chapter 19; see also Fuster 1997; Joseph, 2011a). For example, about one third of the frontal lobe, i.e. the motor areas, are concerned with initiating, planning, and controlling the movement of the body and fine motor functioning. It is this part of the "archaic" and Neanderthal frontal lobe that appears to be most extensively developed.

The "orbital frontal lobes" acts to inhibit and control motivational and emotional impulses arising from within the limbic system (chapter 19). Via orbital frontal interconnections with the limbic system it is possible for emotions to be represented as ideas, and for ideas to trigger emotions. An examination of the "archaic" H. sapien and Neanderthal orbital area (i.e. endocasts) suggests a relative paucity of development.

The more recently evolved anterior (pre-) frontal lobe and the lateral frontal convexity are highly important in regulating the transfer of information to the neocortex, and are involved in perceptual filtering, and exerting steering influences on the neocortex so as to direct attention and intellectual processes. That is, the anterior half of the frontal lobes act to mediate and coordinate information processing throughout the brain by continually sampling, monitoring, inhibiting and thus controlling and regulating perceptual, cognitive, and neocortical activity.

Moreover, social skills, planning skills, the formation of long range goals, the ability to marshal one's resources so as to achieve those goals, and the capacity to consider and anticipate the future, rather than living in the past, as well as develop alternative problem solving strategies and consider a multiple range of ideas simultaneously, are capacities clearly associated with frontal lobe functional integrity.

Hence, an individual is able to not only anticipate the future and the consequences of certain acts, but can formulate and plan secondary goals which depend on the completion of one's initially planned actions. Indeed, the capacity to decide to do something later, to remember and do it later, and to dream and fantasize and to visualize the future as pure possibility are made possible via the frontal lobes (chapter 19).

Conversely, when the frontal lobes have been damaged, or when the "prefrontal" lobes have been disconnected from the rest of the brain (such as following pre-frontal lobotomy), status seeking, social concern, foresight, and emotional, motivational, intellectual, conceptual, initiative, problem solving and organization skills are negatively impacted. Frontal lobe damage or surgical disconnection of the pre-frontal lobe reduces one's ability to profit from experience, to anticipate consequences, or to learn from errors by modifying future behavior.

There is a reduction in creativity, dreaming, abstract reasoning, symbolic thinking, or the capacity to synthesize interrelated ideas into concepts or to grasp situations in their entirety. Interests of a social or intellectual nature are diminished, or, with severe damage, abolished. Planning skills, long range goal formation, concern for the future, clothing, personal adornments, symbolism, social status, the thoughts of others and personal identity, matter little. In addition, language functioning is typically disrupted, with damage to the left frontal lobe, i.e. Broca's area, resulting in severe reductions in speech output and with right frontal injuries producing excessive and delusional speech (chapter 19).


Creatures such as sharks, teleosts, amphibians, and reptiles are completely lacking any semblance of what could be construed as a "frontal lobe," and behavior in these species are mediated by brainstem, limbic and striatal structures (MacLean 2010). Likewise, over the course of early (i.e. fetal and neonatal) human development, behavior is largely mediated by the brainstem with minimal neocortical participation (Joseph 2011c). In fact, as per the frontal lobe, human ontogeny appears to replicate phylogeny and parallels hominid development.

The human frontal lobes take over ten to 20 years to reach adult levels of maturity and development. Due to frontal lobe immaturity, and as the frontal lobes act to regulate and inhibit emotional, attentional, perceptual and behavioral activity, children, therefore, often behave in an emotional, impulsive, thoughtless manner, are easily distracted, and have difficulty inhibiting or guiding their behavior in respect to achieving long term goals. They tend to live in the "here and now" and are often ruled by their immediate desires and emotional impulses.

Similarly, following frontal lobes injury, or when the "prefrontal" lobes have been disconnected from the rest of the brain (such as following pre-frontal lobotomy), inhibitory function, concern for long term consequences, foresight, initiative, and emotional, and organization skills are negatively impacted and the patient may become exceedingly childish, disinhibited, impulsive, and distractible (Fuster 1997; Joseph 1986a, 1988a, 2011a; Luria 1980; Passingham 1993; Stuss & Benson 1986). These deficits include a reduced ability to profit from experience, to anticipate consequences, or to learn from errors by modifying future behavior, such that the same action may be performed again and again (i.e. perseveration).

Therefore, although frontal lobe damage is not synonymous with frontal lobe immaturity, there are nevertheless obvious parallels, even in regard to perseveration. Children tend to repeatedly ask the same questions, prefer to be read the same stories again and again, and can derive considerable enjoyment from watching the same cartoons or movies hour after hour, day after day, and week after week.

Hence, inhibitory functions, and flexibility in thinking, creativity, complex concept formation, symbolism, logic, planning, foresight, insight, and organization skills are directly associated with the functional integrity of the frontal lobes. By contrast, although capable of thinking about "tomorrow" or events hours into the future, those with immature, poorly developed, or injured frontal lobes tend to live in the "here and the now".



The ability to consider problems from multiple perspectives, as well as the capacity to express personal identity or status via the creation and wearing of personal adornments, coincides with the evolution of modern Homo sapiens sapiens during the Upper Paleolithic. In contrast Neanderthals, archaics, and other peoples of the Middle Paleolithic, constructed and made essentially the same tools over and over again for perhaps 50,000 years, until around 35,000 B.P., with little variation or consideration of alternatives (Binford, 1982; Gowlett, 1984; Mellars, 1989) -a clear frontal lobe deficiency referred to as "perseveration."

Conversely, with the advent of the Upper Paleolithic and "modern" (Cro-Magnon) H. sapiens sapiens, the capacity to visualize multiple possibilities and to use natural contours and shapes in order to create not just tools but a variety of implements, decorations, and objects, came into being; items which could also be employed for multiple purposes (Leroi-Gourhan, 1964, 1982).

Neanderthals tended to create simple tools that served a single purpose (Hayden, 1993). There is also good evidence to indicate that the Neanderthals (but not the "early moderns") of the Middle Paleolithic tended to live in the "here and now," with little ability to think about or consider the distant future (Binford, 1973, 1982; Dennell, 2013; Mellars, 1989, 1996).

For example, in summing up some of the major behavioral differences between those of the Upper vs Middle Paleolithic, Binford (1982, p.178) argues that "early adaptations appear to be based on tactics which do not require much planning ahead, that is, beyond one or two days...and...the ability to anticipate events and conditions not yet experienced was not one of the strengths of" those of the the Middle Paleolithic.

This is also apparent from a comparison of differences in hunting strategies employed by Neanderthals vs modern and even "early modern" humans. For example, Neanderthals, "early moderns" and "modern" Paleolithic H. sapiens sapiens, hunted large animals (Chase 1989; Lieberman & Shea 2004; Marean & Kim 1998; Mellars 1989, 1996; Rolland 1989; Shea 1998); which should not be surprising given that even male chimpanzees and baboons stalk, kill, dismember, and consume a variety of animals. However, just as there is evidence over the course of hominid evolution that large game animals increasingly served as human prey, there is also considerable data which indicate that the Cro-Magnon and other Upper Paleolithic H. sapiens, as well as "early moderns" utilized a far superior and effective strategy as compared to the Neanderthals (Chase, 1989; Lieberman & Shea, 2004).

For example, Lieberman and Shea (2004, p. 317) argue that "early moderns" from Qafzeh utilized a pattern of "circular mobility" which involved moving residential camps in a recurrent annual cycle. They note that "such a strategy puts a premium on a group's ability to monitor the availability of distant resources," and that it increases their capacity to position themselves "near the highest-quality resources at the most advantageous time to collect them." In contrast, the Kebara Neanderthals who lived in the same region, albeit 30,000 years later, appear to have employed a much less efficient "radiating" strategy, which results in a "threefold increase in the frequency of hunting trips." Indeed, in many respects, the Neanderthals engaged in opportunistic hunting, such as stampeding animals over cliffs.

Similarly, Mellars (1989, p. 357) argues the Neanderthals appear to have employed a "pattern of exploitation that was in certain respects significantly less logistically organized than that practiced by many of the later, Upper Paleolithic communities in the same regions" (see also Chase, 1989). The same could be said when comparing them to the "early moderns" (Lieberman and Shea, 2004); which reflects directly on frontal lobe functioning; i.e. "early moderns" and modern humans are more greatly endowed.

These differences associated with frontal lobe functional integrity is also apparent from analyses of the faunal assemblages from Upper vs Middle Paleolithic sites throughout Europe. According to Chase (1989) and Mellars (1989: 357), as contrasted with the Neanderthals (Middle Paleolithic), the hunting strategies of the Cro-Magnon and other "moderns" "appear to reflect a high degree of planning, foresight, and logistic organization in the structuring of hunting activities, which involved a clear ability to predict and anticipate the movement of the reindeer herds."

As is well known, Upper Paleolithic (Cro-Magnon and other "modern H. sapien) social and organizational skills were exceedingly well developed and sophisticated. In contrast, Neanderthal social organization skills seem to have been exceedingly primitive, and they appear to have been somewhat socially isolated.

Neanderthals also tended to be quite mobile, occupying various sites for only short time periods, and their "home bases" tended to lack any form of sophisticated organization or structure (Mellars, 1989, 1996). Of course, population densities during the Middle Paleolithic were also quite low (Whallon, 2009), which in turn would have limited their opportunities for inter-social exchange; thus reducing selective pressures to evolve these particular skills -which are associated with the frontal lobes as well as the limbic system (see chapters 13, 19).

As based on these data and arguments, functionally the frontal lobes appear to be comparatively more well developed in Cro-Magnon and even "early modern" H. sapiens. If "moderns" and Neanderthal occupied similar or adjacent territories these differences in frontal lobe capacity would have conferred upon "moderns" a tremendous intellectual advantage over their less well endowed competitors.

Progressive frontal lobe development would have also led to refinements and vast improvements in social skills and social organization, symbolic and artistic expression, and hunting and related technology. These are all characteristics associated with the Upper Paleolithic and much less so with the Middle Paleolithic period of evolutionary development.

As neatly summed up by an ardent defender of Neanderthal cognitive capabilities (Hayden, 1993, p. 139), "as a rule, there is no evidence of private ownership or food storage, no evidence for the use of economic resources for status or political competition, no elaborate burials, no ornaments or other status display items, no skin garments requiring intensive labor to produce, no tools requiring high energy investments, no intensive regional exchange for rare items like sea shells or amber, no competition for labor to produce economic surpluses and no corporate art or labor intensive rituals in deep cave recesses to impress onlookers and help attract labor."

Neanderthal burial.

Neanderthals tended to live in the here and now with little concern for the future, social status, the thoughts of others, or long term consequences -typical features of reduced frontal lobe functioning (see chapter 19).


Neanderthal owe their name to their remnants and fossils being first discovered in a deep gorge known as the Neanderthal (or Neander Valley), Germany, in 1856. However, their remains and those of other archaics can be found throughout Europe, Russia, and Iraq, as far south as the jungles of Zimbabwe, and as far East as China. Hence, archaic H. sapiens lived in a variety of (generally cold) climates, though it seems that they were eventually forced out of these other environs as the climate began to warm, and began to concentrate in the even colder climates of Paleolithic Europe. As noted, because of the arctic-like conditions, and the scarcity of edible plants, Neanderthals took up a hunting and scavenging way of life and displayed a willingness to eat almost anything on four or two legs--including other Neanderthals.

In one cave (unearthed in 1939, and opened after 60,000 years), a deep chamber was discovered which housed a single skull which was surrounded by a ring of stones. The man had died via a blow to the temple, and like other such body-less skulls, the foramen magnum had been enlarged presumably so that the brains could be scooped out and possibly eaten. Presumably this skull had been preserved as a "trophy" or perhaps for ritualistic purposes, for Neanderthals had a propensity for killing cave bears and then preserving their heads in stone chests which would be placed at the entrance to their dwelling cave.

Although the Neanderthal were able to throw rocks and spears, in many ways their hunting techniques were little different from that of wolves. As such rather than stalking their prey, they would ambush and stampede the animals so that those who were weakest could be most easily caught and dispatched, including, other Neanderthals. In one site, dated to over 100,000 years B.P., Neanderthals decapitated eleven of their fellow Neanderthals, smashed their faces beyond recognition, and enlarged the base of each skull (the foramen magnum) so that the brains could be scooped out whole and presumably eaten. Even the skulls of children were treated in this fashion.

Neanderthals were not a very tidy people as they lived amongst their own filth. There living space was littered with food that was thrown to the floor of their cave dwellings where it then began to rot, only to later become part of the fossilized record. In fact, they would throw the bones and carcasses of their fellow Neanderthals into the refuse pile. In one cave, a collection of over 20 Neanderthals were found mixed up with the remains of other animals and refuse. Hence, with the presumed exception of "friends," mates, children, and family, Neanderthals not only saw one another as a potential meal, but their humanness must have been typically completely disregarded.

Like modern humans, Neanderthals were a violent, murderous breed, as the remnants of their skeletons preserved for so many eons attests, for many of their fossils still betray the cruel ravages of deliberately and violently inflicted wounds. Interestingly, from an analysis of at least some of these skeletons which met violent ends, it appears that the wounds were inflicted to the left side of the body; the side that would be pierced by a right handed opponent.

Neanderthal Stone tools

Neanderthal Stone tools

Due perhaps to the changing seasons or depletion of game, or perhaps because of what must have become an unbearable stench of rotting food and the decay of those corpses mixed with the refuge, the Neanderthals tended to move their encampments from time to time within what appears to have been very large territories extending perhaps more than a 100 miles from border to border. Again, Neanderthals spent much of their time hunting due to the scarcity of vegetable matter and as they were not that skilled at catching prey.

Curiously, it appears that Neanderthal men outnumbered the females by about 10%. This is surprising for among present day humans, a greater number of females are born, and a greater number of females survive as males begin dying off at a faster rate from birth onward. Hence, it can be presumed that Neanderthals not only displayed a propensity for killing and eating one another, but they must have systematically engaged in female infanticide.

Nevertheless, despite their propensity for killing and eating one another, including the tendency to throw away and mix the remains with rotting food, Neanderthals were also capable of what might be called "love" --at least for family, as there is evidence that they took care of or at least assisted those who had been injured or maimed, enabling them to live many more years despite their grievous injuries. In fact, the skeleton of one Neanderthal male, who was about age 45 when he died, had been cared for over a number of years following profoundly crippling injuries. His right arm had atrophied, and his lower arm and hand had apparently fallen off, and his left eye socket and his right shoulder, collarbone and both legs were badly injured. Moreover, and as will be detailed below, this love appears to have continued beyond death, as Neanderthals sometimes buried their dead, even sprinkling the bodies with flowers.

Neanderthal burial. The body and skull were covered with flowers before burial.

Neanderthal skull/brain (top) compared to modern skull/brain (bottom).


Despite the poverty of frontal lobe development, archaic H. sapiens maintained a very primitive cultural and social life and apparently developed spiritual beliefs regarding the dead and their souls. Indeed, it is these early humans who apparently first practiced complex mortuary rites as they buried infants, children, and adults with grave offerings, and animal bones; a practice that would become increasingly complex with the evolution of "early moderns" over the ensuing 30,000 years (see chapter 9 for an extensive review of this and related literature).

Thus archaic H. sapiens and "early moderns" were carefully buried in Qafzeh, near Nazareth and in the Mt. Carmel, Mugharetes-Skhul caves on the Israeli coast over 90,000 to 98,000 years ago (McCown 1937; Smirnov 1989; Trinkaus 1986). This includes a Qafzeh mother and child who were buried together, and another infant who was buried holding the antlers of a fallow deer across his chest. In a nearby site equally as old (i.e. Skhul), yet another was buried with the mandible of a boar held in his hands whereas an adult had stone tools placed by his side (Belfer-Cohen & Hovers 2012; McCown 1937). Thus it is quite clear that humans have been burying and presumably weeping over their dead, and perhaps preparing them for a journey to the Hereafter, for over 100,000 years.

However, in this regard, archaic and modern humans were and are little different from the Neanderthals who also engaged in complex religious rituals. For example, Neanderthals were buried in sleeping positions with the body flexed or lying on its side, surrounded by goat horns placed in a circle, with reindeer vertebrae, animal skins, stone tools, red ochre, and flowers, with large bovine bones above the head, with limestone blocks placed on top of the head and shoulders, or beneath the head like a pillow, and with heads severed coupled with evidence of ritual decapitation, facial bone removal, and cannibalism. Moreover, Neanderthals presumably buried a bear at Regourdou, and at Drachenloch, they buried stone "cysts" containing bear skulls (Kurten 1976); hence, "the clan of the cave bear."

Of course, the fact that these Neanderthals were buried does not necessarily imply that they held a belief in God. Rather what is indicated is that they had strong feelings for the deceased and were perhaps preparing them for a journey to the Hereafter or the land of dreams -hence the presence of stone tools, the sleeping position, and stone pillows. Throughout the ages, dreams have been commonly thought to be the primary medium in which gods and human interact (Campbell 1988; Jung 1945, 1964). Dreams therefore served as a doorway, a portal of entry to the spirit world.

The possibility that these ancient humans believed the dead (or their souls) might return and cause harm is also suggested by the (admittedly controversial) evidence of ritual decapitation, and the placement of heavy stones upon the body. This suggests they believed in ghosts, souls, or spirits, and a continuation of "life" after death, and therefore took necessary precautions to prevent certain souls from being released from the body or returning to cause mischief.

Similarly, the buried animal skulls and bones implies a degree of ritual symbolism, which when coupled with grave offering and positioning of the body, certainly seems to imply the Neanderthals were capable of very intense emotions and feelings ranging from love to perhaps spiritual and superstitious awe (a function of the limbic system and inferior temporal lobe). When coupled with the evidence reviewed above (and below), there thus seems to be good reason to assume that Neanderthals maintained spiritual and mystical belief systems involving perhaps the transmigration of the soul and the horrors, fears, and hopes that accompany such feelings and beliefs.

Given that Middle Paleolithic peoples (archaic, "early moderns," Neanderthal) and those of the Upper Paleolithic ("moderns" and Cro-Magnon) all buried their dead with grave offerings indicates that all groups shared a certain commonality in regard to that region of the brain (and in fact the only region of the brain) that has been implicated in the generation of fear, love, intense emotions, and religious and spiritual beliefs: the limbic system (e.g., amygdala) and inferior temporal lobe (Bear 1979; d'Aquili & Newberg 1993; Gloor 1986, 2012; Horowitz et al. 1968; Jaynes 1976; Joseph 1982, 2012a, 2011d; MacLean 2010; Penfield & Perot 1963; Rolls 2012).

It could thus be concluded that the inferior temporal lobe (as well as the amygdala) may have been as developed in "archaics", Neanderthals, and "early moderns", as in Upper Paleolithic (Cro-Magnon and other "modern") H. sapiens. The evolution of these cerebral nuclei in turn made it possible not only to experience, but attribute spiritual or religious significance to certain actions and objects (see chapter 9).

For example, in addition to burial and mortuary practices, one of the first signs of exceedingly ancient religious symbolism is the discovery of an engraved "cross" that is perhaps between 60,000 to 100,000 years old. Regardless of time and culture, from the Aztecs, American Indians, Romans, Greeks, Africans, Cro-Magnons, and modern Christians, the cross consistently appears in a mystical context, and/or is attributed tremendous cosmic or spiritual significance (Campbell, 1988; Jung 1964; Sitchin, 2010).

The entrance to the underground Upper Paleolithic cathedral. The Chauvet cave. Note the sign of the cross. Reprinted from Chauvet et al., (1996). Dawn of Art: The Chauvet Cave. Henry H. Adams. New York.


Given that there are neurons that fire selectively to specific geometric visual shapes (e.g. faces, hands, triangles), and that they exist largely within the inferior temporal lobe (Desimone & Gross 1979; Gross et al. 1972; Richmond et al. 2013; Richmond et al. 1987), it could therefore be assumed that "cross" neurons as well as "mystical/religious" feeling neurons (or neural networks) had probably evolved by 100,000 years ago.

Indeed, along the neocortical surface of the inferior temporal lobe (and within the amygdala) are dense neuronal fields that contain neurons that fire selectively in response to visual images of faces, hands, eyes, and complex geometric shapes, including crosses (Gross, et al. 1972; Richmond, et al. 2013, 1987; Rolls 1984, 2012; Ursin & Kaada 1950). The ability to recognize faces, geometric shapes, and social emotional nuances are dependent on these specialized temporal lobe and amygdala neurons that respond selectively to these stimuli (Gross; et al. 1972; Richmond, et al. 2013, 1987; Rolls 1984). However, since neurons in the amygdala and inferior temporal are also multimodally responsive and subserve almost all aspects of emotion, including religious feeling, it is possible for faces and geometric symbols to become infused with emotional, mystical, and religious significance.

For example, heightened emotional activity within these nuclei could result in feelings of fear, foreboding, or religious awe, as well as activation of neural networks that respond selectively to crosses, such that emotional and spiritual significance is attributed to objects such as "crosses." Similar explanations could be offered in regard to the spiritual significance attributed to triangles (i.e. pyramids), and circles. In fact, along with crosses, triangles and circles were etched on Cro-Magnon cave walls over 30,000 years ago (Chauvet, et al., 1997; Leroi-Gurhan 1964). However, there is no similar evidence of symbolic or creative thought among Neanderthals--which in turn is a function of their poorly developed frontal lobes; a structure directly implicated in abstract and symbolic thinking.

Hence, although evidence for Neanderthal burial practices is indicative of a well developed temporal lobe and amygdala and hippocampus, this cannot be taken as evidence for high levels of frontal lobe functioning. The frontal lobes are not implicated in the generation of religious affective states.

On the other hand, the evolution of the temporal lobe, the superior temporal lobe in particular, may well have contributed to the functional evolution of the frontal lobes, Broca's area in particular. As will be detailed below, this would have been accomplished indirectly, following the evolution of the angular gyrus, a structure that is coextensive with Wernicke's area and which projects directly into the left lateral convexity of the frontal lobe. Once the angular gyrus evolved, Broca's area, which in hominoids serves as a hand manipulation area, became reorganized so as to manipulate the motor areas subserving human speech.

As will be detailed below, the evolution of the angular gyrus, a functional Broca's area, and thus language, and the lack of such capacities among archaics, H. erectus, H. habilis, and Australopithecus, is evident as based on an analysis of the evolution of artistry, tool technology, and handedness.


Language, gesturing, and fine motor control are dominated by the left cerebral hemisphere and the right hand in approximately 90% of present day humans). Moreover, and as has been well established, manual activity, right handedness and expressive speech are directly related not only neuroantomically, but behaviorally, which is why speech is often accompanied by hand gestures.

Indeed, the frontal motor "hand area" is adjacent to Broca's speech area; an association which in part explains and accounts for the disruptive effects of speech production on the ability to simultaneously sequence, position, or maintain stabilization of the hands (Hicks 1975; Kimura & Archibald 1974; Kinsbourne & Cook 1971); i.e. dual neural/behavioral activation of the hand/speech area results in competitive interference. Although there are exceptions (Joseph 1986b), handedness and language are intimately linked . However, this association only slowly evolved and required the activation of the hands coupled with vocalization and the reorganization of Broca's area from a hand area to a speech area, in order to forge this linkage.

It has been suggested that among lower primates that the right hand is more frequently used for postural support; i.e. grasping, hanging, and holding tree branches and other objects (MacNeilage, 1993), whereas the left hand tends to be employed for gross manual manipulation tasks. However, among apes, the forelimbs are less involved in postural support, which in turn frees the right hand for other purposes.

MacNeilage (1993) argues that in "very general terms" handedness may have shifted from left to right over the course of primate evolution, such that prosimians tend to be left handed when engage in acts requiring manual prehension, whereas this left-sided preference is weaker in monkeys except in regard to tasks with a strong visual-spatial demands. Hence, according to MacNeilage (1993) right hand preference in monkeys and apes begins to be seen in tasks involving manipulation and auditory discrimination.

However, contrary to the claims of MacNeilage (1993), McGrew and Merchant (2012, p. 118) have reported that chimpanzees show "no sign of a right-hand bias in object manipulation during tool use... if anything, the pooled data suggest a leftward tendency. Conversely, for the visually guided movement of reaching, the tendency seems to be to the right and not the left." Indeed, it appears that apes, monkeys and prosimians show little or no population bias toward preferentially using the right or left hand (see also Lehman, 1993).

Although there is no convincing evidence for handedness among apes or monkeys, trends in the lateralization of vocalization-perception are evident (Hauser, 1997; Hauser & Anderson, 2004; Peterson & Jusczyk, 1984). For example, as based on reaction time, Japanese macaques respond to vocalizations presented to the right ear/left hemisphere significantly quicker than to those presented to the left ear/right hemisphere (Peterson et al. 1978). In addition, destruction of the left temporal lobe in non-human primates significantly disrupts (but does not abolish) performance on auditory discrimination tasks (Heffner & Heffner 1984, 2010; Hupfer et al., 1977; Schwarz & Tomlinson, 2010). Among humans, similar damage results in a profound receptive aphasia and a complete loss of the ability to comprehend language; i.e. Wernicke's receptive aphasia.

Although rudimentary trends in regard to hemispheric lateralization for "language" perception may have emerged during the course of hominoid evolution (as is apparent in modern day non-human primates), handedness (and left hemisphere language representation) was only gradually acquired over the span of hominid evolution. Hence, around two million years ago perhaps up to 60-70% of Australopithecines favored the right extremity (Dart 1953). By 1.6 million years ago, perhaps as many as 70-80% of H. habilis had developed similar right handed inclinations (Toth 2013).

However, it was not until about 100,000 years ago, that perhaps up to 80% to 90% of archaic humans had become right handed (Cornford 1986). Indeed, this is evident based on the stone knife marks left on Neanderthal teeth, and from the left sided wound and injuries found on their remains--which suggest a right handed attacker.

These findings suggest that the left hemisphere became slowly organized over the course of evolution so as to mediate control over the hand. Once hand dominance was established, the left hemisphere also became selectively organized for performing temporal sequential tasks involving the hands, such as tool making.

As will be detailed below, the parietal lobe is considered a "lobe of the hand" (Critchley 1953; Hyvarinen 2012; Kaas 2013; Lynch 1980; Mountcastle et al. 1975, 1980) whereas the angular gyrus/IPL contains the motor engrams responsible for the programming of complex temporal and sequential hand and figer movements, including those involved in tool making and utilization. Hence, given the above, it can be assumed that the angular gyrus probably slowly evolved over the course the last two millions years in parallel with the establishment of handedness and hand-related activities as reflected in the evolution of tool technology. Given these trends and the association between right handedness, the left hemisphere, and language, it can also be assumed that not only the neural substrate for preferential hand use and tool making, but for modern human speech production and perception also gradually arose over the course of the last two million years.


Based on a considerable body of evidence, it has been theorized that right handedness, at least in humans, is a direct consequence of the comparatively earlier maturation of the left frontal somatomotor areas and the fact that during fetal brain development, the left corticospinal tract descends into the brainstem and crosses over at the medullary pyramids and therefore establishes brainstem and spinal-motor interconnections in advance of the right hemisphere (Joseph 1982). Given that the corticospinal (pyramidal) tract originates in the somatomotor areas of the frontal and parietal lobe (see chapters 19, 20), and as these areas program hand use and fine motor control, therefore, among the majority of "modern" humans, because of this earlier maturation, it is the right hand/left hemisphere which becomes dominant for gesturing, grasping, tool making, sewing, and communicating. Hence, the majority of humans are more likely to spontaneously activate the right rather than the left half of the body when gesturing or speaking. The right hand, in fact, appears to serve as a kind of motor extension of language and thought insofar as it often accompanies or acts at the behest of linguistic impulses and even emphasizes certain aspects of speech.

Motor functioning and hand use are dependent on sensory (e.g. propioceptive, kinesthetic) and visual feedback which in turn is provided by the parietal lobe (chapter 20). For example, if not for the sensory feedback of the muscles and joints (information which is transmitted to the parietal lobes and then relayed to the motor areas) movement would become clumsy and uncoordinated; i.e. a person would not know where their limbs were in space and in relation to one another. Since the parietal lobes and the frontal motor areas are richly interconnected, they serve in many ways as a single neurocortical unit, i.e. sensorimotor cortex (Luria, 1980), which thus programs and guides motor behaviors.

The parietal lobes (areas 5 and 7) contain neurons which guide hand movements in visual space, including neurons which fire when reaching for or grasping or manipulating various objects (Critchley, 1953; Hyvarinen, 1982; Kaas, 1993; Lynch, 1980; Mountcastle et al., 1975, 1980). Moreover, in contrast to the visuals areas in the inferior temporal and occipital lobe, the parietal lobe responds to visual input from the periphery and lower visual fields -the regions in which the arms, hands, and feet are most likely to come into view (Joseph 1993). Therefore, when engaged in activities involving the hands the parietal lobes become activated not just due to hand movement, but due to visual feedback. The parietal lobes are watching and visually guiding the hands such as when gesturing or manipulating some object or constructing a tool.

In contrast the temporal and occipital lobes are more concerned with identifying and observing whatever target the hands may be aiming toward or manipulating. In this manner the parietal and temporal lobes interact in regard to aiming, throwing, and identifying relevant targets. For example, when throwing, the hand and upper arm often leaves the lower visual field and enters the upper visual field which is also the domain of the temporal lobe which in turn is observing the target.

However, because they receive visual and somatomotor feedback, the parietal lobes are also uniquely situated so as to learn and memorize hand-movement related behaviors such as when gesturing or manipulating some object or constructing a tool (Joseph 1993, 2011e). And, because a right handed individual is more likely to use the right hand for tool making and the left hand for holding the tool, it is the left parietal lobe (which monitors the right lower visual field and controls the right hand) which would be more significantly effected by and involved in temporal-sequential manipulative activities.

Hence, as the environment acts on gene selection, over the course of human evolution, the parietal (and temporal-occipital) lobes have expanded, increased in dendritic density, and formed intimate and overlapping interconnections and thus new capacities (e.g. Juliano, Eslin, & Tommerdahl 2004; Ramachandran 1993); all of which have effected the adjoining sensory association areas including Wernicke's area and the somato-motor-hand areas. In consequence, corresponding to the evolution of handedness and tool making capabilities, the left (and right) superior parietal lobule as well as the superior posterior temporal lobe greatly expanded in size, thereby creating a large portion of the multi-modal neocortical tissue that would give rise to the angular gyrus which is located at the junction of the parietal, occipital and temporal lobe.

Therefore, once the preference for right hand motor control became sufficiently pronounced, the left IPL/angular gyrus (and frontal lobe) continued to evolve, and proficiency in temporal sequential and fine motor control increased, as did language-related capabilities--As Wernicke's area and the angular gyrus are coextensive. Tool making slowly evolved beyond the Oldowan/Acheulean traditions, and increasingly complex and sophisticated weapons and hunting implements were invented.


The angular gyrus of the left hemisphere contains the "motor engrams" necessary for the performance of complex temporal sequential movements, including those involved in tool use and manufacture (see chapters 11, 20). In this regard, the angular gyrus/IPL is also a "lobe of the hand." Hence, with the evolution of the angular gyrus, the ability to use the fingers and the hand, particularly the right hand, in tasks requiring a series of sequential steps, including counting, also evolved. In consequence, if the left cerebral angular gyrus/IPL were severely injured, mathematical ability would be abolished, as would the capacity to perform tasks involving temporal-sequential movements; a conditions referred to as apraxia.

Consider the simple steps necessary to make a pot of instant coffee. From obtaining the coffee container, filling it and heating the water, to filling a cup with coffee grounds and then pouring the hot water into the cup and so on are just a few of the many steps that must be performed in a highly interrelated sequence. Take just one of these steps and perform it out of order, and one destroys the overall integrity of what one was attempting to accomplish; e.g. cold water in the cup, heat the empty pot, drink the water, pour in the coffee grounds, etc.. This is exactly what occurs with apraxia.

With IPL/angular gyrus injuries, not only would the individual be unable to make a pot of coffee or retrieve a cigarette and then strike a match in the correct sequence, but they might be unable to put on their clothes -much less sew them together. Similarly, individuals lacking an angular gyrus/IPL, or those with injuries to this region, would be unable to fashion complex tools -much less utilize them in a complex temporal sequence.

Paleolithic Goddess: Venus de Brassempouy.

Moreover, because the development of related sequential skills, such as math and counting abilities, are first acquired in relation to the hand, e.g., counting with the fingers, by pointing (chapter 11), damage to the IPL/angular gyrus can result not only in apraxia and a loss of mathematical ability (acalculia), but an inability to recognize one's fingers (finger agnosia). When coupled with anomia and dyslexia (abnormalities also associated with IPL/angular gyral injuries), collectively these disturbances are referred to as Gerstmann's syndrome.


Non-human primates lack handedness or complex tool making or using capabilities. This is because the ability to make or utilize complex tools is dependent on the IPL/angular gyrus and motor areas of the frontal lobe. Hominoids lack a human-like Broca's area, or an angular gyrus though they are endowed with an inferior-superior parietal lobe (areas 7b and 7ip), which, as noted contain neurons that guide hand movements, including grasping and manipulating. Hence, although they lack an angular gyrus, homoinoids, such as chimpanzees make and use simple tools such as rocks, leaves, and sticks (Boesch & Tomasello, 1998; Goodall 1986, 2010; McGrew & Marchant 2012). Likewise, they can produce a variety of vocalizations which have specific meanings. Hence, although H. habilis and Australopithecus, were using rocks as simple stone tools some 2.4 to 2.6 million years ago this does not indicate that they had evolved an angular gyrus or a human-like Broca's expressive speech area.

Moreover, although perhaps as many as 60% of Australopthecines, 70% of H. habilis and at least 80% of archaic H. sapiens may have been right handed (Cornford, 1986; Dart, 1953; Toth, 2013), given the rather unvarying and still simplistic Oldowan/Acheulean/Mousterian stone tool technologies associated with these groups, there is still no evidence that these species had evolved an angular gyrus--though certainly trends in this direction are evident. Nevertheless, it was not until the Upper Paleolithic and the appearance of anatomically "modern" Paleolithic humans, including the Cro-Magnon, that tool making became literally an art and evolved beyond the use of rock and stone and complex multifaceted features were incorporated in their construction. It is at this stage of evolutionary development that we have clear functional and neuropsychological evidence for the evolution of the angular gyrus of the IPL.

However, this does not mean to imply that this evolutionary acquisition occurred independently of cultural/environmental influences, and/or that all humans became similarly endowed at about the same time. That is, it is unlikely that cerebral "evolution" has ceased (see chapter 4). Nor does it appear that all humans are equally endowed with angular gyral and in particular, frontal neocortical tissue or capabilities. Indeed, from a functional perspective, and as based on differences in impulse control, long term planning skills, and so on, it could be argued that some racial groups are probably differentially functionally and perhaps neuroanatomically endowed -a consequence of racial-evolutionary differences in environmental, climatic, geological, and cultural influences across time and generations.

Nevertheless, it is with the evolution of the Cro-Magnon, the angular gyrus and expansions in the frontal lobe which provided the neurological foundations for tool design and construction, the ability to sew and even wear clothes, and the capacity to create art, and pictorial language in the form of drawing, painting, sculpting, and engraving. It is the the evolution of these tissues which enabled human beings to not only create visual symbols but to talk about them and create verbal symbols in the form of written language (Joseph 1993, 2011e; Strub & Geschwind, 2013). The parietal lobes, in fact, not only guides and observes hands movements, but comprehends gestures, including those that produce written symbols and thus made possible the evolution of reading and writing.

Paleolithic Goddess: Venus de Brassempouy.

Paleolithic Spearthrower

Paleolithic Spearthrower

The Right and Left Parietal Lobe

Although humans possess two parietal lobes, they do not perform the exact same functions. The left parietal is more concerned with temporal sequences, grammar, gestural communication and language including writing, spelling and the production of signs such as in American Sign Language (Joseph, 1993; Kimura, 1993; Poizner et al. 2010; Strub, & Geschwind, 2013).

The right parietal lobe is more concerned with guiding the body as it moves through space, and with the analysis, manipulation and depiction of spatial relations such as through drawing, carpentry, masonry, throwing, aiming, determining spatial relationships, as well as painting and art or even sewing together or putting on one's clothes (see chapters 10, 11, 20). If the right parietal lobule were injured, an individual may suffer from constructional, visual-spatial and dressing apraxia (see chapter 10).

As such, the two parietal lobes are concerned with different aspects of language, motor control, visual-spatial analysis, and the art of gestural communication. Indeed, it was the evolution of these two differently functioning parietal tissues (as well as Broca's area in the left frontal lobe) and their subsequent harmonious interaction that made possible complex, grammatically correct, written and spoken language, as well as visual artistry and the ability to draw as well as run, and aim, and throw with accuracy (e.g. move through visual space and dispatch a distant animal via bow, or spear).


As noted, apes and monkeys do not possess an angular gyrus (Geschwind, 1966). Rather (as based on an analysis of tool technology) the angular gyrus of the IPL may not have begun to truly evolve until about 100,000 B.P, and then only among a select group of perhaps early modern humans living in central Africa (e.g. modern day Zaire) and in selected localities elsewhere.

For example, double-row barbed spears and bone points have been discovered in eastern Zaire from sites dated to approximately 75,000 to 90,000 B.P (Brooks et al. 1995; Yellen et al. 1995). These bone tools not only required a complex series of steps for their construction but their manufacture indicates these peoples were able to recognize that bone could serve as a workable plastic medium that could be employed for a variety of purposes.

In this regard the Neanderthals certainly lagged behind their "early modern" and "modern" counterparts in regard to inferior parietal development, as similar tools of equal complexity do not appear in European sites until about 35,000 B.P.

As noted, experience influences brain structure and functional organization. Neanderthals were not only more limited in regard to environmental opportunity, but they possessed relatively short distal limbs which in turn restricted arm movement. Coupled with limitations in the overall configuration of their glenohumoral joint surfaces, this reduced their ability to accurately throw rocks or other hunting implements (Churchill & Trinkaus, 2010). These experiential limitations would have negatively impacted neurological development.

In contrast, Cro-Magnon and other "modern" humans of the Upper Paleolithic possessed a much rounder humerus which is indicative of more frequent and efficient throwing. Hence, although Neanderthals utilized throwing spears (Anderson-Gerfaud, 1989; Thieme, 1997), these same weapons would have been wielded with considerable more efficiency and accuracy and at a farther distance, in the hands of"modern" H. sapiens including Cro-Magnon (Churchill & Trinkaus, 2010).

In addition, due to the relatively large pubic length in the Neanderthal population (Rosenberg, 1988; Trinkaus, 1984), they would not have been able to run as fast or move as rapidly or efficiently through space as compared to Cro-Magnons. These restrictions in functional flexibility and thus sensory feedback would have placed the Neanderthals at an additional disadvantage.

In that the IPL of the right hemisphere is concerned with the hand and the movement of the body in space, and thus the analysis of body-spatial relationships, it could be inferred that these post-cranial limitations on Neanderthal aiming, throwing, and running, also directly reflect on the functional capacity and evolutionary development of this region of the cerebrum. In that experience can in turn influence brain growth and development as well as gene selection, and because the brain functions in accordance with the "use it or lose it" principle, the Neanderthal parietal lobe would have been negatively impacted due to diminished experiential opportunity (chapter 28).

As compared to "modern" and Cro-Magnon peoples, these physical and experiential restrictions would have resulted in an excelerated rate of neuronal "pruning" and cellular atrophy, and related perceptual, behavioral, and cognitive deficiencies among the Neanderthal peoples thus placing them at a significant disadvantage.


Although there is some evidence that some Middle Paleolithic groups made simple bone tools (Hayden, 2004), particularly those who lived in Zaire (Brooks et al. 1995; Yellen et al. 1995), the Middle/Upper Paleolithic transition is characterized by the creation of complex bone tools, the appearance of the sewing needle, and the creation of personal adornments such as carefully shaped beads of bone, ivory and animal teeth, animal engravings, perforated shells which were presumably traded or transported over long distances, and statuettes, drawings, and paintings of animal and female figures; i.e. creative and utilitarian endeavors that are mediated by the angular gyrus/IPL of the left and right cerebral hemisphere. The fact that art and tool making became exceedingly more complex during the Upper Paleolithic and with the appearance of "modern" humans, can be directly attributed to the expansion of the left and right inferior parietal lobe, and the left frontal motor areas controlling fine hand movement.


However, this is not to suggest that Neanderthals were completely devoid of tool making capabilities, for their tool kit, albeit consisting largely of rocks, included "blade" technologies, and hafted points (See Hayden, 2004; Lieberman & Shea, 2004; Shea, 1989). Even so, other than the fact that a lot of effort was put into their construction the same tools were made the same way over and over again for perhaps 200,000 years, until around 35,000 B.P.

Moreover, unlike the 90,000 year old blades discovered in eastern Zaire, and particularly the tools associated with the Cro-Magnon peoples, Middle Paleolithic Neanderthal tools were predominately "use-specific" and thus served, for the most part, a unidimensional purpose (Hayden, 2004). In fact, similar to children, the Neanderthals tended to use their mouth for manipulative tasks (Molnar, 1972; Trinkaus, 2012).

Specifically, it Neanderthals would use their mouth for grasping and holding objects as well as chewing and softening items such as hides in order to soften them and make them more pliable. Indeed, as the environment acts on gene selection, this may explain the large, long and wide out thrust shape of their face and very large mouth and huge jaw as well as the very large front teeth they possessed.

Like carnivorous animals, these people would also use their mouths to tear food apart; albeit with the assistance of a stone knife. For example, a Neanderthal would stuff meat into their mouth and then take a knife and cut off whatever protruded, holding the end of the meat with one hand and cutting with the other. Sometimes, however, they would actually run the knife across their own teeth which would leave a number of scratch marks. Via the direction of anglation of these scratches it becomes apparent that the Neanderthals were primarily right handed as is indicated by the direction of wear on their teeth.

Although the Neanderthals used stone "knives" it is not until the rise of the Upper Paleolithic that highly complex blade and completely new, diverse, and multifaceted tool (Aurignacian) technologies became the norm. Moreover, with the Upper Paleolithic peoples, the capacity to impose form, to visualize multiple possibilities and to use natural contours and shapes in order to create not just tools but a variety of implements, decorations, and objects, came into being, including complex representational and mobile art, complex scaffolding to support cave artists, and the sewing needle (Leroi-Gourhan, 1964, 1982) -all of which requires an angular gyrus/IPL and a motor cortex capable of controlling fine hand and finger movements; not only so that they may be fashioned but employed correctly.

In contrast, there is no evidence of a sewing needle or complex tool construction among Neanderthal populations during the Middle Paleolithic, and the capacity to visualize possibilities in regard to shape and form, was comparatively absent as well (Binford, 1982; Mellars, 1989, 1996); that is, until the very end of their rein and only among a small subpopulation (see below). The failure to invent complex tools and especially the sewing needle certainly suggests an inability to utilize this implement, which in turn speaks volumes regarding their level of angular gyrus/IPL development. Neanderthals utilized only the most simplistic of methods for hide or buckskin preparation and then only during the very end of the Middle Upper Paleolithic (Anderson-Gerand, 1989). As noted, the inability to make or even correctly put on clothing, is directly associated with IPL abnormalities.


There is now some highly controversial evidence of what appears to be an intermediate stage of tool construction that may have been practiced by "late" Neanderthals during the last stages of the Middle to Upper Paleolithic transition; i.e. the Chalteperronian, including bone tools, blades, and ornaments which are in some respects similar to implements associated with a tradition referred to as Aurignancian. Moreover, the skeletal remains of Neanderthals and those intermediate between Neanderthals and "modern" H. sapiens have been found in association with these Aurignancian-like tools and ornaments in at least two sites, in Arcy (Hubelin, Spoor, Braun, Zonneveld, & Condemi 1996) and Saint CeSaire, Western France (Allsworth-Jones 1989).

Hence, the controversy as the Aurignancian tool making industries are typically attributed to anatomically modern Upper Paleolithic peoples (Gambrier 1989; Stringer 2014). In fact despite the presence of these intermediate forms, there is some debate as to the identity of those responsible for the Chatelperronian tool industries which overlapped with and which may have predated the Aurignancian. That is, some scientists dispute the notion that Neanderthals could have created the Chatelperronian implements and vehemently reject any possibility that they could have created Aurignancian tools.

By contrast, d'Errico et al. (1998, p. 2) argues that "Neanderthals may have been the producers of all the pre-Aurignancian Upper Paleolithic technocomplexes of Western and Central Europe." The implications are important for it implies that by 34,000 B.P., "late" Neanderthals may have acquired the skills to design and construct personal ornaments and simple bone tools including blade technologies, and personal ornaments including beads, awls, pins, and rings (Bahn 1998; d'Errico et al. 1998). This also raises the possibility that they may have also begun to develop Aurignancian tools as both tool making industries overlapped and existed simultaneously for a period of several centuries in a several regions of northern Spain and western France.

Mellars (1989, 1998), however, argues that the Chatelperronian and Aurignacian industries are associated with two different populations, i.e. Neanderthals and Cro-Magons respectively. In support of his position, Mellars (1989, 1998) points ouf that the Chatelperronian disappeared from France and Spain about 30,000 years ago, and was replaced by Aurignacian technologies--events which coincide with the extinction of Neanderthals and their replacement by Cro-Magnon.

Moreover, because they overlapped, it has been argued that the Chatelperronian may have been acquired in trade and cultural exchange (Allsworth-Jones 1989) or even mimicry (White 1993; see also Mellars 1989, 1998) since Neanderthals and Upper Paleolithic peoples simultaneously shared the planet for up to 10,000 or more years (Harold 1989; Hublin et al. 1996; Karavanic et al. 1998; Mellars 1989).

Cultural exchange, though possible, seems unlikely as the Chatelperronian tool industries differ from the Aurignancian (d'Errico et al. 1998). Hence, it would appear that by 34,000 B.P., the "late" Neanderthals of Arcy and Saint CeSaire, either independently created these items, and/or that the differences in the tool Chatelperronian and Aurignacian industries are a natural consequence of alterations which would be induced in the process of mimicry.

Nevertheless, even if we accept the mimicry hypothesis, this would still indicate that the "late" Neanderthals of Arcy and Saint CeSaire possessed the necessary technological skills for creating simplistic bone and ivory implements; which would suggest they may have evolved an intermediate angular gyrus. Indeed, trends in this techno-social-cultural direction are also suggested by the 400,000 year old, wooden "spears" which were found in Schoningen (Thieme, 1997), as well as evidence of Middle Paleolithic, Neanderthal burial practices and the tendency of some groups to paint their cave dwellings with red ochre.

On the other hand, there is no evidence for technological accomplishments similar to those of Arcy or Saint CeSaire, among other "Late" Neanderthal populations such as those living in Spain as late as 30,000 to 28,000 B.P (Mellars 1998). In this regard, although at least some populations of "Late" Neanderthals may have possessed the necessary skills for constructing simple bone implements, not all Neanderthals employed these skills, which again suggests that mimicry may have been the impetus for these creations.

Again, however, regardless of which position we accept, from a paleoneurological perspective, the evidence and arguments briefly reviewed above, can be interpreted to indicate that at least some populations of "Late" Neanderthals may have evolved a rudimentary angular gyrus which enabled them to imitate or create simple bone and ivory implements. That is, the Chatelperronian technology would seem to represent an intermediate stage of frontal and angular gyrus development and thus continuity in archaic to "early modern" to "modern" human cerebral development. This progression is also evident from an examination of the paleoarchealogical evidence from Africa and the Middle East and is also consistent with the multi-regional view of evolution in which different subgroups emerge at different times depending on changing environmental conditions which in turn induce a step-wise progression in metamorphosis.

For example, in addition to the 90,000 year old barbed points from eastern Zaire (Brooks et al. 1995; Yellen et al. 1995), carefully shaped, incised and notched bones have been recovered from sites near the Klasies River Mouth (Singer & Wymer, 1982), and it is from South African sites that the remains of "early modern" H. sapiens (a transitional species) have been discovered (Rightmire, 1984). In addition, a perforated Conus shell buried with an archaic H. sapiens infant in Border Cave on the Swaziland border and notched rib fragments and 7 split-tusk "daggers" have also been recovered (Volman, 1984). These sites have been dated from between 90,000 to 98,000 B.P. However, nothing similar has yet been recovered from any Middle Paleolithic (Neanderthal) sites in Europe.

In the Middle East, further advances in tool and blade technology, including the development of personal decorated ornaments have been discovered in sites associated with "modern" and "early modern" H. sapiens, which have been dated from between 40,000 to 47,000 B.P. (Clark & Lindly, 1989; Marks, 1989). However, it is not until around 35,000 to 38,000 B.P. that tool and related artistic and symbolic appears to explode (White, 1989), with even further advances occurring from 20,000-15,000 years B.P and of course up to the present.

Hence, there is evidence of a progression in mental and cognitive capability that is reflected in tool and artistic technology, and which is also associated with progressive evolutionary advances in the inferior parietal as well as frontal lobes. If that is the case, then "late" Neanderthals may well have evolved the capacity to vocalize a word poor, somewhat aggrammatical form of language.

However, in that Upper Paleolithic tool technology is also characterized by complex and highly standardized blade and tool technologies (Jelinek, 1989; Leroi-Gourhan, 1964), and given the precision as well as specific temporal-sequential steps that characterized tool construction (Leroi-Gourhan, 1964), it thus appears that the basic motoric neural templates that subserve not just temporal-sequential tool construction but language expression do not become fully evidence until this time period. That is, although there is evidence for a step-wise progression in frontal (including Broca's area) and angular gyrus evolution, these structures do not appear to have fully functionally evolved until the emergence of the Cro-Magnon people. Indeed, as will be detailed below, once the angular gyrus had evolved (as represented by the emergence of the Aurignancian tool technology, this in turn induced a functional reorganization of Broca's area which came to subserve expressive speech.

Cro-Magnon Paleolithic burial in sleeping position.


The environment acts on gene selection, which in turn acts on the environment which acts on gene selection. Thus a complex feedback system exists that links group and individual experience with gene expression. Hence, trends become pronounced tendencies, and trends and tendencies eventually become characterological traits in successive generations and species when the genome is repeatedly influenced by similar, albeit more complex environmental input. Thus we see that over the course of human evolution that right handedness and language lateralization became increasingly pronounced.

Hence, cross generational environmental and experiential factors are key to understanding the evolution of the neurological foundation for language, including the angular gyrus and a functional Broca's area. Although these factors are related to the hand, more importantly they are a function of the way in which the hand has been employed and for what tasks. Clearly, as modern human language is characterized not only by a complex and varied vocabulary, but by its temporal sequential grammatical structure, then the key to the evolution of language and its underlying neuroanatomical foundation, are those tasks which are temporal sequential and which may have been coupled with vocalizing; i.e. food gathering and domestic tool construction.

The basic skills necessary in the gathering of vegetables, fruits, seeds, berries and the digging of roots include the ability to engage in fine and rapid, often bilateral, temporal-sequential physical maneuvers with the arms, hands, and particularly the fingers. For almost all of human history, human females have been the traditional gatherers (Dahlberg, 1981; Joseph, 2011e; Martin & Voorhies, 1975; Murdock & Provost, 1981; Zilman, 1981).

However, in grubbing for roots and bulbs, the gatherer would need a digging stick which they probably had to periodically sharpen by using stone flakes. In fact, the first tools created by Hominids were probably digging sticks employed by female gatherers (Joseph, 2011e).

The gatherer would also carry a hammerstone for cracking nuts and for grinding the various produce collected during the day. In addition, to food preparation, clothes had to be fashioned out of hides, and these too are tasks associated with women (Joseph, 2011e; Neithhammer, 1977).

Thus her duties probably included cleaning the hides via the use of a scraper, drying and curing the skin over the smoke of a fire, and then using a knife or cutter to make the general desired shape, and then a punch to make holes through which leather straps or vine can be passed so as to create a garment that could keep out the cold. By Cro-Magnon times they were weaving and using a needle to sew garments together.

Thus, in addition to gathering, women made tremendous use of tools and may have been the first tool makers. Indeed, in current and recent hunting and gathering groups (e.g. the American Indians), these and related "domestic" tasks are almost exclusively associated with "women's" work (Joseph, 2011e; Neithhammer, 1977). With the exception of hunting implements (probably fashioned exclusively by males), it is females and not the males who make and use tools (Joseph, 2011e; Niethhammer, 1977). Similarly, among apes, females chimps generally use tools much more frequently than males (see Mc Grew & Marchant 2012).

In tool making, technique comes is essential if the same implement is to be fashioned again and again (Bradshaw & Rogers 2012; Greenfield 2012). The basic skills necessary in tool construction include the ability to engage in fine and rapid, temporal-sequential maneuvers with the arms, hands, and particularly the fingers. Certain tools are made in a step-wise, sequential manner, with specific movements, and with a certain degree of muscular power and considerable precision. To make and utilize tools involving a precision grip requires that the manufacturer not only have a hand capable of such feats (Bradshaw & Rogers 2012; Greenfield 2012; Marzke 1997; Toth 2013), but a brain that can control this hand and which can use foresight and planning in order to carry out all the steps involved in the implement's manufacture.

Due to selective pressures and the survival and breeding of those who were successful at these activities, the left half of the brain, which controls the right hand, became increasingly adapted for fine motor control including temporal-sequencing be it for the purposes of tool making or for gathering. In this regard, it is noteworthy that fine motor skills, such as those involving rapid temporal sequencing are abilities at which females tend to excel as compared to males (Broverman, et al. 1968).

As females have engaged in gathering for time periods much longer than males, coupled with her possible role in tool manufacturing and tool use (e.g. skinning, clothes making, etc.), and the fact that speech production would not have been restricted but encouraged (as compared to the silent hunters), it might be assumed that the neural substrate for the temporal-sequential and grammatical aspects of what would become spoken language developed earlier and to a greater extent in the brains of women; particularly in the areas observing and controlling hand movements and somesthetic functioning (the parietal lobes). In fact, this is exactly the case; female demonstrate a decided superiority over males in regard to many aspects of language (see chapter 7).

Although women, through tool making and gathering may have acquired and developed language and fine motor temporal sequencing abilities at an earlier point and to a greater extent than men (which is certainly true of modern women; see below), males would have also directly benefited from this aquisition via genetic inheritance and because he would also have a mother who would talk to him and teach him language. Woman provided him with the fruit of linguistic knowledge and what would become linguistic consciousness (see chapter 7).


Apes and monkeys do not utilize grammar and are not dependent on complex temporal sequencing in order to make their needs known.

In fact, although apes and monkeys employ gestures in order to communicate, they are incapable of producing grammatically correct and sophisticated sign language (ASL), despite extensive training (reviewed in Joseph, 1993; Premack and Premack, 1988). Presumably, this is because ASL is dependent on the functional integrity of the IPL/angular gyrus, which is why in deaf humans, damage to the left parietal lobe can induce severe receptive gestural aphasia (Kimura, 1993; Poizner et al. 2010) and anomic aphasia in hearing adults.

However, whereas vocal communication in humans is emotional, word rich, and organized in temporal sequences of grammatical word units, vocal communication in non-human primates and mammals is purely social and emotional , and tends to have a non-sequential organization (Aich et al. 2010; Hauser 1997; Joseph 1993; Premack & Premack 2013; Robinson 1973), consisting of moans, screams, barks, grunts, pants, and pant-hoots (Dunbar & Dunbar 1975; Erwin 1975; Fedigan 2012; Goodall 1986, 2010; Hauser 1997). Non-human animals do not employ sound units in order to communicate, and the role of the neocortex is relative minimal in this regard (Jurgens et al. 1982; Myers 1976). Rather, vocalization is largely a function of the limbic system, and is thus referred to as "limbic language" (Joseph 1982, 2012a).

Limbic language is "innate" in the sense that these emotional sounds are produced beginning soon after birth, and are vocalized by human infants born deaf, and non-human primates who are reared in isolation with surgically muted mothers. These emotional sounds are also understood cross-culturally (see chapters 10, 15). However, over the course of development, these emotional vocalizations are molded and shaped by experience and/or become associated with specific objects, individuals, or dangerous situations.

For example, as demonstrated by Cheney and Seyfarth (1978), vervet monkeys employ three distinct emotional calls to signal the presence of eagles, vs snakes, vs leopards. Experienced adults respond to these calls by looking up (eagle), looking down (snake), or climbing up a tree (leopard), depending on which call is uttered--even if the sound is produced from a tape recorder. Likewise infants reared in isolation also respond with generalized alarm when they hear these sounds. However, as these isolated animals grow older, they cannot differentiate between snake, vs eagle, vs leopard warning-calls, and are as likely to look up as look down as climb a tree. Experience plays a role in associating these emotional sounds to specific individuals or dangers.

Although be it human or non-human mammal/primate, these emotional sounds are produced by the limbic system, over the course of development (and evolution) they also come to be hierarchically represented within the neocortex.

Nevertheless, as noted above, there are trends among non-human primates toward some degree of reliance on neocortical tissue and the superior temporal lobe in regard to the comprehension of species specific sounds. Ninety percent of primate auditory cortex neurons are activated by species-specific calls (Newman & Wollberg 1973), whereas destruction of the left superior temporal lobe disrupts that ability to make sound discriminations. There is also some evidence to suggest that asymmetries in the planum temporal are apparent even in chimpanzees (Gannon 1998), and that in other hominoids and monkeys, the left hemisphere is dominant for the perception of primate vocalizations (Hauser & Anderson 2004; Peterson & Jusczyk 1984; Peterson et al. 1978). Presumably, left hemisphere dominance for vocalization perception and expression gradually increased in the transition from Australopithecus, to H. habilis, to H. erectus, to Neanderthals to Cro-Magnon.

As noted, the angular gyrus is a direct extension not only of the superior parietal lobe but the superior temporal auditory areas. The auditory neocortex also receives projections from the superior parietal lobe and amygdala and is multi-modally responsive (Pandya 1995; Pandya & Yeterian 2013). However, destruction of the primate left superior temporal lobe and primary and assocation auditory areas (which in humans would include Wernicke's receptive speech area), does not render these creatures "aphasic," and their capacity to detect and recognize species specific calls is not effected (Dewson et al. 1975; Heffner & Heffner, 1984; Hupfer et al. 1977).

However, destruction of this area can effect their ability to make fine auditory discrimination between similar sounds--which is also the case with humans. However, in contrast to humans, an analysis of the spectral response patterns of auditory cortex neurons in monkeys (Macaca mulatta) indicates a complete absence of pitch and pitch tone sensitivity (Schwarz & Tomlinson, 2010). This is because, in the primate brain, these sounds are processed by subcortical nuclei, i.e. the limbic system (see chapter 15), which in turn explains why "language" in these creatures (like most of our hominid ancestors), is predominately social and emotional (Erwin, 1980; Fedigan, 2012; Goodall, 1986, 2010), and lacking in grammatical or temporal sequential organization (Premack and Premack, 1988).

"Language" functioning and comprehension is thus preserved with extensive destruction of the primate "Wernicke's" area; i.e. the left superior temporal lobe because language in these species in mediated by the limbic system.

In humans, limbic language has become hierarchically represented at the level of the neocortex and is subject to temporal sequencing by the angular gyrus. Hence, destruction of the human left superior temporal lobe results in profound disturbances of linguistic comprehension, i.e. Wernicke's aphasia; and an impaired capacity to discern the individual units of speech and their temporal order (see chapter 11). In contrast to non-human animals in which language is limbic and non-sequential, the sounds of complex human language must be separated into discrete interrelated linear units or they will be perceived as a blur, or even as a foreign language.

Hence, a patient with Wernicke's aphasia may perceive a spoken sentence such as the "pretty little kitty" as the "lipkitterlitty" (chapter 11). They also may have difficulty establishing the boundaries of phonetic (confusing "love" for "glove") and semantic (cigarette for ashtray) auditory information. Grammatical speech is always severely impaired; all of which, again is a function of the unique role of the angular gyrus in the evolution of human language.

Over the course of evolution, the angular gyrus began imposing temporal sequences on auditory input. Specifically, the increasing dominance of the left superior temporal lobe for auditory comprehension coupled with the establishment of right handedness and refinements in tool technology, eventually gave rise to expansions in the temporal-parietal (and frontal) lobes, including the planum temporal (e.g. compare Gannon 1998, LeMay & Geschwind 1976; Geschwind & Levitsky 1968) thereby forming the angular gyrus. Once the angular gyrus evolved it began imposing temporal sequences and manipulating auditory input processed by Wernicke's area.

Likewise, as the angular gyrus and the frontal lobe evolved and expanded, the neural pathways linking the IPL with the frontal lobe greatly increased in density (Aboitiz & Garcia 1997). Hence, once the angular gyrus had evolved, the posterior speech areas became linked with the anterior speech areas thereby promoting the functional evolution of Broca's area which thus gained hierarchical dominance over the oral-laryngeal musculature and the limbic and subcortical vocalization centers including the anterior cingulate and the midbrain periaqueductal gray.


The IPL/angular gyrus, therefore, coupled with handedness and evolutionary advances in sequencing capabilities, not only promoted the functional evolution of Broca's and Wernicke's areas, but serves as a nexus which came to link the anterior and posterior auditory areas, thereby forming a "language axis" (Joseph 1982, 1988a, 1993; Joseph et al. 1984). That is, the cingulate-medial frontal-Broca pathways (subserving sound production) came to be linked with the amygdala-Wernicke's pathway (subserving sound reception) at the level of the neocortex. Thus, limbic (emotional) language became hierarchically represented at the level of the neocortex such that sound production and perception became punctuated by temporal sequences, thus promoting speech unit production and perception.

Prior to the evolution of the angular gyrus and a functional Broca's area, sound production was most likely under the exclusive control of the amygdala, cingulate and medial frontal lobes which acted on the midbrain periqueductal gray in order to vocalize (chapter 15). Indeed, as noted, among non-human primates destruction of the left frontal operculum (i.e. "Broca's area") does not effect vocalization rate, the acoustical structure of their calls, or social-emotional communication (Jurgens et al. 1982; Myers 1976), whereas humans become severely aphasic. Rather, among hominoids and monkeys, the tissue homologous to Broca's area serves the hand (Gentilucci et al. 1988; Rizzolatti et al. 1988) but not speech (Jurgens et al. 1982; Luschei & Goldberg 1981; Myers 1976).

Likewise, prior to the evolution of the angular gyrus, auditory perception in archaic and earlier species of humanity, was probably almost exclusively associated with the brainstem, inferior colliculi, thalamus, and the amygdala including amygdala-derived temporal lobe neocortex, the superior temporal lobe.

Since there is no evidence for complex tool technology or an angular gyrus among Australopithecus, H. habilis, H. erectus archaic H. sapiens or Neanderthals, it thus appears that "modern" human linguistic abilities probably did not fully emerge until the evolution of the Cro-Magnon peoples. As in non-human primates, the language possessed by these earlier hominids, including Neanderthals, was likely emotional, limbic in origin, word-poor, and aggrammatical. Until the very end of their rein, Neanderthals and other ancient hominids simply lacked the temporal sequential capabilities and thus the neurological foundation to produce vocabulary-rich, complex grammatical speech. In fact, environmental and evolutionary pressures experienced by Neanderthals would have hindered and suppressed the development of "modern" speech.


Australopithecus skull/brain compared to Human skull/brain.


Over the course of hominid and human evolution, the expansion of the frontal lobe has paralleled expansions in the temporal lobe (see Figures 1, & 3) and presumably, the IPL. The temporal lobes appear to have lengthened in both an anterior-inferior and superior-posterior direction thus contributing to the development of the superior temporal, planum temporale, including the angular gyrus/IPL (e.g., Aboitiz & Garcia 1997; Joseph, 1993, 2011e; Wilkins & Wakefield 1995) which is, in part, coextensive with Wernicke's area and extensively connected with Broca's area and the hand area in the frontal lobe. Presumably, the frontal lobe expanded and functionally evolved, in part, as a result of the increased input from these newly evolved posterior neocortices. For example, in comparing non-human primates and humans (e.g. Aboitiz & Garcia 1997; see Petrides & Pandya 1988; Preuss & Goldman-Rakic 1991), it is evident that as the frontal lobe expanded and the IPL gave rise to the angular gyrus, there has been a significant increase in the density of axonal interconnections between the frontal lobe and the IPL. In humans, this pathway has become a thick fiber bundle, the arcuate fasiculus.

If the frontal lobes were denied input from the posterior neocortex, such as due to a lesion of the angular gyrus or the arcuate fasciculus, the patient might suffer a conduction aphasia such that speech becomes empty and there results a profound word finding difficulty; i.e. conduction aphasia and anomic aphasia; a consequence also of Broca's area becoming disconnected from the posterior speech areas. Not only might speech become severely effected, but a patient might become apraxic or paretic, and lose the ability to perform temporal sequential motor acts or engage in fine motor movements including those involved in the construction and utilization of complex tools .

This is because the IPL/angular gyrus, and the primary and secondary somesthetic receiving areas, supply input and sensory feedback, and act to program the motor areas (De Renzi & Lucchetti 1988; Heilman et al. 1982; Kimura 1993; Strub & Geschwind 2013) including the oral-facial musculature (Huang et al. 1989) and Broca's expressive speech area (Goodglass & Kaplan 1982; Joseph 1982 1988a, 2011e, Kimura 1993).

Hence, the frontal lobes are not only interlocked with the primary, secondary and association areas, but are dependent upon the reception of input from the posterior neocortex, the parietal lobe and angular gyrus/Wernicke's area in particular especially in regard to fine motor activity and the expression of vocabulary rich and grammatical human speech. Indeed, it is the angular gyrus (in conjunction with Wernicke's area) which provides not only grammar, but vocabulary and the units of speech.

The angular gyrus sits at the junction of the occipital, parietal and posterior-superior temporal lobes, and is believed to assimilate the converging association received from the adjoining association areas (Joseph 1982, 1986a; Joseph et al., 1984; Pandya & Yeterian 2013). That is, the angular gyrus, in association with the frontal lobes, assimilates different associations, creating complex concepts, and providing auditory equivalents to non-verbal associations, thereby forming multi-modal concepts and linguistic categories, including words and names. Thus, auditory associations can be matched with visual or somesthetic impressions, which enables humans to visualize what they hear or touch, or to name or describe what they touch or see.

Indeed, the IPL/angular gyrus and Broca's area in the frontal lobe are critically involved in naming, word finding, grammatical speech organization, and in concert with Wernicke's area, transmits, via the arcuate fasciculus, grammatically organized words and sentences to Broca's speech area. Upon receiving this input, Broca's area (in conjunction with the parietal lobe, Huang et al. 1989) organizes the immediately adjacent oral, laryngeal motor areas (Foerster 1936; Fox 1995; LeBlanc 2012; Petersen et al. 1988, 1989) in order to vocalize, enunciate, and speak. Thus, both Broca's area, and the parietal lobe become activated during oral activity.

With the evolution of the angular gyrus, humans therefore gained the ability to match auditory with visual and somesthetic impressions, and to manipulate not just the external environment via the hand, but the internal environment, and the oral-laryngeal musculature, so as punctuate and impose sequences on auditory, visual, and somesthetic impressions which are then injected into the stream of language and thought. Hence, just as external objects could be subject to sequential manipulation which resulted in the creation and use of complex multi-faceted tools, vocalization and auditory perception became subject to sequencing and punctuation, thereby producing units of speech..

As noted, increased input likely contributed to the expansion of the frontal lobes, including the functional evolution of Broca's area. That is, over the course of evolution, Broca's area appears to have become functionally reorganized in response to motor-related auditory impulses relayed from the IPL/angular gyrus. Initially this increased input may have been directly related to the development of right handedness and right hand motor control. For example, the frontal motor-hand area is immediately adjacent to and intimately interconnected with the primary motor areas mediating oral, laryngeal, and mandibular movements (Penfield & Roberts 1959; Woolsey 1958), including Broca's area (Fox 1995; Penfield & Roberts 1959; Joseph 1982, 1988a, 2011e). Again, in monkeys, the tissue "homologous" to Broca's area is directly involved in manual activity but does not subserve vocalization (Jurgens et al. 1982; Luschei & Goldberg 1981; Myers 1976). Hence, as proficiency in the control over fine motor movements increased secondary to the development of the angular gyrus and right handedness, and as the angular gyrus/Wernicke's area is also concerned with language functioning, presumably Broca's area ceased to control and manipulate the hand and instead became reorganized so as to program the oral-laryngeal motor areas and thus the oral musculature thereby producing units of speech--as is the case with present day humans.


The human female displays clear language superiorities as compared to the human male, even learning language and expanding her vocabulary more quickly, and speaking more rapidly (Joseph, 1993, 2011e). As detailed in chapter 7, this language superiority is a direct consequence of her ancestral history as food gatherer and provider of child care. Female gatherers (unlike silent male hunters) can chatter to their hearts content. Gathering, as well as tool making, is in fact directly related to the acquisition of the neurological foundations for language (Joseph 2012b, 1993, 2011e).

By contrast, hunting does not promote linguistic development as the successful hunter must be silent. Wolves, wild dogs, and lionesses spend a considerable part of each day tracking and hunting and there is no evidence of speech among these creatures. Rather, although a hunter may throw his spear with the right hand (which is stronger and which can also be directed by the right hemisphere, via bilateral control over gross movements) he requires excellent visual-spatial skills and must maintain long periods of silence in order to be successful in these endeavors.

Aspects of hunting would also put a premium on parietal lobe and right cerebral cognitive development as tracking, aiming, throwing, geometric analysis of spatial relationships, as well as environmental sound analysis, are also directly related to the functional integrity of the right half of the brain (Guiard et al. 2013; Haaland and Harrington, 2010; Joseph, 1988a; see chapter 10). In modern humans, it is the right half of the brain which mediates visual-spatial perceptual functioning, including the ability to aim and throw a spear; but not grammatical speech. And, males consistently demonstrate superior visual spatial, maze learning, tracking, aiming, and related non-verbal skills, as compared to females (Joseph, 2011e).


Among the Cro-Magnon and "modern" H. sapiens, where hunting was the center of religious and artistic life, 60-80% of their diet consisted of fruits, nuts, grains, honey, roots and vegetables (Prideaux, 1973), which were probably gathered by the females. Even among the great majority of the very few modern hunting and gathering societies in existence today, women are the gatherers and main providers of food whereas spoils from the hunt account for only about 35% of the diet (Dahlberg, 1981; Martin & Voorhies, 1975; Murdock & Provost, 1981; Zilman, 1981).

In contrast, male and female Neanderthals, including their young children, were predominately hunters and meat eaters as is apparent from analyses of the striation patterns along the surface of their teeth (Fox and Perez-Perez, 1993; Molnar, 1972); and as based on the faunal remains from their hunting sites (Lieberman & Shea, 2004). That is, a diet high in vegetation exerts a highly abrasive influence on the teeth -and this is not the case with Neanderthals. Moreover, glacial (Middle Paleolithic) Europe was very meager in plant food resources (Gamble, 1986). Therefore, silence and not speech production would have been emphasized among these Neanderthal hunting populations.

It is noteworthy, however, that although Neanderthals may have lived during a rather cool period, that the "modern" H. sapiens from Qafzeh lived during a rather warmer phase and did not spend as much time engaged in hunting (see Lieberman and Shea, 2004). Thus these two populations would have adapted differently with the "early moderns" given a competitive advantage in regard to acquiring temporal-sequential and language skills as there was less pressure to remain silent and more time available to devote to non-hunting activities.

Unlike "early modern" and Upper Paleolithic hunter gatherers, Neanderthal spent three times as much time hunting and scavenging for meat (Lieberman & Shea, 2004). Thus they would have also had to practice protracted periods of silence, which would not contribute to the development of speech. Moreover, hunting was an activity that apparently consumed a considerable amount of Neanderthal female energy, as is evident from their relative lack of sexual dimorphism (as compared to males and females of the Upper Paleolithic) and as based on post-cranial analysis of the upper extremities (Ben-Itzhak, Smith and Bloom, 1988; Jacobs, 2013). That is, Neanderthal females engaged in activities similar to men -hunting for meat.

Hunting and scavenging for meat do not require language or temporal sequential skills but rely on the functional integrity of the right half of the brain. Given the lack of evidence for a well developed inferior parietal lobe, the paucity of evidence to indicate complex cognitive or tool making capabilities, and the somewhat socially isolated manner in which they lived, it could be assumed that the Neanderthals, who lived predominately as meat eaters and not gatherers, would not have been likely to develop temporal-sequential or language skills.

Coupled with Lieberman's (2012; Lieberman et al. 2012) analysis of the supralaryngeal airways, and his discussion of the contrary data provided by Arensburg et al. (2010) regarding the Kebara Neanderthal Hyoid bone (which if accurate refutes his theory), there thus appears to be both anatomical as well as functional evidence that this population was not capable of complex language production. As argued by Lieberman (2012, 409), "Neanderthals lacked the anatomical prerequisites for producing unnasalized speech and the vowels [i], [u], and [a]. To speak any human language unimpaired requires the ability to produce these vowels and unnasalized speech, and Neanderthals did not have this ability. They would also have been unable to produce velar consonants such as [k] and [g]; these consonants are almost universally present in human languages."

It is noteworthy, however, that Lieberman and colleagues (2012) argue that the supralaryngeal airways of "early modern" H. sapiens, including those from Skhul V, are essentially similar to modern humans; which is consistent (in regard to temporal sequencing and thus language) with the notion that this population evolved an angular gyrus in advance of Neanderthals. However, as also noted above, it appears that "late" Neanderthals (which may have been a Cro-Magnon/Neanderthal hybrid) also appear to have evolved at least a rudimentary angular gyrus; at least as based on evidence of a Chatelperronian tool technology. Hence, raises the possibility that "late" Neanderthals may have evolved the capacity to produce a word poor, aggrammatical language.


Clearly from a review of the available functional, neuropsychological, cultural, technological, linguistic, and behavioral neurological data, it appears that the evolution of modern H. sapiens sapiens is characterized by advances in frontal lobe and inferior parietal/angular gyrus development. It also appears that Neanderthals flourished and continued to proliferate in Europe, Palestine, and Iraq from almost 130,000 to about 30 thousand years ago but were completely wiped off the face of the Earth, most likely by the Cro-Magnon who in successive waves of humanity increasingly encroached, invaded, and probably raped, enslaved, and killed them off; ushering in the Upper Paleolithic and "replacing" them in the process.

It is likely that a similar process of invasion and neurological competition took place among H. erectus, and the "archaic" and "modern" H. sapiens sapiens in the Far East and Asia. It is also possible that the Cro-Magnon and Asian "moderns" may have interbred in the Middle East, and elsewhere (see Frayer et al. 2012, for related discussion).

As to those who claim that Neanderthals, H. erectus, and even H. habilis had long ago acquired speech and modern language, these claims are disputed by the tool technology employed by these populations and other indices indicating a paucity of frontal and angular gyrus development. And, as to those who rely on phrenology to claim evidence for Broca's and Wernicke's area in archaic H. sapiens and H. habilis, it must be emphasized that those making these claims are in fact claiming to have identified secondary "sulci" and secondary "gyri" based on an examination of a mold conforming to the shape of the inside of a skull, and not the brain! Falk, Holloway, and colleagues, have in fact never seen a cast of a Neanderthal, H. habilis or Australopithecine brain and are merely speculating based on questionable methodology regarding the bumps on the inside of a skull.

Even those involved in making these claims have exchanged accusations of improper methodology, shoddy scholarship, and arbitrarily identifying and even guessing at to the position of various landmarks on these endocasts (see Falk, 2013b; Holloway 1981).

As stated by Holloway (1981), "Falk's use of L/H indices is unusual.... Twisting the arguments by injudicious use of indices" which "camouflages" important facts. Oddly, in attacking Falk, Holloway (1981) claims that his data is based on an endocast collection which "is not carefully selected."

Neanderthal (top) vs CroMagnon Skull (bottom)

Given the above, there appears to be no credible evidence that H. habilis, H. erectus, or Neanderthal possessed a linguistically functional Broca's or Wernicke's area and/or were capable of modern human temporal-sequential, complex grammatical speech (However, see Arensburg et al. 2010)

Again, even if H. habilis, H. Erectus, and Neanderthal possessed a functional Broca's and/or a Wernicke's area this does not indicate that these brain regions were linguistically functional, for similar landmarks can be discerned on chimpanzee and even dog brains. As detailed above, non-human primates and other creatures, these brain areas serve a non-language function. However, given the evidence of a transitional tool and implement making tradition among a few isolated populations of Neanderthals some 34,000 years ago, it is possible that "late" Neanderthals may have developed a limited, word-poor vocabulary.

However, although these people appear to have been eradicated and to have lost the human race due to inferior cognitive and cerebral capabilities, intermediate Neanderthal/Cro-Magnons were likely produced due to rape or mutual sex between these two species of humanity. Indeed, this may explain the development of the Chatelperronian tool technology by "Late" Neanderthals, some of whom appear to have been intermediate between Neanderthals and Cro-Magnon. Cross breeding may also explain the recent discovery (by Joao Zilhao) of the 24,500 year old skeletal remains of a four year old (Cro-Magnon/Neanderthal) boy who was buried with strings of marine shells and painted with red ocher. That is, although possessing the typical Cro-Magnon cranium and facial charateristics, his short legs and stocky build was Neanderthal. If this and other intermediate types were not produced secondary to cross breeding, the only other explanation is that due to climatic changes, and as the environment acts on gene selection, some surviving subpopulations of Neanderthals may have begun to evolve into "modern" humans.

That seems unlikely, however, as the genetic evidence indicates that Neanderthals are unrelated to modern Europeans, but are obviously related to each other.


"And it came to pass when men began to multiply on the face of the earth, and daughters were born unto them, that the sons of God saw the daughters of men, that they were fair, and they took them wives of all which they chose... There were giants in the earth in those days; and also after that, when the sons of God came in unto the the daughters of men, and they bare children to them, the same became mighty men which were of old, men of renown." Genesis 6.1-4.

Neanderthal (top) vs CroMagnon Skull (bottom)

Neanderthal (top) vs CroMagnon Skull (bottom)

The Cro-Magnon were a very handsome people with thin hips, broad shoulders, aquiline noses, prominent chins, small even teeth, high rounded foreheads and with brains almost a third larger than those of the average woman and man today. There was nothing ape-like or Neanderthal about these people. The men stood 6 ft tall, though the women were somewhat smaller and more delicate. Compared to those who came before them, and those who came after, and until the advent of the 20th century, these people were giants. The origins of these peoples, however, are unknown as there are no transitional forms that link them with "early modern" peoples who were decidedly more archaic in appearance. Perhaps their origins may be found in the myths of Genesis; that the Cro-Magnon are descended from the sons of God who bred with the daughters of man.

Nor is it known as to what became of these people and the civilization they created, as it appears that they and their cities simply disappeared some 10,000 years ago, perhaps washed away by the great floods that followed the rather sudden onset and ending of the last ice age. according to the written records of the Sumerian people, whose own civilization arose 6,000 years ago, the Cro-Magnon peoples, and their great cities were in fact destroyed by a cosmic upheaval, when the planet Venus snaked through the heavens and careened close to the Earth, plunging the planet into an icy darkness that was then followed by cataclysmic floods, some 4,000 years before their own time. Plato, in fact, twice tells a similar tale, of floods and rising seas that destroyed this ancient technologically advanced civilization, some 9,000 years before his own time.

And not just the Sumerians, and Plato, but the Egyptians, and the ancient Mayas of central American, tell similar tails of ancient cities and civilizations ruled by the descendants of the gods. And According to the Mayas, all was destroyed some 10,000 years before their own time, when the planet Venus snaked through the heavens and careened close to our planet, enveloping the Earth in terrible floods.

Although 10,000 or so years ago the planet was in fact plunged into a sudden ice age which ended rather abruptly, causing the seas to rise by almost 400 feet, surely the Sumerians, Egyptians, Plato, and the Mayas are mistaken about great cities and civilizations, and are merely weaving fanciful tales.

As noted, some 35,000 years ago, the Cro-Magnon sported a brain that was almost a third larger than that of a modern 6 foot man; i.e. 1800 cc vs 1350 ccs. Moreover, these peoples were accomplished artists, musicians, craftsmen, sorcerers, and extremely talented hunters, fishermen, and highly efficient gatherers. These peoples rather suddenly developed tools and weapons that had never before been seen and had learned how to make and bake pottery and construct clay figurines as well as construct kilns and burn coal so as to fire and mold their creations.

Neanderthal (top) vs CroMagnon Skull (bottom)

Neanderthal (top) vs CroMagnon Skull (bottom)

From the time of Homo Erectus, humans had utilized fire to keep warm, to provide light, to cook their food, and to ward off animals. However, the Cro-Magnon learned over 20,000 years ago how to make fire using the firestone; iron pyrite which was repeatedly struck with a flint thus making sparks which could easily ignite brush. They also created the first rudimentary blast furnaces which were capable of emitting enormous amounts of heat, so as to fire clay. They did this by digging a tiny tunnel from the bottom the the hearth which allowed air to be drawn in. Indeed, 25,000 years ago these people were making fire hardened ceramics and clay figures of animals and females with bulging buttocks and breasts.

Moreover, many of these figures were shaped so that they tapered into points so that they could be stuck into the ground or into some other substance either for ornamental or supernatural purposes (i.e household goddesses or as fertiity figures; i.e. earth mothers). In fact, much of the art produced, be it finely crafted "laurel leafs" or other artistic masterpieces served ritual, spiritual, and esthetic functions: art that was meant to be looked at, owned and admired, and for trade, as jewelery and household decorations, and as highly prized possessions.

Likewise, the first musical instruments were also created by these people. In the crudest form this may have consisted of wooden drums. However, 25,000 years ago they had also created the first tiny flutes and whistles.

These peoples were also the first to weave baskets, and the first to use needle and thread in order to make finely fitted clothes which were carefully and deftly sewn together. Unlike all those who had come before them they decorated their clothes and tools and weapons with elaborate designs and symbols, and within their underground cathedrals they left behind elaborate and complex paintings, some of which were almost 3-dimensional. Indeed, these peoples demonstrated an esthetic artistic awareness and mastery which equals the ability of any living artist today.

The artistic development of the Cro-Magnon has been divided into several periods; beginning with the the Aurignacian period (35-50,00 years B.P.) which was followed by the Magdalenian period (20,000-35,000 years B.P.)

During the initial stages of Aurignacian, drawings were often small in size and the use of colors was more simplistic--an impression, however, which may be due to erosion. By 35,000 years ago, beginning with the Aurignacian Magdalenian transition, there is considerable emphasis on color and shading and an incredible degree of attention is paid to anatomical detail including muscular expression.

Thirty five thousand years ago, Cro-Magnon were painting animals not only on walls but on ceilings, utilizing rich yellows, reds, and browns in their paintings and employing the actual shape of the cave walls so as to conform with and give life-like dimensions, including the illusion of movement to the creature they were depicting. Indeed, many of their engraving on bones and stones show a complete mastery of geometric awareness and they often used the nature contours of the cave walls, including protuberances, to create a 3-dimensional effect.

For example, over a stone protuberance a rounded haunch of a bison was drawn thus creating a 3-D effect. Hence, the drawing or carving often became a harmonious or rather, an organic part of the object or tool upon which it was depicted. The Cro-Magnon drew and painted scenes in which animals mated, defecated, fought, charged, and/or fleeing and dying from wounds inflicted by hunters, thus recreating the scenes of everyday life. Moreover, most of the animals were drawn to scale, that is, they were depicted in their actual size; and all this, 35,000 years ago (e.g. Chauvet, et al., 1997).

Like those who came before them, the Cro-Magnon obtained their colors from natural earth pigments, such as ocher, a clay that contains a variety of iron minerals. However, whereas Neanderthals and H. habilis apparently had a fondness for red, the Cro-Magnon learned to separate and mix these pigments so as to create a variety of hues and colors. In order to mix these subtances and to arrive at the correct consistency, a variety of lubricants such as blood, urine, vegetable and fruit juices, and with animal fat and the contents of eggs were employed. The separate colors were then mixed in various hollowed out rocks and shells. The Cro-Magnon artist would also use a brush, or his or her fingers. In fact, they used a variety of different brushes which enabled the artist to create different shades and strokes. In some cases the artist simply blew the paint onto the drawing via a tubular bone, thus making a mist-like spray.

The Cro-Magnon artists had also invented abstract impressionism, as many of their paintings and artworks were exceedingly abstract, surrealistic, and/or comprised of geometric forms and concentric shapes and ovals which in turn formed abstract versions of animals or women. Indeed, they displayed an artistic mastery equal to that of any modern master, including Picasso. Picasso, upon gazing upon these Paleolithic masterpieces felt compelled to complain that in 30,000 years "we have learned nothing new. We have invented nothing." Indeed, the geometric and angular form of representation employed by these Paleolithic Masters appears again and again throughout history and is found in Egyptian, Summerian, and even early Greek art.

In fact, these were the first people to paint and etch what today might be considered "cheesecake." That is, they drew and painted slim, shapely, naked and nubile young maidens in various positions of repose (see chapter 8).

The Cro-Magnon artist, like modern artists today, made tiny sketches on slabs of stone, copies of what they intended to paint. After they had finished with their painting, they would obliterate the sketch by smearing it with mud or ocher so that a new drawing could be made. These cave painters also had to rely on artificial light and utilized little lamps of hallowed stone that were filled with grease that would slowly burn.

It is noteworthy that over thousands of years, the Cro-Magnon artist often drew and painted over existing drawings and paintings, including those which were hidden away in deep recesses that were extremely difficult to get to. That they readily drew and painted over existing paintings indicates that the location of the painting and not the painting itself was often of particular significance, particularly in that many paintings were in out of the way places where one had to crawl long distance through tiny spaces and along rather tortuous routes to get to them. Hence, these particular spots must have been viewed as having some sort of mystical, spiritual, or at least ritualistic significance.

It is often thought that the paintings and other figures that filled their caves and underground cathedrals were not developed purely for esthetic enjoyment but as part of various magical rituals. Indeed, the Cro-Magnon were probably the first people to engage in magic and sorcery and may have been the first to develop notions of God and otherworldly and supernatural spirits. Hence, these rituals were probably developed to aid in the hunt and to appease the whims of the gods. In fact, the Cro-Magnons obviously believed in an after-life, and buried their dead with flowers, clothing, beads, head bands, necklaces, weapons, and offerings of food.

Their achievements were not merely limited to art, ceramics, music, and spirituality, as they constantly experimented with and created new inventions such as the sewing needle, pointed burins, highly efficient cutters and scrapers, and the spear thrower which seems to have first made it appearance about 20,000 years ago. This device consisted of a spear that was fitted into a long hooked rod, about 1-2 feet in length. Via the spear thrower a Cro-Magnon hunter or warrior could toss his weapon an incredible distance and at a tremendous velocity thus greatly enhancing his killing power and range. In effect, the spear thrower acted as an extension of a man's arm and enabled a man to almost double the distance in which he could throw a spear. Hence, he could easily impale and kill an animal or another man standing anywhere from 70 to 150 yards away. The spear thrower was used not only by the Cro-Magnon in Europe, but by the Cro-Magnon in the Artic and in the America's (by the Eskimo and Paleo Indians).

Like all their tools, the spear throwers were elaborating decorated with fine carvings, etchings, drawings and paintings of animals such as horses, deers, bisons, deers, horses, birds and fish. These tools and weapons were also made from a variety of substances such as reindeer antlers.

These people also realized that a spear covered with barbs, harpoon style, would do much more damage than a smooth point. However, even with barbs, animals often were not killed outright and often the mortally wounded beast would run some distance before becoming weak enough to be overcome and killed. Hence, the Cro-Magnon created blood grooves along side the bone spearheads so that blood could more efficiently gush from the wound thus speeding the process of dying.

In addition, perhaps 20,000 years ago the first bows and arrows apparently came into widespread use and the arrows appear to have been feathered so as to stabilize their flight. With the creation of the bow the hunter could now remain completely hidden for if he missed with his first shot the animal would not even know he was there (so long as he stayed down wind and out of sight). The hunter could now shoot again and again.

The Cro-magnon were the greatest hunters of their time and unlike their ancestors they were able to hunt and kill antelopes, biosons, wild horses, reindeer, mammoths, and even lions and bears. They were also good trappers and fisherman and took birds, small animals and fish in abundance. In fact, they may have been the first true fishermen and constantly harvested the abundant game living in the lakes, rivers and seas. These people utilized nets, a trident shaped spear as well as a baited hook which would then become lodged in the throat of the fish. They also developed the "trap," a rawhide or thick vine noose which was attached to a bent sapling which was tethered to some object such as a huge rock. The noose would be laid where the animal was expected to walk. Any animal or human which tripped the tether that held the sampling would be snared by the loop which would then yank the unfortunate creature into the air where it would then hang only to die exposed to the elements or to be later beat to death.

The Cro-Magnon people likely utilized iron and engaged in metal tool making. Unfortunately such devices would not have withstood the ravages of time. However, the flint points they utilized were in fact superior to iron or steel in their cutting power and can penetrate more deeply when thrown at some unfortunate animal or human. These people applied some type of solvent to these blades to make them more durable as they tend to be somewhat brittle and can be broken.

The making of tools, weapons, and art objects required the development of extremely fine motor control and sensory, stereognostic perception. For in creating such objects, the artist or tool maker must know just how much pressure to apply, in one direction and with what force it must be applied, as well as what tool to use. They ust be very sensitive to the planes and angles of the objects structure and then must utilize the finest precision.

In contrast, the Neanderthal and those who came before them, simply knocked two stones together so as to sharpen a rock, or they chipped away flakes from rocks and used these as cutting tools and weapons. The typical Neanderthal tool kit consisted of up to 60-70 different items (horizontal scrapers, blunted and doubled edged rock knives).

In contrast, diversity in tool making is the hall mark of the Cro-Magnon. Twenty five thousand years ago, a typical tool kit consisted of well over 125 items, e.g. knives for cutting, whittling, stone saws, chisels, perforators for making holes, needles, scrapers for bone others for skin, pounding slabs, etc.) and many these tools were attached to wood, bone and antler handles, and/or were made of these items including ivory.

Ivory can also be steamed and bent so that specific shapes can be molded. Of course the type of tools constructed were in turn determined by the environment and climate in which they lived. For example, among those who lived in Europe, deer would drop their antlers every season, which then lay upon the ground for the taking. In parts of Siberia and other very cold climates, wood was scarce and thus bone or ivory was employed much more frequently.

Predominantly it was the men who were the hunters whereas the women engaged in the gathering of vegetables, fruits, seeds, and berries and the digging of roots. To aid the Cro-Magnon women on their daily shopping trips they carried large leather purses (or pouches) and/or baskets into which they could deposit their goods. The women also carried large flat bone knives up to 9 inches in length.

These people and their decendents became so proficient at gathering and foraging and hunting that they were able to settle year round in villages. In fact, the Cro-Magnon built houses of wood and stone that were large enough to easily provide shelter for up to 20-25 adults and children at a time and which might be anywhere from 50 feet or more wide and 20 or more feet long. These were not merely makeshift accomodations that could be moved at whim for the houses were set on foundations that were sunk 2-3 feet into the ground. These houses also contained bedrooms, common living areas (or living rooms), kilns and fireplaces, as well as stone storage vaults where meat and other perishables could be easily stored for weeks at a time.

By 15,000 years ago, they were already living in small cities of thousands of people; cities surrounded by woodland and small farms consisting of wild wheats that the women likely planted. In fact, stone sicles and grinding stones were in use 20,000 years ago which allowed for the harvesting and milling of wild and domestic grains. Moreover, they may have made beer from the grain and may well have discovered that wine could be produced from the fermented grape. Hence, these people invented civilization over 20,000 years ago; which is exactly what the oldest written records currently in existence, patiently explain (see chapter 1).

However, not all Cro-Magnon were city dwellers. Many made their homes out of animal skins that were sewn together thus forming tents held together by poles and anchored to the ground by stones, bones, or wooden posts.

Nevertheless, the massive efficiency by which these people were able to hunt, gather, forage, as well as plant and harvest their own grains not only resulted in a very well rounded and healthy diet but in increased leisure time. Indeed, these people may well have arrived at a 3 day work week 35,000 years ago; leisure time that could be devoted to the development of other pursuits and interests, such as the acquisition of material goods and wealth and the seeking of knowledge for its own sake.

Given the accomplishment of these peoples 20,000 to 35,000 years before our own time, coupled with the fact that their brain was almost a third larger than that of a present day woman or man of equal size, it is perhaps not unreasonable to wonder if the "myths" regarding great cities and civilizations that existed before the deluge, may have a bit of truth to them. Perhaps they are not myths at all?

The Sumerians, for example, distinguished between myth and fact, for they always began their tales of the marvelous in the following way: "One upon a time..." The same is not true regarding Sumerian accounts of five great cities that they claim had been erected thousands of years before the flood, cities by the names of Eridu, Badtibira, Larak, Sippar, and Shuruppak.

In fact, the Sumerian's divided history into two periods: "Before the flood" (Lam abubi) and "After the flood" (Arki abubi), with their own civilization, some 6,000 years ago (in what today is Iraq), belonging to the later. According to the Sumerians these predeluvial civilizations were incredibly scientifically advanced, and were ruled by mighty kings who erected great temples, pyramids, and a lion headed sphinx, whose predeluvial perfection was unsurpassed by those civilizations that arose after the deluge. Indeed, according to the oldest written records available to us, these great kingdoms and city states were completely inundated and washed away leaving only distant memories of a long lost golden age (Hammerly-Dupuy, 1988; Kramer, 1981; Woolley, 1965); a golden age that included a land of the pyramids and a sphinx, west of the Garden of Eden.

Although badly weathered by water erosion, a great sphinx still stands in a land called Egypt; a great sphinx which is around 13,000 (or more) years in age, and which suffered the bulk of its erosion when inundated by torrential rains and flooding (Schoch, 2012). The great sphinx was constructed thousands of years before the rise of ancient Egypt; erected by a predeluvial civilization.

It has also been claimed that the three pyramids of Giza were also constructed before the deluge, which in turn may explain their utter perfection versus the (comparatively) shoddy nature of those copies erected by the ancient Egyptians. Although there are claims and counter claims regarding who built the three pyramids of Giza, even the ancient kings of Egypt refer to them as being quite ancient. In fact, ancient Arabian historians explain that the Pyramids were created before the great flood:

"Saurid...one of the kings of Egypt prior to the Great Flood, who resided in the great city of Amusus... was the builder of two of the great pyramids....Three hundred years before the Great Flood, Saurid had a dream where the Earth turned upside down with all its people, the people fled in a blind rush, and the stars fell down..."(al-Maqrizi, 1911).

According to al-Maqrizi (1911), all the kings advisors had the same dream and predicted the end of civilization. So, Saurid, the pre-deluvial King of Egypt, decided to build the pyramids in order to serve as a great museum and library where all the worlds knowledge could be preserved.

As all who visit or who have studied these grand monuments can attest, the three pyramids are not just the oldest and architecturally superior to all those built by the ancient Egyptians, but they are completely un-Egyptian. Unlike all other Egyptians edifices the inner walls of the pyramids of Giza are completely barren of hieroglyphics or in fact any markings whatsoever. Coupled with their utter perfection, this suggests that they were built by a different, technologically advanced civilization. In fact, although the grandest and oldest of them all is attributed to Khufu, Khufu in fact, indicates that the Great Pyramid and the Sphinx, had been constructed long before his own time.

As noted, the Sumerians, Babylonians, the ancient Jews, Greeks, and Romans, and in the Western half of the World, the ancient Olmecs, Mayas and Aztecs, and in the Eastern half, ancient China, all tell of an incredible catastrophe that occurred little more than 12,000 years ago; a cosmic calamity that took on the specter of a horrible war in heaven, a war of the worlds involving Venus--that terrible serpent--and which shook the kingdoms and destroyed the cities of the Earth, and a war which was followed by earthquakes, volcanic activity, terrible rains, floods, then arctic cold.

Although we can dismiss these stories as fantasy and myth, for the geological evidence and the historical record indicates that around 12,000 years ago temperatures dropped precipitously, the last ice age began quite suddenly, and which was then followed by rapid increase in temperatures and terrible floods--floods that would have washed away not just all signs of civilization, but as with modern day floods, the top layers of the earth itself, leaving behind only the bedrock of ancient ocean floors from half a million years ago.

We also know that some momentous event resulted in the flash freezing of innumerable creatures including the woolly mammoth. Indeed, woolly mammoths and other animals, have been found frozen solid, with absolutely no sign of decay and with grass in their mouth and food in their belly--carcasses dated to 12,000 years ago. They had obviously been drenched in an avalanche of water and then flash frozen alive. As is evident from their diet and the food in their mouths, these calamitous events took place in a most inviting environment that just moments before had been warm and lush and teaming with life.

Supporting the possibility of a sudden inundation of water, is the fact that the glaciers from the last ice age were not uniformly established in the way advancing ice sheets would be expected; as if the Earth had swept through an ocean of water drifting in space, leaving some regions high and dry, and others drenched only to flash freeze.

For example, the ice sheets in India were located along the Equator and extended upward toward the Himalayas, rather than from North to South as might be expected. Conversely, the ice sheet in Africa extended from the Equator toward the South pole, which again is the opposite of what might have been expected. In fact, the Americas, Europe, parts of Africa, India and Siberia were covered, whereas the rest of Asia and everything north of Siberia, including the North Pole, was unaffected. There was almost a randomness to it. This is reminiscent of the manner in which some flu viruses spread; which, some scientists have proposed, may be due to the Earth orbiting through regions of contagion and passing through cosmic viral clouds.

However, after a period in which the Earth was tossed to and fro, this last glacial period just as suddenly ended. The melting glaciers released wave upon tidal wave of an ocean of rushing water which not only swept away all in its path, but which resulted in sea levels rising anywhere from 40 to 400 feet -evidence which is supported not just by the geological record but ancient records, including those compiled by the Sumerians, Babylonians, Greeks, Olmecs, and Mayas. Peoples throughout the world, and across divergent and distant cultures all speak in hushed and somber tones about the Great Flood (e.g. Dundes, 1988). This cataclysmic event was so momentous in the history of woman and man that it is even detailed by the Greeks, the Mayas and Aztecs, as well as in Genesis and the even more ancient records of the Babylonians and Sumerians (Horcasitas, 1988; Kramer, 1981; Woolley, 1965).

Again, the Greek philosopher-scientist, Plato, twice made a now famous note of this world wide tragedy and states that the greatest civilizations of the past had been submerged and crushed beneath the seas, 9,000 years before his own time. But of course, this is myth.


As to how and where they first emerged, we do not know, and our only source as to their demise are ancient "myths." Nevertheless, unlike those who came before them, the Cro-Magnon were an intellectually and neurologically superior breed of humanity for the frontal lobes had increased significantly in size and they evolved a new neocortical structure; i.e. the angular. Hence, due to these neurological developments, technology and innovation also mushroomed, including, over 35,000 years ago, the ability to think in temporal sequences.

These are all capabilities associated with the functional integrity of the IPL/angular gyrus as well as developmental advances in the right and left cerebral hemisphere and the frontal lobes. However, it is the evolution of the angular gyrus which not only gave rise to the eventual development of visual-pictorial imagery and temporal sequential tool making technology, but the fashioning of signs; what would become written language.

The evolution of the angular/gyrus enabled human beings to create complex visual-artistic symbols some 35,000 to 40,000 years ago, and to later modify these symbols in the form of written signs and then written language (Joseph 1993). Indeed, complex and detailed paintings left in the deep and forgotten recesses of ancient caves indicates that Homo sapiens sapiens of the Upper Paleolithic were capable of telling stories and making signs which are still comprehensible and pregnant with meaning 40 millinea later.

Although we can debate the merits and purposes for which these pictorial displays were created (magic, religious, instructive), they nevertheless represent the first form of language that had been formed (written) by the hand. Body movements and gestures had now become adapted for conveying meaning in the form of pictures; the result of well crafted, delicate, and precise finger, hand, and arm movements, the graphic impressions of which were engraved, painted, and carved in stone.

Humans of the Upper Paleolithic, like modern westernized humans, initially utilized single pictures, such as a red hand to indicate a clear, readily understood message, such as "stop", or "do not enter." In most American cities, this same "hand" symbol is used at intersections to indicate if and when someone may cross a street. Moreover, Upper Paleolithic humans also utilized abstract symbols, the meanings of which are not at all clear, though some scholars have associated them with sex, or fertility, or male vs female signs, and so on. However, maybe these symbols were actually a primitive form of writing; the likes of which first appeared over 30,000 years ago. If these peoples, thousands of years later, actually invented writing, we do not know.

However, the earliest preserved evidence detailing the evolution of written symbols comes from ancient Sumer, around 6000 or more years ago. In Sumer (a land where ancient Babylon would be established) as elsewhere, the first forms of "writing" were pictorial--a tradition that was already in use at least 30,000 years before their time; i.e. during the Upper Paleolithic.

Like those of the Upper Paleolithic, Sumerians initially utilized single pictorial symbols to convey specific messages, such as a "lion" to indicate "watch out, lions around," or that of a "grazing gazelle" so as to inform others that good hunting could be found in the vicinity (Chiera, 1966; Kramer 1981; Wooley 1965). Although the Cro-Magnons sometimes used single two-step pictorial displays to indicate a sequence of action, as based on available evidence, it was not until the time of the Sumerians and the Egyptians that people began to skillfully employ a series of pictures to represent not just actions, but abstract as well as concrete ideas.

For example, prior to the Sumerians and the rise of the Egyptian civilization, a depiction of a lion or a man might indicate the creature itself, or the nature-God that was incarnate in its form. The Sumerians and Egyptians then took symbolizing to its next evolutionary step. They strung these symbols together.

For example, by drawing a "foot," or a "mouth," or an "eye," they could indicate the idea "to walk," or to "eat," and to "look" or "watch out." By combining these pictures they were able to indicate complex messages, such that "so and so" had to "to walk" to a certain place where "gazelles" might be found in order "to eat," but they would have to keep an "eye" out for "lions".

This was a tremendous leap, for earlier in Sumerian and Egyptian civilization, to indicate a complex message such as the above would have required elaborate pictorial detail of entire bodies engaged in particular actions. Although beautiful to behold, and easily understood by all, this was a very cumbersome and time consuming process.

The next step in the evolution of writing was the depiction not just of actions and ideas, but sounds and names. As argued by Edward Chiera (1966), if they wanted to write a common Sumerian name such as "kuraka," they would draw objects which contained the sounds they wanted to depict, such as a "mountain" which was pronounced "kur," and "water" which was read as "a," and then a "mouth" which was pronounced "ka." By combining them together one was able to deduce the sound or name that the writer desired.

Likewise, over 5,300 years ago, the Egyptians were making similar literary leaps--either independently as a function of simultaneous mental metamorphosis, or due to cultural diffusion. For example, when "writing" the name of the city Ba-set, the ancient Egyptians would put together separate symbols, such as a throne, known as a "Ba" and a stork known as a "set." Each symbol, therefor stood for a consonant, and consonants make up syllables.

This was a tremendous leap in abstract thinking and in the creation of writing, for now visual symbols became associated with sound symbols and one could now not only look at pictures and know what was meant, but what the symbols sounded like as well. In this manner, writing, although still in pictorial form, became much more precise. Now ideas, actions, and words, and thus complex concepts could be conveyed.

Eventually pictures also came to be placed in temporal sequences and became employed for accounting purposes, including the tabulation of taxes. Hence this was a highly efficient means of communication, though it still remained quite cumbersome which required that further steps in symbolic thought be invented. The Sumerians met this challenge by inventing wedge form cuneiform characters which gradually began to replace the older pictographs and ideographs (Chiera, 1966; Kramer 1981; Wooley 1965). That is, these pictures were transformed into a series of wedge shaped lines which were read from left to right.

Initially these characters resembled the pictures they were destined to replace. However, as the use of cuneiform continued to evolve, pictorial details were gradually minimized, until finally these characters lost all pictorial relationship to that which they were meant to describe. In fact, a similar form of writing was also appearing in Neolithic Europe (see Figures ), in what is today Romania and the Balkins (Vlassa 1963)--similarities which may be due to cultural links with ancient Sumer or Mesopotamia Among the ancient Egyptians, a similar mode of written expression was also developing which was eventually transformed into hieroglyphics. However, in Egypt pictures remained an essential feature of their writing until the very end of their civilization about 2,000 years ago.

The Sumerians, Babylonians, and the later appearing Assyrians (all of whom lived in what is today Iraq), did not develop an alphabet, although they were able to depict vowels, consonants, and complex ideas and sounds via cuneiform (Chiera, 1966). In fact, even when these people were later introduced to foreign devised alphabets, they resisted this innovation.

Hence, from complex visual images, to images that were placed in sequences, to sequences of wedge shaped lines which were read from left to right, modern writing slowly evolved. Moreover, be it the writing of the Egyptians, the Sumerians, or modern day Americans, the process of reading and writing remains essentially similar. Both require the interaction of brain areas involved in visual and auditory analyses, as well as the evolution of the angular gyrus which enabled visual signals to be matched with sounds so that auditory equivalents could be conjured up. In this manner people are able to not only look at a visual symbol, but recognize it as a "word " and know what it sounds like as well.


Three hundred thousand years ago a someone took a piece of red ocher pigment and sharpened it presumably so as to mark something (Pfeiffer 2013). On what surface did it draw and what the nature of the composition may have been, we do not know. We can only guess that it served some symbolic purpose, or it may have merely served only to make a mark.

Three hundred thousand years ago someone took the rib of an ox and carved a series of geometric double arches on it (Pfeiffer 2013). Was he or she just doodling, or was this a common form of artistic expression even in those lost days and forgotten nights? Again, we do not know.

Sixty thousand years ago Neanderthals were painting their caves red and by time they were overrun by the Cro-Magnon people twenty thousand years later, geometric patterns, designs and doodles soon graced many a wall and cavern (Leroi-Gourhan 1964; Prideaux 1973). However, it was not until about twenty thousand years ago that people began leaving marks on rocks and walls that suggested that may have been keeping track of, or counting something. Perhaps the phases of the moon, or the number of animals killed? No one knows.

Just as we have no idea when the first complex sentence was spoken, or the first words were written, the point at which human beings first began to count or to measure the geometric properties of the land or the universe surrounding remains a mystery. However, it is also evident that the Cro-Magnons had developed an advanced awareness of geometric principles tens of thousands of years ago.

Geometry and the first forms of spoken and written pictorial language appear to be naturally related to the functional integrity of the right half of the brain (see chapter 10) and the parietal lobe which is exceedingly concerned with visual-spatial relations. It is likely, however, that the first (temporal-sequential) mathematical concepts were promulgated by the left cerebral hemisphere and parietal lobe, and like writing, were related to hand use. That is, one first counts on his or her fingers, and then they learn to count by pointing with their fingers at objects which they wish to add together, and then later they grasp a pen or pencil and make marks and signs which indicate the numbers they used and their summations; actions which are guided and observed by the parietal lobes.

The decimal system is clearly an outgrowth on this reliance on the fingers for counting, for this system is based on the concept of tens. Even the decimal systems employed by the ancients of Meso-America was digitalized, with the exception that they used a base of 20 as they apparently counted their toes.

It has been postulated that human beings first became concerned with geometry and numbers with the advent of agriculture and wide scale farming, the only remaining early evidence of which appears over ten thousand years ago, in Jericho, after the last great flood. That is, geometry and math was presumably employed in order to count their crops as well as survey their fields. In fact, around 8,000 years ago, the community that presumably first introduced farming into northern and Western Europe, the LBK culture (Linearbandkeramik), constructed pottery incised with geometric and curved lines that appear to depict field boundaries and rows of crops.

On the other hand, other than its use in art, geometry may well have first been employed to survey the heavens: visual-space. It was due to geometric-heavenly concerns that many of the ancients considered geometry to be the math of the gods and of divine origin. Perhaps this is why almost all ancient temples and buildings were not only constructed according to complex geometric principles, but oriented in regard to certain celestial configurations, including the temples of ancient Sumer. The Sumerians were exceedingly knowledgable about complex geometric principles (Kramer 1981; Wooley 1965), as were the Egyptians (Breasted 1909; Gardiner 1961; Wilson 1951).

Although it is apparent that the Sumerians were also familiar with and utilized a decimal system, and by 4000 years ago the Babylonians had developed the fundamental laws of mathematics, both cultures nevertheless, relied on a sexagesimal system for their complicated calculations because it was far superior to the decimal (Chiera 1966; Kramer 1981; Wooley 1965). For example, whereas the decimal system can be factored by 2,5,10, the 60 unit sexagesimal system could be factored by 2,3,4,5,6,9,10,12,15, and so on.

The sexagesimal system is also clearly related to the geometry of space and the partition of what they considered the cosmic, or divine circle; an activity that many ancient and recent cultures have indulged in and also considered divine. Thus, when the cosmic circle or the heavens are equally divided into four quadrants, e.g. North, East, West, South, this forms the sign of a "cross." This is the same cross that most cultures have also deemed to be divine and celestial in origin (chapter 9).

However, the creation of the "divine" four, or three, represents only the most rudimentary features of the sexagesimal system. For example, a circle can be divided into 360 degrees and so on, and these principles can be applied not just to surveys of the heavens but architecture. Via the complicated permutations made possible via the sexagesimal system, the Sumerians, the Babylonians, the Egyptians, the Greeks, and those living in ancient Meso-America were able to make very precise calculations of angular, object, and mathematical relations, and to create temples and buildings, the likes of which today could only be designed, built and fitted together using extremely precise tools and advanced, computerized measuring devices.

It is this same sexagesimal system that is employed in the measurement of time; i.e. 60 seconds and 60 minutes. Similarly, the first calendars were created in the same manner, the Sumerians dividing the circle into 12 parts in accordance with their beliefs regarding the sacred celestial nature of the number "12."

That is, the Sumerians, Egyptians and the Babylonians were well aware that the sun, and not the Earth, was at the center of the solar system. They also realized that the Earth was one of several planets and that all traveled around the sun. They postulated the presence of 10 planets (one of which may have been a moon), plus the moon that circles the Earth, which, when coupled with the sun, equalled the sum of 12.

It is this same Sumerian "12" which makes the 12 hours of the day and the night (the 24 hour day), and is retained in the form of the 12 months and the 12 houses of the Zodiac (Kramer 1981; Wooley 1965). The ancient Egyptians essentially adapted this system for designing their own calander, and in Meso-America an almost identical calender system was devised by the Mayas.

However, the Babylonians (and probably the Sumerians before them) took the decimal and 60 unit sexagesimal system one step further and invented a way to write these numbers in a temporal sequence, the grammatical order of which revealed the value of the sum (Kramer 1981; Wooley 1965). In this manner, thanks to the Sumerian-Babylonians, when one writes 4254, it is clear that the first "4" is a thousand times greater than the last "4". Finally, when the ancient Hindu's appeared on the scene the concept of "nothing" was formulated and thus "zero" came into being.

Just as written language soon came to be organized in a non-pictorial series of temporal sequences, so to did the understanding of the cosmos, geometry, time, and numbers. These tremendous intellectual and creative achievements, however, like language, were dependent on the functional integrity of the inferior parietal lobe (as well as other neural structures such as located within the frontal and temporal lobe and the thalamus), for with the destruction of this tissue, one's sense of space, geometry, written language, and math may be abolished.


The ability to engage in complex conversational speech probably remained severely limited until the appearance of the Cro-Magnon and anatomically modern Homo sapiens sapiens in Asia and Australia, well over 60,000 years ago. Indeed, these were the first people to inhabit this planet who possessed the intelligence as well as the laryngeal structures, so as to produce language and to think and communicate via the use of symbols and signs. This evident not only based on their tool technology and artistic achievements, but the structure of their oral and nasal cavities and the comparatively larger and longer pharynx gave the Cro-Magnon people the ability to enunciate, structure, shape, form, and project sounds very rapidly over a wide distance.

Nevertheless, these capacities were acquired very slowly and over millions of years, the development of which corresponded to evolutionary changes in the organization of the human brain and mind and brain. For example, over the course of human evolution, and with the increasingly adaption of the left hemisphere for fine motor and language mediation, and with the evolution of reading and writing and math, older, non-language functions formerly associated with the left half of the brain were displaced diminished, and/or lost their neocortical representation. That is, they were presumably crowded out by language related abilities including reading, writing and math (Joseph, 1986b, 2012b, 1993).

This is because there is only so much neocortical space available and these new functions drove out the old. This is not to imply that prior to this development the two have the brain were identical in functional representation for this was certainly not the case (e.g. Hamilton & Vermeire, 1988; Hauser 1993; Hopkins, et al. 2010; Pohl, 2013).

As noted, among non-human primates, the left temporal lobe is dominant over the right in regard to the perception and comprehension of communicative vocalizations. Moreover, in primates the right hemisphere also displays the rudiments of lateralized functional specialization; similar to that of modern humans. For example, Baboons and rhesus macaques display a right hemisphere advantage in identifying and discriminating musical cords, pure tones, as well as for visual-spatial, form, and facial and facial-emotional discrimination, recognition, and expression (Hamilton & Vermeire, 1988; Hauser 1993; Hopkins, et al. 2010; Pohl, 2013). Human show a similar (albeit much more pronounced) right cerebral specialization (chapter 10).

As per functional crowding, consider for example that those who are highly educated vs those of low socio-economic and educational status show differential patterns of cerebral specialization such that those with with university training are more clearly lateralized (Alvarez & Fuentes 2004). Moreover, primates with "language training" demonstrate a greater right hemisphere advantage in visual-spatial analysis than those without (Hamilton & Vermeire, 1988), suggesting a crowding effect; that is, non-language functions formerly associated with the left hemisphere are displayed due to the limited amount of neocortical space available. Hence, over the course of evolution, as language related functions increase in importance, their neocortical representation increased thereby giving rise to differential hemispheric specialities.

For example, it has been repeatedly demonstrated (e.g. Dennis & Whitaker 1976; Feldman et al. 2012; Joseph, 1986b, Kurthen et al. 2012; Novelly & Joseph, 2013), that if the left hemisphere is damaged early in infancy (thus reducing neocortical space), language functions will migrate and take over vacant space within the right hemisphere. However, when this occurs, capacities normally associated with the right half of the cerebrum are diminished due to the effects of functional crowding. That is, language takes over cortical space devoted to later appearing right cerebral functions which in turn suffer due to lack of representation.

However, not all aspects of language will migrate, relocate, or functionally redevelopment. Hence, the syntactic (temporal sequential) components of speech may be deficient (Eisele & Aram, 2004; Dennis & Whitaker, 1976) and language development may lag (Feldman et al. 2012; Marchman et al. 1991). Presumably this is due to innate hemispheric differences in motor control as related to language expression.

Conversely, if the right cerebrum is damaged early in life, many of its capacities can be in part, acquired by the left hemisphere (Novelly & Joseph, 2013), in which case language begins to suffer There is only so much cortical space available and there is much competition for representation.

With the evolution and development of language, therefore, not just the human left hemisphere but the right half of the brain underwent further functional metamorphosis and developed and evolved. Thus the right and left half of the brain became increasingly differentially specialized which enabled humans to accomplish twice as much and which in turn gave birth to language, the art of creativity, as well as what has been referred to as "neuroses" (Joseph, 2012b). That is, with the appearance of the angular gyrus, and the evolution of the frontal lobe, language, profound artistic expression, self-consciousness and right and left hemisphere functional specialization (in addition to that of the limbic system, midbrain and brainstem), a schism had formed in the psyche of woman and man and between the right and left half of the brain. The human mind was now subject to fragmentation and prone to the development of psychosis, neuroses, and intra-psychic conflicts.

Copyright: 1996, 2000, 2010, 2018 - Rhawn Joseph, Ph.D.