a bonobo humanity?

‘Rise above yourself and grasp the world’ Archimedes – attribution

Posts Tagged ‘human origins

stuff on human ancestry 1: the australopithecines, mostly

leave a comment »

All the evolution we know of proceeds from the vague to the definite

C S Pierce

some of these depictions are more vague than others – we’re definitely not there yet


I was in a bookshop yesterday, where I picked up a copy of Yuval Noah Harari’s book Sapiens and had a gander at the back cover. I read one sentence, which went something like ‘100,000 years ago there were at least six species of Homo sapiens, now there is only one.’ Or maybe it was just ‘six species of Homo‘. It resonated with me, because it’s been a while since I’ve researched and written about the ever-fascinating topic of human origins, a topic that resurfaced for me recently on reading an essay, ‘Lucy on the earth in stasis’ by Stephen Jay Gould in his 1996 collection Dinosaur in a haystack. The essay promoted his ‘punctuated equilibrium’ view of evolution, as it reported that Australopithecus afarensis appeared to be the only hominin type in existence for a period of almost a million years, from approximately 4.9 million years ago to 4 million years ago, after which there was a relatively rapid radiation of hominid species. I could only take the essay on trust, but I maintained the thought that I should investigate whether this claim still held, some twenty-three years later. And that, further, I should investigate whether we were any clearer about our descent, as the last surviving species of that apparent radiation.

And by the way, for my education’s sake I need to straighten out the difference between hominids and hominins. We humans are both, apparently. The hominids, or great apes, include four genera: Pongo, the orang-utangs, of which there are three extant species; Pan, of which there are two species, chimps and bonobos; Gorilla (two species), and Homo, of which there’s only one extant species, but many extinct ones including Neanderthals. The term ‘hominid’ has broadened over time. The term ‘hominin’ is more restrictive, referring only to those species ancestral or related to humans, since the split from the chimp and bonobo line. This explains, I hope, why we are both hominids and hominins. Clearly, though, I should stick to the term hominin for this post, or series of posts.

Anyway, I was surprised to read this claim about the state of human play 100,000 years ago. The old Bill Bryson question, How do they know that? came to mind, but I also felt skeptical, as I seemed to remember that the number was smaller – possibly dependent on whether you were a lumper or a splitter.

We know of course that our closest living relatives are (equally) chimps and bonobos, and the latest dating of our divergence from their line is 4 to 7 million years (according to Wikipedia, but Gould put it at 6 to 8 mya, and this video from the American Museum of Natural History gives it more ‘precisely’ at 7 mya, and another Wikipedia article gives the figure as 6.5 to 5.5 mya, so who knows?) There are a couple of possibilities for our last shared ancestor – Sahelanthropus tchadensis and Orrorin tugenensis – but their more or less competing claims are mired in uncertainty, due to the extreme sparsity of material. It may well be that neither of them fit the bill.

When they look at the evidence from early hominins, researchers are particularly interested in signs of bipedalism, which have been argued to exist in S tchadensis due to the placement of its foramen magnum (the hole in the skull through which the spinal cord passes) towards the back – though this placement has been disputed, quelle surprise. In any case, these earliest hominins evolved during the Pleiocene epoch into the definitely bipedal australopithecines. The bipedal adaptation is so important to the emergence of Homo sapiens that it has been the subject of a great deal of speculation, hypothesis and argumentation. It’s likely that there were a variety of converging factors that favoured this trait’s development. For example, it provided a wider visual field, especially on the ground; it left the hands free to grasp and carry food; it enabled long-distance running, and it reduced the expenditure of energy. However, bipedalism appears to have been a slow development, and early australopithecines such as A afarensis likely spent a lot of time in trees. This is supported by anatomical features such as longer arm-bones, curved fingers, a shallow rib-cage and strong clavicular anchors for brachiation (swinging from the arms among branches).

Over time there were anatomical changes favouring bipedality. These included greater robustness of the ankle and knee joints, and changed positioning of the foramen magnum, the femur and the spine, to support changes to the centre of gravity. But the changes which have had the most long-lasting, even at times dire effects, have been those to the pelvic region. The strengthening and widening of this region, including the ilium, ischium and pubis, to support an upright stance, has to a serious degree compromised the process of childbirth. It’s been observed that australopithecines share with modern humans a sexual dimorphism relating to the lumbar vertebrae, allowing the spinal curvature of females to become more pronounced during pregnancy, which helps to better distribute the weight of the unborn child and to reduce fatigue and maintain stability of posture for the mother. However, the changed shape of the pelvis and the consequent narrowing of the birth canal resulted in what has become known as ‘the obstetrical dilemma’. Unlike virtually every other mammalian species, humans face major difficulties and dangers in childbirth, which require others – midwives or other medical professionals – to assist in the process (for example, neonatal rotation is often necessary for safe delivery). A ‘solution’ to this dilemma, which appears to have evolved over time, is a comparatively short gestation period – the time spent in the womb – to give a greater opportunity for both mother and child to survive the birth. This of course leads to a longer period of childhood dependence as it develops outside the womb. Apparently, a modern human baby is born with approximately 25% of full brain development, compared to 45-50% in other primates. Brain size at birth is limited due to the obstetric dilemma, and greater neoteny is the result. 

Encephalisation, which refers to a growth in brain size or mass relative to body size, is now seen as a later development in the human story than bipedalism. Brain size in general has become very questionable as a measure of complex evolutionary development – witness those smart corvids – and it’s worth noting that the Neanderthal brain is on average larger than ours. What’s important, though, is brain structure – something we can’t really look at vis-a-vis our ancestors. However it is reasonable to assume that our much larger brain size compared to australopithecines is largely due to growth in the temporal lobes and the prefrontal cortex. In fact all regions have grown, including the cerebellum, traditionally associated with fine motor control and balance, but more recently connected with cognitive function and language.  

But let me return to the hunt for the hominin links from the other great apes to Homo sapiens. In the mid-nineties, two new examples of early hominins were discovered, Australopithicus anamensis and Ardipithecus ramidus. I’m guessing that Gould didn’t know about these discoveries when he wrote his essay, as they seem to have punctured his punctuated equilibrium thesis, at least as regards hominins. Anyway the anamensis species is believed to have lived from about 4.2 to 3.8 million years ago, and the A ramidus specimens have been dated to around 4.4 million years ago, but interestingly, A afarensis, the principal subject of Gould’s essay, is now believed to have lived from 3.9 million years ago to 2.9 million years ago – that’s a million years after Gould’s stated range. The australopithecines first came to our attention in 1925 when Raymond Dart described Australopithecus africanus from specimens found in South Africa. A africanus is a more gracile type, and may well be in the direct line to humans, though there’s been a lot of dispute about the dating and classifying of different specimens. A africanus is generally thought to be a more recent species than A afarensis, another gracile type. So maybe we can link A africanus back to A afarensis, which in turn can be linked back to S tchadensis, with some intermediate missing links. But then there’s another recently discovered species, Australopithecus sediba, which has been dated to around 2 million years ago and is thought to be a transitional species between A africanus and either Homo habilis (which some prefer to describe as Australopithecus habilis) or Homo erectus. Another gracile species discovered in the nineties, A garhi, dating to about 2.5 million years ago, also seems to fit as a species connecting Australopithecus and Homo. From what I’m reading, the fragmentary nature of these finds, together with obvious questions as to whether particular specimens are typical of whole species (type specimens are often juveniles, which might not be such a good idea), are the main barriers to pinning down the precise lines of succession. That’s why every new discovery is such a treasure. 

I haven’t mentioned Ardipithecus or Paranthropus as yet. In the nineties specimens were found in the Afar triangle in East Africa, and classified as Ardipithecus ramidus (around 4.4 million years ago, with uncertain evidence of bipedality, and some evidence of reduced sexual dimorphism) and Ardipithecus kadabba (about 5.6 mya, possibly an ancestor of A ramidus, but known from only a few teeth and bones – the type specimen being a bit of mandible with an attached molar). It’s possible, according to some researchers, that Ardipithecus, Orrorin, and Sahelanthropus all belong to the same genus. 

I’ll have a look at Paranthropus, apparently a more robust distant cousin of ours, then move forward to the Homo genus, next time. 


Australopithecus Evolution (video), by Henry the PaleoGuy, 2019

Seven million years of human evolution (video), American Museum of Natural History, 2018











Written by stewart henderson

October 23, 2019 at 5:16 pm

Should we be lumpers or splitters over our hominid ancestors?

with 4 comments

D4500 on the right

D4500 on the right

As I’m overwhelmed and a bit stressed by work issues, I’ve not posted here for a while, or to be precise I’ve got three or four posts going which I’ve not been able to finish. So I’ve decided to throw something down and push it out today no matter what.

A fascinating post on the John Hawks blog, alerted to me by Butterflies and Wheels. He goes into much detail on an issue that has fascinated me, in my dilettantish way. My general reading on human ancestry, which turned up names such as Homo erectus, Homo habilis, Homo ergaster, Homo rudolfensis, Homo heidelbergensis et al, together with the information that the remains of these hominids or hominins were scanty and their precise identities disputed, made me wonder from my distant armchair whether they all represented different species or just variants of the one. Of course I have no expertise at all, and I don’t know the difference between a species and a subspecies, but my reading did make me aware that this was a genuine issue amongst paleoanthropologists.

The Hawks post, which takes its departure from a paper published on a recently revealed specimen of Homo erectus, goes into some detail on all this. The cranial specimen, D4500, from Dmanisi in Georgia, is the best-preserved of any so far discovered. The writers of the paper take the opportunity to put forward the view that the early Homo finds, such as D4500 and remains from the Malapa fossil site in South Africa, and by inference a number of others, represent a single lineage, a view with which Hawks largely concurs. So there, I told you so.

Of course Hawks goes into a lot of detail, and expresses his views with the diffidence we generally find in true scientists, but I’m delighted to find my vague sense of things so thoroughly supported. i must be a lumper, but of course I’m prepared to change my mind at the slightest change in the winds of research. Now I just need to work out where all those Australopithecines fit into the general picture, without moving too far from my armchair, of course.

Written by stewart henderson

November 4, 2013 at 4:26 pm

how to debate William Lane Craig, or not – part 9, concluding remarks

with 5 comments


Now I want to make some final remarks about the debate process and the way it can be manipulated, and some general remarks about the growth of atheism.

I’ve taken some time to respond to Dr Craig’s arguments, and I could’ve taken longer, but I didn’t consider all of them worthy of an elaborate response. In any case I’ve taken a lot longer than twenty minutes for my overall response, and that’s as it should be. To make a claim is generally easier and less time-consuming than to refute a claim, and it has always been thus, and Dr Craig knows that very well. This is probably why Dr Craig insists on setting the agenda and why he always claims that, if every one of his points isn’t refuted in 20 minutes, he wins.  This is essentially a modified version of the infamous ‘Gish gallop’, in which the opponent has little hope of addressing all the erroneous elements embedded in every point in the allotted time, so he or she (but actually I don’t recall a female ever debating Dr Craig) has no choice but to select two or three points to focus on. This allows Dr Craig to claim a very dubious ‘victory’ for the points that aren’t addressed. Hopefully in pointing this out, I’ve helped you to see the limited relevance of the time-constrained debate format in answering these big questions.

Now, I want to focus finally on the growth of the non-religious trend in the west. I recall hearing Dr Craig in an interview stating that only 2% of the US population was atheist. He probably got this figure from the 2009 ARIS report, the American Religious Identification Survey, which did indeed find that some 1.6% of surveyed American adults self-identified as atheist or agnostic. However the same report found that some 15% of Americans identified as having no religion. Make of that what you will. That same report also found that, in 2008, some 76% of Americans identified as Christians, compared with 86% in 1990. The report concludes that:

‘The challenge to Christianity in the U.S. does not come from other religions but rather from a rejection of all forms of organized religion’.

A more recent 2012 study by the Pew Forum on Religion & Public Life reports:

The number of Americans who do not identify with any religion continues to grow at a rapid pace. One-fifth of the U.S. public – and a third of adults under 30 – are religiously unaffiliated today, the highest percentages ever in Pew Research Center polling.
In the last five years alone, the unaffiliated have increased from just over 15% to just under 20% of all U.S. adults. Their ranks now include more than 13 million self-described atheists and agnostics (nearly 6% of the U.S. public), as well as nearly 33 million people who say they have no particular religious affiliation (14%).

The USA, however, is a lot more religious than other western nations. My own country, Australia, is I think more typical in its profile. In Australia’s 2011 census, the non-religious category amounted to 22.3% of the whole, the fastest-growing category by far, and considering that 8.6% of the population chose not to answer the question, and that a substantial proportion of those would be non-religious, it probable that more than a quarter of the population would identify as non-religious. Some 61% of Australians now identify as Christians, compared to around 84% in the early seventies, and it’s been falling more rapidly in recent years. Figures from Great Britain and Canada are much the same, with rapid growth in the non-religious categories in recent years.

Yet in spite of all this evidence, Dr Craig scoffs at the challenges to his theism and dismisses atheists as intellectual lightweights. He even likes to make the claim that atheists have been using the same arguments for the last 300 years and that all their arguments have been quashed. This amuses me, because this is exactly what any number of atheist philosophers have been saying about theists and their arguments. And I have to say, having read a few essay collections on the existence of god, I’ve always thought that atheists had by far the best arguments – but then, I would, wouldn’t I?

The difficulty that Dr Craig and his cronies must face is this. If he has all the best arguments, why are the majority of philosophers – trained analytical thinkers – non-believers, even in his own country? Why is it that non-belief is growing far more rapidly among the most educated than among the least educated? Why is it that millions and millions and millions of people, in Australia, Europe, North America and Japan, are comfortably rejecting Christianity and religion? Is there a virus going around? Have people dumbed down from the glorious days of pre-Enlightenment Christendom? Now, don’t get me wrong, I don’t want to shake Dr Craig out of his smug complacency – not that this would be possible – but I do want to pose that question to you, the audience. What has changed over the past half-century? I’m not saying that I know the answer myself, though I have my speculations on that question, which I won’t share with you today. But let me be clear that there is a change under way.

Dr Craig, as I say has spoken of 300 years of atheism. The writer Jack Miles has written about how galling it must be for atheists that the term has been around for a couple of thousand years, with still only a minority of followers. But Miles has misrepresented the situation. A couple of thousand years ago there were very few people, mostly intellectuals, who scoffed at the religious superstitions of their fellows. Epicurus, Seneca, Lucretius, these were largely isolated individuals, islands in a sea of theism, or at least deism. The term atheist in fact began to be bandied about with the rise of Christianity. The Christians called the Pagans atheists, and the Pagans called the Christians atheists, and in a sense both sides were correct, because each side refused to believe in the only god or gods worth worshipping, according to the other side. Of course to modern observers, neither side was atheist.

Atheism as a ‘movement’ is of far more recent vintage. Isolated individuals cropped up again in the eighteenth century – Jean Meslier, Baron d’Holbach, Hume, Diderot and a few others – but many of the Enlightenment and early nineteenth century critics of Christianity, such as Voltaire, Paine, and the American founding fathers, were deists. Even in the late 19th century, the great voices of atheism, such as Robert Ingersoll, were largely voices in the wilderness, though the intellectual claims of atheism were forwarded by many philosophers such as Jeremy Bentham and J S Mill who simply ignored the ethical claims of religion completely, as have most moral philosophers since their time.

But it’s really only in the twentieth century, and the later half of it, that atheism has become common-place. This is a trend that I cannot see being reversed, in a world where knowledge – of our universe, of our psychology, and of our human origins – expands on a daily basis. Religious belief is becoming out-moded and, to many, positively embarrassing in its simplistic claims about good and evil, sin and redemption, and gods as lords over us, to be worshipped and feared and so forth. Of course we live in a multi-speed polity, as far as the absorption of new ideas is concerned, and we will long continue to have our backward-facing Islamists, our Haredi Jews and our Amish-style Christian sects, but they will not be among the world’s movers and shakers.

So to return to Dr Craig and his crusade against the world’s atheists. None of his arguments withstands much scrutiny but he will never admit this and he will go on repeating them, unbent and unbowed until, if I may quote the bard, ‘second childishness and mere oblivion’ puts a stop to the farce. I mentioned earlier the flat-earthers who filled halls only 150 years ago with their speeches against the round-earth conspiracy. Not one of those flat-earthers ever admitted he was wrong. Every last one of them went to their deaths proclaiming their ‘truths’ with just as much confidence as when they started out. Creationists never change their minds either, or very rarely. They just die. And they’re not replaced, or the replacement rate is unable to match the death rate, and so the species eventually dies out. This has been the fate of the flat-earthers. It will happen to the creationists too, though it’ll take a little longer, and as to those who in future want to take up the cause of Dr Craig or his later incarnations, you’ll no doubt find the going increasingly tough, and the potential audience increasingly indifferent. The real world is becoming just too interesting to keep focusing on rehashed arguments about done and dusted worldviews.

Go in peace, and thanks for listening.


leave a comment »


Nobody loves me, everybody hates me, thank I’ll go and eat worms

Long ones short ones fat ones skinny ones

Worms that squiggle and squirm

That’s called a kids’ song, or a campfire song, and in some versions the words are different, but that’s how I learned it in the wolf cubs as an eight-year-old, and the words often come back to me when, as quite often happens, I find that nobody loves me and everybody hates me. This is the case at present so I was heartened by watching a doco this morning on worms, and I thought I’d cheer myself by writing about them rather than eating them.

I’m talking earthworms here, just to narrow things down. The longest worm that we know of (not an earthworm) is the bootlace worm, Lineus longissimus, of the phylum Nemertea, specimens of which grow as long as 55 metres – though they’re stretchy, so that might be cheating. As for earthworms, Australia’s regarded as a hotspot of wormy diversity, according to wormologists, with the giant Gippsland earthworm, Megascolides australis, coming in as one of the biggest at up to 3 metres, and over an inch in diameter. You could base more than a couple of hefty meals on a critter that size, but sadly they’re a threatened species, another casualty of human encroachment on habitat. In fact, a great many of Australia’s 1000 or so known native earthworm species are in the same position, but for obvious reasons they don’t get the same attention as bilbies and potoroos.

As every gardener knows, worms are much valued for the way they transform the soil, providing new opportunities for the growth and development of plants. They also aerate the soil – letting in air, releasing carbon dioxide – with their burrowing activities. They don’t simply become two if they’re cut in half, though they can regenerate a chopped-off tail. They’re delicate and can be easily broken if pulled at, and in fact they have tiny gripping hairs, called setae, all over their bodies which makes them especially hard to pull out of the ground, as if you’d want to. Like me, they’re hermaphrodites (I think that’s why everybody hates me) and they breed by stretching alongside each other and exchanging sperm, a process that often lasts for many hours.

Okay, I’m not a hermaphrodite, but I may as well be, and a two-headed one at that.

Worms make great food for birds, platypuses and the occasional intrepid toddler, and their excreta, aka castings, the end-product of incessant organic digestion, is taken up by plants, and is full of such goodies as phosphorus, nitrogen, calcium and magnesium. They like and need moisture, and in fact the giant earthworm can be detected by the underground squelching and gurgling created by their activities.

The basic worm anatomical structure, whether you’re talking land or sea, has been around a very long time, and obviously has proved very effective and enduring. It’s believed that the first-ever vertebrate creature (according to current knowledge), the ocean-living chordate Pikaia gracilens, incorporated the beginnings of a backbone into its worm-like body some 500 million years ago. That makes worm-eating a form of cannibalism. In fact, eating itself is a form of cannibalism and we really should stop.

Let’s look at how earthworms get around. The direction of their movement is a response to light and to soil chemistry as it impacts on skin cells. They move by expanding and contracting their muscles, anchoring themselves as they go with their setae, which they put out and retract as they go. Skin secretions help to bind the soil around them, easing their burrowing passage. Like us, they move a lot more sluggishly (probably not a good choice of words) in the cold weather.

So, that’s it for worms, for now. I’ve opened a few cans of them in my time, but I’ve always been reluctant to examine the contents. See how I’ve changed.

a dish of mopane worms - a fave from Zimbabwe

a dish of mopane worms – a fave from Zimbabwe

Written by stewart henderson

January 30, 2013 at 10:09 am

is there life on mars?

leave a comment »

good question, Davie

Back in 1975, NASA sent two space probes to Mars. Their landers touched down on the Martian surface less than a year later. The Viking 1 lander remained operational for more than six years, Viking 2 for three and a half. During this time, biological experiments were conducted upon Martian soil. As far as the general public is concerned, the results of these tests were negative, though for those in the know, it wasn’t quite that simple. Not that there was any great conspiracy or cover-up; the consensus amongst the cognoscenti was that the evidence tilted much more towards no-life than towards life, for the minute samples examined.

It seems, though, that exobiologists have long been intrigued by some of the findings in a particular batch of experiments, known as the Labelled Release experiments. As this Wikipedia article describes, these experiments involved a soil sample being inoculated with a weak aqueous nutrient solution. The nutrients were of the type produced in the famous Miller-Urey experiments of the fifties. Evidence was sought for metabolisation of these nutrients by micro-organisms in the soil, if any, and the first trial of these experiments produced surprisingly positive results. In fact, both the Viking probes produced initially positive results from different soil samples, one with a sample of surface soil exposed to sunlight, the other with a sample from beneath a rock. However, when the tests were repeated later, they produced negative results. Many other different types of biological tests were carried out during this mission, all of them yielding negative results. So it was all very inconclusive and mysterious.

Fast forward to April 2012, when a report was released by an international team of scientists suggesting that, after thorough analysis of the Labelled Release data, ‘extant microbial life on Mars’ may have been detected.

Researchers long ago abandoned the idea of multicellular life currently existing on Mars. Conditions for the maintenance of such life forms may have existed there billions of years ago – the Viking orbiters found evidence of erosion and the possible remains of river valleys – but those conditions have changed, though some have argued that the soil coloration and recent detection of silicate minerals indicates more recent signs of water, vegetation and microbial activity. All of this is highly contentious, but all good fun, and indicates that more research is required.

In 2008, a robotic spacecraft landed on Mars, in the polar region, and remained operational for about six months. The Phoenix lander had two principal objectives, to test for any history of water in the region, and to search for anything organic in the surrounding regolith [the surface layer of broken rock and soil affected by wind or water]. Preliminary data revealed perchlorate, an acid-derived salt, in the soil, which wasn’t a good sign. Perchlorate can act as an ‘anti-freeze’, lowering the freezing point of water. Generally, though, the pH levels of the tested soil, and its salinity, were benign from a biological perspective. CO2 and bound water were also detected.

We’ve only minutely scratched a few surface points of a huge beast, you might say. What we’ve found isn’t too promising, but it’s enough to keep us wanting to investigate further, just to make sure, or to know more. After all, there’s still plenty to learn about the surface of our own planet. Recently, for example, we learned how perchlorates can be formed from soils with highly concentrated salts, in the presence of UV and sun light. Chloride is converted to perchlorate in the process, which has been reproduced in the lab. Only in 2010, soils with high concentrations of perchlorate were discovered over a large section of Antarctica.

Between August 6 and August 20, that’s to say in two or three weeks time, the Mars Science Laboratory [MSL, also known as ‘Curiosity’] will land on Mars and look for further signs, past or present, of biological activity. It’s likely that whatever is discovered, not just in terms of life itself, but in terms of conditions for life, will be hotly debated. This Wikipedia article, covering the whole life-on-Mars search and debate, includes this intriguing para:

The best life detection experiment proposed is the examination on Earth of a soil sample from Mars. However, the difficulty of providing and maintaining life support over the months of transit from Mars to Earth remains to be solved. Providing for still unknown environmental and nutritional requirements is daunting. Should dead organisms be found in a sample, it would be difficult to conclude that those organisms were alive when obtained.

True enough, but even if dead, what a revelation it would be. Extra-terrestrial death means extra-terrestrial life, and so very very close to home in the great vastness of the universe. Another blow to our uniqueness, what terrible fun.

Written by stewart henderson

July 25, 2012 at 7:08 pm

more on human ancestry

with one comment

Homo erectus

This is a fantastic time to be interested in human origins.

[Ed Yong, ‘Not Exactly Rocket Science’]

The post linked to above, and probably plenty of others I could’ve linked to, makes my previous post on human origins look decidedly amateurish. But after all, I’m an amateur. And that’s all good, I’m just trying to educate myself here.

The Yong post doesn’t concern itself with how far back we can trace our distinctly human ancestry, but it does refer to the multi-regional model of origins, which I carelessly mentioned in my post, and it focuses on the latest analyses of our relations with Neanderthals, through computer simulations of the spread of populations and examination of the Neanderthal genome. It also mentions a recently discovered type of archaic human called the Denisovans, with whom we also interbred. This sounds confusing, as you might think we  were archaic humans when we bred with them, but apparently not – or perhaps not quite so archaic as Denisovans.

It seems from my reading that ‘archaic’ here means a form that’s no longer extant, rather than a form that’s less developed or more ‘primitive’. Evidence of the Denisovans comes from a finger bone found in Denisova cave in Siberia, only in March 2010. Mitochondrial DNA from this hominin bone, which dated back 41,000 years, suggested that it was distinct from both Neanderthals and modern humans. Next, a team examined and sequenced the nuclear genome of this hominin [all from a wee finger bone], aided by the fact that DNA is better preserved in cold climates. As we know, the Neanderthal genome and the modern human genome have also been sequenced, and a comparison of results has shown that the Denisovans and the Neanderthals shared a divergent branch from the line leading to modern African humans. The branch diverged from the lineage some 800,000 years ago, with Denisovans and Neanderthals diverging from each other some 640,000 years ago.

That so much can now be gleaned from such scant fossil finds does tend to excite. Only last year another early species of Homo was identified, H gautengensis, through analysis of a specimen found back in the seventies in the Strekfontein caves of South Africa. This species is believed to predate H habilis, emerging more than two million years ago and dying out about 600,000 years ago. It was big-toothed and small-brained, little more than three feet tall, and weighing around 50kgs. It was bipedal on the ground but probably spent most of its time in trees, and it lived largely if not entirely on vegetable matter. Can we identify such a creature as human? It probably lacked speech or any language capacity. We can’t analyse its DNA, but anatomical and other research suggests it is a close relative of H sapiens, if not a direct ancestor.

These recent discoveries of possibly or probably new lineages are due to a convergence of new techniques and accumulated knowledge – we know better where to look and what to look for. More discoveries await analysis, and previous finds await reappraisal. It looks as if our Homo ancestry will get a lot bushier as a result. There’s a bit of a language issue here. Homo sapiens are clearly human beings, and they’re direct descendants of another Homo species. H erectus is the most likely candidate, but we’re far from sure. In any case, at the point of divergence there would’ve been little noticeable difference between our species and its immediate ancestor. This difficulty about beginnings is compounded by our naming the whole genus Homo, Latin for ‘man’, in the gender-neutral sense. The genus is more than 2 million years old, our species may only be about 200,000 years old. Many would argue that H gautengensis, given the description given above, is not human. Others would argue that it is, or at least that it is proto-human [though this is problematic as it suggests a human ‘prototype’, a rather teleological term]. So the original question, how long have humans been around is dependent both on how we define ‘human’, and on what we can properly infer from the data. For example, though we might be able to infer much about the lifestyle of H gautengensis [I don’t know how they managed to work out its probable diet] from modern analyses, we might never be able to know how we’d have reacted if we’d met a specimen. Would we see recognition in its eyes? Could we have befriended it [or him, or her]? How would we have communicated? It’s more likely, of course, that both groups would’ve exhibited in-group/out-group hostility, but even so, it’s hard to imagine what that hostility would’ve felt like, with its admixture of recognition, curiosity and wonder.

In any case, for those who might want to argue that  the species H sapiens and H sapiens alone is truly human, there are more complications. In 1997 remains were found in Ethiopia of a probable subspecies of H sapiens, since named H sapiens idaltu, dating back 160,000 years. The remains consisted of three craniums, and who knows how many other remains are yet to be found, of this and other subspecies. So now we prefer to call ourselves H sapiens sapiens, but more of that another time.

Written by stewart henderson

September 15, 2011 at 9:53 pm

Posted in anthropology

Tagged with