Archive for the ‘evidence’ Category
stuff on human ancestry 1: the australopithecines, mostly
All the evolution we know of proceeds from the vague to the definite
C S Pierce

I was in a bookshop yesterday, where I picked up a copy of Yuval Noah Harari’s book Sapiens and had a gander at the back cover. I read one sentence, which went something like ‘100,000 years ago there were at least six species of Homo sapiens, now there is only one.’ Or maybe it was just ‘six species of Homo‘. It resonated with me, because it’s been a while since I’ve researched and written about the ever-fascinating topic of human origins, a topic that resurfaced for me recently on reading an essay, ‘Lucy on the earth in stasis’ by Stephen Jay Gould in his 1996 collection Dinosaur in a haystack. The essay promoted his ‘punctuated equilibrium’ view of evolution, as it reported that Australopithecus afarensis appeared to be the only hominin type in existence for a period of almost a million years, from approximately 4.9 million years ago to 4 million years ago, after which there was a relatively rapid radiation of hominid species. I could only take the essay on trust, but I maintained the thought that I should investigate whether this claim still held, some twenty-three years later. And that, further, I should investigate whether we were any clearer about our descent, as the last surviving species of that apparent radiation.
And by the way, for my education’s sake I need to straighten out the difference between hominids and hominins. We humans are both, apparently. The hominids, or great apes, include four genera: Pongo, the orang-utangs, of which there are three extant species; Pan, of which there are two species, chimps and bonobos; Gorilla (two species), and Homo, of which there’s only one extant species, but many extinct ones including Neanderthals. The term ‘hominid’ has broadened over time. The term ‘hominin’ is more restrictive, referring only to those species ancestral or related to humans, since the split from the chimp and bonobo line. This explains, I hope, why we are both hominids and hominins. Clearly, though, I should stick to the term hominin for this post, or series of posts.
Anyway, I was surprised to read this claim about the state of human play 100,000 years ago. The old Bill Bryson question, How do they know that? came to mind, but I also felt skeptical, as I seemed to remember that the number was smaller – possibly dependent on whether you were a lumper or a splitter.
We know of course that our closest living relatives are (equally) chimps and bonobos, and the latest dating of our divergence from their line is 4 to 7 million years (according to Wikipedia, but Gould put it at 6 to 8 mya, and this video from the American Museum of Natural History gives it more ‘precisely’ at 7 mya, and another Wikipedia article gives the figure as 6.5 to 5.5 mya, so who knows?) There are a couple of possibilities for our last shared ancestor – Sahelanthropus tchadensis and Orrorin tugenensis – but their more or less competing claims are mired in uncertainty, due to the extreme sparsity of material. It may well be that neither of them fit the bill.
When they look at the evidence from early hominins, researchers are particularly interested in signs of bipedalism, which have been argued to exist in S tchadensis due to the placement of its foramen magnum (the hole in the skull through which the spinal cord passes) towards the back – though this placement has been disputed, quelle surprise. In any case, these earliest hominins evolved during the Pleiocene epoch into the definitely bipedal australopithecines. The bipedal adaptation is so important to the emergence of Homo sapiens that it has been the subject of a great deal of speculation, hypothesis and argumentation. It’s likely that there were a variety of converging factors that favoured this trait’s development. For example, it provided a wider visual field, especially on the ground; it left the hands free to grasp and carry food; it enabled long-distance running, and it reduced the expenditure of energy. However, bipedalism appears to have been a slow development, and early australopithecines such as A afarensis likely spent a lot of time in trees. This is supported by anatomical features such as longer arm-bones, curved fingers, a shallow rib-cage and strong clavicular anchors for brachiation (swinging from the arms among branches).
Over time there were anatomical changes favouring bipedality. These included greater robustness of the ankle and knee joints, and changed positioning of the foramen magnum, the femur and the spine, to support changes to the centre of gravity. But the changes which have had the most long-lasting, even at times dire effects, have been those to the pelvic region. The strengthening and widening of this region, including the ilium, ischium and pubis, to support an upright stance, has to a serious degree compromised the process of childbirth. It’s been observed that australopithecines share with modern humans a sexual dimorphism relating to the lumbar vertebrae, allowing the spinal curvature of females to become more pronounced during pregnancy, which helps to better distribute the weight of the unborn child and to reduce fatigue and maintain stability of posture for the mother. However, the changed shape of the pelvis and the consequent narrowing of the birth canal resulted in what has become known as ‘the obstetrical dilemma’. Unlike virtually every other mammalian species, humans face major difficulties and dangers in childbirth, which require others – midwives or other medical professionals – to assist in the process (for example, neonatal rotation is often necessary for safe delivery). A ‘solution’ to this dilemma, which appears to have evolved over time, is a comparatively short gestation period – the time spent in the womb – to give a greater opportunity for both mother and child to survive the birth. This of course leads to a longer period of childhood dependence as it develops outside the womb. Apparently, a modern human baby is born with approximately 25% of full brain development, compared to 45-50% in other primates. Brain size at birth is limited due to the obstetric dilemma, and greater neoteny is the result.
Encephalisation, which refers to a growth in brain size or mass relative to body size, is now seen as a later development in the human story than bipedalism. Brain size in general has become very questionable as a measure of complex evolutionary development – witness those smart corvids – and it’s worth noting that the Neanderthal brain is on average larger than ours. What’s important, though, is brain structure – something we can’t really look at vis-a-vis our ancestors. However it is reasonable to assume that our much larger brain size compared to australopithecines is largely due to growth in the temporal lobes and the prefrontal cortex. In fact all regions have grown, including the cerebellum, traditionally associated with fine motor control and balance, but more recently connected with cognitive function and language.
But let me return to the hunt for the hominin links from the other great apes to Homo sapiens. In the mid-nineties, two new examples of early hominins were discovered, Australopithicus anamensis and Ardipithecus ramidus. I’m guessing that Gould didn’t know about these discoveries when he wrote his essay, as they seem to have punctured his punctuated equilibrium thesis, at least as regards hominins. Anyway the A anamensis species is believed to have lived from about 4.2 to 3.8 million years ago, and the A ramidus specimens have been dated to around 4.4 million years ago, but interestingly, A afarensis, the principal subject of Gould’s essay, is now believed to have lived from 3.9 million years ago to 2.9 million years ago – that’s a million years after Gould’s stated range. The australopithecines first came to our attention in 1925 when Raymond Dart described Australopithecus africanus from specimens found in South Africa. A africanus is a more gracile type, and may well be in the direct line to humans, though there’s been a lot of dispute about the dating and classifying of different specimens. A africanus is generally thought to be a more recent species than A afarensis, another gracile type. So maybe we can link A africanus back to A afarensis, which in turn can be linked back to S tchadensis, with some intermediate missing links. But then there’s another recently discovered species, Australopithecus sediba, which has been dated to around 2 million years ago and is thought to be a transitional species between A africanus and either Homo habilis (which some prefer to describe as Australopithecus habilis) or Homo erectus. Another gracile species discovered in the nineties, A garhi, dating to about 2.5 million years ago, also seems to fit as a species connecting Australopithecus and Homo. From what I’m reading, the fragmentary nature of these finds, together with obvious questions as to whether particular specimens are typical of whole species (type specimens are often juveniles, which might not be such a good idea), are the main barriers to pinning down the precise lines of succession. That’s why every new discovery is such a treasure.
I haven’t mentioned Ardipithecus or Paranthropus as yet. In the nineties specimens were found in the Afar triangle in East Africa, and classified as Ardipithecus ramidus (around 4.4 million years ago, with uncertain evidence of bipedality, and some evidence of reduced sexual dimorphism) and Ardipithecus kadabba (about 5.6 mya, possibly an ancestor of A ramidus, but known from only a few teeth and bones – the type specimen being a bit of mandible with an attached molar). It’s possible, according to some researchers, that Ardipithecus, Orrorin, and Sahelanthropus all belong to the same genus.
I’ll have a look at Paranthropus, apparently a more robust distant cousin of ours, then move forward to the Homo genus, next time.
References
Australopithecus Evolution (video), by Henry the PaleoGuy, 2019
Seven million years of human evolution (video), American Museum of Natural History, 2018
https://en.wikipedia.org/wiki/Human_evolution
https://en.wikipedia.org/wiki/Australopithecus
https://en.wikipedia.org/wiki/Australopithecus_afarensis
https://en.wikipedia.org/wiki/Australopithecus_africanus
https://en.wikipedia.org/wiki/Australopithecus_sediba
https://en.wikipedia.org/wiki/Australopithecus_anamensis
https://en.wikipedia.org/wiki/Ardipithecus
http://humanorigins.si.edu/evidence/human-fossils/species/ardipithecus-ramidus
On Massimo Pigliucci on scientism: part 1 – what is science?

I’ve written a couple of posts on scientism (all references below), which is for some reason a topic that always gets me exercised. So a recent brief interview with the philosopher Massimo Pigliucci, on the Point of Inquiry podcast, has set me back on the wagon. This blog post will be a piece by piece analysis of (some bits of) the interview.
I’ll begin with the Point of Inquiry host Kavin Senapathy’s intro, in which she gives a definition of scientism as:
this idea that the scientific method is the only worthwhile way of answering questions, and that any question that can’t be tackled using science is therefore unimportant or frivolous, and this often seems to apply to areas of social or political concern. In practice, those with a scientific approach try to colonise other areas of expertise and call them science. So this is really an ideology
So scientism is an ideology (and Pigliucci agrees with this later in the interview). I must say I’m skeptical of both terms, but let me focus for now on ‘ideology’. I once recall, during a meeting of secular and religious humanists, an old bloke beside me describing atheism as an ideology. The term’s often abused, and almost invariably used as a put-down. Only the other day, our former PM, John Howard, not known for his scientific literacy, complained that the recent federal election was marred by ‘climate change ideology’, by which he clearly meant the view that anthropogenic global warming is an issue.
More important here, though, is the attempt to define scientism, which makes me wonder if scientism is really a thing at all. The problem for me here is that it’s obvious that any area of ‘social or political concern’ will benefit from rigorous thought, or inference, based on various forms of evidence. Whether you want to call it science or not isn’t, for me, a major issue. For example, a state’s immigration policy would best be based on a range of concerns and analyses about its population, its resources, its productivity, its degree of integration, its previous experience of immigration, its relations with neighbours, the needs and aspirations of the immigrants, and so on. These factors can’t simply be intuited (though politicians generally do base their decisions on intuition, or ideology), but whether such analysis rises to the level of science doubtless depends on how you define science. However, it would clearly benefit from science in the form of number-crunching computer technology – always bearing in mind the garbage-in-garbage-out caveat.
So, it’s not about ‘colonising’ – it’s about applying more rigour, and more questioning, to every area of human activity. And this is why ‘scientism’ is often a term of abuse used by the religious, and by ‘alternative medicine’ and ‘new age’ aficionados, who are always more interested in converts than critiques.
Returning to the interview, Pigliucci was asked first off whether it’s a common misconception among skeptics that there’s a thing called ‘the scientific method’:
Yes I think it is, and it’s actually a common misconception among scientists, which is more worrisome. If you pick up a typical science textbook… it usually starts out with a short section on the scientific method, by which they usually mean some version of… the nomological deductive model. The idea is that science is based firstly on laws…. the discovery of laws of nature, and ‘deductive’ means that mostly what is done is deduction, the kind of inferential reasoning that mathematicians and logicians do. But no scientists have ever used this model, and philosophers of science have debated the issue over the last century of so and now the consensus among such philosophers is that scientists do whatever the hell works….
(I’ve ‘smoothed out’ the actual words of Pigliucci here and elsewhere, but I believe I’ve represented his ideas accurately). I found this an extraordinary confession, by a philosopher of science, that after a century of theorising, philosophers have failed abysmally in trying to define the parameters of the scientific process. I’m not sure if Pigliucci understands the significance, for his own profession, of what he’s claiming here.
I have no problems with Pigliucci’s description that scientists ‘do what works’, though I think there’s a little more to it than that. Interestingly, I read a few books and essays on the philosophy of science way back in my youth, before I actually started reading popular science books and magazines, and once I plugged into the world of actual scientific experimentation and discovery I was rarely tempted to read that kind of philosophy again (mainly because scientists and science writers tend to do their own practical philosophising about the field they focus on, which is usually more relevant than the work of academic philosophers). I came up, years ago, with my own amateur description of the scientific process, which I’ll raise here to the status of Universal Law:
Scientists employ an open-ended set of methods to arrive at reliable and confirmable knowledge about the world.
So, while there’s no single scientific method, methodology is vital to good science, for hopefully obvious reasons. Arriving at this definition doesn’t require much in the way of philosophical training, so I rather sympathise with those, such as Neil Degrasse Tyson, Sam Harris and Richard Dawkins, who are targeted by Pigliucci as promoters or practitioners of scientism (largely because they feel much in the philosophy of science is irrelevant to their field). But first we really need to get a clearer view of what Pigliucci means by the term. Here’s his attempt at a definition:
Scientism is the notion that some people apply science where either it doesn’t belong or it’s not particularly useful. So, as betrayed by the ‘ism’, it’s an ideology. It’s the notion that it’s an all-powerful activity and that all interesting questions should be reducible to scientific questions. If they’re not, if science can’t tell you anything, then either the question is uninteresting or incoherent. This description of scientism is generally seen as a critique, though there are some who see scientism as a badge of honour.
Now I must say that I first came across scientism in this critical sense, while watching a collection of speeches by Christians and pro-religion philosophers getting stuck into ye olde ‘new atheism’ (see the references below). Their views were of course very defensive, and not very sophisticated IMHO, but scientism was clearly being used to shelter religious beliefs, which cover everything from morality to cosmology, from any sort of critique. There was also a lot of bristling about scientific investigations of religion, which raises the question, I suppose, as to whether anthropology is a science. It’s obvious enough that some anthropological analyses are more rigorous than others, but again, I wouldn’t lose any sleep over such questions.
But the beauty of the scientific quest is that every ‘answer’ opens up new questions. Good science is always productive of further science. For example, when we reliably learned that genes and their ‘mutations’ were the source of the random variation essential to the Darwin-Wallace theory of evolution, myriad questions were raised about the molecular structure of genes, where they were to be found, how they were transferred from parents to offspring, how they brought about replication and variation, and so forth. Science is like that, the gift that keeps on giving, turning ‘unknown unknowns’ into ‘known unknowns’ on a regular basis.
I’ve read countless books of ‘popular’ science – actually many of them, such as Robert Sapolsky’s Behave, James Gleick’s The information, and Oliver Morton’s Eating the the sun, are fiendishly complex, so not particularly ‘popular’ – as well as a ton of New Scientist, Scientific American and Cosmos magazines, and no mention has been made of ‘the scientific method’ in any of them, so Pigliucci’s claim that many scientists believe in some specific method just doesn’t ring true to me. But let me turn to some more specific critiques.
When Sam Harris wrote The Moral Landscape…he wrote in an endnote to the book that by science he meant any kind of reasoning that is informed by facts. Well, by that standard when my grandmother used to make mushroom risotto for me on Sundays, she was using science, because she was reasoning about what to do, based on factual experience. Surely that doesn’t count as science [laughing]… Even if you think of ‘food science’ as a science that’s definitely not what my grandmother was doing. It’s this attempt to colonise other areas of expertise and call them science…
In my view Pigliucci disastrously misses the point here. Making a delicious risotto is all about method, as is conducting an effective scientific experiment. It’s not metaphorical to say that every act of cooking is a scientific experiment – though of course if you apply the same method to the same ingredients, MacDonalds-style, the experimental element diminishes pretty rapidly. Once someone, or some group, work out how to make a delicious mushroom risotto (I’m glad Pigliucci chose this example as I’ve cooked this dish countless times myself!) they can set down the recipe – usually in two parts, ingredients and method – so that it can be more or less replicated by anyone. Similarly, once scientists and technologists work out how to construct a functioning computer, they can set down a ‘computer recipe’ (components and method of construction) so that it can be mass-produced. There’s barely any daylight between the two processes. The first bread-makers arguably advanced human technology as much as did the first computer-makers.
I have quite a bit more to say, so I’ll break this essay into two parts. More soon.
References – apart from the first and the last, these are all to pieces written by me.
Point of Inquiry interview with Massimo Pigliucci
Discussion on scientific progress and scientism, posted April 2019
A post about truth, knowledge and other heavy stuff, posted March 2013
politics and science need to mix, posted August 2011
On supervenience, posted January 2011
Roger Scruton and the atheist ‘fashion’, posted January 2011
a critique of Johnathan Ree’s contribution, posted January 2011
Marilynne Robinson tries her hand at taking on ‘new atheism’, posted January 2011
a few thoughts on libertarianism
Libertarianism is like Leninism: a fascinating, internally consistent political theory with some good underlying points that, regrettably, makes prescriptions about how to run human society that can only work if we replace real messy human beings with frictionless spherical humanoids of uniform density (because it relies on simplifying assumptions about human behavior which are unfortunately wrong). I don’t know who wrote this.

Aren’t libertarians a lovely lot?
I might look more closely at some libertarian philosophy later, but for now I want to critique the kind of standard libertarianism I’ve heard from politicians and bloggers.
Well, okay, I’ll start with a philosopher, Robert Nozick, whose much-vaunted/pilloried book Anarchy, State and Utopia I tried to read in the eighties. I found it pretty indigestible and essentially learned from others that his argument depended rather too much on one principle – the human right of individuals to certain positive and negative freedoms, but especially negative ones, like the right to be left largely alone, to make their own decisions for example about how to contribute to the greater good. The book ended up advocating for a minimalist state, in which everyone gets to create their own communities of kindred spirits, organically grown A cornucopia of utopias. The kind of state that, ummm, like, doesn’t exist anywhere. That’s the problem. Utopia is definable as a society that only exists in fantasy.
And then there’s the exaltation of the individual. This is the problem I’ve encountered with every libertarian I’ve read or viewed – and I’m quite glad I’ve rarely had any personal encounters with them.
If I did, here would be my response. Homo sapiens are the most socially constructed mammals on the planet. Language has massively facilitated this, and in turn has become our most powerful social product. Common languages have created civilisations, and this has allowed us to dominate the planet, for better or worse. And civilisation requires, or just is, organised social structure. That’s to say, a state, that eternal bogey-man of the libertarian.
This entity, the state, has shaped humans for millennia. Today, we owe (largely) to the state the clothes we wear, the food we eat, the education we’re hopefully still having, the jobs we’ve had and lost, the houses we live in, the cars we used to drive, and the good health we increasingly enjoy. That’s why, it seems to me, we owe it to ourselves to make the state we live in as good as we can make it, in terms of health, safety, opportunity, support, pleasure and self-improvement, for all its members.
It seems to me we have to work with what exists instead of trying to invent utopias – because, obviously, one person’s utopia is another’s nightmare. What exists today is a variety of states, some clearly better than others. The minimalist states are among the worst, and they’re understandably called failed states. There is no effectively functioning minimalist state on the planet, a fact that many libertarians blithely ignore. Their emphasis on individual liberty seems to me the product of either beggar-thy-neighbour selfishness or starry-eyed optimism about natural affinities.
Again, I turn to the USA, my favourite whipping-state. This hotbed of libertarians has not blossomed as it could, considering its booming economy. From this distance, it seems a sad and often stomach-turning mixture of white-collar fraudsters and chronically disadvantaged, over-incarcerated victims, and good people who largely accept this as the status quo. The you-can-achieve-anything mantra of the American Dream generally sees individuals as blank slates who can best fulfil their potential when pulled from the rubble of the coercive state. Or State, as many libertarians prefer.
It didn’t take my recent reading of Robert Sapolsky’s Behave, a superb overview of human behaviour and its multifarious and interactive underpinnings, or Steven Pinker’s earlier The Blank Slate, to realise that this was a dangerous myth. It was always screamingly obvious to me, from my observation of the working-class milieu of my childhood, the variety of skills my classmates displayed and the problems they faced from the outset, together with my readings of more privileged worthies and their patrician connections (Bertrand Russell on the knee of William Gladstone always comes irritatingly to mind), that there has never been anything like an even playing field for exhibiting and making the most of whatever qualities we’re gifted with or are motivated to cultivate and improve.
So this is the problem: we’re not free to undo what has been ‘done to us’ – the parents we have, the country (or century) we’re born in, the traumas and joys we’ve experienced in the womb, our complex genetic inheritance and so forth. All of these things are connected to a much wider world and a past over which we have no control. They shape us (into go-getting libertarians or bleeding-heart liberals or whatever) much more than we’re generally prepared to admit. And these shaping forces, since the emergence of civilisation and that sometimes messily organised unit called the state, are profoundly social. And even if we’re not talking about western civilisation it’s the same – it takes a village to raise a child.
These shaping forces aren’t necessarily bad or good, they just are. But all in all we should be glad they are. The social brain is the brightest, most complex brain, and such brains wouldn’t have developed if the individual was sacrosanct, in receipt of the right to be ‘left alone’. Civilisation is surely the most impressive achievement of human evolution, and as Ralph Adolphs of Caltech puts it, ‘no component of our civilization would be possible without large-scale collective behavior’.
The state, of course, has its drawbacks, as do all large-scale multifaceted administrative entities. The ancient Greek city-states produced a host of brilliant contributors to their own esteem as well as to the world history of drama, philosophy, mathematics and history itself, in spite of being built on slavery and denying any equitable role to women, but even there the (probably few) slaves who worked in the most enlightened households would’ve benefitted from the collective, and the women, however officially treated, were surely just as involved and brainy as the men.
As society has grown increasingly complex we as individuals have grown in proportion, as have our individual delusions of grandeur. At least in some cases. What the best of us should have learned, though, is that a rich, diverse, dynamic society, which cannot but be organised, produces the best offerings to its children. Diminishing the state by refusing to contribute to it actually diminishes and impoverishes the self, diminishes connection and the recognition of collective value. This raises the rather large point that the self isn’t what most people think it is – an autonomous, self-actuated entity. Instead, it is driven by complex social inputs from the very start, indeed from long before it came into being. Just as events from long before a crow is born, or even conceived, will go a long way in determining how that adult crow behaves.
Yet the myth of the individual, autonomous self is a live one, and it’s what drives most libertarians. In so far as people see themselves as self-actualising, they will argue the same for others, and absolve themselves from responsibility for others’ failures, mistakes or incapacities. Such attitudes significantly play down disadvantages of background, and even reduce exposure to those differences. Since everyone has the choice to be as successful as me (according to my own measure of success), why should I waste time hanging out with losers? By that measure, to suggest that silver-spoon libertarians would willingly provide support to disadvantaged communities is as unrealistic as expecting Donald Trump to hang out with the construction workers on his trumpy towers.
In some respects, libertarianism represents the opposite pole to communism, on a continuum that stretches into complete delusion at both ends. There have never been any actual, functioning communist or libertarian states. Both are essentially abstract ideologies, which take little account of the science of evolved human behaviour. When we do take account of that science, we find it is fiendishly complex, with the individual as a unit being driven and shaped by social dependencies, connections and responsibilities, which are generally vital to that individual’s well-being. In western democratic societies, apart from family and workplace organisations, we have government, which includes, in Australia, councils, states and a federation of states. It all sound terribly complex and web-like, and some apparently see it as ‘the enemy of individual liberty’ but in fact it’s the web of civilised human life, which we’ve all contributed to creating, and it’s a pretty impressive web – though more impressive in some places than in others. I think the best thing we can try to do is to improve it rather than trying to extricate ourselves from it. In any case, doing so – I mean, removing ourselves from organised society – just won’t work, and fundamentally misunderstands the nature of our evolved humanity.
The battle for justice, part 1: some background to the case
A prosecution should not proceed if there is no reasonable prospect of a conviction being secured. This basic criterion is the cornerstone of the uniform prosecution policy adopted in Australia.
from ‘The decision to prosecute’, in ‘Statement of prosecution policy and guidelines’, Director of Public Prosecutions, South Australia, October 2014

not this movie, unfortunately
I rarely focus on myself on this blog, but now I feel I have to. Today I lost my job because of something that happened to me about 12 years ago. So the next I don’t know how many posts will be devoted to my battle for justice, in the hope that it may help others in a similar situation. Of course I also find that writing is my best solace, as well as my best weapon. I have no financial resources to speak of, all I have is a certain amount of nous.
Between 2003-4 and 2010 I was a foster carer, under the aegis of Anglicare. Over that period I fostered six boys, with naturally varying success.
So why did I become a foster carer? I simply saw an ad on a volunteering website. I was being pushed to do some work, which I’ve always been reluctant to do, being basically a reclusive bookworm who loves to read history, science, everything that helps to understand what humans are, where they came from, where they’re going. And I hate when work interferes with that! But having come from what for me was a rather toxic family background, trying to shut myself from screaming fights between parents, and being accused by my mother, the dominant parent, of being a sneak and a liar, and ‘just like your father’ (her worst insult), and being physically and mentally abused by both parents (though never sexually), and having run away from home regularly in my teen years, I imagined that, as a survivor, I could offer something which might work for at least some of these kids – a hands-off, non-bullying environment which would be more equal in terms of power than many foster-care situations. Call me naive…
Mostly, this approach worked. I did have to get heavy now and then of course, but not for long, so I always managed to stay on good terms with my foster-kids, as I have more recently with my students. This was even the case with the lad who accused me of raping him.
Let me describe the case as briefly as possible. A fifteen-year old boy was in my care in September 2005. He was much more of a handful than the previous two boys I’d looked after, and when I lost my temper with him during a school holiday trip in Victor Harbour, he took it out on me by claiming to his mother, with whom he spent his weekends, that I’d punched him on the back of the head. This was false, but his mother took the matter to the police, and the boy was immediately taken out of my care.
After an internal review conducted by Anglicare I was cleared of any wrongdoing, to their satisfaction at least, and another boy was placed in my care. Then, sometime in early 2006, this boy was secretly whisked out of my care, and I was informed by Anglicare that a serious allegation had been made against me. I was in shock, naturally thinking this new boy had also accused me of some kind of violence, but I was finally informed by the Anglicare social worker who’d been overseeing my placements that ‘it isn’t your new foster – kid’. The penny dropped more or less immediately that it was the same boy who’d accused me of hitting him. This boy, as far as I was aware, was now living happily with his mum.
I was left in limbo for some time, but eventually I received a message from the police to go to the Port Adelaide police station. There I was asked to sit down in an office with two police officers, and informed that I was under arrest for rape.
I was somewhat taken aback haha, and I don’t recall much of the conversation after that, but I think it went on for a long time. I do remember one key question: if the boy’s lying, why would he make such an allegation? I had no answer: I was unable to think clearly, given the situation. But later that night, after my release on bail, an answer came to me, which might just be the right one. When the boy was in my care, the plan was to reconcile him with his mother, who put him in care in the first place because she couldn’t cope with him. I knew his mother, as I met her every weekend for handover. She was highly strung and nervous, and it seemed likely she was again having trouble coping with full-time care. Quite plausibly, she was threatening to return him to foster care, which he wouldn’t have wanted. She allowed him to smoke, she allowed him to hang out with his mates, and her environment was familiar to him. To him, I would’ve seemed boringly bookish and unadventurous. What’s more, his claim that I’d hit him had worked perfectly for him, getting him exactly where he wanted. Why not shut the door on foster care forever, by making the most extreme claim?
I don’t really know if this sounds preposterous to an impartial reader, but this answer to the riddle struck me as in keeping with what I knew of the boy’s thinking, and it was backed up by a remark he made to me, which soon came back to haunt me. He said ‘my mum’s friend told me that all foster carers are child molesters…’. It was the kind of offhand remark he’d often make, but it was particularly striking in light of something I was told later by my lawyer. Apparently, the boy didn’t tell his mother directly that I’d raped him, he’d told a friend of his mother, who’d then told her.
So, after the sleepless night following my arrest, I felt confident that I knew the answer to the key police question. I typed it up and took it forthwith to the Port Adelaide station (I didn’t trust the mail). How utterly naive of me to think they’d be grateful, or interested! I received no response.
So I obtained a lawyer through legal aid, or the Legal Services Commission. At the time I was dirt poor: I’d received a stipend as a foster carer, but that had stopped. Otherwise I worked occasionally as a community worker or English language teacher, mostly in a voluntary role. From the moment I was charged I spent many a sleepless night imagining my days in court, heroically representing myself of course, exposing contradictions and confabulations, citing my spotless record, my abhorrence of violence of all kinds, etc, etc. So I was a bit miffed when my lawyer told me to sit tight and do nothing, say nothing, and to leave everything to him. Standard procedure, presumably. The case passed from hearing to hearing (I don’t know if that’s the word – at least there were several court appearances), over a period of more than a year, and every time I expected it to be dismissed, since I knew there was no evidence. It had to be dismissed, there could be no other possibility. The only reason it had become a court matter in the first place, it seemed to me, was the absolute enormity of the allegation. But how could this possibly be justified? But I had to admit, the boy had, more or less accidentally, stumbled on the perfect crime to accuse me of – a crime committed months before, where there could be no visible evidence one way or another… It was all very nerve-wracking. And I was very annoyed at the fact that the DPP (the Office of the Director of Public Prosecutions) seemed to have different lawyers representing it at every court appearance, and mostly they behaved as if they’d only been handed the brief minutes before.
Finally I arrived at the lowest point so far – an arraignment. I didn’t know this (my last) appearance would be an arraignment and I didn’t know what that was. I just expected yet another appearance with a handful of yawning court officials and lawyers in attendance. Instead I found a packed courtroom.
Arraignment is a formal reading of a criminal charging document in the presence of the defendant to inform the defendant of the charges against him or her. In response to arraignment, the accused is expected to enter a plea.
In Australia, arraignment is the first of eleven stages in a criminal trial, and involves the clerk of the court reading out the indictment. (WIKIPEDIA)
The reason the courtroom was packed is that several arraignments are processed in the same courtroom on the same day, so there were several accused there with their friends and families. Unfortunately, I was solo. On my turn, I was taken out to the holding cells and brought in – some kind of ceremonial – to the dock. The charge was read out (I’d already been given the ‘details’ by the lawyer, so I barely listened to it) and I was asked to plead, and the judge told the court, to my utter amazement, that I was adjudged to have a case to answer.
So it was perhaps even more amazing that, a week or two after that appearance, the case was dropped.
three problems with Islamic society, moderate or otherwise
As a teacher of English to foreign students, I have a lot of dealing with, mostly male, Moslems. I generally get on very well with them. Religion doesn’t come up as an issue, any more than with my Chinese or Vietnamese students. I’m teaching them English, after all. However, it’s my experience of the views of a fellow teacher, very much a moderate Moslem, that has caused me to write this piece, because those views seem to echo much that I’ve read about online and elsewhere.
- Homosexuality
It’s well known that in such profoundly Islamic countries as Saudi Arabia and Afghanistan, there’s zero acceptance of homosexuality, to the point of claiming it doesn’t exist in those countries. Its ‘non-existence’ may be due to that fact that its practice incurs the death penalty (in Saudia Arabia, Yemen, Mauritania, Iran and Sudan), though such penalties are rarely carried out – except, apparently, in Iran. Of course, killing people in large numbers would indicate that there’s a homosexual ‘problem’. In other Moslem countries, homosexuals are merely imprisoned for varying periods. And lest we feel overly superior, take note of this comment from a very informative article in The Guardian:
Statistics are scarce [on arrests and prosecutions in Moslem countries] but the number of arrests is undoubtedly lower than it was during the British wave of homophobia in the 1950s. In England in 1952, there were 670 prosecutions for sodomy, 3,087 for attempted sodomy or indecent assault, and 1,686 for gross indecency.
This indicates how far we’ve travelled in a short time, and it also gives hope that other nations and regions might be swiftly transformed, but there’s frankly little sign of it as yet. Of course the real problem here is patriarchy, which is always and everywhere coupled with homophobia. It’s a patriarchy reinforced by religion, but I think if we in the west were to try to put pressure on these countries and cultures, I think we’d succeed more through criticising their patriarchal attitudes than their religion.
Having said this, it just might be that acceptance of homosexuality among liberal Moslems outside of their own countries (and maybe even inside them) is greater than it seems to be from the vibes I’ve gotten from the quite large numbers of Moslems I’ve met over the years. A poll taken by the Pew Research Centre has surprised me with its finding that 45% of U.S. Moslems accept homosexuality (in 2014, up from 38% in 2007), more than is the case among some Christian denominations, and the movement towards acceptance aligns with a trend throughout the U.S. (and no doubt all other western nations), among religious and non-religious alike. With greater global communication and interaction, the diminution of poverty and the growth of education, things will hopefully improve in non-western countries as well.
2. Antisemitism and the Holocaust
I’ve been shocked to hear, more than once, Moslems blithely denying, or claiming as exaggerated, the events of the Holocaust. This appears to be a recent phenomenon, which obviously bolsters the arguments of many Middle Eastern nations against the Jewish presence in their region. However, it should be pointed out that Egypt’s President Nasser, a hero of the Moslem world, told a German newspaper in 1964 that ‘no person, not even the most simple one, takes seriously the lie of the six million Jews that were murdered [in the Holocaust]’. More recently Iran has become a particular hotspot of denialism, with former President Ahmadinejad making a number of fiery speeches on the issue. Most moderate Islamic organisations, here and elsewhere in the west, present a standard line that the Shoah was exactly as massive and horrific as we know it to be, but questions are often raised about the sincerity of such positions, given the rapid rise of denialism in the Arab world. Arguably, though, this denialism isn’t part of standard anti-semitism. Responding to his own research into holocaust denialism among Israeli Arabs (up from 28% in 2006 to 40% in 2008), Sammy Smooha of Haifa University wrote this:
In Arab eyes disbelief in the very happening of the Shoah is not hate of Jews (embedded in the denial of the Shoah in the West) but rather a form of protest. Arabs not believing in the event of Shoah intend to express strong objection to the portrayal of the Jews as the ultimate victim and to the underrating of the Palestinians as a victim. They deny Israel’s right to exist as a Jewish state that the Shoah gives legitimacy to. Arab disbelief in the Shoah is a component of the Israeli-Palestinian conflict, unlike the ideological and anti-Semitic denial of the Holocaust and the desire to escape guilt in the West.
This is an opinion, of course, and may be seen as hair-splitting with respect to anti-semitism, but it’s clear that these counterfactual views aren’t helpful as we try to foster multiculturalism in countries like Australia.They need to be challenged at every turn.

Amcha, the Coalition for Jewish Concerns holds a rally in front of the Iranian Permanent Mission to the United Nations in response to Iranian President Mahmoud Ahmadinejad’s threats against Isreal and denial of the Holocaust, Monday, March 13, 2006 in New York. (AP Photo/Mary Altaffer)
3. Evolution
While the rejection, and general ignorance, of the Darwin-Wallace theory of evolution – more specifically, natural selection from random variation – may not be the most disturbing feature of Islamic society, it’s the one that most nearly concerns me as a person keen to promote science and critical thinking. I don’t teach evolution of course, but I often touch on scientific topics in teaching academic English. A number of times I’ve had incredulous comments on our relationship to apes (it’s more than a relationship!), and as far as I can recall, they’ve all been from Moslem students. I’ve also come across various websites over the years, by Moslem writers – often academics – from Turkey, India and Pakistan whose anti-evolution and anti-Darwin views degenerate quickly into fanatical hate-filled screeds.
I won’t go into the evidence for natural selection here, or an explanation of the theory, which is essential to all of modern biology. It’s actually quite complex when laid out in detail, and it’s not particularly surprising that even many non-religious people have trouble understanding it. What bothers me is that so many Moslems I’ve encountered don’t make any real attempt to understand the theory, but reject it wholesale for reasons not particularly related to the science. They’ve used the word ‘we’ in rejecting it, so that it’s impossible to even get to first base with them. This raises the question of the teaching of evolution in Moslem schools (and of course, not just Moslem schools), and whether and how much this is monitored. One may argue that non-belief in evolution, like belief in a flat earth or other specious ways of thinking, isn’t so harmful given a general scientific illiteracy which hasn’t stopped those in the know from making great advances, but it’s a problem when being brought up in a particular culture stifles access to knowledge, and even promotes a vehement rejection of that knowledge. We need to get our young people on the right page not in terms of a national curriculum but an evidence-based curriculum for all. Evidence has no national boundaries.
Conclusion – the problem of identity politics
The term identity politics is used in various ways, but I feel quite clear about my own usage here. It’s when your identity is so wrapped up in a political or cultural or religious or class or caste or professional grouping, that it trumps your own independent critical thinking and analysis. The use of ‘we think’ or ‘we believe’, is the red flag for these attitudes, but of course this usage isn’t always overt or conscious. The best and probably only way to deal with this kind of thinking is through constructive engagement, drawing people out of the groupthink intellectual ghetto through argument, evidence and invitations to reconsider (or consider for the first time) and if that doesn’t work, firmness regarding the evidence-based view together with keeping future lines of communications open. They say you should keep your friends close and your enemies closer, and it’s a piece of wisdom that works on a pragmatic and a humane level. And watch out for that firmness, because the evidence is rarely fixed. Education too is important. As an educator, I find that many students are open to the knowledge I have to offer, and are sometimes animated and inspired by it, regardless of their background. The world’s an amazing place, and students can be captivated by its amazingness, if it’s presented with enthusiasm. That can lead to explorations that can change minds. Schools are, or can be, places where identity politics can fragment as peers from different backgrounds can converge and clash, sometimes in a constructive way. We need to watch for and combat the echo-chamber effect of social media, a new development that often reinforces false and counter-productive ideas – and encourages mean-spirited attacks on faceless adversaries. Breaking down walls and boundaries, rather than constructing them, is the best solution. Real interactions rather than virtual ones, and thinking about the background and humanity of the other before leaping into the fray (I’m beginning to sound saintlier than I’ve ever really been – must be the Ha Ji-won influence!)
the strange world of the self-described ‘open-minded’ part two
- That such a huge number of people could seriously believe that the Moon landings were faked by a NASA conspiracy raises interesting questions – maybe more about how people think than anything about the Moon landings themselves. But still, the most obvious question is the matter of evidence.
Philip Plait, from ‘Appalled at Apollo’, Chapter 17 of Bad Astronomy

the shadows of astronauts Dave Scott and Jim Irwin on the Moon during the 1971 Apollo 15 mission – with thanks to NASA, which recently made thousands of Apollo photos available to the public through Flickr
So as I wrote in part one of this article, I remember well the day of the first Moon landing. I had just turned 13, and our school, presumably along with most others, was given a half-day off to watch it. At the time I was even more amazed that I was watching the event as it happened on TV, so I’m going to start this post by exploring how this was achieved, though I’m not sure that this was part of the conspiracy theorists’ ‘issues’ about the missions. There’s a good explanation of the 1969 telecast here, but I’ll try to put it in my own words, to get my own head around it.
I also remember being confused at the time, as I watched Armstrong making his painfully slow descent down the small ladder from the lunar module, that he was being recorded doing so, sort of side-on (don’t trust my memory!), as if someone was already there on the Moon’s surface waiting for him. I knew of course that Aldrin was accompanying him, but if Aldrin had descended first, why all this drama about ‘one small step…’? – it seemed a bit anti-climactic. What I didn’t know was that the whole thing had been painstakingly planned, and that the camera recording Armstrong was lowered mechanically, operated by Armstrong himself. Wade Schmaltz gives the low-down on Quora:
The TV camera recording Neil’s first small step was mounted in the LEM [Lunar Excursion Module, aka Lunar Module]. Neil released it from its cocoon by pulling a cable to open a trap door prior to exiting the LEM that first time down the ladder.
Neil Armstrong, touching down on the Moon – an image I’ll never forget
the camera used to capture Neil Armstrong’s descent
As for the telecast, Australia played a large role. Here my information comes from Space Exploration Stack Exchange, a Q and A site for specialists as well as amateur space flight enthusiasts.
Australia was one of three continents involved in the transmissions, but it was the most essential. Australia had two tracking stations, one near Canberra and the other at the Parkes Radio Observatory west of Sydney. The others were in the Mojave Desert, California, and in Madrid, Spain. The tracking stations in Australia had a direct line on Apollo’s signal. My source quotes directly from NASA:
The 200-foot-diameter radio dish at the Parkes facility managed to withstand freak 70 mph gusts of wind and successfully captured the footage, which was converted and relayed to Houston.
Needless to say, the depictions of Canberra and Sydney aren’t geographically accurate here!
And it really was pretty much ‘as it happened’, the delay being less than a minute. The Moon is only about a light-second away, but there were other small delays in relaying the signal to TV networks for us all to see.
So now to the missions and the hoax conspiracy. But really, I won’t be dealing with the hoax stuff directly, because frankly it’s boring. I want to write about the good stuff. Most of the following comes from the ever-more reliable Wikipedia – available to all!
The ‘space race’ between the Soviet Union and the USA can be dated quite precisely. It began in July 1956, when the USA announced plans to launch a satellite – a craft that would orbit the Earth. Two days later, the Soviet Union announced identical plans, and was able to carry them out a little over a year later. The world was stunned when Sputnik 1 was launched on October 4 1957. Only a month later, Laika the Muscovite street-dog was sent into orbit in Sputnik 2 – a certain-death mission. The USA got its first satellite, Explorer 1, into orbit at the end of January 1958, and later that year the National Aeronautics and Space Administraion (NASA) was established under Eisenhower to encourage peaceful civilian developments in space science and technology. However the Soviet Union retained the initiative, launching its Luna program in late 1958, with the specific purpose of studying the Moon. The whole program, which lasted until 1976, cost some $4.5 billion and its many failures were, unsurprisingly, shrouded in secrecy. The first three Luna rockets, intended to land, or crash, on the Moon’s surface, failed on launch, and the fourth, later known as Luna 1, was given the wrong trajectory and sailed past the Moon, becoming the first human-made satellite to take up an independent heliocentric orbit. That was in early January 1959 – so the space race, with its focus on the Moon, began much earlier than many people realise, and though so much of it was about macho one-upmanship, important technological developments resulted, and vital observations were made, including measurements of energetic particles in the outer Van Allen belt. Luna 1 was the first spaceship to achieve escape velocity, the principle barrier to landing a vessel on the Moon.
After another launch failure in June 1959, the Soviets successfully launched the rocket later known as Luna 2 in September that year. Its crash landing on the Moon was a great success, which the ‘communist’ leader Khrushchev was quick to ‘capitalise’ on during his only visit to the USA immediately after the mission. He handed Eisenhower replicas of the pennants left on the Moon by Luna 2. And there’s no doubt this was an important event, the first planned impact of a human-built craft on an extra-terrestrial object, almost 10 years before the Apollo 11 landing.
The Luna 2 success was immediately followed only a month later by the tiny probe Luna 3‘s flyby of the far side of the Moon, which provided the first-ever pictures of its more mountainous terrain. However, these two missions formed the apex of the Luna enterprise, which experienced a number of years of failure until the mid-sixties. International espionage perhaps? I note that James Bond began his activities around this time.

the Luna 3 space probe (or is it H G Wells’ time machine?)
The Luna Program wasn’t the only only one being financed by the Soviets at the time, and the Americans were also developing programs. Six months after Laika’s flight, the Soviets successfully launched Sputnik 3, the fourth successful satellite after Sputnik 1 & 2 and Explorer 1. The important point to be made here is that the space race, with all its ingenious technical developments, began years before the famous Vostok 1 flight that carried a human being, Yuri Gagarin, into space for the first time, so the idea that the technology wasn’t sufficiently advanced for a moon landing many years later becomes increasingly doubtful.

the first Dalek? Sputnik 3
https://en.wikipedia.org/wiki/Tsiolkovsky_State_Museum_of_the_History_of_Cosmonautics
Of course the successful Vostok flight in April 1961 was another public relations coup for the Soviets, and it doubtless prompted Kennedy’s speech to the US Congress a month later, in which he proposed that “this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the Moon and returning him safely to the Earth.”
So from here on in I’ll focus solely on the USA’s moon exploration program. It really began with the Ranger missions, which were conceived (well before Kennedy’s speech and Gagarin’s flight) in three phases or ‘blocks’, each with different objectives and with increasingly sophisticated system design. However, as with the Luna missions, these met with many failures and setbacks. Ranger 1 and Ranger 2 failed on launch in the second half of 1961, and Ranger 3, the first ‘block 2 rocket’, launched in late January 1962, missed the Moon due to various malfunctions, and became the second human craft to take up a heliocentric orbit. The plan had been to ‘rough-land’ on the Moon, emulating Luna 2 but with a more sophisticated system of retrorockets to cushion the landing somewhat. The Wikipedia article on this and other missions provides far more detail than I can provide here, but the intensive development of new flight design features, as well as the use of solar cell technology, advanced telemetry and communications systems and the like really makes clear to me that both competitors in the space race were well on their way to having the right stuff for a manned moon landing.
I haven’t even started on the Apollo missions, and I try to give myself a 1500-word or so limit on posts, so I’ll have to write a part 3! Comment excitant!
The Ranger 4 spacecraft was more or less identical in design to Ranger 3, with the same impact-limiter – made of balsa wood! – atop the lunar capsule. Ranger 4 went through preliminary testing with flying colours, the first of the Rangers to do so. However the mission itself was a disaster, as the on-board computer failed, and no useful data was returned and none of the preprogrammed actions, such as solar power deployment and high-gain antenna utilisation, took place. Ranger 4 finally impacted the far side of the Moon on 26 April 1962, becoming the first US craft to land on another celestial body. Ranger 5 was launched in October 1962 at a time when NASA was under pressure due to the many failures and technical problems, not only with the Ranger missions, but with the Mariner missions, Mariner 1 (designed for a flyby mission to Venus) having been a conspicuous disaster. Unfortunately Ranger 5 didn’t improve matters, with a series of on-board and on-ground malfunctions. The craft missed the Moon by a mere 700 kilometres. Ranger 6, launched well over a year later, was another conspicuous failure, as its sole mission was to send high-quality photos of the Moon’s surface before impact. Impact occurred, and overall the flight was the smoothest one yet, but the camera system failed completely.
There were three more Ranger missions. Ranger 7, launched in July 1964, was the first completely successful mission of the series. Its mission was the same as that of Ranger 6, but this time over 4,300 photos were transmitted during the final 17 minutes of flight. These photos were subjected to much scrutiny and discussion, in terms of the feasibility of a soft landing, and the general consensus was that some areas looked suitable, though the actual hardness of the surface couldn’t be determined for sure. Miraculously enough, Ranger 8, launched in February 1965, was also completely successful. Again its sole mission was to photograph the Moon’s surface, as NASA was beginning to ready itself for the Apollo missions. Over 7,000 good quality photos were transmitted in the final 23 minutes of flight. The overall performance of the spacecraft was hailed as ‘excellent’, and its impact crater was photographed two years later by Lunar Orbiter 4. And finally Ranger 9 made it three successes in a row, and this time the camera’s 6,000 images were broadcast live to viewers across the United States. The date was March 24, 1965. The next step would be that giant one.

A Ranger 9 image showing rilles – long narrow depressions – on the Moon’s surface