Archive for the ‘curiosity’ Category
reading matters 1

The universe within by Neil Turok (theoretical physicist extraordinaire)
Content hints
– Massey Lectures, magic that works, the ancient Greeks, David Hume and the Scottish Enlightenment, James Clerk Maxwell, quantum mechanics, entanglement, expanding and contracting universes, the square root of minus one, mathematical science in Africa, Paul Dirac, beauty and knowledge, the vitality of uncertainty, Mary Shelley, quantum computing, digital and analogue, Richard Feynman, science and humanity, humility, education, love, collaboration, creativity and thrill-seeking.
the strange world of the self-described ‘open-minded’ – part three, Apollo
In 2009, a poll held by the United Kingdom’s Engineering & Technology magazine found that 25% of those surveyed did not believe that men landed on the Moon. Another poll gives that 25% of 18- to 25-year-olds surveyed were unsure that the landings happened. There are subcultures worldwide which advocate the belief that the Moon landings were faked. By 1977 the Hare Krishna magazine Back to Godhead called the landings a hoax, claiming that, since the Sun is 93,000,000 miles away, and “according to Hindu mythology the Moon is 800,000 miles farther away than that”, the Moon would be nearly 94,000,000 miles away; to travel that span in 91 hours would require a speed of more than a million miles per hour, “a patently impossible feat even by the scientists’ calculations.”
From ‘Moon landing conspiracy theories’ , Wikipedia

Time magazine cover, December 1968
Haha just for the record the Sun is nearly 400 times further from us than the Moon, but who’s counting? So now to the Apollo moon missions, and because I don’t want this exploration to extend to a fourth part, I’ll be necessarily but reluctantly brief. They began in 1961 and ended in 1975, and they included manned and unmanned space flights (none of them were womanned).
But… just one more general point. While we may treat it as inevitable that many people prefer to believe in hoaxes and gazillion-dollar deceptions, rather than accept facts that are as soundly evidence-based as their own odd existences, it seems to me a horrible offence in this case (as in many others), both to human ingenuity and to the enormous cost in terms, not only of labour spent but of lives lost. So we need to fight this offensive behaviour, and point people to the evidence, and not let them get away with their ignorance.
The Apollo program was conceived in 1960 during Eisenhower’s Presidency, well before Kennedy’s famous mission statement. It was given impetus by Soviet successes in space. It involved the largest commitment of financial and other resources in peacetime history. The first years of research, development and testing involved a number of launch vehicles, command modules and lunar modules, as well as four possible ‘mission modes’. The first of these modes was ‘direct ascent’, in which the spacecraft would be launched and operated as a single unit. Finally, after much analysis, debate and lobbying, the mode known as Lunar Orbit Rendezvous (LOR) was adopted. The early phases of the program were dogged by technical problems, developmental delays, personal clashes and political issues, including the Cuban missile crisis. Kennedy’s principal science advisor, Jerome Weisner, was solidly opposed to manned missions.
I can’t give a simple one-by-one account of the missions, as the early unmanned missions weren’t simply named Apollo 1, 2 etc. They were associated strongly with the Saturn launch vehicles, and the Apollo numbering system we now recognise was only established in April 1967. The Apollo 4 mission, for example, is also known as AS-501, and was the first unmanned test flight of the Saturn 5 launcher (later used for the Apollo 11 launch). Three Apollo/Saturn unmanned missions took place in 1966 using the Saturn 1B launch vehicle.
The manned missions had the most tragic of beginnings, as is well known. On January 27 1967 the three designated astronauts for the AS-204 spaceflight, which they themselves had renamed Apollo 1 to commemorate the first manned flight of the program, were asphyxiated when a fire broke out during a rehearsal test. No further attempt at a manned mission was made until October of 1968. In fact, the whole program was grounded after the accident for ‘review and redesign’ with an overall tightening of hazardous procedures. In early 1968, the Lunar Module was given its first unmanned flight (Apollo 5). The flight was delayed a number of times due to problems and inexperience in constructing such a module. The test run wasn’t entirely successful, but successful enough to clear the module for future manned flights. The following, final unmanned mission, Apollo 6, suffered numerous failures, but went largely unnoticed due to the assassination of Martin Luther King on the day of the launch. However, its problems helped NASA to apply fixes which improved the safety of all subsequent missions.
And so we get to the first successful manned mission, Apollo 7. Its aim was to test the Apollo CSM (Command & Service Module) in low Earth orbit, and it put American astronauts in space for the first time in almost two years. It was also the first of the three-man missions and the first to be broadcasted from within the spaceship. Things went very well in technical terms, a relief to the crew, who were only given this opportunity due to the deaths of the Apollo 1 astronauts. There were some minor tensions between the astronauts and ground staff, due to illness and some of the onboard conditions. They spent 11 days in orbit and space food, though on the improve, was far from ideal.
Apollo 8, launched only two months later in December, was a real breakthrough, a truly bold venture, as described in Earthrise, an excellent documentary of the mission made in 2005 (the astronauts were the first to witness Earthrise from the Moon). The aim, clearly, was to create a high-profile event designed to capture the world’s attention, and to eclipse the Soviets. As the documentary points out, the Soviets had stolen the limelight in the space race – ‘the first satellite, the first man in orbit, the first long duration flight, the first dual capsule flights, the first woman in space, the first space walk’. Not to mention the first landing of a human-built craft on the Moon itself.

One of the world’s most famous photos, Earthrise, taken by astronaut William Anders on Christmas Eve, 1968
The original aim of the mission was to test the complete spacecraft, including the lunar module, in Earth orbit, but when the lunar module was declared unready, a radical change of plan was devised, involving an orbit of the Moon without the lunar module. Apollo 8 orbited the Moon ten times at close quarters (110 kms above the surface) over a period of 20 hours. During the orbit they made a Christmas Eve telecast, the most watched program ever, up to that time. Do yourself a favour and watch the doco. The commentary of the astronaut’s wives are memorable, and put the moon hoaxers’ offensiveness in sharp relief.
By comparison to Apollo 8 the Apollo 9 mission (March ’69) was a modest affair, if that’s not too insulting. This time the complete spacecraft for a Moon landing was tested in low Earth orbit, and everything went off well, though space walking proved problematic, as it often had before for both American and Soviet astronauts, due to space sickness and other problems. With Apollo 10 (May ’69) the mission returned to the Moon in a full dress rehearsal of the Apollo 11 landing. The mission created some interesting records, including the fastest speed ever reached by a manned vehicle (39,900 kms/hour during the return flight from the Moon) and the greatest distance from home ever travelled by humans (due to the Moon’s elliptical orbit, and the fact that the USA was on the ‘far side of the Earth’ when the astronauts were on the far side of the Moon).
I’ll pass by the celebrated Apollo 11 mission, which I can hardly add anything to, and turn to the missions I know less – that’s to say almost nothing – about.
Apollo 12, launched in November 1969, was a highly successful mission, in spite of some hairy moments due to lightning strikes at launch. It was, inter alia, a successful exercise in precision targeting, as it landed a brief walk away from the Surveyor 3 probe, sent to the Moon two and a half years earlier. Parts of the probe were taken back to Earth.
The Apollo 13 mission has, for better or worse, come to be the second most famous of all the Apollo missions. It was the only aborted mission of those intended to land on the Moon. An oxygen tank exploded just over two days after launch in April 1970, and just before entry into the Moon’s gravitational sphere. This directly affected the Service Module, and it was decided to abort the landing. There were some well-documented hairy moments and heroics, but the crew managed to return safely. Mea culpa, I’ve not yet seen the movie!
Apollo 14, launched at the end of January 1971, also had its glitches but landed successfully. The astronauts collected quite a horde of moon rocks and did the longest moonwalk ever recorded. Alan Shepard, the mission commander, added his Moon visit to the accolade of being the first American in space ten years earlier. At 47, he’s the oldest man to have stepped on the Moon. The Apollo 15 mission was the first of the three ‘J missions’, involving a longer stay on the Moon. With each mission there were improvements in instrumentation and capability. The most well-known of these was the Lunar Roving Vehicle, first used on Apollo 15, but that mission also deployed a gamma-ray spectrometer, a mass spectrometer and a laser altimeter to study the Moon’s surface in detail from the command module. Apollo 16 was another successful mission, in which the geology of the Moon’s surface was the major focus. Almost 100kgs of rock were collected, and it was the first mission to visit the ‘lunar highlands’. The final mission, Apollo 17, was also the longest Moon stay, longest moonwalks in total, largest samples, and longest lunar orbit. And so the adventure ended, with high hopes for the future.
I’ve given an incredibly skimpy account, and I’ve mentioned very few names, but there’s a ton of material out there, particularly on the NASA site of course, and documentaries aplenty, many of them a powerful and stirring reminder of those heady days. Some 400,000 technicians, engineers, administrators and other service personnel worked on the Apollo missions, many of them working long hours, experiencing many frustrations, anxieties, and of course thrills. I have to say, as an internationalist by conviction, I’m happy to see that space exploration has become more of a collaborative affair in recent decades, and may that collaboration continue, defying the insularity and mindless nationalism we’ve been experiencing recently.

a beautiful image of the International Space Station, my favourite symbol of global cooperation
Finally, to the moon hoaxers and ‘skeptics’. What I noticed on researching this – I mean it really was obvious – was that in the comments to the various docos I watched on youtube, they had nothing to say about the science and seemed totally lacking in curiosity. It was all just parroted, and ‘arrogant’ denialism. The science buffs, on the other hand, were full of dizzy geekspeak on technical fixes, data analysis and potential for other missions, e.g. to Mars. In any case I’ve thoroughly enjoyed this little trip into the Apollo missions and the space race, in which I’ve learned a lot more than I’ve presented here.
Did Freud ever pass his orals?

Freud died of epithelioma from sticking too many cigars in his mouth, but he doesn’t strike me as the orally-fixated dependent type
A young person I know is studying psychology probably for the first time and she informed me of the stages of early childhood psychological development she has been told about – oral, anal, phallic, latency and genital. I’d certainly heard of the first two of these, but not too much of the others. A quick squiz at the lists of Dr Google led me to Freudian psychosexual theory, which naturally raised my scepical antennae. And yet, despite my limited parental experience I’ve noted that babies do like to put things in their mouths a lot (the oral stage is supposed to extend from birth to 1 -2 years), sometimes to their great detriment. So, personality-wise, is the oral stage a real thing, and does it really give way to the anal stage, etc? I’m using the oral stage here to stand for all the stages in the theory/hypothesis.
These stages were posited by Freud as central to his hypothesis of psychosexual development – though how the phallic stage is experienced by girls is an obvious question. His view was that our childhood development was a matter of fixation, at various periods, on ‘erogenous zones’. After the oral stage, children supposedly switch to an anal stage, which lasts to 3 years of age – presumably on average. These switches might be delayed, or brought on earlier, in individual cases, and sometimes an individual might get stuck at a particular stage, denoting psychosexual problems.
So how real are these stages? Are some more real than others? What is the experimental evidence for them, do they exist in other primates, and if they exist, then why? What purpose do they serve?
It seems that Freud, and perhaps also his followers, have built up a whole system around these stages and how individuals are more or less influenced by any one or a combination in the development of their adult personalities, and since the degree of influence of these different stages and the way they’ve combined in each individual is pretty well impossible to recover, the theory looks to be unfalsifiable. There also appears to be the problem that psychologists can usually only track back from the adult’s personality to speculate about early childhood influences, which looks like creating a circular argument. For example, if an individual presents as an overly trusting, dependent personality, this may be cited as evidence of fixation at the oral stage of development, because children fixated at this stage are believed to develop these personalites in later life. The only way out of this impasse it seems to me is to define this oral stage (or any other stage) more carefully, so that we can accurately identify children who have experienced a prolonged or fixated oral stage, and then return to them to observe how their personalites have developed.
Of course there are other problems with the theory. There needs to be a clearer explanation, it seems to me, of how these apparently erogenously-related stages are marked into personality traits in later life. The relationship between an obsession with putting things in your mouth, or sucking, licking or otherwise craving and enjoying oral sensations, and a dependent, trusting personality, is by no means obvious. In fact, some might go as far as to say that, prima facie, it makes about as much sense as an astrologically-based account of personality.
Perhaps if we look at the oral stage, or claims about it, more closely, we’ll find something of an explanation. In this description, we learn that the libido, or life force, gets fixated in the oral stage in more than one way, leading to an ‘oral receptive personality’ and an ‘oral aggressive personality’. The first type, which is a consequence of a delayed or overly fixated oral stage, is trusting and dependent, the second is dominating and aggressive, due largely to a curtailed oral stage, apparently. Those who experienced a longer oral stage in childhood are supposedly more likely to be smokers and nail-biters as adults, though I’m not sure how this relates to being a dependent or trusting personality.
In any case this hardly takes us further in terms of evidence, and it’s worth noting that the site in which this is mooted is described as ‘integrated sociopsychology’. Dr Steven Novella, in the most recent episode of the Skeptic’s Guide to the Universe, warned about the use of such terms as ‘integrative’, ‘functional’ and ‘holistic’ used before ‘medicine’ as a red flag indicating a probable bogus approach. I suspect the same goes for psychology. Obviously the website’s author is a Freudian, and he makes this statement as to evidence:
What is undoubtedly disturbing to the ‘Freud-bashers’ is how much evidence has accumulated over the years to say that, in broad terms at least, if not always in detail, Freud’s observations pretty much stand up so many years later.
However, other psychology sites I’ve looked at, which don’t appear to me to be particularly Freud-bashing, have pointed to the lack of evidence as the principal problem for Freud’s stages. Of course the major problem is how to test for the ‘personality effect’ of these stages. Again I think of astrology – someone dedicated to astrological causation can always account for personality ‘deviations’ in terms of cusps and conjunctions and ascendants and the like, and this would surely also be the case for the confounding influences of our various cavities and tackle, so to speak.
Some 20 years ago a paper by Fisher & Greenberg (1996) suggested that Freud’s stages and other aspects of his early childhood writings should be scientifically examined as separate hypotheses, in a sort of piecemeal fashion. Unfortunately I can find little evidence that evidence has been found for the oral stage as a marker for later personality development – or even looked for. This is probably because most scientists in the field – experimental psychologists – have little interest in these Freudian hypotheses, and little funding would be available for testing them. They would surely have to be longitudinal studies, with a host of potentially confounding factors accounted for, and the end results would hardly be likely to convince other early childhood specialists.
I’ve said the theory looks to be unfalsifiable, but I’m not quite prepared to say outright that it is. It seems to me that the oral stage, with its obvious association with breast-feeding, and the obvious association between prolonged breast-feeding and dependence, at least in popular culture, is the one most amenable to testing. The later Oedipus/Elektra complexes, associated I think with the phallic stage, seem rather too convoluted and caveat-ridden to be seriously testable. I must admit to a residual fondness for some of Freud’s theories of development though, however unscientific they might be. Though I was never interested in the strict form of the Oedipus complex, because my father was by far the weaker of my parents, I felt it offered some insight into relations with the dominant parent – struggle, rivalry, attempts to overthrow. I also agreed with his general view that early childhood is absolutely crucial to our subsequent psychological development, and I found his ego, id and superego hypotheses enlightening and fascinating. Polymorphous perversity, sublimation and the pervasive influence of libido also tickled my fancy a lot.
I think it’s fair to say that Freud has had a greater influence on popular culture than on science, but it has been a profound influence, and overall a positive one. The term ‘observations’, rather than theories, seems better to describe his contributions. In writing about the libido and the pleasure principle, inter alia, he accepted our instinctive animal nature, and gave us ideas about how to both harness it and overcome it. Notions like the id and the superego seemed to give us fresh ways to think about desire, discipline and control. His ideas and concepts tapped into stuff that was very personal to us in our individual struggles, and his universalising tendencies helped us, I think, to look sympathetically at the struggles of others. Libido itself was a banner-word that helped release us from the straight-jacket of earlier sexual thinking – or avoidance thereof.
It’s also probably unfair to expect from Freud’s pioneering work anything like the scientific riguor we expect and really need from psychology today. Certainly he was far too firm about the rightness of his most speculative work – I read The Interpretation of Dreams as an ideas-hungry teenager and was impressed with its first-half demolition of previous dream theories, but the second-half presentation of his own theory struck me even then as ludicrously weak, though it had the definitely positive effect of putting me off dream-interpreters for life (a dream that can be interpreted is a dream not worth having, and that’s their greatest gift to us). It’s more what he drew attention to that counts. His concept of the unconscious doesn’t really cut it today, but he made us start thinking of unconscious motivations in general, and much else besides. I’ve never been to an analyst, but I think one benefit of the psychoanalytic movement is to help us realise that there’s no normality and that we all carry baggage of guilt, anger, fear and frustration. For all its failings, his was a humanising enterprise.
preliminary remarks preliminary to a voyage

follow the thick blue line
I’ve been working desultorily on a number of blog pieces which I’m struggling to finish, partly because they’re hard work but also because the excitement and stress is building for my maiden voyage overseas, not counting my barely-brain-developed boat-trip to Australia from Southampton aged 5 – memories include a camel train on the banks of the Suez, being rescued from drowning in the ship’s pool, and being befriended by an older kid which mainly involved being chased around the decks a lot. So from this day forth I’m devoting this blog to the trip, lots of short sharp and shiny shite, around 500 words daily, though I’m unlikely to keep to that limit, seriously.
So I’m not yet packed and wondering about the Aus$ which they say is rising and that’s good for OS travel. I’ve been described – though only by one person, my travelling companion – as a Scottish mothpurse and my main stressor is definitely $$$$ – sadly I don’t have the symbol for euros on my keyboard. I think the recent rise means cheapie flights but ours was paid-for long ago. The current Aus$ buys .68 in euros and I’ve no idea whether that’s good or bad or better than it was, whenever was was. Anyhow nothing to be done so let’s change the subject to my moustache. I thought it’d be a fine frivolity to grow one for the trip, something Frenchy and chic and daft, but after about four days’ growth it’s looking more Hitler than Charles Boyer, who was too chic to sport a tache anyway, and besides I’ve never liked them. At least my hair’s grown salt’n pepper with age, and seriously short on pepper, so it’ll be prominent as frost on a silver dust bush, and a change is as good as a haircut so I’ll leave it growing for now.
I’m at the frantically seeking advice stage. Got my first-ever passport – had to become an Australian citizen, which made me feel like a fraud come congratulations time – money-belt, international connector thingy. Downloaded Skype for myself and my travelling companion (though I won’t be using it, having no friends and family), had it explained to me that Messenger through Facebook is the cheapest form of communication – would desperately love to have an extra TC, aged about 13, to keep me straight on smartphone technostuff etc. Told to wear stockings on the flight, against DVT, which I may not, and have found hopefully the right advice against aerosynusitis, aka plane brain, which had me folded over my seatbelt on a recent flight to Melbourne. Still have to photocopy my passport, do some house-cleaning and catfood-buying for my house-sitter, and other things I can’t remember. My mind’s blanking out unpredictably so I’m sure to stuff something majorly up, but my TC’s coming over tomorrow to help with the packing and share the stress.
Okay the itinerary. A 14-day cruise or thereabouts down the Danube-Main-Rhine from Budapest to Amsterdam, after which a two-night stopover and then a train to Paris for a week’s stay on the île Saint-Louis, the walls of our cosy pied-à-terre lapped by the Seine, plus ou moins. Then down the tunnel and two nights in once-swinging London, and then, hurly-burly done, back to the serenity and quiet contemplation of home. On verra.
what should a vegan’s pet eat, and other immortal questions
Jacinta: So here’s a question – if vegans have pets – say a cat or a dog – do they feed them only vegetables?
Canto: I don’t know, I suppose it would depend on the vegan…
Jacinta: Shouldn’t it depend on the pet? Cats and dogs are carnivores aren’t they? So it would be a form of cruelty to deprive them of meat. Might even be murder.
Canto: We don’t extend murder to the killing of other animals.
Jacinta: Many vegans do.
Canto: Good point. I once read an article by a vegan philosopher, who gets out of those problems by declaring that using animals as pets is unethical. A form of slavery, I suppose.
Jacinta: So, we free the pets? Along with the cows, the sheep, the donkeys, the camels, the water buffalos, the horses, the chooks and pigeons and all those other creatures we’ve used and abused so horridly?
Canto: Well, from memory – I’ll never be able to hunt out the article – he didn’t address the issue of those animals already under captivity of one sort or another. He was simply wanting to argue on general principles that using animals for our personal benefit was unethical.
Jacinta: Even if it benefits the animal?
Canto: Well I suppose the argument would be that even a well-treated slave is still a slave.
Jacinta: But if you free a dog, say, what would happen to it? You’re actually throwing it out of its home, it has nowhere else to go. And I believe that there’s historical evidence that dogs, and probably cats too, have adapted to live with humans. That it was their choice, in a sense. Like pigeons in the city getting fat on leftover bits of hamburger, with no obvious ill-effects. Do pigeons get diabetes?
Canto: Well there’s an obvious difference between scavenging pigeons and pets. Pets don’t choose to become pets. I think that’s the way the argument would run. Unfortunately there are a lot of current pets who would suffer from being set free, but that’s not the issue.
Jacinta: I think I see. We look after the pets we’ve got, then bury them and don’t have any more. And this wouldn’t mean the end of all dogs because there are plenty of strays – scavengers – to maintain the species. And no more enforced ‘pedigree’ breeding – I’d be all for that. But there’s a problem – in order to get rid of all the pets, you have to stop them breeding and that would mean desexing them – a gross interference of their right to reproduce. And if you allow them to reproduce, you must surely bear responsibility for their offspring as your home is theirs. You’re caught in a trap, you can’t walk out, because you love them babies too much.
Canto: You’re looking at it all from a practical perspective, which is all fine and good and relevant, but I think the issue for this philosopher was, I think – judging from him being a vegan – that all such usage of animals – pets as cuddly toys, dolphins as trained performers, horses and camels as pack animals, etc, not to mention farming them for slaughter – is unethical. What do you think of that as a general principle?
Jacinta: I don’t think it holds up, because species take advantage of other species all the time, and not just by preying on them. Sharks have their remoras, we have lice more or less specially adapted to us, roses have their aphids, in fact everywhere you look you have species making use of other species. And presumably being a vegan he marks a strict boundary between animal and vegetable and in reality that’s quite a fuzzy boundary, like with coral. And what about insects, what’s the vegan take on that?
Canto: Presumably negative – they have eyes and antennae and feelings of some sort.
Jacinta: Yes, well it’s a step too far I think. Yes we have a moral responsibility to avoid causing undue suffering….
Canto: Well what about this argument. Because we can survive – and indeed thrive – on only plants, we should do so. I mean, you’re talking about species that, say, are mostly carnivorous – that won’t survive if their food supply dries up. Sharks, for example, they can’t just become vegan, they’ve adapted to a very specific diet. We on the other hand are omnivores, we can dispense with certain varieties of food, including meat, and still live healthy lives, perhaps.
Jacinta: Hmmm, that’s definitely a more difficult question. I do believe that being omnivores, or being very adaptable in our diet has stood us in very good stead in the past, like in the last major ice age when we almost died out apparently. So I’m wondering whether confining our diet might not expose us to greater risks…
Canto: It may not even mean confining our diet – we could synthesise many of the proteins and other nutrients we nowadays get from meat. We’ve already done that, probably.
Jacinta: Well I’ve heard they’re still a long way from synthesising anything that really has the nutrients as well as the texture, flavour, odour and je ne sais quoi of meat. At under about $200, 000.
Canto: And if they achieved that feat, and got it down to competitive prices, would you go vego?
Jacinta: Well of course – I’d have no reason not to. I just don’t think it’ll happen in my lifetime.
Canto: But let’s say for argument’s sake that it does – would you feed this synthetic stuff to your pet cat?
Jacinta: Ah so we come full circle. Yes I would, since it would be more or less chemically identical to meat.
Canto: But animals that have adapted to become carnivores have also adapted to become hunters. They go together. Haven’t you turned your cat from its proper course in life?
Jacinta: No, she became removed from her ‘proper course’, if there is such a thing, by becoming my pet, whether by her choice or mine, or the choice of her ancestors. Likely she will keep up her hunting skills, catching flies and insects and mice and small birds, if she can. And she will benefit from being my friend, as I will benefit from being hers. Like all good friends, we’ll use each other for own purposes, which we hope will be, and will try to make, mutually beneficial.
Canto: Okay, no further questions your excellency.
Pourquoi science? – inter alia
So as I approach my sixtieth year I’m in a mood to reflect on my largely wasted, dilettantish life (at least seen from a certain perspective… ).
It seems to me that my two older siblings and I were largely the products of benign neglect, if that’s not too unfair to my parents, who seemed largely pre-occupied with their – highly dysfunctional – relationship with each other. Anyway this neglect had its advantages and disadvantages, and it was offset by at least one key decision of my mother (by far the dominant parent). She had us taken to the local library once a fortnight to borrow books, and there were always books aplenty in the house, including at least two sets of encyclopaedias. So from the age of six or seven until I left home, the local libraries became a haven.
From almost the beginning though I felt a difference between learning, which was a thrill, and school, which I suffered in silence. My first strong memory of school comes from grade one, when I was five or six. My teacher asked me to read from our class reader and I had to tell her that I’d forgotten to bring it from home. She blew up at me. ‘You’ve forgotten it again! What’s the matter with you? How many times have I told you,’ etc etc. I was extremely humiliated. I was learning that I was vague, forgetful, disorganised, and it was all too true. Shortly after this, I arrived at school and discovered I’d forgotten my reader again. I was so scared I hid in the bushes until break time, when I rejoined the class unnoticed, apparently (though probably not). I remember the sense of being defiant and tricksterish.
It’s funny that I’m now a teacher who checks students’ homework and has to admonish those who don’t do it, because as a kid in primary school and later in high school, when the issue loomed much larger, I never did any homework. Not once, ever. I even got caned for it in high school. And suffered endless screaming fits from my mother when the matter was reported back to her. I remember many sleepless nights fretting about how to survive the next day’s questioning, but still I was unable or unwilling to comply. I spent a lot of my school days staring out the window, daydreaming of freedom. One day I watched a tiny bird – a hummingbird, I thought, but we have no hummingbirds in Australia – hovering a bit driftily above some bushes, for ages and ages. What an ability, what a perspective it had! And yet it felt constrained to hover there. Maybe only humans could free themselves from these ‘natural’ constraints.
I concocted an idea for a novel, which I confided to my sister, of schoolkids rising up and throwing out the teachers, establishing an ‘independent state’ school – an idea I probably took from Animal Farm. She was very enthusiastic, probing me on the details, assuring me it would be a best-seller, I would become famous. I became briefly obsessed with contemplating and planning the takeover – the secret meetings, the charismatic leader, the precisely organised tactics, the shock and dismay of our former masters, the nationwide reaction – but of course I soon stumbled over the outcome. Surely not Animal Farm again?
I learned over time that Elizabeth, our town, was the most working-class electorate in South Australia, with the largest percentage of labor voters in the state, and possibly even the country. Of course, one had to take pride in being the biggest or the most of anything, but what did it mean to be working-class? Was it a good or a bad thing? Was our family more or less working-class than our neighbours? I was discovering that interesting questions led to more questions, rather than to answers. That, as Milan Kundera wrote, the best questions didn’t have answers, or at least not final ones. Of course, the provisional answer seemed to be that it wasn’t good to be working class, or middle class, or upper class, but to move beyond such limitations. But I was learning, through my library reading, which increasingly consisted of Victorian English literature for some reason, that class wasn’t so easy to transcend.
I continued to struggle as my schooling moved towards the pointy end. Classmates were dropping out, working in factories, getting their first cars. I was wagging school a lot, avoiding the house, sleeping rough, drinking. My older brother started an economics degree at university, probably the first person in the history of my parents’ families to do so as the prospect of university education was opened up to the great unwashed, but I was unlikely to be the second. I recall wagging it one afternoon, walking to the end of my street, where the city of Elizabeth came to an abrupt end, and wandering through the fields and among the glasshouses of the Italian marketers, armed with my brother’s hefty economics textbook, and getting quite excited over the mysteries of supply and demand.
And so it went – I left school, worked in a factory here, a factory there, went on the dole, worked in an office for a while, got laid off, another factory, moved to the city, shared houses with art students, philosophy students, mathematics nerds (whom I loved), wrote volumes of journals, tried to write stories, ritually burned my writings, read philosophy, had regular bull sessions about all the really interesting things that young people obsess about and so on and on. And I haven’t even mentioned sex.
I’d always been hopelessly shy with the opposite sex and wrote myself off as eternally poor and inadequate, but I loved girls and fantasised endlessly. I felt guilty about it, not because I thought it immoral – I never had any moral qualms about sex, which made it all the more easy to dismiss religions, which all seemed to be obsessed with regulating or suppressing it. I felt guilty because sexual daydreaming always seemed the lazy option. I was like Proust’s Swann, I would tire easily from thinking too much, especially as those great questions never had any easy or final answers. So I would give up and indulge my fantasies, and even the occasional unrequited or unrealistic passion for real female acquaintance. I remember hearing of a celebrated mathematician who would wander homeless around the USA I think it was, couchsurfing at the homes of mathematical colleagues male and female, inspiring them to collaborate with him on mathematical papers, so that he held a record for the most papers published in peer-reviewed journals. An attractive female colleague laughed at the idea of an affair with him, because apparently everyone knew he was entirely asexual, had never been heard to even mention sex in his life… Could this be true, I wondered, and if so, how could I create for myself a brain like his? It seemed to me that Aristotle was right, the pleasure derived from certain types of contemplation was greater than sexual pleasure (though dog knows I’d hate to forgo sex). I’d experienced this myself, grappling with something in Wittgenstein, reading a passage over and over until an insight hit me and set me pacing around my bedroom all night long talking to myself. But maybe it was all bullshit.
So now to get to the heart of the matter – pourquoi science? As a youngster I read novels, and sometime works of history – one of my first big adult books was a very good biography of Richard III, which I read at 14, and which came flooding back when Richard’s body was miraculously discovered recently. But I never read science. At school I quickly lost track of physics and mathematics, while always being vaguely aware of how fundamental they were. Through philosophy in my early twenties I started to regain an interest, but generally I’d resigned myself to being on the arts side of the great divide.
One book, or one passage in a book, changed this. The book was Der Zauberberg, or The Magic Mountain, by Thomas Mann, which I read in 1981. This was the story of Hans Castorp, a young man in his mid-twenties, as I was when I read it. As a tubercular patient, he was sent to a sanitarium in the Alps for a period of enforced idleness, where he encountered a number of more or less interesting characters and was encouraged to grapple with some more or less interesting ideas. Wrapped up on his loggia, he was reading some books on fundamental science, and fell into contemplation, and in a passage of some fifteen pages he asked himself two fundamental questions, both of which branched off into a whole series of sub-questions (or so I remember it). They were: What is life? and What is matter? And there was something about the way Mann animated this Castorp character, as ordinary a fellow as myself, and made me identify with his questioning and his profound wonder. It just flipped a switch in me. These were the questions. They could easily fill several lifetimes. No reason ever to be bored again.
I immediately went out and bought my first ever science magazine, Scientific American, and throughout the eighties I bought each monthly issue and read it cover to cover, not always understanding it all of course, but gradually building up a general knowledge. Later I switched to New Scientist, and nowadays I read the fine Australian magazine Cosmos, as well as listening to science podcasts and reading the odd blog. I’m far from being a scientist, and I’ll never have more than a passing knowledge – but then, that’s all that even the most brilliant scientist can hope for, as Einstein well knew.
But here’s the thing – and I’ll expand on this in my next post. It’s not science that’s interesting – science is just a collection of tools. What’s interesting is the world. Or the universe, or everything. It’s the curiosity, and the questions, and the astonishing answers that raise so many more questions. For example – what is matter? Our investigations into this question have revealed that we know bugger all abut the stuff. And when we were young, as a species, we thought we knew it all!
Next time, I’ll focus more deeply on science itself, its meaning and its detractors.
a change of focus, and Charlie Darwin’s teenage fantasies
“bashful, insolent; chaste, lustful; prating, silent; laborious, delicate; ingenious, heavy; melancholic, pleasant; lying, true; knowing, ignorant; liberal, covetous, and prodigal”
Michel de Montaigne, ‘Myself’
Sitting at my computer with the ABC’s ‘Rage’ on in the background, when on came a video by an artist who’s taken the moniker ‘Montaigne’, and how could I not be attracted? Good luck to her. I first stumbled on the original Montaigne decades ago, and like thousands before and since, I was fairly blown away. He’s been an inspiration and a touchstone ever since, and to think I’m now approaching his age at his death. One thing he wrote has always stayed with me, and I’ll misquote in the Montaignian tradition, being more concerned with the idea than the actual words – something like ‘I write not to learn about myself, but to create myself’. This raises the importance of writing, of written language, to an almost ridiculous degree, and I feel it in myself, as I’ve sacrificed much to my writing, such as it is. Certainly relationships, friendships, career – but I was always bad at those. All I have to show for it is a body of work, much of it lost, certainly before the blogosphere came along, the blogosphere that retains everything, for better or worse.
The New Yorker captures the appeal of Montaigne well. He wasn’t an autobiographical writer, in that he didn’t dwell on the details of his own life, but as a skeptic who trusted little beyond his own thoughts, he provided a fascinating insight into a liberal and wide-ranging thinker of an earlier era, and he liberated the minds of those who came later and were inspired by his example, including moi, some 400 years on. So, I’d like to make my writings a bit more Montaignian in future (I’ve been thinking about it for a while).
I’ve been focussing mainly on science heretofore, but there are hundreds of bloggers better qualified to write about science than me. My excuse, now and in the future, is that I’m keen to educate myself, and science will continue to play a major part, as I’m a thorough-going materialist and endlessly interested in our expanding technological achievements and our increasing knowledge. But I want to be a little more random in my focus, to reflect on implications, trends, and my experience of being in this rapidly changing world. We’ll see how it pans out.
Reading the celebrated biography of Charles Darwin by Adrian Desmond and James Moore, I was intrigued by some remarks in a letter to his cousin and friend, William Darwin Fox, referring to the ‘paradise’ of Fanny and Sarah Owen’s bedrooms. This was 1828, and the 19-year-old Darwin, already an avid and accomplished beetle collector and on his way to becoming a self-made naturalist, was contemplating ‘divinity’ studies at Cambridge, having flunked out of medicine in Edinburgh. Fanny was his girlfriend at the time. These bedrooms were
‘a paradise… about which, like any good Mussulman I am always thinking… (only here) the black-eyed Houris… do not merely exist in Mahomets noddle, but are real substantial flesh and blood.’
It’s not so much the sensual avidity shown by the 19-year-old that intrigues me here, but the religious attitude (and the fascinating reference to Islam). For someone about to embark on a godly career – though with the definite intention of using it to further his passion for naturalism – such a cavalier treatment of religion, albeit the wrong one, as ‘inside the noddle’, is quite revealing. But then Darwin’s immediate family, or the males at least, were all quasi-freethinkers, unlike his Wedgewood cousins. Darwin never took the idea of Holy Orders seriously.
a plague of mysteries
I’m writing this because of some remarks made in the workplace which – well, let’s just say they set my sceptical antennae working overtime. They were claims made about the bubonic plague, of all things.
Bubonic plague, dubbed the Black Death throughout European history, is a zoonotic disease, which means it spreads from species to species – in this case from rodents to humans via fleas. Actually there are three types of ‘black death’ plagues, all caused by the enterobacterium Yersinia pestis, the others being the septicemic plague and the pneumonic plague. Other zoonotic diseases include ebola and influenza. Flea-borne infections generally attack the lymphatic system, as does bubonic plague. The term ‘bubonic’ comes from Greek, meaning groin, and the most well-known symptom of the disease were ‘buboes’, grotesque swellings of the glands in the groin and armpit.
It wasn’t called the Black Death for nothing (the blackness was necrotising flesh). It’s estimated that half the European population was wiped out by it in the 14th century. If untreated, up to two-thirds of those infected will be dead within four days. With modern antibiotic treatments, the mortality rate is of course greatly reduced. The broad-based antibiotic, streptomycin has proved very effective. Of course treatment should be immediate if possible, and prophylactic antibiotics should be given to anyone in contact with the infected.
The plague is first known to have stuck Europe in the sixth century, at the time of Justinian. The Emperor actually caught the disease but recovered after treatment. It’s believed that the death toll was very high, but little detail has been recorded. The fourteenth century outbreak appears to have originated in Mongolia, from where it spread through Mongol incursions into the Crimea. An estimated 25 million died in this outbreak from 1347 to 1352. More limited outbreaks occurred in later centuries, and the last serious occurrences in Europe were in Marseille in 1720, Messina (Sicily) in 1743, and Moscow in 1770. However it emerged again in Asia in the nineteenth century. Limited for some time to south-west China, it slowly spread from Hong-Kong to India, where it killed millions of people in the early twentieth century. Infected rats were inadvertently transported to other countries by trading vessels, resulting in outbreaks in Hawaii and Australia. By 1959, when worldwide casualties dropped to under 200 annually, the World Health Organisation was able to declare the disease under control, but there was another outbreak in India in 1994, causing widespread panic and over 50 deaths.
So that’s a v brief history of the rise and fall of bubonic plague, but I’m interested in looking at early treatments and the discovery of its cause. For the fact is that, even in 1900, when the plague first came to Australia, there was no clear consensus among the experts as to its means of transmission, with many believing that it was as a result of contact with the infected. However, a growing body of evidence was showing a connection with epizootic infection in rats, and as it happened, work done by Australian bacteriologists Frank Tidswell, William Armstrong and Robert Dick, working for a new public health department in Sydney under Chief Medical Officer John Ashburton Thompson, established as a direct result of the plague outbreaks in Sydney from 1900 to 1925, contributed substantially to the modern understanding of Yersinia pestis and its spread from rats to humans. This Australian work was another step forward in the germ theory of disease, first suggested by the French physician Nicolas Andry in 1700, and built upon by many experimental and speculative savants over the next 150 years. The great practical success of John Snow’s work on cholera, followed by the researches of Louis Pasteur and Robert Koch, established the theory as mainstream science, but zoonotic infections, especially indirect ones where the infection passes from one species to another by means of a vector, have always been tricky to work out.
In fact it was in Hong Kong that the Yersinia pestis bacterium was identified as the culprit. A breakout of plague occurred there in the 1890s, and Alexandre Yersin, a bacteriologist who had worked under both Pasteur and Bloch, was invited to research the disease. He identified the bacterium in June 1894, at about the same time as a Japanese researcher, Kitasato Shibasaburo. The cognoscenti recognise that both men should share the honour of discovery.
What is fascinating, though, is that the spread of plague from Asia in the 1890s to various ports of the world in the earlier 20th century was very different from the spread of earlier pandemics. Did this have anything to do with science or human practices? Well, what follows is drawn from by far the most comprehensive analysis of the disease I’ve found online, Samuel Cohn’s ‘Epidemiology of the Black Death and successive waves of plague’, in the Cambridge Journal of Medical History.
Cohn’s research and analysis casts credible doubt on the whole plague story, specifically the assumption that we’re dealing with one disease, from the sixth century through to modern outbreaks. He recounts the standard story of three separate pandemics, in the sixth century with a number of recurrences, ditto in the fourteenth century, and in the nineteenth. However, the epidemiology of the most recent pandemic, definitely attributed to Y Pestis and its carrier the Oriental rat flea, Xenopsylla cheopis, is substantially different from that of pandemics one and two, a fact which, according to Cohn, has been obscured by inaccurate analysis of the records. Cohn’s own analysis, it must be said, is fulsome, with 30 pages of references in a 68-page online essay. He doesn’t have a solution as to what caused the earlier pandemics, but he asks some cogent questions. For my own understanding’s sake, I’ll try to summarise the issues in sections.
speed of transmission
Pandemic 3, if we can call it that, was a much slower mover than the previous two. It seems to have sprung up in China’s Yunnan province from where it reached Hong Kong in 1894. It was noted in the early 20th century that Y pestis was travelling overland at a speed of only 12 to 15 kilometres a year. This can be explained by the fact that Y pestis is a disease mainly of rats, though other rodents can also be infected, and rats don’t move far from their home territories. At this rate pandemic 3, even in a world of railways, cars, and dense human populations, would have taken some 25 years to cover the distance that pandemic 1 covered in 3 months. Pandemic 1 made its first appearance in an Egyptian port in 541 and quickly spread around the Mediterranean from Iberia to Anatolia. Within two years of first occurrence it had reached to the wastelands of Ireland and eastern Persia. Pandemic 2, believed to have originated in India, China or the Russian steppes, made its first European appearance in Messina, Sicily in 1347. Within three years it had impacted most of continental Europe, and had even reached Greenland. The fastest overland travel recorded for plague occurred in 664 (pandemic 1), when it took only ninety-one days to travel 385 kilometres from Dover to Lastingham (4.23 km a day)— far faster than anything seen from Y pestis since its discovery in 1894. Pandemic 2’s speed was similar, as Cohn details it:
like the early medieval plague, the “second pandemic” was a fast mover, travelling in places almost as quickly per diem as modern plague spreads per annum. George Christakos and his co-researchers have recently employed sophisticated stochastic and mapping tools to calculate the varying speeds of dissemination and areas afflicted by the Black Death, 1347–51, through different parts of Europe at different seasons. They have compared these results to the overland transmission speeds of the twentieth-century bubonic plague and have found that the Black Death travelled at 1.5 to 6 kilometres per day—much faster than any spread of Yersinia pestis in the twentieth century. The area of Europe covered over time by the Black Death in the five years 1347 to 1351 was even more impressive. Christakos and his colleagues maintain that no human epidemic has ever shown such a propensity to cover space so swiftly (even including the 1918 influenza epidemic). By contrast to the spread of plague in the late nineteenth and twentieth centuries the difference is colossal: while the area of Europe covered by the Black Death was to the 4th power of time between 1347 and 1351, that of the bubonic plague in India between 1897 and 1907 was to the 2nd power of time, a difference of two orders of magnitude.
All of which raises the question – why was pandemic 3 so much slower than the others? Could it be that Y pestis wasn’t the cause of the earlier pandemics?
mode of transmission
We know that Y pestis is a disease of rats, and we know that the Black Death was all about rats, so that’s an obvious connection, no? Well, according to Cohn, what we think we know is just wrong. ‘… no scholar has found any evidence, archaeological or narrative, of a mass death of rodents that preceded or accompanied any wave of plague from the first or second pandemic.’ I must say I found this incredible when I first read it, yet Cohn seems to have investigated the sources thoroughly.
Cohn notes that:
while plague doctors of “the third pandemic” discovered to their surprise that the bubonic plague of the late nineteenth and twentieth centuries was rarely contagious, contemporaries of the first suggest a highly contagious person-to-person disease. Procopius, Evagrius, John of Ephesus, and Gregory of Tours characterized the disease as contagious and, in keeping with this trait, described it as clustering tightly within households and families; the evidence from burial sites supports their claims.
Pandemic 2 made the word contagium popular among the general public, and the incredible speed of transmission became one of the principle signs of the Black Death, differentiating it, for example, from smallpox, which had some similar physical characteristics. This contagion suggests person to person contact, more typical of pneumonic plague, which is highly infectious and can be transmitted through coughing and sneezing. A later chronicler of pandemic 2, Richard Mead, writing in the 1700s, advised against crowding plague sufferers in hospitals, as it ‘will promote and spread the Contagion’. However, those treating pandemic 3 noted, to their surprise, that plague wards were the safest places to be, and that this particular plague rarely took on the pneumonic form.
Cohn notes that the earlier pandemics were often associated with famine. For example in Alexandria and Constantinople in 618 and 619 famine preceded the plague and appeared to spark it into life. However, pandemic 3, definitely caused by Y Pestis, tended not to thrive in situations of dearth and was instead fed by increased yields. Such yields lead to higher rat populations, and higher rates of possibly infected rat fleas and so higher rates of transmission to humans.
death rates
According to contemporary accounts the first pandemic wiped out entire regions, decimating the inhabitants of cities and the countryside through which it so swiftly passed. These accounts are backed up by archaeological and other evidence. It’s pretty clear that millions died in the second pandemic too. Compare this to the third pandemic, which spread so slowly and was limited to coastal areas and even just shipping docks. Restricted to temperate zones, this last pandemic resulted in deaths in the hundreds, with never more than 3% of an affected population dying.
symptoms
Although few quantitative records describe the signs or symptoms of plague for pandemic one, those that do (and Cohn cites 6 different ancient authors) are in general agreement in their descriptions of ‘swellings in the groin, armpits, or on the neck just below the ear’, the classic symptoms of bubonic plague. Procopius of Caesaria also observed that victims’ bodies were covered in black pustules or lenticulae. Pandemic 2, which begins with the Black Death of 1347-52, is marked, on the other hand, by extensive records, both professional and popular – writings about it were amongst the first forms of popular literature.
range and seasonality
Another problem for the view that this has all been the doing of Y pestis, is that pandemics 1 and 2 could strike all year round, but generally settled into a pattern of prevailing in summer in the southern Mediterranean and the Near East, which is not the best season for the flea vector X cheopis. The seasonal cycle of modern plague is quite different, and the range is much more limited.
So all this opens up a mystery. Scientists are agreed that we don’t have a clear-cut story of Y pestis causing horrific disease through rats and fleas over millennia (archaeological and other evidence suggests that rats were scarce in 14th century Europe) , but they’re much in disagreement about what the real story might be. If not Y pestis, then maybe a hemorrhagic virus (one of which causes ebola). Such viruses are notorious for their rapid transmission, their resurgences and their high mortality rates. Pneumonic plague, the more infectious, lung-infecting form of plague may also be implicated, but this doesn’t appear to agree with most of the described symptoms of pandemics 1 and 2. Other types of fleas, not associated with rats, as well as lice, are also being considered as possible vectors. Some geneticists believe that a variant of Y pestis may have been responsible. It looks as if genetic analysis is the most likely pathway to finding a solution.
This article got started, as I wrote at the beginning, because someone keen on naturopathy said something about bubonic plague in our staff room. Some plant she brought in, which had great anti-oxidant properties (she clearly hasn’t kept up with the latest findings on anti-oxidants) was also a cure for bubonic plague, or maybe it was a variant of the plant, and the person who discovered the secret of its healing properties died suddenly (presumably not from plague) and the secret was lost to us for centuries…
on a big jet plane
This morning I did something I’ve never done in my entire adult life, and I’m nearly 58. I got into an aeroplane which went into the sky. It took me to Melbourne from Adelaide. From there I caught another plane to Canberra, where I’m writing this in the city’s YHA.
I was anxious about this flight. I have a guilty secret, I’m an addict of Air Crash Investigations, so I’m semi-expert on the many things that can go wrong on an aircraft and I’ve had very little experience of a plane arriving safely at its destination.
I did travel on a plane at 14, from Adelaide to Kangaroo Island and back again – about an hour’s travel all up. Today’s journeys weren’t much longer, but of course it’s the take offs and landings that are the major killers.
I do realise that air travel is the safest mode available. I’m about the only person I know who hasn’t travelled by plane dozens of times without being the worse for it, but that’s not much consolation when you strap yourself into your tight little economy seat and note how flimsy everything looks, how thin the barrier between yourself and the outside air – air which, I soon learn, is 37000 metres above solid ground.
While we were walking through one of those moveable corridors that led directly to the aircraft’s front door I could see the pilot and his apprentice (had he earned his Ps?) chatting in the cockpit (strange word for a space designed to bring thousands of travellers though thousands of kilometres of high sky). I was shocked at how vulnerable they looked sitting there so prominently forward in what I think is called the nose-cone, which looked horribly fragile, like a glass egg that could be cracked by any passing bird. I’d expected something more like the bridge of the starship Enterprise, or that mysterious intergalactic vessel that Carl Sagan peered out of in Cosmos.
I was also a bit shocked at how bus-like the interior was, with its densely packed seating and narrow central aisle. Of course this was no jumbo jet – do they use that term nowadays? – but even so… and then I was shocked again, as we taxied to the runway, that I could feel the bumps on the road, as if we really were in a taxi, with suspension issues. through the window I could see the plane’s right wing bouncing and shuddering. It wasn’t screwed on properly! I was having a little joke with myself, but I wasn’t amused. I glanced around at the other passengers. One was reading a magazine, another was yawning ostentatiously. I had a book in my lap – Will Storr’s The heretics: adventures with the enemies of science – but this time it was just for show. It just wouldn’t do to behave like a gawping schoolboy, though that was exactly what I was doing. And to be fair to my benumbed self, even the sad circumstances of schizophrenics and Morgellons sufferers seemed to pale in comparison to my life and death situation.
We moved off from the airport lights into the pre-dawn dimness. I wasn’t going to see much of this takeoff, I’d have to rely on feeling. Someone over the intercom was saying, in his most reassuring voice, that the weather in Melbourne was pretty dismal, suggesting problems with the landing. Oh my. On the runway, everything suddenly got loud. The rockets had launched, or something, and then we were off the ground, I could tell by the lights falling away beneath me.
Dawn was breaking. Soon I could see clearly the mass of Lake Alexandrina, with Lake Albert attached like a suckling pup. I knew it well from so many maps, and I thought of those great mapmakers Jim Cook and Matt Flinders, how amazed they would’ve been at seeing such grand features, that would’ve taken them weeks to survey, set before them in an instant. But then the plane veered off, tilting at an angle that no bus would ever survive, and again I glanced around at my fellow passengers to check if it was okay to panic. all was blandness, and when the plane finally righted itself I gazed down – and due to the cloud cover I had to look down as near as perpendicular as possible to see much land at all – at a whole array of fascinating but unrecognisable features. I tried to fix them in my memory so I could check them on a map later – I love maps. But would they appear on a map? Were we still flying over South Australia or had we crossed the border? Was I being too obsessional? Of what use would be such knowledge? Well, bearing in mind Bertrand Russell’s nice essay on useless knowledge, I had some thoughts on airlines doing a running commentary on the sights and scenes on the ground, synced to flight-paths, one for each side of the street, so to speak, and played through headphones, which you could take or leave; but the logistics of it, considering variations of flight-path and speed of flight, and the probable lack of interest, considering the bored or otherwise absorbed expressions of my fellow passengers, would be too much for cost-conscious airliners.
Within a few minutes I was shocked – yet again – to hear that we’d soon be arriving in Melbourne. Someone said over the intercom that conditions remained miserable and that, hopefully, everything would be okay. There was more tilting and veering, and I tried to make out the familiar shape of Port Phillip Bay but we were too close to the ground. In any case I soon became concerned with something altogether different, something which was much more of a problem on my return flight to Adelaide (I’m writing the rest of this up at home, three more flights later). My ears began to ache, building up to some intensity until suddenly there was an unblocking, like the burst of a bubble, and only then did I realise that the pain was localised to one ear. After that, all was fine, but on the return trip there was no bubble-burst, and the pain reached an excruciating level, leaving me moaning and whimpering and desperate for relief. The problem was, of course, aerosinusitis, which I’ll deal with in my next post.
The lego blocks of the CBD came and went on the window screen and I could soon see the airstrips of Tullamarine. The landing was slightly bumpy but nothing untoward, and I was looking forward to a pleasant coffee break and possibly breakfast in ‘Melbourne’, before the connecting flight to Canberra.
No way José. A quick check of our tickets (yes we really didn’t check them before this) told us that the other flight was leaving just as we were arriving. How could they do this to us? But if we ran or – don’t panic – walked very fast, we just might… then we noticed it wasn’t a departure but a check-in time, yet even so… And in fact, after some long striding through long stretches of airport we got there just in time for boarding. Thank god I didn’t need a toilet break, and it was just as well we didn’t have an hour to spare considering airport prices – the medium latte I bought at Adelaide airport, which I had to gulp down just before boarding, cost me $5.30, an all-time record.
The Canberra trip was anti-climactic, in spite of the bogey word ‘turbulence’, so much featured on Air Crash Investigations. Not only was I a vastly more experienced traveller, but this time there was nothing to see landwise, nothing but whiter-than-white clouds from horizon to horizon, like a fluffy Antarctica. Only as we descended below the cloud line near Canberra – and this flight was even shorter than the first one – did I get to see something familiar, the forested slopes of the Snowies, where once I did some memorable bush-walking, attacked by march flies and leeches and coming face-to-face, for a fleeting instant, with a black snake.
After a near-perfect landing, nothing more to report, my innocence of flying had slipped away forever. How ironic that Virgin airlines should deprive me of my virginity in this area. From now on I can blend in with all the rest, almost without pretending. There’s something almost sad about it, a tiny loss of identity, or a replacement for some part of me that I’m not quite sure about. But hey, we all know the self is an illusion.
what does curiosity actually mean?
You might say that Philip Ball has performed a curious task with his book, Curiosity. He’s taken this term, which we moderns might take for granted, and examined what intellectuals and the public have made of it down through the ages – with a particular focus on that wobbly symbol of the seventeenth century British scientific enlightenment, the Royal Society. I’ve been spending a bit of time in the seventeenth century lately, what with Dava Sobel’s book on the struggle to measure longitude, Matthew Cobb’s book on the untangling of the problem of eggs and sperm and conception, not to mention Bill Bryson’s lively treatment of Hooke, Leeuwenhoek and cells and protozoa in A Short History of Nearly Everything.
That century, with some of its most interesting actors, including Francis Bacon, René Descartes, William Harvey, Jan Swammerdam, Nicolas Steno, Johann Komensky (aka Comenius), Samuel Butler, Thomas Hobbes, Robert Hooke, Robert Boyle, Antonie van Leeuwenhoek, Thomas Shadwell, Margaret Cavendish and Isaac Newton, represented a great testing period for science and its reception by the public. Curiosity has always had its enemies, and still does, as evidenced by some Papal pronouncements of recent years, but in earlier, more universally religious times, knowledge and its pursuit were treated with great wariness and suspicion, a suspicion sanctioned by the Biblical tale of the fall. The Catholic Church had risen to a position of great power in the west, though the revolting Lutherans, Anglicans, Calvinists and their ilk had spoiled the party somewhat, and England in particular, having grown in pride and prosperity during the Elizabethan period, was flexing its muscles and exercising its grey matter in exciting new ways. The sense of renovation was captured by the versatile Bacon, with works like the Novum Organum (New Method), The New Atlantis and The Advancement of Learning.
In the past I’ve described curiosity and scepticism as the twin pillars of the scientific mindset, but they’re really more like a pair of essential forces that interact and modify each other. Scepticism without curiosity is just pure negativity and nihilism, curiosity without scepticism is directionless and naive.
But perhaps that’s overly glib. What, if any, are the limits of curiosity, and when is it a bad thing? It killed the cat, after all.
The word derives from the Latin ‘cura’, meaning care. Think of the word ‘curator’. However, if you think of one of the most curious works of the ancients, Pliny the Elder’s Natural History, you’d have to say, from a modern perspective, that little care was taken to separate truth from fiction in his massive and sometimes bizarre collection of curios. This sort of unfiltered inclusivity in collecting ‘facts’ and stories goes back at least to Herodotus, the ‘father of lies’ as well as of history, and it goes forward to medieval bestiaries and herbaria. These collections of the weird and wonderful were, of course, not intended to be scientific in the modern sense. The term ‘science’ wasn’t in currency and no clear scientific methodologies had been elaborated. As to curiosity, it certainly wasn’t a fixed term, and after the political establishment of Christianity, it was more often than not seen in a negative light. ‘We want no curious disputation after possessing [i.e. accepting the truth of] Jesus Christ’, wrote Tertullian in the early Christian days. Another early Christian, Lactantius [c240-c320], explained that the reason Adam and Eve were created last was so that they’d remain forever ignorant of how their god created everything else. That was how it was intended to be. Modern creationists follow this tradition – God did it, we don’t know how and we don’t really care.
Fast forward to Francis Bacon, who still, in the early 17th century, had to contend with the view of curiosity as a sinful extravagance, a view that had dominated Europe for almost a millennium and a half. Bacon had quite a pragmatic, almost business-like view of curiosity as a tool to benefit humanity. The ‘cabinet of curiosities’ was becoming well established in his time, and Bacon advised all monarchs, indeed all rich and powerful men, to maintain one, well sorted and labelled, as if to do so would be magically empowering. The problem with these cabinets, though, was that there was little understanding about the relations between entities and articles. That’s to say, there was little that was modernly scientific about them. Their objects were largely unrelated rarities and oddities, having only one thing in common, that they were ‘curious’. Bacon recognised that this wouldn’t quite do, and tried to point a way forward. He didn’t entirely succeed, but – small steps.
Ball’s book is at pains to correct, or at least provide nuance to, the standard view of Bacon as initiator of and father-figure to the British scientific enlightenment. In fact, Bacon may have been a Rosicrucian, and his utopian New Atlantis describes a more or less priestly caste of technical experts, living and working in Solomon’s House, and keeping their arts and knowledge largely under wraps, like the alchemists and mages of earlier generations. Bacon, with his government connections and his obvious ambition to be benefited by as well as benefiting the state, was concerned to harness knowledge to productivity and profit, and those who see science largely as a coercion of nature have cursed him for it ever since. Mining and metallurgy, engineering and manufacturing were his first subjects, but he also imagined great changes in agriculture – the breeding of plants, fruits and flowers, as well as animals, to create ‘super-organisms’, in and out of season, for our benefit and delight. The art and science of the kitchens of Solomon’s House produces superior dishes, as well as wines and other beverages, and printing and textiles have advanced greatly, with new fabrics, papers, dyes and machinery. Even the weather is subject to manipulation, with rain, snow and sunshine under the control of the savants. The details of all these advancements are kept vague of course, (and here’s where Bacon’s insistence on ‘secret knowledge’ plays to his advantage, a point not sufficiently noted by Ball in his need to connect Bacon with the the alchemist-magicians of the past) but what is represented here is promise, a faith in human ingenuity to improve on the products of the natural world.
In focusing on all these benefits, Bacon manages largely to sidestep the religious aversion to curiosity as a form of intellectual avarice. However, Bacon and his more curious compatriots were never too far from the magical dark arts. Few intellectuals of this period, for example, would have dismissed alchemy out of hand, in spite of Chaucer’s delicious mockery of it over 200 years before, or Ben Jonson’s more contemporaneous take in The Alchemist. What differentiated Bacon was an interest in system, however vaguely adumbrated, and a harnessing of this system to the interests of the state.
Bacon tried to interest James I in a state sponsored proto-scientific institution, but this got nowhere, largely because he couldn’t devise anything like a practical program for such an entity, but a generation or two after his death, after a civil war, a brief republic and a restoration, the Royal Society was formed under the more or less indifferent patronage of Charles II. Bacon was seen as its guiding spirit, and there was an expectation, or hope, that its members would be virtuosi, a term then in currency. As Ball explains:
The virtuoso was ‘a rational artist in all things’… meaning the arts as well as the sciences, pursued methodically with a scientist’s understanding of perspective, anatomy and so forth. (It is after all in the arts that the epithet ‘virtuoso’ survives today.) The virtuoso was permitted, indeed expected, to indulge pure curiosity: to pry into any aspect of nature or art, no matter how trivial, for the sake of knowing. There was no sense that this impulse need be harnessed and disciplined by anything resembling a systematic program, or by an attempt to generalise from particulars to overarching theories.
Charles II, in spite of having some scientific pretensions, paid scant attention to his own Society, and neglected to fund it. What was perhaps worse for the Society was his amused approval of a hit play of the time, Thomas Shadwell’s The Virtuoso, which satirized the Society through its central character, Sir Nicholas Gimcrack. The play, as well as many criticisms of the Society’s practices by the likes of the philosopher Thomas Hobbes and the aristocratic Margaret Cavendish (Duchess of Newcastle-upon-Tyne), presented another kind of negativity vis-a-vis unbridled curiosity, more modern, if not more pointed than the old religious objections.
The play-goer first encounters Sir Nicholas Gimcrack lying on a table making swimming motions. He tells his visitors that he’s learning to swim, but they are dubious about his method. His response:
I content myself with the speculative part of swimming; I care not for the practick. I seldom bring anything to use; tis not my way. Knowledge is my ultimate end.
This was the updated criticism. Pointless observations and experiments, leading nowhere and of no practical use. Gimcrack appears to have been based on Robert Hooke, one of the Royal Society’s most brilliant members, who was suitably enraged on viewing the play. Shadwell mocked Hooke’s prized invention, the air pump, intended to create a vacuum for the purpose of observing objects inserted into it, and he presented a jaundiced view of Gimcrack, through the dialogue of his niece, as ‘a sot that has spent two thousand pounds in microscopes to find out the nature of eels in vinegar, mites in a cheese, and the blue of plums.’ These were all examined in Hooke’s ground-breaking and breath-taking work Micrographia.
Most of Shadwell’s mockery hasn’t stood the test of time, but he was far from the only one who targeted the practices and the approach of the Society and of ‘virtuosi’, sometimes with humour, sometimes with indignation. Their criticisms are worth examining, both for what they reveal of the era, and for their occasional relevance today. Many of them seem totally misplaced – mocking the ‘weighing of air’, which they naturally saw as the weighing of nothing, or the examining, through the newish tool the microscope, of a gnat’s leg. It should be recalled that Hooke, through his microscopic investigations, was the first to highlight and to name the individual cell. Yet it was a common criticism of the era, due largely to the ignorance of the interconnectedness of all things that the scientifically literate now take for granted, that these explorations were simply time-wasting dilettantism. The philosophical curmudgeon Thomas Hobbes, for example, firmly believed that experiments couldn’t produce significant truths about the world. It seems that the general public, who didn’t have access to such things, saw microscopes and telescopes as magical devices which didn’t so much reveal new worlds as to create them. If they couldn’t be verified with one’s own eyes, how could these visions be trusted? And there was the old religious argument that we weren’t meant to see them, that we should keep to our god-given limitations.
Generally speaking, as Ball describes it, though the criticisms and misgivings weren’t so clearly religious as they had been, they centred on a suspicion about unrestrained curiosity and questioning, which might lead to an undermining of the social order (a big issue after the recent upheavals in England), and to atheism (they were on the money with that one). They had a big impact on the Royal Society, which struggled to survive in the late seventeenth and early eighteenth centuries. It’s worth noting too, that the later eighteenth century Enlightenment on the continent was much more political and social than scientific.
But rather than try to analyse these criticisms, I’ll provide a rich sample of them, without comment. None of them are ‘representative’, but together they give a flavour of the times, or of the more conservative feeling of the time.
[Is there] anything more Absurd and Impertinent than a Man who has so great a concern upon his Hands as the Preparing for Eternity, all busy and taken up with Quadrants, and Telescopes, Furnaces, Syphons and Air-pumps?
John Norris, Reflections on the conduct of human life, 1690
Through worlds unnumber’d though the God be known,
‘Tis ours to trace him only in our own….
The bliss of man (could pride that blessing find)
Is not to act or think beyond mankind;
No powers of body or of soul to share,
But what his nature and his state can bear.
Why has not a man a microscopic eye?
For this plain reason, man is not a fly.
Say what the use, were finer optics giv’n,
T’inspect a mite, not comprehend the heav’n? …
Then say not man’s imperfect, Heav’n in fault;
Say rather, man’s as perfect as he ought:
His knowledge measur’d to his state and place,
His time a moment, and a point his space.
Alexander Pope, An Essay on Man
There are some men whose heads are so oddly turned this way, that though they are utter strangers to the common occurrences of life, they are able to discover the sex of a cockle, or describe the generation of a mite, in all its circumstances. They are so little versed in the world, that they scarce know a horse from an ox; but at the same time will tell you, with a great deal of gravity, that a flea is a rhinoceros, and a snail an hermaphrodite.
… the mind of man… is capable of much higher contemplations [and] should not be altogether fixed upon such mean and disproportionate objects.
Joseph Addison, The Tatler, 1710
But could Experimental Philosophers find out more beneficial Arts then our Fore-fathers have done, either for the better increase of Vegetables and brute Animals to nourish our bodies, or better and commodious contrivances in the Art of Architecture to build us houses… it would not onely be worth their labour, but of as much praise as could be given to them: But as Boys that play with watry Bubbles, or fling Dust into each others Eyes, or make a Hobby-horse of Snow, are worthy of reproof rather then praise, for wasting their time with useless sports; so those that addict themselves to unprofitable Arts, spend more time then they reap benefit thereby… they will never be able to spin Silk, Thred, or Wool, &c. from loose Atomes; neither will Weavers weave a Web of Light from the Sun’s Rays, nor an Architect build an House of the bubbles of Water and Air… and if a Painter should draw a Lowse as big as a Crab, and of that shape as the Microscope presents, can any body imagine that a Beggar would believe it to be true? but if he did, what advantage would it be to the Beggar? for it doth neither instruct him how to avoid breeding them, or how to catch them, or to hinder them from biting.
[Inventors of telescopes etc] have done the world more injury than benefit; for this art has intoxicated so many men’s brains, and wholly employed their thoughts and bodily actions about phenomena, or the exterior figures of objects, as all better arts and studies are laid aside.
Margaret Cavendish, Observations upon Experimental Philosophy, 1666
[A virtuoso is one who] has abandoned the society of men for that of Insects, Worms, Grubbs, Maggots, Flies, Moths, Locusts, Beetles, Spiders, Grasshoppers, Snails, Lizards and Tortoises….
To what purpose is it, that these Gentlemen ransack all Parts both of Earth and Sea to procure these Triffles?… I know that the desire of knowledge, and the discovery of things yet unknown is the pretence; but what Knowledge is it? What Discoveries do we owe to their Labours? It is only the discovery of some few unheeded Varieties of Plants, Shells, or Insects, unheeded only because useless; and the knowledge, they boast so much of, is no more than a Register of their Names and Marks of Distinction only.
Mary Astell, The character of a virtuoso, 1696
There are many other such comments, very various, some attempting to be witty, others indignant or contemptuous, and some quite astute – the Royal Society did have more than its share of dabblers and dilettantes, and was far from being simply ‘open to talents’ – but for the most parts the criticisms haven’t dated well. You won’t see The Virtuoso in your local playhouse in the near future. Wide-ranging curiosity, mixed with a big dose of scepticism and critical analysis of what the contemporary knowledge provides, has proved itself many times over in the development of scientific theory and an ever-expanding world view, taking us very far from the supposedly ‘better arts and studies’ the seventeenth century pundits thought we should be occupied by, but also making us realize that the science that has flowed from curiosity has mightily informed those ‘better arts and studies’, which can be perhaps best summarized by the four Kantian questions, Who are we? What do we know? What should we do? and What can we hope for?