Waiting for Gestalt

Waiting for Gestalt

By Gene Wilburn

Gestalt (ge STALT). A word meaning, roughly, when the brain perceives with clarity that the whole of a system is greater than the sum of its parts, and everything clicks into one awareness. One can have a gestalten moment. But can one achieve a gestalten existence?

When I was coming of age intellectually at university in the early to mid 1960s, there were a number of explorations of the mind making the rounds. Existentialism, the sometimes bleak philosophy that arose strongly in Paris after the Nazi occupation at the end of World War II, was alive and well. Sartre, Beauvoir, and Camus were still publishing and there was something compelling in the message that you’re responsible for who you become, creating a personal integrity in the face of the meaninglessness and absurdity of the universe. This is, of course, an over simplification.

Along with the primary existential philosophers came “Theatre of the Absurd,” a literary form of existentialism, perhaps best seen in the play by Samuel Beckett, Waiting for Godot, in which “logical construction and argument give way to irrational and illogical speech and its ultimate conclusion, silence.” [Wikipedia, “Theatre of the Absurd”]

Another prevailing line of thought came from the field of psychology, in the form of Abraham Maslow’s “Hierarchy of Needs” with “self actualization” at the top of the pyramid. In its wake people were self actualizing all over the place, or at least that’s what they professed. It certainly launched a full-blown pop psychology business and fuelled New-Age-style thinking before “New Age” had even become a word.

A different branch of psychology, from Germany, had earlier in the century introduced Gestalt Theory, a holistic psychology that seemed to imply that if you could attain a gestalt with yourself and your environment, you could flow through it with understanding, and perhaps appreciation, in the way that listening to a symphony is an experience that transcends the individual notes of the musical score.

Looking back on this fifty years later, I think existentialism has held up rather well, especially when augmented with a generous helping of late Roman-style stoicism. Maslow’s hierarchy of needs still has a sound feel to it, though there is a sense that Western society, as a whole, has slipped down the pyramid a bit in this era of anti-enlightenment, anti-science populism.

But the one that still teases my mind is gestalt theory. At the turning of each decade I’ve been waiting for that gestalten moment when everything would click into place and I would reach an understanding — “Because something is happening here / But you don’t know what it is / Do you, Mister Jones?” [Bob Dylan, “Ballad of a Thin Man”]

The problem is, how does one achieve gestalt when everything keeps changing?

The Impact of the 1960s

I emerged from the 1950s like most boys who had reached their teens by the start of the 1960s, interested in cars, playing basketball, grooving to the week’s Top–10 radio, and thinking about going to university after high school. In other words, I was as cookie-cutter naive as one could be.

It was the folk music era which, in my relative isolation, I took to be the music of the Kingston Trio, Limelighters, Chad Mitchell Trio, Burl Ives, and that new group on the radio, Peter Paul and Mary. It was when I heard Joan Baez sing a couple of old ballads like “Barbara Allen” I began to perceive a different kind of folk music that was less slick and more personal. Back then it was just music I liked. Later it would change me.

My intellectual life began when I went to university where I first majored in engineering. It was a tough study, but I was getting by, being moderately good at math and logic. There was, however, a problem. I enjoyed learning folk music more than studying STEM subjects and the lyrics of Bob Dylan and Phil Ochs left me questioning what I was doing. I bought a guitar, learned a fistful of chords, and learned to sing and play the songs that were haunting me.

My taste in folk music had also led me to discover the Weavers, Pete Seeger, Woody Guthrie, Cisco Houston, and a rich vein of black blues singers from Big Bill Broonzy and the Rev. Gary Davis to Mississippi John Hurt. I loved all these voices of the people.

I couldn’t square my study of engineering with my awareness of what was happening. The civil rights movement in the American South highlighted the inappropriate treatment of black people. President Kennedy had been assassinated, then Martin Luther King, then Robert Kennedy. There was a strange, unpopular war being waged in Vietnam.

Things were changing, blowing in the wind, as it were, and the gestalt of the time was changing with it. I switched my major to English and my minor to French, and began studying literature with its plays, novels, poems, and essays. In French classes, we frequently read the existentialists Sartre and Camus. I studied philosophy, social history, and art history. I met and became friends with dozens of like-minded individuals, some male, some female, some straight, some gay, a few who were black or hispanic, all of whom shared a passion for literature, art, philosophy, and music. I had found my people.

Something happens to your mind when you embrace the Humanities — something that comes as a series of epiphanies that raises your consciousness into new realms of thought and feeling resulting from contact with the great writers, poets, playwrights, philosophers, artists, and musicians of all eras. It’s intoxicating and exhilarating and, as Thomas Wolfe proclaimed in the title of his novel, You Can’t Go Home Again. You’re changed.

You reach for a higher kind of gestalt, the gestalt of the modestly well-educated. You begin to read the New York Times, The New Yorker, The New York Review of Books, Le Monde, The Times (London), The Guardian, Harper’s, Atlantic Monthly, The Globe and Mail, and university quarterlies. You listen to folk music, cool jazz, classical music, and opera. You see Verdi in the same tradition as Shakespeare, and taste the richness of Old English in Beowulf and the delightful Middle English of Geoffrey Chaucer.

It’s a heady experience, all in all, but the question always arises: what are you going to do with all this when you head out into the “real” world?

One Pill Makes You Larger, and One Pill Makes You Small

For one gestalten period it seemed as if the world had changed. The war in Vietnam was vigorously opposed, campus radicalism was on the rise, and hair got longer. The folk music I’d grown up with was woven into a new kind of rock music and the voices of Joni Mitchell, Grace Slick, Janis Joplin, and Crosby, Stills, Nash, and Young filled the airwaves, along with new bands like the Doors, Led Zeppelin, Grateful Dead, Jefferson Airplane, Santana, and Frank Zappa.

Alan Watts taught us about Zen, the tarot deck came back into fashion, and decorated VW vans filled with flower children with headbands, victory signs, peace medallions, and bloodshot eyes were common sights.

Among the reading favourites were One Flew Over the Cuckoo’s Nest, Been Down So Long It Looks Like Up to Me, Catch–22, The Vedas and The Upanishads, The Teachings of Don Juan, The I Ching and The Whole Earth Catalog.

Everyone was for “getting back to nature” and many communes were started, mostly ending in failure, and from the broadway musical Hair to massive rock concerts, it was assumed that the Age of Aquarius was upon us. The Mexican poet Octavio Paz described it as an “explosion of consciousness.”

It’s sometimes said that if you remember the 60s, you weren’t really there. My own memory of the time is patchy, with psychedelically-coloured gaps and an enduring sense of mysticism. But, like many, I didn’t see how it was sustainable. In the words of the Jefferson Airplane, “You are the Crown of Creation / And you have no place to go.”

The Origin of Species

The flower-power era couldn’t last, of course, because someone has to pay the bills. I trimmed my hair, picked up a degree in library science, and took a job. Through sheer good fortune I ended up as Head Librarian at the Royal Ontario Museum, in Toronto. It was there that I began hanging out with ornithologists, palaeontologists, mammalogists, geologists, mineralogists, ichthyologists, and entomologists, as well as archaeologists. It has shaped my thinking to this day. I had encountered the gestalt of scientific thinking and research.

One of the curators, a palynologist (one who studies modern and ancient pollens) challenged me with the question: “Have you read Darwin’s Origin of Species?” Being a lit major, I hadn’t, so I decided to give it a go.

What surprised me the most was how clear Darwin’s Victorian prose was. I was mesmerized by the concept of “descent with modification” or as it came to be known, “evolution.” Shortly after reading Origin, a new volume by Stephen Jay Gould passed through the library — a collection of essays entitled Ever Since Darwin. I gave this a read and subsequently read every book of essays Gould produced, culled from his monthly column in Natural History.

As a newly-minted amateur naturalist and birder I became hooked on reading science books written for the general public. The 60’s mantra “all is one” took on a philosophically material interpretation when I studied how the universe started, how suns ignited and planets formed, and how, on this one we call Earth, life sparked and evolved, going through great periods of diversity, extinction, more diversity, more extinction, and so on, leading eventually to a group of suddenly sapient simians. As Carl Sagan pointed out, we are made from the remnants of star dust, and every living thing on the planet is related.

My readings in science and science history led me to reaffirm the existentialist theme that life can be heaven or hell, but human beings mean very little in the face of the universe. I shed any last remnants of religion. Materially, we are bodies that live and die, each of us randomly sorted into different situations, different cultures, different countries and it’s these things that shape our sense of who we are.

There are people for whom science is enough. To paraphrase Darwin, there’s a grandeur to this concept of life and its descent with modification through time and its tangled branches and the sudden bursts of evolution that Gould referred to as “punctuated equilibrium.” This is a gestalt that most naturalists come to feel through their observation of life’s many remarkable species.

But is science alone enough to sustain the human spirit, or psyche, that je ne sais quoi that some people call a “soul”? Perhaps, and perhaps not, depending on the individual. What science does, for me, is to throw into relief all the amazing works of mankind, from art, history, philosophy, literature, and music to the increasing technological achievements that accompanied the industrial revolution.

By the time I had begun to assimilate this naturalistic view, information technology was picking up the pace. Television, radio, newspapers and other media shaped us and moulded us in ways that perhaps only Marshall McLuhan could sort out. But that was merely a preface of things to come: the computer revolution.

Bits, Bytes, and Qubits

From the late 70s onward the computer revolution picked up momentum until it reached nearly Biblical proportions: “And in that time a great change came across the land” [my paraphrase]. Computing became personal, portable, and profoundly ubiquitous.

Like others, I joined the revolution, pivoting my career from librarianship to Information Technology (IT). From the earliest whimsical days that included an ad in Byte Magazine for dBase II, entitled “dBASE II vs The Bilge Pump,” to the corporate adoption of personal computers as strategic tools in the workplace, to the computer (aka smartphone) in one’s pocket or purse, a virtual Pandora’s box of consequences was unleashed.

My work involved setting up workstations, email servers, database servers, storage servers, web servers, and firewalls, with a little programming tossed in for spice. I enjoyed decades of computing projects and by the time I retired, in 2006, the industry had progressed from 8-bit personal computers such as the Apple II, to 64-bit powerhouses running Microsoft Windows, MacOS, Linux, iOS, Android, and a few dozen lesser-known operating systems. Smartphones and tablets had become almost a birthright.

Computing begat digital photography, streaming audio and video, automobile electronics, appliance electronics, social networks, and, with lesser success, self-driving cars. I now listen to streaming music, watch streaming videos, and get my news and opinion pages from the Internet.

On another level, machine learning (ML) has grown and penetrated the Internet to such a degree that one can examine a product on Amazon and see ads for it within hours on Facebook. Privacy has suffered. The Internet, invented for the purpose of sharing scientific information, developed a dark side, the extent of which is still being assessed — surveillance, phishing attacks, the hacking of personal information, and possibly enough manipulation to sway elections.

The pace is still swift and the increasingly successful bids to harness Quantum Computing (whose basic unit of information is called a Qubit) will likely bring unforeseen changes. Nothing stands still.

End Game

“You can’t stop the future. You can’t rewind the past. The only way to learn the secret, is to press play” ~ Jay Asher, Thirteen Reasons Why

In my retirement, I’ve once again become a student. I read incessantly, both fiction and nonfiction, I take the occasional online course, and I think, if not profoundly, at least genuinely. It aids thinking to have a philosophical framework to compare one’s thoughts to, and I continue to find the challenge of existentialism worthwhile for this. It’s an honest philosophy, derived from the human spirit looking at an irrational and uncaring, absurd, universe and deciding to carve out a personal meaning for being human. It’s a difficult challenge (never underestimate existential angst) but it’s more open and honest than clinging to a derived set of values, liberal or conservative, from those around us.

I’m beginning to understand why Camus used the story of Sisyphus to highlight the challenge. In the Greek myth, Sisyphus was condemned to roll a huge boulder to the top of a hill. Every time he reached the top, the boulder would roll back to the bottom and he was required to repeat the procedure, for eternity. “Camus claims that when Sisyphus acknowledges the futility of his task and the certainty of his fate, he is freed to realize the absurdity of his situation and to reach a state of contented acceptance. With a nod to the similarly cursed Greek hero Oedipus, Camus concludes that ‘all is well,’ indeed, that ‘one must imagine Sisyphus happy.’” [Wikipedia, “The Myth of Sisyphus”]

It would be neat and tidy, at this final stage of my life, to wrap up my thoughts with a pretty bow attached, but I’m unable to do so. There have always been random elements in our story that change the story itself: a colliding meteor, a world war, an economic depression, climate change, the overthrowing of the monarchy and aristocracy, the re-establishment of a wealthy set of plutocrats, the place you were born, the family you emerged from, the schools you attended, the number of freedoms, or lack thereof, of the prevailing government, and, not least, who you fall in love with. It is difficult to piece all this together into a holistic understanding. I am, in my final years, still waiting — waiting for gestalt.

 

On My 72nd Year: My Ten Fundamental Beliefs

On My 72nd Year: My Ten Fundamental Beliefs

By Gene Wilburn

“I just dropped in to see what condition my condition was in” ~ Kenny Rogers

Every year around birthday time (June 10 for me), I like to take stock of what I believe in. Where do I fit with the cosmos? What are my bedrock, fundamental assumptions? This year’s thinking mirrors very closely what I’ve thought for several years, but age has perhaps lent them more clarity.

Let’s start at the beginning. As Terry Pratchett once wrote, “In the beginning, there was nothing, which exploded.” For each of us our cosmology starts somewhere, and for me it starts with the Big Bang, which I’m told was not really so much an explosion as an expansion — a very dramatic expansion in which a primordial soup of plasma emerged that was so hot not even atoms could form. As it expanded it created space and time. The universe was born. About 13.7 billion years ago, if our measurements are correct.

If it helps you to believe that this was a result of God breathing across the waters, so be it — we each have our favourite narratives. The question of how something can come of nothing is a profound one, and physicists have some thoughts about this: that there really is nothing such as nothing. Particles and antiparticles evidently come into and out of existence billions of times per second and usually annihilate one another, but, at least once, it is possible that particles accumulated faster than antiparticles forming a singularity and, well, boom!

So, that’s belief number one: the universe came into existence. This had implications. Chemistry was born. Eventually particles changed into quarks as the plasma cooled, and later, hydrogen atoms formed. Concentrate enough hydrogen atoms together and what do you get? Fusion. The birth of stars. Then, over time, some stars die in a spectacular explosion called a supernova in which most of the rest of the naturally-occurring chemical elements are created, spewn forth as stellar dust and ice. As these clouds of stellar material concentrate and condense, new stars form, with planets around them. One of these we call the sun, and the planet we live on, which we call earth, formed about 4.5 billion years ago, give or take a million or two.

Which brings me to belief number two: Out of inorganic matter, life began. How is still unsolved, but researchers are exploring the tantalizing possibilities of RNA and other life-critical molecules developing in places like deep-sea vents and evolving into a self-replicating thing that we might call first life, or even protolife — ancestors to the prokaryotes, or single-celled organisms without a nucleus, like bacteria and archaea. Somewhere along the line a bit of luck (for us) happened: two prokaryotes combined to form nucleated cells, which we name as eukaryotes.

Which brings me to belief number three: life evolved. Over great periods of time, eukaryotes developed along plant and animal lines, oxygenating the atmosphere, and eventually some pioneering life forms ventured from the oceans to the lands to colonize the barren geology of earth, turning it into large organic ecosystems.

Belief number four is that, for the most part, the universe is random. It is not willed, or fated, or progressive, though randomness can lead to increased complexities. Using some basic structural parts, nature evolved through random genetic changes into extraordinarily rich organic landscapes and seascapes, filled with the plants and animals of its day. The PreCambrian Explosion, various extinctions, and random events, such as comets crashing into the earth, diverted the path of life several times, until, after eons, the great age of reptiles was over and land mammals had the chance to fill the empty ecological niches.

Belief number five is that the emergence of human beings, in the form of Homo sapiens, was not preordained. We’re a branch of primates that evolved in particular ways to adjust to our environment and we had cousin species, Neanderthals, Denisovans, etc., who did the same. We’ve only been on the planet a short while, in geological terms, but we’ve become a new force. After nearly perishing from extinction ourselves, we made it, and we developed a complex brain that would allow us to discover agriculture, mathematics, and art. Not to mention learning how to sew reindeer hide into warm clothing for the Arctic.

Belief number six is that we originated in and emigrated from Africa. Down deep, we’re all Africans and, living in a very warm climate, we were all probably dark skinned because extra melanin in the skin protects against overpowering UV radiation. Those of us who migrated to colder climates lost some of our melanin because whiter skin helps absorb the sun’s rays better in colder conditions and not as much protection against UV radiation is required.

Belief number seven is that we developed into a language-oriented species that loves narratives. We acquired strong imaginations to accompany the impressive encyclopedic knowledge of our environment we learned through hunting and gathering. Tales around the campfire, stories of our ancestors, legends, myths, and, of course, gods. We’re a species that wants to participate in its own narrative, even if that narrative is unreliable.

Belief number eight is that this narrative of human history is important to study in all its facets, including science, the humanities, and the arts and music. Physically, humans haven’t evolved much in the past 100,000 years or so, but mentally we’ve evolved through many great civilizations in ways that are fascinating and that contributed to our rise as a species.

Belief number nine is that, mentally, we went through our ‘teen’ years between Galileo and Einstein. We began to mature toward mental adulthood by rigorously questioning, observing, measuring, and testing our premises. We bent the planet to our wants and needs with an industrial and scientific revolution.

Belief number ten is that we’ve reached, borrowing from an Arthur C. Clarke title, Childhood’s End. We now possess the ability to destroy the planet for mankind, as well as other species. As adults we must learn to be stewards of our planet and treat it with respect.

I realize I’ve said little about the human condition itself. That is left to explore and think about in the context of the first ten beliefs, but I suspect it will always remain a personal, and sometimes communal, journey for each of us. This is why I write essays — to see what I think about life. We are all part of the overall narrative of mankind and each of us expresses it in a personal way.

To be sure, if we reach mental adulthood intact, the greatest history of Homo sapiens could just be starting. As we look at our current state, we see automation trending ever forward, with artificial intelligence waiting in the wings. We may create a new kind of life form. We may, if wise, develop eco-friendly attitudes about our home planet and change how we obtain energy and food. Our journey as mental adults has just begun. We may even prosper, as long as a random comet doesn’t smash into us again, and as long as we don’t self destruct. Fingers crossed, humankind!

Where is Heaven?

Where is Heaven?

By Gene Wilburn

“They can keep their heaven. When I die, I’d sooner go to Middle Earth” ~ George R.R. Martin

The mythic concept of heaven varies from culture to culture and religion to religion, as does the concept of hell. Homer gave us a Greek perspective of the Underworld in the Odyssey — a murky kind of place where even the famous Greek heroes end up in residence. The Egyptians, with their sky gods and interest in an afterlife thought of heaven as some kind of physical place beyond the known universe and the Book of the Dead describes the process a soul must go through to reach it. The Assyrians believed the afterlife was located below our world and one of their words for it translates as the “Great Below.” Everyone went there, regardless of status, and it seems to have been more of a handy place to store ghosts than any kind of paradise.

The Christian myth of heaven is the one I grew up knowing about, though “knowing about” is a precarious phrase. It’s a particularly unclear myth. Stereotyped as a place of fluffy clouds and angels playing harps and an entry gate made of pearl, it’s always portrayed as stupefyingly mind-numbing. One thing is clear though: its direction is “up.” Certain Christian religious figures “ascended” into it. The mediaeval view places it beyond the firmament, the known universe. And it’s there that selected believers go and meet all their lost loved ones, who presumably made it as well. A more salubrious place than the Greek underworld, perhaps, or the Assyrian ghost closet, and not as strenuous as the Egyptian journey to the afterlife. A comfort myth if ever there was one.

But the thing about the mediaeval world view is that the “known universe” was a much smaller and modest place than the one we live in today. Earth was the centre of that universe and the firmament was fixed and revolved around the earth — with the exception of “the wanderers,” or planets, and alarming things like comets. Heaven was just beyond the visible firmament and some mediaeval drawings portray an observer lifting the edge of the firmament and peeking out at heaven beyond. Everything in its place and God in his (always masculine) heaven.

This cosmological view took a huge hit with a paradigm shift that started with Copernicus in the 16th Century, with his heliocentric proposal, and was refined and modified through the incredibly accurate naked-eye astronomical observations of Tycho Brahe, and embraced in the writings of astronomer Johannes Kepler. The 17th Century was the time of largest early astronomical breakthroughs through the telescopic observations of Galileo Galilei (Jupiter had “moons”!) and the brilliant insights of Isaac Newton, who provided the laws of physics and mathematics that accurately accounted for observable planetary motions. The Church was not very happy with all of this, of course, and forced Galileo to recant publicly, though he never did so in private.

In one sense, I think of modern science as starting with Galileo because he used a scientific instrument, the telescope, to investigate the night sky. The telescope was followed by the microscope, and extended instrumental observation and measurement gradually became the established norm for exploring the fabric of the universe. It wasn’t even called science at first. It was usually referred to as “natural philosophy.”

It wasn’t until the early 20th Century that astronomers began to suspect that our galaxy, the Milky Way, was not the full extent of the universe and that what was showing as blobs sometimes referred to as “nebulae” on the photographic plates might themselves be galaxies. With better and bigger telescopes and some brilliant analysis, it was not only determined that they were, in fact, galaxies like our own, but we could even measure their distance from ours. The universe turned out to be immensely huge, with, as Carl Sagan used to say, “billions and billions” of galaxies. Not only that, but continued observations indicated that the universe was expanding. And if it was expanding, what was it expanding from? Enter the Big Bang.

The telescope, as both instrument of the scientific revolution, and a symbol thereof, is central to the recent book, Intellectual Curiosity and the Scientific Revolution, by Toby E. Huff, 2011, a highly readable, well-researched study of why the scientific revolution started in the West, and not in China or the Islamic world.

Telescopes were still at the forefront when I was a kid in the 1950s — the “big” telescope of the day was the huge reflector at Mount Palomar, in California, with its 200-inch mirror Hale telescope. The newspapers frequently printed black-and-white images taken with the famous instrument and there was great ferment over the firmament, so to speak. Exciting times.

It was in regard to this great telescope that I first encountered a fissure between science and religion. I was a run-of-the-mill Protestant kid who attended church with some regularity, but a good friend and classmate of mine came from a more fundamentalist stream of belief. His church, located in a nearby town, was hosting a “revival meeting” that featured two travelling evangelical preachers of fiery words and demeanour, and my friend’s family invited me along to attend one of the evening sermons. I was nervous about this because it was one of those congregations my family referred to as the “Holy Rollers” and its members were prone to burst into tongues and exclamations during the service.

But curiosity, sometimes my friend, sometimes my enemy, overcame my misgivings and I went. At this point we come back to the concept of heaven, for the heaven of this congregation was the “Kingdom of Heaven” returning to create a “New Earth” for the “Chosen,” wrapped up in some kind of event called “the Rapture.” The Christian Bible is an odd thing, at best. Most of it is a disjointed anthology of Hebrew religious writings, some of it dating back from the time of the Egyptians and the Sumerians. Added to that is an appendix called the New Testament, and the appendix to that is a very strange thing called the “Book of Revelation,” that reads like it was composed by a person, or an entire commune, on a very bad acid trip (“don’t take any of the brown acid,” as the Woodstock announcer warned). It’s so weird and vague that you can read into it pretty much whatever you want, but it seems to be the energy source for many fundamentalists.

But the two preachers, in their admittedly riveting, entertaining, and electric theatre, told a blatant lie. They said they had visited the great telescope at Mount Palomar, and that while the “scientists” were called away from the tour for a few minutes, the preachers peeked through the telescope, up into the skies, and they saw, with their own eyes, the Kingdom of Heaven descending toward the earth. The congregation was practically swooning, but I was gobsmacked that a man of religion would lie. I knew that you didn’t look through the telescope at all. You made photographic plates using it and studied the plates. And that started me thinking hard about heaven. If there is such a place, where is it?

There is no physical plane we know of, above or below, that can accommodate such a mythic landscape. Another dimension? That’s pretty science-fictiony and since it offers no method of verification, it’s no better than saying it’s located in some kind of vast, eternal Rubik’s Cube. Or that it fits somehow into Terry Pratchett L-space, though this might be pleasing to Argentine writer Jorge Luis Borges who said, “I have always imagined that Paradise will be a kind of library.”

As an armchair philosopher, I get mentally cranky about things that offer no method of verification, nor any proof of existence. I’m open to evidence, but I don’t see any place in the known universe where a heaven might fit in. So, as a humanist, I can only conclude that heaven, and hell, are actually situated within the living, earthly creatures called humans. The terms represent sublime good times, and ghastly bad times, in the lives of individuals, much of it determined by where we live, our general place in society, and events often beyond our control. To the residents of Aleppo, for instance, any place that’s not being bombed might seem like heaven. The city itself can, without exaggeration, be called a hell.

In the end, I think heaven is a state of mind and that mythic heavens are little more than works of the imagination. And, as many pundits over the years have observed, if there really turns out to be such a place, I doubt many of my friends will be there anyway, so it doesn’t highly attract me.

 

Atlantis Farewell

When I grew up, in the 1950s, it was a time when researchers were still searching for Bridey Murphy and field and stream magazines still reported sightings of Bigfoot. Flying Saucer Magazine could be found at the drugstore magazine counter, and ESP was thought to have been a proven phenomenon at Duke University.

Concurrently we were administered first the Salk polio vaccine, and then the Sabin replacement. The 20th Century at its mid period was a time when science was coming on strong but beliefs in the strange and supernatural persisted deeply into the general culture of the day.

I liked science, just as I liked Mickey Mantle, but it resided in a place alongside the prophetic dreams of Edgar Cayce and the local church. As a noncritical child, I believed pretty much in everything, including the lost continent of Atlantis. The very fact that I have to provide links to some of these references shows the distance between today and then. Today most of this material has been relegated to the pages of the National Enquirer.

Science, in the meantime, has thrived and grown and has to a large degree undermined the realm of UFOs, yeti, and the supernatural. In places the local church is still thriving, but the numbers are dwindling every year as the congregations age and the true believers leave this life.

By the end of the 20th Century we were well on our way toward a new Age of Enlightenment, if by enlightenment we mean dispelling beliefs that have no provable basis in fact.

By this point in the 21st Century science and scientific thinking have become even more pervasive, despite outbreaks, particularly in the United States, of anti-scientific undercurrents such as anti-vaccination movements and anti-evolution legislation. Frustrating though they may be to a rational person, these movements will die out as surely as the search for Bridey Murphy. There is nothing real to sustain them.

I don’t mind this sea-change in perspective because I vowed, many years ago, not to believe in things for which there was no good evidence. In my case, this included religion.

However, one area I see fading away causes me some lamentation: the receding of mythology as a force of the psyche. Through the 60s and 70s it seemed that Carl Jung, with his archetypes, and popularizers of Jungian thinking, like Joseph Campbell, had somehow tapped into the wellspring of the human spirit. Mythic stories run deep through our emotions and often lead us to a feeling of epiphany. Of all the things I’ve given up from my youth, mythology is the hardest. I don’t mean myth as story—which is eternally fascinating—but myth as something fundamental to the human psyche.

But as brain science knowledge spreads, it seems the concept of a psyche is little more than a brain construct—a side effect of consciousness. Of course there is pushback to this kind of materialistic thinking, a kind of Cartesian split of brain/mind, but it’s difficult for a rational person to see much basis in fact for this view. Nonetheless the idea of a brain/mind split reaches far back in Western philosophy and is hard to shake off simply because it’s been with us for so long.

Like the measles, however, belief in a psyche can be inoculated against by sticking with the hard evidence. In time it will be seen to be as mythical as the lost continent of Atlantis—a pretty, captivating story, but unsustainable as a model of reality.

Dispensing with old beliefs, especially cherished ones, is difficult. Yet the rewards of maintaining evidence-based reasoning outweigh the pain of parting with wish-fulfillment, faith-based beliefs. It cleanses thinking and prepares one for the real world. There comes a time when it’s necessary, for good mental hygiene, to say farewell to Atlantis.

Solar Flares and Twitter Follies

Solar Flare

So, it was an odd week. During an intense solar flare there were widespread reports of huge aurora borealis displays, often seen much farther south than usual. We might have been able to catch a glimpse of these but for one thing. We had a week of socked-in, cloud cover with intermittent rain and snow. No hope of seeing the sky.

I started physiotherapy for a neck problem seemingly caused by normal aging and arthritis — an impinged nerve that sends pain into my shoulder, down into my arm, and tingles all the way to my fingers. The exercises my therapist has given me are straight out of Treat Your Own Neck, by Robin McKenzie — a book he recommends highly.

A news item from this week is a head scratcher: British Tourists Arrested in American Terror Charges. In their Twitter comments the couple joked that they were going to “destroy America” (British slang meaning party hard), and “dig up Marilyn Monroe” (a quote from the comedy Family Guy which is an American show).

For his Twitter jokes, Leigh Van Bryan, 26, was handcuffed and kept under armed guard in a cell with Mexican drug dealers for 12 hours after landing in Los Angeles with pal Emily Bunting. Both had their luggage searched for spades and shovels.

Lessons learned? Big Brother is watching and BB has absolutely no sense of humour (even American-grown humour). The couple were barred from entering the U.S.

On a positive note, I just received an Amazon delivery: Garner’s Modern American Usage. I’ve already enjoyed spending over an hour opening the book at random and reading the passages. For anyone who loves the English language and enjoys usage and grammar discussions, this book is highly recommended.

Intellectual Curiosity and the Scientific Revolution: Book Review

Intellectual Curiosity and the Scientific Revolution: A Global Perspective, by Toby E. Huff. Cambridge University Press, 2010. 368p.

This work, by Toby E. Huff, is a perspective on the explosion of scientific knowledge in 17th Century Europe and England, and a probing look at why this happened in the West and not in China, India, or the Islamic states.

Using one invention in particular — the telescope — Huff documents how this instrument revolutionized the study of the heavens and how Galileo and others used it to explore the skies and make detailed observations that led to a confirmation of a heliocentric system posited by Copernicus. The paradigm shift away from a spherical, geocentric system electrified the science of the day.

Huff then traces the migration of the telescope into China, India and Islamic countries. In China Jesuit missionaries not only introduced the telescope, they also translated into Chinese the findings of Galileo and others. A few court astrologers (astronomers) were interested, but mainly to help refine the prediction of heavenly events. At the time, China held a flat-earth view of the cosmos and hadn’t even moved to a spherical, geocentric view. The new knowledge was not incorporated into Chinese study. Huff then traces parallel stories from India and the world of Islam.

And it wasn’t just in astronomy. Huff also traces the development and knowledge of forces and dynamics, and medical anatomy in the 17th Century and how in Europe scientists, from the 12th C onward, had been dissecting animals and human corpses, thereby learning about the interior anatomy of the body. Islamic faith considered dissection a form of desecration and it wasn’t allowed. Similar obstacles were in place in India and China.

Huff then postulates on the differences in culture between the West and the rest of the world. Europe and England had more or less autonomous universities and, due to the rise of Protestantism and its goal of having each person read the Bible personally, a steady rise in literacy. Literacy in the rest of the world was circumscribed and schools in China, India, and Islamic countries were mainly set up to teach traditional wisdom and religious law.

The West was living in a time of great intellectual curiosity and had had a tradition, based on Aristotle, of hands-on experimenting to discover the truths of things. Again, there was no corresponding climate in the rest of the world.

Huff’s work is academic, with copious footnotes and an extensive bibliography. He has, in no way, suggested the West was superior to the other cultures of the day, but that it was culturally different. China, India, and the Islamic states had access to the instruments developed in the West, and to the knowledge that was being discovered and disseminated, but their societies didn’t have the requisite intellectual curiosity to pursue the new knowledge.

Intellectual Curiosity and the Scientific Revolution is one of the best science books I’ve read in awhile. I recommend it to anyone interested in the history of science and especially the emergence of science in the 17th Century and the culmination of many threads of knowledge in the publications of Newton.

More Trouble for Darwin

darwin-ape

When Darwin published On the Origin of Species in 1859, he suspected it would cause a storm of protest and indignation from religious quarters. He was right. Evolutionary studies, along with the new geological studies of the 19th century, posited the first awareness of “deep time,” as Stephen Jay Gould would later call it. It hypothesized that the earth was older, incredibly older, than had been previously thought. The evidence, corroborated by scientists then and since, has supported the hypothesis and shown that The Bible of the Christian church isn’t a reliable guide to the history of the planet.

Worse, from a Victorian point of view, was the evidence that man, along with the great apes, had descended from a common ancestor, through a long process of natural selection. One measured in millions of years for our branch alone.

Darwin had a hard time of it publicly and was lampooned in the newspapers of the day. But, continued studies through the next 150 years plus a new understanding of genetics, has shown that, with minor exceptions, Darwin got it right. For this he is justly honored for being one of the great figures of science.

Among scientists of natural history, the theory of evolution fits the facts, full stop. Nothing scientific has ever been put forward to challenge this point of view, and the so-called “gaps” put forward by those who don’t want to believe the facts, have been closed one by one as more discoveries have been unearthed. The fossil record and the genetic record are both consistent with evolution having taken place in the deep time of the planet Earth. To most scientifically literate people, the theory of evolution is as solid as the theory of gravity.

But there are still those who resist facing the facts. While visiting a cave in Arkansas last spring, I innocently asked our guide how long it took for the magnificent large limestone formation to form. She replied, “It depends on whether you believe in the ‘millions’ theory or the ‘thousands’ theory of the earth. I’m in the ‘thousands’ camp so I’d say a few thousand years.”

The “thousands” theory? This derives from Bishop James Ussher in the 17th century who speculated that the date of the Biblical Creation could be dated by calculating the lifespans of Old Testament patriarchs. Ussher’s conclusion was that the earth began on October 23, 4004 B.C. This totally Biblical calculation has somehow survived into present times, within sects of fundamentalist Christians who believe therefore that the earth is some 6000 or so years old, despite scientific (and rather obvious) evidence to the contrary.

What people choose to believe as an article of faith, rather than reason, is a basic right in the Western world. There are people who believe in the efficacy of quartz crystals and “power spots” as well. The problem begins when beliefs such as these spill out from personal and congregational spaces into public spaces.

You’d think that 150 years of solid evidence for the evolution of life on our planet, including our own evolution into Homo sapiens, would be sufficient reason for having it taught in schools. Yet there are still fundamental Christian lobbyists who want it taught alongside something they call “intelligent design.” Judge John Jones III ruled in the Dover, Pa., case in 2002, that “intelligent design, by its very nature, is a religious belief, not a scientific fact or theory, and therefore should not be taught in schools.” Intelligent Design is a tarted up name for Creationism — an attempt to give it scientific trappings.

But the debate continues. A recent Washington Times article, “On teaching evolution: New year, old fight,” reports that “at least two U.S. states in 2012 will consider bills that downplay the notion man evolved from animals and call for Charles Darwin’s famous theory to be taught as just that – one possible explanation, not the definitive answer.” Alongside Intelligent Design, that is.

Rep. Gary Hopper of New Hampshire is quoted as saying, “I want the problems with current theories to be presented so that kids understand that science doesn’t really have all the answers. They are just guessing.”

Guessing? If this is any indication of how some people think science works, it’s clear that we need more, not less, teaching of science and scientific literacy. Certainly science doesn’t have all the answers. That’s the nature of science. A scientist, like a good detective, follows the evidence, wherever it leads. In fact science is based on challenging the evidence. Whenever a new study emerges, other scientists try to pick it apart. If it withstands the challenges and is replicated by other scientific studies, a consensus forms around the results. If, eventually, evidence points to something entirely different, then previous views are updated and a new consensus is arrived at. In brief, science is self-correcting.

Religious faith does not operate this way. It instead harbors the concept of “immutable” truths. Which is not to say that a scientist can’t be a religious person. It’s just that he or she doesn’t confuse the two “magisteria” as Stephen Jay Gould would call the different mental spaces of science and religion.

That we live in an age of science is indisputable. And the scientific consensus is that Darwin got it right. There are no evidence-based challenges to evolutionary theory, only faith-based ones. To become good, participating citizens of a scientific world, students need to be taught how science works and not have their publicly-funded science studies entangled with the religious beliefs of fundamentalist Christian (or, for that matter, Islamic) faith.

In a modern world, church and state must be kept separate. The teaching of science must be taught in the context of following the evidence, not of comparing it to religious beliefs. To do anything else would be a disservice to the students.

Taking Stock: Facing 2012

iPhone Selfie

I hope you all had a good Christmas and New Year season!

Traditionally New Years Day is a time for resolutions that will largely be unkept in the months that follow, so I’ll refrain from making any. Besides some of them are ongoing no matter what time of year: lose weight, exercise more, write more.

Looking back to 2011, I’ve had a Macbook Air (11″) for a year now and it’s so slick and useful it still feels new. As such it’s an incentive to get down to the task of writing just so I can use it. I enjoy my technologies, but it’s been a long time since one has stayed so fresh. Kudos to Apple for another brilliant design and execution.

There are rumours of a new iPad in the works some time 2012. If it turns out to be true I might be ready to pick one up. I gave my previous one to Marion after getting the more writer-friendly Macbook Air, but I confess I miss the iPad experience. I get a miniature version of it with my iPhone 4 but it’s not the same without the large viewing screen.

As I’ve mentioned in the past, I’m a fan of podcasts and I’d like to pay tribute to my two favourites: I Should Be Writing, by Mur Lafferty, and Brain Science Podcast, by Dr. Ginger Campbell. You ladies have allowed me to listen in on hours of intelligent conversation. Thank you.

I have a couple of directions I may take my writing in 2012. One idea I’ve been kicking around is putting together a series of personal essays into a Kindle book. The other is to write on a couple of subjects that interest me, but as extended feature articles that could be published as Kindle Shorts.

I don’t have any special photo projects in mind for the year. I’m content to carry a camera around with me and take shots of this and that as I see things. I plan to post a new photo on my Flickr photostream every day, if possible. The camera in my iPhone 4 increases my odds of meeting this goal.

One of the things I may do more of in 2012 is post short reviews of books I’ve read. My current reading is Intellectual Curiosity and the Scientific Revolution, by Toby E. Huff. I’m about 25% into it and already it’s shaping up as the best science book I’ve read in the past year.

Currently listening to The Harrow & The Harvest, by Gillian Welch. Indispensable if you like a traditional folk sound.

My other two goals for the New Year are to study more philosophy and mathematics. I’m nearly ready to tackle my Algebra II course and I have a good Teaching Company Great Lectures course Modern Intellectual Tradition: From Descartes to Derrida that I’ve started. Staying intellectually active is less a goal than a deep-seated need. I suspect it’s the same for you.

I look forward to seeing and hearing from friends in 2012. May your 2012 be a wonderful year.

Tracking Science

I’ve been a science buff since grade school so it’s natural I want to follow recent developments in science. The trouble is, it’s an impossible task. Not even scientists can keep up, even within their specialty.

So the best I can do is track some of the highlights of science, technology, and medicine that make it into the news. The task is made easier through the use of Twitter, by adding science and technology Twitter feeds to my account. I’m currently following 45 sci-tech feeds and receiving up to 100 tweets a day. I expect the number of feeds to increase as I discover more of them.

Although Twitter is a great help in keeping up to date on things sci-tech, it’s a burden to read through the daily feeds. I spend a lot (too much) time on the Internet as it is but there are more feeds than I can easily track. As a consequence I was missing a lot of interesting stories.

What I needed was an easier way to track the stories. I found a good compromise solution by using Paper.li, a nifty service that allows you to create a daily online newspaper from your Twitter feeds. Paper.li lets you set up your newspaper on a once-a-day or twice-a-day publishing schedule, and you can set the times of publication.

Using this service I’ve created Gene’s Sci-Tech Daily, a twice-a-day paper and it’s proven a helpful way to track news items. It keeps me up to date on the hunt for the Higg’s Boson, the voyage of Curiosity to Mars, the potential cloning of wooly mammoths, reviews of new tech items such as the Kindle Fire, and the latest in the Apple-Samsung lawsuits. The paper misses some stories, of course, but on the whole it seems to vacuum up the main items.

Feel free to bookmark this newsletter and use it if you want a pleasant way to track science and technology news. Drop me a line if you find it useful.

A New Way of Walking

20070717_bipedalism

Everybody’s talking ’bout a new way of walking
Do you want to lose your mind?
Walk right in, sit right down
Daddy let your mind roll on
— Rooftop Singers, “Walk Right In”

I remember an incident from the late 70s. At the time I was Head Librarian at the Royal Ontario Museum and my main reference and cataloguing duties were with the museum’s science departments. As a result of this, I got first look at most of the new acquisitions, which included Scientific American reprints. One of the reprints was on bipedalism and one of the articles articulated the mechanics of walking upright.

The context of the reprint was on early hominids and what was required for them to walk on two legs. We now think bipedalism was a very early evolutionary development and that some of our ancestors who walked upright were apes with craniums no bigger than a chimp’s. In other words, bipedalism goes a long way back.

What I recall the most, after reading the reprint, was how for the next few days afterward I got stoned on watching people walk. It was if I were witnessing bipedal walking for the first time. I could see the mechanics in action, and the beautiful flow of balance and energy efficiency. (Of course pretty girls made the observation additionally interesting.) It was as if encountering a new idea for the first time, then seeing it applied everywhere.

All this came back to me sharply a few months ago when, getting out of bed in the morning, I’d step on my right foot and gasp at the sudden, sharp pain in the heel. Yikes, what was this? I thought at first I’d bruised it badly somehow, but the heel didn’t, well, heal. It hurt worse and worse as the days went on. So much so I had to grab a cane to walk any distance, and even that was painful.

It turned out I was “blessed” with a common condition called plantar faciitis, in inflammation of the plantar fascia in the feet. For which there is no quick or easy cure. I started walking less and icing my foot at least once a day. It significantly reduced my walking radius which in turn impacted my photography. My doctor told me to be patient, and that I might benefit from custom orthotics.

I did the next best thing. I hobbled to The Running Room where I found generic orthotic arch support inserts. As soon as I tried a pair, the relief was instant. Not a cure, but it made putting weight on my foot somewhat less painful. I bought them, transferred them from shoe to shoe in all the shoes I wore, and continued the ice treatments. My doc also prescribed an anti-inflammatory that helped with the pain.

As a result I began, gradually, to walk more easily. But with a difference. Whereas previously I would hit my heel down hard as I walked, I began to shift my downstride more to the middle of my foot. I didn’t do this consciously — it simply hurt less to walk that way. But it felt awkward, for awhile.

Today as I was walking, relatively pain free, I realized I had a new way of walking. That the small muscles in my legs, ankles, and feet had adjusted to the new stride, and that I was walking very comfortably. Now there are people who posit that walking in shoes is unnatural and that shoes rob us of the natural gait we evolved. This I don’t know the answer to. Perhaps.

All I know is that I felt comfortable, almost floating, and that my mind was rolling on.