NOTES FROM UNDER THE TABLE:The Snoring of the Machines
illustration: Nicholas Christakis, James Fowler; Marco Derksen, flickr, CC BY-NC 2.0 license

NOTES FROM UNDER THE TABLE:The Snoring of the Machines

BY James Hopkin

Put your machines to sleep more often, leave your phones at home, wander out in search of you know not what, but wander out anyway, and discover something that can’t be shown on a screen

2 minutes reading left

The Snoring of the Machines (and the birth of ghost-modernism)

During the summer, I visited the Pergamon Museum in Berlin, and I stood before the Ishtar Gate (575 BC), once part of the Walls of Babylon, and alluding to the Mesopotamian, pre-Christian creation myth of Ishtar and Marduk. Of course, what a sight, what a timeless vision of an entrance, a true beginning, enough to set you trembling! But what’s this? People weren’t looking, not really looking, certainly not patiently observing, standing in awe and wonder, astonished by their own sudden receptivity, amazed to have found such an otherworldly apparition in the middle of a pulsing European capital, no, instead they were taking out cameras and phones, pointing and clicking away, studying, not the ancient wonder in front of them, but the tiny screens on the back of their gadgets, thus instantly media-ting, de-mystifying, making safe (not to mention slaughtering the aura of) this extraordinary scene, and, most importantly for them, converting it into a virtual format which they can then photofuck, infantilise, and post to a disinterested mass of friends - “look where I was today” (though I wasn’t even looking) – before filing away under “Summer, 2010”.

Don’t you despair?

In his novel, Homo Faber, the Swiss novelist, Max Frisch wrote, “technology…the knack of so arranging the world that we don’t have to experience it.”

That was in 1957.

And here’s Zadie Smith writing recently in The New York Review of Books, “When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it’s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears…[but] our denuded networked selves don’t look more free, they just look more owned.’’

She gave up that social networking site (in which we are all as happy as fish in an ocean of marketing piranhas), stating that it is “falsely jolly, fake-friendly, self-promoting, slickly disingenuous” and “the greatest distraction from work I’ve ever had”.

Don’t you agree? Internetic distractions provide the greatest threat to cogent narratives, to patient observation, to a deeper sense of our selves. And by cogent narratives, I mean there’s also a very real threat to language itself: sms-speak, media-babble, the consonant- and number-heavy jargon of virtual correspondence.

“You Are Not a Gadget” is a book by computer scientist and artist (the best of both worlds?), Jaron Lanier. He writes, “Different media designs stimulate different potentials in human nature. We shouldn’t seek to make the pack mentality as efficient as possible. We should instead seek to inspire the phenomenon of individual intelligence.’’

Culture 2.0? The best or worst thing to happen to culture since the Enlightenment? Are you telling me the Kindle is the new Caxton? Certainly, the fascinating (and often time-wasting) availability of information on the net requires a shrewd intelligence to navigate one’s way around. Or are we all just doing what everyone else is doing, moving – and process is one of the key words of Culture 2.0 – not towards self-awareness, but towards self-exposure, virtual exhibitionism, even an evacuation of the self: all of me is out there, online. And it’s instant. No room in the dark for the growth of a soul. Just as the darkrooms of conventional photography are now considered somewhat otiose.

“A person is a mystery in broad daylight,” wrote Sartre.

But not in our culture of confession, of an almost cretinous will to divulge.

Yet as Max Frisch writes, this time in his novel, “I’m not Stiller”, a wonderful excavation of the many selves that constitute a single identity, “You can put anything into words, except your own life”.

When I did a spell as Guest Professor of Literature at the University of Leipzig, I taught a postgraduate class on an American writer, Thomas Pynchon. Two of the students turned up with Wikipedia print-outs about the author which one of them read by way of their response to reading The Crying of Lot 49. When I told them that their university library has a large section of Pynchon criticism and that I would not accept Wikipedia as postgraduate research (and certainly not as the only research) they made a decision: they dropped out of the class.

Is this the generation of Culture 2.0?

Ok, before you start calling me a Luddite, a Romantic, a cynic, or just plain nostalgic, let’s go back a little. I was born in Cambridge, around the time the home computer was invented in that city, and I grew up with Sir Clive Sinclair’s innovations: the ZX80, the ZX81, the Sinclair Spectrum. I’m as easy with computers as anyone. I use them to aid me in my work,  but I refuse to be in thrall to them. That said, I wholeheartedly support the technological advances that link the world, increase literacy, the availability and immediacy of news (see, for example, the coverage of the UK’s recent student protests shot by the students themselves), the arts, information. I know, too, how much these advances have, to give just one example, helped independent filmmakers to produce cheap but high-quality films and documentaries, a good many of which challenge the status quo, media-marketing dictates, political exploitation, inequality, the hidden stories of war and so on. But, of course, there’s a concomitant danger.

In The Guardian, Sean O’Hagan, writing of the current boom in low-budget, high-quality documentaries, observed, “That is perhaps the reason why its [the documentary’s] boundaries are currently being stretched – to keep up with the increasing unreality of the real world.’’

Which takes me back to Jaron Lanier: “Information systems,” he writes, “need to have information in order to run, but information underrepresents reality”. Yet vehement supporters of Culture 2.0 claim that information systems extend reality.

So which reality are we living in? And whose? And how far does it go? Can Culture 2.0 lead, as many believe, to a more efficient, fairer democracy (Democracy 2.0?) and if so, what are these innovators promising next? The end of world hunger and poverty 2.0?

More likely, Culture 2.0 is simply a concept, slipped in while stressing continuity, when, in effect, it’s a digitalised postmodernism,  the labyrinth replaced by a web.

If that’s the case, then we are really going to need some great minds, writers, artists, filmmakers, mythopoeic thinkers to help us make sense of our plight, to help us find our way around this all-encompassing web, stricken as we will be between real and virtual worlds, and in dire need of synthesizing our real and virtual selves in the search for an identity that does not deny but rather encourages the flourishing of our individualism and intelligence. And isn’t this need in many ways redolent of the inception of modernism after the First World War, an era facing rapid advances in communications, film, travel, while trying to recover spiritual values after the inhumane slaughter of that conflict, and its corollary, economic uncertainty? So, after the postmodern glorification of fragments and “anything goes” (and the welcome collapse of borders, regimes, exclusivity)  and during war and terrorism and the endless siege of the media and marketing machines that have coerced us into spending more money than we possess, don’t we need a few visionaries, a – and here I coin a phrase, a re-born concept, but one to bloom in its own way for our age - ghost-modernism (a spectral reincarnation of modernism, plus the better aspects of pomo and the internetic age) to put some of the pieces together, including our selves, virtually or otherwise, before “everything’s gone”?

Culture 2.0 is not the end of culture, nor is it a new beginning. The ancient portals remain. The beauty of a creation myth is that it is cyclical: it comes round again. If we’re to prosper in the current e-poch, ghost-modernism (recovering spiritual values and mythopoeic structures, coherent narratives, the mystery of life and of the self, the art of slow and steady perception, not to mention a more European outlook) is, for me, the way to bridge our lives between the real and the virtual world. And there’s something you can do alone in the non-virtual world, too: put your machines to sleep more often, leave your phones at home, wander out in search of you know not what, but wander out anyway, and discover something – a vision, a moment, a mystery,  yes, an experience – that can’t be shown on a screen.

Then keep it to yourself.

Or has E M Forster’s “only connect” been replaced, without appeal and at whatever the cost, by “only connectivity”?

© James Hopkin 09.11.10