January 12, 2011

Caught in an information rip?

There’s so much information out there that journalists are starting to code in languages like R or Python to weed out patterns and spikes in data sets. I’m not sure how many of these data journalists there are, yet. But there’s no question there are stories to be found (search ‘data journalist’ and you’ll find pages of guides to the emerging field).

And more and more we’re actually obsessed by information. Think about the flow of stories and observations and thoughts on Twitter. The number of blogs covering data visualisation and infographics (though that trend seems to have peaked). The way we use spatial apps in a crisis like the Queensland floods. And in the world of science, we’re building up massive troves of data.

It turns out searching is one of our primary drives.

In 1954, psychologist James Olds put electrodes in rats’ brains for an experiment. By accident, he discovered that if the probe was put in the lateral hypothalamus of the brain, and the rat was allowed to press a lever and stimulate its own electrodes, it would press the lever until it collapsed. Ever since, people have assumed that the lateral hypothalamus is the brain’s ‘pleasure centre’.

But that didn’t make sense to neuroscientist Jaak Panksepp.

The animals, he writes in Affective Neuroscience: The Foundations of Human and Animal Emotions, were “excessively excited, even crazed.” The rats were in a constant state of sniffing and foraging. Some of the human subjects described feeling sexually aroused but didn’t experience climax. Mammals stimulating the lateral hypothalamus seem to be caught in a loop, Panksepp writes, “where each stimulation evoked a reinvigorated search strategy”.

It is an emotional state Panksepp tried many names for: curiosity, interest, foraging, anticipation, craving, expectancy. He finally settled on seeking. Panksepp has spent decades mapping the emotional systems of the brain he believes are shared by all mammals, and he says, “Seeking is the granddaddy of the systems.” It is the mammalian motivational engine that each day gets us out of the bed, or den, or hole to venture forth into the world. It’s why, as animal scientist Temple Grandin writes in Animals Make Us Human, experiments show that animals in captivity would prefer to have to search for their food than to have it delivered to them.

For humans, this desire to search is not just about fulfilling our physical needs. Panksepp says that humans can get just as excited about abstract rewards as tangible ones. He says that when we get thrilled about the world of ideas, about making intellectual connections, about divining meaning, it is the seeking circuits that are firing.

That’s from an Emily Yoffe piece in Slate on why our brains are hard-wired to love Google, Twitter and all those other things.

So that’s why I wade through writers’ blogs and scan feeds and pore through books and magazines. It’s powerful, very useful, and probably not particularly healthy. But is it just the way things are?

Maybe. Kristin Alford blogged yesterday about her search for ‘flow’ amids the interruptions.

Attaining flow requires sustained thinking and the creative application of our skills and knowledge to solve new and difficult problems – mastering a difficult turn for an ice skater, finding the right phrase for a poet. But when we reach a state of flow, we barely notice the time pass and gain great satisfaction.

How do we create opportunities for sustained thinking and flow in our connected world with the rush of Twitter, status updates on Facebook, hitting receive on emails and the incessant ping ping of messages on our devices? How do break old unproductive habits associated with connection?

Kristin’s solution, via Jack Cheng at the excellent A List Apart, was to think about the habit fields surrounding your work environment:

On the one hand it should be a place of quiet contemplation and flow. On the other hand it is also where we check messages, talk on the phone, pay bills, research on the internet. If you check Twitter first thing in the morning and then regularly during the day, it becomes a habit associated with that space.

Cheng noted that he deleted a certain Twitter client because he would find himself absent-mindedly clicking the shortcut key without realising. I do this regularly when I switch between applications, finding myself on Tweetdeck with no previous intention of viewing it. It has become an instinctive habit, muscle memory.

Cheng now sits in a different chair for Twitter and email, saving his desk for actual work – in his case writing, designing and coding. Physically changing the space has reduced the social media habit field at his desk.

I’m not sure sitting at a different desk, using a different computer,  is really practical for me. But the issue of ‘habit fields’ forming around these heavily used locations really chimes. As I mentioned in the comments of Kristin’s post, I think this is something we’re all struggling to get right. Twitter and whatever’s next are so good at taking us to good things, but the by-product is a Pavlovian response in our reward centres, constant refreshing, scanning and instinctively interacting. I take time out to read books, long articles and papers, and write thoughts, observations and ideas ever day, but I’m going to have to think more about those habit fields.

It definitely puts a different slant on neuroscientist David Eagleman’s prediction in The Guardian that within the next 20 years we’ll be jacking information streams directly into our brains with William Gibson-esque machine interfaces.

I’d like to imagine we’ll have robots to do our bidding. But I predicted that 20 years ago, when I was a sanguine boy leaving Star Wars, and the smartest robot we have now is the Roomba vacuum cleaner. So I won’t be surprised if I’m wrong in another 25 years. Artificial intelligence has proved itself an unexpectedly difficult problem.

Will we have reached the singularity – the point at which computers surpass human intelligence and perhaps give us our comeuppance? We’ll probably be able to plug information streams directly into the cortex for those who want it badly enough to risk the surgery.

That idea of the rewiring our ideas of humanity with sentient computers and the singularity appears to be crystallising.  Tim Flannery’s fascinating interview with Robyn Williams on ABC’s Science Show laid out the Internet, the flow of information around the planet and the increased interdependence of all of us as steps towards the development of a superorganism – he calls it Gaia – and as he describes it, it definitely calls to mind the ant colonies described by entomologist EO Wilson.

It’s a debate that’s picked up a lot of steam. Flannery’s comments were editorialised by Tim Blair in the Daily Telegraph and anonymously in The Australian.

Come on down, Gaia!

Hey, if the big guy can take out Australian citizenship by this morning and hold a cricket bat, we’ve got a job for him at the SCG.

It’s obviously speculative, but there’s a lot more to this than meets the eye – and it really taps into what we’re increasingly understanding about how complex systems work, and how Earth’s system works. Here’s a response from three scientists on the role of what’s increasingly being called Earth Systems Science.

A critical feature of Earth System Science is to recognise that human activities now form a major interactive part of the functioning and evolution of the entire planet. This is a significant departure from the past where humans have been studied separately from the environment around us. We have been regarded as villains impacting the planet’s natural systems, and victims suffering from the way the planet reacts, for example through changing climate.

This new approach means that the natural science of global environmental change must be linked with social science, economics and the humanities, that is, “global environmental change” must become “global change”.

There are real risks when we become so interconnected and tapped into the global flow of data. Personal risks, that stop us from being creative and able to deliver on our promise. Global risks, where crises like the financial meltdown brought on by sub-prime lending can cascade around the world due to our interconnected financial systems. And fundamental challenges to Earth’s inhabitants, like mass extinction of Australia’s biodiversity.

But there’s also the very real chance of good – the ability to understand each other, to have richer, far more fulfilling experiences, to come up with new ideas, to solve some of these big problems. Finding the balance is crucial.

  • leesawatego

    RT @matt_levinson: Caught in an information rip? http://ideas.fortunegrey.com/2011/01/12/

  • http://bridge8.wordpress.com/ Kristin Alford

    So excited reading this and seeing the other connections apart from what I wrote yesterday.nThe idea of seeking is something I hadn’t picked up on. And am right now (thanks to @willozap) pondering the link between deep thinking and creative flow. Perhaps seeking, thinking, creating all require different modes, and different interactions with the digital.

  • Pingback: Tweets that mention Caught in an information rip? | This Is Possible -- Topsy.com

  • howthebodyworks

    RT @matt_levinson: Caught in an information rip? http://ideas.fortunegrey.com/2011/01/12/

  • ScientistMags

    @matt_levinson Where would you like me to comment? On the blog?

  • http://twitoaster.com/country-au/matt_levinson/ matt_levinson

    @ScientistMags that’d be a great place!

  • http://philosophicallydisturbed.wordpress.com/ Magdeline Lum

    I joined Twitter to fulfill my desire to keep up with the latest news on the Iran elections in 2009. That was a while ago now. I’m still there and have lost hours on the service.nnI can spot people that do nothing more than to create 140 characters of wit with the sole purpose of being retweeted endlessly. I have noticed that I have become impatient with real time coverage of events. Sometimes real time is not fast enough and it’s not like anything can travel faster than real time. I find myself at times searching out other points of views of the one situation. This is a wonderful thing to do but I have begun to ask myself why.nnThese days I click through on links from trusted sources before retweeting them instantly. After all, why bother broadcasting something that you haven’t understood or worse still, taken the time to read. Lately I’ve slipped in a few links to Youtube videos. I have seen some replies and RTs made in a time less than the length of the video. You just have to wonder whether anyone actually pays attention anymore or indeed what paying attention now means.nnI agree. There is less comprehension happening. I have seen so many arguments over Twitter from people on the same side of the debate. I admit to being in some heated conversations of Twitter and it has been centred around one pedantic fact that usually means nothing in day to day conversations. It’s as if the entire message and meaning of it is lost.

  • ScientistMags

    @matt_levinson Done. Commented. Hope it makes sense.

  • http://twitter.com/woodenpalace Cam Webb

    I remember being told about the ‘habit field’ over 10 years ago during PhD orientation when we were told to have a designated workspace for thesis writing so that our head would always be in the right place. What rubbish I thought. However, the ‘habit field’ is something I struggle with more and more and I’m sure it has an impact on me. I am quickly approaching a time when writing manuscripts at my workdesk is near impossible. Between the typical work/laboratory distractions that are ever present, the online distractions of ‘information seeking’ add to the loss of productivity through interuption (although there is no doubt that having almost any publication at my finger tips extremely useful). I can still be productive but I find it hard to be creative/critical without the uninterupted time to reflect on what I’m writing. I’m thinking of setting aside a few hours a week just for manuscript writing when I take the laptop down to the library away from the lab/online distractions and see if I can set up a more productive/creative ‘habit field’.

Previous post:

Next post: