You are currently browsing the tag archive for the ‘singularity’ tag.

Black HoleI’m on a plane to Boston to attend a workshop sponsored by the XPrize Foundation, and hosted by MIT & the renown futurist Ray Kurzweil, head of Singularity University. The workshop is being held to discuss the merits of creating an $10M XPrize to accelerate the development of direct brain-computer interfaces or BCI for short.

I’ve always thought BCI was a great idea. I’m not the first to call it the “next step in human evolution”.  I’ve thought, “wouldn’t it be empowering to be able to surf the web with your thoughts?” When this vision of BCI becomes a reality, you would have all the knowledge of the world much closer than your fingertips, since the knowledge would be inside your head, and instantly accessible, just like your ‘native’ thoughts and memories, .

This optimism has prompted me to engage in research to make BCI a reality, as described in this recent Computer World article, in which I was (mis)quoted as saying that people will be interacting with computers via chips implanted in their brain by the year 2020.  It will likely take longer than that, given the technical hurdles and regulatory approval process, but there don’t appear to be any show-stoppers that would prevent it from happening within the next 20 or 30 years at the outset.

Now, after spending little over a month as an active user of Twitter, I’ve got serious reservations about this wisdom of this vision of the future. Don’t get me wrong. Twitter is an amazing service. But its nothing like surfing the web.  Surfing the web is a ‘pull’ activity. I’m in control of what I’m looking for, and what I’m reading. In contrast, Twitter is a ‘push’ service. Once I’ve set up a list of people to follow (I’ve got about 80 now), posts and links come at me relentlessly.

The perpetual stream of information wouldn’t be a big deal, except for one simple fact – its addicting. I’m fascinated by the information contained in a large fraction of the links the people I follow are posting. As a result, I find myself spending hours reading their posts, and when I’m done there is a whole new series of tweets, and the process repeats.  And better semantic filtering technology seemingly wouldn’t help. Its not that I’m overwhelmed by crap. There is too much interesting stuff to read, and better filters would probably just point out more stuff that I’m missing, making the whole thing even more addicting!

I’m no slacker, and I’m usually a very self-disciplined person. Just ask my friends, family and co-workers. But the stream of information coming at me from Twitter is just so interesting and so distracting, it is hard to focus on other things.  I don’t think I could describe the experience as well as Jim Stogdilll (@stogdill) has done in his post Skinner Box? – There’s an App for That. I’ll just quote a couple passages, but anyone interested in the addictive side of Twitter, and what it can do to your thinking ability, should read it in its entirety.

In describing his conflicted attitude towards Twitter, Jim says:

I can either drink liberally from the fire hose and stimulate my intellect with quick-cutting trends, discoveries, and memes; but struggle to focus. Or I can sign off, deactivate, and opt out. Then focus blissfully and completely on the rapidly aging and increasingly entropic contents of my brain, but maybe finish stuff. Stuff of rapidly declining relevance.

This rings so true with me, and like Jim, I find it hard sometimes to willfully opt out – the stream is just too enticing. As Jim observes, we’re like rats in a Skinner box, self stimulating with our reward of choice, real-time information.  We’re ‘digital stimulusaholics’. Jim goes on to say:

For the last couple of years I’ve jacked in to this increasing bit rate of downloadable intellectual breadth and I’ve traded away the slow conscious depth of my previous life. And you know what? Now I’m losing my self. I used to be a free standing independent cerebral cortex. My own self. But not any more. Now I’m a dumb node in some uber-net’s basal ganglia. Tweet, twitch, brief repose; repeat. My autonomic nervous system is plugged in, in charge, and interrupt ready while the gray wrinkly stuff is white knuckled from holding on.

What if Twitter is turning us into mindless cogs in a big machine, and the machine turns out to be dumb?  As Jim describes it:

What if the singularity already happened, we are its neurons, and it’s no smarter than a C. elegans worm?

Now imagine just how much more addicting the stream would be if it was coming at us in real-time through a two-way link hooked directly to our brains. Sure there would be an ‘off’ button – responsible scientists (like me!) will make sure of that.  But would anyone be able to push it?  I’m far from certain.

So I’m stuck in a difficult position.

Tomorrow I’m meeting with Mr. Singularity himself, Ray Kurzweil and a bunch of other proponents of brain-computer interfaces to brainstorm about offering a big cash XPrize for the first group to make high-bandwidth BCI a reality.  And I’m thinking it may not be such a good idea for the future of humanity.

I expect Kurzweil will argue that merging our slow squishy brains with our machines is the only option we have, and that rather than turning our brains to mush, it will jack them up to runs thousands of times more efficiently than they do today, since transistors are so much faster than neurons.

Recent studies have shown that humans aren’t very good at multi-tasking, and paradoxically, people who multi-task the most are worse at multi-tasking than people who usual focus on one thing and occasionally multi-task.  So much for the learning / brain-plasticity argument that ‘we’ll adapt’.

Perhaps our brains could be reconfigured to be better at multi-tasking if augmented with silicon?  Perhaps with a BCI, we could be reading an article and talking to our spouse at the same time. How weird would that be?  And with such a significant change in my cognition, would I still feel like me?  Would it feel like there were more than one of me?  Talk about schizophrenia!

Call me a conservative, but I know enough about the brain and human psychology to realize that it maintains its hold on reality by a rather tenuous rope, carefully woven from many strands over millennia by evolution. That rope is bound to get seriously frayed if we try to jack up our neural wiring to run many times faster, or to be truly multi-threaded in the time frame Kurzweil is talking about for the singularity, i.e. 2030 to 2050.

But on the other hand, one might conclude we’re damned if we do and damned if we don’t.  Whether we like it or not, things aren’t slowing down. The amount of information in the stream is doubling every year. If instead of jacking in with BCI, we take the conservative route and leave our brains alone, the Twitter experience shows us we’re likely to be sucked in to the increasingly voluminous and addicting flood of information, left with only our meager cognitive skill set with which to cope with the torrent.  I’m afraid our native, relatively feeble minds may not stand a chance against the selfish memes lurking in the information stream.

Sigh. Maybe I’m over-reacting…  If I don’t chicken out, I will try to bring up these concerns during the BCI XPrize discussions starting tomorrow.  I may even tweet about it. The official hashtag for the workshop is #bcixprize. Just click the link to follow along – it should be fascinating…

Updates:

  1. Here is another interesting perspective by Todd Geist on what it might be like to be a small part of a global information network, like the organisms on Pandora in the movie Avatar.  As I pointed out in my comment to Todd’s post, the difference between Pandora’s creatures and humans is that they had millions of years of evolution to cope with the direct mind-to-mind linkages, while its happening to us in the course of at most a few generations.
  2. Here is a skeptical perspective on the whole idea of the singularity.

Ever feel like you're part of a big machine?

This blog is an exploration of what being part of a collective might mean for each of us as individuals, and for society.

What is it that is struggling to emerge from the convergence of people and technology?

How can each of us play a role, as a thoughtful cog in the big machine?

Dean Pomerleau
@deanpomerleau

----------------------------

Twitter Updates