You are currently browsing the tag archive for the ‘Predictions’ tag.

We may have just witnessed an important milestone in the awakening of the web.

While this point may be controversial, I content that future exponential growth of the digital economy will eventually require getting humans out of the loop.  If computing power continues to double every 18 months in accordance with Moore’s Law,  utilizing all those cycles will eventually require computers to start talking directly to other computers, without the goal of assisting, informing or entertaining human beings.

Why? Because human population is virtually flat and limits to human cognition mean there is only so much digital content people can effectively digest.

According to a recent University of California study,  the average US citizen consumes an estimated 34Gb of data daily, mostly in the form of  TV & video games. Collectively, American households consumed 3.6 zettabytes of information of all kinds in 2008, the researchers estimated. While this seems like a lot and is  likely to continue growing for some time as video resolution gets higher, our appetite for bytes will inevitably flatten out, particularly if we continue to get more of our information through mobile devices.

If machine-to-machine communication will eventually need to pick up the slack in demand for the ever increasing bandwidth, how and when will it happen and what form will it take?

To some degree it is happening already. For quite some time there has been a shaky alliance between news aggregators (e.g. Google News) and machine-driven decision tools, best exemplified by automated financial trading systems.  The widely reported United Airlines incident last year showed just how risky this combination can be. For anyone who missed it, United Airlines stock plummeted from $12 to $3, losing 75% of its value over the course of a few minutes on Sept. 8th 2008, and in the process wiped out over $1B in shareholder value.

Insider trading?  Nope.

It turns out the trigger was a small mistake by a Florida newspaper that accidentally reran a story from 2002 about UAL’s bankruptcy without a date, making it appear like it was fresh news.  Within a minute, the automated scanning system of Google News, which visits more than 7,500 news sites every 15 minutes, found the story and thinking it new, added it to its breaking news stream.  An employee at Bloomberg financial news saw the story and rebroadcast it to thousands of readers, quite many of whom follow United Airlines.  Within minutes United’s stock tanked, largely as a result of automated trading programs that saw the price dropping and sold the stock to prevent additional losses.

Once the mistake was cleared up and trading resumed, UAL’s stock recovered most of the $1B it had lost, but the incident was an important lesson for the burgeoning industry of automated news scanning and financial trading. What went wrong during the United Airline incident was a combination of human error and runaway automation that both propagated and acted upon the mistake.

You could try to blame the human element of the equation since in this case without the human error of resurrecting an out-of-date story, the incident would never have happened. But Scott Moore, head of Yahoo News, hit the nail on the head when he said:

This is what happens when everything goes on autopilot and there are no human controls in place or those controls fail.

Now in what could be an important (but potentially risky) step further. we are beginning to see computers acting as both the producers and consumers of content, without a human in the loop.  In this case it is called computational journalism and it consists of content generated by computers for the express purpose of consumption by other computers.

Academics at Georgia Tech and Duke University have been speculating about computational journalism for some time. But now, the folks at Thomson Reuters, the world’s largest news agency, have made the ideas a reality with a new service they call NewsScope. A recent Wired article has a good description of NewsScope:

NewsScope is a machine-readable news service designed for financial institutions that make their money from automated, event-driven, trading. Triggered by signals detected by algorithms within vast mountains of real-time data, trading of this kind now accounts for a significant proportion of turnover in the world’s financial centres.

Reuters’ algorithms parse news stories. Then they assign “sentiment scores” to words and phrases. The company argues that its systems are able to do this “faster and more consistently than human operators”.

Millisecond by millisecond, the aim is to calculate “prevailing sentiment” surrounding specific companies, sectors, indices and markets. Untouched by human hand, these measurements of sentiment feed into the pools of raw data that trigger trading strategies.

One can easily imagine that with machines deciding what events are significant and what they mean, and other machines using that information to make important decisions, we have the makings of an information ecosystem that is free of human input or supervision. A weather report suggesting a hurricane may be heading towards central America could be interpreted by the automated news scanners as a risk to the coffee crop, causing automated commodity trading programs to cause a rise on coffee futures. Machines at coffee producing companies could see the price jump, and trigger release of stockpiled coffee beans onto the market, all without a human hand in the whole process. Machines will be making predictions and acting on them in what amounts to a fully autonomous economy.

This could be an alternative route to the Global Brain I previously envisioned as the end result of the TweetStream application.  By whichever route get there (and there are likely others yet to be identified), the emergence of a viable, worldwide, fully-automated information exchange network will represent an historic moment.  It will be the instant our machines no longer depend entirely on humans for their purpose. It will be a critical milestone in the evolution of intelligence on our planet, and a potentially very risky juncture in human history.

The development of NewsScope is appears to be an important step in that direction. We live in interesting times.

1/4/09 Update

Thomson Reuters, the developers of NewsScope, today acquired Discovery Logic, a company whose motto is “Turning Data into Knowledge”.  Among its several products, is Synapse, designed to help automate the process of technology tranfer of government-sponsored healthcare research by the NIH Office of Technology Transfer (OTT).  They describe Synapse as:

An automated system to match high-potential technologies with potential licensees. Using Synapse, OTT now uses a “market-push” approach, locating specific companies that are most likely to license a certain technology and then notifying the companies of the opportunity.

Using the same product, OTT also found it could also successfully perform “market pull,” in which OTT can identify multiple technologies in its inventory in which a company may find interest.

Apparently Reuters isn’t interested in just automating the process of generating and disseminating news, but technology as well.

"This Will Change Everything Book"Yesterday I read the latest survey by John Brockman from the This year, John asked 125 leading thinkers of today one simple question:

“What game-changing scientific ideas and developments do you expect to live to see?”

Respondents included Richard Dawkins,  Freeman Dyson, Dan Dennett and Brian Eno. Each wrote a few paragraphs describing their vision of the ‘Big Thing’ likely to happen that will be a game-changer (good or bad) for humanity.

The results were fascinating, and well worth reading in their entirety either in Brockman’s soon-to-be-released book, This Will Change Everything, or online at the’s World Question Center 2009.

For people who don’t want to wait for the book release later this month, and who don’t have the time to real all 125 essays, I took a couple hours to read and categorize their responses, to get a sense of what big developments the visionaries as a group are expecting to happen.  Here is a histogram of how frequently the contributors mentioned various topics.  There are more than 125 votes since several of the contributors mentioned more than one idea or development. 2009 Survey - What will Change Everything?

The results were pretty surprising. By far the most often cited development that these experts think will dramatically change our lives were advances in brain science and brain-computer interfaces (BCI).  Biotech and genetic engineering came in second, followed by artificial intelligence/robotics.

I had expected climate change to be up at the top, but it was tied for 4th. Perhaps the scientifically-minded participants in the survey figure humanity will get a handle on climate change via a combination of new energy technology & geo-engineering.  Maybe they think climate change won’t be a global catastrophe after all, but more like the Y2K glitch.

My personal favorite, the emergence of global consciousness (perhaps through some kind of singularity transition) was the 7th most frequently mentioned development that could change everything.

For people who don’t want to wait for the book release and who don’t have time to read all the contributor’s essays

Ever feel like you're part of a big machine?

This blog is an exploration of what being part of a collective might mean for each of us as individuals, and for society.

What is it that is struggling to emerge from the convergence of people and technology?

How can each of us play a role, as a thoughtful cog in the big machine?

Dean Pomerleau


Twitter Updates