U.S. Supercomputer Ranks First: For the first time since November 2009, a supercomputer from the United States has been ranked number one on the list of the world’s top supercomputers. IBM's Sequoia has been given the honor at the International Supercomputing Conference in Hamburg, Germany.
The machine churns through data at an impressive 16.32 petaflop/second. It resides at the Department of Energy’s Lawrence Livermore National Laboratory and pulled ahead of Japan's Fujitsu “K Computer,” which held first place for the last two years. That machine slipped down to the number two slot, processing at 10.51 Pflop/s.
Sequoia is used primarily to perform nuclear weapons simulation, but it's also used by scientists to process data related to astronomy, energy, studying of the human genome and climate change. via Top500 Supercomputer Sites
Math Helps Computers Get Sarcasm: Your computer may be clever, but it's unlikely to get your sarcasm. Psychologists from Stanford University hope to change that with a quantum theory of pragmatics. Basically the theory will add a mathematical dimension to the way speech is converted into numbers -- a computer's language. The researchers devised a mathematical equation to predict human behavior and the likelihood that a speaker is referring to a particular object -- or being sarcastic. Their equation is based on the real-life reactions of 745 people who agreed to do an online survey. The researchers' hope is the additional equation could help computers understand language better, help treat people with language disorders and maybe even make those computerized customer service attendants a little less aggravating. via Futurity
In Isaac Asimov's "Foundation" series, the future of masses of people can be predicted with "psychohistory," a method of predicting future political and social trends, using a device called the "Prime Radiant." In the 1950s, there wasn't the math or the computational power available to make such a thing reality. Now there might be.
Supercomputers, such as the Nautilus at the University of Tennessee's Center for Remote Data Analysis and Visualization, may have brought the world closer to Asimov's vision, though it is still early days. The key is seeking patterns in massive amounts of data and being able to visualize them. Kalev Leetaru, assistant director for text and digital media analytics at the University of Illinois Urbana-Champaign, did just that.
Leetaru used a database of 100 million news articles spanning the period from 1979 to early 2011. The data is from the Open Source Center and Summary of World Broadcasts, both set up by the U.S. and British intelligence agencies to monitor what amounts to nearly every news source in the world and translate them into nuanced English. By analyzing the text in the news stories and the tone -- whether they were largely positive or negative -- Leetaru found patterns emerging that seemed to line up with major periods of unrest. For example, in Egypt, the tone of news articles about Mubarak grew increasingly negative as the protests grew, until eventually Mubarak resigned.
It isn't just the tone of the articles, however; it's also the change in tone over time. According to Leetaru's findings, Saudia Arabia's government has remained in power because the tone of the news there has been equally negative in the past, whereas the tenor of it in Tunisia and Egypt has hit new lows. Leetaru notes that many country experts on Egypt said Mubarak would likely ride out the uprising, as he had done before.
Another pattern the supercomputer was able to tease out was evidence of Osama bin Laden living in Pakistan. It did so by checking how often his name was recorded in association with the country. Visualized as lines on a map (pictured above) connecting the cities mentioned in stories that also referenced bin Laden, a pattern emerged that centered on northern Pakistan -- within a couple of hundred miles of Islamabad.
All this is possible because the supercomputer can seek patterns in networks with 100 trillion connections and 10 billion nodes (or actors). An ordinary computer, Leetaru says, can only look at small parts of the data at a time, and even attempting to run many in parallel wouldn't do the job. That's because when mapping networks, the amount of memory required goes up exponentially with the number of connections. Only a supercomputer could do it, and it was getting the time on the machine (140,000 hours per processor, or about a week with the whole thing running at once) that really enabled Leetaru to make the kinds of conclusions he did.
The technology isn't able to predict events precisely yet. Leetaru likens it to the early days of weather forecasting -- at one point, not much better than a guess, but now reliable enough to base decisions on. It won't predict individual actions, but it might be able to say what the reaction to something like the self-immolation of Mohamed Bouazizi in Tunisia would be.
It's obvious why Intelligence agencies and militaries would be very interested. Asked if they had called, Leetaru said, "I can't answer that."
The brain is an enormously complex machine whose processing power is unmatched by even the most advanced supercomputer. In fact, as evidenced by IBM’s famed ‘cat brain’ project, the best supercomputers are required just to simulate the basic behavior of biological neural circuits.
A major difference between the brain and a supercomputer is that the brain handles memory and processing at the same time -- on what is essentialy a parallel circuit. But even the best computers are built with physically separate ‘memory’ and ‘processing’ components. The back and forth transmission of data between these two subsystems makes the machines inherently non-parallel.
However, this month in the scientific journal Advanced Materials, a study led by David Wright of the University of Exeter in England has shown how semiconductor ‘phase-change materials’ have just the right properties to build a machine that can store memory and process information at the same time.
The curious property of phase-change materials -- which can be designed to melt at one temperature and solidify at another, releasing enormous amounts of energy in the process -- is not new. In fact, it's used in optical drive technology such as Blue-Ray discs.
In this latest application, phase change materials are used to create computationally sophisticated processors. The team built a 'phase-change processor' that was able to perform arithmetic operations such as addition, subtraction, multiplication and division. They hope that simulations of neurons that use these processors as artificial neurons will be able to more closely mimic the behavior of biological neurons.
Last night, IBM's Watson computer won the final round of the three-day Man V. Machine Jeopardy! competition. At the beginning of the show, the humans were fierce, proving that they could buzz in faster than Watson, even though the machine knew the answer. But by the time Final Jeopardy came around, Watson was ahead and was able to decipher the clue: "William Wilkinson's 'an account of the principalities of Wallachia and Moldavia' inspired this authors's most famous novel" and provide the question, "Who is Bram Stoker?"
Both human competitors, Ken Jennings and Brad Rutter, got it correct as well. But I give Jennings additional kudos for his humorous parenthetical, "I for one welcome our new computer overloads." Nice job. The final scores were:
But although Watson won the competition, humans still prevailed. Afterall, Watson was designed by a team of dedicated researchers who spent years building out its 90 servers and customizing hundreds of algorithms that produce precise answers. A tremendous amount of science has gone into developing this machine. In this video from IBM, you can hear more about it, but essentially Watson is comprised of natural language processing, machine learning, knowledge representation and reasoning and analytics. When it's given a task, like coming up with a question to Jeopardy! clue, many computer processors work together in parallel to come up with the solution.
On Jeopardy!, Watson provided those solutions in as little as 3 seconds. But what is this computer's future in our world? In short, such a computer can extract intelligence from the information overload (overload, not overlord) we experience daily in our lives.
In this video from IBM, project researchers describe how a computer system like Watson could be capable of reading an unlimited number of documents, understanding the information and completely retaining it. Now, imagine that you could ask Watson a question and receive an answer in the time it takes one to press the button on a Jeopardy! buzzer.
Financial companies could use a computer like Watson to read and analyze news reports, market reports, trade publications, world events, blogs -- you name it -- and extract meaningful information for investors or business owners. A computer like Watson could also be used in healthcare to more accurately diagnosis disease based on the patient's history as well as what's in the medical literature. A doctor could discover in seconds what the best treatment might be as well as the likely outcome.
I think Watson is agreat achievement of our time. Congrats to everyone on IBM who helped make it happen.
Researchers at IBM's Zurich Labs have developed a prototype supercomputer called the Aquasar that uses a water-cooling principle to keep the system from overheating. The Aquasar is a normal-sized computer; there's nothing tiny about it. But IBM thinks that the water-cooling technology that's proven effective in this supercomputer could work just as well in a vastly smaller machine.
The processors in today's computers get very hot, and they have to be cooled off, usually by air. IBM found that using water to cool off a computer's processors is 4,000 times more efficient than using air.
In fact, up to 50 percent of an average air-cooled data center's energy consumption and carbon footprint today is just from powering the necessary cooling systems to keep the processors from overheating.
Dimos Poulikakos is the head of the Laboratory of Thermodynamics in New Technologies, ETH Zurich. His team of researchers worked with IBM to help develop Aquasar.
"With Aquasar, we make an important contribution to the development of sustainable high performance computers and computer system," he says in a press release from IBM. "In the future it will be important to measure how efficiently a computer is per watt and per gram of equivalent CO2 production."
In IBM's water-cooling system, the processors and several other parts in the computer are cooled with water that is no warmer than 140 degrees Fahrenheit. That's still pretty hot, but it's enough to keep the computers from overheating. As long as the processors remain well below 185 degrees Fahrenheit, the system will operate normally.
This water runs throughout the computer system in what IBM calls "micro-channel liquid coolers," but they are essentially tiny tubes full of water. Every processor in the computer has these tubes directly attached to them, so no processor in the system overheats.
IBM says Aquasar is almost 50 percent more efficient than the world's most powerful supercomputers.
Bruno Michel of IBM's Zurich Labs tells the BBC a supercomputer's energy efficiency is a lot more important than it used to be. "In the future, computers will be dominated by energy costs; to run a data center will cost more than to build it."
IBM says it's important for supercomputers to become more energy efficient not just about saving money, but also about helping the environment. About two percent of the world's energy is consumed by building and operating computer equipment.
Until recently, the most powerful supercomputer in the world could perform about 770 million computational operations per second at a cost of one watt of power. The Aquasar prototype clocked up nearly half again as much, at 1.1 billion operations per second. Now the task is to shrink it.
Mark Stromberg is a principal research analyst at Gartner. As he tells the BBC, "We currently have built this Aquasar system that's one rack full of processors. We plan that 10 to 15 years from now, we can collapse such a system in to one sugar cube - we're going to have a supercomputer in a sugar cube."
Here's how IBM explains their new supercomputer, in a video the company released:
Conserving energy is just as important as generating more renewable energy in terms of preventing climate change. The idea is that we can use renewable energies like solar and wind where we can, like NFL stadiums and the White House, and conserve energy as well.
This week Top500.org announced the world's fastest computers at the annual International Supercomputing Conference in Dresden, Germany. And boy, are they fast. So fast that in writing about them I was almost at a loss for ways to describe their capabilities. You can read my Tech 10 list here. We rely on supercomputers to do calculations that our brains--and laptops--can't do quickly enough. Among other things, they eliminate actual nuclear testing because they can handle the virtual modeling.
All those teraflops--and pentaflops!--require serious power. Not just computing power, but physical electrical power. At the conference, a panel met earlier today discuss what to do now that high-perfomance computing is reaching a stage where it costs about as much to keep the machines running ($7.2 billion) as it does to build them in the first place ($9.2 billion). The biggest players in the supercomputing world are starting with what they do best: data. This is the first time the Top500 list has included electrical usage for each system.
Horst Gietl, an executive consultant for the conference, says the issue of processor power consumption will be a big one in the coming years. If we don't adress it, he says, "The number one supercomputer in the year 2020 needs a nuclear power
plant for its power supply." Either that, or a lot of cyclists.
Photo: IBM's Roadrunner supercomputer for Los Alamos National Laboratory, the fastest computer in the world. Credit: LeRoy N. Sanchez, Records Management, Media Services and Operations for LANL.