Petaflops are here

Status
Not open for further replies.

kmguru

Staff member
A handful of engineers at a lab in Poughkeepsie, N.Y., have assembled what they expect will become--at least for a while--the world's most powerful computer. The IBM Roadrunner likely will go down in history as the first computer to consistently crank out 1 petaflops--a quadrillion floating-point operations per second.

But the true significance of the IBM supercomputer--and many similar efforts gearing up around the world--might not be the milestone of cracking the petaflops barrier. The bigger impact may lie in what the creators and users of these powerful machines are learning about science and how to harness parallel computing.

Leading-edge supercomputers are making great strides in reducing cost and power consumption. But there is a growing gap between their theoretical performance and the amount of real work they can accomplish. The gap is due to the growing complexity of programming systems with many processors.



More...
 
It's nteresting to think about what new applications will come with these new levels of processing capabilities. I remember watching this guy doing a TED talk and he was talking about the history and trends of science and he extrapolated on where it's going. He suggested that humans are soon going to be making most of its progress by doing experiments in computer simulations. Sounds pretty plausible. You can do a lot more tweaking in virtual ality.
 
You may need a petaflop machine when Microsoft comes out with Windows 10 OS. :D

Seriously, we are a long way off doing Matrix type simulations or a partial human body simulation for drug effects not to mention real weather forecast.

May be someone will write an AI program and let the system learn from the internet....
 
It's nteresting to think about what new applications will come with these new levels of processing capabilities. I remember watching this guy doing a TED talk and he was talking about the history and trends of science and he extrapolated on where it's going. He suggested that humans are soon going to be making most of its progress by doing experiments in computer simulations. Sounds pretty plausible. You can do a lot more tweaking in virtual ality.
My friends is a high level chemist. He specifically works on enzyme active sites. He said that no the computers can not model even these simple reactions with very good predictability. The only paper to be published say so was eventually found out to have been made up.
 
You may need a petaflop machine when Microsoft comes out with Windows 10 OS. :D

Seriously, we are a long way off doing Matrix type simulations or a partial human body simulation for drug effects not to mention real weather forecast.

May be someone will write an AI program and let the system learn from the internet....

Eh, I don't think we're that far off. Maybe a few decades or so. The thing about exponential improvements in processor capacity is that it is slow at first, but once it hits a critical point, the knee, then things begin happening extremely quickly.
 
Yes it is exponential...but only if that path stays on course...like quad core, 8 cores, 16 cores, 32 cores in say 4 to 6 years....
 
Yes it is exponential...but only if that path stays on course...like quad core, 8 cores, 16 cores, 32 cores in say 4 to 6 years....

Processor capacity doubles on a relatively reliable schedule, but that doesn't mean that number cores, or clock speed double do. It's just general performance or flops.
 
The 2009 petaflop machine will likely be an evolution of SGI's NAS Technology Refresh supercomputer, also announced last week, and slated to be installed this summer. That system, a 20,480-core Altix ICE supercomputer, is designed to provide more than 240 teraflops of computing power. - HP wire

Capacity is all about adding cores....
 
The 2009 petaflop machine will likely be an evolution of SGI's NAS Technology Refresh supercomputer, also announced last week, and slated to be installed this summer. That system, a 20,480-core Altix ICE supercomputer, is designed to provide more than 240 teraflops of computing power. - HP wire

Capacity is all about adding cores....

Yeah, but other things are important too, like improving the design and materials of the processors. Adding cores is only one way to increase throughput. Nonetheless, it does appear that the future of computing lies in parallel processing.
 
You may need a petaflop machine when Microsoft comes out with Windows 10 OS. :D

Seriously, we are a long way off doing Matrix type simulations or a partial human body simulation for drug effects not to mention real weather forecast.

May be someone will write an AI program and let the system learn from the internet....

still working on this... lol
 
Capacity is all about adding cores....
If there were true we'd still be using 386 cores in our super computers and just using billions of them.

Super computers have long involved thousands of CPUs. Roadrunner uses Ghz Opterons and each has GB of RAM.
 
Actually you can use billions of 386 like NASA used to do. However, while doing that design, people just improved them to 586. But you wont see a single core 1086 anymore.

There is not a whole lot of difference between adding cores and adding CPUs. It is splitting hairs. One is inside, the other is outside. My statement still stands.
 
Status
Not open for further replies.
Back
Top