• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • Done now. Domine miserere nobis.

China’s Tianhe-2 New Supercomputer

John_Mann

Active Member
Local time
Today 4:24 AM
Joined
Feb 23, 2013
Messages
376
---
Location
Brazil
In a massive escalation of the supercomputing arms race, China has built Tianhe-2, a supercomputer capable of 33.86 petaflops — almost twice as fast as the US Department of Energy’s Titan, and topping the official Top 500 list of supercomputers by some margin. The US isn’t scheduled to build another large supercomputer until 2015, suggesting China will hold pole position for a long time to come. The computer has 32,000 Ivy Bridge Xeon CPUs and 48,000 Xeon Phi accelerator boards for a total of 3,120,000 compute cores, which are decked out with 1.4 petabytes of RAM. And of course the operating system is Linux.

http://www.extremetech.com/computin...-shocks-the-world-by-arriving-two-years-early
 

Architect

Professional INTP
Local time
Yesterday 9:24 PM
Joined
Dec 25, 2010
Messages
6,691
---
I wonder who will get to exascale first.

Interesting to consider the modern iPad is as powerful as the early Cray supercomputers.
 

Duxwing

I've Overcome Existential Despair
Local time
Yesterday 11:24 PM
Joined
Sep 9, 2012
Messages
3,783
---
I wonder who will get to exascale first.

Interesting to consider the modern iPad is as powerful as the early Cray supercomputers.

Unfortunately, sufficiently small semiconductors experience quantum 'interference,' with electrons leaping between the atoms-thick plates of copper and silicon, so, in time, computing power will eventually be directly proportional to volume.

-Duxwing
 

Architect

Professional INTP
Local time
Yesterday 9:24 PM
Joined
Dec 25, 2010
Messages
6,691
---
Unfortunately, sufficiently small semiconductors experience quantum 'interference,' with electrons leaping between the atoms-thick plates of copper and silicon, so, in time, computing power will eventually be directly proportional to volume.

Some physicists have estimated the computational limits of matter. A rock, for example, is a computer. It has a myriad of atoms (not unlike a transistor) changing state. The computation is also chaotic and not performing anything useful, other than being a rock.

At any rate Kurzweil has it in his book but raw matter does has a computational capacity, however presently we are far from that. Additionally I suspect that the estimates for computational density are missing some tricks. Consider holographic computing. Imagine our rock is composed of some regular crystalline structure. I believe the estimates Kurzweil cites just look at the phonon density. Why not consider it an EM lattice? Then you get orders upon orders of magnitude higher computation if the lattice responds to different frequencies.

Maybe naive because I'm not an expert in these things but I think a viable potential. Anyhow we have plenty of room to go. Just consider the human brain which has, what, 10E19 CPS density? And that density and size gives us, us. Again Kurzweil has it in his book.

Even with pedestrian silicon exascale will be hit in 2015-2020.
 

Duxwing

I've Overcome Existential Despair
Local time
Yesterday 11:24 PM
Joined
Sep 9, 2012
Messages
3,783
---
Some physicists have estimated the computational limits of matter. A rock, for example, is a computer. It has a myriad of atoms (not unlike a transistor) changing state. The computation is also chaotic and not performing anything useful, other than being a rock.

At any rate Kurzweil has it in his book but raw matter does has a computational capacity, however presently we are far from that. Additionally I suspect that the estimates for computational density are missing some tricks. Consider holographic computing. Imagine our rock is composed of some regular crystalline structure. I believe the estimates Kurzweil cites just look at the phonon density. Why not consider it an EM lattice? Then you get orders upon orders of magnitude higher computation if the lattice responds to different frequencies.

Maybe naive because I'm not an expert in these things but I think a viable potential. Anyhow we have plenty of room to go. Just consider the human brain which has, what, 10E19 CPS density? And that density and size gives us, us. Again Kurzweil has it in his book.

So how are we going to tap that?

Even with pedestrian silicon exascale will be hit in 2015-2020.

Cool! But at what scale factor?

-Duxwing
 

John_Mann

Active Member
Local time
Today 4:24 AM
Joined
Feb 23, 2013
Messages
376
---
Location
Brazil
Yeah, it's a golden age to hardware. But software seems to be far behind. And we need more knowledge about how the human brain works in real time in cell level.

IBM Watson use only 80 teraflops. We could built Watson in the 90's!

But maybe better hardware enable us to run wild evolutionary programming. I don't know.
 

Thurlor

Nutter
Local time
Today 3:24 PM
Joined
Jul 8, 2012
Messages
643
---
Location
Victoria, Australia
A rock, for example, is a computer. It has a myriad of atoms (not unlike a transistor) changing state. The computation is also chaotic and not performing anything useful, other than being a rock.

I'd come to this conclusion myself some time ago (though I always use the example of an ocean of water).

What would be required to determine whether or not there was an 'internal consistency' within the 'computational realm' of 'stuff'?
 

walfin

Democrazy
Local time
Today 12:24 PM
Joined
Mar 3, 2008
Messages
2,436
---
Location
/dev/null
Haha let's see if they can actually do anything useful with it.
 

Thurlor

Nutter
Local time
Today 3:24 PM
Joined
Jul 8, 2012
Messages
643
---
Location
Victoria, Australia
How much computational power is there in the average domesticated beehive?
 
Top Bottom