© 2024 St. Louis Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

A Calculating Win for China's New Supercomputer

IRA FLATOW, HOST:

Every six months, one of my next guests ranks the 500 fastest computers in the world, the supercomputers, and back in November 2010, China took number one for the first time with a supercomputer called Milky Way 1. President Obama acknowledged China's feat in his State of the Union address a few months later and said we were facing a Sputnik moment.

PRESIDENT BARACK OBAMA: And now it's our turn. We know what it takes to compete for the jobs and industries of our time. We need to out-innovate, out-educate and out-build the rest of the world.

(APPLAUSE)

FLATOW: Now more than two years later, what are we doing? How have we gone? Well, not so much. This week, the latest survey of supercomputers finds China again is on top with its Milky Way 2. That's a machine that's not just slightly better than America's best machine but about twice as fast as our runner-up.

How did we get here? We lost our competitive lead in innovation some time ago. Are we set to lose the race in supercomputing? And you think, well, it's just a computer. What does this say about the state of science and innovation in the U.S. in general? Countries like China and Japan are betting big time on science and technology with investments to match. Can we meet the challenge?

That's what we're going to be talking about with our guests. Horst Simon is editor for the Top500, also deputy lab director at the Lawrence Berkeley National Lab in Berkeley. Rick Stevens is associate lab director for computing, environment and life science research Argonne National Laboratory outside of Chicago.

We're going to take your calls at 1-800-989-8255. You can also tweet us, @scifri, @-S-C-I-F-R-I, talk about China's supercomputer. Have we lost our edge? Can we get it back? The Milky Way 2. Stay with us. We'll be right back after this break.

(SOUNDBITE OF MUSIC)

FLATOW: I'm Ira Flatow; this is SCIENCE FRIDAY from NPR.

(SOUNDBITE OF MUSIC)

FLATOW: This is SCIENCE FRIDAY; I'm Ira Flatow. We're talking about China's new supercomputer, the fastest in the world, what it means for American computing power, innovation. Do we have anything that works to rival it? And why do we care? Why do we care that we don't have the fastest computer in the world? We're not the leader in innovation in the world anymore.

Well, we'll talk about those issues. My guests are Horst Simon, editor for the Top500. He's also deputy lab director at the Lawrence Berkeley National Lab in Berkeley, California. Rick Stevens, associate lab director for computing, environment and life science research at Argonne National Lab, that's just outside of Chicago. He's also professor of computer science at the University of Chicago. Welcome, gentlemen, welcome to SCIENCE FRIDAY.

HORST SIMON: Good morning.

RICK STEVENS: Good morning Ira.

FLATOW: What is - good morning. Tell me what is so important about having the fastest computer.

SIMON: I think it's a symbolic value. It has, of course, some real practical applications, but it is indeed a race, and a lot of countries are in the race because it shows on a certain object scale intellectual prowess and where a country is at. But beyond that I think beyond this symbolic value there is some real technological know-how that is required to get there. And so therefore the Chinese accomplishment of being number one again, the second time, is really significant.

And I think right now the second time that (unintelligible), Milky Way 2, is on the number one spot is even more significant than the first time around because the Chinese have demonstrated not only that they are in this race for the long term, but they have in the new system added a number of technologies that are domestically homegrown technologies that show us that they have made significant progress to be in that race.

FLATOW: They are using Intel processors, correct?

SIMON: That is correct. They're using Intel processors and also the Intel accelerators, the (unintelligible) technology. But what is important to notice is that the software system is all homegrown. They have their own Linux-type operating system, Kylin Linux, they have developed program languages, implemented them on the system. Tools, they've built their own interconnect technology. They've built their own front-end processors.

So it's just like except the computational engine, this is a genuine Chinese system. And the reason everybody's concerned about this is that the next step will be to replace the Intel processors with something that maybe in two or three years is also domestically in China produced. So that's why this is a big step forward.

FLATOW: Rick, you work at Argonne National Labs, where the supercomputer Mira lives. That's number five on the most recent list.

STEVENS: It's number five right now. It debuted at number three and then moved down the list. What's interesting...

FLATOW: Were you surprised - let me just ask you, were you surprised that the Chinese took the lead on this?

STEVENS: No I wasn't surprised because we had been watching them quite carefully over the last five or six years. And they were not at all secretive about their intent to build this class of machine.

FLATOW: And you have to invest money to do this, do you not?

STEVENS: Yeah, they're investing quite a lot. This machine is - we don't know the number exactly, but we estimate it's about twice as expensive as the machines that we've deployed so far.

FLATOW: What kind of money do you talk about here in building a machine like this?

STEVENS: It's probably on the order of about 300 million.

FLATOW: And the biggest U.S. machines are how many hundreds of million?

STEVENS: Less than 200.

FLATOW: So they set out to invest in becoming number one, then, in this, and putting money into it?

STEVENS: That's right. The technology that this Chinese machine is based on is current technology, and in between the first generation Milky Way machine and the second generation, of course the United States and Japan also took the lead for a short time. The U.S. recently deployed two machines in the two five and based on the same technology.

So what this really represents, in addition to what Horst said, their interest in producing domestic capability, they're also willing to spend more than we are on a single machine.

FLATOW: Don't you expect, though, that there are secret supercomputers being used by our government that are much faster than we know about? You know, just take the surveillance of all the phone and data traffic by the NSA. Don't you think that takes a massive amount of computing power that somewhere there are computers that no one's talking about?

SIMON: Yes, I really think there is more computer power out there than is on the Top500 list. The Top500 list really reflects what's publicly known, what the sites, the vendors acknowledge. And we can take our previous guest here on the show, like Google, for example. Google certainly has huge compute power that's not represented on the list.

What's different, though, is that the applications of all the computers on the Top500 list are for large-scale scientific modeling and simulation and also in an industrial sense, modeling and simulation. And these types of supercomputers have very different characteristics that set them often apart from what is used in, say, Internet computing or data processing. So it's a different race.

FLATOW: So here you're talking about specialized computers that could be really big.

SIMON: Yes, so a key ingredient for these computers is what I've glanced over very briefly, this interconnect. And this technology makes it possible for - in the (unintelligible) case, more than 300 million cores to actually cooperate and work on computing tasks. That's a very difficult project process, and that requires not just great technology but also requires applications and software that can use that parallelism.

FLATOW: Rick, you testified on Capitol Hill a few weeks ago, to the House Science Committee, about what it would take to get the next generation of supercomputing exascale. Were they receptive? Did they say, well, we'll give you the money for that?

STEVENS: They didn't say they would give us the money, but they were quite receptive to the challenge. But in the U.S., this is a long-term activity to invest in the basic R&D to work cooperatively with many agencies, with many computer vendors, to keep pushing the technology forward.

You can't just get there by buying machines. We're going to have to dramatically improve power consumption, dramatically improve the performance of memory systems and so forth to do this. And so what I was talking about on the - to the House Science Committee was a strategy for doing that.

FLATOW: But it also takes money and a commitment.

SIMON: It will take - definitely take a commitment, eight to 10 years, and a large investment of money.

If I...

FLATOW: Go ahead, Horst.

SIMON: So Rick and I have worked on exascale since 2006, and what seven years later is somewhat frustrating is the slow pace in which this has progressed. Everybody seems to be very supportive and understands the strategic importance of high-performance computing and of getting to the exascale in terms of its impact on science, in terms of national security and in terms of economic competitiveness. Yet we seem to have currently just in general a very slow going when it comes to large-scale science projects in the U.S.

FLATOW: Yeah, we've seen a lot of that going on in other places. Let's talk about this word exascale. Give us the progression of how - what you mean by exascale.

STEVENS: Well, exascale is 10 to the 18th operations per second or 10 to the 18th bytes. The systems we have today are measured in petaOPS or petabytes. They're about 10 to 20, or the Chinese machine is about 30 petaFLOPS. Over the last 40 years or so, we've been able to increase the speed of the top supercomputers about a factor of 1,000 every 10 years.

So we used to have machines that were measured in gigaFLOPS, a billion operations per second, and then teraFLOPS and then petaFLOPS, and now we're targeting exaFLOPS.

FLATOW: 1-800-989-8255. Let's go to the phones. Bill(ph) in Arcadia, California. Hi Bill.

BILL: Actually, that's Arcata, at the other end of the state.

FLATOW: Oh, I'm sorry.

BILL: I was interested to hear a discussion of the national security implications. It seemed quite a few moments into the show before I heard it mentioned. But we're in an interesting age now where cyber warfare is actually being manifested in, and, you know, it represents a threat throughout our national infrastructure. And we're in a time when a massive battleship could easily be crippled by a properly written and inserted type of code. So this may most importantly be a national security issue.

FLATOW: Horst, would you agree?

SIMON: Yes, although most of the supercomputers that we're talking about are used for national security applications in a different setting. For example, they are used for modeling and simulation of a nuclear stockpile, understanding new materials or fatigue in old materials, this type of application. But I agree with the caller that supercomputers in general have this national security application.

And the important thing is to keep in mind that the supercomputers that are on the Top500 list that Rick and I are talking about in a sense the vanguard of computers. This is where new technology is tested out, where new technology is proven in a demanding environment with innovative and aggressive uses, such as the scientists are. And so once the technology survives this it can migrate out and be useful to many other applications. So, losing that leading edge is, I think, the concern I would have in terms of national security applications.

FLATOW: You mean just developing the technology, using it as sort of an experimental database to make a supercomputer?

STEVENS: Well, you can think of it this way, Ira, these top supercomputers are like time machines. They give us access to a capability that won't be broadly available for five to 10 years. So whoever has the time machine is able to do experiments, able to see into the future deeper and more clearly than those that don't have such machines.

FLATOW: I never heard it quite put that way. It's very nicely put. But people are concerned about cyber warfare and supercomputers possibly being involved in those. Is that a reason itself to develop supercomputers?

STEVENS: Well, in cybersecurity, often, the use of the computer is very specific, and while you can program general purpose machines, such as the ones we're talking about, to be both offensive engines in some sense or defensive engines, that's probably not the most efficient way to engage in cyber warfare. You would more likely use specialized devices.

FLATOW: 1-800-989-8255 is our number. We'll see if we can get a phone call or two in here. Let's go to Bill in Wayne, Pennsylvania. Hi, Bill.

BILL: Hi. How are you doing? Can you hear me OK?

FLATOW: Yeah.

BILL: So my question is really: Should we be focused on putting more and more chips and memory and an O.S. together to build a bigger and faster computer or looking at things like nanoscale or quantum computers which might give us more bang for the buck further into the future? And you can always put more and better and faster chips together. But isn't it really - it's the efficiency in how you look to the future? I mean, if you go back 50 years ago and you look at what they're doing with, you know, the ENIAC and systems like that compared to where we are now, I mean there's not that much of a difference.

We're still, you know, the hardware isn't that much of a difference. We should probably put our money into more advanced techniques at the fabrication level to get, you know, a 1,000 or a more quantum increase in processing power than just be throwing more chips at a box.

FLATOW: All right. Let me get an answer. Thanks for the call. Horst?

SIMON: Yeah. Well, I think we should do both. The caller is right that we do need to invest into the technologies that are sometimes called Beyond CMOS. Indeed, we are today still running with a computer architecture that is the heritage from past Second World War to (unintelligible) architecture. So scaling this can be probably pushed through for a couple more generations of computer systems but ultimately, yes, we do need to invest into new technology, such as quantum computing or any alternate technologies.

But we can not stop supporting the current high-end computing because any of these new technologies are at least a decade away from reaching a production and usable level. So we need to continue with the scaling up to the exascale that Rick and I have discussed.

FLATOW: I'm Ira Flatow. This is SCIENCE FRIDAY from NPR. In the few minutes left, you talked about what this says to the world about American innovation and the state of innovation. Can you talk a little bit more about the importance of that?

STEVENS: Well, certainly, a very visible sign of a nation's intent to participate in leading-edge science and technology development, even though right now China is number one on the list, the United States has vastly more machines on the list. So you can argue that we're not falling behind in our sort of global commitment, but it's a very accurate barometer of our intent to invest to stay at that leading edge. Many, many disciplines require high-performance computing to make progress, whether it's from designing new materials to designing better drugs, understanding fundamental science or designing better security for our soldiers need high-performance computing. And so it's a ubiquitous technology.

FLATOW: And do you think that our people, our government, whoever would be responsible for helping fund or make sure that technology - who's ahead, do they understand this problem?

STEVENS: I think they do understand. I was just in Washington earlier this week, again, in the House side, and there was very general support. I think the challenge is simply a matter of budget priority. Trading off investments in high-performance computing or advanced computing technology with other lower-priority items, and that priority discussion really needs to take place.

FLATOW: Well, why not run it through the Pentagon? They seem to get a lot of money that even they're not asking for. I mean if you want to get a high-tech project say you need it for, you know, for the military.

STEVENS: Well, historically, DARPA has invested in this area. I think in the last five or six years or so DARPA has changed to be more focused on the war fighter directly and less on the broader infrastructure. So I think a national program is needed that combines the assets of DOE and Department of Defense and the National Science Foundation and other agencies to do this because everybody is in it together.

FLATOW: You do hold out hope that that's what's going to happen.

STEVENS: I think it's going to have to happen that way. We need to get a consensus in Washington to do this because it's so critical.

FLATOW: Horst, any last comments about where you see this happening.

SIMON: Well, I'm less optimistic unfortunately than Rick because I put my money in a bet against exascale happening before 2020. And the reason why I say this is because I've observed the slow pace of progress that we have made in the last three or four years since 2010. So I'm afraid that we are on a path to losing out, and this is not just in high-performance computing getting towards exascale, but across a wide range of science.

Just as Rick said, generally, the support is there, the understanding is there, the goodwill is there. But we live in a very budget-constrained world in the U.S. right now, whereas some of our friends in terms of science, but competitors when it comes to technology, are investing much more heavily.

FLATOW: Like China.

SIMON: Like China.

FLATOW: OK. You've got the last word, Horst. I want to thank you very much both of you. Horst Simon is editor for the Top500. He's also deputy lab director at the Lawrence Berkeley National Lab in Berkeley. Rick Stevens, associate lab director for Computing, Environment and Life Science Research at Argonne National Lab in - that's outside of Chicago, also professor of Computer Science at the University of Chicago. Thank you, gentlemen, for taking time to be with us.

STEVENS: Thank you.

FLATOW: You're welcome.

SIMON: Thank you.

FLATOW: We're going to take a short break, and we'll talk about - we're going to talk about food, food in the supermarket and maybe why it should have a date timestamp for you. It's a really interesting topic. We'll be right back after the break. Stay with us. I'm Ira Flatow. This is SCIENCE FRIDAY from NPR News. Transcript provided by NPR, Copyright NPR.