There’s a group based at Duke University that thinks it can out-do IBM in the quantum-computing game, and it just got another $15 million in funding from the U.S. government.
The National Science Foundation grant is helping underwrite a consortium led by professors Jungsang Kim and Ken Brown that’s previously received backing from the federal Intelligence Advanced Research Projects Activity.
Kim said the group is developing a quantum computer that has “up to a couple dozen qubits” of computational power and reckons it’s a year or so from being operational. The world qubit is the quantum-computing world’s equivalent of normal computing’s “bit” when it comes to gauging processing ability, and each additional qubit represents a doubling of that power.
“One of the goals of this [grant] is to establish the hardware so we can allow researchers to work on the software and systems optimization,” Kim said of the National Science Foundation grant the agency awarded on Aug. 6.
Sign Up and Save
Get six months of free digital access to The News & Observer
Two or three dozen qubits might not sound like a lot when IBM says it has built and tested a 50-qubit machine. But the Duke-led research group is approaching the problem from an entirely different angle.
The “trapped-ion” design it’s using could hold qubits steady in its internal memory for much longer than superconducting designs like those IBM is working on can manage, Brown said.
Superconducting designs — which operate at extremely cold temperatures — “are a bit faster” than trapped-ion ones and are the focus of “a much larger industrial effort,” Brown said.
That speed-versus-resilience tradeoff could matter because IBM says its machines can hold a qubit steady in memory for only up to about 90 microseconds. That means processing runs have to be short, on the order of no more than a couple of seconds total.
“One thing that’s becoming clear in the community is, the thing we need to scale is not just the number of qubits but also the quality of operations,” said Brown, who in January traded a faculty post at Georgia Tech for a new one at Duke. “If you have a huge number of qubits but the operations are not very good, you effectively have a bad classical computer.”
Kim added that designers working on quantum computers have to look for the same kind of breakthrough in thinking about the technology that the Wright brothers brought to the development of flight.
Just as the Wrights and other people working in the field in the late 19th and early 20th centuries figured out that mimicking birds was a developmental dead end, the builders of quantum computers “have to start with something that’s fundamentally quantum and build the right technology to scale it,” Kim said. “You don’t build quantum computers by mimicking classical computers.”
But for now, the government agencies that are subsidizing the field are backing different approaches and waiting to see what pans out.
The Aug. 6 grant is the third big one Kim’s lab has secured, building on awards from IARPA in 2010 and 2016 that together brought it about $54.5 million in funding. But in both those rounds of funding, teams from IBM were also among those getting awards from the federal agency, which funds what it calls “high-risk/high-payoff” research for the intelligence community.
The stakes are so high because quantum computing could become a breakthrough technology. It exploits the physics of subatomic particles in hopes of developing a machine that can process data that exists in multiple states at once, rather than the binary 1 or 0 of traditional computing.
IBM and the government aren’t the only heavy hitters involved. Google has a quantum-computing project of its own that’s grown with help from IARPA funding. Kim and other people involved in the Duke-led group have also formed a company called IonQ that’s received investment from Google and Amazon.
The Duke-led group also includes teams from from the University of Maryland, the University of Chicago and Tufts University that are working on hardware, software and applications development, respectively, Duke officials say. Researchers from the University of New Mexico, MIT, the National Institute of Standards and Technology and the University of California-Berkeley are also involved.
Duke doesn’t have quantum computing all to itself in the Triangle, as in the spring IBM made N.C. State University part of its Q Network, a group of businesses, universities and government agencies that can use IBM’s quantum machines via the cloud.
But the big difference between the N.C. State and Duke efforts is that with State, the focus is on developing both the future workforce and beginning to push software development, while at Duke it’s more fundamentally about trying to develop the technology.
Not that software is a side issue, mind.
“If I had a quantum computer with 60 qubits, I know there are algorithms I can run on it that I can’t simulate with my regular computers,” Brown said, explaining that the technology requires new thinking there, too. “That’s a weird place to be.”
The quantum project is important enough that Duke has backed it with faculty hires. Brown had been collaborating with Kim’s group for a while, but elected to move to Duke from Georgia Tech after Duke officials decided to conduct what Kim termed “a cluster hire” of quantum specialists.
Brown joined Kim in the Pratt School of Engineering’s electrical and computer engineering department. A search for someone to fill an an endowed chair in physics continues.
Another professor involved, Iman Marvian, also joined the Duke faculty at the start of 2018 thanks to the university’s previously announced “quantitative initiative.” A quantum information theorist, he got a joint appointment in physics and engineering. He came to Duke from MIT after a post-doc stint at the Boston school.