Archive for the ‘Quantum Computer’ Category
IBM plans to build a 1121 qubit system. What does this technology mean? – The Hindu
Posted: September 26, 2020 at 9:52 am
(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)
Last week, IBM said it will build Quantum Condor, a 1121 qubit quantum computer, by the end of 2023. The company claims the system can control behaviour of atoms to run applications, and generate world-changing materials to transform industries. IBM says its full-stack quantum computer can be deployed via cloud, and that it can be programmed from any part of the world.
The technology company is developing a super-fridge, internally codenamed Goldeneye, to house the computer. The 10-foot-tall and 6-foot-wide refrigerator is being designed for a million-qubit system.
What are Qubits and quantum computers?
Quantum computers process data exponentially faster than personal computers do. They deploy non-intuitive methods, coupled with lots of computing, to solve intractable problems. These machines operate using qubits, similar to bits in personal computers.
The similarity ends there. The way quantum machines solve a problem is very different from how a traditional machine does.
A classical computer tries solving a problem intuitively. If they are given a command, they attempt every possible move, one after another, turning back at dead ends, until they find a solution.
Quantum computers deploy superposition to solve problems. This allows them to exist in multiple states, and test all possible ways at once. And qubits, the fundamental units of data in quantum computing, enables these machines to compute this way.
In regular computers, bits have either 0 or 1 value, and they come in four possible combinations - - 00, 01, 10, 11. Only one combination can exist at a single point of time, which limits processing speed.
But, in quantum machines, two qubits can represent same values, and all four can exist at the same time. This helps these systems to run faster.
This means that n qubits can represent 2n states. So, 2 qubits represent 4 states, 3 qubits 8 states, 4 qubits 16 states, and so on. And now imagine the many states IBMs 1121 qubit system can represent.
An ordinary 64-bit computer would take hundred years to cycle through these combinations. And thats exactly why quantum computers are being built: to solve intractable problems and break-down theories that are practically impossible for classical computers.
To make such large and difficult calculations happen, the qubits need to be linked together in quantum entanglement. This enables qubits at any end of the universe to connect and be manipulated in such a way that not one can be described without referencing the others.
Why are qubits difficult?
One of the key challenges for processing in qubits is the possibility of losing data during transition. Additionally, assembling qubits, writing and reading information from them is a difficult task.
The fundamental units demand special attention, including a perfect isolation and a thermostat set of one hundredth of a degree above absolute zero. Despite strict monitoring, due to their highly sensitive nature, they can lose superposition even from a slightest variation. This makes programming very tricky.
Since quantum computers are programmed using a sequence of logic gates of various kinds, programmes need to run quickly before qubits lose coherence. The combination of superposition and entanglement makes this process a whole lot harder.
Other companies building quantum computers
There has been a lot of interest in quantum computing in recent times. In 2016, IBM put the first quantum computer in the cloud. Google launched Sycamore quantum computer last year, and said it was close to achieving quantum supremacy.
This month, IBM released its 65-qubit IBM Quantum Hummingbird processor to IBM Q Network members, and the company is planning to surpass the 100-qubit milestone with its 127-qubit IBM Quantum Eagle processor next year. It is also planning to roll out a 433-qubit IBM Quantum Osprey system in 2022.
D-Wave systems, a Canada-based quantum computing company, launched its cloud service in India and Australia this year. It gives researchers and developers in these two countries real-time access to its quantum computers.
Honeywell recently outlined its quantum system, and other technology companies like Microsoft and Intel are also chasing commercialisation.
The ongoing experiments and analysis speak volumes on how tech companies are viewing quantum computers as the next big breakthrough in computing.
Quantum computers will likely deliver tremendous speed, and will help in solving problems related to optimisation in defence, finance, and other industries.
IBM views the 1000-qubit mark as the point from where the commercialisation of quantum computers can take off.
Read the original here:
IBM plans to build a 1121 qubit system. What does this technology mean? - The Hindu
Inaugural OSA Quantum 2.0 Conference Featured Talks on Emerging Technologies – Novus Light Technologies Today
Posted: at 9:52 am
Published on 22 September 2020
The unique role of optics and photonics in driving quantum research and technologies was featured in presentations for the inaugural OSA Quantum 2.0 Conference held 14 17 September. The all-virtual event, presented concurrently with the 2020 Frontiers in Optics and Laser Science APS/DLS (FiO + LS) Conference, drew almost 2,500 registrants from more than 70 countries.
Live and pre-recorded technical presentations on quantum computing and simulation to quantum sensing were available for registrants across the globe at no cost. The conference engaged scientists, engineers and others addressing grand challenges in building a quantum science and technology infrastructure.
The meeting succeeded in bringing together scientists from academia, industry and government labs in a very constructive way, said conference co-chair Michael Raymer of the University of Oregon, USA. The high quality of the talks, along with the facilitation by the presiders and OSA staff, moves us closer to the goal of an open, global ecosystem for advancing quantum information science and technology.
Marissa Giustina, senior research scientist and quantum electronics engineerwith Google AI Quantum, described the companys efforts to build a quantum computer in her keynote talk. Googles goal was to build a prototype system that could enter a space where no classical computer can go at a size of about 50 qubits. To create a viable system, Guistina said there must be strong collaboration between algorithm and hardware developers.
Quantum Algorithms for Finite Energies and Temperatures was the focus of a talk by Ignacio Cirac, director of the Theory Division at the Max Planck Institute of Quantum Optics and Honorary Professor at the Technical University of Munich. He described advances in quantum simulators for addressing problems with the dynamics of physical quantum systems. His recent work focuses on developing algorithms for use on quantum simulators to solve many-body problems
Solutions to digital security challenges was the topic of a talk by Gregoire Ribordy,co-founder and CEO of ID Quantique, Switzerland. He described quantum security techniques, technology and strengths in his keynote talk titled Quantum Technologies for Long-term Data Security. His work centers on the use of quantum safe cryptography and quantum key distribution, and commercially available quantum random number generators in data security.
Mikhail Lukin, co-director of the Harvard Quantum Initiative in Science and Engineering and co-director of the Harvard-MIT Center for Ultracold Atoms, USA, described progress towards quantum repeaters for long-distance quantum communication. He also discussed a new platform for exploring synthetic quantum matter and quantum communication systems based on nanophotonics with atom-like systems.
Conference-wide sponsors for the combined OSA Quantum 2.0 Conference and FiO + LS Conference included Facebook Reality Labs, Toptica Photonics and Oz Optics. Registrants interacted with more than three dozen companies in the virtual exhibit to learn about their latest technologies from instruments for quantum science and education to LIDAR and remote sensing applications.
Registrants can continue to benefit from conference resources for 60 days. Recordings of the technical sessions, the e-Posters Gallery and the Virtual Exhibit will be available on-demand on the FiO + LS website.
Labels: Optical Society,quantum technology,research,optics,conference,applications
See more here:
Could Quantum Computing Progress Be Halted by Background Radiation? – Singularity Hub
Posted: September 1, 2020 at 10:55 am
Doing calculations with a quantum computer is a race against time, thanks to the fragility of the quantum states at their heart. And new research suggests we may soon hit a wall in how long we can hold them together thanks to interference from natural background radiation.
While quantum computing could one day enable us to carry out calculations beyond even the most powerful supercomputer imaginable, were still a long way from that point. And a big reason for that is a phenomenon known as decoherence.
The superpowers of quantum computers rely on holding the qubitsquantum bitsthat make them up in exotic quantum states like superposition and entanglement. Decoherence is the process by which interference from the environment causes them to gradually lose their quantum behavior and any information that was encoded in them.
It can be caused by heat, vibrations, magnetic fluctuations, or any host of environmental factors that are hard to control. Currently we can keep superconducting qubits (the technology favored by the fields leaders like Google and IBM) stable for up to 200 microseconds in the best devices, which is still far too short to do any truly meaningful computations.
But new research from scientists at Massachusetts Institute of Technology (MIT) and Pacific Northwest National Laboratory (PNNL), published last week in Nature, suggests we may struggle to get much further. They found that background radiation from cosmic rays and more prosaic sources like trace elements in concrete walls is enough to put a hard four-millisecond limit on the coherence time of superconducting qubits.
These decoherence mechanisms are like an onion, and weve been peeling back the layers for the past 20 years, but theres another layer that left unabated is going to limit us in a couple years, which is environmental radiation, William Oliver from MIT said in a press release. This is an exciting result, because it motivates us to think of other ways to design qubits to get around this problem.
Superconducting qubits rely on pairs of electrons flowing through a resistance-free circuit. But radiation can knock these pairs out of alignment, causing them to split apart, which is what eventually results in the qubit decohering.
To determine how significant of an impact background levels of radiation could have on qubits, the researchers first tried to work out the relationship between coherence times and radiation levels. They exposed qubits to irradiated copper whose emissions dropped over time in a predictable way, which showed them that coherence times rose as radiation levels fell up to a maximum of four milliseconds, after which background effects kicked in.
To check if this coherence time was really caused by the natural radiation, they built a giant shield out of lead brick that could block background radiation to see what happened when the qubits were isolated. The experiments clearly showed that blocking the background emissions could boost coherence times further.
At the minute, a host of other problems like material impurities and electronic disturbances cause qubits to decohere before these effects kick in, but given the rate at which the technology has been improving, we may hit this new wall in just a few years.
Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing, Brent VanDevender from PNNL said in a press release.
Potential solutions to the problem include building radiation shielding around quantum computers or locating them underground, where cosmic rays arent able to penetrate so easily. But if you need a few tons of lead or a large cavern in order to install a quantum computer, thats going to make it considerably harder to roll them out widely.
Its important to remember, though, that this problem has only been observed in superconducting qubits so far. In July, researchers showed they could get a spin-orbit qubit implemented in silicon to last for about 10 milliseconds, while trapped ion qubits can stay stable for as long as 10 minutes. And MITs Oliver says theres still plenty of room for building more robust superconducting qubits.
We can think about designing qubits in a way that makes them rad-hard, he said. So its definitely not game-over, its just the next layer of the onion we need to address.
Image Credit: Shutterstock
View post:
Could Quantum Computing Progress Be Halted by Background Radiation? - Singularity Hub
Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,…
Posted: at 10:55 am
One of the goals of theSuperconducting Quantum Materials and Systems Centeris to build a beyond-state-of-the-art quantum computer based on superconducting technologies.The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles.
The U.S. Department of Energys Fermilab has been selected to lead one of five national centers to bring about transformational advances in quantum information science as a part of the U.S. National Quantum Initiative, announced the White House Office of Science and Technology Policy, the National Science Foundation and the U.S. Department of Energy today.
The initiative provides the newSuperconducting Quantum Materials and Systems Centerfunding with the goal of building and deploying a beyond-state-of-the-art quantum computer based on superconducting technologies. The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles. Total planned DOE funding for the center is $115 million over five years, with $15 million in fiscal year 2020 dollars and outyear funding contingent on congressional appropriations. SQMS will also receive an additional $8 million in matching contributions from center partners.
The SQMS Center is part of a $625 million federal program to facilitate and foster quantum innovation in the United States. The 2018 National Quantum Initiative Act called for a long-term, large-scale commitment of U.S. scientific and technological resources to quantum science.
The revolutionary leaps in quantum computing and sensing that SQMS aims for will be enabled by a unique multidisciplinary collaboration that includes 20 partners national laboratories, academic institutions and industry. The collaboration brings together world-leading expertise in all key aspects: from identifying qubits quality limitations at the nanometer scale to fabrication and scale-up capabilities into multiqubit quantum computers to the exploration of new applications enabled by quantum computers and sensors.
The breadth of the SQMS physics, materials science, device fabrication and characterization technology combined with the expertise in large-scale integration capabilities by the SQMS Center is unprecedented for superconducting quantum science and technology, said SQMS Deputy Director James Sauls of Northwestern University. As part of the network of National QIS Research centers, SQMS will contribute to U.S. leadership in quantum science for the years to come.
SQMS researchers are developing long-coherence-time qubits based on Rigetti Computings state-of-the-art quantum processors. Image: Rigetti Computing
At the heart of SQMS research will be solving one of the most pressing problems in quantum information science: the length of time that a qubit, the basic element of a quantum computer, can maintain information, also called quantum coherence. Understanding and mitigating sources of decoherence that limit performance of quantum devices is critical to engineering in next-generation quantum computers and sensors.
Unless we address and overcome the issue of quantum system decoherence, we will not be able to build quantum computers that solve new complex and important problems. The same applies to quantum sensors with the range of sensitivity needed to address long-standing questions in many fields of science, said SQMS Center Director Anna Grassellino of Fermilab. Overcoming this crucial limitation would allow us to have a great impact in the life sciences, biology, medicine, and national security, and enable measurements of incomparable precision and sensitivity in basic science.
The SQMS Centers ambitious goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Researchers have expanded the use of Fermilab cavities into the quantum regime.
We have the most coherent by a factor of more than 200 3-D superconducting cavities in the world, which will be turned into quantum processors with unprecedented performance by combining them with Rigettis state-of-the-art planar structures, said Fermilab scientist Alexander Romanenko, SQMS technology thrust leader and Fermilab SRF program manager. This long coherence would not only enable qubits to be long-lived, but it would also allow them to be all connected to each other, opening qualitatively new opportunities for applications.
The SQMS Centers goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Photo: Reidar Hahn, Fermilab
To advance the coherence even further, SQMS collaborators will launch a materials-science investigation of unprecedented scale to gain insights into the fundamental limiting mechanisms of cavities and qubits, working to understand the quantum properties of superconductors and other materials used at the nanoscale and in the microwave regime.
Now is the time to harness the strengths of the DOE laboratories and partners to identify the underlying mechanisms limiting quantum devices in order to push their performance to the next level for quantum computing and sensing applications, said SQMS Chief Engineer Matt Kramer, Ames Laboratory.
Northwestern University, Ames Laboratory, Fermilab, Rigetti Computing, the National Institute of Standards and Technology, the Italian National Institute for Nuclear Physics and several universities are partnering to contribute world-class materials science and superconductivity expertise to target sources of decoherence.
SQMS partner Rigetti Computing will provide crucial state-of-the-art qubit fabrication and full stack quantum computing capabilities required for building the SQMS quantum computer.
By partnering with world-class experts, our work will translate ground-breaking science into scalable superconducting quantum computing systems and commercialize capabilities that will further the energy, economic and national security interests of the United States, said Rigetti Computing CEO Chad Rigetti.
SQMS will also partner with the NASA Ames Research Center quantum group, led by SQMS Chief Scientist Eleanor Rieffel. Their strengths in quantum algorithms, programming and simulation will be crucial to use the quantum processors developed by the SQMS Center.
The Italian National Institute for Nuclear Physics has been successfully collaborating with Fermilab for more than 40 years and is excited to be a member of the extraordinary SQMS team, said INFN President Antonio Zoccoli. With its strong know-how in detector development, cryogenics and environmental measurements, including the Gran Sasso national laboratories, the largest underground laboratory in the world devoted to fundamental physics, INFN looks forward to exciting joint progress in fundamental physics and in quantum science and technology.
Fermilab is excited to host this National Quantum Information Science Research Center and work with this extraordinary network of collaborators, said Fermilab Director Nigel Lockyer. This initiative aligns with Fermilab and its mission. It will help us answer important particle physics questions, and, at the same time, we will contribute to advancements in quantum information science with our strengths in particle accelerator technologies, such as superconducting radio-frequency devices and cryogenics.
We are thankful and honored to have this unique opportunity to be a national center for advancing quantum science and technology, Grassellino said. We have a focused mission: build something revolutionary. This center brings together the right expertise and motivation to accomplish that mission.
The Superconducting Quantum Materials and Systems Center at Fermilab is supported by the DOE Office of Science.
Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.
More here:
The future of artificial intelligence and quantum computing – Military & Aerospace Electronics
Posted: at 10:55 am
NASHUA, N.H. -Until the 21st Century, artificial intelligence (AI) and quantum computers were largely the stuff of science fiction, although quantum theory and quantum mechanics had been around for about a century. A century of great controversy, largely because Albert Einstein rejected quantum theory as originally formulated, leading to his famous statement, God does not play dice with the universe.
Today, however, the debate over quantum computing is largely about when not if these kinds of devices will come into full operation. Meanwhile, other forms of quantum technology, such as sensors, already are finding their way into military and civilian applications.
Quantum technology will be as transformational in the 21st Century as harnessing electricity was in the 19th, Michael J. Biercuk, founder and CEO of Q-CTRL Pty Ltd in Sydney, Australia, and professor of Quantum Physics & Quantum Technologies at the University of Sydney, told the U.S. Office of Naval Research in a January 2019 presentation.
On that, there is virtually universal agreement. But when and how remains undetermined.
For example, asked how and when quantum computing eventually may be applied to high-performance embedded computing (HPEC), Tatjana Curcic, program manager for Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) of the U.S. Defense Advanced Research Projects Agency in Arlington, Va., says its an open question.
Until just recently, quantum computing stood on its own, but as of a few years ago people are looking more and more into hybrid approaches, Curcic says. Im not aware of much work on actually getting quantum computing into HPEC architecture, however. Its definitely not mainstream, probably because its too early.
As to how quantum computing eventually may influence the development, scale, and use of AI, she adds:
Thats another open question. Quantum machine learning is a very active research area, but is quite new. A lot of people are working on that, but its not clear at this time what the results will be. The interface between classical data, which AI is primarily involved with, and quantum computing is still a technical challenge.
Quantum information processing
According to DARPAs ONISQ webpage, the program aims to exploit quantum information processing before fully fault-tolerant quantum computers are realized.This quantum computer based on superconducting qubits is inserted into a dilution refrigerator and cooled to a temperature less than 1 Kelvin. It was built at IBM Research in Zurich.
This effort will pursue a hybrid concept that combines intermediate-sized quantum devices with classical systems to solve a particularly challenging set of problems known as combinatorial optimization. ONISQ seeks to demonstrate the quantitative advantage of quantum information processing by leapfrogging the performance of classical-only systems in solving optimization challenges, the agency states. ONISQ researchers will be tasked with developing quantum systems that are scalable to hundreds or thousands of qubits with longer coherence times and improved noise control.
Researchers will also be required to efficiently implement a quantum optimization algorithm on noisy intermediate-scale quantum devices, optimizing allocation of quantum and classical resources. Benchmarking will also be part of the program, with researchers making a quantitative comparison of classical and quantum approaches. In addition, the program will identify classes of problems in combinatorial optimization where quantum information processing is likely to have the biggest impact. It will also seek to develop methods for extending quantum advantage on limited size processors to large combinatorial optimization problems via techniques such as problem decomposition.
The U.S. government has been the leader in quantum computing research since the founding of the field, but that too is beginning to change.
In the mid-90s, NSA [the U.S. National Security Agency at Fort Meade, Md.] decided to begin on an open academic effort to see if such a thing could be developed. All that research has been conducted by universities for the most part, with a few outliers, such as IBM, says Q-CTRLs Biercuk. In the past five years, there has been a shift toward industry-led development, often in cooperation with academic efforts. Microsoft has partnered with universities all over the world and Google bought a university program. Today many of the biggest hardware developments are coming from the commercial sector.
Quantum computing remains in deep space research, but there are hardware demonstrations all over the world. In the next five years, we expect the performance of these machines to be agented to the point where we believe they will demonstrate a quantum advantage for the first time. For now, however, quantum computing has no advantages over standard computing technology. quantum computers are research demonstrators and do not solve any computing problems at all. Right now, there is no reason to use quantum computers except to be ready when they are truly available.
AI and quantum computing
Nonetheless, the race to develop and deploy AI and quantum computing is global, with the worlds leading military powers seeing them along with other breakthrough technologies like hypersonics making the first to successfully deploy as dominant as the U.S. was following the first detonations of atomic bombs. That is especially true for autonomous mobile platforms, such as unmanned aerial vehicles (UAVs), interfacing with those vehicles onboard HPEC.
Of the two, AI is the closest to deployment, but also the most controversial. A growing number of the worlds leading scientists, including the late Stephen Hawking, warn real-world AI could easily duplicate the actions of the fictional Skynet in the Terminator movie series. Launched with total control over the U.S. nuclear arsenal, Skynet became sentient and decided the human race was a dangerous infestation that needed to be destroyed.
The development of full artificial intelligence could spell the end of the human race. Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldnt compete and would be superseded. Stephen Hawking (2014)
Such dangers have been recognized at least as far back as the publication of Isaac Asimovs short story, Runabout, in 1942, which included his Three Laws of Robotics, designed to control otherwise autonomous robots. In the story, the laws were set down in 2058:
First Law A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Whether it would be possible to embed and ensure unbreakable compliance with such laws in an AI system is unknown. But limited degrees of AI, known as machine learning, already are in widespread use by the military and advanced stages of the technology, such as deep learning, almost certainly will be deployed by one or more nations as they become available. More than 50 nations already are actively researching battlefield robots.
Military quantum computing
AI-HPEC would give UAVs, next-generation cruise missiles, and even maneuverable ballistic missiles the ability to alter course to new targets at any point after launch, recognize counter measures, avoid, and misdirect or even destroy them.
Quantum computing, on the other hand, is seen by some as providing little, if any, advantage over traditional computer technologies, by many as requiring cooling and size, weight and power (SWaP) improvements not possible with current technologies to make it applicable to mobile platforms and by most as being little more than a research tool for perhaps decades to come.
Perhaps the biggest stumbling block to a mobile platform-based quantum computing is cooling it currently requires a cooling unit, at near absolute zero, the Military trusted computing experts are considering new generations of quantum computing for creating nearly unbreakable encryption for super-secure defense applications.size of a refrigerator to handle a fractional piece of quantum computing.
A lot of work has been done and things are being touted as operational, but the most important thing to understand is this isnt some simple physical thing you throw in suddenly and it works. That makes it harder to call it deployable youre not going to strap a quantum computing to a handheld device. A lot of solutions are still trying to deal with cryogenics and how do you deal with deployment of cryo, says Tammy Carter, senior product manager for GPGPUs and software products at Curtiss-Wright Defense Solutions in Ashburn, Va.
AI is now a technology in deployment. Machine learning is pretty much in use worldwide, Carter says. Were in a migration of figuring out how to use it with the systems we have. quantum computing will require a lot of engineering work and demand may not be great enough to push the effort. From a cryogenically cooled electronics perspective, I dont think there is any insurmountable problem. It absolutely can be done, its just a matter of decision making to do it, prioritization to get it done. These are not easily deployed technologies, but certainly can be deployed.
Given its current and expected near-term limitations, research has increased on the development of hybrid systems.
The longer term reality is a hybrid approach, with the quantum system not going mobile any time soon, says Brian Kirby, physicist in the Army Research Laboratory Computational & Informational Sciences Directorate in Adelphi, Md. Its a mistake to forecast a timeline, but Im not sure putting a quantum computing on such systems would be valuable. Having the quantum computing in a fixed location and linked to the mobile platform makes more sense, for now at least. There can be multiple quantum computers throughout the country; while individually they may have trouble solving some problems, networking them would be more secure and able to solve larger problems.
Broadly, however, quantum computing cant do anything a practical home computer cant do, but can potentially solve certain problems more efficiently, Kirby continues. So youre looking at potential speed-up, but there is no problem a quantum computing can solve a normal computer cant. Beyond the basics of code-breaking and quantum simulations affecting material design, right now we cant necessarily predict military applications.
Raising concerns
In some ways similar to AI, quantum computing raises nearly as many concerns as it does expectations, especially in the area of security. The latest Thales Data Threat Report says 72 percent of surveyed security experts worldwide believe quantum computing will have a negative impact on data security within the next five years.
At the same time, quantum computing is forecast to offer more robust cryptography and security solutions. For HPEC, that duality is significant: quantum computing can make it more difficult to break the security of mobile platforms, while simultaneously making it easier to do just that.
Quantum computers that can run Shors algorithm [leveraging quantum properties to factor very large numbers efficiently] are expected to become available in the next decade. These algorithms can be used to break conventional digital signature schemes (e.g. RSA or ECDSA), which are widely used in embedded systems today. This puts these systems at risk when they are used in safety-relevant long-term applications, such as automotive systems or critical infrastructures. To mitigate this risk, classical digital signature schemes used must be replaced by schemes secure against quantum computing-based attacks, according to the August 2019 proceedings of the 14th International Conference on Availability, Reliability & Securitys Post-Quantum Cryptography in Embedded Systems report.
The security question is not quite so clean-cut as armor/anti-armor, but there is a developing bifurcation between defensive and offensive applications. On the defense side, deployed quantum systems are looked at to provide encoded communications. Experts say it seems likely the level of activity in China about quantum communications, which has been a major focus for years, runs up against the development of quantum computing in the U.S. The two aspects are not clearly one-against-one, but the two moving independently.
Googles quantum supremacy demonstration has led to a rush on finding algorithms robust against quantum attack. On the quantum communications side, the development of attacks on such systems has been underway for years, leading to a whole field of research based on identifying and exploiting quantum attacks.
Quantum computing could also help develop revolutionary AI systems. Recent efforts have demonstrated a strong and unexpected link between quantum computation and artificial neural networks, potentially portending new approaches to machine learning. Such advances could lead to vastly improved pattern recognition, which in turn would permit far better machine-based target identification. For example, the hidden submarine in our vast oceans may become less-hidden in a world with AI-empowered quantum computers, particularly if they are combined with vast data sets acquired through powerful quantum-enabled sensors, according to Q-CTRLs Biercuk.
Even the relatively mundane near-term development of new quantum-enhanced clocks may impact security, beyond just making GPS devices more accurate, Biercuk continues. Quantum-enabled clocks are so sensitive that they can discern minor gravitational anomalies from a distance. They thus could be deployed by military personnel to detect underground, hardened structures, submarines or hidden weapons systems. Given their potential for remote sensing, advanced clocks may become a key embedded technology for tomorrows warfighter.
Warfighter capabilities
The early applications of quantum computing, while not embedded on mobile platforms, are expected to enhance warfighter capabilities significantly.
Jim Clark, director of quantum hardware at Intel Corp. in Santa Clara, Calif., shows one of the companys quantum processors.There is a high likelihood quantum computing will impact ISR [intelligence, surveillance and reconnaissance], solving logistics problems more quickly. But so much of this is in the basic research stage. While we know the types of problems and general application space, optimization problems will be some of the first where we will see advantages from quantum computing, says Sara Gamble, quantum information sciences program manager at ARL.
Biercuk says he agrees: Were not really sure there is a role for quantum computing in embedded computing just yet. quantum computing is right now very large systems embedded in mainframes, with access by the cloud. You can envision embedded computing accessing quantum computing via the cloud, but they are not likely to be very small, agile processors you would embed in a SWAP-constrained environment.
But there are many aspects of quantum technology beyond quantum computing; the combination of quantum sensors could allow much better detection in the field, Biercuk continues. The biggest potential impact comes in the areas of GPS denial, which has become one of the biggest risk factors identified in every blueprint around the world. quantum computing plays directly into this to perform dead reckoning navigation in GPS denial areas.
DARPAs Curcic also says the full power of quantum computing is still decades away, but believes ONISQ has the potential to help speed its development.
The main two approaches industry is using is superconducting quantum computing and trapped ions. We use both of those, plus cold atoms [Rydberg atoms]. We are very excited about ONISQ and seeing if we can get anything useful over classical computing. Four teams are doing hardware development with those three approaches, she says.
Because these are noisy systems, its very difficult to determine if there will be any advantages. The hope is we can address the optimization problem faster than today, which is what were working on with ONISQ. Optimization problems are everywhere, so even a small improvement would be valuable.
Beyond todays capabilities
As to how quantum computing and AI may impact future warfare, especially through HPEC, she adds: I have no doubt quantum computing will be revolutionary and well be able to do things beyond todays capabilities. The possibilities are pretty much endless, but what they are is not crystal clear at this point. Its very difficult, with great certainly, to predict what quantum computing will be able to do. Well just have to build and try. Thats why today is such an exciting time.
Curtiss Wrights Carter says he believes quantum computing and AI will be closely linked with HPEC in the future, once current limitations with both are resolved.
AI itself is based on a lot of math being done in parallel for probability answers, similar to modeling the neurons in the brain highly interconnected nodes and interdependent math calculations. Imagine a small device trying to recognize handwriting, Carter says. You run every pixel of that through lots and lots of math, combining and mixing, cutting some, amplifying others, until you get a 98 percent answer at the other end. quantum computing could help with that and researchers are looking at how you would do that, using a different level of parallel math.
How quantum computing will be applied to HPEC will be the big trick, how to get that deployed. Imagine were a SIGINT [signals intelligence] platform land, air or sea there are a lot of challenges, such as picking the right signal out of the air, which is not particularly easy, Carter continues. Once you achieve pattern recognition, you want to do code breaking to get that encrypted traffic immediately. Getting that on a deployed platform could be useful; otherwise you bring your data back to a quantum computing in a building, but that means you dont get the results immediately.
The technology research underway today is expected to show progress toward making quantum computing more applicable to military needs, but it is unlikely to produce major results quickly, especially in the area of HPEC.
Trapped ions and superconducting circuits still require a lot of infrastructure to make them work. Some teams are working on that problem, but the systems still remain room-sized. The idea of quantum computing being like an integrated circuit you just put on a circuit board were a very long way from that, Biercuk says. The systems are getting smaller, more compact, but there is a very long way to go to deployable, embeddable systems. Position, navigation and timing systems are being reduced and can be easily deployed on aircraft. Thats probably where the technology will remain in the next 20 years; but, eventually, with new technology development, quantum computing may be reduced to more mobile sizes.
The next 10 years are about achieving quantum advantage with the systems available now or iterations. Despite the acceleration we have seen, there are things that are just hard and require a lot of creativity, Biercuk continues. Were shrinking the hardware, but that hardware still may not be relevant to any deployable system. In 20 years, we may have machines that can do the work required, but in that time we may only be able to shrink them to a size that can fit on an aircraft carrier local code-breaking engines. To miniaturize this technology to put it on, say, a body-carried system, we just dont have any technology basis to claim we will get there even in 20 years. Thats open to creativity and discovery.
Even with all of the research underway worldwide, one question remains dominant.
The general challenge is it is not clear what we will use quantum computing for, notes Rad Balu, a computer scientist in ARLs Computational & Informational Sciences Directorate.
The rest is here:
The future of artificial intelligence and quantum computing - Military & Aerospace Electronics
Researchers Found Another Impediment for Quantum Computers to Overcome – Dual Dove
Posted: at 10:55 am
Maintaining qubits stable will be the pivot to realizing the potential of quantum computing, and now researchers have managed to discover a new obstacle to this stability: natural radiation.
Natural or background radiation is produced by various sources, both natural and artificial. Cosmic rays produce natural radiation, for instance, and so do concrete buildings. It is surrounding us all the time, and so this poses something of an issue for future quantum computers.
After numerous experiments that modified the level of natural radiation around qubits, physicists could establish that this background noise does indeed push qubits off balance in a way that hinders them from operating properly.
Our study is the first to show clearly that low-level ionizing radiation in the environment degrades the performance of superconducting qubits,says physicist John Orrell, from the Pacific Northwest National Laboratory (PNNL). These findings suggest that radiation shielding will be necessary to attain long-sought performance in quantum computers of this design.
Natural radiation is under no circumstance the most important or the only menace to qubit stability, which is basically known as coherence; everything from temperature variations to electromagnetic fields is able to mess with the qubit.
However, scientists say if were to attain a future where quantum computers are performing most of our advanced computing needs, then this hindrance from natural radiation is going to have to be addressed.
After the team that carried out the study was faced with issues regarding superconducting qubit decoherence, it decided to examine the possible problem with natural radiation. They discovered it breaks up a main quantum binding known as theCooper pairof electrons.
The radiation breaks apart matched pairs of electrons that typically carry electric current without resistance in a superconductor,says physicist Brent VanDevender, from PNNL. The resistance of those unpaired electrons destroys the delicately prepared state of a qubit.
Regular computers can be distorted by the same issues that impact qubits, but quantum states are a lot more delicate and sensitive. One of the reasons that we dont have authentic full-scale quantum computers at the moment is that theres no way yet to keep qubits stable for more than a few milliseconds at a time.
If we can develop on that, the benefits when it comes to computing power could be gigantic: while classical computer bits can only be set as 1 or 0, qubits can be set as 1,0, or both at the same time, a state known assuperposition.
Researchers have managed to get it happening, but only for a very short period, and in an extremely controlled setting. The good news, however, is that scientists like those at PNNL are dedicated to the challenge of discovering how to make quantum computers a reality, and with the new finding, we know a bit more about what weve to overcome.
Practical quantum computing with these devices will not be possible unless we address the radiation issue,says VanDevender. Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing.
A paper detailing the research has been published in the journalNature.
Known for her passion for writing, Paula contributes to both Science and Health niches here at Dual Dove.
See original here:
Researchers Found Another Impediment for Quantum Computers to Overcome - Dual Dove
Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 – Kentucky Journal 24
Posted: at 10:55 am
Overview:
Quantum cryptographyis a new method for secret communications that provides the assurance of security of digital data. Quantum cryptography is primarily based on the usage of individual particles/waves of light (photon) and their essential quantum properties for the development of an unbreakable cryptosystem, primarily because it is impossible to measure the quantum state of any system without disturbing that system.
Request For ReportSample@https://www.trendsmarketresearch.com/report/sample/9921
It is hypothetically possible that other particles could be used, but photons offer all the necessary qualities needed, the their behavior is comparatively understandable, and they are the information carriers in optical fiber cables, the most promising medium for very high-bandwidth communications.
Quantum computing majorly focuses on the growing computer technology that is built on the platform of quantum theory which provides the description about the nature and behavior of energy and matter at quantum level. The fame of quantum mechanics in cryptography is growing because they are being used extensively in the encryption of information. Quantum cryptography allows the transmission of the most critical data at the most secured level, which in turn, propels the growth of the quantum computing market. Quantum computing has got a huge array of applications.
Market Analysis:
According to Infoholic Research, the Global Quantum cryptography Market is expected to reach $1.53 billion by 2023, growing at a CAGR of around 26.13% during the forecast period. The market is experiencing growth due to the increase in the data security and privacy concerns. In addition, with the growth in the adoption of cloud storage and computing technologies is driving the market forward. However, low customer awareness about quantum cryptography is hindering the market growth. The rising demands for security solutions across different verticals is expected to create lucrative opportunities for the market.
Market Segmentation Analysis:
The report provides a wide-ranging evaluation of the market. It provides in-depth qualitative insights, historical data, and supportable projections and assumptions about the market size. The projections featured in the report have been derived using proven research methodologies and assumptions based on the vendors portfolio, blogs, whitepapers, and vendor presentations. Thus, the research report serves every side of the market and is segmented based on regional markets, type, applications, and end-users.
Countries and Vertical Analysis:
The report contains an in-depth analysis of the vendor profiles, which include financial health, business units, key business priorities, SWOT, strategy, and views; and competitive landscape. The prominent vendors covered in the report include ID Quantique, MagiQ Technologies, Nucrypt, Infineon Technologies, Qutools, QuintenssenceLabs, Crypta Labs, PQ Solutions, and Qubitekk and others. The vendors have been identified based on the portfolio, geographical presence, marketing & distribution channels, revenue generation, and significant investments in R&D.
Get Complete TOC with Tables andFigures@https://www.trendsmarketresearch.com/report/discount/9921
Competitive Analysis
The report covers and analyzes the global intelligent apps market. Various strategies, such as joint ventures, partnerships,collaborations, and contracts, have been considered. In addition, as customers are in search of better solutions, there is expected to be a rising number of strategic partnerships for better product development. There is likely to be an increase in the number of mergers, acquisitions, and strategic partnerships during the forecast period.
Companies such as Nucrypt, Crypta Labs, Qutools, and Magiq Technologies are the key players in the global Quantum Cryptography market. Nucrypt has developed technologies for emerging applications in metrology and communication. The company has also produced and manufactured electronic and optical pulsers. In addition, Crypta Labs deals in application security for devices. The company deals in Quantum Random Number Generator products and solutions and Internet of Things (IoT). The major sectors the company is looking at are transport, military and medical.
The report includes the complete insight of the industry, and aims to provide an opportunity for the emerging and established players to understand the market trends, current scenario, initiatives taken by the government, and the latest technologies related to the market. In addition, it helps the venture capitalists in understanding the companies better and to take informed decisions.
Regional Analysis
The Americas held the largest chunk of market share in 2017 and is expected to dominate the quantum cryptography market during the forecast period. The region has always been a hub for high investments in research and development (R&D) activities, thus contributing to the development of new technologies. The growing concerns for the security of IT infrastructure and complex data in America have directed the enterprises in this region to adopt quantum cryptography and reliable authentication solutions.
<<< Get COVID-19 Report Analysis >>>https://www.trendsmarketresearch.com/report/covid-19-analysis/9921
Benefits
The report provides an in-depth analysis of the global intelligent apps market aiming to reduce the time to market the products and services, reduce operational cost, improve accuracy, and operational performance. With the help of quantum cryptography, various organizations can secure their crucial information, and increase productivity and efficiency. In addition, the solutions are proven to be reliable and improve scalability. The report discusses the types, applications, and regions related to this market. Further, the report provides details about the major challenges impacting the market growth.
Read more here:
Q-NEXT collaboration awarded National Quantum Initiative funding – University of Wisconsin-Madison
Posted: at 10:55 am
The University of WisconsinMadison solidified its standing as a leader in the field of quantum information science when the U.S. Department of Energy (DOE) and the White House announced the Q-NEXT collaboration as a funded Quantum Information Science Research Center through the National Quantum Initiative Act. The five-year, $115 million collaboration was one of five Centers announced today.
Q-NEXT, a next-generation quantum science and engineering collaboration led by the DOEs Argonne National Laboratory, brings together nearly 100 world-class researchers from three national laboratories, 10 universities including UWMadison, and 10 leading U.S. technology companies to develop the science and technology to control and distribute quantum information.
The main goals for Q-NEXT are first to deliver quantum interconnects to find ways to quantum mechanically connect distant objects, says Mark Eriksson, the John Bardeen Professor of Physics at UWMadison and a Q-NEXT thrust lead. And next, to establish a national resource to both develop and provide pristine materials for quantum science and technology.
Q-NEXT will focus on three core quantum technologies:
Eriksson is leading the Materials and Integration thrust, one of six Q-NEXT focus areas that features researchers from across the collaboration. This thrust aims to: develop high-coherence materials, including for silicon and superconducting qubits, which is an essential component of preserving entanglement; develop a silicon-based optical quantum memory, which is important in developing a quantum repeater; and improve color-center quantum bits, which are used in both communication and sensing.
One of the key goals in Materials and Integration is to not just improve the materials but also to improve how you integrate those materials together so that in the end, quantum devices maintain coherence and preserve entanglement, Eriksson says. The integration part of the name is really important. You may have a material that on its own is really good at preserving coherence, yet you only make something useful when you integrate materials together.
Six other UWMadison and Wisconsin Quantum Institute faculty members are Q-NEXT investigators: physics professors Victor Brar, Shimon Kolkowitz, Robert McDermott, and Mark Saffman, electrical and computer engineering professor Mikhail Kats, and chemistry professor Randall Goldsmith. UWMadison researchers are involved in five of the six research thrusts.
Im excited about Q-NEXT because of the connections and collaborations it provides to national labs, other universities, and industry partners, Eriksson says. When youre talking about research, its those connections that often lead to the breakthroughs.
The potential impacts of Q-NEXT research include the creation ofa first-ever National Quantum Devices Databasethat will promote the development and fabrication of next generation quantum devices as well as the development of the components and systems that enable quantum communications across distances ranging from microns to kilometers.
This funding helps ensure that the Q-NEXT collaboration will lead the way in future developments in quantum science and engineering, says Steve Ackerman, UWMadison vice chancellor for research and graduate education. Q-NEXT is the epitome of the Wisconsin Idea as we work together to transfer new quantum technologies to the marketplace and support U.S. economic competitiveness in this growing field.
Read more here:
Q-NEXT collaboration awarded National Quantum Initiative funding - University of Wisconsin-Madison
This Equation Calculates The Chances We Live In A Computer Simulation – Discover Magazine
Posted: at 10:55 am
Credit: metamorworks/Shutterstock
Sign up for our email newsletter for the latest science news
The Drake equation is one of the more famous reckonings in science. It calculates the likelihood that we are not alone in the universe by estimating the number of other intelligent civilizations in our galaxy that might exist now.
Some of the terms in this equation are well known or becoming better understood, such as the number of stars in our galaxy and the proportion that have planets in the habitable zone. But others are unknown, such as the proportion of planets that develop intelligent life; and some may never be known such as the proportion that destroy themselves before they can be discovered.
Nevertheless, the Drake equation allows scientists to place important bounds on the numbers of intelligent civilizations that might be out there.
However, there is another sense in which humanity could be linked with an alien intelligenceour world may just be a simulation inside a massively powerful supercomputer run by such a species. Indeed, various scientists, philosophers and visionaries have said that the probability of such a scenario could be close to one. In other words, we probably are living in a simulation.
The accuracy of these claims is somewhat controversial. So a better way to determine the probability that we live in a simulation would be much appreciated.
Enter Alexandre Bibeau-Delisle and Gilles Brassard at the University of Montreal in Canada. These researchers have derived a Drake-like equation that calculates the chances that we live in a computer simulation. And the results throw up some counterintuitive ideas that are likely to change the way we think about simulations, how we might determine whether we are in one and whether we could ever escape.
Bibeau-Delisle and Brassard begin with a fundamental estimate of the computing power available to create a simulation. They say, for example, that a kilogram of matter, fully exploited for computation, could perform 10^50 operations per second.
By comparison, the human brain, which is also kilogram-sized, performs up to 10^16 operations per second. It may thus be possible for a single computer the mass of a human brain to simulate the real-time evolution of 1.4 10^25 virtual brains, they say.
In our society, a significant number of computers already simulate entire civilizations, in games such as Civilization VI, Hearts of Iron IV, Humankind and so. So it may be reasonable to assume that in a sufficiently advanced civilization, individuals will be able to run games that simulate societies like ours, populated with sentient conscious beings.
So an interesting question is this: of all the sentient beings in existence, what fraction are likely to be simulations? To derive the answer, Bibeau-Delisle and Brassard start with the total number of real sentient beings NRe, multiply that by the fraction with access to the necessary computing power fCiv; multiply this by the fraction of that power that is devoted to simulating consciousness fDed (because these beings are likely to be using their computer for other purposes too); and then multiply this by the number of brains they could simulate Rcal.
The resulting equation is this, where fSim is the fraction of simulated brains:
Here RCal is the huge number of brains that fully exploited matter should be able to simulate.
The sheer size of this number, ~10^25, pushes Bibeau-Delisle and Brassard towards an inescapable conclusion. It is mathematically inescapable from [the above] equation and the colossal scale of RCal that fSim 1 unless fCiv fDed 0, they say.
So there are two possible outcomes. Either we live in a simulation or a vanishingly small proportion of advanced computing power is devoted to simulating brains.
Its not hard to imagine why the second option might be true. A society of beings similar to us (but with a much greater technological development) could indeed decide it is not very ethical to simulate beings with enough precision to make them conscious while fooling them and keeping them cut-off from the real world, say Bibeau-Delisle and Brassard.
Another possibility is that advanced civilizations never get to the stage where their technology is powerful enough to perform these kinds of computations. Perhaps they destroy themselves through war or disease or climate change long before then. There is no way of knowing.
But suppose we are in a simulation. Bibeau-Delisle and Brassard ask whether we might escape while somehow hiding our intentions from our overlords. They assume that the simulating technology will be quantum in nature. If quantum phenomena are as difficult to compute on classical systems as we believe them to be, a simulation containing our world would most probably run on quantum computing power, they say.
This raises the possibility that it may be possible to detect our alien overlords since they cannot measure the quantum nature of our world without revealing their presence. Quantum cryptography uses the same principle; indeed, Brassard is one of the pioneers of this technology.
That would seem to make it possible for us to make encrypted plans that are hidden from the overlords, such as secretly transferring ourselves into our own simulations.
However, the overlords have a way to foil this. All they need to do is to rewire their simulation to make it look as if we are able to hide information, even though they are aware of it all the time. If the simulators are particularly angry at our attempted escape, they could also send us to a simulated hell, in which case we would at least have the confirmation we were truly living inside a simulation and our paranoia was not unjustified...conclude Bibeau-Delisle and Brassard, with their tongues firmly in their cheeks.
In that sense, we are the ultimate laboratory guinea pigs: forever trapped and forever fooled by the evil genius of our omnipotent masters.
Time for another game of Civilization VI.
Ref: arxiv.org/abs/2008.09275 : Probability and Consequences of Living Inside a Computer Simulation
Here is the original post:
This Equation Calculates The Chances We Live In A Computer Simulation - Discover Magazine
I confess, I’m scared of the next generation of supercomputers – TechRadar
Posted: at 10:55 am
Earlier this year, a Japanese supercomputer built on Arm-based Fujitsu A64FX processors snatched the crown of worlds fastest machine, blowing incumbent leader IBM Summit out of the water.
Fugaku, as the machine is known, achieved 415.5 petaFLOPS by the popular High Performance Linpack (HPL) benchmark, which is almost three times the score of the IBM machine (148.5 petaFLOPS).
It also topped the rankings for Graph 500, HPL-AI and HPCH workloads - a feat never before achieved in the world of high performance computing (HPC).
Modern supercomputers are now edging ever-closer to the landmark figure of one exaFLOPS (equal to 1,000 petaFLOPS), commonly described as the exascale barrier. In fact, Fugaku itself can already achieve one exaFLOPS, but only in lower precision modes.
The consensus among the experts we spoke to is that a single machine will breach the exascale barrier within the next 6 - 24 months, unlocking a wealth of possibilities in the fields of medical research, climate forecasting, cybersecurity and more.
But what is an exaFLOPS? And what will it mean to break the exascale milestone, pursued doggedly for more than a decade?
To understand what it means to achieve exascale computing, its important to first understand what is meant by FLOPS, which stands for floating point operations per second.
A floating point operation is any mathematical calculation (i.e. addition, subtraction, multiplication or division) that involves a number containing a decimal (e.g. 3.0 - a floating point number), as opposed to a number without a decimal (e.g. 3 - a binary integer). Calculations involving decimals are typically more complex and therefore take longer to solve.
An exascale computer can perform 10^18 (one quintillion/100,000,000,000,000,000) of these mathematical calculations every second.
For context, to equal the number of calculations an exascale computer can process in a single second, an individual would have to perform one sum every second for 31,688,765,000 years.
The PC Im using right now, meanwhile, is able to reach 147 billion FLOPS (or 0.00000014723 exaFLOPS), outperforming the fastest supercomputer of 1993, the Intel Paragon (143.4 billion FLOPS).
This both underscores how far computing has come in the last three decades and puts into perspective the extreme performance levels attained by the leading supercomputers today.
The key to building a machine capable of reaching one exaFLOPS is optimization at the processing, storage and software layers.
The hardware must be small and powerful enough to pack together and reach the necessary speeds, the storage capacious and fast enough to serve up the data and the software scalable and programmable enough to make full use of the hardware.
For example, there comes a point at which adding more processors to a supercomputer will no longer affect its speed, because the application is not sufficiently optimized. The only way governments and private businesses will realize a full return on HPC hardware investment is through an equivalent investment in software.
Organizations such as the Exascale Computing Project (EPC) the ExCALIBUR programme are interested in solving precisely this problem. Those involved claim a renewed focus on algorithm and application development is required in order to harness the full power and scope of exascale.
Achieving the delicate balance between software and hardware, in an energy efficient manner and avoiding an impractically low mean time between failures (MTBF) score (the time that elapses before a system breaks down under strain) is the challenge facing the HPC industry.
15 years ago as we started the discussion on exascale, we hypothesized that it would need to be done in 20 mega-watts (MW); later that was changed to 40 MW. With Fugaku, we see that we are about halfway to a 64-bit exaFLOPS at the 40 MW power envelope, which shows that an exaFLOPS is in reach today, explained Brent Gorda, Senior Director HPC at UK-based chip manufacturer Arm.
We could hit an exaFLOPS now with sufficient funding to build and run a system. [But] the size of the system is likely to be such that MTBF is measured in single digit number-of-days based on todays technologies and the number of components necessary to reach these levels of performance.
When it comes to building a machine capable of breaching the exascale barrier, there are a number of other factors at play, beyond technological feasibility. An exascale computer can only come into being once an equilibrium has been reached at the intersection of technology, economics and politics.
One could in theory build an exascale system today by packing in enough CPUs, GPUs and DPUs. But what about economic viability? said Gilad Shainer of NVIDIA Mellanox, the firm behind the Infiniband technology (the fabric that links the various hardware components) found in seven of the ten fastest supercomputers.
Improvements in computing technologies, silicon processing, more efficient use of power and so on all help to increase efficiency and make exascale computing an economic objective as opposed to a sort of sporting achievement.
According to Paul Calleja, who heads up computing research at the University of Cambridge and is working with Dell on the Open Exascale Lab, Fugaku is an excellent example of what is theoretically possible today, but is also impractical by virtually any other metric.
If you look back at Japanese supercomputers, historically theres only ever been one of them made. They have beautifully exquisite architectures, but theyre so stupidly expensive and proprietary that no one else could afford one, he told TechRadar Pro.
[Japanese organizations] like these really large technology demonstrators, which are very useful in industry because it shows the direction of travel and pushes advancements, but those kinds of advancements are very expensive and not sustainable, scalable or replicable.
So, in this sense, there are two separate exascale landmarks; the theoretical barrier, which will likely be met first by a machine of Fugakus ilk (a technological demonstrator), and the practical barrier, which will see exascale computing deployed en masse.
Geopolitical factors will also play a role in how quickly the exascale barrier is breached. Researchers and engineers might focus exclusively on the technological feat, but the institutions and governments funding HPC research are likely motivated by different considerations.
Exascale computing is not just about reaching theoretical targets, it is about creating the ability to tackle problems that have been previously intractable, said Andy Grant, Vice President HPC & Big Data at IT services firm Atos, influential in the fields of HPC and quantum computing.
Those that are developing exascale technologies are not doing it merely to have the fastest supercomputer in the world, but to maintain international competitiveness, security and defence.
In Japan, their new machine is roughly 2.8x more powerful than the now-second place system. In broad terms, that will enable Japanese researchers to address problems that are 2.8x more complex. In the context of international competitiveness, that creates a significant advantage.
In years gone by, rival nations fought it out in the trenches or competed to see who could place the first human on the moon. But computing may well become the frontier at which the next arms race takes place; supremacy in the field of HPC might prove just as politically important as military strength.
Once exascale computers become an established resource - available for businesses, scientists and academics to draw upon - a wealth of possibilities will be unlocked across a wide variety of sectors.
HPC could prove revelatory in the fields of clinical medicine and genomics, for example, which require vast amounts of compute power to conduct molecular modelling, simulate interactions between compounds and sequence genomes.
In fact, IBM Summit and a host of other modern supercomputers are being used to identify chemical compounds that could contribute to the fight against coronavirus. The Covid-19 High Performance Computing Consortium assembled 16 supercomputers, accounting for an aggregate of 330 petaFLOPS - but imagine how much more quickly research could be conducted using a fleet of machines capable of reaching 1,000 petaFLOPS on their own.
Artificial intelligence, meanwhile, is another cross-disciplinary domain that will be transformed with the arrival of exascale computing. The ability to analyze ever-larger datasets will improve the ability of AI models to make accurate forecasts (contingent on the quality of data fed into the system) that could be applied to virtually any industry, from cybersecurity to e-commerce, manufacturing, logistics, banking, education and many more.
As explained by Rashid Mansoor, CTO at UK supercomputing startup Hadean, the value of supercomputing lies in the ability to make an accurate projection (of any variety).
The primary purpose of a supercomputer is to compute some real-world phenomenon to provide a prediction. The prediction could be the way proteins interact, the way a disease spreads through the population, how air moves over an aerofoil or electromagnetic fields interact with a spacecraft during re-entry, he told TechRadar Pro.
Raw performance such as the HPL benchmark simply indicates that we can model bigger and more complex systems to a greater degree of accuracy. One thing that the history of computing has shown us is that the demand for computing power is insatiable.
Other commonly cited areas that will benefit significantly from the arrival of exascale include brain mapping, weather and climate forecasting, product design and astronomy, but its also likely that brand new use cases will emerge as well.
The desired workloads and the technology to perform them form a virtuous circle. The faster and more performant the computers, the more complex problems we can solve and the faster the discovery of new problems, explained Shainer.
What we can be sure of is that we will see the continuous needs or ever growing demands for more performance capabilities in order to solve the unsolvable. Once this is solved, we will find the new unsolvable.
By all accounts, the exascale barrier will likely fall within the next two years, but the HPC industry will then turn its attention to the next objective, because the work is never done.
Some might point to quantum computers, which approach problem solving in an entirely different way to classical machines (exploiting symmetries to speed up processing), allowing for far greater scale. However, there are also problems to which quantum computing cannot be applied.
Mid-term (10 year) prospects for quantum computing are starting to shape up, as are other technologies. These will be more specialized where a quantum computer will very likely show up as an application accelerator for problems that relate to logistics first. They wont completely replace the need for current architectures for IT/data processing, explained Gorda.
As Mansoor puts it, on certain problems even a small quantum computer can be exponentially faster than all of the classical computing power on earth combined. Yet on other problems, a quantum computer could be slower than a pocket calculator.
The next logical landmark for traditional computing, then, would be one zettaFLOPS, equal to 1,000 exaFLOPS or 1,000,000 petaFLOPS.
Chinese researchers predicted in 2018 that the first zettascale system will come online in 2035, paving the way for new computing paradigms. The paper itself reads like science fiction, at least for the layman:
To realize these metrics, micro-architectures will evolve to consist of more diverse and heterogeneous components. Many forms of specialized accelerators are likely to co-exist to boost HPC in a joint effort. Enabled by new interconnect materials such as photonic crystal, fully optical interconnecting systems may come into use.
Assuming one exaFLOPS is reached by 2022, 14 years will have elapsed between the creation of the first petascale and first exascale systems. The first terascale machine, meanwhile, was constructed in 1996, 12 years before the petascale barrier was breached.
If this pattern were to continue, the Chinese researchers estimate would look relatively sensible, but there are firm question marks over the validity of zettascale projections.
While experts are confident in their predicted exascale timelines, none would venture a guess at when zettascale might arrive without prefacing their estimate with a long list of caveats.
Is that an interesting subject? Because to be honest with you, its so not obtainable. To imagine how we could go 1000x beyond [one exaFLOPS] is not a conversation anyone could have, unless theyre just making it up, said Calleja, asked about the concept of zettascale.
Others were more willing to theorize, but equally reticent to guess at a specific timeline. According to Grant, the way zettascale machines process information will be unlike any supercomputer in existence today.
[Zettascale systems] will be data-centric, meaning components will move to the data rather than the other way around, as data volumes are likely to be so large that moving data will be too expensive. Regardless, predicting what they might look like is all guesswork for now, he said.
It is also possible that the decentralized model might be the fastest route to achieving zettascale, with millions of less powerful devices working in unison to form a collective supercomputer more powerful than any single machine (as put into practice by the SETI Institute).
As noted by Saurabh Vij, CEO of distributed supercomputing firm Q Blocks, decentralized systems address a number of problems facing the HPC industry today, namely surrounding building and maintenance costs. They are also accessible to a much wider range of users and therefore democratize access to supercomputing resources in a way that is not otherwise possible.
There are benefits to a centralized architecture, but the cost and maintenance barrier overshadows them. [Centralized systems] also alienate a large base of customer groups that could benefit, he said.
We think a better way is to connect distributed nodes together in a reliable and secure manner. It wouldnt be too aggressive to say that, 5 years from now, your smartphone could be part of a giant distributed supercomputer, making money for you while you sleep by solving computational problems for industry, he added.
However, incentivizing network nodes to remain active for a long period is challenging and a high rate of turnover can lead to reliability issues. Network latency and capacity problems would also need to be addressed before distributed supercomputing can rise to prominence.
Ultimately, the difficulty in making firm predictions about zettascale lies in the massive chasm that separates present day workloads and HPC architectures from those that might exist in the future. From a contemporary perspective, its fruitless to imagine what might be made possible by a computer so powerful.
We might imagine zettascale machines will be used to process workloads similar to those tackled by modern supercomputers, only more quickly. But its possible - even likely - the arrival of zettascale computing will open doors that do not and cannot exist today, so extraordinary is the leap.
In a future in which computers are 2,000+ times as fast as the most powerful machine today, philosophical and ethical debate surrounding the intelligence of man versus machine are bound to be played out in greater detail - and with greater consequence.
It is impossible to directly compare the workings of a human brain with that of a computer - i.e. to assign a FLOPS value to the human mind. However, it is not insensible to ask how many FLOPS must be achieved before a machine reaches a level of performance that might be loosely comparable to the brain.
Back in 2013, scientists used the K supercomputer to conduct a neuronal network simulation using open source simulation software NEST. The team simulated a network made up of 1.73 billion nerve cells connected by 10.4 trillion synapses.
While ginormous, the simulation represented only 1% of the human brains neuronal network and took 40 minutes to replicate 1 seconds worth of neuronal network activity.
However, the K computer reached a maximum computational power of only 10 petaFLOPS. A basic extrapolation (ignoring inevitable complexities), then, would suggest Fugaku could simulate circa 40% of the human brain, while a zettascale computer would be capable of performing a full simulation many times over.
Digital neuromorphic hardware (supercomputers created specifically to simulate the human brain) like SpiNNaker 1 and 2 will also continue to develop in the post-exascale future. Instead of sending information from point A to B, these machines will be designed to replicate the parallel communication architecture of the brain, sending information simultaneously to many different locations.
Modern iterations are already used to help neuroscientists better understand the mysteries of the brain and future versions, aided by advances in artificial intelligence, will inevitably be used to construct a faithful and fully-functional replica.
The ethical debates that will arise with the arrival of such a machine - surrounding the perception of consciousness, the definition of thought and what an artificial uber-brain could or should be used for - are manifold and could take generations to unpick.
The inability to foresee what a zettascale computer might be capable of is also an inability to plan for the moral quandaries that might come hand-in-hand.
Whether a future supercomputer might be powerful enough to simulate human-like thought is not in question, but whether researchers should aspire to bringing an artificial brain into existence is a subject worthy of discussion.
See the article here:
I confess, I'm scared of the next generation of supercomputers - TechRadar