Archive for the ‘Quantum Computing’ Category
Explained | The challenges of quantum computing – The Hindu
Posted: December 21, 2022 at 12:15 am
The story so far: The allure of quantum computers (QC) is their ability to take advantage of quantum physics to solve problems too complex for computers that use classical physics. The 2022 Nobel Prize for physics was awarded for work that rigorously tested one such experience and paved the way for its applications in computing which speaks to the contemporary importance of QCs. Several institutes, companies and governments have invested in developing quantum-computing systems, from software to solve various problems to the electromagnetic and materials science that goes into expanding their hardware capabilities. In 2021 alone, the Indian government launched a National Mission to study quantum technologies with an allocation of 8,000 crore; the army opened a quantum research facility in Madhya Pradesh; and the Department of Science and Technology co-launched another facility in Pune. Given the wide range of applications, understanding what QCs really are is crucial to sidestep the misinformation surrounding it and develop expectations that are closer to reality.
A macroscopic object like a ball, a chair or a person can be at only one location at a time; this location can be predicted accurately; and the objects effects on its surroundings cant be transmitted faster than at the speed of light. This is the classical experience of reality.
For example, you can observe a ball flying through the air and plot its trajectory according to Newtons laws. You can predict exactly where the ball will be at a given time. If the ball strikes the ground, you will see it doing so in the time it takes light to travel through the atmosphere to you.
Quantum physics describes reality at the subatomic scale, where the objects are particles like electrons. In this realm, you cant pinpoint the location of an electron. You can only know that it will be present in a given volume of space, with a probability attached to each point in the volume like 10% at point A and 5% at point B. When you probe this volume in a stronger way, you might find the electron at point B. If you repeatedly probe this volume, you will find the electron at point B 5% of the time.
There are many interpretations of the laws of quantum physics. One is the Copenhagen interpretation, which Erwin Schrdinger popularised using a thought-experiment he devised in 1935. There is a cat in a closed box with a bowl of poison. There is no way to know whether the cat is alive or dead without opening the box. In this time, the cat is said to exist in a superposition of two states: alive and dead. When you open the box, you force the superposition to collapse to a single state. The state to which it collapses depends on the probability of each state.
Similarly, when you probe the volume, you force the superposition of the electrons states to collapse to one depending on the probability of each state. (Note: This is a simplistic example to illustrate a concept.)
The other experience relevant to quantum-computing is entanglement. When two particles are entangled and then separated by an arbitrary distance (even more than 1,000 km), making an observation on one particle, and thus causing its superposition to collapse, will instantaneously cause the superposition of the other particle to collapse as well. This phenomenon seems to violate the notion that the speed of light is the universes ultimate speed limit. That is, the second particles superposition will collapse to a single state in less than three hundredths of a second, which is the time light takes to travel 1,000 km. (Note: The many worlds interpretation has been gaining favour over the Copenhagen interpretation. Here, there is no collapse, automatically removing some of these puzzling problems.)
The bit is the fundamental unit of a classical computer. Its value is 1 if a corresponding transistor is on and 0 if the transistor is off. The transistor can be in one of two states at a time on or off so a bit can have one of two values at a time, 0 or 1.
The qubit is the fundamental unit of a QC. Its typically a particle like an electron. (Google and IBM have been known to use transmons, where pairs of bound electrons oscillate between two superconductors to designate the two states.) Some information is directly encoded on the qubit: if the spin of an electron is pointing up, it means 1; when the spin is pointing down, it means 0.
But instead of being either 1 or 0, the information is encoded in a superposition: say, 45% 0 plus 55% 1. This is entirely unlike the two separate states of 0 and 1 and is a third kind of state.
The qubits are entangled to ensure they work together. If one qubit is probed to reveal its state, so will some of or all the other qubits, depending on the calculation being performed. The computers final output is the state to which all the qubits have collapsed.
One qubit can encode two states. Five qubits can encode 32 states. A computer with N qubits can encode 2N states whereas a computer with N transistors can only encode 2 N states. So a qubit-based computer can access more states than a transistor-based computer, and thus access more computational pathways and solutions to more complex problems.
Researchers have figured out the basics and used QCs to model the binding energy of hydrogen bonds and simulate a wormhole model. But to solve most practical problems, like finding the shape of an undiscovered drug, autonomously exploring space or factoring large numbers, they face some fractious challenges.
A practical QC needs at least 1,000 qubits. The current biggest quantum processor has 433 qubits. There are no theoretical limits on larger processors; the barrier is engineering-related.
Qubits exist in superposition in specific conditions, including very low temperature (~0.01 K), with radiation-shielding and protection against physical shock. Tap your finger on the table and the states of the qubit sitting on it could collapse. Material or electromagnetic defects in the circuitry between qubits could also corrupt their states and bias the eventual result. Researchers are yet to build QCs that completely eliminate these disturbances in systems with a few dozen qubits.
Error-correction is also tricky. The no-cloning theorem states that its impossible to perfectly clone the states of a qubit, which means engineers cant create a copy of a qubits states in a classical system to sidestep the problem. One way out is to entangle each qubit with a group of physical qubits that correct errors. A physical qubit is a system that mimics a qubit. But reliable error-correction requires each qubit to be attached to thousands of physical qubits.
Researchers are also yet to build QCs that dont amplify errors when more qubits are added. This challenge is related to a fundamental problem: unless the rate of errors is kept under a certain threshold, more qubits will only increase the informational noise.
Practical QCs will require at least lakhs of qubits, operating with superconducting circuits that were yet to build apart from other components like the firmware, circuit optimisation, compilers and algorithms that make use of quantum-physics possibilities. Quantum supremacy itself a QC doing something a classical computer cant is thus at least decades away.
The billions being invested in this technology today are based on speculative profits, while companies that promise developers access to quantum circuits on the cloud often offer physical qubits with noticeable error rates.
The interested reader can build and simulate rudimentary quantum circuits using IBMs Quantum Composer in the browser.
Read more:
Quantum Computing Will Change Our Lives. But Be Patient, Please – CNET
Posted: at 12:15 am
To hear some tell it, quantum computing progress will soon stall, ushering in a "quantum winter" when big companies ice their development programs and investors stop lavishing investments on startups.
"Winter is coming," Sabine Hossenfelder, a physicist and author working for the Munich Center for Mathematical Philosophy, said in a November video. "This bubble of inflated promises will eventually burst. It's just a matter of time."
There are signs she's right. In 2022, quantum computing hit a rough patch, with share prices plunging for the three publicly traded companies specializing in the potentially revolutionary technology. Startups seeking strength in numbers are banding together, a consolidation trend with eight mergers so far by the reckoning of Global Quantum Intelligence analysts.
But you'd have been hard pressed to find a whiff of pessimism at Q2B, a December conference about the business of quantum computing. Industry players showed continued progress toward practical quantum computers, Ph.D.-equipped researchers from big business discussed their work, and one study showed declining worries about a research and investment freeze.
"I don't think there will be a quantum winter, but some people will get frostbite," Global Quantum Intelligence analyst Doug Finke said at Q2B.
Quantum computing relies on the weird rules of atomic-scale physics to perform calculations out of reach of conventional computers like those that power today's phones, laptops and supercomputers. Large-scale, powerful quantum computers remain years away.
But progress is encouraging, because it's getting harder to squeeze more performance out of conventional computers. Even though quantum computers can't do most computing jobs, they hold strong potential for changing our lives, enabling better batteries, speeding up financial calculations, making aircraft more efficient, discovering new drugs and accelerating AI.
Quantum computing executives and researchers are acutely aware of the risks of a quantum winter. They saw what happened with artificial intelligence, a field that spent decades on the sidelines before today's explosion of activity. In Q2B interviews, several said they're working to avoid AI's early problems being overhyped.
"Everyone talks about the AI winter," said Alex Keesling, CEO of quantum computer maker QuEra. "What did we learn? People are trying to adjust their messaging...so that we avoid something like the AI winter with inflated expectations."
Those quantum computing applications emerged over and over at Q2B, a conference organized by quantum computing software and services company QC Ware. Although quantum computers can handle only simple test versions of those examples so far, big companies like JP Morgan Chase, Ford Motor Co., Airbus, BMW, Novo Nordisk, Hyundai and BP are investing in R&D teams and proof-of-concept projects to pave the way.
The corporate efforts typically are paired with hardware and software efforts from startups and big companies like IBM, Google, Amazon, Microsoft and Intel with big bets on quantum computing. Underpinning the work is government funding for quantum computing research in the US, France, Germany, China, Australia and other countries.
While conventional computers perform operations on bits that represent either one or zero, quantum computers' fundamental data-processing element, called the qubit, is very different. Qubits can record combinations of zeros and ones through a concept called superposition. And thanks to a phenomenon called entanglement, they can be linked together to accommodate vastly more computing states than classical bits can store at once.
The problem with today's quantum computers is the limited number of qubits -- 433 in IBM's latest Osprey quantum computer -- and their flakiness. Qubits are easily disturbed, spoiling calculations and therefore limiting the number of possible operations. On the most stable quantum computers, there's still a better than one in 1,000 chance a single operation will produce the wrong results, an error rate that's disgracefully high compared with conventional computers. Quantum computing calculations typically are run over and over many times to obtain a statistically useful result.
Today's machines are members of the NISQ era: noisy intermediate-scale quantum computers. It's still not clear whether such machines will ever be good enough for work beyond tests and prototyping.
But all quantum computer makers are headed toward a rosier "fault-tolerant" era in which qubits are better stabilized and ganged together into long-lived "logical" qubits that fix errors to persist longer. That's when the true quantum computing benefits arrive, likely five or more years from now.
Quantum computing faces plenty of challenges on the way to maturity. One of them is hype.
Google's captured attention with its "quantum supremacy" announcement in 2019, in which its machine outpaced conventional computers on an academic task that didn't actually accomplish useful work. John Preskill, a Caltech physicist who's long championed quantum computing, has warned repeatedly about hype. Nowadays, companies are focused on a more pragmatic "quantum advantage" goal of beating a conventional computer on a real-world computing challenge.
The technology could be big and disruptive, and that piqued the interest of investors. Over the past 14 months, three quantum computer makers took their companies to the public markets, taking the faster SPAC, or special purpose acquisition company, route rather than a traditional initial public offering.
First was IonQ in October 2021, followed by Rigetti Computing in March and D-Wave Systems on August.
The markets have been unkind to technology companies in recent months, though. IonQ is trading at half its debut price, and D-Wave has dropped about three quarters. Rigetti, trading at about a 10th of its initial price, is losing its founding CEO on Thursday.
Although quantum computer startups haven't failed, some mergers indicate that prospects are rosier if teams band together. Among others, Honeywell Quantum Solutions merged with Cambridge Quantum to form Quantinuum in 2021; Pasqal merged with Qu&Co in 2022; and ColdQuanta -- newly renamed Infleqtion -- acquired Super.tech.
But the reality is that quantum computing hype isn't generally rampant. Over and over at Q2B, quantum computing advocates showed themselves to be measured in their predictions and guarded about promising imminent breakthroughs. Comments that quantum computing will be "bigger than fire" are the exception, not the rule.
Instead, advocates prefer to point to a reasonable track record of steady progress. Quantum computer makers have gradually increased the scale of quantum computers, improved its software and decreased the qubit-perturbing noise that derails calculations. The race to build a quantum computer is balanced against patience and technology road maps that stretch years into the future.
For example, Google achieved its first error correction milestone in 2022, expects its next in 2025 or so, then has two more milestones on its road map before it plans to deliver a truly powerful quantum computer in 2029. Other roadmaps from companies like Quantinuum and IBM are equally detailed.
And new quantum computing efforts keep cropping up. Cloud computing powerhouse Amazon, which started its Braket service with access to others' quantum computers, is now at work on its own machines too. At Q2B, the Novo Nordisk Foundation -- with funding from its Novo Nordisk pharmaceutical company -- announced a plan to fund a quantum computer for biosciences at the University of Copenhagen's Niels Bohr Institute in Denmark.
It's a long-term plan with an expectation that it'll be able to solve life sciences problems in 2035, said physicist Peter Krogstrup Jeppesen, who left a quantum computing research position at Microsoft to lead the effort.
"They really, really play the long game," said Cathal Mahon, scientific leader at the Novo Nordisk Foundation.
Some startups are seeing the frosty investment climate. Raising money today is more challenging, said Asif Sinay, chief executive of Qedma, whose error suppression technology is designed to help squeeze more power out of quantum computers. But he's more sanguine about the situation since he's not looking for investors right now.
Keeping up with technology roadmaps is critical for startups, said Duncan Stewart of the Business Development Bank of Canada, which has invested in quantum computing startups. One of them, Nord Quantique in Quebec, "will live or die based on whether they meet their technical milestones 18 months from now," he said.
But startup difficulties wouldn't cause a quantum winter, Quantinuum Chief Operating Officer Tony Uttley believes. Two scenarios that could trigger a winter, though, are if a big quantum computing company stopped its investments or if progress across the industry stalled, he said.
The quantum computing industry isn't putting all its eggs in one basket. Various designs include trapped ions, superconducting circuits, neutral atoms, electrons on semiconductors and photonic qubits.
"We are not close to a general purpose quantum computer that can perform commercially relevant problems," said Oskar Painter, a physicist leading Amazon Web Services' quantum hardware work. But even as a self-described cynical physicist, he said, "I'm very convinced we're going to get there. I do see the path to doing it."
Read the original post:
Quantum Computing Will Change Our Lives. But Be Patient, Please - CNET
Crossing the Quantum Frontier: ORNL, Quantum Science Center … – HPCwire
Posted: at 12:15 am
Dec. 19, 2022 Oak Ridge National Laboratorys next major computing achievement could open a new universe of scientific possibilities accelerated by the primal forces at the heart of matter and energy.
The worldsfirst publicly revealed exascale supercomputer kicked off a new generation of computing in May 2022 when scientists at the U.S. Department of Energys ORNL set a record for processing speed. As Frontieropens to full user operations, quantum computing researchers at ORNL and the DOEsQuantum Science Center, or QSC, continue working to integrate classical computing with quantum information science to develop the worlds first functionalquantum computer, which would use the laws of quantum mechanics to tackle challenges beyond even the fastest supercomputers in operation.
We believe that quantum computers will be able to simulate quantum systems that are intractable to simulate with classical methods and thereby advance science that will be foundational for the future economy and national security of the U.S., said Nick Peters, who leads ORNLs Quantum Information Science, or QIS, Section.
The year of that quantum milestone could be like none before at least since 1947. Thats when scientists at Bell Labs invented the transistor, the three-legged electronic semiconductor that ultimately replaced the cumbersome vacuum tubes relied on by computers of the previous generation. The leap in technology enabled the microchip, the electronic calculator and the computing revolution that followed.
Researchers believe they could be approaching a similar pivot point that would kick-start the quantum computing revolution and transform the world again this time with the potential for unprecedented computing horsepower and ultra-secure communications.
The DOEs Office of Science launched the QSC, a DOE National Quantum Information Science Research Center headquartered at ORNL, in 2020 in part to help speed toward those goals. The QSC combines resources and expertise from national laboratories, universities and industry partners, including ORNL,Los Alamos National Laboratory,Fermi National Accelerator Laboratory,Purdue UniversityandMicrosoft.
Any quantum revolution wont happen all at once.
A lot of people anticipate well have a eureka moment when quantum computing takes over high-performance computing, said ORNLs Travis Humble, director of the QSC. But real scientific progress usually happens slowly and incrementally, in stages you can measure over time. We may now be inching up on that tipping point when quantum computing offers an advantage and a quantum computer surpasses the classical computers weve relied on for so long.
But it wont happen overnight, and its going to take a lot of long, hard work.
The Quantum Shift
Quantum computing uses quantum bits, or qubits, to store and process quantum information. Qubits arent like the bits used by classical computers, which can store only one of two potential values 0 or 1 per bit.
A qubit can exist in more than one state at a time by using quantum superposition, which allows combinations of distinct physical values to be encoded on a single object.
Superposition is like spinning a coin on its edge, Peters said. When its spinning, the coin is neither heads nor tails.
A qubit stores information in a tangible degree of freedom, such as two possible frequency values. Superposition means the qubit, like the spinning coin, can exist in both frequencies at the same time. Measuring the frequency determines the probability of measuring either of the two values, such as a coins likelihood to land on heads or tails.
The more qubits, the greater the possible superposition and degrees of freedom, for an exponentially larger quantum computational framework. That difference could fuel such innovations as vastly more powerful supercomputers, incredibly precise sensors and impenetrably secure communications.
But those superpowers come with a cost. Quantum superposition lasts only as long as a qubit remains unexamined. Only a finite amount of information can be extracted from a qubit once its measured.
When you measure a qubit, you destroy the quantum state and convert it to a single bit of classical information, Peters said. Think about the spinning coin. If you slap your hand down on the coin, it will be either heads or tails, so you get only one classical bit out of the measurement.The trick is to use qubits in the right way, such that the measurement turns into useful classical results.
Finding that trick could deliver huge payoffs. A quantum supercomputer, for example, could use the laws of quantum physics to probe fundamental questions of how matter and energy work, such as what makes certain materials act as superconductors of electricity. Questions like those have so far eluded the best efforts of scientists and existing supercomputing systems likeFrontier, the first exascale supercomputer and fastest in the world, and its predecessors.
A development like this would be such a shift as to be a new tool in the box that we theoretically could use to fix almost anything, Humble said.
But first scientists must answer basic questions about how to make that new tool work. A true quantum computer wont be like any computer thats ever come before.
The great tension right now is this tightrope between quantum computing as an exciting new field of research and these tremendous technical challenges that were not sure how to solve, said Ryan Bennink, who leads ORNLs Quantum Computational Science Group. How do you even think about programming a quantum computer? Everything we know about programming is based on classical computers. Thats why our understanding must be evolutionary. Were building on what others have done with quantum so far, one step at a time.
Those steps include projects supported by ORNLsQuantum Computing User Program, or QCUP. The program awards time on privately owned quantum processors around the country to support independent quantum study. The computers used arent quite what quantum computings advocates have in mind for the revolution.
I wouldnt compare the quantum computers we have now with supercomputers, said Humble, who oversees QCUP. These quantum computers are basically systems we experiment with to show how quantum mechanics can be used to perform simple calculations on test problems. Conventional computers can do most of these calculations easily. The researchers testing these machines are doing the best science to gain insight into how we can make quantum computing work for scientific discovery and innovation.
For a future quantum supercomputer, we need a machine that meets a threshold of accuracy, reliability and sustainability that we just havent seen yet.
Turning Down the Noise
The main obstacle for useful quantum simulations so far has been the relatively high error rate from noise degrading qubit quality. Those kinds of simulations wont be ready for prime time until scientists achieve the same level of real-world consistency and accuracy offered by standard supercomputers.
Qubits acquire these errors just sitting there, Bennink said. Every time we operate the quantum computer, we introduce error. When we read out the values generated by the calculations, we introduce more error. That doesnt mean the simulations are all wrong. We can perform some quantum simulations with an error rate of 1% per operation. But if we need to do 10,000 operations in a simulation, thats going to be more errors than we can fix. So right now, were limited in the operations we can run before the amount of quantum noise renders the results useless. We need to get the error rate below a reliable threshold preferably a tenth of a percent or lower.
Researchers keep inching closer, study by study. A team led by theUniversity of Chicagos Giulia Galli recently used an allocation from QCUP to simulate quantum spin defects in a crystal and balance the error rate to a level deemed acceptable for scientific use.
The results were not perfect, but we were able to cut down the errors to such a point that the results became scientifically useful, Galli said.
Another QCUP study led byArgonne National LaboratorysRuslan Shaydulinused quantum state tomography, which estimates the properties of a quantum state, to correct noise on a study using five qubits and reach a 23% improvement in quantum state fidelity.
We achieved a much larger-scale validation on this hardware with more qubits than had been done before, Shaydulin said. These results put us one step closer to realizing the potential of quantum computers.
As refinements continue, researchers suggest incorporating qubits into larger supercomputing systems might act as a bridge to fully quantum systems. That doesnt mean classical computers would go extinct.
Ultimately, quantum computing will most likely become an essential element in high-performance computing, but its unlikely to replace classical computing altogether, Peters said.
I expect well see the rise of hybrid computing, where classical systems use quantum computing as an accelerator, similar to GPUs in a supercomputer like Frontier. Id even expect hybrid systems to be the primary way we leverage the power of quantum computers as they mature. Algorithm-optimized quantum processors could help simulate parts of problems too challenging for purely classical machines until we find a seamless way to integrate both types of computing.
Connecting the Qubits
The next step from a true quantum computer would be a quantum network and ultimately aquantum internet of such computers that would enable communication through qubits.
Efforts by quantum information scientists at ORNL seek to establish entanglement between remote quantum objects, a process that could be used for computing or for building quantum sensor networks. Along with classical communications, entanglement which means two objects intertwine so closely that one cant be described independently of the other no matter how far apart enables distant users to move quantum information over a network byquantum teleportation, or the transmission of a qubit from one place to another without physical travel through space.
Entanglement is often the key resource needed to carry out a desired quantum application, and it needs to be done in an error-free or nearly error-free way, Peters said. We tend to lose most of the qubits carrying this information as theyre transmitted. Thats a big challenge that requires us to develop quantum repeaters, which you could consider a special type of quantum computer, to correct for loss and other errors.
Ultimately, well need to develop not just new technology but new concepts to make a quantum internet a reality, but entanglement is a necessary step.
Scientists at ORNL and the QSC have made cracking the code to that entanglement a top priority.
Were not committed to a single type of quantum technology, Humble said. We think theres value in variety. Two approaches have emerged as leading favorites so far, but were open to all possibilities.
One approach focuses on harnessing dim beams of light to connect quantum machines for secure and lightning-fast communications.
A light particle, or photon, can exist in two frequencies at a time, like the ability of a qubit to hold more than one value at once. Photons could be used as the vessels for encoding information that could then be transmitted across hundreds or thousands of miles from a satellite to separate ground stations, for example at the speed of light. A cryptographic key based on quantum mechanical principles could be delivered and used to encrypt the messages for virtually unbreakable security.
We already know how to send these light particles over long distances think about a TV or radio signal so now we just need to figure out how to use their inherent properties to encode them and enable networked quantum computing, said Raphael Pooser, an ORNL quantum research scientist. Photons are just pieces of electromagnetic field floating in space, like a pendulum swinging back and forth. That gives us good variables for computing because photons have an infinite number of possible values that would allow us to store large or small amounts of information.
Pairing Photons
As with most aspects of quantum computing, the theorys not easy to put into practice.
The really difficult thing about photons is that they dont interact naturally, said Joe Lukens, senior director of quantum networking atArizona State Universityand a frequent collaborator with ORNL. As Ive heard it stated simply, put two flashlights together, and you have two light beams, not a particle accelerator. The beams just fly past each other. From a computing perspective, you want your qubits to have a high degree of interaction to achieve that necessary entanglement.
ORNLs photonics researchers could be closing in on a way to bring that vision to life. The approach, known as frequency bin coding, focuses on using pulse shapers, which manipulate the frequencies of light waves, and phase modulators, which manipulate photonic oscillation cycles, to encode and entangle particles, imprint them on light beams and then transmit them over optical fiber.
A2020 experimentby Lukens and fellow quantum researchers at Purdue University demonstrated the approach could be used to control frequency bin qubits in an arbitrary manner, laying the groundwork for the types of quantum operations needed in quantum networking.
Thats the basic building block of a quantum computing network, Lukens said. If we put a pulse shaper and phase modulator back to back, in principle we could build any kind of quantum gate for a universal quantum computer.
A2021 studyled by ORNL successfully used photons to share entanglement among three quantum nodes in separate buildings linked by a quantum local area network.
Now we need to figure out how to scale up, Lukens said. There are still a lot of questions about the best path to a quantum network, but I think frequency-based photonics has a good shot.
Promising Platforms
The other main target of ORNLs quantum simulation research focuses on trapping and controlling ions, atoms charged by a loss of electrons. Each ion carries a positive charge that can be used to move the ion around in a radio-frequency trap. The quantum state of the ion can be controlled for quantum applications through such means as lasers and microwaves.
One of the advantages of working with trapped ions is theyre natural qubits, said Chris Seck, an ORNL quantum research scientist leading the ion-trap effort. Each trapped ion of a specific species is identical (in the same environment), and the physics of trapping and manipulating their quantum states has been well understood for decades. Thats part of what makes this such a promising platform.
ORNL has invested more than $3 million in its ion-trap efforts so far, mainly through the QSC.
Were still starting up, and as with any new effort, especially one started just before the COVID-19 pandemic, there have been growing pains, Seck said. Were excited about the possibilities for further exploration.
ORNL continues to expand its quantum efforts, including the creation of the QIS Section in 2021.
The QIS section is home to ORNLs research groups devoted to developing the tools and techniques for quantum sensing, computing and networking, Peters said. The QIS staff collaborate broadly across ORNL, in the region and across the U.S.
Researchers cant predict which strategy might lead to that quantum watershed moment or when it might come. Industry partners of the QSC have taken up other approaches. Discoveries could lead in directions yet to be considered.
Were learning more every day about what works and what doesnt, Humble said. Its akin to the late 1940s of computing, when the invention of the transistor didnt bring a digital revolution overnight. It was another decade before the invention of the microchip and even longer before we saw the rise of modern computers, cellphones and the internet. So were prepared for a sustained commitment to develop quantum computing and the remarkable opportunities it affords.
The QSC, a DOE National Quantum Information Science Research Center led by ORNL, performs cutting-edge research at national laboratories, universities and industry partners to overcome key roadblocks in quantum state resilience, controllability, and ultimately the scalability of quantum technologies. QSC researchers are designing materials that enable topological quantum computing; implementing new quantum sensors to characterize topological states and detect dark matter; and designing quantum algorithms and simulations to provide a greater understanding of quantum materials, chemistry, and quantum field theories. These innovations enable the QSC to accelerate information processing, explore the previously unmeasurable, and better predict quantum performance across technologies. For more information, visitqscience.org.
UT-Battelle manages ORNL for the Department of Energys Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, visithttps://energy.gov/science.
Source: ORNL
Link:
Crossing the Quantum Frontier: ORNL, Quantum Science Center ... - HPCwire
IBM and Algorithmiq Pushing AI Quantum Computing for Health Care – Datamation
Posted: at 12:15 am
IBM is one of the companies most focused on quantum computing and general artificial intelligence (AI). The advances made by IBMs Watson platform and the quantum computing team out of IBM Research are proof of that leadership.
IBM recently announced the massive Osprey, which is one of the most advanced quantum computers in the world. IBM also announced a partnership with Algorithmiq out of Finland that is developing a quantum simulation platform focused initially on health care and materials science.
The interesting result should be a significant improvement in related drug discovery efforts that, given quantum computings massive performance advantage with huge datasets, should help advance new drug development while significantly lowering side effects once finished.
The same problem that has plagued these efforts in the past, including access to data, particularly from research hospitals, hasnt been fully mitigated. But federated and synthesized data efforts are slowly beginning to close those gaps to create the potential for that data to be available once a fully capable quantum computer can be spun up to the task.
Lets talk about quantum computers and how they could significantly change the world and particularly health care this week:
The first time I was introduced to IBMs Watson platform, it was focused almost exclusively on the medical industry. The M.D. who briefed me shared that once that old instance of Watson had been trained, he entered a series of symptoms from a woman hed worked with for years to identify her illness. It took him around three years of focused research to identify a list of potential illnesses.
In short, even though this was a rudimentary form of Watson at the time, it changed a multi-year process into one that could arguably have been done in minutes. For many patients, it could cure an illness that might never have been diagnosed, given how much effort that diagnosis would have required.
Medical AIs require massive amounts of data to do their job, because they have to focus on the deep learning (DL) side of AI, given the high variability of both people and illnesses. Side effects, unintended adverse consequences, like addiction, and cost are all part of any effort to find an ideal medication to address a new or existing illness. Once mature and at sufficient size and scale, quantum computers will be able to deal with datasets that are far larger than we are able to realistically deal with, by using speeds that todays conventional computers cant touch.
This quantum capability should give IBM a significant edge in a market where these massive datasets and fast results are required and make IBMs recent partnership with Algorithmiq critical to the successful future of the AI effort. In short, our ability to deal with a pandemic more effectively will likely be impacted by how mature this joint venture between the two companies is at that time. Once mature, it could be a medical game changer when it comes to developing better, safer, and more trustworthy medications.
IBMs leadership in AI and quantum computing was highlighted both by the announcement of the powerful quantum computer and the announcement that Finlands Algorithmiq would be partnering with IBM on drug discovery.
The combination of these two announcements showcases the very real near-term potential benefits for AI and quantum computing. Sometimes, having the right partner can lead to truly world-changing efforts. Finding a faster, better way to discover medications would go a long way to assuring longer lives and lowering our medical expenses over time.
Excerpt from:
IBM and Algorithmiq Pushing AI Quantum Computing for Health Care - Datamation
Classical vs. quantum computing: What are the differences? – TechTarget
Posted: at 12:15 am
As new technologies develop and gain traction, the public tends to divide into two groups: those who believe it will make an impact and grow, and those who don't. The former tends to be correct, so it is crucial to understand how future technologies differ from the status quo to prepare for their adoption en masse.
Classical computing has been the norm for decades, but in recent years, quantum computing has continued to rapidly develop. The technology is still in its early stages, but has existing and many more potential uses in AI/ML, cybersecurity, modeling and other applications.
It might be years before widespread implementation of quantum computing. However, explore the differences between classical vs. quantum computing to gain an understanding should the technology become more widespread.
Quantum computers typically must operate under more regulated physical conditions than classical computers because of quantum mechanics. Classical computers have less compute power than quantum computers and cannot scale as easily. They also use different units of data -- classical computers use bits and quantum computers use qubits.
In classical computers, data is processed in a binary manner.
Classical computers use bits -- eight units of bits is referred to as one byte -- as their basic unit of data. Classical computers write code in a binary manner as a 1 or a 0. Simply put, these 1s and 0s indicate the state of on or off, respectively. They can also indicate true or false or yes or no, for example.
This is also known as serial processing, which is successive in nature, meaning one operation must complete before another one follows. Lots of computing systems use parallel processing, an expansion of classical processing, which can perform simultaneous computing tasks. Classical computers also return one result because bits of 1s and 0s are repeatable due to their binary nature.
Quantum computing, however, follows a different set of rules. Quantum computers use qubits as their unit of data. Qubits, unlike bits, can be a value of 1 or 0, but can also be 1 and 0 at the same time, existing in multiple states at once. This is known as superposition, where properties are not defined until they are measured.
According to IBM, "Groups of qubits in superposition can create complex, multidimensional computational spaces," which enables more complex computations. When qubits become entangled, changes to one qubit directly affect the other, which makes information transfer between qubits much faster.
In classical computers, algorithms need a lot of parallel computations to solve problems. Quantum computers can account for multiple outcomes when they analyze data with a large set of constraints. The outputs have an associated probability, and quantum computers can perform more difficult compute tasks than classical computers can.
Most classical computers operate on Boolean logic and algebra, and power increases linearly with the number of transistors in the system -- the 1s and 0s. The direct relationship means in a classical computer, power increases 1:1 in tandem with the transistors in the system.
Because quantum computers' qubits can represent a 1 and 0 at the same time, a quantum computer's power increases exponentially in relation to the number of qubits. Because of superposition, the number of computations a quantum computer could take is 2N where N is the number of qubits.
Classical computers are well-suited for everyday use and normal conditions. Consider something as simple as a standard laptop. Most people can take their computer out of their briefcase and use it in an air-conditioned caf or on the porch during a sunny summer day. In these environments, performance won't take a hit for normal uses like web browsing and sending emails over short periods of time.
Organizations that don't plan on implementing quantum computing in their own business will still need to prepare for the external threats quantum computing might impose.
Data centers and larger computing systems are more complex and sensitive to temperature, but still operate within what most people would consider "reasonable" temperatures, such as room temperature. For example, ASHRAE recommends A1 to A4 class hardware stays at 18 to 27 degrees Celsius, or 64.4 to 80.6 degrees Fahrenheit.
Some quantum computers, however, need to reside in heavily regulated and stringent physical environments. Some need to be kept at absolute zero, which is around -273.15 degrees Celsius or -459.67 Fahrenheit, although recently the first room-temperature computer was developed by Quantum Brilliance.
The reason for the cold operating environments is that qubits are extremely sensitive to mechanical and thermal influences. Disturbances can cause the atoms to lose their quantum coherence -- essentially, the ability for the qubit to represent both a 1 and a 0 -- which can cause errors to computations.
Like most technologies, quantum computing poses opportunities and risks. While it might be a while before quantum computers really take off, start to have conversations with leadership and develop plans for quantum computing.
Organizations that don't plan on implementing quantum computing in their own business will still need to prepare for the external threats quantum computing might impose. Firstly, quantum computers can potentially crack even the most powerful and advanced security measures. For example, a motivated enough hacker can, in theory, use quantum computing to quickly break the cryptographic keys commonly used in encryption if they are savvy.
In addition, organizations that are considering quantum computers for their data centers or certain applications will have to prepare facilities. Like any other piece of infrastructure, quantum computers need space, electricity supply and resources to operate. Begin examining the options available to accommodate for them. Look at budget, space, facility and staffing needs to begin planning.
See the article here:
Classical vs. quantum computing: What are the differences? - TechTarget
The state of quantum computing in 2023 – Verdict
Posted: at 12:15 am
The quantum computing market is expected to reach between $1 billion and $5 billion by 2025, according to GlobalDatas Tech, Media and Telecom Predictions 2023 report. Quantum computing uses principles of quantum physics to store and compute data. Superposition describes the ability of a quantum bit (qubit) to exist in an on and off state simultaneously. Qubits must be isolated from their external environment to achieve a state of superposition known as coherence. This is where much of the scientific and technological challenge exists, as qubits often decohere before the computation is completed. Current quantum computers (QCs) are said to be in the noisy intermediate-scale quantum (NISQ) stage of development.
IBM is a world leader in quantum computing research and development. In November 2022, it unveiled its latest QC, Osprey. Boasting 433 qubits, Osprey is currently the highest qubit count QC in the world, more than triple IBMs previous record of the 127-qubit Eagle QC. IBM is on track to develop 4,000-qubit QCs by 2025 with a roadmap of 1,121 qubits in 2023 (Condor QC) and 1,386 qubits in 2024 (Flamingo QC). GlobalData predicts that full-scale commercial quantum computing will likely begin in 2027.
Big Tech firms have begun offering quantum-as-a-service (QaaS), notably Microsofts Azure Quantum, which provides users with access to hardware from other companies such as IonQ, Toshiba, and Honeywells Quantinuum. The QaaS market will continue to grow as more companies invest in quantum.
JP Morgan, Volkswagen, and Lockheed Martin are already investing in their quantum infrastructure in preparation for widespread adoption. These companies are well-positioned to benefit from a quantum advantage in financial modeling, process optimization, cybersecurity, and military research and development.
2023 will see the quantum capabilities gap continue to narrow between the US and China as the tech war intensifies. Of the 62 quantum computing start-ups listed on GlobalDatas companies database, 29% were headquartered in the US, followed by the UK (13%), and China (10%). These countries will become hubs of activity in quantum computing, attracting both domestic and international investment.
Quantum computing is the latest arena of competition between the US and China as both countries strive for technological supremacy. Major US tech firms currently dominate quantum computing. In China, Alibaba leads ahead of Huawei and Baidu. Alibaba is at the forefront of Chinas quantum strategy; it has invested $15 billion into the DAMO science and technology research center and partnered with the Chinese Academy of Sciences.
Though currently estimated to be lagging five years behind the US in quantum computing, China supersedes the US in quantum satellite communications. The Chinese government has invested $10 billion into the construction of the National Laboratory for Quantum Information Science. When completed, its research focus will be on quantum technologies with direct military application. China benefits from an autocratic economic model. China can pool resources from institutions, corporations, and the government, working collectively to achieve a single aim.
In contrast, US tech firms compete against each other. The US government is increasingly involving itself in quantum development. The US CHIPS and Science Act, which was signed into law in August 2022, details $153 million of domestic funding to support US quantum computing initiatives, including discovery, infrastructure, and workforce. This support package will be implemented between 2023 and 2027.
Read more here:
U Toronto and Fujitsu team use quantum-inspired computing to … – Green Car Congress
Posted: at 12:15 am
Researchers from the University of Torontos Faculty of Applied Science & Engineering and Fujitsu have applied quantum-inspired computing to find the promising, previously unexplored chemical family of Ru-Cr-Mn-Sb-O2 as acidic oxygen evolution reaction catalysts for hydrogen production.
The best catalyst shows a mass activity eight times higher than state-of-the-art RuO2 and maintains performance for 180 h. A paper on their work appears in the journal Matter.
Choubisa et al.
Scaling up the production of what we call green hydrogen is a priority for researchers around the world because it offers a carbon-free way to store electricity from any source. This work provides proof-of-concept for a new approach to overcoming one of the key remaining challenges, which is the lack of highly active catalyst materials to speed up the critical reactions.
Ted Sargent, senior author
Nearly all commercial hydrogen is produced from natural gas. The process produces carbon dioxide as a byproduct; if the CO2 is vented to the atmosphere, the product is known as grey hydrogen, but if the CO2 is captured and stored, it is called blue hydrogen. Green hydrogen is a carbon-free method that uses an electrolyzer to split water into hydrogen and oxygen gas. The low efficiency of available electrolyzers means that most of the energy in the water-splitting step is wasted as heat, rather than being captured in the hydrogen.
Researchers around the world are striving to find better catalyst materials that can improve this efficiency. Because each potential catalyst material can be made of several different chemical elements, combined in a variety of ways, the number of possible permutations quickly becomes overwhelming.
One way to do it is by human intuition, by researching what materials other groups have made and trying something similar, but thats pretty slow. Another way is to use a computer model to simulate the chemical properties of all the potential materials we might try, starting from first principles. But in this case, the calculations get really complex, and the computational power needed to run the model becomes enormous.
Jehad Abed, co-lead author
To find a way through, the team turned to the emerging field of quantum-inspired computing. They made use of the Digital Annealer, a tool that was created as the result of a long-standing collaboration between U of T Engineering and Fujitsu Research. This collaboration has also resulted in the creation of the Fujitsu Co-Creation Research Laboratory at the University of Toronto.
Digital Annealer (DA) is a computer architecture developed to solve large-scale combinatorial optimization problems rapidly using CMOS digital technology. DA is unique in that it uses a digital circuit design inspired by quantum phenomena and can solve problems that are very difficult and time-consuming or even impossible for classical computers to address.
Digital Annealer is inspired by quantum mechanics, but unlike quantum computers, does not require cryogenic temperatures. DA makes use of a method called annealingnamed after the annealing process using in metallurgy. During this procedure, metal is heated to a high temperature before the structure stabilizes as it is slowly cooled to a lower energy, more stable state.
Using the analogy of placing blocks in a box, in the classical computational approach, the blocks are placed in sequence. If a solution is not found, the process is restarted and repeated until a solution is found. With the annealing approach, the blocks are placed randomly and the entire system is shaken. As the shaking is gradually reduced, the system becomes more stable as the shapes quickly fit together.
DA is designed to solve fully connected quadratic unconstrained binary optimization (QUBO) problems and is implemented on CMOS hardware. The second-generation Digital Annealer expands the scale of problems that can be solved from the 1,024 bits of the first generation, launched in May 2018, to 8,192 bits and an increase in computational precision.
This leads to substantial gains in precision and performance for enhanced problem-solving and new applications, expanding by a factor of one hundred the complexity that the second-generation Digital Annealer can tackle now. Its algorithm is based on simulated annealing, but also takes advantage of massive parallelization enabled by the custom application-specific CMOS hardware.
The Digital Annealer is a hybrid of unique hardware and software designed to be highly efficient at solving combinatorial optimization problems. These problems include finding the most efficient route between multiple locations across a transportation network, or selecting a set of stocks to make up a balanced portfolio. Searching through different combinations of chemical elements to a find a catalyst with desired properties is another example, and it was a perfect challenge for our Digital Annealer to address.
Hidetoshi Matsumura, senior researcher at Fujitsu Consulting (Canada)
In the paper, the researchers used a technique called cluster expansion to analyze an enormous number of potential catalyst material designsthey estimate the total as a number on the order of hundreds of quadrillions. For perspective, one quadrillion is approximately the number of seconds that would pass by in 32 million years.
Quantum annealers and similar quantum-inspired optimizers have the potential to provide accelerated computation for certain combinatorial optimization challenges. However, they have not been exploited for materials discovery because of the absence of compatible optimization mapping methods. Here, by combining cluster expansion with a quantum-inspired superposition technique, we lever quantum annealers in chemical space exploration for the first time. This approach enables us to accelerate the search of materials with desirable properties 1050 times faster than genetic algorithms and bayesian optimizations, with a significant improvement in ground state prediction accuracy.
Choubisa et al.
The results pointed toward a promising family of materials composed of ruthenium, chromium, manganese, antimony and oxygen, which had not been previously explored by other research groups.
The team synthesized several of these compounds and found that the best of them demonstrated a mass activity that was approximately eight times higher than some of the best catalysts currently available.
The new catalyst has other advantages too: it operates well in acidic conditions, which is a requirement of state-of-the-art electrolyzer designs. Currently, these electrolyzers depend on catalysts made largely of iridium, which is a rare element that is costly to obtain. In comparison, ruthenium, the main component of the new catalyst, is more abundant and has a lower market price.
The team aims to optimize further the stability of the new catalyst before it can be tested in an electrolyzer. Still, the latest work serves as a demonstration of the effectiveness of the new approach to searching chemical space.
I think whats exciting about this project is that it shows how you can solve really complex and important problems by combining expertise from different fields. For a long time, materials scientists have been looking for these more efficient catalysts, and computational scientists have been designing more efficient algorithms, but the two efforts have been disconnected. When we brought them together, we were able to find a promising solution very quickly. I think there are a lot more useful discoveries to be made this way.
Hitarth Choubisa, co-lead author
Resources
Hitarth Choubisa, Jehad Abed, Douglas Mendoza, Hidetoshi Matsumura, Masahiko Sugimura, Zhenpeng Yao, Ziyun Wang, Brandon R. Sutherland, Aln Aspuru-Guzik, Edward H. Sargent (2022) Accelerated chemical space search using a quantum-inspired cluster expansion approach, Matter doi: 10.1016/j.matt.2022.11.031
Originally posted here:
U Toronto and Fujitsu team use quantum-inspired computing to ... - Green Car Congress
Europe’s first-ever exascale supercomputer will launch in Germany … – TNW
Posted: at 12:15 am
JUPITER is set to become the first European supercomputer to make the leap into the exascale era. This means, itll be capable of performing more than an exaflop (or 1 quintillion) operations per second. In other words, the devices computing power willsurpassthat of 5 million laptops or PCs combined.
The European High Performance Computing Joint Undertaking (EuroHPC JU), which is being behind the project, has now signed ahosting agreementwith theJlich Supercomputing Centre(JSC) in Germany, where JUPITER will be located.
Under the terms of the agreement, JUPITER (which stands for Joint Undertaking Pioneer for Innovative and Transformative Exascale Research) will be installed on the campus of the Forschungszentrum Jlich research institute in 2023. The machine will be operated by the JSC.
This new supercomputer will be backed by a 500million budget, split equally between the EuroHPC JU and German federal and state sources.
JUPITERs remarkable power will support the development of high-precision models of complex systems. The machine will be used to analyse key societal issues in Europe, such as health, biology, climate, energy, security, and materials. It will also support intensive use of AI and analysis of enormous data volumes.
Experts expect the computer to improve research quality (while reducing costs), and integrate future technologies such as quantum computing. The device will be available to a wide range of European users in the scientific community, industry, and public sector.
Along with its outstanding computing power, JUPITER will feature a dynamic, modular architecture, which will enable optimal use of the various computing modules used during complex simulations. Notably, JUPITER has been designed as a green supercomputer and will be powered by green electricity, supported by a warm water cooling system. At the same time, its average power consumption is anticipated to be up to15 megawatts approximately six megawatts less than the USFrontierexascale supercomputer.
Upon completion, JUPITER will become the ninth (and best) supercomputer the EuroHPC JU has provided to Europe. Three are expected to be available shortly, and five are already operational. Among them isLUMI, which has beenrankedthe fastest in the EU and third fastest in the world.
Read more here:
Europe's first-ever exascale supercomputer will launch in Germany ... - TNW
2022 Year in Review – Caltech
Posted: at 12:15 am
As the end of 2022 quickly approaches, Caltech News looks back at our coverage of the research, discoveries, events, and experiences that shaped the Institute. Here are some highlights.
Caltech researchers used data gathered both in space by the Mars Reconnaissance Orbiter (MRO) and on the ground by the Mars Perseverance Rover to continue to probe the Red Planet's past and any potential signs it was previously hospitable to life. In January, MRO survey data revealed that liquid water was on Mars about one billion years earlier than suspected. Meanwhile, Perseverance made a beeline across the floor of Jezero Crater during spring 2022, arriving at an ancient river delta in April. The delta is thought to be one of the best possible places to search for past signs of life; there, Perseverance found signs of past water along with evidence of possible organic compounds in the igneous rocks on the crater floor. After a few months at the delta, Perseverance project scientist Ken Farley announced in September the discovery of a class of organic molecules in two samples of mudstone rock collected from a feature called Wildcat Ridge. While these organic molecules can be produced through nonliving chemical processes, some of the molecules themselves are among the building blocks of life.
Not all eyes aimed toward space are set on Mars, however. New instruments and surveys provided insights related to other celestial bodies in our Milky Way galaxy, such as asteroids, and helped discover an abundance of planets outside of our solar system.
In March, the NASA Exoplanet Archive, an official catalog for exoplanetsplanets that circle other stars beyond our sunhoused at Caltech's IPAC astronomy center, officially hit a new milestone: 5,000 exoplanets.
Looking even farther out into the universe from planet Earth, Caltech researchers made several discoveries, including a tight-knit pair of supermassive black holes locked in an epic waltz, and a new "black widow" star system, spotted by the Zwicky Transient Facility (ZTF), in which a rapidly spinning dead star called a pulsar is slowly evaporating its companion.
Caltech's ZTF sky survey instrument, based at Palomar Observatory, had previously discovered the first known asteroid to circle entirely within the orbit of Venus. To honor the Pauma band of Indigenous peoples whose ancestral lands include Palomar Mountain, the ZTF team asked the band to name the asteroid. They chose 'Ayl'chaxnim, which means "Venus girl" in their native language of Luiseo.
And far closer to home, new faculty member and historian Lisa Ruth Rand set her sights on the debris we have left in Earth's orbit (and beyond), and what it can tell us about humanity and our evolving relationship with space.
Caltech astronomers continue to lead the way in the development of ever more powerful instruments for answering fundamental questions about our place in the universe. The new Keck Planet Finder, led by astronomer Andrew Howard, will take advantage of the W. M. Keck Observatory's giant telescopes to search for and characterize hundreds, and ultimately, thousands of exoplanets, including Earth-size planets that may harbor conditions suitable for life.
NASA has also selected the UltraViolet EXplorer (UVEX) proposal, led by astronomer Fiona Harrison, for further study. If selected to become a mission, UVEX would conduct a deep survey of the whole sky in ultraviolet light to provide new insights into galaxy evolution and the life cycle of stars. Harrison's current NASA mission, NuSTAR (Nuclear Spectroscopic Telescope Array), an X-ray telescope that hunts black holes, celebrated 10 years in space. Meanwhile, the development of NASA's SPHEREx (Spectro-Photometer for the History of the Universe, Epoch of Reionization and Ices Explorer), led by astronomer Jamie Bock, is forging ahead with a customized test chamber delivered this year to Caltech.
As new telescopes continue to come together, a venerable Caltech telescope is being taken apart atop Maunakea in Hawaii. The Caltech Submillimeter Observatory (CSO) received the final permits to begin its decommissioning process. Scientists plan to ultimately repurpose the telescope and put it back together in Chile.
Caltech's fundamental quest for understanding life and our origins also inspires many research efforts and innovations with the potential to improve human health and well-being.
Continuing work that began with the COVID-19 pandemic, Pamela Bjrkman and colleagues developed a new type of vaccine that protects against the virus that causes COVID-19 and closely related viruses, while Sarkis Mazmanian has shown how an imbalance of gut microbes can cause binge eating. Meanwhile, other researchers made real what would have seemed like science fiction only a few years ago: Caltech medical engineer Wei Gao created an artificial skin for robots that interfaces with human skin and allows a human operator to "feel" what the robot is sensing; chemical engineer Mikhail Shapiro engineered a strain of remote-controlled bacteria that seek out tumors inside the human body to deliver targeted drugs on command; and neuroscientist Richard Andersen and colleagues developed a brainmachine interface that can read a person's brain activity and translate it into the words the person was thinking technology that may one day allow people with full-body paralysis to speak. Additionally, Caltech researchers created a "synthetic" mouse embryo, complete with brain and beating heart; completed a 20-year quest to decode one of the most complex and important pieces of machinery in our cells; and discovered how fruit flies' extremely sensitive noses help them find food.
In 2022, Caltech paid tribute to its long history of advances in sustainability and then looked forward to pioneering new initiatives and technologies that will reduce humanity's footprint on Earth's fragile environment. Through the newly launched Caltech Heritage Project, a series of oral histories published this year captured the pivotal role Caltech alumni played in the electric car revolution. Meanwhile, in April, Caltech hosted the Caltech Energy 10 (CE10) conference, bringing thought leaders to campus to chart a path toward achieving the Biden administration's stated goal to cut U.S. global warming gas emissions by 50 percent within the next 10 years.
Caltech researchers continue to contribute to research to generate cleaner energy, ranging from work in the laboratory of John Dabiri (MS '03, PhD '05) to optimize wind farms to efforts to create and commercialize technology for capturing carbon already released into the atmosphere (which earned a Caltech-based startup an XPrize Award).
On campus, Caltech began construction of the Resnick Sustainability Center, scheduled to open in 2024, which will bring together talent from across campus to tackle issues related to climate change and other human impacts on the natural environment. And as the year wraps up, the Space-based Solar Power Project is preparing to launch a demonstration into space to test three key elements of its ambitious plan to harvest solar energy in spacewhere there are no cloudy daysand beam it wirelessly down to Earth.
As the AI4Science Initiative continually demonstrates and the Caltech Science Exchange recently highlighted, artificial intelligence (AI) and machine learning (ML) have applications that reach every corner of campus. In 2022, AI was used to generate the first-ever picture of the black hole at the center of our own galaxy (only the second image of a black hole ever created), to pave the way to improve aircraft design, to help drones fly autonomously in real-weather conditions, and to fight COVID-19. This election year, researchers from Caltech discussed how machine learning can both combat misinformation and fight online bullying.
Caltech continues its role as a major hub of quantum research. The newly announced Dr. Allen and Charlotte Ginsburg Center for Quantum Precision Measurement will unite a diverse community of theorists and experimentalists devoted to understanding quantum systems and their potential uses (see a video about the new center). The 25th annual Conference on Quantum Information Processing, or QIP, the world's largest gathering of researchers in the field of quantum information, a discipline that unites quantum physics and computer science, was held in Pasadena for the first time and represented the first major collaboration between Caltech and the new AWS Center for Quantum Computing on campus.
Fundamental research in the quantum sciences charged ahead, with findings that included a quantum computer-based experiment to test theoretical wormholes and new demonstrations showing how graphene can be used in flexible and wearable electronics.
This year, members of the Caltech community received recognition for expanding the boundaries of scientific knowledge, but also for humanitarian endeavors and for blazing new educational and occupational paths for others to follow.
In March, Roman Korol, a Caltech graduate student, launched a project to collect and distribute humanitarian aid for families affected by the war in Ukraine.
In April, Jessica Watkins, who worked on the Mars Curiosity rover mission while a postdoc at Caltech, made history as the first Black woman on the International Space Station. From space, she hosted a live Q&A for Caltech students and faculty in Ramo Auditorium and reviewed a paper describing how geology on Mars works in dramatically different ways than on Earth.
In May, alumna Laurie Leshin (MS '89, PhD '95) assumed leadership of JPL, becoming its first female director.
In June, Carver Mead (BS '56, MS '57, PhD '60), one of the fathers of modern computing, received the 2022 Kyoto Prize for leading contributions to the establishment of the guiding principles for very large-scale integration systems design, which enables the basis for integrated computer circuits.
In October, Caltech alumnus John Clauser (BS '64) shared the 2022 Nobel Prize in Physics "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science." The same month, Edward Stone retired as the project scientist for NASA's Voyager mission a half-century after taking on the role. Under his guidance, the Voyager probes explored the solar system's four gas-giant planets and became the first human-made objects to reach interstellar space, the region between stars containing material generated by the death of nearby stars. Also, Tracy Dennison began her term as the new Ronald and Maxine Linde Leadership Chair of the Division of the Humanities and Social Sciences.
In November, 50 years after they entered Caltech as the Institute's first Black female students, Karen Maples, MD (BS '76); Deanna Hunt (BS '76); and Lauretta Carroll (BS '77) reflected on the challenges and successes they experienced then and in the years that followed.
Throughout the year, the Institute took steps to implement new programs and bolster existing ones that underscore Caltech's guiding values, such as supporting students and postdoctoral scholars, creating a more inclusive environment, and celebrating and accounting for its history.
To create more opportunities for students and increase interdisciplinary research, Caltech created a new graduate education track that combines medical engineering and electrical engineering. To further boost interdisciplinary research and expand Caltech's prominence as a hub for mathematics, the Institute became the new home of the American Institute of Mathematics, an independent nonprofit organization funded in part by the National Science Foundation.
The Institute, which this year kicked off a partnership with the Carnegie Institution for Science, also became a charter member of SEA Change, an initiative of the American Association for the Advancement of Science that supports educational institutions as they systemically transform to improve diversity, equity, accessibility, and inclusion in science, technology, engineering, mathematics, and medicine.
The Institute expanded its Presidential Postdoctoral Fellowship, which supports efforts to diversify academia by recruiting and supporting promising postdoctoral scholars from underrepresented communities.
On campus, Caltech marked the dedication of the Grant D. Venerable House, honoring its namesake alumnus, who was the first Black undergraduate student to graduate from Caltech and an active student leader and athlete during his time on campus. It also celebrated the dedication of the Lee F. Browne Dining Hall, honoring the late Lee Franke Browne, a former Caltech employee and lecturer who dedicated his life and career to efforts that expanded students' access to STEM and who advanced human rights.
With the return of in-person events, the Institute was able to reestablish and strengthen ties to the local community through educational programs for area students, and through cultural events and lectures whose online components often reached even broader audiences across the world.
This year, the Institute celebrated the centennial of the Caltech Seismological Laboratory, marking an unparalleled century at the forefront of earthquake science and geophysics.
Caltech also celebrated the 100th anniversary of the Watson Lectures, which launched in 1922 as a way to benefit the public through education and outreach. Continuing that tradition, Caltech partnered with local schools to bring high school students to campus to see the lectures and engaged young students through other educational outreach programs, including the new Caltech Earthquake Fellows program and the Caltech Planet Finder Academy, both of which launched this year. Other programs designed to bolster science education for young students included Summer Research Connection, a program that invites high school students and teachers from Pasadena Unified School District and other nearby schools into Caltech laboratories, and the National Science Olympiad Tournament, which Caltech hosted this year for the first time and whose students played the main role in conducting the event.
For the campus community, TechFest returned to campus for the first time since the start of the COVID-19 pandemic, welcoming students with an in-person block party on Beckman Mall complete with games and fireworks.
Caltech's Public Programming was able to re-engage with the community through in-person events, including CaltechLive! events such as the performance of Nobuntu, a female a cappella quintet from Zimbabwe; and lectures from the Science Journeys, Movies that Matter and Behind the Book series that showcased such varied topics as a journey to the center of Jupiter, a discussion of the science of cooking, and how climate migration will reshape the world.
See the original post here:
VC Fund Nemesis Technologies To Add More Liquidity By Connecting Investors With Opportunities In AI, – Crowdfund Insider
Posted: at 12:15 am
VC Fund Nemesis Technologies To Add More Liquidity By Connecting Investors With Opportunities In AI, Crowdfund Insider
View original post here: