Archive for the ‘Quantum Computing’ Category
Foresight Leads 2.2 Million Growth Capital Investment Into Cavero Quantum – The Quantum Insider
Posted: July 1, 2024 at 2:33 am
Insider Brief
PRESS RELEASE Foresight Group (Foresight), the leading listed infrastructure and regional private equity investment manager, has led a 2.2 million growth capital investment into Cavero Quantum Ltd (Cavero Quantum or the Company) alongside co-investor, Northern Gritstone.
Cavero Quantum, a University of Leeds spinout, has developed a new passwordless encryption technology for secure key generation and authentication. It is compatible with legacy hardware, requires little bandwidth and has the potential to be potentially secure against cyber attacks by quantum computers.
The technology is attractive to a wide range of sectors and has immediate market application through replacing multi-factor authentication and one-time passwords with a high security, frictionless, passwordless form of authentication.
Founded by Professor Ben Varcoe and Dr Frey Wilson, Cavero Quantum will use the funding to begin commercialising its technology and launch its first product. Ben will support Cavero Quantum alongside his existing role as Professor of Quantum Information Science at the University of Leeds, while Frey will become Chief Technology Officer.
As part of the investment, the founders will be supported by the appointment of James Trenholme, as CEO, and Andrew Wallace as Chair. James is an experienced software entrepreneur who has previously founded and exited an identity services start-up, whilst Andrew Wallace has significant deep-tech experience in quantum computing.
Foresight has invested alongside Northern Gritstone, an investment company dedicated to supporting ambitious science and technology-enabled businesses in the North of England, including through its venture-building program NG Studios powered by Deeptech Labs, in which Cavero Quantum participated earlier this year.
Cyber attacks are estimated to cost the global economy $7 trillion per year and are driving investment across the cyber security market, including in passwordless authentication which is projected to be worth 17 billion per year by 2027. Demand for Caveros solution is expected to further increase as existing cryptographic methods become more vulnerable to quantum computers.
Richard Ralph, Investment Manager at Foresight, said: Whilst Cavero Quantums technology is potentially revolutionary to quantum cryptography, it offers the potential for immediate improvements on existing cryptographic approaches due to its dual authentication and passwordless nature, thereby providing improved security against existing cyber attacks. The technology has been independently validated and we look forward to working with Ben, Frey, James, Andrew and Northern Gritstone to commercialise this innovative technology.
James Trenholme, CEO at Cavero Quantum, commented: The technologythat Ben, Frey and the experimental quantum science team at the University of Leeds have built reallyisground-breakingtechnology.Itsthe first solution in the world that can replace security standards like ECDH without compromising on architecture and customer experience, keeping data safe for the future as Quantum computing becomes the norm. Its an honour to lead Cavero Quantum. This is a great team, and Im looking forward to building a great business together.
Duncan Johnson, Chief Executive Officer at Northern Gritstone, added: Cavero Quantums technology is applicable today and has the potential to allow individuals, businesses and nations to function safely in a post-quantum world. Spun out of the University of Leeds innovation ecosystem, one of Northern Gritstones university partners, Cavero Quantum is an example of a world-class business of tomorrow built on the amazing science and technology that exists in the North of England today.
Professor Nick Plant, Deputy Vice-Chancellor: Research and Innovation, University of Leeds, said: It is inspiring to see how the experimental quantum science team at Leeds has developed solutions for such a critical issue in online security. Caveros technology will have a major impact on our global community, making sector-leading improvements and bringing financial savings to businesses. It is testament to the world-leading, innovative technology being driven by our region.
Read the rest here:
Foresight Leads 2.2 Million Growth Capital Investment Into Cavero Quantum - The Quantum Insider
Leveraging quantum computing algorithms to solve complex optimization problems in logistics? – ET Edge – ET Edge Insights
Posted: at 2:33 am
The world is awash in data. According to International Data Corporation, the global datasphere is projected to grow to a staggering 175 zettabytes by 2025. This data deluge presents both opportunities and challenges. In logistics, a sector characterized by intricate networks and ever-evolving demands, optimizing operations to maximize efficiency and minimize costs has become paramount. This is where a nascent technology with revolutionary potential steps in: quantum computing.
Quantum computing represents a paradigm shift from the classical computers we rely on today. It harnesses the laws of quantum mechanics to unlock a new realm of computational power. Unlike classical bits, which are confined to either a 0 or 1 state, quantum bits, or qubits, can exist in a superposition of both states simultaneously. This phenomenon, along with entanglement, where qubits become linked and share a single fate, allows quantum computers to explore a vast number of possibilities concurrently, a power known as quantum parallelism.
This is a game-changer for tackling the complex optimization problems that plague the logistics industry. Here, traditional optimization algorithms often struggle due to the sheer volume of variables and the exponential increase in computation time as problem size grows. But quantum algorithms, leveraging the power of superposition and entanglement, promise to provide novel solutions. Industry giants such as IBM and DHL have begun proposing quantum solutions to logistics problems. DHL notes that since last-mile delivery costs account for 53% of total shipping costs, non-traditional solutions are needed to truly optimize costs.
The logistics landscape is riddled with optimization challenges. Consider the Traveling Salesman Problem (TSP): a salesman needs to visit a set of cities exactly once and return to the starting point, minimizing total travel distance. This is a complex task due to various constraints such as traffic, last-minute customer requests, and strict delivery windows. And as the number of cities increases, even the most powerful classical computers struggle to find the optimal route.
Beyond route optimization, logistics companies also grapple with challenges in inventory management and demand forecasting, where they must balance inventory levels to meet fluctuating demand while minimizing holding costs. Additionally, fleet management and scheduling require optimizing schedules and routes for vehicles, taking into account factors like traffic, fuel efficiency, and driver availability. Moreover, the design of supply chain networks demands efficiency to minimize transportation costs and ensure timely delivery. Addressing these multifaceted challenges is crucial for maintaining smooth and cost-effective logistics operations.
Traditional approaches to these problems, such as linear programming and heuristics, often reach computational limits as problem complexity increases. This is where quantum computing algorithms come to the fore.
Several quantum algorithms hold immense potential for logistics optimization. Quantum Annealing, inspired by the physical process of annealing, tunnels through solution spaces to find the optimal state. The Variational Quantum Eigensolver (VQE) algorithm iteratively refines the state of qubits to find solutions to optimization problems. The Quantum Approximate Optimization Algorithm (QAOA) utilizes a series of quantum operations to tailor the search for optimal solutions. Although not directly an optimization algorithm, Grovers Algorithm offers a significant speedup for searching databases, potentially aiding in tasks like finding optimal routes or inventory locations. Together, these algorithms represent powerful tools for enhancing efficiency and effectiveness in logistics.
The efficiency of these algorithms lies in their ability to explore a multitude of potential solutions simultaneously, unlike their classical counterparts. This translates to significant reductions in computation time, particularly for problems with vast solution spaces.
Lets delve into how these algorithms can be applied to specific logistics challenges:
While the potential of quantum computing for logistics optimization is undeniable, there are challenges to overcome. Currently, quantum hardware is in its nascent stages of development. Tech giants such as IBM and Google have announced quantum roadmaps to reach 1 million qubits by 2030, a number necessary for most commercial purposes like supply chain-related operations. That number currently stands at only 5000 qubits.
Furthermore, qubit error rates remain high, and the number of controllable qubits in a single processor is limited. Integrating these algorithms with existing logistics software and workflows also requires significant development efforts.
For logistics companies, adopting quantum computing solutions will require a cost-benefit analysis along with investments in training personnel and developing the necessary infrastructure.
The potential benefits of leveraging quantum computing algorithms for logistics optimization are vast. While technical challenges remain, continued research and development hold the promise of unlocking a new era of efficiency and sustainability in the logistics sector. It is crucial for logistics companies to stay informed about advancements in quantum computing and consider pilot projects to explore its potential applications. By embracing this revolutionary technology, the logistics industry can navigate the complexities of the data-driven world and deliver a future of optimized operations, reduced costs, and a more sustainable global supply chain.
Go here to see the original:
The Road to Error-Free Quantum Computing – AZoQuantum
Posted: at 2:33 am
Jun 24 2024Reviewed by Lexie Corner
In a studypublished in PeerJ Computer Science, Professor Kazuhiro Ogataand Assistant Professor Canh Minh Do of the Japan Advanced Institute of Science and Technology (JAIST) suggested using symbolic model checking to validate quantum circuits.
Quantum computing is a fast-developing technology that utilizes the principles of quantum physics to tackle complicated computational problems that are extremely difficult for classical computing.
To take advantage of quantum computing, researchers worldwide have created a large number of quantum algorithms that show notable gains over classical algorithms.
Creating these algorithms requires the use of quantum circuits, which are models of quantum processing. Before they are actually deployed on quantum hardware, they are utilized to design and implement quantum algorithms.
Quantum circuits consist of a series of quantum gates, measurements, and qubit initializations, among other events. Quantum gates execute quantum computations by working on qubits, the quantum equivalents of conventional bits (0s and 1s), and manipulating the system's quantum states.
Quantum states are the output of quantum circuits that can be monitored to provide classical outcomes with probabilities from which additional actions can be taken. Since quantum computing is frequently counterintuitive and substantially distinct from classical computing, the likelihood of mistakes is significantly larger. As a result, it is critical to ensure that quantum circuits have the correct features and perform as planned.
This can be accomplished using model checking, a formal verification approach used to ensure that systems meet desirable attributes. Although certain model checkers are specialized to quantum programs, there is a distinction between model-checking quantum programs and quantum circuits due to differences in representation and the absence of iterations in quantum circuits.
Considering the success of model-checking methods for verification of classical circuits, model-checking of quantum circuits is a promising approach. We developed a symbolic approach for model checking of quantum circuits using laws of quantum mechanics and basic matrix operations using the Maude programming language.
Canh Minh Do, Assistant Professor, Japan Advanced Institute of Science and Technology
Maude is a high-level specification/programming language based on rewriting logic that enables the formal definition and verification of complicated systems. It comes with a Linear Temporal Logic (LTL) model checker that determines if systems meet the necessary features. Maude also enables the development of exact mathematical models of systems.
Using the Dirac notation and the rules of quantum physics, the researchers formally defined quantum circuits in Maude as a set of quantum gates and measurement applications. They provided the systems intended attributes and its initial state in LTL.
By using a set of quantum physics laws and basic matrix operations formalized in our specifications, quantum computation can be reasoned in Maude.The researchers then automatically checked whether quantum circuits satisfied the required characteristics using the integrated Maude LTL model checker.
Using this method, several early quantum communication protocols, each with increasing complexity, were checked: Superdense Coding, Quantum Teleportation, Quantum Secret Sharing, Entanglement Swapping, Quantum Gate Teleportation, Two Mirror-image Teleportation, and Quantum Network Coding.
They discovered that the initial iteration did not meet the desired property of Quantum Gate Teleportation. By employing this method, the researchers suggested an updated version and verified that it was accurate.
These findings highlight the significance of the suggested novel technique for the verification of quantum circuits. However, the researchers highlight certain drawbacks of their strategy that need more investigation.
Dr. Do added, In the future, we aim to extend our symbolic reasoning to handle more quantum gates and more complicated reasoning on complex number operations. We also would like to apply our symbolic approach to model-checking quantum programs and quantum cryptography protocols.
Verifying the expected functionality of quantum circuits will be extremely useful in the approaching era of quantum computing. In this context, the current technique is the first step toward a broader framework for verifying and specifying quantum circuits, opening the way for error-free quantum computing.
The study was supported by JST SICORP Grant Number JPMJSC20C2, Japan, and JSPS KAKENHI Grant Numbers JP23H03370, JP23K19959, and JP24K20757.
Do, C. M., etal. (2024) Symbolic model checking quantum circuits in Maude. PeerJ Computer Science. doi:10.7717/peerj-cs.2098
See more here:
University of Gondar Scientists Say Quantum Computers Offer Promising Boost to Alzheimer’s Diagnosis – The Quantum Insider
Posted: at 2:33 am
Insider Brief
A team of scientists said an innovative ensemble deep learning model combined with quantum machine learning classifiers might improve the accuracy and efficiency of Alzheimers disease (AD) classification, according to a study published in Nature.
The researchers, from the University of Gondar in Ethiopia, used the classifiers to investigate Alzheimers disease, a chronic neurodegenerative disorder. Early diagnosis is crucial for timely intervention and treatment, potentially improving the quality of life for those affected. Traditional methods for diagnosing Alzheimers have limitations in accuracy and efficiency, prompting researchers to explore advanced technologies, such as quantum computing.
Quantum Computing and Deep Learning
Quantum computing offers a promising alternative to classical machine learning approaches for various disease classification tasks. Quantum computers, while still under development, can theoretically process complex data and perform calculations at a much faster rate, leveraging quantums unique potential to handle large datasets more efficiently and accurately.
The team leveraged this potential by developing a model that integrates deep learning architectures and quantum machine learning algorithms. This hybrid approach aims to enhance the precision and speed of Alzheimers diagnosis.
The study used data from the Alzheimers Disease Neuroimaging Initiative I (ADNI1) and Alzheimers Disease Neuroimaging Initiative II (ADNI2) datasets. These datasets, comprising MRI brain images, were merged and pre-processed to form the basis of the proposed model. Key features were extracted using a customized version of VGG16 and ResNet50 models. These features were then fed into a Quantum Support Vector Machine (QSVM) classifier to categorize the data into four stages: non-demented, very mild demented, mild demented, and moderate demented.
The ensemble deep learning model combined the strengths of both VGG16 and ResNet50 architectures, deep learning architectures used for image recognition tasks. VGG16 is known for its simplicity and deep convolutional layers, while ResNet50 introduces residual connections to allow for training of very deep networks without performance degradation. The QSVM classifier provided the computational power of quantum algorithms. This combination aimed to enhance the overall performance of the classification model.
Evaluation and Results
The performance of the proposed model was evaluated using six metrics: accuracy, area under the curve (AUC), F1-score, precision and recall. The results demonstrated that the ensemble model significantly outperformed several state-of-the-art methods in detecting Alzheimers disease.
These results lean toward the superiority of the ensemble model with QSVM in accurately classifying AD stages from the merged ADNI dataset. Its important to note that the ResNet + QSVM model exhibited a 6% improvement in accuracy compared to the standalone ResNet model, while the proposed ensemble model showed 8.5% and 12.21% better results compared to other ensemble and SVM models, respectively.
The experiments were conducted using a Hewlett Packard Core i5, sixth-generation computer with 8 GB RAM, and a Google Colab Pro GPU.On the quantum side, the researchers relied on a 5-qubit quantum hardware or simulator, employing the QSVM model from the Qiskit library. This setup allowed for efficient processing and analysis of the MRI brain images, demonstrating the practical application of quantum computing in medical research.
Implications and Future Research
The study highlights the potential of combining quantum classifiers and ensemble learning to achieve effective outcomes in disease classification tasks. The integration of quantum machine learning classifiers with deep learning architectures can significantly improve the accuracy and efficiency of Alzheimers disease diagnosis.
However, the researchers acknowledge the need for further studies to evaluate the practical implementation of this model within medical devices. Future research could focus on integrating the proposed model into real-world medical settings, providing a significant solution to support primary care for Alzheimers disease, especially in cases where MRI scans are blurred or challenging to interpret.
The researchers include: researchers Abebech Jenber Belay, Yelkal Mulualem and elaku Bitew Haile, all of the department of Information Technology, College of Informatics, University of Gondar, Gondar, Ethiopia.
Link:
Chevron invests in quantum computing development for oil and gas market – WorldOil
Posted: March 9, 2024 at 2:40 am
(WO) OQC announced that Chevron Technology Ventures, part of Chevron Corporation, has joined its $100m Series B funding round.
Quantum computing in the oil and gas market is expected to grow at a CAGR of 37.9%, owing to the increasing demand for efficient optimization and simulation across the sector. Chevron's investment marks a significant move by a supermajor into the rapidly evolving field of quantum computing.
"OQC's development of the quantum computer has the potential to change the information processing landscape by merging the bounds of engineering and physics," said Jim Gable, Vice President, Innovation and President of Technology Ventures at Chevron. "This is the latest investment from our Core Energy Fund, which focuses on high-tech, high-growth startups and breakthrough technologies that could improve Chevron's core oil and gas business performance as well as create new opportunities for growth."
A quantum future for oil and gas. OQC's technology provides several potential groundbreaking opportunities for the oil and gas sector, including the development and optimization of catalysts and the efficiency of transportation and distribution networks. Quantum is anticipated to accelerate the oil and gas industry's discovery and development of new materials through the simulation of complex molecules to lower carbon products.
To realize this future, the oil and gas industry requires secure, accessible and powerful quantum computing that is integrated with existing high-performance computing. Prior to the launch of OQC Toshiko, quantum computers were only available in labs, making secure access for companies and integration with existing high-performance computing the largest barriers to wider business adoption of this groundbreaking technology.
Commenting on the news, Ilana Wisby, Chief Executive Officer at OQC, said, "Chevron's investment marks a significant milestone in harnessing quantum computing for the energy sector. We're excited to drive innovation and efficiency in exploration and renewables and pioneer enterprise-ready quantum in the energy sector."
Read the original post:
Chevron invests in quantum computing development for oil and gas market - WorldOil
Why the QPU Is the Next GPU – Built In
Posted: at 2:40 am
The computational demands of various sectors, such as drug discovery, materials science, and AI are skyrocketing. Graphics processing units (GPUs) have been at the forefront of this journey, serving as the backbone for tasks demanding high parallel processing capabilities. Their integration into data centers has marked a significant advancement in computational technology.
As we push the boundaries of what's computationally possible, however, the limitations of GPUs become apparent, especially when facing problems that classical computing struggles to solve efficiently. Enter the quantum processing unit (QPU), a technology that promises not just to complement but potentially transcend the capabilities of GPUs, heralding a new era in computational science.
A quantum processing unit, or QPU, uses qubits and quantum circuit model architecture to solve problems that are too computationally intensive for classical computing. Its potential is analogous to the transformational impact the GPU had on computing in the 2000s.
More From Yuval BogerWhat Role Will Open-Source Development Play in Quantum Computing?
The binary system is at the core of classical computing, with bits that exist in one of two states: zero or one. Through logic gates within the von Neumann architecture (an architecture that includes a CPU, memory, I/O, and data bus), this binary processing has propelled technological progress for decades. GPUs, enhancing this system, offer parallel processing by managing thousands of threads simultaneously, significantly outpacing traditional CPUs for specific tasks.
Despite their prowess, GPUs are still bound by the linear progression of classical algorithms and the binary limitation of bits, making some complex problems inefficient and energy-intensive to solve. A key reason for this linear progression limitation is that a classical algorithm can only process one possible solution at a time.
The integration of GPUs into data centers began in the late 1990s and early 2000s, initially focused on graphics rendering. NVIDIAs GeForce 256, released in 1999 and billed as the worlds first GPU, marked a significant shift towards GPUs as programmable units rather than merely graphics accelerators. Their general-purpose computing potential was realized in the mid-2000s with NVIDIAs introduction of CUDA in 2006, enabling GPUs to handle computational tasks beyond graphics, such as simulations and financial modeling.
The democratization of GPU computing spurred its adoption for scientific computing and AI, particularly benefiting from GPUs parallel processing capabilities. This led to wider use in research and high-performance computing, driving significant advancements in GPU architecture.
By the early 2010s, the demand for big data processing and AI applications accelerated GPU adoption in cloud services. This period also saw the rise of specialized AI data centers optimized for GPU clusters, enhancing the training of complex neural networks.
The 2020s have seen continued growth in GPU demand, driven by deep learning applications in natural language processing, computer vision, and speech recognition. Modern deep learning frameworks and the introduction of specialized AI accelerators, such as Googles TPU and NVIDIAs Tensor Core GPUs, underscore the critical role of GPUs in AI development and the evolving landscape of computational hardware in data centers.
Despite these developments, GPUs did not displace traditional CPUs. Rather, they ran side by side. We saw the rise of heterogeneous computing: the increasingly popular integration of GPUs with CPUs and other specialized hardware within a single system. This allows different processors to handle tasks best suited to their strengths, leading to improved overall efficiency and performance.
Quantum computing introduces a transformative approach to computing with the concept of qubits. Unlike classical bits, qubits can exist in a state of superposition, embodying both zero and one simultaneously. This characteristic, along with quantum entanglement, enables quantum computers to process information on a scale that classical machines cant match. Quantum gates manipulate these qubits, facilitating parallel processing across exponentially larger data sets.
Quantum gates are the fundamental building blocks of quantum circuits, analogous to logic gates in classical computing, but designed for operations on qubits instead of classical bits. Quantum gates manipulate the state of qubits according to the principles of quantum mechanics, enabling the execution of quantum algorithms. Some quantum gates operate only on a single qubit, whereas others operate on two or more qubits. Multi-qubit gates are critical to exploiting the entangle and superposition properties of quantum computing.
The quantum computing field is grappling with challenges like qubit stability and effective quantum error correction, however, which are crucial for achieving scalable quantum computing. Qubits are inherently fragile and can be affected by a variety of environmental conditions. Therefore, maintaining a stable qubit state is challenging, and researchers still must develop special techniques to detect and correct unwanted changes in the qubit state.
QPU technology is poised to revolutionize areas where classical computing reaches its limits. In drug discovery, for instance, QPUs could simulate molecular interactions at scales never before possible, expediting the creation of new therapeutics. Materials science could benefit from the design of novel materials with tailored properties. In finance, QPUs could enhance complex model optimizations and risk analysis. In AI, they could lead to algorithms that learn more efficiently from less data. QPUs are thus able to tackle problems that CPUs and GPUs cannot and never will, and thus open new frontiers of discovery and innovation.
Although GPUs have revolutionized data center operations, they also bring formidable challenges. The voracious GPU appetite for power generates significant heat, which demands sophisticated and often expensive cooling systems to maintain optimal performance levels. This not only increases the operational costs but also raises environmental concerns due to the high energy consumption required for both running the units and cooling them.
In addition to these physical constraints, the technological landscape in which GPUs operate is rapidly evolving. The constant need for updates and upgrades to accommodate new software demands and improve processing capabilities presents substantial logistical and financial hurdles. This strains resources and complicates long-term planning for data center infrastructure.
QPUs promise to address many of these challenges. QPUs perform computations in ways fundamentally different from classical systems. Specifically, the intrinsic ability of qubits to exist in multiple states simultaneously allows QPUs to tackle complex problems more effectively, reducing the need for constant hardware upgrades. This promises not only a leap in computational power but also a move towards more sustainable and cost-effective computing solutions, directly addressing the critical limitations faced by GPUs in todays data centers.
The journey toward QPU adoption in computational infrastructures is laden with hurdles, though. Achieving stable, large-scale quantum systems and ensuring reliable computations through quantum error correction are paramount challenges. Some types of quantum computers require special cooling and environmental conditions that are uncommon in data centers and thus require adaptation.
Additionally, the quantum software development field is in its infancy, necessitating the creation of new programming tools and languages. To make use of the quantum properties of QPUs, just translating classical algorithms is insufficient. Instead, we will need to invent new types of algorithms. Just like GPUs allow us to leverage parallel processing, QPUs allow us to execute code differently. Despite these obstacles, ongoing research and development are gradually paving the way for QPUs to play a central role in future computational tasks.
Today, QPU integration into broader computational infrastructures and their practical application in industry and research is still in the nascent stages. The development and commercial availability of quantum computers is growing, with several companies and research institutions demonstrating quantum advantage and offering cloud-based quantum computing services.
How close are QPUs to taking a prime position next to GPUs? In other words, if we were to compare the development of QPUs with the historical development of GPUs, what year would we be in now?
Drawing a parallel with the GPU timeline, the current stage of QPU integration closely mirrors the GPU landscape in the mid-2000s, when GPUs became general-purpose computing machines that were adopted for niche applications.
Given these considerations, the current stage of QPU integration might be analogous to the GPU industry around 2006-2007. That was a time of pivotal change, where the foundational technologies and programming models that would enable widespread adoption were just being established. For QPUs, the development of quantum algorithms, error correction techniques, and qubit coherence are akin to the early challenges faced by GPUs in transitioning to general-purpose computing.
More on Quantum ComputingAre You Prepared for the Quantum Revolution?
In summary, although GPUs continue to play a critical role in advancing computational capacities, the integration of QPUs into data centers holds the promise of overcoming the operational and environmental challenges posed by current technologies. With their potential for lower power consumption, reduced heat output, and diminished need for frequent upgrades, QPUs represent a hopeful horizon in the quest for more efficient, sustainable, and powerful computing solutions. QPUs wont replace GPUs, just like GPUs did not eliminate classical CPUs. Instead, the data center of the future will include all three computing methods.
Original post:
What is quantum computing good for? XPRIZE and Google offer cash for answers – Network World
Posted: at 2:40 am
The sponsors of a new $5 million prize want to boost the quantum computing industry by encouraging developers to write new algorithms to help the emerging technology solve real-world problems.
The new Quantum for Real-World Impact contest, from the XPRIZE Foundation, aims to speed the development of quantum computing algorithms focused on sustainability, health, and other societal issues. The three-year contest, sponsored by Google Quantum AI and the Geneva Science and Diplomacy Anticipator Foundation, wants to unleash the potential of quantum computing, according to the contest site.
Currently, quantum computers are not sufficiently advanced enough to solve real-world societal problems that classical computers cannot, the contest site says. However, as the technology advances, relatively few companies and university researchers are focused on translating quantum algorithms into real-world application scenarios and assessing their feasibility to address global challenges once sufficiently powerful hardware is available.
The new contest is crucial for the advancement of quantum computing, said Rebecca Krauthamer, co-founder and chief product officer at QuSecure, a vendor of quantum-resilient cybersecurity tools.
XPRIZE has a powerful history of pushing forward advancements in cutting-edge technology in spaceflight, conservation, advanced medicine, and more, she said. The contest signifies were in a truly exciting time for quantum computing.
Quantum computing hardware development still has a significant road ahead, she added, but much of the innovation from the technology will come from new algorithms and the application of quantum computers to real-world problems.
The contest provides the recognition of the great potential of quantum computing for both commercial and societal gain, she added.
Contestants can write new algorithms to solve new problems using quantum computing, they can show how existing algorithms can be used to solve previously unknown applications of quantum computing, or they can show ways to reduce the computing resources needed for a quantum computer to work on already established algorithms or applications.
Examples of possible contest entries include:
The contest is a good starting point for quantum computing in business models, said Jim Ingraham, vice president of strategic research, EPB of Chattanooga, a power and telecommunications company that launched a quantum-powered network in late 2022. Commercialization is the next essential step for bringing quantum technologies out of the lab and into the real world, he said.
The EPB Quantum Network was another step forward, he added. The network provides access to the necessary proving ground for quantum technologists to show investment worthiness and commercial viability, he said. This is a necessary step to help companies, government agencies and researchers accelerate the development of their technologies.
The contest may assist companies that havent found a way to profit from quantum computing innovation, added Lawrence Gasman, founder and president of Inside Quantum Technology, a quantum research firm.
It may bring in firms that could otherwise not survive, he said. This implies that the use of money is carefully vetted and only goes to firms that can make money in the short-to-medium term.
While quantum computing is not yet mainstream, that day is coming, said QuSecures Krauthamer.
When you see a news headline stating that quantum computers have been used to solve a problem that you recognize something like enhancing battery technology, or optimizing financial portfolios, or improving greenhouse emissions thats when youll know that quantum computing has gone mainstream, she said. We will begin seeing these headlines more in the next couple of years.
View original post here:
What is quantum computing good for? XPRIZE and Google offer cash for answers - Network World
3 Quantum Computing Stocks to Buy for Real-World Breakthrough – InvestorPlace
Posted: at 2:40 am
The quantum computing industry is experiencing significant growth, with advancements in both hardware and software making it a key consideration for organizations looking to invest in cutting-edge technology. To this end, we look at some of the top quantum computing stocks to buy as businesses utilize this next-gen technology across various industries.
Major tech players are increasingly interested in making significant investments in quantum computing to align with the rapid pace of technological advancements amid customers current demands, which are seeking innovative computational solutions.
Drawing on data from the quantum market and insights from industry thought leaders gathered in the fourth quarter of 2023, the recent State of Quantum 2024 report noted the transition from theoretical exploration to practical application, highlighted by the emergence of full-stack quantum computer deliveries in national labs and quantum centers.
In 2022, venture investments in quantum technology soared to over $2 billion amid strong investor confidence in this burgeoning field. However, by 2023, these investments saw a sharp 50% drop, sparking debates about a potential quantum winter.
Industry experts argue the decline reflects broader venture capital trends and not a loss of faith in the quantum sectors prospects. Government funding has increasingly filled the gap private investors left, mitigating concerns over the investment slowdown.
The bottom line is the quantum industry is still advancing, albeit at a moderate pace. This emphasizes the need for realistic expectations and a sustained commitment to research and development. Despite the recent dip in investment, the sectors insiders remain cautiously optimistic about its future. This suggests the industry is far from stagnating.
Lets take a closer look at leading quantum computing stocks to buy.
Intel (NASDAQ:INTC), the semiconductor giant, is actively pursuing a turnaround strategy to regain its leadership in the technology industry. The plan involves a significant restructuring of its operations, investment in advanced chip manufacturing technologies and a renewed focus on innovation.
Among other things, Intel is pushing hard to develop its quantum computing products. The chipmaker introduced Tunnel Falls, a quantum computing chip leveraging the companys cutting-edge manufacturing techniques.
The company has collaborated with various government and academic research entities to facilitate the testing of Tunnel Falls. According to Intel, the new chip has a 95% yield rate across the wafer and voltage uniformity.
Quantum computing isnt the core focus of Intels strategy to reclaim its semiconductor industry leadership. However, the initiative represents a potential growth area. Success in quantum computing research could position Intel as a key player in this innovative technology domain in the future. This could make Intel one of the top quantum computing stocks to buy.
Similarly to Intel, Alphabet (NASDAQ:GOOGL, NASDAQ:GOOG) is making significant strides in quantum computing through its subsidiary, Quantum AI. Focusing on developing quantum processors and algorithms, Googles parent company aims to harness quantum technology for breakthroughs in computing power.
Alphabet recently exceeded Q4 earnings expectations with a net income of $20.69 billion and a 13% revenue increase to $86.3 billion. Its advertising revenue of $65.52 billion slightly missed analyst projections.
While fighting Microsoft (NASDAQ:MSFT) on the AI front, Google has also ventured into the quantum computing realm with its proprietary quantum computing chips, Sycamore. In a strategic move, Google spun off its quantum computing software division into a standalone startup, SandboxAQ, in March 2022.
Its dominant position in search drives Googles foray into quantum computing. It aims to develop more efficient, faster and intelligent solutions. The company plays a crucial role in managing vast volumes of digital information. It can gain immensely by enabling various organizations to harness the transformative power of quantum computing and AI.
FormFactor (NASDAQ:FORM), a leading provider in the semiconductor industry, specializes in the design, development and manufacture of advanced wafer probe cards. These probe cards are essential for the electrical testing of semiconductor wafers before cutting them into individual chips.
FormFactor is strategically positioned within the quantum computing ecosystem through its semiconductor test and measurement solutions expertise. The company provides advanced systems essential for developing and testing quantum computing chips. These systems are designed to operate at extremely low temperatures, a fundamental requirement for quantum computing experiments where qubits must be maintained in a coherent state.
Its flagship products include precision engineering solutions like the Advanced Matrix series for high-density applications and the TouchMatrix series for touchscreen panels. FormFactors products enable semiconductor manufacturers to perform reliable and accurate testing at various stages of the production process. This ensures the functionality and quality of the final semiconductor products.
Last month, FormFactor reported a modest top-line year-over-year increase of 1.3%, reaching $168.2 million. Looking ahead, expectations for the first quarter are aligned with the recent quarterly performance, with projected revenue of around $165 million.
On the date of publication, Shane Neagle did not hold (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.
Shane Neagle is fascinated by the ways in which technology is poised to disrupt investing. He specializes in fundamental analysis and growth investing.
More here:
3 Quantum Computing Stocks to Buy for Real-World Breakthrough - InvestorPlace
Longer coherence: How the quantum computing industry is maturing – DatacenterDynamics
Posted: at 2:40 am
Quantum computing theory dates back to the 1980s, but it's really only in the last five to ten years or so that weve seen it advance enough to the point it could realistically become a commercial enterprise.
Most quantum computing companies have been academic-led science ventures; companies founded by PhDs leading teams of PhDs. But, as the industry matures and companies look towards a future of manufacturing and operating quantum computers at a production-scale, the employee demographics are changing.
While R&D will always play a core part of every technology company, making quantum computers viable out in the real world means these startups are thinking about how to build, maintain, and operate SLA-bound systems in production environments.
This new phase in the industry requires companies to change mindset, technology, and staff.
Plus rebuilding Ukraine, Cologix's CEO, and more
20 Dec 2023
At quantum computing firm Atom Computing, around 40 of the companys 70 employees have PhDs, many joining straight out of academia. This kind of academic-heavy employee demographic is commonplace across the quantum industry.
I'd venture that over half of our company doesn't have experience working at a company previously, says Rob Hays, CEO of Atom. So theres an interesting bridge between the academic culture versus the Silicon Valley tech startup; those are two different worlds and trying to bridge people from one world to the other is challenging. And it's something you have to focus and work on openly and actively.
Maturing from small startups into large companies with demanding customers and shareholders is a well-trodden path for hundreds of technology companies in Silicon Valley and across the world.
And quantum computers are getting there: the likes of IonQ, Rigetti, and D-Wave are already listed in the Nasdaq and New York Stock Exchange although the latter two companies have had to deal at various times with the prospect of being de-listed due to low stock prices.
Most of the quantum companies DCD spoke to for this piece are undergoing a transition from pure R&D mode to a more operational and engineering phase.
When I first joined four years ago, the company was entirely PhDs, says Peter Chapman, IonQ CEO. We're now in the middle of a cultural change from an academic organization and moving to an engineering organization. We've stopped hiring PhDs; most of the people we're hiring nowadays are software, mechanical, and hardware engineers. And the next phase is to a customer-focused product company.
Chapman points to the hirings of the likes of Pat Tan and Dean Kassmann previously at Amazons hardware-focused Lab126 and rocket firm Blue Origin, respectively as evidence of the company moving to a more product- and engineering-focused workforce.
2023 also saw Chris Monroe, IonQ co-founder and chief scientist, leave the company to return to academia at North Carolinas Duke University.
During the earnings call announcing Monroes departure, Chapman said: Chris would be the first one to tell you that the physics behind what IonQ is doing is now solved. It's [now] largely an engineering problem.
Atoms Hays notes a lot of the engineering work that the company is doing to get ready for cloud services and applications is software-based, meaning the company is looking for software engineers.
We are mostly looking for people that have worked at cloud service providers or large software companies and have an interest in either learning or already some foundational knowledge of the underlying physics and science, he says. But we're kind of fortunate that those people self-select and find us. We have a pretty high number of software engineers who have physics undergrads and an extreme interest in quantum mechanics, even though by trade and experience they're software engineers.
On-premise quantum computers are currently rarities largely reserved for national computing labs and academic institutions. Most quantum processing unit (QPU) providers offer access to their systems via their own web portals and through public cloud providers.
But todays systems are rarely expected (or contracted) to run with the five-9s resiliency and redundancy we might expect from tried and tested silicon hardware.
Right now, quantum systems are more like supercomputers and they're managed with a queue; they're probably not online 24 hours, users enter jobs into a queue and get answers back as the queue executes, says Atoms Hays.
We are approaching how we get closer to 24/7 and how we build in redundancy and failover so that if one system has come offline for maintenance, there's another one available at all times. How do we build a system architecturally and engineering-wise, where we can do hot swaps or upgrades or changes with minimal downtime as possible?
Other providers are going through similar teething phases of how to make their systems which are currently sensitive, temperamental, and complicated enterprise-ready for the data centers of the world.
I already have a firm SLA with the cloud guys around the amount of time that we do jobs on a daily basis, and the timeframes to be able to do that, says Chapman. We are moving that SLA to 24/7 and being able to do that without having an operator present. It's not perfect, but its getting better. In three or four years from now, you'll only need an on-call when a component dies.
Rigetti CTO David Rivas says his company is also working towards higher uptimes.
The systems themselves are becoming more and more lights out every quarter, he says, as we outfit them for that kind of remote operation and ensure that the production facilities can be outfitted for that kind of operation.
Rigetti
Manufacturing and repair of these systems is also maturing, since the first PhD-built generations of quantum computers. These will never be mass-produced, but the industry needs to move away from one-off artisanal machines to a more production line-like approach.
A lot of the hardware does get built with the assistance of electronics engineers, mechanical engineers, says Atoms Hays, but much is still built by experimental physicists.
IonQs Chapman adds: In our first-generation systems, you needed a physicist with a screwdriver to tune the machine to be able to run your application. But every generation of hardware puts more under software control.
Everywhere a screwdriver could be turned, there's now a stepper motor under software control, and the operating system is now doing the tuning.
Simon Phillips, CTO of the UKs Oxford Quantum Circuits, says OQC is focused on how it hires staff and works with partners to roll out QPUs into colocation data centers.
And the first part of that starts with if we put 10 QPUs in 10 locations around the world, how do we do that without having an army of 100 quantum engineers on each installation?
And the first part of that starts with having a separate deployment team and a site reliability engineering team that can then run the SLA on that machine.
He adds: Not all problems are quantum problems. It can't just be quantum engineers; it's not scalable if it's the same people doing everything.
It's about training and understanding where the first and second lines of support sit, having a cascading system, and utilizing any smart hands so we can train people who already exist in data centers.
IonQ
While the quantum startups are undergoing their own maturing process, their suppliers are also being forced to learn about the needs of commercial operators and what it means to deploy in a production data center.
For years, the supply chain including for the dilution refrigerators that keep many quantum computers supercooled has dealt with largely self-reliant academic customers in lab spaces.
Richard Moulds, general manager of Amazon Braket at AWS, told DCD the dilution refrigerator market is a cottage industry with few suppliers.
One of the main fridge suppliers is Oxford Instruments, an Oxford University spin-out from the late 1950s that released the first commercial dilution unit back in 1966. The other large incumbent, Blufors, was spun out of what is now the Low Temperature Laboratory at Aalto University in Finland 15 years ago.
Prior to the quantum computing rush, the biggest change in recent years was the introduction of pulse tube technology. Instead of a cryostat inserted into a bath of liquid helium4, quantum computers could now use a closed loop system (aka a dry fridge/cryostat).
This meant the systems could become smaller, more efficient, more software-controlled - and more user-friendly.
With the wet dilution fridge (or wet cryostat), you need two-floor rooms for ceiling height. You need technicians to top up helium and run liquefiers, you need to buy helium to keep topping up, says Harriet van der Vliet, product segment manager, quantum technologies, Oxford Instruments.
It was quite a manual process and it would take maybe a week just to pre-cool and that would not even be getting to base temperature.
For years, the fridges were the preserve of academics doing materials science; they were more likely to win a Nobel prize than be part of a computing contract.
Historically, it's been a lab product. Our customers were ultra-low temperature (ULT) experts; if anything went wrong, they would fix it themselves, says van der Vliet. Now our customers have moved from being simply academics to being commercial players who need user-friendly systems that are push button.
While the company declined to break out numbers, Oxford said it has seen a noticeable change in the customer demographic towards commercial quantum computing customers in recent years, but also a change in buying trends. QPU companies are more likely to buy multiple fridges at once, rather than a single unit every few years for an academic research lab.
The commercial part is growing for sure, adds David Gunnarsson, CTO at Blufors. The company has expanded factory capacity to almost double production capabilities to meet growing demand.
There have been more and more attempts to create revenue on quantum computing technology. They are buying our systems to actually deploy or have an application that they think they can create money from. We welcome discussion with data centers so they can understand our technology from the cryogenics perspective.
And while the industry is working towards minimizing form factors as much as possible, for the foreseeable future the industry has settled on essentially brute force supercooling with bigger fridges. Both companies have released new dilution fridges designed for quantum computers.
Smaller fridges (and lower qubit-count) systems may be able to fit into racks, but most larger qubit-count supercooled systems require a much larger footprint than traditional racks. Blufors largest Kide system can cool around 1,000 qubits: the system is just under three meters in height and 2.5 meters in diameter, and the floor beneath it needs to be able to take about 7,000 kilograms of weight.
It has changed the way we do our product, says Gunnarsson. They were lab tools before; uptime wasnt discussed much before. Now we are making a lot of changes to our product line to ensure that you can be more certain about what the uptime of your system will be.
Part of the uptime challenge suppliers face around fridges an area where Gunnarsson notes there is still something of a mismatch is in the warm-up/cool-down cycle of the machines.
While previously the wet bath systems could take a week to get to the required temperatures, the new dry systems might only take a day or two each way. That is important, because cooling down and warming up cycles are effectively downtime; a dirty word when talking about service availability.
The speed with which you can get to temperature is almost as important as the size of the chip that you can actually chill, says AWS Moulds. Today, if you want to change the device's physical silicon, you have got to warm this device up and then chill it back down again, that's a four-day cycle. That's a problem; it means machines are offline for a long time for relatively minor changes.
While this might not be an issue for in-operation machines Rigetti CTO Rivas says its machines can be in service for months at a time, while Oxford Instruments says an OQC system was in operation non-stop for more than a year the long warm-up/cool-down cycle is a barrier to rapid testing.
From a production perspective, the systems remain cold for a relatively long time, says Rivas. But we're constantly running chips through test systems as we innovate and grow capacity, and 48 hours to cool a chip down is a long time in an overall development cycle.
Oxford Instruments and Blufors might be the incumbents, but there are a growing number of new players entering the fridge space, some specifically focusing on quantum computing.
The market has grown for dilution fridges, so there are lots more startups in the space as well making different cooling systems, says van der Vliet. There are many more players, but the market is growing.
I think it's really healthy that there's loads of players in the field, particularly new players who are doing things a little bit differently to how we've always done it.
The incumbents are well-placed to continue their lead in the market, but QPU operators are hopeful that competition will result in better products.
There will be genuine intellectual property that will emerge in this area and you'll definitely start to see custom designs and proprietary systems that can maintain temperature in the face of increasing power.
Atoms Hays notes that, for laser-based quantum systems, the lasers themselves are probably the largest constraint in the supply chain. Like the dilution fridges, these are still largely scientific technologies made by a handful of suppliers.
We need relatively high-powered lasers that need to be very quiet and very precise," he says. Ours are off the shelf, but they're semi-custom and manufacturer builds to order. That means that there's long lead times; in some cases up to a year.
He adds that many of the photonic integrated circuits are still relatively small - the size of nickels and dimes - but hopes they can shrink down to semiconductor size in future to help reduce the footprint
For now, the quantum industry is still enjoying what might be the autumn of its happy-go-lucky academic days. The next phase may well lead to quantum supremacy and a new phase in high-performance computing, but it will likely lead to a less open industry.
I think its nice that the industry is still sort of in that mode, says AWS Moulds. The industry is still taking a relatively open approach to the development. We're not yet in the mode of everybody working in their secret bunkers, building secret machines. But history shows that once there's a clear opportunity, there's a risk of the shutters coming down, and it becoming a more cut-throat industry.
In the end, that's good for customers; it drives down costs and drives up reliability and performance. But it might feel that might feel a little bit brutal for some of the academics that are in the industry now.
Read more:
Longer coherence: How the quantum computing industry is maturing - DatacenterDynamics
Quantum Attack Protection Added to HP Business PCs – SecurityWeek
Posted: at 2:40 am
HP announced on Thursday that several of its business PCs now benefit from protection against quantum computer attacks thanks to a new security chip.
The tech giant said the 5th generation of its Endpoint Security Controller (ESC) chip, which is built into some of its computers, can protect the integrity of the devices firmware using quantum-resistant cryptography.
According to HP, the 5th generation ESC is currently available in Zbool Firefly, Power and Studio workstations; EliteBook 1000 series, 800 series and some 600 series notebooks; and some 400 series ProBook notebooks.
By embedding protection against quantum computer hacks at the chip level, HP is today setting a new standard in hardware and firmware security with our 5th generation ESC chip, HP said. By isolating the chip from the processor and OS, the ESC provides a hardware platform that reduces the risk of data breaches and improves productivity by preventing downtime.
[ Read: Cyber Insights 2024: Quantum and the Cryptopocalypse ]
While practical quantum computer attacks may still be at least a decade away, major tech companies have already started taking steps to ensure that the cryptography used in their products will be able to provide protection against quantum attacks when that day comes.
Apple, for instance, recently announced adding post-quantum encryption to iMessage to protect communications against quantum computing attacks.
Governments have also started taking steps to tackle the theoretical threats posed by quantum computing before they become a reality.
HP urges businesses to immediately start planning for the future and begin migrating their fleets. The company recommends identifying the highest priority use cases, finding out what technology providers are planning in regards to quantum protections, and creating a plan to ensure protection is rolled out in the required timeframe.
Related: AI Helps Crack NIST-Recommended Post-Quantum Encryption Algorithm
Related: In Other News: WEFs Unsurprising Cybersecurity Findings, KyberSlash Cryptography Flaw
Read the original here:
Quantum Attack Protection Added to HP Business PCs - SecurityWeek