Page 16«..10..15161718..»

Archive for the ‘Quantum Computer’ Category

Honeywell Wants To Show What Quantum Computing Can Do For The World – Forbes

Posted: August 14, 2020 at 11:51 pm


without comments

The race for quantum supremacy heated up in June, when Honeywell brought to market the worlds highest performing quantum computer. Honeywell claims it is more accurate (i.e., performs with less errors) than competing systems and that its performance will increase by an order of magnitude each year for the next five years.

Inside the chamber of Honeywells quantum computer

The beauty of quantum computing, says Tony Uttley, President of Honeywell Quantum Solutions, is that once you reach a certain level of accuracy, every time you add a qbit [the basic unit of quantum information] you double the computational capacity. So as the quantum computer scales exponentially, you can scale your problem set exponentially.

Tony Uttley, President, Honeywell Quantum Solutions

Uttley sees three distinct eras in the evolution of quantum computing. Today, we are in the emergent erayou can start to prove what kind of things work, what kind of algorithms show the most promise. For example, the Future Lab for Applied Research and Engineering (FLARE) group of JPMorgan Chase published a paper in June summarizing the results of running on the Honeywell quantum computer complex mathematical calculations used in financial trading applications.

The next era Uttley calls classically impractical, running computations on a quantum computer that typically are not run on todays (classical) computers because they take too long, consume too much power, and cost too much. Crossing the threshold from emergent to classically impractical is not very far away, he asserts, probably sometime in the next 18 to 24 months. This is when you build the trust with the organizations you work with that the answer that is coming from your quantum computer is the correct one, says Uttley.

The companies that understand the potential impact of quantum computing on their industries, are already looking at what it would take to introduce this new computing capability into their existing processes and what they need to adjust or develop from scratch, according to Uttley. These companies will be ready for the shift from emergent to classically impractical which is going to be a binary moment, and they will be able to take advantage of it immediately.

The last stage of the quantum evolution will be classically impossible"you couldnt in the timeframe of the universe do this computation on a classical best-performing supercomputer that you can on a quantum computer, says Uttley. He mentions quantum chemistry, machine learning, optimization challenges (warehouse routing, aircraft maintenance) as applications that will benefit from quantum computing. But what shows the most promise right now are hybrid [resources]you do just one thing, very efficiently, on a quantum computer, and run the other parts of the algorithm or calculation on a classical computer. Uttley predicts that for the foreseeable future we will see co-processing, combining the power of todays computers with the power of emerging quantum computing solutions.

You want to use a quantum computer for the more probabilistic parts [of the algorithm] and a classical computer for the more mundane calculationsthat might reduce the number of qbits needed, explains Gavin Towler, vice president and chief technology officer of Honeywell Performance Materials Technologies. Towler leads R&D activities for three of Honeywell's businesses: Advanced Materials (e.g., refrigerants), UOP (equipment and services for the oil and gas sector), and Process Automation (automation, control systems, software, for all the process industries). As such, he is the poster boy for a quantum computing lead-user.

Gavin Towler, Vice President and Chief Technology Officer, Honeywell Performance Materials and ... [+] Technologies

In the space of materials discovery, quantum computing is going to be critical. Thats not a might or could be. It is going to be the way people do molecular discovery, says Towler. Molecular simulation is used in the design of new molecules, requiring the designer to understand quantum effects. These are intrinsically probabilistic as are quantum computers, Towler explains.

An example he provides is a refrigerant Honeywell produces that is used in automotive air conditioning, supermarkets refrigeration, and homes. As the chlorinated molecules in the refrigerants were causing the hole in the Ozone layer, they were replaced by HFCs which later tuned out to be very potent greenhouse gasses. Honeywell already found a suitable replacement for the refrigerant used in automotive air conditioning, but is searching for similar solutions for other refrigeration applications. Synthesizing in the lab molecules that will prove to have no effect on the Ozone layer or global warming and will not be toxic or flammable is costly. Computer simulation replaces lab work but ideally, you want to have computer models that will screen things out to identify leads much faster, says Towler.

This is where the speed of a quantum computer will make a difference, starting with simple molecules like the ones found in refrigerants or in solvents that are used to remove CO2 from processes prevalent in the oil and gas industry. These are relatively simple molecules, with 10-20 atoms, amenable to be modeled with [todays] quantum computers, says Towler. In the future, he expects more powerful quantum computers to assist in developing vaccines and finding new drugs, polymers, biodegradable plastics, things that contain hundred and thousands of atoms.

There are three ways by which Towlers counterparts in other companies, the lead-users who are interested in experimenting with quantum computing, can currently access Honeywells solution: Run their program directly on Honeywells quantum computer; through Microsoft Azure Quantum services; and working with two startups that Honeywell has invested in, Cambridge Quantum Computing (CQC) and Zapata Computing, both assisting in turning business challenges into quantum computing and hybrid computing algorithms.

Honeywell brings to the quantum computing emerging market a variety of skills in multiple disciplines, with its decades-long experience with precision control systems possibly the most important one. Any at-scale quantum computer becomes a controls problem, says Uttley, and we have experience in some of the most complex systems integration problems in the world. These past experiences have prepared Honeywell to show what quantum computing can do for the world and to rapidly scale-up its solution. Weve built a big auditorium but we are filling out just a few seats right now and we have lots more seats to fill, Uttley sums up this point in time in Honeywells journey to quantum supremacy.

See the original post here:

Honeywell Wants To Show What Quantum Computing Can Do For The World - Forbes

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum Computing for the Next Generation of Computer Scientists and Researchers – Campus Technology

Posted: at 11:51 pm


without comments

C-Level View | Feature

A Q&A with Travis Humble

Travis Humble is a distinguished scientist and director of the Quantum Computing Institute at Oak Ridge National Laboratory. The institute is a lab-wide organization that brings together all of ORNL's capabilities to address the development of quantum computers. Humble is also an academic, holding a joint faculty appointment at the University of Tennessee, where he is an assistant professor with the Bredesen Center for Interdisciplinary Research and Graduate Education. In the following Q&A, Humble gives CT his unique perspectives on the advancement of quantum computing and its entry into higher education curricula and research.

"It's an exciting area that's largely understaffed. There are far more opportunities than there are people currently qualified to approach quantum computing." Travis Humble

Mary Grush: Working at the Oak Ridge National Laboratory as a scientist and at the University of Tennessee as an academic, you are in a remarkable position to watch both the development of the field of quantum computing and its growing importance in higher education curricula and research. First, let me ask about your role at the Bredesen Center for Interdisciplinary Research and Graduate Education. The Bredesen Center draws on resources from both ORNL and UT. Does the center help move quantum computing into the realm of higher education?

Travis Humble: Yes. The point of the Bredesen Center is to do interdisciplinary research, to educate graduate students, and to address the interfaces and frontiers of science that don't fall within the conventional departments.

For me, those objectives are strongly related to my role at the laboratory, where I am a scientist working in quantum information. And the joint work ORNL and UT do in quantum computing is training the next generation of the workforce that's going to be able to take advantage of the tools and research that we're developing at the laboratory.

Grush: Are ORNL and UT connected to bring students to the national lab to experience quantum computing?

Humble: They are so tightly connected that it works very well for us to have graduate students onsite performing research in these topics, while at the same time advancing their education through the university.

Grush: How does ORNL's Quantum Computing Institute, where you are director, promote quantum computing?

Humble: As part of my work with the Quantum Computing Institute, I manage research portfolios and direct resources towards our most critical needs at the moment. But I also use that responsibility as a gateway to get people involved with quantum computing: It's an exciting area that's largely understaffed. There are far more opportunities than there are people currently qualified to approach quantum computing.

The institute is a kind of storefront through which people from many different areas of science and engineering can become involved in quantum computing. It is there to help them get involved.

Grush: Let's get a bit of perspective on quantum computing why is it important?

Humble: Quantum computing is a new approach to the ways we could build computers and solve problems. This approach uses quantum mechanics that support the most fundamental theories of physics. We've had a lot of success in understanding quantum mechanics it's the technology that lasers, transistors, and a lot of things that we rely on today were built on.

But it turns out there's a lot of untapped potential there: We could take further advantage of some of the features of quantum physics, by building new types of technologies.

Here is the original post:

Quantum Computing for the Next Generation of Computer Scientists and Researchers - Campus Technology

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum mechanics is immune to the butterfly effect – The Economist

Posted: at 11:51 pm


without comments

That could help with the design of quantum computers

Aug 15th 2020

IN RAY BRADBURYs science-fiction story A Sound of Thunder, a character time-travels far into the past and inadvertently crushes a butterfly underfoot. The consequences of that minuscule change ripple through reality such that, upon the time-travellers return, the present has been dramatically changed.

The butterfly effect describes the high sensitivity of many systems to tiny changes in their starting conditions. But while it is a feature of classical physics, it has been unclear whether it also applies to quantum mechanics, which governs the interactions of tiny objects like atoms and fundamental particles. Bin Yan and Nikolai Sinitsyn, a pair of physicists at Los Alamos National Laboratory, decided to find out. As they report in Physical Review Letters, quantum-mechanical systems seem to be more resilient than classical ones. Strangely, they seem to have the capacity to repair damage done in the past as time unfolds.

To perform their experiment, Drs Yan and Sinitsyn ran simulations on a small quantum computer made by IBM. They constructed a simple quantum system consisting of qubitsthe quantum analogue of the familiar one-or-zero bits used by classical computers. Like an ordinary bit, a qubit can be either one or zero. But it can also exist in superposition, a chimerical mix of both states at once.

Having established the system, the authors prepared a particular qubit by setting its state to zero. That qubit was then allowed to interact with the others in a process called quantum scrambling which, in this case, mimics the effect of evolving a quantum system backwards in time. Once this virtual foray into the past was completed, the authors disturbed the chosen qubit, destroying its local information and its correlations with the other qubits. Finally, the authors performed a reversed scrambling process on the now-damaged system. This was analogous to running the quantum system all the way forwards in time to where it all began.

They then checked to see how similar the final state of the chosen qubit was to the zero-state it had been assigned at the beginning of the experiment. The classical butterfly effect suggests that the researchers meddling should have changed it quite drastically. In the event, the qubits original state had been almost entirely recovered. Its state was not quite zero, but it was, in quantum-mechanical terms, 98.3% of the way there, a difference that was deemed insignificant. The final output state after the forward evolution is essentially the same as the input state before backward evolution, says Dr Sinitsyn. It can be viewed as the same input state plus some small background noise. Oddest of all was the fact that the further back in simulated time the damage was done, the greater the rate of recoveryas if the quantum system was repairing itself with time.

The mechanism behind all this is known as entanglement. As quantum objects interact, their states become highly correlatedentangledin a way that serves to diffuse localised information about the state of one quantum object across the system as a whole. Damage to one part of the system does not destroy information in the same way as it would with a classical system. Instead of losing your work when your laptop crashes, having a highly entangled system is a bit like having back-ups stashed in every room of the house. Even though the information held in the disturbed qubit is lost, its links with the other qubits in the system can act to restore it.

The upshot is that the butterfly effect seems not to apply to quantum systems. Besides making life safe for tiny time-travellers, that may have implications for quantum computing, too, a field into which companies and countries are investing billions of dollars. We think of quantum systems, especially in quantum computing, as very fragile, says Natalia Ares, a physicist at the University of Oxford. That this result demonstrates that quantum systems can in fact be unexpectedly robust is an encouraging finding, and bodes well for potential future advances in the field.

This article appeared in the Science & technology section of the print edition under the headline "A flutter in time"

Read more:

Quantum mechanics is immune to the butterfly effect - The Economist

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Major quantum computational breakthrough is shaking up physics and maths – The Conversation UK

Posted: at 11:51 pm


without comments

MIP* = RE is not a typo. It is a groundbreaking discovery and the catchy title of a recent paper in the field of quantum complexity theory. Complexity theory is a zoo of complexity classes collections of computational problems of which MIP* and RE are but two.

The 165-page paper shows that these two classes are the same. That may seem like an insignificant detail in an abstract theory without any real-world application. But physicists and mathematicians are flocking to visit the zoo, even though they probably dont understand it all. Because it turns out the discovery has astonishing consequences for their own disciplines.

In 1936, Alan Turing showed that the Halting Problem algorithmically deciding whether a computer program halts or loops forever cannot be solved. Modern computer science was born. Its success made the impression that soon all practical problems would yield to the tremendous power of the computer.

But it soon became apparent that, while some problems can be solved algorithmically, the actual computation will last long after our Sun will have engulfed the computer performing the computation. Figuring out how to solve a problem algorithmically was not enough. It was vital to classify solutions by efficiency. Complexity theory classifies problems according to how hard it is to solve them. The hardness of a problem is measured in terms of how long the computation lasts.

RE stands for problems that can be solved by a computer. It is the zoo. Lets have a look at some subclasses.

The class P consists of problems which a known algorithm can solve quickly (technically, in polynomial time). For instance, multiplying two numbers belongs to P since long multiplication is an efficient algorithm to solve the problem. The problem of finding the prime factors of a number is not known to be in P; the problem can certainly be solved by a computer but no known algorithm can do so efficiently. A related problem, deciding if a given number is a prime, was in similar limbo until 2004 when an efficient algorithm showed that this problem is in P.

Another complexity class is NP. Imagine a maze. Is there a way out of this maze? is a yes/no question. If the answer is yes, then there is a simple way to convince us: simply give us the directions, well follow them, and well find the exit. If the answer is no, however, wed have to traverse the entire maze without ever finding a way out to be convinced.

Such yes/no problems for which, if the answer is yes, we can efficiently demonstrate that, belong to NP. Any solution to a problem serves to convince us of the answer, and so P is contained in NP. Surprisingly, a million dollar question is whether P=NP. Nobody knows.

The classes described so far represent problems faced by a normal computer. But computers are fundamentally changing quantum computers are being developed. But if a new type of computer comes along and claims to solve one of our problems, how can we trust it is correct?

Imagine an interaction between two entities, an interrogator and a prover. In a police interrogation, the prover may be a suspect attempting to prove their innocence. The interrogator must decide whether the prover is sufficiently convincing. There is an imbalance; knowledge-wise the interrogator is in an inferior position.

In complexity theory, the interrogator is the person, with limited computational power, trying to solve the problem. The prover is the new computer, which is assumed to have immense computational power. An interactive proof system is a protocol that the interrogator can use in order to determine, at least with high probability, whether the prover should be believed. By analogy, these are crimes that the police may not be able to solve, but at least innocents can convince the police of their innocence. This is the class IP.

If multiple provers can be interrogated, and the provers are not allowed to coordinate their answers (as is typically the case when the police interrogates multiple suspects), then we get to the class MIP. Such interrogations, via cross examining the provers responses, provide the interrogator with greater power, so MIP contains IP.

Quantum communication is a new form of communication carried out with qubits. Entanglement a quantum feature in which qubits are spookishly entangled, even if separated makes quantum communication fundamentally different to ordinary communication. Allowing the provers of MIP to share an entangled qubit leads to the class MIP*.

It seems obvious that communication between the provers can only serve to help the provers coordinate lies rather than assist the interrogator in discovering truth. For that reason, nobody expected that allowing more communication would make computational problems more reliable and solvable. Surprisingly, we now know that MIP* = RE. This means that quantum communication behaves wildly differently to normal communication.

In the 1970s, Alain Connes formulated what became known as the Connes Embedding Problem. Grossly simplified, this asked whether infinite matrices can be approximated by finite matrices. This new paper has now proved this isnt possible an important finding for pure mathematicians.

In 1993, meanwhile, Boris Tsirelson pinpointed a problem in physics now known as Tsirelsons Problem. This was about two different mathematical formalisms of a single situation in quantum mechanics to date an incredibly successful theory that explains the subatomic world. Being two different descriptions of the same phenomenon it was to be expected that the two formalisms were mathematically equivalent.

But the new paper now shows that they arent. Exactly how they can both still yield the same results and both describe the same physical reality is unknown, but it is why physicists are also suddenly taking an interest.

Time will tell what other unanswered scientific questions will yield to the study of complexity. Undoubtedly, MIP* = RE is a great leap forward.

See more here:

Major quantum computational breakthrough is shaking up physics and maths - The Conversation UK

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

IEEE International Conference on Quantum Computing and Engineering (QCE20) Transitions to All-Virtual Event – PRNewswire

Posted: at 11:51 pm


without comments

The exciting QCE20 conference programfeatures over 270 hours of programming. Each day the QCE20 conference, also known as IEEE Quantum Week, will virtually deliver 9-10 parallel tracks ofworld-class keynotes, workforce-building tutorials, community-building workshops, technical paper presentations, innovative posters, and thought-provoking panels through a digital combination of pre-recorded and live-streamed sessions. Attendees will be able to participate in live Q&A sessions with keynote speakers and panelists, paper and poster authors, as well as tutorial and workshop speakers. Birds of a Feather, Networking, and Beautiful Coloradosessions spice up the program between technical sessions. The recorded QCE20 sessions will be available for on-demand until November 30.

"With our expansive technical program and lineup of incredible presentations from thought-leaders all over the globe, this is shaping up to be the quantum event of the year," said Hausi Mller, QCE20 General Chair, IEEE Quantum Initiative Co-Chair. "I encourage all professionals and enthusiasts to become a quantum computing champion by engaging and participating in the inaugural IEEE International Conference on Quantum Computing & Engineering (QCE20)."

Workshops and tutorials will be conducted according to their pre-determined schedule in a live, virtual format. The QCE20 tutorials program features 16 tutorials by leading experts aimed squarely at workforce development and training considerations, and 21 QCE20 workshopsprovide forums for group discussions on topics in quantum research, practice, education, and applications.

Ten outstanding keynote speakers will address quantum computing and engineering topics at the beginning and at the end of each conference day, providing insights to stimulate discussion for the networking sessions and exhibits.

QCE20 panel sessionswill explore various perspectives of quantum topics, including quantum education and training, quantum hardware and software, quantum engineering challenges, fault-tolerant quantum computers, quantum error correction, quantum intermediate language representation, hardware-software co-design, and hybrid quantum-classical computing platforms. Visit Enabling and Growing the Quantum Industryto view the newest addition to the lineup.

Over 20 QCE20 exhibitors and sponsors including Platinum sponsors IBM, Microsoft, and Honeywell, and Gold sponsors Quantropi and Zapatawill be featured Monday through Friday in virtual exhibit rooms offering numerous opportunities for networking.

QCE20 is co-sponsored by the IEEE Computer Society, IEEE Communications Society, IEEE Photonics Society, IEEE Council on Superconductivity,IEEE Electronics Packaging Society, IEEE Future Directions Quantum Initiative, and IEEETechnology and Engineering Management Society.

Register to be a part of the highly anticipated virtual IEEE Quantum Week 2020.

Visit qce.quantum.ieee.org for all program details, as well as sponsorship and exhibitor opportunities.

About the IEEE Computer SocietyThe IEEE Computer Society is the world's home for computer science, engineering, and technology. A global leader in providing access to computer science research, analysis, and information, the IEEE Computer Society offers a comprehensive array of unmatched products, services, and opportunities for individuals at all stages of their professional career. Known as the premier organization that empowers the people who drive technology, the IEEE Computer Society offers international conferences, peer-reviewed publications, a unique digital library, and training programs. Visit http://www.computer.orgfor more information.

About the IEEE Communications Society The IEEE Communications Societypromotes technological innovation and fosters creation and sharing of information among the global technical community. The Society provides services to members for their technical and professional advancement and forums for technical exchanges among professionals in academia, industry, and public institutions.

About the IEEE Photonics SocietyTheIEEE Photonics Societyforms the hub of a vibrant technical community of more than 100,000 professionals dedicated to transforming breakthroughs in quantum physics into the devices, systems, and products to revolutionize our daily lives. From ubiquitous and inexpensive global communications via fiber optics, to lasers for medical and other applications, to flat-screen displays, to photovoltaic devices for solar energy, to LEDs for energy-efficient illumination, there are myriad examples of the Society's impact on the world around us.

About the IEEE Council on SuperconductivityThe IEEE Council on Superconductivityand its activities and programs cover the science and technology of superconductors and their applications, including materials and their applications for electronics, magnetics, and power systems, where the superconductor properties are central to the application.

About the IEEE Electronics Packaging SocietyThe IEEE Electronics Packaging Societyis the leading international forum for scientists and engineers engaged in the research, design, and development of revolutionary advances in microsystems packaging and manufacturing.

About the IEEE Future Directions Quantum InitiativeIEEE Quantumis an IEEE Future Directions initiative launched in 2019 that serves as IEEE's leading community for all projects and activities on quantum technologies. IEEE Quantum is supported by leadership and representation across IEEE Societies and OUs. The initiative addresses the current landscape of quantum technologies, identifies challenges and opportunities, leverages and collaborates with existing initiatives, and engages the quantum community at large.

About the IEEE Technology and Engineering Management SocietyIEEE TEMSencompasses the management sciences and practices required for defining, implementing, and managing engineering and technology.

SOURCE IEEE Computer Society

http://www.computer.org

Excerpt from:

IEEE International Conference on Quantum Computing and Engineering (QCE20) Transitions to All-Virtual Event - PRNewswire

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

6 new degrees approved, including graduate degrees in biostatistics and quantum information science: News at IU – IU Newsroom

Posted: at 11:51 pm


without comments

The Indiana University Board of Trustees has approved six new degrees, four of which are graduate level.

All of the new graduate degrees are on the Bloomington campus:

Also approved were a Bachelor of Arts in theater, film and television at IUPUI and a Bachelor of Science in accounting at IU East.

The master's and doctoral degrees in biostatistics are offered by the Department of Epidemiology and Biostatistics in the School of Public Health-Bloomington. They will focus on rural public health issues and specialized areas in public health research, such as the opioid epidemic.

Biostatistics is considered a high-demand job field. Both degrees are intended to meet the labor market and educational and research needs of the state, which is trying to reduce negative health outcomes. Biostatisticians typically are hired by state and local health departments, federal government agencies, medical centers, medical device companies and pharmaceutical companies, among others.

The Master of Science in quantum information science will involve an intensive, one-year, multidisciplinary program with tracks that tie into physics, chemistry, mathematics, computer science, engineering and business. It's offered through the Office of Multidisciplinary Graduate Programs in the University Graduate School. The degree was proposed by the College of Arts and Sciences, the Luddy School of Informatics, Computing and Engineering, and the Kelley School of Business.

Most of the faculty who will teach the classes are members of the newly established IU Quantum Science and Engineering Center.

Students who earn the Master of Science in quantum information science can pursue careers with computer and software companies that are active with quantum computation, and national labs involved in quantum information science, among other opportunities.

The Master of International Affairs is a joint degree by the O'Neill School of Public and Environmental Affairs and the Hamilton-Lugar School of Global and International Studies. The degree is the first of its kind offered by any IU campus and meets student demand for professional master's programs having an international focus.

Featured components of the degree include the study of international relations and public administration. Graduates can expect to find employment in the federal government, such as the Department of State, the Department of Treasury or the U.S. intelligence community, or with private-sector firms in fields such as high-tech, global trade and finance.

The Bachelor of Arts in theater, film and television combines existing programs and provides them a more visible home in the School of Liberal Arts at IUPUI. The degree features three distinct concentrations:

Applied theater is a growing field that emphasizes and works with organizations around issues of social justice, social change, diversity and inclusion.

IU East's Bachelor of Science in accounting degree, offered through the School of Business and Economics, helps meet projected high demand in the accounting industry. It also will prepare students to take the certified public accountant or certified managerial accountant exams, or enter graduate programs in accounting or business.

Original post:

6 new degrees approved, including graduate degrees in biostatistics and quantum information science: News at IU - IU Newsroom

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

The race to building a fully functional quantum stack – TechCrunch

Posted: at 11:51 pm


without comments

More posts by this contributor

Quantum computers exploit the seemingly bizarre yet proven nature of the universe that until a particle interacts with another, its position, speed, color, spin and other quantum properties coexist simultaneously as a probability distribution over all possibilities in a state known as superposition. Quantum computers use isolated particles as their most basic building blocks, relying on any one of these quantum properties to represent the state of a quantum bit (or qubit). So while classical computer bits always exist in a mutually exclusive state of either 0 (low energy) or 1 (high energy), qubits in superposition coexist simultaneously in both states as 0 and 1.

Things get interesting at a larger scale, as QC systems are capable of isolating a group of entangled particles, which all share a single state of superposition. While a single qubit coexists in two states, a set of eight entangled qubits (or 8Q), for example, simultaneously occupies all 2^8 (or 256) possible states, effectively processing all these states in parallel. It would take 57Q (representing 2^57 parallel states) for a QC to outperform even the worlds strongest classical supercomputer. A 64Q computer would surpass it by 100x (clearly achieving quantum advantage) and a 128Q computer would surpass it a quintillion times.

In the race to develop these computers, nature has inserted two major speed bumps. First, isolated quantum particles are highly unstable, and so quantum circuits must execute within extremely short periods of coherence. Second, measuring the output energy level of subatomic qubits requires extreme levels of accuracy that tiny deviations commonly thwart. Informed by university research, leading QC companies like IBM, Google, Honeywell and Rigetti develop quantum engineering and error-correction methods to overcome these challenges as they scale the number of qubits they can process.

Following the challenge to create working hardware, software must be developed to harvest the benefits of parallelism even though we cannot see what is happening inside a quantum circuit without losing superposition. When we measure the output value of a quantum circuits entangled qubits, the superposition collapses into just one of the many possible outcomes. Sometimes, though, the output yields clues that qubits weirdly interfered with themselves (that is, with their probabilistic counterparts) inside the circuit.

QC scientists at UC Berkeley, University of Toronto, University of Waterloo, UT Sydney and elsewhere are now developing a fundamentally new class of algorithms that detect the absence or presence of interference patterns in QC output to cleverly glean information about what happened inside.

A fully functional QC must, therefore, incorporate several layers of a novel technology stack, incorporating both hardware and software components. At the top of the stack sits the application software for solving problems in chemistry, logistics, etc. The application typically makes API calls to a software layer beneath it (loosely referred to as a compiler) that translates function calls into circuits to implement them. Beneath the compiler sits a classical computer that feeds circuit changes and inputs to the Quantum Processing Unit (QPU) beneath it. The QPU typically has an error-correction layer, an analog processing unit to transmit analog inputs to the quantum circuit and measure its analog outputs, and the quantum processor itself, which houses the isolated, entangled particles.

Read the original:

The race to building a fully functional quantum stack - TechCrunch

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Toshiba Exits PC Business 35 Years of IBM Compatible PCs – Electropages

Posted: at 11:51 pm


without comments

Recently, Toshiba announced that it would sell the remained of its computer and laptop operations to Sharp after 35 years of working in the sector. Who is Toshiba, what products did Toshiba produce, and what will Toshiba look towards for its future endeavours?

Toshiba is a Japanese multinational conglomerate with its headquarters located in Minato, Tokyo. Toshiba provides a wide range of services and products in many industries, including semiconductors, discrete electronics, hard drives, printers, and quantum cryptography. Founded in 1890, the company has over 140,000 employees worldwide with yearly revenue of 3.693 trillion, and an operating income of 35.4 billion. Toshiba is arguably most known for its consumer-end products, including televisions, laptops, and flash memory.

One of the biggest challenges faced by early computer makers was creating a portable machine that would allow individuals to work while on the move. The reasons for the difficulty came from a multitude of problems, including heavy batteries, bulky floppy drives, and CRT screens that can easily weigh into the tens of kilograms. The first portable computer, called the Osborne, was developed in 1981, but its reliance on a mains plug made the computer more of a luggable as opposed to a portable platform (a battery pack was available, but only as an optional add-on). While the Osborne was the worlds first portable computer, the first IBM compatible PC laptop was produced by Toshiba in 1985, and offered MS-DOS 2.11, integrated an Intel 80C88 4.7MHz processor, 256KB RAM, internal 3.5 floppy drive, and a 640 x 200 display. Measuring only 4.1KG, the Toshiba T1100 is considered the first mass-produced laptop computer and provided a standard that other manufacturers would quickly follow.

While Toshiba has a long history producing PC compatible computers and laptops, the recent fall in sales has led to Toshiba selling the remainder of its stake in Dynabook to Sharp. To better understand just how much sales have fallen, Toshiba was selling over 17 million computers in 2011 and had dropped to just 1.9 million in 2017. This fall in sales resulted in Toshiba pulling out from the European market in 2016, but even this move did not help entirely. The exact reason for this reduction in sales cannot be attributed to any one cause, but the mass influx of mobile devices such as tabs and smartphones, as well as the introduction of cloud-based applications, means that tasks that would typically be done on a computer can now be done of much smaller, more convenient devices.

Consumer demand for laptops has soared in the last few months because of the Coronavirus pandemic and global lockdowns, but overall, the market for personal computers has been tough for quite a while. Only those who have managed to sustain scale and price (like Lenovo), or have a premium brand (like Apple) have succeeded in the unforgiving PC market, where volumes have been falling for years.

While the PC market is incredibly vast, it is only a small sector that Toshiba has specialised in. This year (2020), Toshiba announced its plans to launch quantum cryptography services, develop affordable solid-state LiDAR, and produce hydrogen fuel cells. Toshiba also continues to develop its other industrial sectors, including electronic storage (FLASH, HDDs, etc.), building systems (elevators), energy systems, infrastructure, and retail. Such a move by Toshiba makes sense when considering that quantum computers are starting to find real-world application, governments around the world are trying to move towards green technologies, and the rapid increase in internet usage is putting a strain on data centres.

Read More

More here:

Toshiba Exits PC Business 35 Years of IBM Compatible PCs - Electropages

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

IBM Z mainframes revived by Red Hat, AI and security – TechTarget

Posted: at 11:51 pm


without comments

By

Published: 13 Aug 2020

Mainframe systems could play a significant role in cybersecurity and artificial intelligence advancements in years to come and IBM is investing in those areas to ensure system Z mainframes have a stake in those growing tech markets.

IBM mainframe sales grew some 69% during the second quarter of this year, achieving the highest year-over-year percentage increase of any other business unit. Some industry observers attribute the unexpected performance to the fact the z15, introduced a year ago, is still in its anticipated upcycle. Typically, mainframe sales level off and dip after 12 to 18 months until the release of a new system. But that might not be the case this time around.

Ross Mauri, general manager of IBM's Z and LinuxOne mainframe business, discussed some of the factors that could contribute to sustained growth of the venerable system, including IBM's acquisition of Red Hat, the rise of open source software and timely technical enhancements.

Mainframe revenues in the second quarter were the fastest-growing of any IBM business unit, something analysts didn't expect to see again. Is this just the typical upcycle for the latest system or something else at work?

Ross Mauri: A lot of it has to do with the Red Hat acquisition and the move toward hybrid clouds. Consequently, mainframes are picking up new workloads, which is why you are seeing a lot more MIPS being generated. We set a record for MIPS in last year's fourth quarter.

How much of it has to do with the increase in Linux-based mainframes and the growing popularity of open source software?

Mauri: Yes, there is that plus all the more strategic applications [OpenShift, Ansible] going to the cloud. What also helped was our Capacity On Demand program going live in the second quarter, providing users with four times the [processor] capacity they had a year ago.

Some industries are in slumps, but online sales are up and that means credit card and banking systems are more active than normal. They liked the idea of being able to turn on 'dark' processors remotely.

Some analysts think mainframes are facing the same barrier Intel-based machines are with Moore's Law. Are you running out of real estate on mainframe chips to improve performance?

Mauri: What we have done is made improvements in the instruction set. So, with things like Watson machine learning, users can work to a pretty high level of AI, taking greater advantage of the hardware. We've not run out of real estate on the chips, or out of performance, and I don't think we will. If you think that, we will prove you wrong.

But with the last couple of mainframe releases performance improvements were in the single digits, compared to 30% to 40% performance improvements of Power systems.

Mauri: In terms of Z [series mainframes], they are running as fast as Power. We know where [mainframes] are going to be running in the future. As we move to deep learning inference engines in the future, you'll see more AI running on the system to help with fraud analytics and real-time transactions. We haven't played out our whole hand yet. The AI market is still nascent; we are very much at the beginning of it. For instance, we're not anywhere near what we can do with the security of the system.

As we move to deep learning inference engines in the future, you'll see more AI running on [mainframes] to help with fraud analytics and real-time transactions. We haven't played out our whole hand yet. Ross MauriGeneral manager, IBM's Z and LinuxOne mainframes

We have started to put quantum encryption algorithms in the system already, to make sure security was sound given what's going on in the world of cybersecurity. You'll see us continue to invest more in the future when it comes to AI. We'll build on that machine learning base we have already.

Is IBM Research investigating other technologies that would sit between existing mainframes and quantum computers in terms of improving performance?

Mauri: Our [mainframe] systems group is working closely with the quantum team as well as with IBM Research. We are still in the research phase; no one's using them for production.

What we're exploring with IBM Research and clients is trying to determine what algorithms run well on a quantum computer for solving business problems and business processes that now run on mainframes. For instance, we're looking at big financial institutions where we can make use of quantum computers as closely coupled accelerators for the mainframe. We think it can greatly reduce costs and improve business processing speed. It's actually not that complex to do. We're doing active experiments with clients now.

What are you looking at to increase performance?

Mauri: We are looking at a whole range of options right now. We have something we do with clients called Enterprise Design Thinking where they are involved throughout an entire process to make sure we're not putting some technology in that's not going to work for them. We have been doing that since the z14 [mainframe].

Read more:

IBM Z mainframes revived by Red Hat, AI and security - TechTarget

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum computing will (eventually) help us discover vaccines in days – VentureBeat

Posted: May 17, 2020 at 10:41 pm


without comments

The coronavirus is proving that we have to move faster in identifying and mitigating epidemics before they become pandemics because, in todays global world, viruses spread much faster, further, and more frequently than ever before.

If COVID-19 has taught us anything, its that while our ability to identify and treat pandemics has improved greatly since the outbreak of the Spanish Flu in 1918, there is still a lot of room for improvement. Over the past few decades, weve taken huge strides to improve quick detection capabilities. It took a mere 12 days to map the outer spike protein of the COVID-19 virus using new techniques. In the 1980s, a similar structural analysis for HIV took four years.

But developing a cure or vaccine still takes a long time and involves such high costs that big pharma doesnt always have incentive to try.

Drug discovery entrepreneur Prof. Noor Shaker posited that Whenever a disease is identified, a new journey into the chemical space starts seeking a medicine that could become useful in contending diseases. The journey takes approximately 15 years and costs $2.6 billion, and starts with a process to filter millions of molecules to identify the promising hundreds with high potential to become medicines. Around 99% of selected leads fail later in the process due to inaccurate prediction of behavior and the limited pool from which they were sampled.

Prof. Shaker highlights one of the main problems with our current drug discovery process: The development of pharmaceuticals is highly empirical. Molecules are made and then tested, without being able to accurately predict performance beforehand. The testing process itself is long, tedious, cumbersome, and may not predict future complications that will surface only when the molecule is deployed at scale, further eroding the cost/benefit ratio of the field. And while AI/ML tools are already being developed and implemented to optimize certain processes, theres a limit to their efficiency at key tasks in the process.

Ideally, a great way to cut down the time and cost would be to transfer the discovery and testing from the expensive and time-inefficient laboratory process (in-vitro) we utilize today, to computer simulations (in-silico). Databases of molecules are already available to us today. If we had infinite computing power we could simply scan these databases and calculate whether each molecule could serve as a cure or vaccine to the COVID-19 virus. We would simply input our factors into the simulation and screen the chemical space for a solution to our problem.

In principle, this is possible. After all, chemical structures can be measured, and the laws of physics governing chemistry are well known. However, as the great British physicist Paul Dirac observed: The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.

In other words, we simply dont have the computing power to solve the equations, and if we stick to classical computers we never will.

This is a bit of a simplification, but the fundamental problem of chemistry is to figure out where electrons sit inside a molecule and calculate the total energy of such a configuration. With this data, one could calculate the properties of a molecule and predict its behavior. Accurate calculations of these properties will allow the screening of molecular databases for compounds that exhibit particular functions, such as a drug molecule that is able to attach to the coronavirus spike and attack it. Essentially, if we could use a computer to accurately calculate the properties of a molecule and predict its behavior in a given situation, it would speed up the process of identifying a cure and improve its efficiency.

Why are quantum computers much better than classical computers at simulating molecules?

Electrons spread out over the molecule in a strongly correlated fashion, and the characteristics of each electron depend greatly on those of its neighbors. These quantum correlations (or entanglement) are at the heart of the quantum theory and make simulating electrons with a classical computer very tricky.

The electrons of the COVID-19 virus, for example, must be treated in general as being part of a single entity having many degrees of freedom, and the description of this ensemble cannot be divided into the sum of its individual, distinguishable electrons. The electrons, due to their strong correlations, have lost their individuality and must be treated as a whole. So to solve the equations, you need to take into account all of the electrons simultaneously. Although classical computers can in principle simulate such molecules, every multi-electron configuration must be stored in memory separately.

Lets say you have a molecule with only 10 electrons (forget the rest of the atom for now), and each electron can be in two different positions within the molecule. Essentially, you have 2^10=1024 different configurations to keep track of rather just 10 electrons which would have been the case if the electrons were individual, distinguishable entities. Youd need 1024 classical bits to store the state of this molecule. Quantum computers, on the other hand, have quantum bits (qubits), which can be made to strongly correlate with one another in the same way electrons within molecules do. So in principle, you would need only about 10 such qubits to represent the strongly correlated electrons in this model system.

The exponentially large parameter space of electron configurations in molecules is exactly the space qubits naturally occupy. Thus, qubits are much more adapted to the simulation of quantum phenomena. This scaling difference between classical and quantum computation gets very big very quickly. For instance, simulating penicillin, a molecule with 41 atoms (and many more electrons) will require 10^86 classical bits, or more bits than the number of atoms in the universe. With a quantum computer, you would only need about 286 qubits. This is still far more qubits than we have today, but certainly a more reasonable and achievable number. The COVID-19 virus outer spike protein, for comparison, contains many thousands of atoms and is thus completely intractable for classical computation. The size of proteins makes them intractable to classical simulation with any degree of accuracy even on todays most powerful supercomputers. Chemists and pharma companies do simulate molecules with supercomputers (albeit not as large as the proteins), but they must resort to making very rough molecule models that dont capture the details a full simulation would, leading to large errors in estimation.

It might take several decades until a sufficiently large quantum computer capable of simulating molecules as large as proteins will emerge. But when such a computer is available, it will mean a complete revolution in the way the pharma and the chemical industries operate.

The holy grail end-to-end in-silico drug discovery involves evaluating and breaking down the entire chemical structures of the virus and the cure.

The continued development of quantum computers, if successful, will allow for end-to-end in-silico drug discovery and the discovery of procedures to fabricate the drug. Several decades from now, with the right technology in place, we could move the entire process into a computer simulation, allowing us to reach results with amazing speed. Computer simulations could eliminate 99.9% of false leads in a fraction of the time it now takes with in-vitro methods. With the appearance of a new epidemic, scientists could identify and develop a potential vaccine/drug in a matter of days.

The bottleneck for drug development would then move from drug discovery to the human testing phases including toxicity and other safety tests. Eventually, even these last stage tests could potentially be expedited with the help of a large scale quantum computer, but that would require an even greater level of quantum computing than described here. Tests at this level would require a quantum computer with enough power to contain a simulation of the human body (or part thereof) that will screen candidate compounds and simulate their impact on the human body.

Achieving all of these dreams will demand a continuous investment into the development of quantum computing as a technology. As Prof. Shohini Ghose said in her 2018 Ted Talk: You cannot build a light bulb by building better and better candles. A light bulb is a different technology based on a deeper scientific understanding. Todays computers are marvels of modern technology and will continue to improve as we move forward. However, we will not be able to solve this task with a more powerful classical computer. It requires new technology, more suited for the task.

(Special thanks Dr. Ilan Richter, MD MPH for assuring the accuracy of the medical details in this article.)

Ramon Szmuk is a Quantum Hardware Engineer at Quantum Machines.

Link:

Quantum computing will (eventually) help us discover vaccines in days - VentureBeat

Written by admin

May 17th, 2020 at 10:41 pm

Posted in Quantum Computer


Page 16«..10..15161718..»



matomo tracker