Archive for the ‘Quantum Computer’ Category
SC20 Invited Speakers Tackle Challenges for the Earth, Its Inhabitants, and Our Security Using ‘More Than HPC’ – HPCwire
Posted: October 8, 2020 at 2:54 am
Oct. 5, 2020 The Invited Talks for SC20 represent the breadth, depth and future outlook of technology and its societal and scientific impact. HPC has always played a critical role in advancing breakthroughs in weather and climate research. This years invited talks extend this further to data driven approaches, including biodiversity, geoscience, and quantum computing. Our speakers will also touch on responsible application of HPC and new technological developments to highlight the impact of this potent and versatile technology on a wide range of applications.
Hear these illustrious speakers during SC20 Invited Talks, TuesdayThursday, November 1719.
Lorena Barba(George Washington University) will explore the need for trustworthy computational evidence through transparency and reproducibility. With the explosion of new computational models for vital research, including COVID-19, applications that are of such importance to society highlight the requirement of building trustworthy computational models. Emphasizing transparency and reproducibility have helped us build more trust in computational findings. How should we adapt our practices for reproducibility to achieve unimpeachable provenance, and reach full accountability of scientific evidence produced via computation?
Shekhar Borkar(Qualcomm Inc.) will speak on the future of computing in the so-called post Moores law era. While speculations about the end of Moores law have created some level of fear in the community, this ending may not be coming as soon as we think. This talk will revisit the historic predictions of the end, and discuss promising opportunities and innovations that may further Moores law and continue to deliver unprecedented performance for years to come.
Dalia A. Conde(University of Southern Denmark) will offer a presentation on fighting the extinction crisis with data. With biodiversity loss identified by the World Economic Forum as one of humanitys greatest challenges, computational methods are urgently needed to secure a healthier planet. We must design and implement effective species conservation strategies, which rely on vast and disparate volumes of data, from genetics and habitat to legislation and human interaction. This talk will introduce the Species Knowledge Index initiative, which aims to map, quantify, analyze, and disseminate open information on animal species to policy makers and conservationists around the globe.
Tom Conte(Georgia Tech) will examine HPC after Moores law. Whether Moores law has ended, is about to end, or will never end, the slowing of the semiconductor innovation curve has left the industry looking for alternatives. Different approaches, beyond quantum or neuromorphic computing, may disrupt current algorithms and software development. This talk will preview the road ahead, and suggest some exciting new technologies on the horizon.
Marissa Giustina(Google LLC) will share the challenges and recent discoveries in the development of Googles Quantum computer, from both the hardware and quantum-information perspectives. This prototype hardware holds promise as a platform for tackling problems that have been impossible to address with existing HPC systems. The talk will include recent technological developments, as well as some perspective for the future of quantum computing.
Patrick Heimbach(The University of Texas at Austin) will discuss the need for advanced computing to help solve the global ocean state estimation problem. Because of the challenge of observing the full-depth global ocean circulation in its spatial detail, numerical simulations play an essential role in quantifying patterns of climate variability and change. New methods that are being developed at the interface of predictive data science remain underutilized in ocean climate modeling. These methods face considerable practical hurdles in the context of HPC, but will be indispensable for advancing simulation-based contributions to real world problems.
Simon Knowles(Graphcore) will discuss the reinvention of accelerated computing for artificial intelligence. As HPC changes in response to the needs of the growing user community, AI can harness enormous quantities of processing power even as we move towards power-limited computing. To balance these needs, the intelligence processor (IPU) architecture is able to capture learning processes and offer massive heterogeneous parallelism. This ground-up reinvention of accelerated computing will show considerable results for real applications.
Ronald P. Luijten(Data Motion Architecture and Consulting GmbH) will offer a presentation on data-centric architecture of a weather and climate accelerator. Using a co-design approach, a non-Von-Neumann accelerator targeting weather and climate situations was developed in tandem with the application code to optimize memory bandwidth. This also led to the filing of a patent for a novel CGRA (Course Grain Reconfigurable Array) layout that reflects grid points in the physical world. The talk will include benchmarks achieved in the project, and a discussion of next steps.
Catherine (Katie) Schuman(Oak Ridge National Laboratory) will introduce us to the future of AI and HPC, in the form of neuromorphic computing and neural accelerators. These two new types of computing technologies offer significant advantages over traditional approaches, including considerably increased energy efficiency and accelerated neural network-style computing. This talk will illustrate the fundamental computing concepts involved in these new hardware developments, and highlight some initial performance results.
Compton Tucker(NASA Goddard Space Flight Center) will speak on satellite tree enumeration outside of forests at the Fifty Centimeter Scale. Non-forest trees, which grow isolated outside of forests, and are not well documented, nevertheless play a crucial role for biodiversity, carbon storage, food resources, and shelter for humans & animals. This talk will detail the use of HPC and machine learning to enumerate isolated trees globally, to identify localized areas of degradation, and quantify the role of isolated trees in the global carbon cycle.
Cliff Young(Google LLC) will entertain the question of whether we can build a virtuous cycle between machine learning and HPC. While machine learning draws on many HPC components, the two areas are diverging in precision and programming models. However, it may be possible to construct a positive feedback loop between them. The Tensor Processing Unit (TPU) could provide opportunities to unite these fields to solve common problems through parallelization, mixed precision, and new algorithms.
Source: Melyssa Fratkin, SC20 Communications Chair
Here is the original post:
A new claimant for "most powerful quantum computer" – Axios
Posted: October 3, 2020 at 5:59 am
The startup IonQ today announced what it's calling "the world's most powerful quantum computer."
Why it matters: Quantum is the next frontier in computing, theoretically capable of solving problems beyond the ability of classical computers. IonQ's next-generation computer looks set to push the boundaries of quantum, but it will still take years before the technology becomes truly reliable.
How it works: IonQ reports its new quantum computer system has 32 "perfect" qubits the basic unit of information in a quantum computer that the company says gives it an expected quantum volume of more than 4,000,000.
Background: IonQ was co-founded by Chris Monroe, a University of Maryland professor and major figure in the development of quantum computers. In the mid-1990s, he began working on entangling atoms to make more precise atomic clocks, the most accurate timekeeping devices known.
The catch: IonQ hasn't yet released detailed specifications of its new system, and its research needs to be verified.
Context: IonQ's announcement comes in the same week that its competitor Honeywell, which also use a version of trapped ions, reported achieving a quantum volume of 128, and the Canadian startup D-Wave announced a 5,000-qubit system built yet another way would that be available for customers, including via the cloud.
Be smart: Comparing different kinds of quantum computing systems is difficult because they function in fundamentally different ways.
Go here to see the original:
ESAs -Week: Digital Twin Earth, Quantum Computing and AI Take Center Stage – SciTechDaily
Posted: at 5:59 am
Digital Twin Earth will help visualize, monitor, and forecast natural and human activity on the planet. The model will be able to monitor the health of the planet, perform simulations of Earths interconnected system with human behavior, and support the field of sustainable development, therefore, reinforcing Europes efforts for a better environment in order to respond to the urgent challenges and targets addressed by the Green Deal. Credit: ESA
ESAs 2020 -week event kicked off this morning with a series of stimulating speeches on Digital Twin Earth, updates on -sat-1, which was successfully launched into orbit earlier this month, and an exciting new initiative involving quantum computing.
The third edition of the -week event, which is entirely virtual, focuses on how Earth observation can contribute to the concept of Digital Twin Earth a dynamic, digital replica of our planet which accurately mimics Earths behavior. Constantly fed with Earth observation data, combined with in situ measurements and artificial intelligence, the Digital Twin Earth provides an accurate representation of the past, present, and future changes of our world.
Digital Twin Earth will help visualize, monitor, and forecast natural and human activity on the planet. The model will be able to monitor the health of the planet, perform simulations of Earths interconnected system with human behavior, and support the field of sustainable development, therefore, reinforcing Europes efforts for a better environment in order to respond to the urgent challenges and targets addressed by the Green Deal.
Todays session opened with inspiring statements from ESAs Director General, Jan Wrner, ESAs Director of Earth Observation Programmes, Josef Aschbacher, ECMWFS Director General, Florence Rabier, European Commissions Deputy Director General for Defence Industry and Space, Pierre Delsaux, as well as Director General of DG CONNECT at the European Commission, Roberto Viola.
The -week 2020 opened on 28 September with inspiring statements from ESAs Director General, Jan Wrner (left) and ESAs Director of Earth Observation Programmes, Josef Aschbacher. Credit: ESA
Pierre Delsaux commented, As our EU Commission President repeated recently during her State of the Union speech, its clear we need to address climate change. The Copernicus program offers us some of the best instruments, satellites, to give us a complete picture of our planets health. But space is not only a monitoring tool, it is also about applied solutions for our economy to make it more green and more digital.
Roberto Viola said, -week is the week for disruptive technology and it is communities like this that our European programmes were designed to support.
Florence Rabier added, Machine learning and artificial intelligence could improve the realism and efficiency of the Digital Twin Earth especially for extreme weather events and numerical forecast models.
Jan Wrner concluded, -week is the perfect example of the New Space approach focusing on disruptive innovation, artificial intelligence, agility and flexibility.
During the week, experts will come together to discuss the role of artificial intelligence for the Digital Twin Earth concept, its practical implementation, the infrastructure requirements needed to build the Digital Twin Earth, and present ideas on how industries and the science community can contribute.
Cloud mask from -sat-1. Credit: Cosine remote sensing B.V
Earlier this month, on 3 September, the first artificial intelligence (AI) technology carried onboard a European Earth observation mission, -sat-1, was launched from Europes spaceport in French Guiana. An enhancement of the Federated Satellite Systems mission (FSSCat), the pioneering artificial intelligence technology is the first experiment to improve the efficiency of sending vast quantities of data back to Earth.
Today, ESA, along with cosine remote sensing, are happy to reveal the first ever hardware-accelerated AI inference of Earth observation images on an in-orbit satellite performed by a Deep Convolutional Neural Network, developed by the University of Pisa.
-sat-1 has successfully enabled the pre-filtering of Earth observation data so that only relevant part of the image with usable information are downlinked to the ground, thereby improving bandwidth utilization and significantly reducing aggregated downlink costs.
Initial data downlinked from the satellite has shown that the AI-powered automatic cloud detection algorithm has correctly sorted hyperspectral Earth observation imagery from the satellites sensor into cloudy and non-cloudy pixels.
Lake Tharthar, Iraq. Credit: Cosine remote sensing B.V
Massimiliano Pastena, -sat-1 Technical Officer at ESA, commented, We have just entered the history of space.
Todays successful application of the Ubotica Artificial Intelligence technology, which is powered by the Intel Movidius Myriad 2 Vision Processing Unit, has demonstrated real on-board data processing autonomy.
Aubrey Dunne, Co-Founder and Vice President of Engineering at Ubotica Technologies, said, We are very excited to be a key part of what is to our knowledge the first ever demonstration of AI applied to Earth Observation data on a flying satellite. This is a watershed moment both for onboard processing of satellite data, and for the future of AI inference in orbital applications.
As the overall 2017 Copernicus Masters winner, FSSCat, was proposed by Spains Universitat Politcnica de Catalunya and developed by a consortium of European companies and institutes including Tyvak International.
Also mentioned in his opening speech this morning, Josef Aschbacher made a special announcement regarding an exciting new ESA initiative, the EOP AI-enhanced Quantum Initiative for EO QC4EO in collaboration with the European Organization for Nuclear Research (CERN).
Quantum computing has the potential to improve performance, decrease computational costs and solve previously intractable problems in Earth observation by exploiting quantum phenomena such as superposition, entanglement, and tunneling.
Quantum computing has the potential to improve performance, decrease computational costs and solve previously intractable problems in Earth observation by exploiting quantum phenomena such as superposition, entanglement and tunneling. Credit: IBM
The initiative involves creating a quantum capability which will have the ability to solve demanding Earth observation problems by using artificial intelligence to support programmes such as Digital Twin Earth and Copernicus. The initiative will be developed at the -lab an ESA laboratory at ESAs center for Earth observation in Italy, which embraces transformational innovation in Earth observation.
ESA and CERN enjoy a long-standing collaboration, centered on technological matters and fundamental physics. This collaboration will be extended to link to the CERN Quantum Technology Initiative, which was announced in June 2020 by the CERN Director General, Fabiola Gianotti.
Through this partnership, ESA and CERN will create new synergies, building on their common experience in big data, data mining and pattern recognition.
Giuseppe Borghi, Head of the -lab, said, Quantum computing together with AI are perhaps the most promising breakthrough to come along in computer technology. In the coming years, we will see more Earth or space science disciplines employing current or future quantum computing techniques to solve geoscience problems.
Josef Aschbacher added, ESA will exploit the broad range of specialized expertise available at ESA and we will place ourselves in a unique position and take a leading role in the development of quantum technologies in the Earth observation domain.
Alberto Di Meglio, Coordinator of the CERN Quantum Technology Initiative, said, Quantum technologies are a rapidly growing field of research and their applications have the potential to revolutionize the way we do science. Preparing for that paradigm change, by building knowledge and tools, is essential. This new collaboration on quantum technologies bears great promise.
See the original post:
ESAs -Week: Digital Twin Earth, Quantum Computing and AI Take Center Stage - SciTechDaily
Schrdingers Web offers a sneak peek at the quantum internet – Science News
Posted: at 5:59 am
Schrdingers Web Jonathan P. Dowling CRC Press, $40.95
When news broke last year that Googles quantum computer Sycamore had performed a calculation faster than the fastest supercomputers could (SN: 12/16/19), it was the first time many people had ever heard of a quantum computer.
Quantum computers, which harness the strange probabilities of quantum mechanics, may prove revolutionary. They have the potential to achieve an exponential speedup over their classical counterparts, at least when it comes to solving some problems. But for now, these computers are still in their infancy, useful for only a few applications, just as the first digital computers were in the 1940s. So isnt a book about the communications network that will link quantum computers the quantum internet more than a little ahead of itself?
Surprisingly, no. As theoretical physicist Jonathan Dowling makes clear in Schrdingers Web, early versions of the quantum internet are here already for example, quantum communication has been taking place between Beijing and Shanghai via fiber-optic cables since 2016 and more are coming fast. So now is the perfect time to read up.
Dowling, who helped found the U.S. governments quantum computing program in the 1990s, is the perfect guide. Armed with a seemingly endless supply of outrageous anecdotes, memorable analogies, puns and quips, he makes the thorny theoretical details of the quantum internet both entertaining and accessible.
Readers wanting to dive right in to details of the quantum internet will have to be patient. Photons are the particles that will power the quantum internet, so we had better be sure we know what the heck they are, Dowling writes. Accordingly, the first third of the book is a historical overview of light, from Newtons 17th century idea of light as corpuscles to experiments probing the quantum reality of photons, or particles of light, in the late 20th century. There are some small historical inaccuracies the section on the Danish physicist Hans Christian rsted repeats an apocryphal tale about his serendipitous discovery of the link between electricity and magnetism and the footnotes rely too much on Wikipedia. But Dowling accomplishes what he sets out to do: Help readers develop an understanding of the quantum nature of light.
Headlines and summaries of the latest Science News articles, delivered to your inbox
Like Dowlings 2013 book on quantum computers, Schrdingers Killer App, Schrdingers Web hammers home the nonintuitive truths at the heart of quantum mechanics. For example, key to the quantum internet is entanglement that spooky action at a distance in which particles are linked across time and space, and measuring the properties of one particle instantly reveals the others properties. Two photons, for instance, can be entangled so they always have the opposite polarization, or angle of oscillation.
In the future, a user in New York could entangle two photons and then send one along a fiber-optic cable to San Francisco, where it would be received by a quantum computer. Because these photons are entangled, measuring the New York photons polarization would instantly reveal the San Francisco photons polarization. This strange reality of entanglement is what the quantum internet exploits for neat features, such as unhackable security; any eavesdropper would mess up the delicate entanglement and be revealed. While his previous book contains more detailed explanations of quantum mechanics, Dowling still finds amusing new analogies, such as Fuzz Lightyear, a canine that runs along a superposition, or quantum combination, of two paths into neighbors yards. Fuzz helps explain physicist John Wheelers delayed-choice experiment, which illustrates the uncertainty, unreality and nonlocality of the quantum world. Fuzzs path is random, the dog doesnt exist on one path until we measure him, and measuring one path seems to instantly affect which yard Fuzz enters even if hes light-years away.
The complexities of the quantum web are saved for last, and even with Dowlings help, the details are not for the faint of heart. Readers will learn how to prepare Bell tests to check that a system of particles is entangled (SN: 8/28/15), navigate bureaucracy in the Department of Defense and send unhackable quantum communications with the dryly named BB84 and E91 protocols. Dowling also goes over some recent milestones in the development of a quantum internet, such as the 2017 quantum-secured videocall between scientists in China and Austria via satellite (SN: 9/29/17).
Just like the classical internet, we really wont figure out what the quantum internet is useful for until it is up and running, Dowling writes, so people can start playing around with it. Some of his prognostications seem improbable. Will people really have quantum computers on their phones and exchange entangled photons across the quantum internet?
Dowling died unexpectedly in June at age 65, before he could see this future come to fruition. Once when I interviewed him, he invoked Arthur C. Clarkes first law to justify why he thought another esteemed scientist was wrong. The first law is that if a distinguished, elderly scientist tells you something is possible, hes very likely right, he said. If he tells you something is impossible, hes very likely wrong.
Dowling died too soon to be considered elderly, but he was distinguished, and Schrdingers Web lays out a powerful case for the possibility of a quantum internet.
Buy Schrdingers Web from Amazon.com.Science Newsis a participant in the Amazon Services LLC Associates Program. Please see ourFAQfor more details.
See the rest here:
Schrdingers Web offers a sneak peek at the quantum internet - Science News
Global QC Market Projected to Grow to More Than $800 million by 2024 – HPCwire
Posted: at 5:59 am
The Quantum Economic Development Consortium (QED-C) and Hyperion Research are projecting that the global quantum computing (QC) market worth an estimated $320 million in 2020 will grow at an anticipated 27% CAGR between 2020 and 2024, reaching approximately $830 million by 2024.
This estimate is based on surveys of 135 US-based quantum computing researchers, developers and suppliers across the academic, commercial and government sectors. Supplemental data and insights came from a companion effort that surveyed 115 current and potential quantum computing users in North America, Europe and the Asia/Pacific region on their expectations, schedules and budgets for the use of quantum computing in their existing and planned computational workloads.
(Keeping track of the various quantum computing organization is becoming a challenge in itself. The Quantum Economic Development Consortium (QED-C) is a consortium of stakeholders that aims to enable and grow the U.S. quantum industry. QED-C was established with support from the National Institute of Standards and Technology (NIST) as part of the Federal strategy for advancing quantum information science and as called for by theNational Quantum Initiative Actenacted in 2018.)
Additional results from the study:
Based on our study and related forecast, there is a growing, vibrant, and diverse US-based QC research, development, and commercial ecosystem that shows the promise of maturing into a viable, if not profitable and self-sustaining industry. That said, it is too early to start picking winners and losers from either a technology or commercial perspective, said Bob Sorensen, quantum analyst for Hyperion Research.
A key driver for commercial success could be the ability of any vendor to ease the requirements needed to integrate QC technology into a larger HPC and enterprise IT user base while still supporting advanced QC-related research for a more targeted, albeit smaller, class of end-user scientists and engineers. This sector is not for faint of heart, but this forecast gives some sense of what is at stake hereat least for the next few years, noted Sorensen.
Source: QED-C
QED-C commissioned and collaborated with Hyperion Research to develop this market forecast to help inform decision making for QC technology developers and suppliers, national-level QC-related policy makers, potential QC users in both the advanced computing and enterprise IT marketplace investors and commercial QC funding organizations. This is a baseline estimate, and Hyperion Research and QED-C are looking to provide periodic updates of their QC market forecast as events, information, or decision- making requirements dictate. Contact: Celia Merzbacher, QED-C Deputy Director, [emailprotected]
See original here:
Global QC Market Projected to Grow to More Than $800 million by 2024 - HPCwire
Berkeley Lab Technologies Honored With 7 R&D 100 Awards – Lawrence Berkeley National Laboratory
Posted: at 5:59 am
Innovative technologies from Lawrence Berkeley National Laboratory (Berkeley Lab) to achieve higher energy efficiency in buildings, make lithium batteries safer and higher performing, and secure quantum communications were some of the inventions honored with R&D 100 Awards by R&D World magazine.
For more than 50 years, the annual R&D 100 Awards have recognized 100 technologies of the past year deemed most innovative and disruptive by an independent panel of judges. The full list of winners, announced by parent company WTWH Media LLC is available at the R&D World website.
Berkeley Labs award-winning technologies are described below.
A Tool to Accelerate Electrochemical and Solid-State Innovation
(from left) Adam Weber, New Danilovic, Douglas Kushner, and John Petrovick (Credit: Berkeley Lab)
Berkeley Lab scientists invented a microelectrode cell to analyze and test electrochemical systems with solid electrolytes. Thanks to significant cost and performance advantages, this tool can accelerate development of critical applications such as energy storage and conversion (fuel cells, batteries, electrolyzers), carbon capture, desalination, and industrial decarbonization.
Solid electrolytes have been displacing liquid electrolytes as the focus of electrochemical innovation because of their performance, safety, and cost advantages. However, the lack of effective methods and equipment for studying solid electrolytes has hindered advancement of the technologies that employ them. This microelectrode cell meets the testing needs, and is already being used by Berkeley Lab scientists.
The development team includes Berkeley Lab researchers Adam Weber, Nemanja Danilovic, Douglas Kushner, and John Petrovick.
Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com)
Information transmitted by MMQ-Com is impervious to security breaches. (Credit: Alexander Stibor/Berkeley Lab)
Quantum communication, cybersecurity, and quantum computing are growing global markets. But the safety of our data is in peril given the rise of quantum computers that can decode classical encryption schemes.
The Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com) technology is a fundamentally new kind of secure quantum information transmitter. It transmits messages by modulating electron matter-waves without changing the pathways of the electrons. This secure communication method is inherently impervious to any interception attempt.
A novel quantum key distribution scheme also ensures that the signal is protected from spying by other quantum devices.
The development team includes Alexander Stibor of Berkeley Labs Molecular Foundry along with Robin Rpke and Nicole Kerker of the University of Tbingen in Germany.
Solid Lithium Battery Using Hard and Soft Solid Electrolytes
(from left) Marca Doeff, Guoying Chen, and Eongyu Yi (Credit: Berkeley Lab)
The lithium battery market is expected to grow from more than $37 billion in 2019 to more than $94 billion by 2025. However, the liquid electrolytes used in most commercial lithium-ion batteries are flammable and limit the ability to achieve higher energy densities. Safety issues continue to plague the electronics markets, as often-reported lithium battery fires and explosions result in casualties and financial losses.
In Berkeley Labs solid lithium battery, the organic electrolytic solution is replaced by two solid electrolytes, one soft and one hard, and lithium metal is used in place of the graphite anode. In addition to eliminating battery fires, incorporation of a lithium metal anode with a capacity 10 times higher than graphite (the conventional anode material in lithium-ion batteries) provides much higher energy densities.
The technology was developed by Berkeley Lab scientists Marca Doeff, Guoying Chen, and Eongyu Yi, along with collaborators at Montana State University.
Porous Graphitic Frameworks for Sustainable High-Performance Li-Ion Batteries
High-resolution transmission electron microscopy images of the Berkeley Lab PGF cathode reveal (at left) a highly ordered honeycomb structure within the 2D plane, and (at right) layered columnar arrays stacked perpendicular to the 2D plane. (Credit: Yi Liu/Berkeley Lab)
The Porous Graphitic Frameworks (PGF) technology is a lithium-ion battery cathode that could outperform todays cathodes in sustainability and performance.
In contrast to commercial cathodes, organic PGFs pose fewer risks to the environment because they are metal-free and composed of earth-abundant, lightweight organic elements such as carbon, hydrogen, and nitrogen. The PGF production process is also more energy-efficient and eco-friendly than other cathode technologies because they are prepared in water at mild temperatures, rather than in toxic solvents at high temperatures.
PGF cathodes also display stable charge-discharge cycles with ultrahigh capacity and record-high energy density, both of which are much higher than all commercial inorganic cathodes and organic cathodes known to exist.
The development team includes Yi Liu and Xinie Li of Berkeley Labs Molecular Foundry, as well as Hongxia Wang and Hao Chen of Stanford University.
Building Efficiency Targeting Tool for Energy Retrofits (BETTER)
The buildings sector is the largest source of primary energy consumption (40%) and ranks second after the industrial sector as a global source of direct and indirect carbon dioxide emissions from fuel combustion. According to the World Economic Forum, nearly one-half of all energy consumed by buildings could be avoided with new energy-efficient systems and equipment.
(from left) Carolyn Szum (Lead Researcher), Han Li, Chao Ding, Nan Zhou, Xu Liu (Credit: Berkeley Lab)
The Building Efficiency Targeting Tool for Energy Retrofits (BETTER) allows municipalities, building and portfolio owners and managers, and energy service providers to quickly and easily identify the most effective cost-saving and energy-efficiency measures in their buildings. With an open-source, data-driven analytical engine, BETTER uses readily available building and monthly energy data to quantify energy, cost, and greenhouse gas reduction potential, and to recommend efficiency interventions at the building and portfolio levels to capture that potential.
It is estimated that BETTER will help reduce about 165.8 megatons of carbon dioxide equivalent (MtCO2e) globally by 2030. This is equivalent to the CO2 sequestered by growing 2.7 billion tree seedlings for 10 years.
The development team includes Berkeley Lab scientists Nan Zhou, Carolyn Szum, Han Li, Chao Ding, Xu Liu, and William Huang, along with collaborators from Johnson Controls and ICF.
AmanziATS: Modeling Environmental Systems Across Scales
Simulated surface and subsurface water from Amanzi-ATS hydrological modeling of the Copper Creek sub-catchment in the East River, Colorado watershed. (Credit: Zexuan Xu/Berkeley Lab, David Moulton/Los Alamos National Laboratory)
Scientists use computer simulations to predict the impact of wildfires on water quality, or to monitor cleanup at nuclear waste remediation sites by portraying fluid flow across Earth compartments. The Amanzi-Advanced Terrestrial Simulator (ATS) enables them to replicate or couple multiple complex and integrated physical processes controlling these flowpaths, making it possible to capture the essential physics of the problem at hand.
Specific problems require taking an individual approach to simulations, said Sergi Molins, principal investigator at Berkeley Lab, which contributed expertise in geochemical modeling to the softwares development. Physical processes controlling how mountainous watersheds respond to disturbances such as climate- and land-use change, extreme weather, and wildfire are far different than the physical processes at play when an unexpected storm suddenly impacts groundwater contaminant levels in and around a nuclear remediation site. Amanzi-ATS allows scientists to make sense of these interactions in each individual scenario.
The code is open-source and capable of being run on systems ranging from a laptop to a supercomputer. Led by Los Alamos National Laboratory, Amanzi-ATS is jointly developed by researchers from Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Berkeley Lab researchers including Sergi Molins, Marcus Day, Carl Steefel, and Zexuan Xu.
Institute for the Design of Advanced Energy Systems (IDAES)
The U.S. Department of Energys (DOEs) Institute for the Design of Advanced Energy Systems (IDAES) project develops next-generation computational tools for process systems engineering (PSE) of advanced energy systems, enabling their rapid design and optimization.
IDAES Project Team (Credit: Berkeley Lab)
By providing rigorous modeling capabilities, the IDAES Modeling & Optimization Platform helps energy and process companies, technology developers, academic researchers, and DOE to design, develop, scale-up, and analyze new and potential PSE technologies and processes to accelerate advances and apply them to address the nations energy needs. The IDAES platform is also a key component in the National Alliance for Water Innovation, a $100 million, five-year DOE innovation hub led by Berkeley Lab, which will examine the critical technical barriers and research needed to radically lower the cost and energy of desalination.
Led by National Energy Technology Laboratory, IDAES is a collaboration with Sandia National Laboratories, Berkeley Lab, West Virginia University, Carnegie Mellon University, and the University of Notre Dame. The development team at Berkeley Lab includes Deb Agarwal, Oluwamayowa (Mayo) Amusat, Keith Beattie, Ludovico Bianchi, Josh Boverhof, Hamdy Elgammal, Dan Gunter, Julianne Mueller, Jangho Park, Makayla Shepherd, Karen Whitenack, and Perren Yang.
# # #
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.
DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.
More here:
Berkeley Lab Technologies Honored With 7 R&D 100 Awards - Lawrence Berkeley National Laboratory
oneAPI Academic Center of Excellence Established at the Heidelberg University Computing Center (URZ) – HPCwire
Posted: at 5:59 am
Sept. 29, 2020 A oneAPI Academic Center of Excellence (CoE) is now established at the Heidelberg University Computing Center (URZ). The new CoE will conduct research supporting the oneAPI industry initiative to create a uniform, open programming model for heterogeneous computer architectures.
A common language for heterogeneous computing
URZ will focus its research and programming efforts on a fundamental high-performance computing (HPC) challenge where modern computers utilize different types of hardware for different calculations. Accelerators, including graphics processing units (GPUs) and field programmable gate arrays (FPGAs), are used in combination with general compute processors (CPUs). Using different types of hardware make computers very powerful and provide versatility for a wide range of situations and workloads. However, hardware heterogeneity complicates software development for these computers, especially when specialized components from a variety of vendors are used in tandem.
One major reason for this complication is that many accelerated compute architectures require their own programming models. Therefore, software developers need to learn and use a different and sometimes proprietary language for each processing unit in a heterogeneous system, which increases complexity and limits flexibility.
oneAPIs cross-architecture language Data Parallel C++ (DPC++), based on Khronos Groups SYCL standard for heterogeneous programming in C++, overcomes these challenges with its single, unified open development model for performant and productive heterogeneous programming and cross-vendor support.
Developing for Heterogeneous Systems: advancing features and capabilities, maximizing interoperability
URZs work as a oneAPI CoE will add advanced DPC++ capabilities intohipSYCL, which supports systems based on AMD GPUs, NVIDIA GPUs, and CPUs. New DPC++ extensions are part of the SYCL 2020 provisional specification that brings features such as unified shared memory to hipSYCL and the platforms it supports furthering the promise of oneAPI application support across system architectures and vendors.
URZ HPC technical specialist Aksel Alpay, who created hipSYCL, leads its on-going development. The whole project is quite ambitious, says Alpay, venturing a look into the future. hipSYCL is an academic research project as well as a development project, where the final product will be used in production operations. It is incredibly exciting to bring DPC++ and SYCL 2020 capabilities to additional architectures, such as AMD GPUs.
To expedite the research, URZ researchers and developers will access an international network of experts at Intel and numerous academic and government institutions a great advantage to advance hipSYCL capabilities and further the goal of the oneAPI initiative. For a scientific computing center to have access to this level of expertise and work together on an open standard with partners from around the globe, is a wonderful prospect, states Heidelberg Universitys CIO and URZ director Prof. Dr. Vincent Heuveline, who is a major proponent of the CoE. In addition to being the universitys main liaison for the center, he will function as its scientific advisor.
One of our strategic goals is to make a measurable contribution to the transfer of new technologies from research to industrial application, and of course to continuously expand our expertise and research efforts in the field of supercomputing. The oneAPI CoE will allow us to do both, explains Heuveline.
oneAPI is a true cross-industry initiative that seeks to simplify development of diverse workloads by streamlining code re-use across a variety of architectures through an open and collaborative approach. URZs research helps to deliver on the cross-vendor promise of oneAPI by expanding advanced DPC++ application support to other architectures, says Dr. Jeff McVeigh, Intel vice president of Datacenter XPU Products and Solutions.
About oneAPI
oneAPI is an industry initiative to create a single, unified, cross-architecture programming model for CPUs + accelerator architectures. Based on industry standards and its open development approach, the initiative will help streamline software development for high performance computers, increase performance, and provide specifications for efficient and diverse architecture programming.
Learn more
Source: Heidelberg University
See more here:
Baidu offers quantum computing from the cloud – VentureBeat
Posted: September 26, 2020 at 9:52 am
Following its developer conference last week, Baidu today detailed Quantum Leaf, a new cloud quantum computing platform designed for programming, simulating, and executing quantum workloads. Its aimed at providing a programming environment for quantum-infrastructure-as-a-service setups, Baidu says, and it complements the Paddle Quantum development toolkit the company released earlier this year.
Experts believe that quantum computing, which at a high level entails the use of quantum-mechanical phenomena like superposition and entanglement to perform computation, could one day accelerate AI workloads. Moreover, AI continues to play a role in cutting-edge quantum computing research.
Baidu says a key component of Quantum Leaf is QCompute, a Python-based open source development kit with a hybrid programming language and a high-performance simulator. Users can leverage prebuilt objects and modules in the quantum programming environment, passing parameters to build and execute quantum circuits on the simulator or cloud simulators and hardware. Essentially, QCompute provides services for creating and analyzing circuits and calling the backend.
Quantum Leaf dovetails with Quanlse, which Baidu also detailed today. The company describes Quanlse as a cloud-based quantum pulse computing service that bridges the gap between software and hardware by providing a service to design and implement pulse sequences as part of quantum tasks. (Pulse sequences are a means of reducing quantum error, which results from decoherence and other quantum noise.) Quanlse works with both superconducting circuits and nuclear magnetic resonance platforms and will extend to new form factors in the future, Baidu says.
The unveiling of Quantum Leaf and Quanlse follows the release of Amazon Braket and Googles TensorFlow Quantum, a machine learning framework that can construct quantum data sets, prototype hybrid quantum and classic machine learning models, support quantum circuit simulators, and train discriminative and generative quantum models. Facebooks PyTorch relies on Xanadus multi-contributor project for quantum computing PennyLane, a third-party library for quantum machine learning, automatic differentiation, and optimization of hybrid quantum-classical computations. And Microsoft offers several kits and libraries for quantum machine learning applications.
Read more:
IBM Partners With HBCUs to Diversify Quantum Computing Workforce – Diverse: Issues in Higher Education
Posted: at 9:52 am
September 21, 2020 | :
In partnership with historically Black colleges and universities (HBCUs), IBM recently launched a quantum computing research initiative to raise awareness of the field and diversify the workforce.
The IBM-HBCU Quantum Center, a multi-year investment, will fund undergraduate and graduate research, provide access to IBM quantum computers through the Cloud and offer student support.
Quantum computing is considered a fairly young field and quantum computers were not readily available in research labs until 2016. IBM was the first company to put a quantum computer on the Cloud, which allows it to be accessible from anywhere, according to Dr. Abraham Asfaw, global lead of Quantum Education and Open Science at IBM Quantum.
What that implies is that now anyone around the world can participate, he said. This is why we have this broad education effort, to really try and make quantum computing open and accessible to everyone. The scale of the industry is very small but we are stepping into the right direction in terms of trying to get more people into the field.
The 13 HBCUs that will be part of the initiative include Albany State University, Clark Atlanta University, Coppin State University, Hampton University, Howard University, Morehouse College, Morgan State University, North Carolina Agricultural and Technical State University, Southern University, Texas Southern University, University of the Virgin Islands, Virginia Union University and Xavier University of Louisiana.
Each of the schools was chosen based on how much the school focused on science, technology, engineering and mathematics (STEM).
Its very important at this point to be building community and to be educating everyone so that we have opportunities in the quantum computing field for everyone, said Asfaw. While at the same time, we are bringing in diverse perspectives to see where quantum computing applications could emerge.
Dr. Abraham Asfaw
The center encourages individuals from all STEM disciplines to pursue quantum computing. According to Asfaw, the field of quantum computing is considered highly interdisciplinary.
Teaching quantum computing, at any place, requires bringing together several departments, he said. So putting together a quantum curriculum is an exercise in making sure your students are trained in STEM all the way from the beginning to the end with different pieces from the different sciences instead of just one department altogether.
Diversifying the quantum computing workforce can also be looked at in two ways. One is getting different groups of people into the field and the other is bringing different perspectives into the field from the direction of the other sciences that could benefit from quantum computing, according to Asfaw.
We are in this discovery phase now, so really having help from all fields is a really powerful thing, he added.
IBM also plans to donate $100 million to provide more HBCUs with resources and technology as part of the Skills Academy Academic Initiative in Global University Programs. This includes providing HBCUs with university guest lectures, curriculum content, digital badges, software and faculty training by the end of 2020, according to IBM.
Our entire quantum education effort is centered around making all of our resources open and accessible to everyone, said Asfaw. [Our investment] is really an attempt to integrate HBCUs, which also are places of origin for so many successful scientists today, to give them opportunities to join the quantum computing revolution.
According to IBM, the skills academy is a comprehensive, integrated program designed to create a foundation of diverse and high demand skill sets that directly correlate to what students will need in the workplace.
The academy will address topics such as artificial intelligence, cybersecurity, blockchain, design thinking and quantum computing.
Those HBCUs involved in the academy include Clark Atlanta University, Fayetteville State University, Grambling State University, Hampton University, Howard University, Johnson C. Smith University, Norfolk State University, North Carolina A&T State University, North Carolina Central University, Southern University System, Stillman College, Virginia State and West Virginia State University.
While we are teaching quantum computing, while we are building quantum computing at universities, while we are training developers to take on quantum computer, it is important at this point to be inclusive and accessible as possible, said Afsaw. That really allows the field to progress.
This summer, IBM also hosted the 2020 Qiskit Global Summer School, which was designed for people to further explore the quantum computing field. The program involved three hours of lectures as well as hands-on learning opportunities. Many HBCU students were part of the program.
This shows you thats one piece of the bigger picture of trying to get the whole world involved in quantum education, said Asfaw. Thats the first place where HBCUs were involved and we hope to continue to build on even more initiatives going forward.
Sarah Wood can be reached at swood@diverseeducation.com.
Go here to see the original:
IBM, Alphabet and well-funded startups in the race for quantum supremacy – IT Brief Australia
Posted: at 9:52 am
GlobalData, the worldwide data analysts, have offered new research that suggests that many companies are joining the race for quantum supremacy, that is, to be the first to make significant headway with quantum computing.
Quantum computers are a step closer to reality to solve certain real life problems that are beyond the capability of conventional computers, the analysts state.
However, the biggest challenge is that these machines should be able to manipulate several dozens of quantum bits or qubits to achieve impressive computational performance.
As a result, a handful of companies have joined the race to increase the power of qubits and claim quantum supremacy, says GlobalData.
An analysis of GlobalDatas Disruptor Intelligence Center reveals various companies in the race to monetisequantum computing as an everyday tool for business.
IBM's latest quantum computer, accessible via cloud, boasts a 65-qubit Hummingbird chip. It is an advanced version of System Q, its first commercial quantum computer launched in 2019 that has 20 qubits. IBM plans to launch a 1,000-qubit system by the end of 2023.
Alphabet has built a 54-qubit processor Sycamore and demonstrated its quantum supremacy by performing a task of generating a random number in 200 seconds, which it claims would take the most advanced supercomputer 10,000 years to finish the task.
The company also unveiled its newest 72-qubit quantum computer Bristlecone.
Alibabas cloud service subsidiary Aliyun and the Chinese Academy of Sciences jointly launched an 11-qubit quantum computing service, which is available to the public on its quantum computing cloud platform.
Alibaba is the second enterprise to offer the service to public after IBM.
However, its not only the tech giants that are noteworthy. GlobalData finds that well-funded startups have also targeted the quantum computing space to develop hardware, algorithms and security applications.
Some of them are Rigetti, Xanadu, 1Qbit, IonQ, ISARA, Q-CTRL and QxBranch.
Amazon, unlike the tech companies competing to launch quantum computers, is making quantum products of other companies available to users via Braket.
It currently supports quantum computing services from D-Wave, IonQ and Rigetti.
GlobalData principal disruptive tech analyst Kiran Raj says, Qubits can allow to create algorithms for the completion of a task with reduced computational complexity that cannot be achieved with traditional bits.
"Given such advantages, quantum computers can solve some of the intractable problems in cybersecurity, drug research, financial modelling, traffic optimisation and batteries to name a few.
Raj says, Albeit a far cry from the large-scale mainstream use, quantum computers are gearing up to be a transformative reality. They are highly expensive to build and it is hard to maintain the delicate state of superposition and entanglement of qubits.
"Despite such challenges, quantum computers will continue to progress into the future where companies may rent them to solve everyday problems the way they currently rent cloud services.
"It may not come as a surprise that quantum computing one day replaces artificial intelligence as the mainstream technology to help industries tackle problems they never would have attempted to solve before.
Read more from the original source:
IBM, Alphabet and well-funded startups in the race for quantum supremacy - IT Brief Australia