Page 118«..1020..117118119120..130140..»

Amit Shah Promotes Organic Farming, Sets Target to Boost Organic Food Exports – Goodreturns

Posted: March 17, 2024 at 2:33 am


Business

-Vasant Shah

New Delhi, March 13: Cooperation Minister Amit Shah highlighted the significance of promoting organic farming to preserve soil health and announced the government's ambitious goal of increasing organic food exports tenfold to Rs 70,000 crore in the coming years. Shah made these remarks while inaugurating the new office building of three national-level multi-state cooperative societies at the World Trade Centre in Nauroji Nagar.

The three societies inaugurated by Shah include Bhartiya Beej Sahakari Samiti Ltd (BBSSL), National Cooperative Organics Ltd (NCOL), and National Cooperative Export Ltd (NCEL). These societies aim to address gaps in organic products, seed conservation and enhancement, and exports, contributing to the resolution of various challenges in Indian agriculture and boosting farmers' income through increased exports of farm products, including organic foods.

Shah expressed concern over India's relatively small share of the global agricultural produce market, which stands at USD 45 billion out of a total market worth USD 2,155 billion. He emphasized the government's target of reaching USD 115 billion by 2030. Recognizing the deterioration of soil health due to chemical fertilizer use, Shah stressed the importance of encouraging farmers to adopt organic farming practices.

To support organic farming, Shah announced the establishment of a laboratory in every district over the next five years to certify organic farms and products. He highlighted NCOL's role in promoting organic farming and introducing organic products under the Bharat brand. Shah expressed confidence that Bharat Organics will capture over 50% of the domestic organic market by 2030.

Shah acknowledged that India's organic exports currently stand at Rs 7,000 crore, compared to the global organic food market worth Rs 10 lakh crore. He set an ambitious target of increasing organic exports to Rs 70,000 crore, emphasizing the potential for growth in this sector. NCOL will play a crucial role in the entire chain of organic products, including collection, certification, testing, standardization, procurement, storage, processing, branding, labeling, packaging, and export. It will also serve as a guide for many cooperatives.

For BBSSL, the seed society, Shah set a target of Rs 10,000 crore turnover in the next five years. He also announced a target of increasing the turnover of NCEL, the export cooperative society, to Rs 1 lakh crore annually over the next five years. These cooperative societies aim to uplift the lives of individuals associated with agriculture and related activities, aligning with Prime Minister Narendra Modi's vision of "Sahakar se Samriddhi" (Prosperity through Cooperation).

The three cooperative societies were established with the approval of the government and registered under the Multi-State Cooperative Societies Act, 2002. Cooperative societies at various levels, from district to state to national, can become members if they are interested in the activities specified for each society.

NCEL was formed to promote exports from the cooperative sector, with member promoters including Gujarat Cooperative Milk Marketing Federation Ltd (GCMMF), Indian Farmers Fertilizer Cooperative Ltd (IFFCO), Krishak Bharati Cooperative Ltd (KRIBHCO), National Agricultural Cooperative Marketing Federation (NAFED), and National Cooperative Development Corporation (NCDC). NCEL will undertake direct export of goods and services of cooperatives and related entities.

NCOL was established to harness the potential of organic products and create a healthy agriculture ecosystem. It serves as an umbrella organization for the aggregation, procurement, certification, testing, branding, and marketing of organic products in the cooperative sector. NCOL is promoted by NAFED, National Dairy Development Board (NDDB), NCDC, GCMMF, and National Cooperative Consumers' Federation (NCCF). NCOL will support the increased production of organic products and facilitate the marketing of authentic and certified organic products by cooperatives and related entities.

BBSSL focuses on advanced and traditional seed research and production, handling their processing and marketing through the cooperative sector. Promoted by IFFCO, KRIBHCO, NAFED, NDDB, and NCDC, BBSSL aims to enhance the production of quality seeds in India, reducing reliance on imported seeds. The society's efforts will contribute to increased agricultural production and improved income for seed-producing farmers.

The three societies will work together to uplift the lives of individuals involved in agriculture and related activities. They will procure agricultural produce and seeds from farmers via Primary Agricultural Credit Societies (PACS), strengthening PACS and ensuring that farmers receive maximum value for their produce. The societies will operate with the objective of ensuring that profits on the net surplus go directly to farmers' accounts, minimizing leakages in the process.

For investment related articles, business news and mutual fund advise

Allow Notifications

You have already subscribed

The rest is here:

Amit Shah Promotes Organic Farming, Sets Target to Boost Organic Food Exports - Goodreturns

Written by admin |

March 17th, 2024 at 2:33 am

Posted in Organic Food

Tagged with

Unlocking Nature’s Treasures: 3D Digital Library Revolutionizes Museum Collections – ScienceBlog.com

Posted: March 9, 2024 at 2:41 am


The completion of the openVertebrate (oVert) project has ushered in a new era of scientific discovery and accessibility for natural history museums. This five-year collaborative effort among 18 institutions has created 3D reconstructions of over 13,000 vertebrate specimens, making them freely available online.

When people first collected these specimens, they had no idea what the future would hold for them, said Edward Stanley, co-principal investigator of the oVert project and associate scientist at the Florida Museum of Natural History.

The oVert project represents a significant shift from the traditional model where museum collections were largely inaccessible to the public and required researchers to physically travel or request specimen loans. Now we have scientists, teachers, students and artists around the world using these data remotely, explained David Blackburn, lead principal investigator and curator of herpetology at the Florida Museum.

Through CT scanning, the project has captured representative species across the vertebrate tree of life, including over half the genera of all amphibians, reptiles, fishes, and mammals. These detailed 3D models offer an unprecedented view of internal structures that were previously only observable through destructive dissection.

Museums are constantly engaged in a balancing act, Blackburn said. You want to protect specimens, but you also want to have people use them. oVert is a way of reducing the wear and tear on samples while also increasing access, and its the next logical step in the mission of museum collections.

The impact of oVert has already been profound, with researchers using the data to uncover remarkable insights into the natural world, such as the discovery of bony plates in the tails of spiny mice and the revelation that the massive dinosaur Spinosaurus was likely a land-dweller rather than an aquatic predator.

Beyond scientific research, the 3D models have been utilized by artists, educators, and in virtual reality experiences, making natural history more accessible and engaging than ever before.

As the project moves forward, the challenge lies in developing advanced tools and techniques to fully leverage this unprecedented wealth of data, pushing the boundaries of what is possible in fields ranging from machine learning to supercomputing.

#DigitalMuseum #3DModelingTechnology #NaturalHistoryCollections #OpenScience

The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.

Read more:
Unlocking Nature's Treasures: 3D Digital Library Revolutionizes Museum Collections - ScienceBlog.com

Written by admin |

March 9th, 2024 at 2:41 am

Posted in Online Library

Tagged with

Shimla MC to establish digital libraries across town – The Tribune India

Posted: at 2:41 am


Tribune News Service

Shimla, March 7

In an initiative to facilitate the students as well as the people of the states capital, Shimla Municipal Corporation (SMC) is set to establish digital libraries across the town.

For this, the corporation has directed all the councillors to find a suitable place in their respective wards. The councillors have also been directed to ensure that the place for establishing the digital libraries is easily accessible by the students.

The proposal of establishing digital libraries was mentioned in the Budget for the financial year 2024-25 which was announced by SMC Mayor Surinder Chauhan on February 15. According to the Budget, the corporation will establish these as per the availability of land.

Chauhan said that the corporation plans to establish digital libraries in each ward for which the corporation has sought proposals from the councillors. The work for the establishment of these libraries will start soon after the proposals are received from the councillors, he said.

With the availability of these digital libraries, students will be greatly benefitted he added.

He said that several new buildings will also be constructed for establishing the libraries for which the corporation will also seek funds from the state as well as the Union Government. Chauhan said that the proposals for the funds will be sent to the government after assessing the report of total expenditure that will be incurred in establishing the libraries.

About The Author

The Tribune News Service brings you the latest news, analysis and insights from the region, India and around the world. Follow the Tribune News Service for a wide-ranging coverage of events as they unfold, with perspective and clarity.

#Shimla

Originally posted here:
Shimla MC to establish digital libraries across town - The Tribune India

Written by admin |

March 9th, 2024 at 2:41 am

Posted in Online Library

Tagged with

Chevron invests in quantum computing development for oil and gas market – WorldOil

Posted: at 2:40 am


(WO) OQC announced that Chevron Technology Ventures, part of Chevron Corporation, has joined its $100m Series B funding round.

Quantum computing in the oil and gas market is expected to grow at a CAGR of 37.9%, owing to the increasing demand for efficient optimization and simulation across the sector. Chevron's investment marks a significant move by a supermajor into the rapidly evolving field of quantum computing.

"OQC's development of the quantum computer has the potential to change the information processing landscape by merging the bounds of engineering and physics," said Jim Gable, Vice President, Innovation and President of Technology Ventures at Chevron. "This is the latest investment from our Core Energy Fund, which focuses on high-tech, high-growth startups and breakthrough technologies that could improve Chevron's core oil and gas business performance as well as create new opportunities for growth."

A quantum future for oil and gas. OQC's technology provides several potential groundbreaking opportunities for the oil and gas sector, including the development and optimization of catalysts and the efficiency of transportation and distribution networks. Quantum is anticipated to accelerate the oil and gas industry's discovery and development of new materials through the simulation of complex molecules to lower carbon products.

To realize this future, the oil and gas industry requires secure, accessible and powerful quantum computing that is integrated with existing high-performance computing. Prior to the launch of OQC Toshiko, quantum computers were only available in labs, making secure access for companies and integration with existing high-performance computing the largest barriers to wider business adoption of this groundbreaking technology.

Commenting on the news, Ilana Wisby, Chief Executive Officer at OQC, said, "Chevron's investment marks a significant milestone in harnessing quantum computing for the energy sector. We're excited to drive innovation and efficiency in exploration and renewables and pioneer enterprise-ready quantum in the energy sector."

Read the original post:

Chevron invests in quantum computing development for oil and gas market - WorldOil

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Why the QPU Is the Next GPU – Built In

Posted: at 2:40 am


The computational demands of various sectors, such as drug discovery, materials science, and AI are skyrocketing. Graphics processing units (GPUs) have been at the forefront of this journey, serving as the backbone for tasks demanding high parallel processing capabilities. Their integration into data centers has marked a significant advancement in computational technology.

As we push the boundaries of what's computationally possible, however, the limitations of GPUs become apparent, especially when facing problems that classical computing struggles to solve efficiently. Enter the quantum processing unit (QPU), a technology that promises not just to complement but potentially transcend the capabilities of GPUs, heralding a new era in computational science.

A quantum processing unit, or QPU, uses qubits and quantum circuit model architecture to solve problems that are too computationally intensive for classical computing. Its potential is analogous to the transformational impact the GPU had on computing in the 2000s.

More From Yuval BogerWhat Role Will Open-Source Development Play in Quantum Computing?

The binary system is at the core of classical computing, with bits that exist in one of two states: zero or one. Through logic gates within the von Neumann architecture (an architecture that includes a CPU, memory, I/O, and data bus), this binary processing has propelled technological progress for decades. GPUs, enhancing this system, offer parallel processing by managing thousands of threads simultaneously, significantly outpacing traditional CPUs for specific tasks.

Despite their prowess, GPUs are still bound by the linear progression of classical algorithms and the binary limitation of bits, making some complex problems inefficient and energy-intensive to solve. A key reason for this linear progression limitation is that a classical algorithm can only process one possible solution at a time.

The integration of GPUs into data centers began in the late 1990s and early 2000s, initially focused on graphics rendering. NVIDIAs GeForce 256, released in 1999 and billed as the worlds first GPU, marked a significant shift towards GPUs as programmable units rather than merely graphics accelerators. Their general-purpose computing potential was realized in the mid-2000s with NVIDIAs introduction of CUDA in 2006, enabling GPUs to handle computational tasks beyond graphics, such as simulations and financial modeling.

The democratization of GPU computing spurred its adoption for scientific computing and AI, particularly benefiting from GPUs parallel processing capabilities. This led to wider use in research and high-performance computing, driving significant advancements in GPU architecture.

By the early 2010s, the demand for big data processing and AI applications accelerated GPU adoption in cloud services. This period also saw the rise of specialized AI data centers optimized for GPU clusters, enhancing the training of complex neural networks.

The 2020s have seen continued growth in GPU demand, driven by deep learning applications in natural language processing, computer vision, and speech recognition. Modern deep learning frameworks and the introduction of specialized AI accelerators, such as Googles TPU and NVIDIAs Tensor Core GPUs, underscore the critical role of GPUs in AI development and the evolving landscape of computational hardware in data centers.

Despite these developments, GPUs did not displace traditional CPUs. Rather, they ran side by side. We saw the rise of heterogeneous computing: the increasingly popular integration of GPUs with CPUs and other specialized hardware within a single system. This allows different processors to handle tasks best suited to their strengths, leading to improved overall efficiency and performance.

Quantum computing introduces a transformative approach to computing with the concept of qubits. Unlike classical bits, qubits can exist in a state of superposition, embodying both zero and one simultaneously. This characteristic, along with quantum entanglement, enables quantum computers to process information on a scale that classical machines cant match. Quantum gates manipulate these qubits, facilitating parallel processing across exponentially larger data sets.

Quantum gates are the fundamental building blocks of quantum circuits, analogous to logic gates in classical computing, but designed for operations on qubits instead of classical bits. Quantum gates manipulate the state of qubits according to the principles of quantum mechanics, enabling the execution of quantum algorithms. Some quantum gates operate only on a single qubit, whereas others operate on two or more qubits. Multi-qubit gates are critical to exploiting the entangle and superposition properties of quantum computing.

The quantum computing field is grappling with challenges like qubit stability and effective quantum error correction, however, which are crucial for achieving scalable quantum computing. Qubits are inherently fragile and can be affected by a variety of environmental conditions. Therefore, maintaining a stable qubit state is challenging, and researchers still must develop special techniques to detect and correct unwanted changes in the qubit state.

QPU technology is poised to revolutionize areas where classical computing reaches its limits. In drug discovery, for instance, QPUs could simulate molecular interactions at scales never before possible, expediting the creation of new therapeutics. Materials science could benefit from the design of novel materials with tailored properties. In finance, QPUs could enhance complex model optimizations and risk analysis. In AI, they could lead to algorithms that learn more efficiently from less data. QPUs are thus able to tackle problems that CPUs and GPUs cannot and never will, and thus open new frontiers of discovery and innovation.

Although GPUs have revolutionized data center operations, they also bring formidable challenges. The voracious GPU appetite for power generates significant heat, which demands sophisticated and often expensive cooling systems to maintain optimal performance levels. This not only increases the operational costs but also raises environmental concerns due to the high energy consumption required for both running the units and cooling them.

In addition to these physical constraints, the technological landscape in which GPUs operate is rapidly evolving. The constant need for updates and upgrades to accommodate new software demands and improve processing capabilities presents substantial logistical and financial hurdles. This strains resources and complicates long-term planning for data center infrastructure.

QPUs promise to address many of these challenges. QPUs perform computations in ways fundamentally different from classical systems. Specifically, the intrinsic ability of qubits to exist in multiple states simultaneously allows QPUs to tackle complex problems more effectively, reducing the need for constant hardware upgrades. This promises not only a leap in computational power but also a move towards more sustainable and cost-effective computing solutions, directly addressing the critical limitations faced by GPUs in todays data centers.

The journey toward QPU adoption in computational infrastructures is laden with hurdles, though. Achieving stable, large-scale quantum systems and ensuring reliable computations through quantum error correction are paramount challenges. Some types of quantum computers require special cooling and environmental conditions that are uncommon in data centers and thus require adaptation.

Additionally, the quantum software development field is in its infancy, necessitating the creation of new programming tools and languages. To make use of the quantum properties of QPUs, just translating classical algorithms is insufficient. Instead, we will need to invent new types of algorithms. Just like GPUs allow us to leverage parallel processing, QPUs allow us to execute code differently. Despite these obstacles, ongoing research and development are gradually paving the way for QPUs to play a central role in future computational tasks.

Today, QPU integration into broader computational infrastructures and their practical application in industry and research is still in the nascent stages. The development and commercial availability of quantum computers is growing, with several companies and research institutions demonstrating quantum advantage and offering cloud-based quantum computing services.

How close are QPUs to taking a prime position next to GPUs? In other words, if we were to compare the development of QPUs with the historical development of GPUs, what year would we be in now?

Drawing a parallel with the GPU timeline, the current stage of QPU integration closely mirrors the GPU landscape in the mid-2000s, when GPUs became general-purpose computing machines that were adopted for niche applications.

Given these considerations, the current stage of QPU integration might be analogous to the GPU industry around 2006-2007. That was a time of pivotal change, where the foundational technologies and programming models that would enable widespread adoption were just being established. For QPUs, the development of quantum algorithms, error correction techniques, and qubit coherence are akin to the early challenges faced by GPUs in transitioning to general-purpose computing.

More on Quantum ComputingAre You Prepared for the Quantum Revolution?

In summary, although GPUs continue to play a critical role in advancing computational capacities, the integration of QPUs into data centers holds the promise of overcoming the operational and environmental challenges posed by current technologies. With their potential for lower power consumption, reduced heat output, and diminished need for frequent upgrades, QPUs represent a hopeful horizon in the quest for more efficient, sustainable, and powerful computing solutions. QPUs wont replace GPUs, just like GPUs did not eliminate classical CPUs. Instead, the data center of the future will include all three computing methods.

Original post:

Why the QPU Is the Next GPU - Built In

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

What is quantum computing good for? XPRIZE and Google offer cash for answers – Network World

Posted: at 2:40 am


The sponsors of a new $5 million prize want to boost the quantum computing industry by encouraging developers to write new algorithms to help the emerging technology solve real-world problems.

The new Quantum for Real-World Impact contest, from the XPRIZE Foundation, aims to speed the development of quantum computing algorithms focused on sustainability, health, and other societal issues. The three-year contest, sponsored by Google Quantum AI and the Geneva Science and Diplomacy Anticipator Foundation, wants to unleash the potential of quantum computing, according to the contest site.

Currently, quantum computers are not sufficiently advanced enough to solve real-world societal problems that classical computers cannot, the contest site says. However, as the technology advances, relatively few companies and university researchers are focused on translating quantum algorithms into real-world application scenarios and assessing their feasibility to address global challenges once sufficiently powerful hardware is available.

The new contest is crucial for the advancement of quantum computing, said Rebecca Krauthamer, co-founder and chief product officer at QuSecure, a vendor of quantum-resilient cybersecurity tools.

XPRIZE has a powerful history of pushing forward advancements in cutting-edge technology in spaceflight, conservation, advanced medicine, and more, she said. The contest signifies were in a truly exciting time for quantum computing.

Quantum computing hardware development still has a significant road ahead, she added, but much of the innovation from the technology will come from new algorithms and the application of quantum computers to real-world problems.

The contest provides the recognition of the great potential of quantum computing for both commercial and societal gain, she added.

Contestants can write new algorithms to solve new problems using quantum computing, they can show how existing algorithms can be used to solve previously unknown applications of quantum computing, or they can show ways to reduce the computing resources needed for a quantum computer to work on already established algorithms or applications.

Examples of possible contest entries include:

The contest is a good starting point for quantum computing in business models, said Jim Ingraham, vice president of strategic research, EPB of Chattanooga, a power and telecommunications company that launched a quantum-powered network in late 2022. Commercialization is the next essential step for bringing quantum technologies out of the lab and into the real world, he said.

The EPB Quantum Network was another step forward, he added. The network provides access to the necessary proving ground for quantum technologists to show investment worthiness and commercial viability, he said. This is a necessary step to help companies, government agencies and researchers accelerate the development of their technologies.

The contest may assist companies that havent found a way to profit from quantum computing innovation, added Lawrence Gasman, founder and president of Inside Quantum Technology, a quantum research firm.

It may bring in firms that could otherwise not survive, he said. This implies that the use of money is carefully vetted and only goes to firms that can make money in the short-to-medium term.

While quantum computing is not yet mainstream, that day is coming, said QuSecures Krauthamer.

When you see a news headline stating that quantum computers have been used to solve a problem that you recognize something like enhancing battery technology, or optimizing financial portfolios, or improving greenhouse emissions thats when youll know that quantum computing has gone mainstream, she said. We will begin seeing these headlines more in the next couple of years.

View original post here:

What is quantum computing good for? XPRIZE and Google offer cash for answers - Network World

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

3 Quantum Computing Stocks to Buy for Real-World Breakthrough – InvestorPlace

Posted: at 2:40 am


The quantum computing industry is experiencing significant growth, with advancements in both hardware and software making it a key consideration for organizations looking to invest in cutting-edge technology. To this end, we look at some of the top quantum computing stocks to buy as businesses utilize this next-gen technology across various industries.

Major tech players are increasingly interested in making significant investments in quantum computing to align with the rapid pace of technological advancements amid customers current demands, which are seeking innovative computational solutions.

Drawing on data from the quantum market and insights from industry thought leaders gathered in the fourth quarter of 2023, the recent State of Quantum 2024 report noted the transition from theoretical exploration to practical application, highlighted by the emergence of full-stack quantum computer deliveries in national labs and quantum centers.

In 2022, venture investments in quantum technology soared to over $2 billion amid strong investor confidence in this burgeoning field. However, by 2023, these investments saw a sharp 50% drop, sparking debates about a potential quantum winter.

Industry experts argue the decline reflects broader venture capital trends and not a loss of faith in the quantum sectors prospects. Government funding has increasingly filled the gap private investors left, mitigating concerns over the investment slowdown.

The bottom line is the quantum industry is still advancing, albeit at a moderate pace. This emphasizes the need for realistic expectations and a sustained commitment to research and development. Despite the recent dip in investment, the sectors insiders remain cautiously optimistic about its future. This suggests the industry is far from stagnating.

Lets take a closer look at leading quantum computing stocks to buy.

Intel (NASDAQ:INTC), the semiconductor giant, is actively pursuing a turnaround strategy to regain its leadership in the technology industry. The plan involves a significant restructuring of its operations, investment in advanced chip manufacturing technologies and a renewed focus on innovation.

Among other things, Intel is pushing hard to develop its quantum computing products. The chipmaker introduced Tunnel Falls, a quantum computing chip leveraging the companys cutting-edge manufacturing techniques.

The company has collaborated with various government and academic research entities to facilitate the testing of Tunnel Falls. According to Intel, the new chip has a 95% yield rate across the wafer and voltage uniformity.

Quantum computing isnt the core focus of Intels strategy to reclaim its semiconductor industry leadership. However, the initiative represents a potential growth area. Success in quantum computing research could position Intel as a key player in this innovative technology domain in the future. This could make Intel one of the top quantum computing stocks to buy.

Similarly to Intel, Alphabet (NASDAQ:GOOGL, NASDAQ:GOOG) is making significant strides in quantum computing through its subsidiary, Quantum AI. Focusing on developing quantum processors and algorithms, Googles parent company aims to harness quantum technology for breakthroughs in computing power.

Alphabet recently exceeded Q4 earnings expectations with a net income of $20.69 billion and a 13% revenue increase to $86.3 billion. Its advertising revenue of $65.52 billion slightly missed analyst projections.

While fighting Microsoft (NASDAQ:MSFT) on the AI front, Google has also ventured into the quantum computing realm with its proprietary quantum computing chips, Sycamore. In a strategic move, Google spun off its quantum computing software division into a standalone startup, SandboxAQ, in March 2022.

Its dominant position in search drives Googles foray into quantum computing. It aims to develop more efficient, faster and intelligent solutions. The company plays a crucial role in managing vast volumes of digital information. It can gain immensely by enabling various organizations to harness the transformative power of quantum computing and AI.

FormFactor (NASDAQ:FORM), a leading provider in the semiconductor industry, specializes in the design, development and manufacture of advanced wafer probe cards. These probe cards are essential for the electrical testing of semiconductor wafers before cutting them into individual chips.

FormFactor is strategically positioned within the quantum computing ecosystem through its semiconductor test and measurement solutions expertise. The company provides advanced systems essential for developing and testing quantum computing chips. These systems are designed to operate at extremely low temperatures, a fundamental requirement for quantum computing experiments where qubits must be maintained in a coherent state.

Its flagship products include precision engineering solutions like the Advanced Matrix series for high-density applications and the TouchMatrix series for touchscreen panels. FormFactors products enable semiconductor manufacturers to perform reliable and accurate testing at various stages of the production process. This ensures the functionality and quality of the final semiconductor products.

Last month, FormFactor reported a modest top-line year-over-year increase of 1.3%, reaching $168.2 million. Looking ahead, expectations for the first quarter are aligned with the recent quarterly performance, with projected revenue of around $165 million.

On the date of publication, Shane Neagle did not hold (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Shane Neagle is fascinated by the ways in which technology is poised to disrupt investing. He specializes in fundamental analysis and growth investing.

More here:

3 Quantum Computing Stocks to Buy for Real-World Breakthrough - InvestorPlace

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Longer coherence: How the quantum computing industry is maturing – DatacenterDynamics

Posted: at 2:40 am


Quantum computing theory dates back to the 1980s, but it's really only in the last five to ten years or so that weve seen it advance enough to the point it could realistically become a commercial enterprise.

Most quantum computing companies have been academic-led science ventures; companies founded by PhDs leading teams of PhDs. But, as the industry matures and companies look towards a future of manufacturing and operating quantum computers at a production-scale, the employee demographics are changing.

While R&D will always play a core part of every technology company, making quantum computers viable out in the real world means these startups are thinking about how to build, maintain, and operate SLA-bound systems in production environments.

This new phase in the industry requires companies to change mindset, technology, and staff.

Plus rebuilding Ukraine, Cologix's CEO, and more

20 Dec 2023

At quantum computing firm Atom Computing, around 40 of the companys 70 employees have PhDs, many joining straight out of academia. This kind of academic-heavy employee demographic is commonplace across the quantum industry.

I'd venture that over half of our company doesn't have experience working at a company previously, says Rob Hays, CEO of Atom. So theres an interesting bridge between the academic culture versus the Silicon Valley tech startup; those are two different worlds and trying to bridge people from one world to the other is challenging. And it's something you have to focus and work on openly and actively.

Maturing from small startups into large companies with demanding customers and shareholders is a well-trodden path for hundreds of technology companies in Silicon Valley and across the world.

And quantum computers are getting there: the likes of IonQ, Rigetti, and D-Wave are already listed in the Nasdaq and New York Stock Exchange although the latter two companies have had to deal at various times with the prospect of being de-listed due to low stock prices.

Most of the quantum companies DCD spoke to for this piece are undergoing a transition from pure R&D mode to a more operational and engineering phase.

When I first joined four years ago, the company was entirely PhDs, says Peter Chapman, IonQ CEO. We're now in the middle of a cultural change from an academic organization and moving to an engineering organization. We've stopped hiring PhDs; most of the people we're hiring nowadays are software, mechanical, and hardware engineers. And the next phase is to a customer-focused product company.

Chapman points to the hirings of the likes of Pat Tan and Dean Kassmann previously at Amazons hardware-focused Lab126 and rocket firm Blue Origin, respectively as evidence of the company moving to a more product- and engineering-focused workforce.

2023 also saw Chris Monroe, IonQ co-founder and chief scientist, leave the company to return to academia at North Carolinas Duke University.

During the earnings call announcing Monroes departure, Chapman said: Chris would be the first one to tell you that the physics behind what IonQ is doing is now solved. It's [now] largely an engineering problem.

Atoms Hays notes a lot of the engineering work that the company is doing to get ready for cloud services and applications is software-based, meaning the company is looking for software engineers.

We are mostly looking for people that have worked at cloud service providers or large software companies and have an interest in either learning or already some foundational knowledge of the underlying physics and science, he says. But we're kind of fortunate that those people self-select and find us. We have a pretty high number of software engineers who have physics undergrads and an extreme interest in quantum mechanics, even though by trade and experience they're software engineers.

On-premise quantum computers are currently rarities largely reserved for national computing labs and academic institutions. Most quantum processing unit (QPU) providers offer access to their systems via their own web portals and through public cloud providers.

But todays systems are rarely expected (or contracted) to run with the five-9s resiliency and redundancy we might expect from tried and tested silicon hardware.

Right now, quantum systems are more like supercomputers and they're managed with a queue; they're probably not online 24 hours, users enter jobs into a queue and get answers back as the queue executes, says Atoms Hays.

We are approaching how we get closer to 24/7 and how we build in redundancy and failover so that if one system has come offline for maintenance, there's another one available at all times. How do we build a system architecturally and engineering-wise, where we can do hot swaps or upgrades or changes with minimal downtime as possible?

Other providers are going through similar teething phases of how to make their systems which are currently sensitive, temperamental, and complicated enterprise-ready for the data centers of the world.

I already have a firm SLA with the cloud guys around the amount of time that we do jobs on a daily basis, and the timeframes to be able to do that, says Chapman. We are moving that SLA to 24/7 and being able to do that without having an operator present. It's not perfect, but its getting better. In three or four years from now, you'll only need an on-call when a component dies.

Rigetti CTO David Rivas says his company is also working towards higher uptimes.

The systems themselves are becoming more and more lights out every quarter, he says, as we outfit them for that kind of remote operation and ensure that the production facilities can be outfitted for that kind of operation.

Rigetti

Manufacturing and repair of these systems is also maturing, since the first PhD-built generations of quantum computers. These will never be mass-produced, but the industry needs to move away from one-off artisanal machines to a more production line-like approach.

A lot of the hardware does get built with the assistance of electronics engineers, mechanical engineers, says Atoms Hays, but much is still built by experimental physicists.

IonQs Chapman adds: In our first-generation systems, you needed a physicist with a screwdriver to tune the machine to be able to run your application. But every generation of hardware puts more under software control.

Everywhere a screwdriver could be turned, there's now a stepper motor under software control, and the operating system is now doing the tuning.

Simon Phillips, CTO of the UKs Oxford Quantum Circuits, says OQC is focused on how it hires staff and works with partners to roll out QPUs into colocation data centers.

And the first part of that starts with if we put 10 QPUs in 10 locations around the world, how do we do that without having an army of 100 quantum engineers on each installation?

And the first part of that starts with having a separate deployment team and a site reliability engineering team that can then run the SLA on that machine.

He adds: Not all problems are quantum problems. It can't just be quantum engineers; it's not scalable if it's the same people doing everything.

It's about training and understanding where the first and second lines of support sit, having a cascading system, and utilizing any smart hands so we can train people who already exist in data centers.

IonQ

While the quantum startups are undergoing their own maturing process, their suppliers are also being forced to learn about the needs of commercial operators and what it means to deploy in a production data center.

For years, the supply chain including for the dilution refrigerators that keep many quantum computers supercooled has dealt with largely self-reliant academic customers in lab spaces.

Richard Moulds, general manager of Amazon Braket at AWS, told DCD the dilution refrigerator market is a cottage industry with few suppliers.

One of the main fridge suppliers is Oxford Instruments, an Oxford University spin-out from the late 1950s that released the first commercial dilution unit back in 1966. The other large incumbent, Blufors, was spun out of what is now the Low Temperature Laboratory at Aalto University in Finland 15 years ago.

Prior to the quantum computing rush, the biggest change in recent years was the introduction of pulse tube technology. Instead of a cryostat inserted into a bath of liquid helium4, quantum computers could now use a closed loop system (aka a dry fridge/cryostat).

This meant the systems could become smaller, more efficient, more software-controlled - and more user-friendly.

With the wet dilution fridge (or wet cryostat), you need two-floor rooms for ceiling height. You need technicians to top up helium and run liquefiers, you need to buy helium to keep topping up, says Harriet van der Vliet, product segment manager, quantum technologies, Oxford Instruments.

It was quite a manual process and it would take maybe a week just to pre-cool and that would not even be getting to base temperature.

For years, the fridges were the preserve of academics doing materials science; they were more likely to win a Nobel prize than be part of a computing contract.

Historically, it's been a lab product. Our customers were ultra-low temperature (ULT) experts; if anything went wrong, they would fix it themselves, says van der Vliet. Now our customers have moved from being simply academics to being commercial players who need user-friendly systems that are push button.

While the company declined to break out numbers, Oxford said it has seen a noticeable change in the customer demographic towards commercial quantum computing customers in recent years, but also a change in buying trends. QPU companies are more likely to buy multiple fridges at once, rather than a single unit every few years for an academic research lab.

The commercial part is growing for sure, adds David Gunnarsson, CTO at Blufors. The company has expanded factory capacity to almost double production capabilities to meet growing demand.

There have been more and more attempts to create revenue on quantum computing technology. They are buying our systems to actually deploy or have an application that they think they can create money from. We welcome discussion with data centers so they can understand our technology from the cryogenics perspective.

And while the industry is working towards minimizing form factors as much as possible, for the foreseeable future the industry has settled on essentially brute force supercooling with bigger fridges. Both companies have released new dilution fridges designed for quantum computers.

Smaller fridges (and lower qubit-count) systems may be able to fit into racks, but most larger qubit-count supercooled systems require a much larger footprint than traditional racks. Blufors largest Kide system can cool around 1,000 qubits: the system is just under three meters in height and 2.5 meters in diameter, and the floor beneath it needs to be able to take about 7,000 kilograms of weight.

It has changed the way we do our product, says Gunnarsson. They were lab tools before; uptime wasnt discussed much before. Now we are making a lot of changes to our product line to ensure that you can be more certain about what the uptime of your system will be.

Part of the uptime challenge suppliers face around fridges an area where Gunnarsson notes there is still something of a mismatch is in the warm-up/cool-down cycle of the machines.

While previously the wet bath systems could take a week to get to the required temperatures, the new dry systems might only take a day or two each way. That is important, because cooling down and warming up cycles are effectively downtime; a dirty word when talking about service availability.

The speed with which you can get to temperature is almost as important as the size of the chip that you can actually chill, says AWS Moulds. Today, if you want to change the device's physical silicon, you have got to warm this device up and then chill it back down again, that's a four-day cycle. That's a problem; it means machines are offline for a long time for relatively minor changes.

While this might not be an issue for in-operation machines Rigetti CTO Rivas says its machines can be in service for months at a time, while Oxford Instruments says an OQC system was in operation non-stop for more than a year the long warm-up/cool-down cycle is a barrier to rapid testing.

From a production perspective, the systems remain cold for a relatively long time, says Rivas. But we're constantly running chips through test systems as we innovate and grow capacity, and 48 hours to cool a chip down is a long time in an overall development cycle.

Oxford Instruments and Blufors might be the incumbents, but there are a growing number of new players entering the fridge space, some specifically focusing on quantum computing.

The market has grown for dilution fridges, so there are lots more startups in the space as well making different cooling systems, says van der Vliet. There are many more players, but the market is growing.

I think it's really healthy that there's loads of players in the field, particularly new players who are doing things a little bit differently to how we've always done it.

The incumbents are well-placed to continue their lead in the market, but QPU operators are hopeful that competition will result in better products.

There will be genuine intellectual property that will emerge in this area and you'll definitely start to see custom designs and proprietary systems that can maintain temperature in the face of increasing power.

Atoms Hays notes that, for laser-based quantum systems, the lasers themselves are probably the largest constraint in the supply chain. Like the dilution fridges, these are still largely scientific technologies made by a handful of suppliers.

We need relatively high-powered lasers that need to be very quiet and very precise," he says. Ours are off the shelf, but they're semi-custom and manufacturer builds to order. That means that there's long lead times; in some cases up to a year.

He adds that many of the photonic integrated circuits are still relatively small - the size of nickels and dimes - but hopes they can shrink down to semiconductor size in future to help reduce the footprint

For now, the quantum industry is still enjoying what might be the autumn of its happy-go-lucky academic days. The next phase may well lead to quantum supremacy and a new phase in high-performance computing, but it will likely lead to a less open industry.

I think its nice that the industry is still sort of in that mode, says AWS Moulds. The industry is still taking a relatively open approach to the development. We're not yet in the mode of everybody working in their secret bunkers, building secret machines. But history shows that once there's a clear opportunity, there's a risk of the shutters coming down, and it becoming a more cut-throat industry.

In the end, that's good for customers; it drives down costs and drives up reliability and performance. But it might feel that might feel a little bit brutal for some of the academics that are in the industry now.

Read more:

Longer coherence: How the quantum computing industry is maturing - DatacenterDynamics

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Quantum Attack Protection Added to HP Business PCs – SecurityWeek

Posted: at 2:40 am


HP announced on Thursday that several of its business PCs now benefit from protection against quantum computer attacks thanks to a new security chip.

The tech giant said the 5th generation of its Endpoint Security Controller (ESC) chip, which is built into some of its computers, can protect the integrity of the devices firmware using quantum-resistant cryptography.

According to HP, the 5th generation ESC is currently available in Zbool Firefly, Power and Studio workstations; EliteBook 1000 series, 800 series and some 600 series notebooks; and some 400 series ProBook notebooks.

By embedding protection against quantum computer hacks at the chip level, HP is today setting a new standard in hardware and firmware security with our 5th generation ESC chip, HP said. By isolating the chip from the processor and OS, the ESC provides a hardware platform that reduces the risk of data breaches and improves productivity by preventing downtime.

[ Read: Cyber Insights 2024: Quantum and the Cryptopocalypse ]

While practical quantum computer attacks may still be at least a decade away, major tech companies have already started taking steps to ensure that the cryptography used in their products will be able to provide protection against quantum attacks when that day comes.

Apple, for instance, recently announced adding post-quantum encryption to iMessage to protect communications against quantum computing attacks.

Governments have also started taking steps to tackle the theoretical threats posed by quantum computing before they become a reality.

HP urges businesses to immediately start planning for the future and begin migrating their fleets. The company recommends identifying the highest priority use cases, finding out what technology providers are planning in regards to quantum protections, and creating a plan to ensure protection is rolled out in the required timeframe.

Related: AI Helps Crack NIST-Recommended Post-Quantum Encryption Algorithm

Related: In Other News: WEFs Unsurprising Cybersecurity Findings, KyberSlash Cryptography Flaw

Read the original here:

Quantum Attack Protection Added to HP Business PCs - SecurityWeek

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Quantum Computing Takes a Giant Leap With Light-Based Processors – SciTechDaily

Posted: at 2:40 am


Researchers have developed a groundbreaking light-based processor that enhances the efficiency and scalability of quantum computing and communication. By minimizing light losses, the processor promises significant advancements in secure data transmission and sensing applications. Credit: SciTechDaily.com

A team of scientists has created a reprogrammable light-based quantum processor, reducing light losses and enabling advancements in quantum computing and secure communications.

Scientists have created a reprogrammable light-based processor, a world-first, that they say could usher in a new era of quantum computing and communication.

Technologies in these emerging fields that operate at the atomic level are already realizing big benefits for drug discovery and other small-scale applications.

In the future, large-scale quantum computers promise to be able to solve complex problems that would be impossible for todays computers.

Lead researcher Professor Alberto Peruzzo from RMIT University in Australia said the teams processor a photonics device, which used light particles to carry information could help enable successful quantum computations, by minimizing light losses.

Our design makes the quantum photonic quantum computer more efficient in terms of light losses, which is critical for being able to keep the computation going, said Peruzzo, who heads the ARC Centre of Excellence for Quantum Computation and Communication Technology (CQC2T) node at RMIT.

If you lose light, you have to restart the computation.

Other potential advances included improved data transmission capabilities for unhackable communications systems and enhanced sensing applications in environmental monitoring and healthcare, Peruzzo said.

The teams reprogrammable light-based processor. Credit: Will Wright, RMIT University

The team reprogrammed a photonics processor in a range of experiments, achieving a performance equivalent to 2,500 devices, by applying varying voltages. Their results and analysis are published in Nature Communications.

This innovation could lead to a more compact and scalable platform for quantum photonic processors, Peruzzo said.

Yang Yang, lead author and RMIT PhD scholar, said the device was fully controllable, enabled fast reprogramming with reduced power consumption, and replaced the need for making many tailored devices.

We experimentally demonstrated different physical dynamics on a single device, he said.

Its like having a switch to control how particles behave, which is useful for both understanding the quantum world and creating new quantum technologies.

Professor Mirko Lobino from the University of Trento in Italy made the innovative photonic device, using a crystal called lithium niobate, and Professor Yogesh Joglekar from Indiana University Purdue University Indianapolis in the United States brought his expertise in condensed matter physics.

Lithium niobate has unique optical and electro-optic properties, making it ideal for various applications in optics and photonics.

My group was involved in the fabrication of the device, which was particularly challenging because we had to miniaturize a large number of electrodes on top of the waveguides to achieve this level of reconfigurability, Lobino said.

Programmable photonic processors offer a new route to explore a range of phenomena in these devices that will potentially unlock incredible advancements in technology and science, Joglekar said.

Meanwhile, Peruzzos team has also developed a world-first hybrid system that combines machine learning with modeling to program photonic processors and help control the quantum devices.

Peruzzo said the control of a quantum computer was crucial to ensure the accuracy and efficiency of data processing.

One of the biggest challenges to the devices output accuracy is noise, which describes the interference in the quantum environment that impacts how qubits perform, he said.

Qubits are the basic units of quantum computing.

There are a whole range of industries that are developing full-scale quantum computing, but they are still fighting against the errors and inefficiencies caused by noise, Peruzzo said.

Attempts to control qubits typically relied on assumptions about what noise was and what caused it, Peruzzo said.

Rather than make assumptions, we developed a protocol that uses machine learning to study the noise while also using modelling to predict what the system does in response to the noise, he said.

With the use of the quantum photonic processors, Peruzzo said this hybrid method could help quantum computers perform more precisely and efficiently, impacting how we control quantum devices in the future.

We believe our new hybrid method has the potential to become the mainstream control approach in quantum computing, Peruzzo said.

Lead author Dr. Akram Youssry, from RMIT, said the results of the newly-developed approach showed significant improvement over the traditional methods of modelling and control, and could be applied to other quantum devices beyond photonic processors.

The method helped us uncover and understand aspects of our devices that are beyond the known physical models of this technology, he said.

This will help us design even better devices in the future.

This work is published in Npj Quantum Information.

Peruzzo said startup companies in quantum computing could be created around his teams photonic device design and quantum control method, which they would continue to study in terms of applications and their full potential.

Quantum photonics is one of the most promising quantum industries, because the photonics industry and manufacturing infrastructure are very well established, he said.

Quantum machine-learning algorithms have potential advantages over other methods in certain tasks, especially when dealing with large datasets.

Imagine a world where computers work millions of times faster than they do today, where we can send information securely without any fear of it being intercepted, and where we can solve problems in seconds that would currently take years.

This isnt just fantasy its the potential future powered by quantum technologies, and research like ours is paving the way.

References:

Programmable high-dimensional Hamiltonian in a photonic waveguide array by Yang Yang, Robert J. Chapman, Ben Haylock, Francesco Lenzini, Yogesh N. Joglekar, Mirko Lobino and Alberto Peruzzo, 2 January 2024, Nature Communications. DOI: 10.1038/s41467-023-44185-z

Experimental graybox quantum system identification and control by Akram Youssry, Yang Yang, Robert J. Chapman, Ben Haylock, Francesco Lenzini, Mirko Lobino and Alberto Peruzzo, 13 January 2024, npj Quantum Information. DOI: 10.1038/s41534-023-00795-5

Go here to read the rest:

Quantum Computing Takes a Giant Leap With Light-Based Processors - SciTechDaily

Written by admin |

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with


Page 118«..1020..117118119120..130140..»



matomo tracker