Quantum Winter, a Reality or Just a Myth…?

This blog tries to understand a pivotal point that has been focal in the recent trends of conversation in the field of quantum computing and quantum-inspired solutions by touching upon events of the AI winter and understanding if quantum is on the same path.
Let us understand exactly why the AI winter started, how it ended and what caused it all.

The AI Saga

AI’s journey was way longer than we can even imagine. In 1763- Thomas Bayes created a framework for machines to think and reason through event probability. In 1842, Ada Lovelace helped Charles Babbage build the first algorithm that could not solve only simple but complex problems, too, forming the basis of AI and automata theories. The progress slowed down, but a decade later, World wars necessitated scientists to create smarter machines that could take the place of humans and be more intelligent than them. In 1936 – mathematician Alan Turing presented the paper that led him to create the Turing-Welchman Bombe Machine (the first computer) that was believed to have broken the code of enigma. Scientists during this period tried to build artificial neurons through which machines could think and reason. In 1943 – Warren S. McCulloch and Walter Pitts’ paper “A Logical Calculus of the Ideas Immanent in Nervous Activity” made way for artificial neurons that could imitate brain neurons. In 1955, “Logic Theorist”, the first artificial intelligence program, was created by Allen Newell and Herbert A. Simon. In 1956, the word ‘AI’ was conceived at the Dartmouth conference, and in 1959, ‘Machine learning’ was coined by Arthur Samuel as the algorithm for programmable computers that could learn from historical data. With a huge momentum behind it, AI could not live up to the high expectations as the algorithms were inefficient and produced fewer results. In 1964, MIT developed ELIZA, a Natural Language Processing algorithm, and Harold Cohen, in 1972, wrote a program through which computers could autonomously create pictures on their own, like an artist painting with his imagination. Across the pond in Japan, an AI-powered humanoid robot, WABOT-1, was created.
Then followed the period of AI winter when all the progress did not produce fruitful results to solve real problems. No further investment was provided for AI research, and no major companies extended a helping hand. There were minor discoveries and improvements in computer science, but nothing major happened in AI, so the progress stalled. Hence, this brought forth the 1st AI winter. In 1980, WABOT-2 began to talk with humans, understand things and play music. A ray of hope shimmered on the horizon, thus ending the first AI winter even though there was no specific date when the AI winter ended. It was in the 80s that there was a renewed interest. In 1986 – Mercedes-Benz conducted a driverless van test drive with no occupants inside, which was a huge milestone. But due to several factors like the technology readiness level of autonomous vehicles and the limitations of the image processing chipsets at that time, the technology could not produce results. Subsequently, another AI winter approached as no major inventions or breakthroughs followed this after 1986. The regression models and the neural networks at that time were crude and could not solve complex problems. Eventually, there was the invention of the backpropagation algorithm that revolutionised this field. All the underperforming model/neural networks suddenly started giving better results leading many large corporations and startups to adopt this technology, thus marking the end of the second AI winter.
In 1995 – Richard Wallace developed an A.L.I.C.E. chatbot that could speak. In 1997 – Garry Kasparov, a world chess champion, was defeated by IBM’s Deep blue at chess. 2002 brought on the AI-inspired autonomous robot Roomba for vacuum cleaning. In 2008 – the Google app began to recognize words, and by 2009 Google initiated the self-driving car project. In 2014 – Tesla came up with the autopilot algorithm, and much later, Amazon’s Alexa was born.

Quantum Computing

Before quantum computers, in the period of the mid to late ’80s and early ’90s, most computer engineers and scientists believed that binary/classical computing was all there was. Everyone was sure that classical computers should be able to compute everything that is computable in the universe, from basic arithmetic to rocket science. Moreover, Moore’s law was popularized and tried to predict the advances in computational power of CC – Classical Computers. CCs could not necessarily do everything efficiently, though. Let us say you wanted to understand something like the universe’s origin or study the behaviour of atoms or molecules. When you enter the realm of atomic and subatomic particles that form the basic building blocks of everything, even the classical computers trying to unravel these mysteries begin to falter. Because this realm is ruled by quantum mechanics and its phenomenon like entanglement, superposition, coherence, etc. Classically calculating these entangled states becomes a nightmare of exponentially increasing complexity. A quantum computer can deal with particles under study by superposing and entangling its quantum bits (Qubits). It enables the computer to process extraordinary amounts of information. 1 Qubit can hold 2 classic bits of information, 2 Qubits can hold 4, and 3 can hold 6, and so it grows exponentially. For instance, we need just 40 entangled qubits to model a problem state that requires 1 trillion classical bits to model the same problem. If quantum computers are so powerful, why is there a debate about the Quantum winter? It might be due to the below facts:

  • The current generation quantum computers are noisy and inaccurate.
  • The current generation’s noisy quantum computers get outperformed by supercomputers, simulators, and optimizers.
  • Error correction in quantum computing is a big challenge.
  •  The scalability of quantum computers is limited by current technology. QPU-powered laptops are still decades away.
  • At the very least, generating a qubit and holding it to a steady state is a big challenge.
  • On top of the above challenge, quantum computers need to maintain the state of entanglement, coherence, and other necessary properties of qubits for calculations which proves to be a Herculean task.
  • Some quantum computers need a cryogenic apparatus to maintain their temperature just above absolute zero. With the current scenario of power outages and energy crises, this would not be very feasible.
  • Even if it were feasible, the most challenging hurdle is that we cannot store the system state or pause in-between calculations, switch off, move away and grab a cup of coffee, come back and resume. If done so, we need to start all over again because of the no-cloning theorem and decoherence. Additionally, these calculations sometimes take hours and days to compute.

Several other reasons are under debate, which some feel are the barriers to overcome to commercialize quantum computers. Unless we achieve that, we should stop this hype about quantum computers and focus on real-world problems. Another reason for the debate is that if we don’t allow a widespread awareness of Q tech or any technology, there won’t be any progress. Mindshare of knowledge and continuous efforts for advancement can empower global minds to collaborate and elevate the technology landscape to its fullest potential. If we say no to every technology below the TRL threshold (Technology Readiness Level), then progress will be stalled not only in that field but also for humanity and science.


The field of Quantum computing has had a lot of breakthroughs in the recent 2-3 years. Even though the quantum computing field is more than a decade older, quantum computers could not make a massive impact as expected by everyone. Many companies like Google, Microsoft, IBM, and an Indian company called “Quanfluence” are trying to build better quantum computers.

  • Google claims its quantum computer can solve the ‘infeasible’ problem in 200 seconds. It would take a regular computer 10,000 years to solve the same problem.
  • Global Q Technology Market Report 2020 highlights that: this market will reach $21.6 billion by 2025.
  • Global investment in quantum technology has already crossed $30 billion.

The current generation of quantum computers is just the 1st generation. Imagine what they will be able to do after a few years. In short, quantum is not some technology of the future but is available today and is already being used by large corporations for solving highly complex and previously unsolvable problems. A detailed PoV on this topic has been published since it is difficult to cover all the points in this blog. Many real-world quantum solutions are available with us, developed by the Infosys Quantum Center of Excellence. A few other solutions are also offered with the help of our ecosystem of partners, quantum cloud service providers, etc.

Major Challenges

Some of the major challenges for companies across the globe are optimization, new project strategy evaluation, reducing project investment, forecasting/simulating all scenarios, molecular simulation, security & cryptography, risk mitigation, integration of smart solutions like AI/ML, etc.


I believe that “the rate of discovery is directly proportional to the size of the object under study”, and you can quote me on that if someone has not already said it. For the same reason, we cannot call it ‘Winter’. Quantum computing, while massively computationally powerful at the same time, has a lot of wrinkles to iron out. The term ‘Q winter’ is derived from AI winter and is facing some criticism and doubts. Hence, some say that the quantum winter is coming. No one has experienced the quantum winter yet, which is why everyone is speculating about it. Winter in any field is generally referred to as the period when there is no interest in that space, no investment, no major breakthroughs, no major player adopting the technology, and no universities or schools teaching these technologies. Accelerating factors for winter might be the absence of social media, fast communications, internet, cloud computation, etc., but is it the same with quantum? Are the scales balanced enough for us to compare quantum with AI winter or nuclear winter and conclusively say Q winter is coming? As the comparative facts were not available for AI, an apple-to-apple comparison is not possible. For a second, let us imagine, if the proven technology such as AI suffered two winter periods, then what would be the case with quantum, which is not even proven yet? With all the recent advances in the field, are they enough to forestall Quantum winter? Can the current drawbacks of quantum computers be truly overcome? Major challenges like scalability, quality, etc., are still hurdles for quantum. Can the quantum field provide enough proof to say it will not suffer a winter phase and are there enough testimonials to conclude that this space has enough breakthroughs, investment, resources, and education/study materials for a sustained future?


For more details, you can reach out to us at quantumcomputing@infosys.com

Co-Contributors:  Dr. Vijayaraghavan Varadharajan & Aseem Rajvanshi






























Author Details

Adarsh V Katagihallimath

Adarsh enjoys his role as Senior Associate Consultant with Infosys Centre for Emerging Technologies. He is part of the Quantum Centre of Excellence and works on tackling classically unsolvable problems with Quantum computing, Quantum inspired solutions and Artificial Intelligence.

Leave a Comment

Your email address will not be published.