Rebecca Krauthamer

The Next Ten Years in Quantum Computing and Data Science

In 1981, Richard Feynman proposed a theoretical device called a “quantum computer” to leverage the laws of quantum physics to achieve computational speed-ups over classical methods. Forty years later, this theoretical device is now a reality. Researchers have developed quantum algorithms as well as the currently small-scale quantum computers that run them. A path to scalability has also been demonstrated, meaning that over the coming decade we are likely to see powerful quantum computers emerge that can carry out specific types of calculations 100,000 or 1,000,000 times faster than even current supercomputers.

As Moore’s law is reaching its limit, quantum computing is poised to be a force that allows us to overcome current and future processing power limitations on classical computing hardware. Quantum computing promises to revolutionize how and what we compute, and will necessitate an equally radical new way of thinking about data. In this talk, I will demystify the current state of quantum computing, discuss the implications for data science and machine learning, and provide industry insight into what use-cases and pilot projects companies are actively working on.