buy lipitor online
Hurwitz & Associates - Insight is Action

The Bozman Blog

Jean is a senior industry analyst focusing her research on server technology, storage technology, database software and the emerging market for Software Defined Infrastructure (SDI).

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Login
    Login Login form
Posted by on in Cloud Computing
  • Font size: Larger Smaller
  • Hits: 324
  • 0 Comments
  • Print

IBM Quantum Computing Jumps to Commercial Use Via Cloud

IBM’s quantum computing technology, developed over decades, is ready for commercialization. It is a fundamentally different approach to computing than is used in today’s systems – and, as such, represents a watershed in computing history.

By allowing scientists and researchers to model the complexities inherent in natural phenomena and financial markets, quantum computing is a new approach to the way in which computing itself is done. It is different than Big Data analytics, which finds patterns in vast amounts of data. Rather, it will generate new types of data characterizing phenomena that couldn’t be quantified before.

What began deep in the IBM research labs in New York and Zurich is now ready to provide computing services, via the IBM Bluemix cloud.

On March 6, 2017, IBM announced its initiative to build commercially available quantum computing systems.

  • The IBM Q quantum systems and services will be delivered via the IBM Cloud platform. The core computing will be done on “qubits,” which are the quantum computing units for programming. The qubits can be orchestrated to work together; up to now, five qubits have been available to early users, and more qubits will become available in 2017.
  • IBM is releasing a new API (application programming interface) for IBM quantum computing, which will allow developers to program in widely used languages, such as Python. The resulting code will be able to access the quantum computing resources, housed in the data center of IBM’s Yorktown Heights, N.Y., research laboratory.
  • IBM is also releasing an upgraded simulator that can model circuits with up to 20 qubits. Later this year, IBM plans to release a full software development kit (SDK) on the IBM Quantum Experience that will allow programmers and users to build simple quantum applications.

 

Quantum Computing in Brief

Quantum computing is designed to generate data based on the physics principles of uncertainty. Many natural phenomena, such as the structure of molecules and medicines, can be better understood by analyzing thousands, or millions, of possibilities, or possible outcomes.

But the sheer scale of the work extends beyond the reach of classical computing used in today’s data centers and Cloud Service Providers (CSPs).

Unlike IBM Watson, which focuses on Big Data and analytics, quantum technology seeks to bring insights based on what is “not” known, rather than finding patterns in known data. Examples include: learning more about chemical bonding and molecules; creating new cryptography algorithms, and advancing machine learning. This is done through an approach called “entanglements” that explore and orchestrate large numbers of potential outcomes – and moving the data results at through new types of high-speed communications links.

 

How It Works

Based on a technology that requires super-cooling at less than one degree Kelvin (a measure on the Kelvin temperature scale), IBM’s quantum computing marries five key elements: a new type of semiconductor processor built with silicon-based superconductors; on-chip nanotechnology; deep cooling containers that house the computer; programming with microwaves – and delivery via the cloud to end-users.  

The reason for the super-cooling is that quantum computing compares “quantum states” that are ever-changing in superconducting materials – making it impossible to pinpoint a given state as a computer “1” or a “0.” However, by leveraging extremely small gaps in electrical pulses traversing the super-cooled semiconductor surfaces, quantum computing finds the likelihood, or the probability, associated with multiple known “states” of the data – even though at any one moment the actual states of that data are in constant flux.

One quick example: It is impossible to find the exact position of all the electrons spinning inside specific molecules, preventing scientists from finding all the possible combinations of electrical bonds inside the molecule. This is important in medicine and pharmacology, where the quantum approach could extend the molecule-folding analysis that is widely used widely today in biotechnology. As a result, new medical treatments may emerge, and new approaches to drug development may be created. And, many more scenarios for research and exploration may open up with wider access to quantum computing capabilities.

 

A Bit of History

Only the core concept about quantum computing existed in 1981, when Nobel laureate Richard Feynman, the famed physics scientist, spoke at the Physics of Computation Conference, hosted by M.I.T. and IBM. (Feynman is best known for his physics work, and for discovering the design fault in the Space Shuttle’s O-Rings that caused the 1986 Challenger explosion.) During his 1981 presentation, Feynman challenged computer scientists to develop computers based on quantum physics.

By the late 1980s, this led to the creation of Josephson-Junction computers, which worked in a prototype super-cooled enclosure, but proved impractical to use in the enterprise data centers of that era. Some had considered its use in deep space – but even that proved to be “not cold enough” to achieve the quantum computing effects. But progress in quantum research continued in the 1990s and early 2000s, relating to programming code for quantum computers, deep cooling in physical data-center containers, and scaling up the quantum analysis.

Key developments in computer science itself have paved the way for the IBM quantum computing initiative. Stepping stones along the way included developing, and working with, the qubits, getting them to work together in “entanglements” to compare computing states, and improvements in coding quantum computers to interface with classical von Neumann computers based on “1s and 0s.”  

In quantum computing, the tiny superconducting Josephson Junction [electrical gaps], operating in extremely low temperature containers, find multiple possible outcomes in such fields as chemistry, astronomy and finance.

The advent of cloud allows quantum computing to take place in special environments, housed in super-cold, isolated, physical containers – while supporting end-user access from remote users worldwide. This model for accessibility changed the calculus for bringing quantum to the marketplace.

Now, the IBM Quantum Experience, as IBM is calling it, is more than an experiment, or a prototype. Rather, it is now discoverable as a new resource on the IBM Cloud, and is already being used by a select group of commercial customers, academic users, and IBM researchers.

 

Why Should Customers Care?

Certain classes of customers are likely to move into quantum computing analysis early on: Areas of interest would include finding new ways to model financial market data; discovery of new medicines and materials; optimizing supply chains and logistics, improving cloud security – and improving machine learning.

This next stage on the path to quantum computing will see collaborative projects involving programmer/developers, university researchers and computer scientists. IBM intends to build out an ecosystem around quantum computing. A number of researchers, including those at M.I.T., the University of Waterloo in Ontario, Canada, and the European Physical Society in Zurich, Switzerland, are already working with IBM on quantum computing. So far, 40,000 users have run more than 275,000 experiments on the IBM quantum computing resource. All are accessing IBM quantum computing – and IBM expects to expand the program throughout 2017.

In addition, there is an IBM Research Frontiers Institute, a consortium that looks at the business impact of new computing technologies. High-profile commercial companies that are founding members of the institute include Canon, Hitachi Metals, Honda, and Samsung – and IBM is asking other organizations to join as members.

 

Quantum Computing’s Future

The time is right for quantum computing – a new way to explore endless permutations of data about possible outcomes. It requires a different kind of technology – not the classical 1s and 0s of classical computing. It has taken decades to mature to the point where it is both accessible and programmable. It is still “early days” in quantum computing, but IBM’s moves to commercialize the technology are positive ones, now involving a wider group of partners in a new and evolving ecosystem around quantum computing.

 

 

 

Last modified on
0
Bozman HeadshotJean S. Bozman is a seasoned industry analyst with more than 20 years of experience as a consultant and industry analyst focused on the worldwide IT markets for databases, servers, storage and software. Jean's full bio

Comments

  • No comments made yet. Be the first to submit a comment

Leave your comment

Guest Thursday, March 23 2017