

Step 3: Created a Q# Project using the Q# Application Template Microsoft is taking a more challenging, but ultimately more promising approach to scaled quantum computing with topological qubits that are theorized to be inherently more stable than qubits produced with existing methods without sacrificing size or speed.Step 2: Installed Microsoft Quantum Development Kit.Step 1: Installed the latest version of Visual Studio 2019 (16.11.4 Preview 1.0).I have completed the following steps (Step #s from tutorial link): Our approach can provide a new pathway toward the milestone of “quantum advantage,” the moment when quantum machines will do something really useful.I am trying to run a simple "Hello World" project in Q#, following this tutorial. To check, we use high-performance conventional computers to “pretend” to be quantum computers with various types of noise problem, and then we successfully apply our learning technique. We prove that the lessons learned on the simplified circuits will work on the real one. We take the circuit we would like to run on the quantum device and create many training variants with the special property that they can be evaluated on a conventional computer, so that we know what the “right” output is. In the present paper we find that we can also automatically learn the key features of quantum noise provided we have good training examples. When machine learning is used to, for example, learn to recognize handwritten postal codes, it does so without ever having to actually describe the countless variations of human handwriting. The task of learning all about the noise in the system can itself become an insurmountable problem. The most powerful error mitigation processes require the user to know the exact nature of the noise that the computer suffers from and that knowledge is hard to get. For near-term devices, we can try to mitigate errors, minimizing their impact so that the output is still useful. Quantum computers are “noisy”: errors sneak in whenever more than a few qubits perform a calculation of any significant complexity. In all cases the protocol successfully adapts to the noise and mitigates it to a high degree. Quantum computing algorithms that solve certain mathematic problems more efficiently than classical computing can assist with optimization and cryptography. The systems suffer a range of noise severities and types, including spatially and temporally correlated variants. Having presented a range of learning strategies, we demonstrate the power of the technique both with real quantum hardware (IBM devices) and exactly emulated imperfect quantum computers. The process yields a configuration that is near optimal versus noise in the real system with its non-Clifford gate set. Our training process uses multiple variants of the primary circuit where all non-Clifford gates are substituted with gates that are efficient to simulate classically. Here we present a method by which the proper compensation strategy can instead be learned ab initio. Unfortunately, these conditions are challenging to satisfy. Quasiprobability methods can permit perfect error compensation at the cost of additional circuit executions, provided that the nature of the error model is fully understood and sufficiently local both spatially and temporally. If noisy-intermediate-scale-quantum-era quantum computers are to perform useful tasks, they will need to employ powerful error mitigation techniques.
