The DWave machine stirred up quite an amount on controversy in the community when it was first announces. The machine basically attempts to solve an NP-complete optimization problem (MAX-2SAT) by encoding it as a ground state of a Hamiltonian, and tries to reach this ground state by moving adiabatically to it from the ground state of an efficiently coolable Hamiltonian.
In general, the adiabatic algorithm is not known to be able to find ground states efficiently as the proximity of low lying levels to the ground state means that the transition between Hamiltonians has to be performed slowly, and the speed at which this can occur is governed by the gap between the ground state and the lowest excited levels. It is commonly believed within the community, but not proved, that no quantum algorithm can efficiently solve NP-complete problems.
In general the ground state of a Hamiltonian can be used to encode a wider variety of problems than NP (know QMA-complete problems), and so decision to focus on NP optimization problems has led to restrictions which prevent the device from being used for general purpose quantum computing (even if noise was not an issue). Thus you can't run Shor's algorithm on the device. Further, you -can- factor any number that you could fit on a 128 qubit device by classical means. The general number field sieve puts 128 bits within reach of modern personal computers.
Noise is a real issue with DWave's device, and although there have been a number of technical papers from them playing down the issue and trying to demonstrate quantum effects, the coherence times for the individual qubits are much shorter than the time scale for the algorithm. Thus the common view within the community seems to be that this is basically an expensive classical special purpose computer.
There is an interesting subtlety as regards noise: If you add noise to the adiabatic algorithm, it degrades gracefully into one of the best classical algorithms for the same problem. Thus you can obtain the same result either way, and the only difference is in the assymptotics for large systems (which are obviously not observables). Thus even if they produce a valid answer for every problem you throw at such a device, this is not enough information to determine if you are actually performing quantum computation.
Let me add that the adiabatic model can encode universal quantum computation, however the limitations of DWave's implementation means that specific machine cannot.
I'm no expert on quantum computing, but I had the same question, and found the following to help, a little bit.
http://tph.tuwien.ac.at/~oemer/doc/quprog/node8.html#SECTION00323300000000000000
The gist of it is, first you 'cool' your qbit to the zero state, and then apply a unitary transformation which has the result $U|0\rangle = |S\rangle$, where $|S\rangle$ is your initial string.
Now, as far as I understand,$|S\rangle$ really is a recording of the probabilities of all possible bit strings, so I would assume that your initial $|S\rangle$ value would be something with:
A high probability for your desired, classical string: $$0.9 \cdot |0101010111\rangle$$
and a very low probability for all other possibilities: $$\frac{0.1}{2^{10}-1} \cdot |others\rangle$$
I really don't know if this is a valid assumption to make, though, so please, tell me if I'm wrong.
Best Answer
No, it is not true. A quantum computer stores the same $2^n$ states that the classical computer stores. The difference is that the quantum computer stores a linear superposition of those states, where the classical computer can only store one of those states at a time. What you refer to as '0 & 1' qubits are actually linear superpositions of the two basis qubits 0 and 1.