Maggie's FarmWe are a commune of inquiring, skeptical, politically centrist, capitalist, anglophile, traditionalist New England Yankee humans, humanoids, and animals with many interests beyond and above politics. Each of us has had a high-school education (or GED), but all had ADD so didn't pay attention very well, especially the dogs. Each one of us does "try my best to be just like I am," and none of us enjoys working for others, including for Maggie, from whom we receive neither a nickel nor a dime. Freedom from nags, cranks, government, do-gooders, control-freaks and idiots is all that we ask for. |
Our Recent Essays Behind the Front Page
Categories
QuicksearchLinks
Blog Administration |
Wednesday, June 27. 2018Quantum computingQuantum Computing Expert Explains One Concept in 5 Levels of Difficulty. I still don't get it, but I don;t get how regular computers work either. Bits?
Trackbacks
Trackback specific URI for this entry
No Trackbacks
Comments
Display comments as
(Linear | Threaded)
It was a pretty good explanation. I did get the impression that they conflated non-quantum computing concepts with some standard or classical computing concepts, perhaps to make it easier to understand or even because they don't see them as separate concepts. What they are really doing is building a hybrid computer with quantum operations and classical digital operations. I don't know if that is because they haven't advanced enough to build a straight quantum model or if it is not possible to build a straight quantum model.
I once worked on a computer that used compressed air instead of electronics. The reason was that it would give off no radiation signature since it was not electronic and it was immune to either radio jamming or EMP interference. It was a concept model and I doubt it was ever put into production. Bits are simply the smallest common denominator or perhaps more accurately the smallest number/value, i.e. a 1 or a 0 typically represented either by a tiny pulse of electricity or a magnetic condition or even in some cases a sound or lack of sound. It is as basic as you can get; it is either off or on, or you could say it is there or it is not there. What makes the difference internally in a computer between data (represented with bits) and process (also represented in bits) is timing. The data sitting there waiting to be accessed is simply a very long string of bits but when used in a process, such as actually computing it is a series of precisely timed processes which result in new/different data . The more complex and useful aspect of the computer is the process and the speed that which it can be done OR the multiplicity with witch it can be done (parallel processing) determines the power of the computer. We have kinda peaked with speed and the more parallel circuits you introduce the more complex the hardware becomes and the more critical the timing becomes. So we are up against limits. Which is why they are exploring quantum computing. My gut tells me that the barriers or hurdles they must overcome are going to be too great to make it practical. That does not mean it won't work or that they won't build them but rather that it will never live up to expectations such that it is a whole "quantum" leap above our current architecture. Bits aren't very weird. Even before digital computers became so economical and ubiquitous, many modern people were already familiar with things that systematically break down arbitrarily complex information into yes and no, like the game of twenty questions, or similar dichotomies in the operation of an abacus or in telegraphic Morse code. Then with some engineering cleverness so that the value of one bit automatically triggers some kind of change in connected bits, and some systematic thought, it's not hard --- easily accessible to a bright high schooler --- to get to things like a circuit for automatically adding two numbers together, and not that hard --- accessible to at least a fair number of high schoolers --- to implement a computer that's (a finite but huge version of) Turing's (elegantly but impractically infinite) abstraction of a universal computing machine, and after that it can all be software.
Qubits are weirder than that, with correlation issues that have no analogues in day to day life, so instead of the first step up the abstraction ladder being something accessible like an abacus toy you can hang in a crib to interest an infant, even the first step is a doozy. And then for good measure, because life isn't fair, the later steps tend to be harder too. I get that classical computers deal with information expressed as combinations of simple yes-no decisions: 0 or 1, for instance. Older analog devices tended to represent information on gradual scales with basically infinite variations between 0 and 1. is the idea that quantum computers can do so, too? For instance, there's at least 0, 1, and [in-between]. Similarly, I gather our brain's nerve impulses are not just go or no-go between each two nodes but also "how strong." And then entanglement, does that substitute for the connecting "wire" between the signal and the receiver?
A modest problem can require more information to be represented than exists in the universe.
Probably some physical law therefore prevents a quantum computer from working as they imagine. It turns up as difficulties in engineering it. "substitute for the connecting 'wire' between the signal and the receiver?"
Entanglement tends to defy simple useful explanation, AFAIK. It is generally not much of a substitute for a wire between the signal and the receiver; though. In fact, one peculiar thing about it is that it resists being exploited to transmit information. It's a pervasive correlation with a number of peculiar regularities to it. One such regularity is that it seems to be impossible to take advantage of the correlation to transmit information faster than the speed of light. That might not sound so peculiar, but it is, because the correlation, viewed from any "there must be deterministic behavior behind the scenes" intuition, would require the crew of backstage deterministic elves running the show to be signalling to each other faster than the speed of light in order to manage the performance that we see. If you like, you can look up "Einstein Podolsky Rosen" for the stock famous example of a setup that makes this particular peculiarity central to the outcome. So much for what it's not; what is it? Um. Entanglement is more nearly the kind of in-phase vs. out-of-phase issue that we associate with mundane waves. It can show up in behavior which is closely analogous to the interference patterns we sometimes see in light waves in special situations, like laser speckle in the light from a laser pointer or bar code scanner, or like the shifting color patterns in reflections from a compact disc. But entanglement occurs in arbitrarily many dimensions, rather than the single dimension (usually time, or a single spatial axis) that shows up in the kind of mundane waves that are easily acessible to play with. And if you are very very clever, like Peter Shor (see https://en.wikipedia.org/wiki/Shor%27s_algorithm), you can figure out a devious way to line everything up so that --- in principle, though not currently in practice --- certain useful information could be coaxed out of the high-dimensional interference behavior. It's all wrapped up in several layers of other math and classical computing, but the quantum part seems to be fundamentally a trick with quantum interference, and not AFAICS much like a trick with signals sneaking around without wires. That said, there are some communications tricks using quantum entanglement that are somewhat reminiscent of what you said entanglement might be like --- in particular, communications setups that resist eavesdropping. Those don't feel exactly like the backstage deterministic elves sneakily conspiring using wires that we can't possibly detect and using messages on those supersecret wires to determine the outcomes we see, but they do feel more like that than they feel like interference patterns in thousands of dimensions. I call BS on Quantum computing. Those scientists can get funding for years on end without ever producing a merchantable product.
Conventional physics research had no merchantable product, except that understanding the atom meant we could understand chemistry, which is kind of important. We had done a lot with chemistry before that, but it had been a lot of trial and error. Physics made chemistry somewhat predictable.
And then one day a guy was crossing a street and figured out that those newfangled neutrons had different speeds, and some moved slower, and that meant they could be captured by certain atoms. That would make a barely unstable atom really really unstable, and that atom would split, and that caused two neutrons to go flying off to find other atoms. And if you could work this speed thing just right, these capture/split/emit 2 events would happen, then twice as many would happen, then 4 times as many, and so on--many many times in a second. And that meant a really really big boom. So some research has no practical applications, until suddenly it has some devastatingly practical ones. |