• TimeSquirrel@kbin.melroy.org
    link
    fedilink
    arrow-up
    42
    ·
    11 days ago

    Do you have any idea the amount of error correction needed to get a regular desktop computer to do its thing? Between the peripheral bus and the CPU, inside your RAM if you have ECC, between the USB host controller and your printer, between your network card and your network switch/router, and so on and so forth. It’s amazing that something as complex and using such fast signalling as a modern PC does can function at all. At the frequencies that are being used to transfer data around the system, the copper traces behave more like radio frequency waveguides than they do wires. They are just “suggestions” for the signals to follow. So there’s tons of crosstalk/bleed over and external interference that must be taken into account.

    Basically, if you want to send high speed signals more than a couple centimeters and have them arrive in a way that makes sense to the receiving entity, you’re going to need error correction. Having “error correction” doesn’t mean something is bad. We use it all the time. CRC, checksums, parity bits, and many other techniques exist to detect and correct for errors in data.

    • over_clox@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      29
      ·
      10 days ago

      I’m well aware. I’m also aware that the various levels of error correction in a typical computer manage to retain the data integrity potentially for years or even decades.

      Google bragging about an hour, regardless of it being a different type of computer, just sounds pathetic, especially given all the money being invested in the technology.

      • TheRealKuni@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        edit-2
        10 days ago

        Traditional bits only have to be 0 or 1. Not a coherent superposition.

        Managing to maintain a stable qubit for a meaningful amount of time is an important step. The final output from quantum computation is likely going to end up being traditional bits, stored traditionally, but superpositions allow qubits to be much more powerful during computation.

        Being able to maintain a cached superposition seems like it would be an important step.

        (Note: I am not even a quantum computer novice.)