Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • Naz@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    edit-2
    27 days ago

    Datacenter LLM tranches are 7-8 H100s per user at full load which is around 4 kW.

    Multiply that by generation time and you get your energy used. Say it takes 62 seconds to write an essay (a highly conservative figure).

    That’s 68.8 Wh, so you’re right.

    Source: I’m an AI enthusiast

    • bandwidthcrisis@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      28 days ago

      Well that’s of the same order of magnitude as the quoted figure. I was suggesting that it sounded vastly larger than it should be.

      • Naz@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        28 days ago

        They’re probably factoring in cooling costs and a bunch of other overhead, I dunno

      • Naz@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        27 days ago

        Nope. Just GPU board power draw. 60 seconds is also pretty long with how fast these enterprise cards are but I’m assuming they’re using a giant 450B or 1270B model.

    • oyo@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      27 days ago

      kW is a unit of instantaneous power; kW/s makes no sense. Note how multiplying that by seconds would cancel time out and return you power again instead of energy. You got there in the end, though.