• stabby_cicada@slrpnk.net
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    So I did a little math.

    This site says a single ChatGPT query consumes 0.00396 KWh.

    Assume an average LED light bulb is 10 watts, or 0.01 kwh/hr. So if I did the math right, no guarantees there, a single ChatGPT query is roughly equivalent to leaving a light bulb on for 20 minutes.

    So if you assume the average light bulb in your house is on a little more than 3 hours a day, if you make 10 ChatGPT queries per day it’s the equivalent of adding a new light bulb to your house.

    Which is definitely not nothing. But isn’t the end of the world either.

    • AliasAKA@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      It’s also the required energy to train the model. Inference is usually more efficient (sometimes not but almost always significantly more so), because you have no error back propagation or other training specific calculations.

      Models probably take 1000 megawatts of energy to train (GPT3 took 284MW by OpenAI’s calculation). That’s not including the web scraping and data cleaning and other associated costs (such as cooling the server farms which is non trivial).

      A coal plant takes roughly 364kg - 500kg of coal to generate 1 MWh. So for GPT3 you’d be looking at 103,376 kg (~230 thousand pounds, or 115 US tons) at minimum to train it. Nobody has used it and we’re not looking at the other associated energy costs at this point. For comparison, a typical home may use 6MWh per year. So just training GPT3 could’ve powered 47 homes for an entire year.

      Edit: also, it’s not nearly as bad as crypto mining. And as another person says it’s totally moot if we have clean sources of energy to fill the need and the grid can handle it. Unfortunately we have neither right now.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    ‘How dare technology keep doing stuff?’ is a deeply weird criticism.

    This isn’t like crypto bullshit, where finance-bro jackasses did databases in the least efficient possible way. We’re pushing the boundaries of results-driven artificial intelligence, modeled on how biological brains work. Is it miraculous? Not exactly. But it’s answering a lot of questions that were exciting forty-odd years ago and suddenly exploded into relevance due to parallel computing… intended for video games.

    Bemoaning the last year-ish of outright witchcraft, based on the up-front costs of training models that will run on a phone, is a perspective that seems more performative than plausible.

    • stabby_cicada@slrpnk.net
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      4 months ago

      I deeply dislike the line of argument that goes “we shouldn’t bother reducing our personal energy consumption because 100 corporations produce 70% of greenhouse gases” or similar arguments. Of course we should. Because it’s the right thing to do.

      But it’s also true: those 100 corporations and their ilk absolutely promote a false narrative that personal responsibility is the solution to climate change, in order to prevent climate regulation that might harm their bottom line.

      And frankly, I think that’s what’s going on here with panic over AI power consumption. Corporate lobbyists and PR creating yet another distraction to slow the course of climate regulation and guilting ordinary people for doing ordinary things in the process.

  • Immersive_Matthew@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    The issue is how the electricity is generated not that it is needed in the first place. Such a great distraction from the real issue that it has got to be big oil that is spinning the story this way.

    Let’s all hate on AI and crypto because they are ruining the entire environment and if we just stopped them, all would be fine with the planet again /s.