• xthexder
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    5 months ago

    I’m not sure that’s even a valid comparison? I’d love to know where you got that data point.

    LLMs run until they decide to output an end-of-text token. So the amount of power used will vary massively depending on the prompt.

    Search results on the other hand run nearly instantaneously, and can cache huge amounts of data between requests, unlike LLMs where they need to run every request individually.

    I’d estimate responding to a typical ChatGPT query uses at least 100x the power of a single Google search, based on my knowledge of databases and running LLMs at home.