• brsrklf@jlai.lu
    link
    fedilink
    English
    arrow-up
    12
    ·
    5 hours ago

    Only yesterday, I searched for a very simple figure, the number of public service agents in a specific administrative region. This is, obviously, public information. There is a government site where you can get it. However I didn’t know the exact site, so I searched for it on Google.

    Of course, AI summary shows up first, and gives me a confident answer, accurately mirroring my exact request. However the number seems way too low to me, so I go check the first actual search result, the aforementioned official site. Google’s shitty assistant took a sentence about a subgroup of agents, and presented it as the total. The real number was clearly given before, and was about 4 times that.

    This is just a tidbit of information any human with the source would have identified in a second. How the hell are we supposed to trust AI for complex stuff after that?

    • Patch@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 minutes ago

      The AI models can be hilariously bad even on their own terms.

      Yesterday I asked Gemini for a population figure (because I was too lazy to look it up myself). First I asked it:

      what is the combined population of place 1, place 2 and place 3

      It answered:

      Unfortunately, providing an absolutely precise, up-to-the-minute combined population figure is difficult. Population numbers are constantly changing.

      To find these figures, I would recommend checking the Office for National Statistics (ONS) website. They provide detailed population statistics for local authorities in England.

      On a whim, I asked it again as:

      combined population of place 1, place 2 and place 3

      And then it gave me the answer sweet as a nut.

      Apparently I was being too polite with it, I guess?