• 12 minutes

    incompetent man babies. can’t even own up to their fuck ups a like a real man.

  • So it’s not called Microsoft 365 copilot anymore then? Since that is fully focused for business and not for entertainment. Make up your fking mind Microsoft

  • 4 hours

    Ah, the famous “Fox News” defense of claiming you’re an entertainment medium but you should totally trust it.

  • 3 hours

    Ah yes copilot in the app everybody thinks of for entertainment…notepad.

  • Oh lord, I might get fired for doing this, but I smell a company-wide email about Copilot Monday morning.

  • 3 hours

    That’s complete and utter bullshit. Either stand behind your product or don’t ship it universally. Pick a lane. Either it’s worth using it or isn’t, so which is it?

    This wishy washy bullshit paints a picture of an embarrassingly inept organization, is that really what you’re trying to project Microsoft?

  • Then how come my company just roll this shit out for work? Allstate just walk us through how us co pilot to type our emails. And to use it for note taking.

    • 4 hours

      Because in the end, you, the person that is forced to use various AI chatbot/agent/model/whatever, will be held responsible for anything that happens after one of the 3000 decisions they imposed you to make with no way to check everything turns out to cause the slightest problem. When that happens, YOU were supposed to know that NOTHING the AI tells/says/do is to be expected correct, so it’s your responsibility if something’s gone wrong.

      • Already told them I refuse to let AI write my emails or make my notes. Fuck that noise.