ZDNet columnist Chris Matyszczyk flags Microsoft’s latest “poetic use of words” when describing the ChatGPT-based functionalities it’s bundling into applications like Word and Excel. It’s there to help steer you to your destination. It’s there to free you to focus on steering your life. And it’s there to help you land on the perfect version of you, the one that does more in order to, I don’t know, be more. There’s one difference, though, between Microsoft’s Copilot and, say, an American Airlines co-pilot. Hark the words of Microsoft VP of Modern Work and Business Applications Jared Spataro: “Sometimes, Copilot will get it right. Other times, it will be usefully wrong, giving you an idea that’s not perfect, but still gives you a head start.” I wonder how long it took for someone to land on the concept of “usefully wrong.” You wouldn’t want, say, the steering wheel on your car to be usefully wrong. Any more than you’d want your electrician to be usefully wrong. Somehow, though, one is supposed to cheer that a piece of AI (hurriedly) slipped into one’s most basic business tools can be utterly mistaken…. Of course, all these companies — Microsoft, too — claim they’re being responsible in the way they create their new offerings. Wait, didn’t Microsoft just lay off its entire AI ethics and society team? Read more of this story at Slashdot.
Microsoft Brags Its ChapGPT Integration Will Be Right or ‘Usefully Wrong’
Advertisment