Microsoft’s Copilot Terms of Use Says the AI Chatbot Is for Entertainment Purposes Only

news.saerio.com

April 6, 2026

Microsoft’s Copilot Terms of Use Says the AI Chatbot Is for Entertainment Purposes Only


Microsoft has been aggressively pushing Copilot, its in-house artificial intelligence (AI) technology, as a powerful productivity tool for enterprises. The Redmond-based tech giant recently released an automation tool dubbed Copilot Cowork, a wellbeing product called Copilot Health, and new native large language models (LLMs). However, despite the push, the AI chatbot’s terms of use describe it as an entertainment tool, which has started a debate online. Netizens have raised concerns that the language used in the terms suggests the tech giant does not want to take accountability for its AI’s actions.

Microsoft Calls Copilot a Party Trick

Last week, several social media users unearthed the terms of use for Microsoft’s Copilot and began sharing it online due to its confusing language. The tech giant updated the chatbot’s terms in October 2025, making several changes to older language. Most of it is a legal disclaimer that highlights how the tool should and should not be used.

However, one part of the document mentions, “Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.” The language is concerning because the company wants the tech used for serious productivity tasks, yet it also mentions that it is not a reliable tool.

Another part of the terms of use claims that Copilot’s response can infringe copyrights, trademarks, or rights of privacy, and that Microsoft will not be held responsible if the user chooses to publish or share these responses publicly or with another individual. This also conflicts with Microsoft’s vision of users generating reports and documents using the AI chatbot.

Reacting to this, Reddit user u/iliveonramen said, “It’s not a good sign when a company won’t stand behind the accuracy of their product. If Microsoft doesn’t trust Copilot, why should I?” Another responded, “If it is for entertainment purposes only, why the hell is my company forcing it on all their workers?”

Meanwhile, a Microsoft spokesperson told PCMag, “The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing. As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update.”

Affiliate links may be automatically generated – see our ethics statement for details.



Source link

Leave a Comment