Microsoft has changed the clause in its Terms and Conditions related to artificial intelligence (AI), and now prohibits anyone from reverse engineering or harvesting data from its AI software to train or improve other models.
Companies now face challenges with the rise of generative AI, as people want to know what companies and advertisers are doing with information provided by users.
Microsoft outlined this issue in a new clause titled "AI Services" in its terms of service. It does not apply to enterprise users.
The new policies will take effect on September 30, and range from reverse engineering to extracting data. Microsoft states that no one can reverse-engineer AI services to discover underlying components of models, algorithms, and systems. For example, no one can try to determine and remove the weights of models.
Microsoft also limits the use of web scraping, web harvesting, or web data extraction methods to extract data from the AI services.
The terms also state that users may not use the AI services -- or data from the AI services -- to create, train, or improve any other AI service.
Part of providing AI services means the user gives Microsoft permission to process and store any inputs as well as outputs to monitor the content. The goal is to prevent abusive or harmful use of the service.
Microsoft also states that users are solely responsible for responding to any third-party claims regarding the use of the AI services in compliance with applicable laws, such as copyright infringement or other claims relating to content output during the use of the AI services.
A Microsoft spokesperson declined to comment on how long the company plans to store user inputs in its software.
"We regularly update our terms of service to better reflect our products and services," the Microsoft spokesperson told The Register, which initially brought light to the change. "Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers," the representative told us in a statement."
Microsoft does not use the data to train AI models for Bing Enterprise Chat.
The Register notes that "policies are a little murkier for its Microsoft 365 Copilot," thought appears to use customer data or prompts for training and store information.
An article on Microsoft's website titled Data, Privacy, and Security for Microsoft 365 Copilot explains how it can generate responses anchored in the customer's business content, such as user documents, emails, calendar, chats, meetings, contacts, and other business data.
It reads that Copilot combines this content with the user's working context, such as the meeting a user is in now, the email exchanges the user has had on a topic, or the chat conversations the user had last week.