> Microsoft's Connected Experiences feature automatically gathers data from Word and Excel files to train the company's AI models. This feature is turned on by default, meaning user-generated content is included in AI training unless manually deactivated.
Not to say that Microsoft products respect privacy, but I don't see evidence that user Word/Excel files are being used for training.
The linked services agreement has had the same language (copy/transmit/etc. "to the extent necessary to provide the services") since at least 2015[0], and "connected experiences" seems to group a wide range of integrations; some like dictation/translation probably utilise ML, but that does not mean training on user content.
Agreed. This was raised within our corp the other week and we read through the privacy and security documentation as it relates to Connected Experiences.
Microsoft has outlined specifically what Connected Experiences covers.[1] [2]
You could argue that predictive text is a product of machine learning but there is no clause allowing for training any generalized large language models using this data. The confusion may have arisen, if they read an article about CoPilot. If the user had a Microsoft Copilot 365 license, then the data would be used as grounding for their personal interaction with CoPilot. But still not used to train any foundational LLMs.
However, even this data is still managed in compliance with Microsoft's data security and privacy agreements.
To play devil's advocate, I don't see any evidence they're NOT training on user content either. Compared to how explicitly they indicate they're not using user content for targeted advertising, this seems like a huge oversight. Given how carefully they've put together these documents, I'm doubtful it was an oversight.
Well, if there's some sort of cloud feature allowing you to share documents you write with others, it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.
However, the terms of service says "To the extent necessary to provide the Services to you and others, [...] and to improve Microsoft products and services". So they're saying they can use your content not just to provide you service, but to provide other people service and to improve all Microsoft products.
> it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.
That would be me sharing a specific document with a specific person. If their terms sepcified that they would only ""reformat, display, and distribute" to people we personally give permission to then that would be fine, but it doesn't.
A word processor stealing the user's IP by default should carry massive fines in the EU. This is pure deception. 20% of annual revenue should be appropriate.
> Microsoft Word and Excel AI data scraping slyly switched to enabled by default — the opt-out toggle is not that easy to find
—
As a tech person, keeping up with disabling and avoiding all this is becoming exhausting. I can’t imagine any regular non-tech person having any chance at avoiding it.
Is it time to just give up? At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?
Worse than exhausting, this is clearly a pattern of abuse done by purposeful intent.
Security fatigue is a well known thing in IT. Configuration fatigue where your configurations malevolently switch back on after the options you chose, disabled them is just as bad, resulting in vexatious experiences.
This is the problem when antitrust is not enforced, and regulation has killed all other smaller market participants. It creates dynamics (abuses) that cause societal upheaval which inevitably lead to violence.
Its really stupid, but the people making these decisions are evil people. Every reasonable person knows that actions have consequences.
Yes, exactly. There's no reason for the burden to be on every single user of every product to disable this crap. The law should require companies to behave more ethically with real consequences if they do not.
>To do so, users must actively opt out by finding and disabling the feature in settings
Odd. So, lets say I wrote a article and it is copyrighted and on some newspaper WEB Page. If I understand this completely, in theory, I need to find everyone who uses this version of Word and tell them to disable this feature ?
If so, looks to me the lawyers are going to have a great time with this and will clog the courts for centuries.
The linked "Services Agreement" doesn't appear to be specific to this "Connected Experiences" thing, but is rather the basic agreement required to use any MS software. Correct me if I'm wrong here, but opting out of this won't restrict MS from having a license to all Your Content?
> "To the extent necessary to provide the Services to you and others, to protect you and the Services, and to improve Microsoft products and services, you grant to Microsoft a worldwide and royalty-free intellectual property license to use Your Content, for example, to make copies of, retain, transmit, reformat, display, and distribute via communication tools Your Content on the Services," the clause reads.
Well, this does make sense in the context of Office 365, OneDrive and the Office web apps in general. (Still dodgy regarding the "worldwide" part but there's no way around that because people can and do expect to access their stuff even while on vacation)
Silently enabling the training of remote AI however? That's not covered under any reasonable interpretation of the above legalese.
IANAL, but I think the "to improve Microsoft products and services" bit does mean that they do legally get to train their AI (which is a Microsoft service) on your data. Still a bastard move though.
IANAL again, but I don't think they get to do literally anything with your data. The phrase used is "to the extent necessary". For instance, I don't think they could scrape their user data for trade secrets and then sell those to the highest bidder.
Who defines “necessary?” Use of Your Content is Necessary to support Microsoft’s business activities, including, but not limited to, training their AI.
There are other laws protecting things like trade secrets and corporate privacy, so it would indeed be foolish for Microsoft to attempt gathering and selling trade secrets. But the wording gives them carte blanche to do anything not already illegal, including using your Most Awesome Word Template in Word’s collection of templates that they distribute to everyone.
Because they boxed themselves in with legalese. Companies would definitely switch off Microsoft services if at all possible if the company's lawyers thought their trade secrets were getting sold off. So I think the "as necessary" framing does probably prevent them from doing some things.
As I laid out in my other comment, I think training AI in particular is covered under the "improving Microsoft products or services" bit of legalese. I do wonder how companies lawyers will respond to this though. They probably thought of that phrase as just allowing Microsoft employees access to documents to see how Word or other pieces of software were being used, or to fix crashes, etc.
I thought it was founded on Bill Gates’s mommy having strong connections to IBM that allowed little Bill to keep the rights to the source code they paid him to write. And the privileged position of having access to a computer at his school when 99.9% of the population did not.
This would certainly be the cause of lots of GDPR violations, considering the kinds of information processed in Word and Excel. I know our condo's owners association keeps contact information of their members in Excel sheets, that's considered PII. It can also contain sensitive information like who is behind on their monthly contributions and by how much.
That's just the first thing I thought of. There must be tons of companies and organisations processing sensitive data in Word and Excel. What about doctor's offices and insurance companies handling medical information? What about banks, financial advisors, lawyers, etc.
Is this the correct use of "opt-in?"
To me, having things "opt-in" means they're off and you can turn them on if you want.
If it's "opt-out" it's automatically on, and you can turn it off.
Likewise, I think the title is literally of opposite what is actually happening.
I think they mean 'Enabled by default'
Thus opt-out would be the correct term.
You are correct. The headline author likely meant "opted in by default" or "enabled by default."
> Microsoft's Connected Experiences feature automatically gathers data from Word and Excel files to train the company's AI models. This feature is turned on by default, meaning user-generated content is included in AI training unless manually deactivated.
Not to say that Microsoft products respect privacy, but I don't see evidence that user Word/Excel files are being used for training.
The linked services agreement has had the same language (copy/transmit/etc. "to the extent necessary to provide the services") since at least 2015[0], and "connected experiences" seems to group a wide range of integrations; some like dictation/translation probably utilise ML, but that does not mean training on user content.
[0]: https://web.archive.org/web/20150608000921/https://www.micro...
Agreed. This was raised within our corp the other week and we read through the privacy and security documentation as it relates to Connected Experiences. Microsoft has outlined specifically what Connected Experiences covers.[1] [2] You could argue that predictive text is a product of machine learning but there is no clause allowing for training any generalized large language models using this data. The confusion may have arisen, if they read an article about CoPilot. If the user had a Microsoft Copilot 365 license, then the data would be used as grounding for their personal interaction with CoPilot. But still not used to train any foundational LLMs. However, even this data is still managed in compliance with Microsoft's data security and privacy agreements.
[1] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...
[2] https://learn.microsoft.com/en-us/microsoft-365-apps/privacy...
To play devil's advocate, I don't see any evidence they're NOT training on user content either. Compared to how explicitly they indicate they're not using user content for targeted advertising, this seems like a huge oversight. Given how carefully they've put together these documents, I'm doubtful it was an oversight.
"In the M365 apps, we do not use customer data to train LLMs. This setting only enables features requiring internet access like co-authoring a document." @Microsoft365 https://twitter.com/Microsoft365/status/1861160874993463648
It’s absurd that Microsoft 365 uses Twitter as its official support announcement platform.
Official announcements about the outage the last couple days we’re posted there.
Twitter is a hostile platform that requires an account to view. Why does MS365 continue to use it?
This seems like a security shit show.
Can we disable it by group policy across entire domains?
Surely no business would ever allow Microsoft to 'reformat, display, and distribute' confidential company documents?
Or am I missing something.
Well, if there's some sort of cloud feature allowing you to share documents you write with others, it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.
However, the terms of service says "To the extent necessary to provide the Services to you and others, [...] and to improve Microsoft products and services". So they're saying they can use your content not just to provide you service, but to provide other people service and to improve all Microsoft products.
> it would make sense you would have to allow Microsoft to "reformat, display, and distribute" for the purpose of providing you that service.
That would be me sharing a specific document with a specific person. If their terms sepcified that they would only ""reformat, display, and distribute" to people we personally give permission to then that would be fine, but it doesn't.
The word 'necessary' can do a lot of heavy lifting.
A word processor stealing the user's IP by default should carry massive fines in the EU. This is pure deception. 20% of annual revenue should be appropriate.
Hopefully full pretax revenue for Microsoft and all their subsidizes.
Title as of the time of this comment:
> Microsoft Word and Excel AI data scraping slyly switched to enabled by default — the opt-out toggle is not that easy to find
—
As a tech person, keeping up with disabling and avoiding all this is becoming exhausting. I can’t imagine any regular non-tech person having any chance at avoiding it.
Is it time to just give up? At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?
Worse than exhausting, this is clearly a pattern of abuse done by purposeful intent.
Security fatigue is a well known thing in IT. Configuration fatigue where your configurations malevolently switch back on after the options you chose, disabled them is just as bad, resulting in vexatious experiences.
This is the problem when antitrust is not enforced, and regulation has killed all other smaller market participants. It creates dynamics (abuses) that cause societal upheaval which inevitably lead to violence.
Its really stupid, but the people making these decisions are evil people. Every reasonable person knows that actions have consequences.
>At what point do you have to accept that the tsunami is here and there’s nothing you can do about it?
Around the late 2000's, but maybe it was earlier. The best time to buy msft stock is always right now.
The solution isn't to give up or attempt to avoid it - it's to make this sort of thing illegal.
Yes, exactly. There's no reason for the burden to be on every single user of every product to disable this crap. The law should require companies to behave more ethically with real consequences if they do not.
I just checked and this is turned off in my installation, but I’m not sure that’s from being EU based, or because my org has disabled it.
> Microsoft says Word and Excel AI data scraping was not switched to enabled by default (Updated)
This seems to be a misunderstanding.
>To do so, users must actively opt out by finding and disabling the feature in settings
Odd. So, lets say I wrote a article and it is copyrighted and on some newspaper WEB Page. If I understand this completely, in theory, I need to find everyone who uses this version of Word and tell them to disable this feature ?
If so, looks to me the lawyers are going to have a great time with this and will clog the courts for centuries.
Microsoft = Spyware
Most tech theses days seems to fall into that classification.
There are not too many pieces of technology these days that intentionally avoid collecting your data in order to be sold to another company.
The linked "Services Agreement" doesn't appear to be specific to this "Connected Experiences" thing, but is rather the basic agreement required to use any MS software. Correct me if I'm wrong here, but opting out of this won't restrict MS from having a license to all Your Content?
> "To the extent necessary to provide the Services to you and others, to protect you and the Services, and to improve Microsoft products and services, you grant to Microsoft a worldwide and royalty-free intellectual property license to use Your Content, for example, to make copies of, retain, transmit, reformat, display, and distribute via communication tools Your Content on the Services," the clause reads.
Well, this does make sense in the context of Office 365, OneDrive and the Office web apps in general. (Still dodgy regarding the "worldwide" part but there's no way around that because people can and do expect to access their stuff even while on vacation)
Silently enabling the training of remote AI however? That's not covered under any reasonable interpretation of the above legalese.
IANAL, but I think the "to improve Microsoft products and services" bit does mean that they do legally get to train their AI (which is a Microsoft service) on your data. Still a bastard move though.
>… intellectual property license to use Your Content
Seems clear to me. Use any way Microsoft wants. The “for example” list is not exhaustive nor limiting.
IANAL again, but I don't think they get to do literally anything with your data. The phrase used is "to the extent necessary". For instance, I don't think they could scrape their user data for trade secrets and then sell those to the highest bidder.
Who defines “necessary?” Use of Your Content is Necessary to support Microsoft’s business activities, including, but not limited to, training their AI.
There are other laws protecting things like trade secrets and corporate privacy, so it would indeed be foolish for Microsoft to attempt gathering and selling trade secrets. But the wording gives them carte blanche to do anything not already illegal, including using your Most Awesome Word Template in Word’s collection of templates that they distribute to everyone.
Why not? Isn’t that the essential ethos Microsoft was founded on?
Because they boxed themselves in with legalese. Companies would definitely switch off Microsoft services if at all possible if the company's lawyers thought their trade secrets were getting sold off. So I think the "as necessary" framing does probably prevent them from doing some things.
As I laid out in my other comment, I think training AI in particular is covered under the "improving Microsoft products or services" bit of legalese. I do wonder how companies lawyers will respond to this though. They probably thought of that phrase as just allowing Microsoft employees access to documents to see how Word or other pieces of software were being used, or to fix crashes, etc.
I thought it was founded on Bill Gates’s mommy having strong connections to IBM that allowed little Bill to keep the rights to the source code they paid him to write. And the privileged position of having access to a computer at his school when 99.9% of the population did not.
"The funds from the bidder will be invested in to products in order to make a better user experience" /s
Reminds me of “this call will be used for training and quality purposes.”
servers are already on debian, client PCs left
This would certainly be the cause of lots of GDPR violations, considering the kinds of information processed in Word and Excel. I know our condo's owners association keeps contact information of their members in Excel sheets, that's considered PII. It can also contain sensitive information like who is behind on their monthly contributions and by how much.
That's just the first thing I thought of. There must be tons of companies and organisations processing sensitive data in Word and Excel. What about doctor's offices and insurance companies handling medical information? What about banks, financial advisors, lawyers, etc.
This is why they chose to put AI into word processors and excel. So they can take the PII in a derived but reconstructable form (weights).
Does this circumvent Azure Information Protection policies as well? Would be fucking hilarious if it did.