Skip to main content

Windows 10 user activity logs are sent to Microsoft despite users opting out

The privacy settings for Windows 10 may not be working as promised.

According to MSPoweruser, a few Reddit users have discovered that despite offering an option to opt out of sending user activity logs to Microsoft, Windows 10 is still sending that data to Microsoft even when users choose to opt out.

Windows 10 users are able to access their user activity log settings by going to Settings and then selecting Privacy. From there, within the Activity History menu, users should be able to prevent their activity history from being sent to Microsoft by either unchecking the option to, “Let Windows sync my activities from this PC to the cloud” or by selecting the “Clear activity history” option.

But as some Windows 10 users have realized, selecting or deselecting these options does not guarantee that the data won’t be shared since this same data can still be accessed by logging into your Microsoft account, specifically when you view your Privacy dashboard.

Additionally, as BetaNews explains, the data isn’t even really considered to be purely “activity history” data. As far as Windows 10 is concerned, that data is diagnostic data, regardless of whether or not it includes a user’s activity history. Which means if users want to have control over how much or what kind of data is released to Microsoft, they will have to access another set of settings.

Instead, users will need to go to the Privacy menu within Settings and select “Diagnostics and feedback.” Within this menu, Windows 10 users can choose the Basic option which only releases diagnostic data related to a PC’s settings, capabilities, and performance. Apparently, if this option is not selected, the PC seems to default to selecting the Full option which is the one that includes a user’s activity history.

It’s problematic that Microsoft’s Privacy settings and dashboard aren’t as clear about how user data is managed and released as it needs to be as many users would normally not be aware of the Basic diagnostic data option if they were just viewing Windows 10 Privacy menu.

That clarity is important in light of recent General Data Protection Regulation (GDPR) guidelines that require companies like Microsoft to clearly inform its users about how it collects and manages user data in addition to making sure that users are given the option to not have their data collected in the first place.

Update: Microsoft reached out to us on Thursday to provide an official statement:

“Microsoft is committed to customer privacy, being transparent about the data we collect and use for your benefit, and we give you controls to manage your data. In this case, the same term “Activity History” is used in both Windows 10 and the Microsoft Privacy Dashboard. Windows 10 Activity History data is only a subset of the data displayed in the Microsoft Privacy Dashboard. We are working to address this naming issue in a future update.”

Anita George
Anita has been a technology reporter since 2013 and currently writes for the Computing section at Digital Trends. She began…
7 beloved Windows apps that Microsoft has killed over the years
A screenshot of Internet Explorer 9.

Microsoft's history is littered with the discontinuation of once-beloved applications. Most recently, WordPad, the renowned text editor app, was conspicuously absent from the latest beta build of Windows 11, indicating an end to its 28-year-long journey. I have fond memories of using the app back in my college days when Microsoft Office was too pricey for me.

WordPad is far from the only app to get canceled by Microsoft over the years. From pioneering productivity tools to nostalgic multimedia players, let's reminisce about some of the most famous applications that Microsoft has consigned to the annals of tech history.
Internet Explorer

Read more
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more