Skip to main content

This small Windows update brings a highly requested change

the leaked windows 11 file explorer app with a modern design.
Digital Trends

Microsoft is simplifying file management in Windows 11 with a new feature in the latest Canary Build (an early preview version of Windows 11). Users can drag and drop files directly between breadcrumbs (paths) in File Explorer. This fulfills a common request from Windows Insiders, and is something Microsoft recently announced in a June 19 Windows Insider Blog post.

These breadcrumbs are the paths you take to where you want to save your file. For example, This PC > Windows (C:) > Program Files. The breadcrumbs will appear in the Address Bar and display the current path taken inside the app. This feature also seems to have reached non-Insiders since its release at the end of May.

Microsoft describes this breadcrumb path as a thin border around the selected area that aims to make moving files around easier. In addition to the drag-and-drop feature, the notifications also get an update. Microsoft has changed how it detects when to suggest turning off app notifications. Microsoft has extended the time frame for detecting when to suggest turning off app notifications, though the exact duration remains unspecified.

The tech giant also acknowledges and lists known issues in the build. The list of problems is not long and includes some issues, such as a bug in dark mode inside the Task Manager and getting hung up on a prior build. This change may not be huge, and time will tell how well it works. But it’s definitely welcome, and Microsoft listening to feedback gives us hope that we might get other much-needed changes to how we use apps on Windows 11.

Judy Sanhz
Judy Sanhz is a Digital Trends computing writer covering all computing news. Loves all operating systems and devices.
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more