Skip to main content

Madman cuts open Surface 3 Pro to install new 1TB solid state drive

madman cuts open surface 3 pro to install new 1tb solid state drive lv6ibf7
Image Credit: Jose Malagon/Blogspot
We’ve heard of customized computing before, but this is getting a little out of hand.

Back in February, the world caught wind of a new method to upgrade the solid-state hard drive included in the Microsoft Surface 3 Pro that allowed users to push beyond the company’s self-imposed limit of 512GB. The technique, pioneered by Mexico City’s Jorge Malagon, doesn’t involve going through official channels on Microsoft’s side. Instead he used a drill to crack open the tablet/laptop hybrid by cutting straight through its metal body.

The reason for the forced entry, as opposed to a more elegant solution, is because the Surface 3’s screen is notorious for cracking at even the slightest attempt to unscrew the backplate. When both iFixit and CNet attempted traditional teardowns upon the Surface 3’s initial release, they found that the adhesive between the screen and the shell was so strong that it was nearly impossible to get the two separated without destroying either the shell or display in the process.

Malagon found a way around this problem by simply checking the schematics of his Surface 3, and then carefully drilling out the space where the SSD was kept inside through the back. From there it was just a matter of installing a replacement, which in this case took the form of Samsung’s 840 EVO 1TB mSATA solid state drive.

The tech-saavy Malagon even went to the trouble of cloning his old hard drive, rather than trying to re-install Windows from scratch. That was probably a wise move, because it means he didn’t have to hassle with trying to re-install the Surface Pro 3’s drivers.

We can’t recommend going this route unless you’re absolutely confident in your abilities with a Dremel, but it’s nice to know it’s an option — if you have a drill, a compatible SSD, and you’re a bit batty.

Chris Stobing
Former Digital Trends Contributor
Self-proclaimed geek and nerd extraordinaire, Chris Stobing is a writer and blogger from the heart of Silicon Valley. Raised…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more