Skip to main content

ASRock enables Skylake overclocking, sending budget chips into the stratosphere

asrock killed overclocking intel skylake nonk gaming motherboard
Image used with permission by copyright holder
Overclocking has always delivered the promise of additional power in exchange for risk, rather than dollars, but not so much in the latest generation of Skylake CPUs. However, it’s no longer just those who paid for the privilege that can overclock this generation of processors, as ASRock has released an official BIOS update for its boards which makes overclocking all Skylake chips possible.

The feature is known as SkyOC and is now available to anyone with a Z170 motherboard. It was initially debuted a few days ago when TechSpot reported on its ability to unlock overclocking.

In-case nobody believed it was doable, ASRock’s official SkyOC page has screengrabs from results it’s achieved with the BIOS update. There’s a Core i5-6400 that was taken from 2.7GHz with a bus speed of 100.9MHz, all the way up to 4.3GHz with a Bus speed just shy of 160MHz. That’s huge.

Perhaps more impressive, it also managed to achieve a near 20 percent boost in clock speed for an i3-6100 and more than 36 percent boost in performance for a Pentium G4400. That’s a sudden a dramatic speed boost on a couple of budget-friendly chips.

And that’s what’s going to be the most interesting aspect about this new BIOS update release: will it change which Skylake CPUs people are buying?

Overclocking has always been something that was dependant on the chip you buy and the luck of the draw. Historically buying lower-end CPUs and overclocking them was a quick way to save yourself some money as they could often even out with the top end. Today chips are sold with overclocking as a feature, so that’s much less common.

But with this new update, we’re back to the wild-west where if you have the know how and get yourself a lucky chip, you may well find yourself out-performing those that spent a lot more money on their hardware, with your only additional cost being an upgraded cooler.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more