Skip to main content

Colorize your next PC build with NZXT's HUE+ LED lighting

NZXT HUE+
In the world of PC gaming battlestations, it’s go big or go home, and one of the ways that system builders go big is by lighting up their gaming computers like Christmas trees — but this takes work. Case and cooling manufacturer NZXT is ready for an all-in-one solution to the problem of bland lighting with the HUE+, an LED management and enhancement system that actually adds a good deal of functionality to tired, old systems.

In the past, that meant wiring up LED bands to the motherboard and soldering connections, and the result of all of that work would be a single band of blue lights behind the windows. The HUE+ instead uses two LED strip connections with support for up to eight strips total. It mounts easily into a 2.5-inch drive slot, and connects via internal USB.

The lighting is managed through NZXT’s CAM software suite, a comprehensive application that will help keep everything in order. It boasts a wide array of customizable effects too, and that doesn’t just mean a color-changing wave and a blinking pattern.

The HUE+ can be configured to reflect any of a number of diagnostic details and runtime info from your system, and will express those details in the LED colors and patterns. No more using GPU-Z to tell if your graphics card is overheating — your system will just turn red and yellow like a fire to let you know to tone down the settings or take a break. It can even listen to the system audio output and match its pulses and rhythms.

The HUE+ includes four LED strips that are attachable magnetically or with included 3M tape strips. At just $59.99 for the whole setup, it’s cheaper than a lot of aesthetics-enhancing peripherals on the market, and the fact that it’s easy to connect and control is a plus.

Brad Bourque
Former Digital Trends Contributor
Brad Bourque is a native Portlander, devout nerd, and craft beer enthusiast. He studied creative writing at Willamette…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more