Skip to main content

Razer adds RGB backlighting to DeathStalker, Orbweaver boards

RGB backlighting has become the technology du jour for high-end keyboards and other peripherals, so it’s no wonder that Razer is continuing to expand its Chroma scheme to more of its products. Joining the firm’s Blackwidow keyboard, Naga mouse and Firefly mouse mat, the Deathstalker will also be getting the versatile backlighting system, as will the Orbweaver gamepad.

This announcement comes from the company’s booth at Gamescom, where it’s been showing off the colorful new versions of its popular keyboards. Since the Deathstalker uses chiclet keys rather than the mechanical switches that the Blackwidow makes use of, the implementation is a little different.

Traditionally Razer’s mechanical switches have used off-centre LEDs, which can result in an uneven lighting throughout the keys – especially those with double functions. That shouldn’t be the case with the Deathstalker Chroma edition.

Related: Taste the rainbow at E3 with Razer’s new Mamba mouse

The new Deathstalker with RGB lighting is available now for $100, which makes it $30 more expensive that the standard Deathstalker, but far cheaper than the “Ultimate” version.

Other announced Chroma products include a new version of the Orbweaver gamepad. As Tom’s points out, it comes with 30 programmable buttons, and with 20 standard keys alongside other control inputs. It features mechanical switches, much like the Blackwidow, and is designed to offer a colorful alternative to the Tartarus, which has fewer keys and uses a membrane interface.

Although the Chroma Orbweaver isn’t set for release for another month or so, when it does debut it will come with a price tag of $130. Although this is the same price that the original Orbweaver is listed at with some retailers, the original can be found for $100 at others, so the new version is slightly more expensive owing to its new Chroma features.

Does the addition of the fancy lighting make you more likely to buy one of these peripherals?

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more