Skip to main content

Mad Catz’s R.A.T. 1 lets you make and use your own customized components

Mad Catz has taken peripheral customization to a whole new level with its R.A.T line. Sure, some other manufacturers might let you tweak the LED lighting, or add some extra weight, or even switch out the thumb grip, but the R.A.T. Pro X introduced earlier this year went way beyond that. The new R.A.T. 1 isn’t quite so transformer-like, but its product shot still looks like an exploded diagram.

Split into three distinct sections, the R.A.T. 1 lets you take apart the palm rest, chassis and sensor module, with the idea being that they can be interchanged with alternatives. Unfortunately, there aren’t any alternatives you can buy right now, but it seems likely Mad Catz will make some available in the future.

What you can do when you buy it, though, is make your own. In an unprecedented move for a manufacturer of almost anything, Mad Catz has released the 3D CAD modelling files for the palm rest, letting you 3D print your own with your own designs if you have the facility to do so, or pay for one to be made for you at a commercial outlet.

Although somewhat of a roundabout process, this is a level of customization that offered by few. The only mouse we can think of that exceeds it is the Roccat Nyth.

You can remove the core sensor module from the palm rest, as well, which lets you use it as a travel mouse if needed. It won’t have quite the same features as the full sized R.A.T. 1, but if space is a premium, this mouse can conform to it.

Other than its swappable modules, the new Mad Catz mouse packs a high quality, 3,500 DPI Pixart PMW 3320 optical sensor under the hood, for precision gaming, a grilled palm rest to keep perspiration to a minimum, a glow in the dark frame (or a white and red alternative) and scroll wheel, and a mouse cable protector to make sure it doesn’t shear off somewhere down the road.

Although not quite available yet, you can pre-order the Mad Catz R.A.T. 1 now for $30.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more