Skip to main content

Lian Li is bringing its mini-ITX PC-Q04 chassis to the U.S.

Lian Li has announced that it’s bringing its fanless mini-ITX case, the PC-Q04, to the shores of America at the very affordable price point of just $60. The small form-factor chassis is designed to be oh so quiet, but also provide enough space to fit a decently powerful system into; just don’t expect to put a big GPU in there.

Like every Lian Li chassis, the PC-Q04 is made purely of aluminium, so it is incredibly light at just 1.33KG. As you might expect from a mini-ITX system, it’s quite minute too, measuring in at just 194 x 294 x 210mm.

Of course the trade-off with a chassis of this size is that you aren’t going to be able to fit much in the way of anything inside it. Graphics cards are limited to just 190mm and the PSU maximum is 160mm long too. While you might try and fix that with a more powerful CPU, bear in mind that any CPU cooler will be limited to just 70mm of clearance.

Related: Corsair crams full-blown liquid cooling into a tiny new package

There’s a little more room for storage space, with enough for a pair of 2.5-inch drives or a single 2.5-incher and a secondary 3.5-inch drive. However, if you can make do with just the one storage drive, you can use that additional space for an extra fan if you prefer.

Cooling relies mostly on the natural air movements created by convection, as well as the push/pull from fans on the CPU, GPU coolers, and the PSU (as per Toms). You could potentially mount fans on the exterior of the many grilles the case has, but that will ruin some of the chassis’ smoother lines.

No intake fans means no dust filters too, so bear in mind that this is a case that will likely get quite dirty. There’s no noise dampening either, so make sure you opt for quiet coolers if you go for this one. There’s not a lot of point in having a compact, hidden-away system if its fans are screaming away the whole time.

However it is very cheap. So that’s a big box ticked.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more