Skip to main content

Harman Kardon GLA-55 2.0-Powered Speaker System


We’ve seen speakers clad in Rosewood, African Blackwood and piano black rubbed out with the painstaking French polish technique before, but none replicate quite the look, we think, of Harmon Kardon’s new GLA-55 2.0 loudspeakers. With slender, jewel-like bodies cut from real glass – not acrylic – they look almost otherworldly in design.

Besides creating a stunningly different aesthetic value, the material chosen for the speaker carcass, which Harmon says is the same used in bulletproof glass, reduces resonance thanks to its stiffness, increasing bass performance and cutting down on distortion.

Harmon Kardon GLA-55
Image Courtesy of Harmon Kardon

The company has also used its top-end drivers in the unit: CMMD Lite Tweeters that produce frequencies up to 20kHz, and Atlas AL Woofers, which supposedly mimic the bass created by ordinary 18-inch woofers, but in a package one sixth the size. A horn-like “slipstream port” below also eliminates the distortion sometimes heard as a whacking noise thanks to its unique shape. Together, both speakers pump out 110 watts from an integrated amp, meaning they can be connected directly to an MP3 player, computer, or other audio source without any other components.

Close Up Binding Posts
Images Courtesy of Harmon Kardon

Harmon Kardon’s GLA-55 2.0 Powered Loudspeakers retail for a cool $1,000 at Amazon.com, the exclusive U.S. distributor. While we’ll wait until we hear a pair before we weigh Harmon’s claim that it’s “arguably the best 2.0 speaker system ever engineered,” we’ll admit right now that $1,000 buys one nice piece of desk candy.

Nick Mokey
As Digital Trends’ Managing Editor, Nick Mokey oversees an editorial team delivering definitive reviews, enlightening…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more