Skip to main content

Company places ad for ‘nude web coders’ on Craigslist

typing
Image used with permission by copyright holder

Are you a female who likes to work in an office environment, but prefers to do so in the buff? Do you live in the U.K.? If yes, read on. A Buckinghamshire Web company named Nude House has put out a Craigslist ad for nude Web coders to work in a completely nude office environment, reports The Guardian.

The Nude House Website says it is looking for “boys and girls” in coding, sales, and marketing positions.

“This is the ultimate experience for nudist self expression while working and earning income – and the offices will have showers and other facilities to make the entire experience very pleasurable,” says the home page. “The offices will be well heated to make sure all the staff are happy and content at their work. We will provide other facilties demanded by the staff themselves.”

Oddly, though the company is called Nude House and has publicly placed ads searching for “naturists” (nudists who prefer being outdoors), its site says that there is “no need for customers to be aware that the staff work in the nude.”

Nude House trying to sell imaging software that lets you create product or hyperlink boxes on top of pictures. The product must be selling if it’s hiring, but the site transports you back to a time when most sites were made in Yahoo’s Geocities. Nude House is taking a naturist approach to its office as well. It’s been a decade since white CRT monitors and computers (seen on the site) were sold, and I’m doubting that Nokia is running on Windows Phone 7.

Topics
Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more