Skip to main content

ChatGPT allows this nightmarish AI typewriter to talk to you

A classic Brother AX-325 typewriter straight out the ’90s has gained ChatGPT powers and is all set to have conversations with you, on paper anyway (literally).

Designer-engineer Arvind Sanjeev shared his full process thread on how the typewriter went from idea to the final, charmingly elegant machine that it is. Sanjeev reverse-engineered a Brother AX-325, modded it with some AI smarts, gave it a new paint job, and called it Ghostwriter. And now, it can conversate (onto a piece of paper) with anyone typing on the keyboard.

Ghostwriter complete exploded view.
Image used with permission by copyright holder

According to his Twitter posts, Sanjeev dismantled the typewriter, mainly to feed the keyboard signals through an Arduino driver, which in turn was sent to a Raspberry Pi running OpenAI’s GPT-3 python API.

GPT-3 is a powerful tool that works like a chatbot, responding to input (in this case the keyboard) and composing text at a level similar to a human. Sanjeev also showered the Brother with a little TLC, giving the typewriter a new bounce in its mechanical step.

As promised, here is the full process thread for Ghostwriter – the #AI typewriter. A journey from idea to realization:

The idea: With the exponential growth and emergence of a prolific number of AI products we see every day, I wanted to create a mindful intervention that (1/13) pic.twitter.com/MCOeAcM26q

— Arvind Sanjeev (@ArvindSanjeev) December 14, 2022

Sanjeev felt that he also needed a way to control the temperature (creativity) and length of the responses from GPT-3, so he installed two knobs above the keyboard. The knobs flank another small addition: a small OLED that displays their feedback statuses.

Ghostwriter conversation on paper.
Image used with permission by copyright holder

Once the technical bits was done, Sanjeev set out to turn the office gray of the Brother into something more inviting and modern. He was aiming for something warm and playful that anyone can approach, without digital distractions, where they could enjoy a “calm, meditative interface of a vintage typewriter” — and we think he succeeded.

Aaron Leong
Former Digital Trends Contributor
Aaron enjoys all manner of tech - from mobile (phones/smartwear), audio (headphones/earbuds), computing (gaming/Chromebooks)…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more