Skip to main content

John McAfee explains how to uninstall McAfee Antivirus – with strippers, coke, and guns

John McAfee How to Uninstall McAfee Antivirus
Image used with permission by copyright holder

After fleeing Belize late last year amidst allegations that he killed his neighbor, the world’s most lovable nuttbag millionaire fugitive, John McAfee, has spent his days in misty Portland, Oregon – what up, PDX?! – doing god-knows-what. Probably killing people and huffing bath salts off their still-warm bodies, if the rumors are to be believed.

But just as we were all beginning to forget about his escapades, the founder of McAfee Antivirus – a company he sold 15 years ago – goes and does something amazing.

In the (100 percent NSFW) video below, John McAfee explains “how to uninstall McAfee Antivirus,” a piece of spam-ware known for sucking up CPU usage, updating at all the wrong moments, and all-around pissing off anyone with the unfortunate luck of finding it installed on their machines. (For the record, McAfee Antivirus is now a product of Intel.) And by “explains,” we mean he makes a mockery of all the various misconceptions (accurate conceptions?) people have about him, from his insatiable womanizing, to his drug experimentation, to his love of high-powered firearms. It’s all in there, and it’s flipping fantastic.

The video was written and directed by Cartoon Monkey Studio animator and cartoonist Chad Essley, who is currently drawing a graphic novel about John McAfee called “The Hinterland.” According to Wired, the book will detail his adventures with McAfee, who Essley has known since 2010 after meeting on a private online forum where people discuss stuff like “antibiotic ventures.”

In other words, McAfee could not have found a better person to create this pièce de résistance. It’s so good, such a perfect mockery of his alter ego, that his critics will find it difficult to skewer him better than he’s already skewered himself.

Enjoy.

(Again, this is adult humor, kids. And could get you adults fired for watching on the job.)

Andrew Couts
Former Digital Trends Contributor
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more