Skip to main content

Steve Jobs allegedly denied knighthood by UK’s Gordon Brown

Steve JobsFormer UK Prime Minister Gordon Brown allegedly blocked Apple chief executive Steve Jobs from being knighted the Queen of England in 2009, reports the Telegraph. Brown’s denial of knighthood, an unnamed former senior Labour Party minister tells the Telegraph, was because Jobs turned down an invitation to speak at the Labour Party conference.

Apple’s revolutionary effects on the technology industry are said to have been the reason for Jobs’ recommendation for knighthood.

“Apple has been the only major global company to create stunning consumer products because it has always taken design as the key component of everything it has produced,” the MP said. “No other CEO has consistently show such a commitment.”

Despite the MP’s claims that Brown arranged for Jobs to not to be knighted, a spokeswoman for the former prime minister told the Telegraph: “Mr. Brown did not block a knighthood for Steve Jobs.”

Regardless of who or what stood in Jobs’ way, his lack of knighthood stands in stark contrast to Microsoft founder Bill Gates, who was knighted in 2005.

As Apple Insider points out, Gates does not carry the title of “Sir” because he is not a citizen of the British Commonwealth. Had Jobs received the honor, he would not have been titled “Sir,” either. Instead, the letters “KBE” are used. They stand for Knight of the British Empire.

Other Americans to receive the KBE honor include evangelist Billy Graham, former New York City mayor Rudolph Giuliani, actor Bob Hope and director Steven Spielberg.

Assuming the rumor is true, Steve Jobs would likely have had more pressing matters on his mind than receiving a symbolic honor in another country — regardless of that honors awesomeness. Remember, in 2009, Apple was in the pre-launch stages of the iPad, which has already changed the modern PC as we know it — a more amorphous achievement than being knighted, yes. But also far more impressive.

Andrew Couts
Former Digital Trends Contributor
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more