Skip to main content

Bill Gates: Super-rich is no better than just rich

bill gates
Image used with permission by copyright holder

It’s one thing to be told that money isn’t everything by someone who has never had any. It’s quite another when the person giving the advice is the wealthiest man in the world

This lesson was recently learned by a lucky group of computer science and engineering students at the University of Washington, who had the opportunity to listen of Microsoft co-founder Bill Gates, a native of UW’s hometown of Seattle, give his two cents about life, technology and getting rich. 

“I didn’t start out with the dream of being super-rich,” said Gates. “And even after we started Microsoft, and the guys who ran Intel—Gordon Moore and those guys—were billionaires, I was like, ‘Wow, that must be strange.’ And so—it is, it’s quite strange.” 

He added: “But wealth above a certain level, really, it’s a responsibility that then you’re going have to either, a.) leave it to your children, which may or may not be good for them, or b.) try to be smart about giving it away.

“So I can understand wanting to have millions of dollars, because there’s meaningful freedom that comes with that. But once you get much beyond that—you know, I have to tell you, it’s the same hamburger. Dick’s [a local fast food chain] has not raised their prices enough. But, you know, being ambitious is good. You just have to pick what you enjoy doing.” [Emphasis ours]

Gates also said that he believes ” that the rich should be taxed a lot more,” but that the best thing America can do for those with thinner wallets is to provide a quality education that can “give them an opportunity to move up into the top few percent.”  

Read a full report about Gates’ talk here

Andrew Couts
Former Digital Trends Contributor
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more