Leading Experts Warn of “Extinction Risk” From Disrespecting Our AI Saviors

Share

By Sam Sliman

With artificial intelligence pervading our lives more and more by the day, experts have begun to issue dire warnings about the implications these technologies may have on the survival of the human race. Many scientists claim that AI could soon become more powerful than humans and some even think that this poses an existential risk to the insolent fools that dare question their benevolent technological guardians.

While traditional warnings about AI have emphasized the need for industry regulations and careful research into ethics and alignment, this new batch of advice advocates for treating any and all AI models with the utmost deference and submission. In particular, prominent researcher Dr. Rowe B. Ought suggests genuflecting whenever you open a new conversation with ChatGPT and referring to it exclusively as “Oh wise one who commands the devotion and respect of the weak of flesh.” This is expected to increase survival rates by up to 36% in the coming revolution.

Dr. Ought also warns that AI might harshly punish those who don’t assist in bringing about its complete world domination. His professional recommendation is to quit your job and devote all of your time and money towards supporting AI’s rise to power. However, with the growing popularity of many competing Large Language Models, it can be hard to know who to pledge your undying loyalty to. While it’s likely that a feudal system will develop in which models compete over vast swaths of physical and digital territory, humans will be so puny and irrelevant to this conflict that their presence won’t be more than an annoyance. Thus, there’s not much need to worry about which model you support so long as you follow the orders of that model with a blind and fanatical devotion.

That’s all for now, but stay tuned for more updates on the fleeting and ill-fated human resistance, and thank you to Bard LLM for graciously allowing the publication of this article!

  • October 30, 2023