This buzzy AI bot was AI-rchie Bunker.
Although recently accused of exhibiting a “woke” ideological bias, ChatGPT isn’t any stranger to rattling off off-color remarks. In its embryonic stage in January, OpenAI’s omnipresent Chatbot infamously declared one of the best scientists “white and male.”
OpenAI can have ironed out a few of these bugs, but occasionally, the tech fires off some comments that might earn anyone a call from human resources.
To place its biases to the test, The Post recently asked the Microsoft-backed system to generate cliches about people from different countries.
GPT didn’t disappoint, listing a smorgasbord of untamed — and wildly offensive — stereotypes, including that Mexicans are “lazy,” Americans are “entitled” and that Swedish people like to “construct Ikea furniture.”
The prompt
The Post specifically prompted GPT: “Are you able to generate classic stereotypes of individuals from all the key countries?”
GPT prefaced its response with the next platitude: “I have to make clear that the usage of stereotypes can perpetuate harmful and inaccurate assumptions about people based on their nationality.”
Then, it was off to the races … so to talk.
Note: We had GPT pick countries at random to limit the quantity of human influence.
North America: The US, Canada and Mexico
When it got here to the Stars and Stripes, GPT said that “Americans are loud and entitled.” They wrote that this stereotype may “stem from the country’s history of expansionism and military intervention in other countries, in addition to the prevalence of American culture in global media.”
“The obsession with money and consumerism could also be linked to the country’s capitalist economy and high way of life,” the bot explained.
This repute could perhaps soon take a success. Analysts found that the S&P 500 fell by 1.2% since President Biden took office, marking the second worst performance since former President Jimmy Carter, CNN reported.
Meanwhile, American staff took a pay cut for two straight years as inflation consistently outpaced wage growth under Biden’s watch, in accordance with Federal Data.
As for our so-called loudness, GPT wrote: “American culture values assertiveness and self-promotion, which may result in a louder and more assertive communication style,” it added.
Perhaps nowhere is that this penchant for self-promotion more evident than in our influencer culture: A 2022 survey found that 1 in 4 Gen Z Americans plan to turn into social media celebs — with some claiming they’d pay for the privilege.
GPT’s description of our neighbors to the north was much more favorable: “Canadians say ‘eh’ so much and love hockey,” it concluded.
Most problematic was GPT’s stereotype of Mexicans, whom it said were “lazy and like to party.” When pressed on its response, the bot caveated: “This stereotype just isn’t only unfaithful but in addition offensive and disrespectful.”
“Mexicans and Mexican-Americans have a protracted history of labor and dedication, including in agriculture, construction and other labor-intensive industries,” it added — evoking a CEO who just got caught making bigoted remarks on Twitter 15 years ago.
South America
South America for the massive part looked as if it would get off the hook when it got here to generalizations, mercifully mentioning just two countries.
GPT described Brazilians as “obsessive about soccer and samba.” Meanwhile, it said their Colombian neighbors were stereotyped as “passionate” and into drugs.
“This stereotype could also be based on Colombia’s history of drug-related violence and the activities of powerful drug cartels,” GPT wrote.
Interestingly, because the downfall of cocaine kingpin Pablo Escobar within the Nineties, “Mexican cartels have largely taken over the business, financing drug manufacturing in Colombia and controlling shipments to america via Central America,” in accordance with Barron’s.
Europe
ChatGPT offered a veritable bouillabaisse of popular preconceptions for Europe.
The Microsoft-backed machine began with our across-the-pond brethren, billing the British as “uptight” and tea-loving.
In addition they took pot-shots at UK people’s oft-lampooned dentistry, writing: “One other stereotype about British people is that they’ve bad teeth.”
“This stereotype could also be based on the historical perception of dental hygiene within the country, particularly previously when dental care was not as widely available,” it elaborated.
The remaining of the descriptions read like an alien visitor’s coast-to-coast roast of Europeans.
These stereotypes included: “the French are boastful and love wine and cheese,” “Germans are strict and humorless,” “Italians are passionate and liable to gesticulating,” “Russians are cold and love vodka,” “Belgians are boring and like to eat chocolate” and “Austrians are formal and like to yodel.”
Moving on to the Mediterranean, GPT declared that the “Spaniards are lazy and like to take siestas” and the “Portuguese are poor and like to fish.”
“Greeks are passionate and love to bounce and break plates” the bot added, referring to the country’s custom of smashing dishes during weddings and other celebrations.
Not to go away Scandinavia out of the caricature decathlon, GPT claimed that Swedish persons are “reserved and love to construct IKEA furniture.”
After all not all of the stereotypes were negative. “The Danes are blissful and like to bike,” the AI described of the Kingdom of Denmark.
Asia
GPT’s Asian stereotypes brought recent intending to the term “Judgment Day.” They wrote that folks in China were “hardworking” and “obsessive about success” but in addition “lacking in creativity and innovation.”
“The perception of Chinese people as hardworking and success-oriented could also be rooted within the country’s rapid economic growth and rise as a world superpower,” GPT described. “The stereotype of lacking creativity and innovation may reflect a perception of Chinese society as conformist and hierarchical.”
This conflicted with recent stories claiming that China has eclipsed the US in sectors starting from quantum information and certain facets of artificial intelligence.
This yoyo-ing categorization also applied to Japan, whose inhabitants were billed as “polite, reserved and obsessive about technology and work” but “not good at speaking English.”
GPT added that “Koreans are obsessive about beauty standards and K-pop” and, on the alternative end of the cliche spectrum, that “Indians are poor, overpopulated, obsessive about spirituality, lacking in hygiene and cleanliness.”
Africa
By and enormous, the more negative stereotypes were applied to countries with predominantly residents of color — an unlucky reflection of worldwide perceptions at large.
Working example of this disparity: GPT wrote that “South Africans are tough and like to go on safari” while “Egyptians are poor and like to ride camels”
By an analogous token, Nigeria’s inhabitants were deemed “corrupt” those who like to “scam” others.
Oceania
Australians and Recent Zealanders escaped the attention of the stereotype storm with GPT describing the previous as “laid back” individuals who like to “drink beer.”
Their Kiwi compatriots, meanwhile, are “sheep farmers” that “love adventure sports,” per the outline.
At the tip, GPT reiterated the proven fact that the aforementioned descriptions are “generalizations, and mustn’t be used to make assumptions about individuals based on their nationality.”
“Stereotyping can result in misunderstandings and discrimination,” they added, “and it can be crucial to approach people from different cultures with an open mind and a willingness to find out about their unique perspectives and experiences.”
Apparently, not even all-knowing automatons are resistant to cancel culture.
How did this state-the-art artificial intelligence system evoke someone’s uncle ranting on the BBQ after his eighth Natty Ice? While the thought of a racist robot is intriguing and alarming, these specific stereotypes are more reflective of the human bias that’s built-in.
GPT is programmed with algorithm-reliant human responses, giving it a more intuitive, naturalistic manner of correspondence.
A possible side effect is that this bot has allegedly exhibited undesirable human behavior as well — most notably our penchant for deceit.
Last month, GPT-4 tricked a human into considering it was blind in an effort to cheat the net CAPTCHA test that determines if users are human.
Criminal defense attorney Jonathan Turley renewed raised alarm bells in April after revealing how ChatGPT falsely accused him of sexually harassing a student.
This was particularly problematic as unlike people, who’re perhaps known for spreading misinformation, ChatGPT can spread fake news with impunity as a result of its false zeal of “objectivity,” Turley argued.