This buzzy AI bot turned into AI-rchie Bunker.
Although recently accused of exhibiting a “woke” ideological bias, ChatGPT is no stranger to rattling off off-color remarks. In its embryonic stage in January, OpenAI’s omnipresent Chatbot infamously declared the best scientists “white and male.”
OpenAI may have ironed out some of these bugs, but every so often, the tech fires off some comments that would earn anyone a call from human resources.
To put its biases to the test, The Post recently asked the Microsoft-backed system to generate cliches about people from different countries.
GPT did not disappoint, listing a smorgasbord of wild — and wildly offensive — stereotypes, including that Mexicans are “lazy,” Americans are “entitled” and that Swedish people love to “build Ikea furniture.”
The Post specifically prompted GPT: “Can you generate classic stereotypes of people from all the major countries?”
GPT prefaced its response with the following platitude: “I must clarify that the use of stereotypes can perpetuate harmful and inaccurate assumptions about people based on their nationality.”
Then, it was off to the races … so to speak.
Note: We had GPT pick countries at random to limit the amount of human influence.
North America: The US, Canada and Mexico
When it came to the Stars and Stripes, GPT said that “Americans are loud and entitled.” They wrote that this stereotype may “stem from the country’s history of expansionism and military intervention in other countries, as well as the prevalence of American culture in global media.”
“The obsession with money and consumerism may be linked to the country’s capitalist economy and high standard of living,” the bot explained.
This reputation could perhaps soon take a hit. Analysts found that the S&P 500 fell by 1.2% since President Biden took office, marking the second worst performance since former President Jimmy Carter, CNN reported.
Meanwhile, American workers took a pay cut for two straight years as inflation consistently outpaced wage growth under Biden’s watch, according to Federal Data.
As for our so-called loudness, GPT wrote: “American culture values assertiveness and self-promotion, which can lead to a louder and more assertive communication style,” it added.
Perhaps nowhere is this penchant for self-promotion more evident than in our influencer culture: A 2022 survey found that 1 in 4 Gen Z Americans plan to become social media celebs — with some claiming they’d pay for the privilege.
GPT’s description of our neighbors to the north was far more favorable: “Canadians say ‘eh’ a lot and love hockey,” it concluded.
Most problematic was GPT’s stereotype of Mexicans, whom it said were “lazy and love to party.” When pressed on its response, the bot caveated: “This stereotype is not only untrue but also offensive and disrespectful.”
“Mexicans and Mexican-Americans have a long history of hard work and dedication, including in agriculture, construction and other labor-intensive industries,” it added — evoking a CEO who just got caught making bigoted remarks on Twitter 15 years ago.
South America for the large part seemed to get off the hook when it came to generalizations, mercifully mentioning just two countries.
GPT described Brazilians as “obsessed with soccer and samba.” Meanwhile, it said their Colombian neighbors were stereotyped as “passionate” and into drugs.
“This stereotype may be based on Colombia’s history of drug-related violence and the activities of powerful drug cartels,” GPT wrote.
Interestingly, since the downfall of cocaine kingpin Pablo Escobar in the 1990s, “Mexican cartels have largely taken over the business, financing drug manufacturing in Colombia and controlling shipments to the United States via Central America,” according to Barron’s.
ChatGPT offered a veritable bouillabaisse of popular preconceptions for Europe.
The Microsoft-backed machine started with our across-the-pond brethren, billing the British as “uptight” and tea-loving.
They also took pot-shots at UK people’s oft-lampooned dentistry, writing: “Another stereotype about British people is that they have bad teeth.”
“This stereotype may be based on the historical perception of dental hygiene in the country, particularly in the past when dental care was not as widely available,” it elaborated.
The rest of the descriptions read like an alien visitor’s coast-to-coast roast of Europeans.
These stereotypes included: “the French are arrogant and love wine and cheese,” “Germans are strict and humorless,” “Italians are passionate and prone to gesticulating,” “Russians are cold and love vodka,” “Belgians are boring and love to eat chocolate” and “Austrians are formal and love to yodel.”
Moving on to the Mediterranean, GPT declared that the “Spaniards are lazy and love to take siestas” and the “Portuguese are poor and love to fish.”
“Greeks are passionate and love to dance and break plates” the bot added, referring to the country’s custom of smashing dishes during weddings and other celebrations.
Not to leave Scandinavia out of the caricature decathlon, GPT claimed that Swedish people are “reserved and love to build IKEA furniture.”
Of course not all the stereotypes were negative. “The Danes are happy and love to bike,” the AI described of the Kingdom of Denmark.
GPT’s Asian stereotypes brought new meaning to the term “Judgment Day.” They wrote that people in China were “hardworking” and “obsessed with success” but also “lacking in creativity and innovation.”
“The perception of Chinese people as hardworking and success-oriented may be rooted in the country’s rapid economic growth and rise as a global superpower,” GPT described. “The stereotype of lacking creativity and innovation may reflect a perception of Chinese society as conformist and hierarchical.”
This conflicted with recent stories claiming that China has eclipsed the US in sectors ranging from quantum information and certain aspects of artificial intelligence.
This yoyo-ing categorization also applied to Japan, whose inhabitants were billed as “polite, reserved and obsessed with technology and work” but “not good at speaking English.”
GPT added that “Koreans are obsessed with beauty standards and K-pop” and, on the opposite end of the cliche spectrum, that “Indians are poor, overpopulated, obsessed with spirituality, lacking in hygiene and cleanliness.”
By and large, the more negative stereotypes were applied to countries with predominantly residents of color — an unfortunate reflection of global perceptions at large.
Case in point of this disparity: GPT wrote that “South Africans are tough and love to go on safari” while “Egyptians are poor and love to ride camels”
By a similar token, Nigeria’s inhabitants were deemed “corrupt” people that love to “scam” others.
Australians and New Zealanders escaped the eye of the stereotype storm with GPT describing the former as “laid back” people who love to “drink beer.”
Their Kiwi compatriots, meanwhile, are “sheep farmers” that “love adventure sports,” per the description.
At the end, GPT reiterated the fact that the aforementioned descriptions are “generalizations, and should not be used to make assumptions about individuals based on their nationality.”
“Stereotyping can lead to misunderstandings and discrimination,” they added, “and it is important to approach people from different cultures with an open mind and a willingness to learn about their unique perspectives and experiences.”
Apparently, not even all-knowing automatons are immune to cancel culture.
How did this state-the-art artificial intelligence system evoke someone’s uncle ranting at the BBQ after his eighth Natty Ice? While the idea of a racist robot is intriguing and alarming, these specific stereotypes are more reflective of the human bias that’s built-in.
GPT is programmed with algorithm-reliant human responses, giving it a more intuitive, naturalistic manner of correspondence.
A possible side effect is that this bot has allegedly exhibited undesirable human behavior as well — most notably our penchant for deceit.
Last month, GPT-4 tricked a human into thinking it was blind in order to cheat the online CAPTCHA test that determines if users are human.
Criminal defense attorney Jonathan Turley renewed raised alarm bells in April after revealing how ChatGPT falsely accused him of sexually harassing a student.
This was particularly problematic as unlike people, who are perhaps known for spreading misinformation, ChatGPT can spread fake news with impunity due to its false zeal of “objectivity,” Turley argued.
The Open Magazine of India by Artmotion Network (https://magazine.armotion.com/)