Garbage Crisis

Garbage Crisis, Ronn Torossian Update

Garbage in, garbage out is a well-worn phrase that originated in the early days of computers when first-adapter users had to write their own paths and batch commands.  But unlike old sayings that come and go, GIGO, as it’s also known, is even more appropriate today with the advent of artificial intelligence or AI.

Some law enforcement agencies have come under intense criticism because their facial recognition programs leaned toward people of color.  Why? AI reacts and responds according to the information that’s entered into it. If is weighed more heavily with information on people of color, that’s the kind of results it will deliver.

Apple was one of the latest companies to get caught up in this because of criticism that its algorithms discriminated in favor of men against women.  What made it more embarrassing was the pre-publicity Apple put out in announcing that its new credit card as “created by Apple, not a bank.”

Bloomberg reported that the New York Department of Financial Services (NY DFS) is investigating Goldman Sachs, Apple’s credit card partner, to determine if its Mastercard indeed discriminates against women.  NY DFS was reacting to a tweet by David Heinemeier Hansson who said his wife got approved for a significantly lower credit limit despite their equal income. 

When contacted by Quartz, a global news organization, Goldman Sachs admitted that it was possible for two people in a family to receive separate ratings but denied that the decisions were based on gender.  Hansson, a tech entrepreneur who hails from Denmark and has a green card, said his wife is American and has a higher credit score than him.

Even Apple co-founder, Steve Wozniak, weighed in telling AdAge he, too, has an Apple credit card and can borrow ten times more than his wife even though they share the same financial accounts. 

Looking Ahead

Scrutiny over AI discrimination will only intensify as more companies utilize this important piece of technology. NY DFS also announced recently that it started an investigation into the U.S.’s largest health insurer based on revenue, United HealthGroup. 

The investigation came in the wake of the results of a study recently published by Science.  That study reported that the Optum device sold to hospitals and insurers by United discriminated in favor of white patients over blacks. In publishing the story, the Wall Street Journal cited NY DFS’ announcement, “This compounds the already unacceptable racial biases that black patients experience, and reliance on such algorithms appears to effectively codify racial discrimination as health providers’ and insurers’ policy,”

RX

AI can be very helpful and powerful in helping companies personalize their marketing and segmenting customers by their interests.   However, organizations moving in that direction must be clear in setting parameters and expectations before implementing their AI programs so they don’t get caught up in a crisis communications dilemma instead.

Marketing and IT teams need to huddle in advance and ensure that what’s entered into the computer delivers data that’s valid and credible.  Information and data entered must accurately reflect the audience(s) the company is targeting. It’s tough enough when bad results are delivered, but much worse if it results in a wide scale controversy that can take months to recover from.

Discover more from Ronn Torossian

Ronn Torossian’s Professional Profile on Muck Rack
GuideStar Profile for Ronn Torossian Foundation
Ronn Torossian’s Articles on Entrepreneur
Ronn Torossian’s Blog Posts on Times of Israel
Ronn Torossian on SoundCloud


Garbage in, garbage out is a well-worn phrase that originated in the early days of computers when first-adapter users had to write their own paths and batch commands.  But unlike old sayings that come and go, GIGO, as it’s also known, is even more appropriate today with the advent of artificial intelligence or AI. Some law enforcement agencies have come under intense criticism because their facial recognition programs leaned toward people of color.  Why? AI reacts and responds according to the information that’s entered into it. If is weighed more heavily with information on people of color, that’s the kind of results it will deliver. Apple was one of the latest companies to get caught up in this because of…