By Ken Sweet

Can you trust Erica, or Sandi or Amy to increasingly control parts of your financial life without giving you inaccurate information or sending money to the wrong place?

That's what the Consumer Financial Protection Bureau is asking in a report released Tuesday, where the bureau lays out a number of concerns about the growing use of chatbots by banks to handle routine customer service requests.

The CFPB worries that banks or loan-servicing companies may cut back on human customer service employees and push an increasing number of routine tasks to AI. Further, the regulator says poorly designed chatbots could run afoul of federal laws that govern how debts are collected or how personal information is being used.

“If firms poorly deploy these services, there’s a lot of risk for widespread customer harm,” said Rohit Chopra, director of the CFPB, in an interview with The Associated Press.

For several years, banks have been handling more and more customer service requests with chatbots, often with female-sounding names like Sandi for Santander, Amy from HSBC, or Eno at Capital One.

Bank of America runs the biggest and most successful financial chatbot under the Erica brand. Erica — the last five letters in America — now handles hundreds of millions of inquiries a year from BofA's customers.

Initially bank chatbots were used to for basic inquiries, like changing a customers' address or phone number, or telling a customer where the nearest branch is or what the routing number might be on their account. But as banks have invested millions into these services, chatbots have gotten especially sophisticated, able to understand full sentences or even help a customer move money around or pay a bill.

The CFPB estimates that roughly four out of every 10 Americans interacted with a bank chatbot last year, a figure they expect will grow.

Banks are getting ready to roll out even more advanced AI-like services. JPMorgan Chase is reportedly developing plans to use ChatGPT and artificial intelligence to help customers pick appropriate investments. Bank of America bankers can use Erica to build customer profiles and potentially recommend products to those customers.

While these services may save customers minutes or hours waiting on hold to reach a human to ask a routine question, the agency has concerns about whether these chatbots will be able to handle the nuances and complicated nature of consumer protection laws without giving customers inaccurate information.

“There are federal laws that give customers rights when it comes to debts, or disputing a transaction, and if customer service agents are being substituted for chatbots, that can pose problems if that software gives them inaccurate information,” Chopra said.

The bureau also found that older customers or customers whose primary language is not English may end up in a customer service “loop” and unable to reach a human agent.

“Time and again, we found complaints of customers getting stuck in circular arguments with a robot or getting wrong information,” he said.

Chopra, both in his current role as director of the CFPB and previously as a commissioner at the Federal Trade Commission, has repeatedly raised concerns that banks and companies may violate consumer laws if they poorly implement algorithms or artificial intelligence software. There have been numerous reports of chatbots from Microsoft, Google and other companies that have used bias language when prompted.

Share:
More In Technology
What to know about the Amazon cloud outage
An internet outage on Monday morning highlights the reliance on Amazon's cloud services. This incident reveals vulnerabilities in the concentrated system. Cloud computing allows companies to rent Amazon's infrastructure instead of building their own. Amazon leads the market, followed by Google and Microsoft. The outage originated in Northern Virginia, the biggest and oldest cloud hub in the U.S. This region handles significantly more data than other hubs. Despite the idea of spreading workloads, many rely on this single hub. The demand for computing power, especially for AI, is driving a construction boom for data centers.
Sex is a big market for the AI industry. ChatGPT won’t be the first to try to profit from it
OpenAI has announced that ChatGPT will soon engage in "erotica for verified adults." CEO Sam Altman says the company aims to allow more user freedom for adults while setting limits for teens. OpenAI isn't the first to explore sexualized AI, but previous attempts have faced legal and societal challenges. Altman believes OpenAI isn't the "moral police" and wants to differentiate content similar to how Hollywood differentiates R-rated movies. This move could help OpenAI, which is losing money, turn a profit. However, experts express concerns about the impact on real-world relationships and the potential for misuse.
Load More