How AI is poised to take a bigger role in payments
The financial services industry has been using artificial intelligence for decades in trading, and as the technology gets smarter it’s being tested more often with payments as well.
“There’s an increasing trend around using AI around financial payments,” said Sumeet Vermani, a global marketing leader working for fintech providers such as Red Box Recorders, a company that records communication, in the financial services industry for compliance.
Financial institutions and payment providers are looking for ways to utilize AI in a number of areas, including customer interaction and fraud detection.
The most popular application of AI in financial services — and perhaps the most limited — is the chatbot, a program that converses with customers through text or speech. In financial services, chatbots are usually used to make the first interaction with a customer, answering questions or directing customers to an area of the website. For more complex interactions, the bot hands the conversation over to a human representative.
“There are a lot of chatbots in the industry right now dealing with the easy tasks, reducing complexity by dealing with multiple people at once,” Vermani said. Chatbots are about “having a digital one-stop shop for customer communication and interaction. Customers are increasingly wanting a single point in which to communicate with their payments providers, whether it’s to transact in a messenger app, learn more about a product or lodge a complaint.”
And in the future, chatbots could be even better at up-selling customers than humans, able to parse through large amounts of data to determine the timing a specific product is needed.
But Raghu Rajah, founder and CEO of CrossCues, a startup building a customer engagement platform for banks using AI, said chatbots are the least valuable use case AI has in financial services.
“The perception is that somehow [chatbots] will replace full service banking which is a bit naive,” Rajah said. “We’ve reached a point in life, that social banking only happens when digital banking fails or breaks, so chatbots aren’t really going to help that.”
Another area of using AI that’s receiving a lot of attention—yet not quite as far along as most people think—is on the marketing side.
Several financial services providers have been playing around with using AI to make predictions about customers' behavior, which can then be used to push certain products or services that will help consumers manage their personal finances.
“We’re starting to see a lot of AI applications around segmenting people, tracking things like credit card spend,” Rajah said.
This is exactly what CrossCues does. Equipped with data, the company’s neural network (a computer system modeled off the human brain) makes a psychographic profile of a customer, advising the financial institution about what a specific customer or demographic likes and dislikes.
It’s similar to what Netflix does in using a viewer’s viewing history and ratings, as well as the history of other viewers with similar interests, to suggest other films, Rajah said.
But this use case has also run into hurdles.
First, AI systems need tons of data to make reliable predictions. And the large amount of data needs to be frequently transferred meaning these systems will need to operate in the cloud, raising concerns around data security.
Second, many industry experts are concerned about the possibility of discriminatory practices that could come from using AI systems. For instance, Facebook’s advertising mechanism allowed advertisers to exclude certain groups based on race, gender and other factors that are prohibited when advertising employment or housing; Facebook has since updated its system to prevent those advertisers from using those traits to target their advertising.
The Consumer Financial Protection Bureau (CFPB) has been quite active in following the AI industry as it moves into financial services marketing. And because of that, said Rajah, banks are extremely gun-shy about utilizing AI for these purposes.
But financial services providers have not been hesitant to adopt AI for fraud detection and mitigation, and have in fact been using AI for this purpose for several decades.
“Traditionally financial services providers had lists of rules that they applied to payments systems to stop fraud: 'If this happens and that happens then it’s probably fraud,'” said Rajah. For instance, if a U.S.-based consumer was making purchases locally one morning and then started purchasing products in the Philippines in the afternoon, the activity might get flagged.
Such patterns are suspicious, but are not always indicative of fraud.
"Now financial services providers are employing learning algorithms to understand the spend patterns of individual people,” he said.
AI is also used for other fraud systems. U.K.-based Atom Bank, Monzo and Barclays are using AI for voice recognition to activate certain services, instead of using alphanumeric passwords.
“From a customer experience perspective, this becomes a much smoother, simpler approach,” Vermani said.
And some machine learning AI systems for voice recognition go beyond that.
“With machine learning AI, banks or payment companies are able to listen to my voice and the machine picks up a specific tempo or tonality and determines how I prefer to receive information,” said Vermani. “Someone that likes facts and bullet-pointed lists will receive information differently than someone that needs a more verbose, eloquent approach.”
But there’s worries here as well.
“As we roll out this technology, we have to be aware of the elements that aren’t as rosy and nice,” Vermani said. “We should be wary of machines that learn because they’ll probably be able to outlearn the initial learning they’ve been given.”
While security elements need to be factored in from the beginning, Vermani remains positive about the future of AI in payments and broader financial services.
As does CrossCue’s Rajah.
“There is a lot more work to be done before [financial services providers] are truly able to leverage AI in meaningful ways,” Rajah said. “But the application’s possibilities are pretty high, especially if regulators are reasonable about it.”