The swell of innovation in artificial intelligence may wash over smaller banks and fintechs, unless regulators and the industry work together to create a competitive environment.
That was one of the takeaways at DC Fintech Week, which took place in Washington D.C. and wrapped up on Wednesday. Several panels covered the regulatory challenges and opportunities that come with AI, including interviews with Federal Reserve Vice Chair for Supervision Michael Barr, Acting Comptroller of the Currency Michael Hsu, and superintendent of the New York State Department of Financial Services Adrienne Harris.
“I liked that the regulators are thinking hard about AI technologies and trying to understand how they work,” said Poorani Jeyasekar, a director at Klaros Group who attended the conference.
Several panelists expressed concern about the challenges facing startups and smaller firms that are dabbling in emerging technologies such as AI, including budget constraints and hiring.
Elizabeth Kelly, who helped start financial planning and investment management firm United Income and sell it to Capital One Financial before she joined the White House as special assistant to the president for economic policy, points out that she was once part of a cash-strapped fintech.
“It’s incredibly expensive to develop large language models that require access to data, computing power, and semiconductors — all of which are limited and costly resources,” she said on a panel about generative versus legacy AI.
As a startup founder, “you don’t have that many options and it can be constraining to your business model, which is why we need a robust and competitive ecosystem,” she added.
The White House recently released an executive order that calls on government agencies to regulate AI more closely, with a focus on explainability, intellectual property and avoiding bias.
Developers of lending and other banking tools powered by artificial intelligence say their firms are well positioned to weather regulatory changes, but remain mindful of how government agencies might respond.
“The current asymmetry in the AI industry stems from entrenched disparities in critical areas such as data access, specialized expertise, and computational resources,” said Claudio Calvino, senior managing director in the forensic and litigation consulting practice at FTI Consulting. He says generative AI has sharpened the focus on these imbalances because of its increasing adoption and media buzz.
Tim Massad, a research fellow at the Kennedy School of Government at Harvard University and former chairman of the U.S. Commodity Futures Trading Commission, echoed Kelly’s points on a panel about explainability.
“You’ll have firms using models when they don’t fully understand them,” he said. “Technology will give an advantage to incumbents and large organizations with lots of data.” Firms with limited budgets will also struggle to attract staff.
Moreover, when regulators try to keep up with the pace of these technologies, “that can easily become a Whac-A-Mole strategy,” said Massad. “In addition to having clear access to the model and the [supervised] firm’s internal audits and testing records, they have to think about other ways to leverage resources.”
One way could be requiring regulated entities to show they had received certain third-party certifications or testing, or that their providers met certain standards.
“Maybe we need self-regulatory organizations to help us in this area,” said Massad. “They have been a large part of the securities and derivatives framework; maybe that’s needed here.”
Jeyasekar was struck by Harris’s comment on another panel about the balance regulators must strike, to stay close enough to these technologies to set up guardrails, without crushing innovation in their early days.
Harris’s office has regulated virtual currencies since 2015 and updated its regulations as recently as April, announcing higher standards for capitalization, cybersecurity protection and anti-money-laundering protocols.
“Too often we think you can protect consumers and markets or you can build a robust marketplace,” said Harris. “I’ve never felt those things were mutually exclusive.”
“Compared to crypto, AI has more fans and interest on the regulators’ side,” said Jeyasekar.
Other panelists detailed their use cases for AI.
“We think a lot about the opportunity to take unstructured data and structure it using AI,” said Steve Holden, the senior vice-president of single-family analytics and modeling at Fannie Mae, on the panel with Kelly.
As one example, Fannie Mae automated a mortgage underwriting process in 2021 that extracted rental payment data from applicants’ bank statements and incorporated that data into its evaluation. He said the agency has approved more than 5,000 borrowers since with that information.
Ardent Venture Partners, which invests in business-to-business fintech, recently built a model it calls Ardent GPT. Ardent GPT, which uses generative AI and large language models, helps winnow down the large number of founders and companies the firm studies, into more manageable lists based on the criteria that Ardent is looking for.
In its bid to experiment with the same technology it is regulating, the New York State Department of Financial Services is preparing a chatbot for its own website, where visitors can ask questions and receive answers in plain language rather than digging through pages of regulatory guidance.
Harris had two pieces of advice for regulated entities experimenting with emerging technologies.
One is, “Don’t ever surprise your regulator,” she said. “If I read about it before I hear about it from you, we are starting in a bad place.”
The other is for so-called disruptors to check themselves for “disdain” about the industry they’re disrupting, an attitude that Harris feels has been changing as the fintech industry matures.
“Don’t be afraid to incorporate the deep expert, the stodgy lawyer, the former regulator now gone to the private sector, into your thinking,” she said.