How is generative AI democratising the availability of banking services?

Be the first to comment

How is generative AI democratising the availability of banking services?

Contributed

This content is contributed or sourced from third parties but has been subject to Finextra editorial review.

Generative AI’s potential is vast, and there are opportunities for this technology to improve customer experience, enhance decision-making, increase colleague productivity and streamline processes. However, there is a need for oversight and ethical considerations – as there is with all types of AI – such as addressing inherent bias in data and ensuring fairness and explainability.

Finextra spoke to Microsoft’s banking industry advisor Abhi Sharma, EMEA financial services industry advocate Marcus Martinez, and go-to-market lead of Microsoft Cloud for financial services Daniel Campbell about the importance of responsible and ethical AI use in financial services.

They explained that, in the first instance, banks should use generative AI across fact-based information (e.g., product documents, policies and procedures) and, during initial stages of adoption, output from the models should include colleague reviews. Banks have extensive amounts of data, and the rate of data generation and capture continues to increase. While generative AI can enable banks to easily interact with this data, a key challenge being noted across the industry is the understanding and mapping of data quality, security and privacy.

Once the data estate is in order and data quality and security is ensured, generative AI can help financial institutions to elevate their customer service, employee experience and democratise the availability of banking services.

Generative AI offers what traditional AI technology couldn’t

One of the key benefits of generative AI, and one of the reasons why institutions are exploring it, is that Gen AI models don’t need to be trained like traditional AI models. When training a traditional AI model, Sharma highlights, “you need to obtain a data set, you need to label that data set, and then you need to train machine learning models. In comparison, generative AI is a lot more plug-and-play. There are obviously some nuances that you have to work around, but the ability to start testing model output quicker is one of the main reasons financial institutions are exploring this technology.”

Martinez adds: “What makes generative AI so special is that it’s a type of AI that doesn’t require a lot of additional training to deliver results and can deal with very unstructured data including audio, images and videos. So considering contracts, call centre conversations, or any video content as an example, it can use unstructured data and make sense of it. That is really powerful from a productivity perspective.”

Compared to traditional AI models, generative AI is conversational, which means that the barrier to entry for end users is a lot lower and it has the potential to offer productivity gains and enhanced decision-making abilities across the whole organisational structure. Whether it’s coding or customer service — generative AI can drive quicker and more sophisticated insights and expand the scope of people who can access this information to make complex business decisions.

Generative AI: Use cases in banking

So how exactly does generative AI democratise the availability of banking services? Sharma, Martinez, and Campbell highlight three example use cases that banks are prioritising.

One of the biggest impacts generative AI has on financial institutions, in the short term, is the bottom line and effectiveness of their customer service department. Pre-AI, customer service agents would often need to reference multiple knowledge bases, product information documents and specific processes to ensure compliance. With generative AI, the agent can interact with a range of these documents to quickly resolve customer queries and navigate calls.

Martinez explains: “While the starting point is interacting with documents, due to the reasoning power of generative AI, the future state can be a lot more exciting. For example, you can have a conversation without a pre-defined structure. You can say: ‘Should I start my mortgage process today?’ or ‘how much should I save for my retirement?’, which are very open and complex questions that traditional AI would struggle to handle. This is not a mainstream use case yet, but that's the direction of travel: easily interacting with the bank or institution using natural language, making complex products and journeys more accessible to customers.”

Illustrating this example: after launching in January, Klarna recently announced that its AI chatbot does the equivalent work of 700 full-time employees and has led to a to a 25% decrease in repeat inquiries as well as bringing down resolution time to under two minutes (compared with 11 minutes previously).

Following up on Martinez point, Campbell adds that generative AI models are able to spot trends and generate insights from call centre conversations much quicker than human agents could. “The list of use cases goes on and on,” he adds. “Summarising conversations and identifying next best actions are big value drivers within the customer service space. Moving away from customer service, generative AI has already unlocked significant value for banks by improving the speed of software development by reducing repetitive tasks and analysis code for adherence to coding style guidelines, ensuring consistency and readability”.

As a final use case, Sharma highlights the possibilities of financial upskilling and delivering good customer outcomes as per the consumer duty standards. The FCA states that banks need to act in good faith, avoid foreseeable harm, and enable and support their customers to achieve their financial objectives. Generative AI will be able to support them in their efforts.

“What banks are trying to do is unlock a level of understanding by offering a conversational interface where you chat to bank policies, procedures or key investor documents,” he explains. “Financial upskilling is really important for the next generation of consumers. When it comes to financial products, consumers have more choice than ever before. In addition, with the rise of embedded banking, the way consumers engage with these products is also changing. As such, providing engaging and natural language based upskilling initiatives is imperative to help consumers plan for their future and deal with financial matters across different life stages.”

Addressing inherent bias: How banks can responsibly roll out generative AI

The crux of any AI is the data it is built on. Generative AI is pre-trained and modelled to a bank’s needs, and to give reliable output, the model needs to be trained on data that reflects the broader society the bank seeks to serve. Yet, the challenge lies in inherent bias.

Martinez explains: “You need to bring synthetic data to train and fine tune the model so it becomes balanced. For example, if someone is applying for a card or mortgage and doesn't fit the overall profile of the bank’s typical customer, it’s not a problem because the model is trained using additional synthetic data which should balance the model and deliver relevant and unbiased outcomes.

“I think that’s where the rubber hits the road: if we think banks have a data problem today, that will be magnified massively because they will need large volumes of synthetic data to periodically retrain their generative AI models. Synthetic data is something that's not mainstream in the Gen AI conversation today, but it's quickly becoming something very, very important to make large and small language models deliver fair results.”

Banks have been using advanced analytics and machine learning for a long time, so the question of responsibility is not a new issue. Yet recent developments, with the launch of ChatGPT and the events at OpenAI, have led to a new focus on the topic of inherent bias and responsible AI.

Campbell echoes Martinez’ previous point and explains why synthetic data can be crucial for banks to help mitigate bias and risk in their models. “Part of the reason that you have bias in models is because you have incomplete data about customers. And one potential use case of AI is being able to fill out data sets using synthetic data, filling the gaps in your data set so you get better results.”

Another crucial element of the responsible rollout of generative AI is explainability and the ongoing effort to keep ensuring the model is fair after it has been deployed. Martinez explains that banks can benefit from taking a similar approach to generative AI than to their risk management. “It's like having a very sophisticated Formula 1 car: You have to do the pitstop, you have to check the pressure on the tyres and you have to regulate the engine. It will be a continuous effort.”

Sharma concludes: “Bias is something that does not just exist within AI, it also exists with humans. As we, and the societies we live in, are getting less biased, we're getting better at identifying it and being able to reflect this improved understanding when we develop and deploy AI. Can I be certain that bias will be removed for good anytime soon? Probably not. But will we have a better understanding of it? Yes.”

How generative AI can help serve vulnerable customers

Democratising banking services is not limited to addressing inherent bias, it’s also crucial to resolve issues around providing financial services to vulnerable customers. Yet generative AI will not be able to address this issue if banks are not able to identify vulnerable customers in the first place.

There are certain types of vulnerabilities that are very hard to spot, and neither human nor AI will be able to accurately identify it in the first instance. However, there are aspects in this where generative AI will make a difference to improving the access to financial services for vulnerable customers.

One of those aspects is financial literacy. Campbell highlights: “If you think about the underbanked and underserved population, there is the promise of generative AI to offer services to those customers that they might not have normally had access to.”

While more fluid customers can more easily access most banking services, he explains, generative AI offers the option to get financial literacy resources to those customers that would ordinarily struggle to access them.

“There are four key drivers of vulnerability with multiple indicators within each category,” Sharma adds. “There can be instances where certain customers with a lower financial capability (e.g., due to numeracy or language skills) are likely to be more comfortable with asking questions in natural language to Gen AI enabled bots or even robo-advisors. Customers are likely to have a more open conversation knowing that a bot will not judge their questions.”

And while we’re not there yet, Martinez highlights that generative AI has the potential to identify vulnerabilities where humans could not. “As humans, we have an inherent limitation in our brains on how many variables we can hold at any given time. That doesn't really apply to generative AI.

“A customer recently told me that they sometimes struggle to identify vulnerable customers because it’s not evident, for example with mental health issues. A human might wonder how they could know this, but it could be possible if you have a very large dataset with different signals that tell you: This person might be facing some kind of personal challenge. I think the technology will allow us to be more precise and to be more proactive in identifying these situations.”

Democratising financial services through data and technology

Generative AI has the potential to improve customer experience, increase the availability of banking services to vulnerable customers, enhance decision-making and streamline processes. Yet these possibilities are dependent on the underlying data available to the AI.

Financial institutions already have a lot of data available to them. Yet data silos continue to stand in the way of effectively making use of the available data. The more effort an organisation puts into their data estate to make it ready for this next AI wave, the higher the value that can be unlocked.

In a complex industry like financial services, the introduction of generative AI is a significant moment that, when deployed responsibly, will help support the democratisation of financial services.

Generative AI’s potential is vast, and there are opportunities for this technology to improve customer experience, enhance decision-making, increase colleague productivity and streamline processes. However, there is a need for oversight and ethical considerations – as there is with all types of AI – such as addressing inherent bias in data and ensuring fairness and explainability.

Finextra spoke to Microsoft’s banking industry advisor Abhi Sharma, EMEA financial services industry advocate Marcus Martinez, and go-to-market lead of Microsoft Cloud for financial services Daniel Campbell about the importance of responsible and ethical AI use in financial services.

They explained that, in the first instance, banks should use generative AI across fact-based information (e.g., product documents, policies and procedures) and, during initial stages of adoption, output from the models should include colleague reviews. Banks have extensive amounts of data, and the rate of data generation and capture continues to increase. While generative AI can enable banks to easily interact with this data, a key challenge being noted across the industry is the understanding and mapping of data quality, security and privacy.

Once the data estate is in order and data quality and security is ensured, generative AI can help financial institutions to elevate their customer service, employee experience and democratise the availability of banking services.

Generative AI offers what traditional AI technology couldn’t

One of the key benefits of generative AI, and one of the reasons why institutions are exploring it, is that Gen AI models don’t need to be trained like traditional AI models. When training a traditional AI model, Sharma highlights, “you need to obtain a data set, you need to label that data set, and then you need to train machine learning models. In comparison, generative AI is a lot more plug-and-play. There are obviously some nuances that you have to work around, but the ability to start testing model output quicker is one of the main reasons financial institutions are exploring this technology.”

Martinez adds: “What makes generative AI so special is that it’s a type of AI that doesn’t require a lot of additional training to deliver results and can deal with very unstructured data including audio, images and videos. So considering contracts, call centre conversations, or any video content as an example, it can use unstructured data and make sense of it. That is really powerful from a productivity perspective.”

Compared to traditional AI models, generative AI is conversational, which means that the barrier to entry for end users is a lot lower and it has the potential to offer productivity gains and enhanced decision-making abilities across the whole organisational structure. Whether it’s coding or customer service — generative AI can drive quicker and more sophisticated insights and expand the scope of people who can access this information to make complex business decisions.

Generative AI: Use cases in banking

So how exactly does generative AI democratise the availability of banking services? Sharma, Martinez, and Campbell highlight three example use cases that banks are prioritising.

One of the biggest impacts generative AI has on financial institutions, in the short term, is the bottom line and effectiveness of their customer service department. Pre-AI, customer service agents would often need to reference multiple knowledge bases, product information documents and specific processes to ensure compliance. With generative AI, the agent can interact with a range of these documents to quickly resolve customer queries and navigate calls.

Martinez explains: “While the starting point is interacting with documents, due to the reasoning power of generative AI, the future state can be a lot more exciting. For example, you can have a conversation without a pre-defined structure. You can say: ‘Should I start my mortgage process today?’ or ‘how much should I save for my retirement?’, which are very open and complex questions that traditional AI would struggle to handle. This is not a mainstream use case yet, but that's the direction of travel: easily interacting with the bank or institution using natural language, making complex products and journeys more accessible to customers."

Illustrating this example: after launching in January, Klarna recently announced that its AI chatbot does the equivalent work of 700 full-time employees and has led to a to a 25% decrease in repeat inquiries as well as bringing down resolution time to under two minutes (compared with 11 minutes previously).

Following up on Martinez point, Campbell adds that generative AI models are able to spot trends and generate insights from call centre conversations much quicker than human agents could. “The list of use cases goes on and on,” he adds. “Summarising conversations and identifying next best actions are big value drivers within the customer service space. Moving away from customer service, generative AI has already unlocked significant value for banks by improving the speed of software development by reducing repetitive tasks and analysis code for adherence to coding style guidelines, ensuring consistency and readability”.

As a final use case, Sharma highlights the possibilities of financial upskilling and delivering good customer outcomes as per the consumer duty standards. The FCA states that banks need to act in good faith, avoid foreseeable harm, and enable and support their customers to achieve their financial objectives. Generative AI will be able to support them in their efforts.

“What banks are trying to do is unlock a level of understanding by offering a conversational interface where you chat to bank policies, procedures or key investor documents,” he explains. “Financial upskilling is really important for the next generation of consumers. When it comes to financial products, consumers have more choice than ever before. In addition, with the rise of embedded banking, the way consumers engage with these products is also changing. As such, providing engaging and natural language based upskilling initiatives is imperative to help consumers plan for their future and deal with financial matters across different life stages.”

Addressing inherent bias: How banks can responsibly roll out generative AI

The crux of any AI is the data it is built on. Generative AI is pre-trained and modelled to a bank’s needs, and to give reliable output, the model needs to be trained on data that reflects the broader society the bank seeks to serve. Yet, the challenge lies in inherent bias.

Martinez explains: “You need to bring synthetic data to train and fine tune the model so it becomes balanced. For example, if someone is applying for a card or mortgage and doesn't fit the overall profile of the bank’s typical customer, it’s not a problem because the model is trained using additional synthetic data which should balance the model and deliver relevant and unbiased outcomes.

“I think that’s where the rubber hits the road: if we think banks have a data problem today, that will be magnified massively because they will need large volumes of synthetic data to periodically retrain their generative AI models. Synthetic data is something that's not mainstream in the Gen AI conversation today, but it's quickly becoming something very, very important to make large and small language models deliver fair results.”

Banks have been using advanced analytics and machine learning for a long time, so the question of responsibility is not a new issue. Yet recent developments, with the launch of ChatGPT and the events at OpenAI, have led to a new focus on the topic of inherent bias and responsible AI.

Campbell echoes Martinez’ previous point and explains why synthetic data can be crucial for banks to help mitigate bias and risk in their models. “Part of the reason that you have bias in models is because you have incomplete data about customers. And one potential use case of AI is being able to fill out data sets using synthetic data, filling the gaps in your data set so you get better results.”

Another crucial element of the responsible rollout of generative AI is explainability and the ongoing effort to keep ensuring the model is fair after it has been deployed. Martinez explains that banks can benefit from taking a similar approach to generative AI than to their risk management. “It's like having a very sophisticated Formula 1 car: You have to do the pitstop, you have to check the pressure on the tyres and you have to regulate the engine. It will be a continuous effort.”

Sharma concludes: “Bias is something that does not just exist within AI, it also exists with humans. As we, and the societies we live in, are getting less biased, we're getting better at identifying it and being able to reflect this improved understanding when we develop and deploy AI. Can I be certain that bias will be removed for good anytime soon? Probably not. But will we have a better understanding of it? Yes.”

How generative AI can help serve vulnerable customers

Democratising banking services is not limited to addressing inherent bias, it’s also crucial to resolve issues around providing financial services to vulnerable customers. Yet generative AI will not be able to address this issue if banks are not able to identify vulnerable customers in the first place.

There are certain types of vulnerabilities that are very hard to spot, and neither human nor AI will be able to accurately identify it in the first instance. However, there are aspects in this where generative AI will make a difference to improving the access to financial services for vulnerable customers.

One of those aspects is financial literacy. Campbell highlights: “If you think about the underbanked and underserved population, there is the promise of generative AI to offer services to those customers that they might not have normally had access to.”

While more fluid customers can more easily access most banking services, he explains, generative AI offers the option to get financial literacy resources to those customers that would ordinarily struggle to access them.

“There are four key drivers of vulnerability with multiple indicators within each category,” Sharma adds. “There can be instances where certain customers with a lower financial capability (e.g., due to numeracy or language skills) are likely to be more comfortable with asking questions in natural language to Gen AI enabled bots or even robo-advisors. Customers are likely to have a more open conversation knowing that a bot will not judge their questions.”

And while we’re not there yet, Martinez highlights that generative AI has the potential to identify vulnerabilities where humans could not. “As humans, we have an inherent limitation in our brains on how many variables we can hold at any given time. That doesn't really apply to generative AI.

“A customer recently told me that they sometimes struggle to identify vulnerable customers because it’s not evident, for example with mental health issues. A human might wonder how they could know this, but it could be possible if you have a very large dataset with different signals that tell you: This person might be facing some kind of personal challenge. I think the technology will allow us to be more precise and to be more proactive in identifying these situations.”

Democratising financial services through data and technology

Generative AI has the potential to improve customer experience, increase the availability of banking services to vulnerable customers, enhance decision-making and streamline processes. Yet these possibilities are dependent on the underlying data available to the AI.

Financial institutions already have a lot of data available to them. Yet data silos continue to stand in the way of effectively making use of the available data. The more effort an organisation puts into their data estate to make it ready for this next AI wave, the higher the value that can be unlocked.

In a complex industry like financial services, the introduction of generative AI is a significant moment that, when deployed responsibly, will help support the democratisation of financial services.

Channels

Comments: (0)

/devops Long Reads

Sehrish Alikhan

Sehrish Alikhan Reporter at Finextra

What are AI chatbots?

/devops

Dominique Dierks

Dominique Dierks Content Manager at Finextra

Why data is the backbone of predictive AI

/devops

Duncan Cooper

Duncan Cooper Chief Data Officer, Asset Servicing at Northern Trust

What are the key data trends for 2024?

/devops

Contributed

This content is contributed or sourced from third parties but has been subject to Finextra editorial review.