In A Decade of Digital Transformation in 12 Months, 46 C-suite executives spoke with PYMNTS for its Q2 eBook on what the world will look like as recovery rolls on and the next iteration of normal rolls out. In this excerpt, David Excell, founder of Featurespace, discusses how the pandemic changed the rules of our daily lives, and how adapting to the digital transformation is key to business survival.
Read the entire eBook here.
One of the most fundamental impacts of the global crisis was that over a short number of days, the rules that we used to manage our daily lives changed. Many of us started working from home, schools became virtual and our options to socialize reduced dramatically. During this seismic shift, most of the infrastructure we rely on stood up to the test; the internet and our payment platforms remained available. As an example of this shift, at Featurespace, we saw an 81 percent increase in card-not-present transactions on April 1, 2020 compared to the previous month.
Fighting financial crime during the pandemic brought about its own challenges. Operational teams had to adapt and work from home, processes had to be updated, and as governments introduced different schemes to support those most affected, we saw large outflows of public funds. The fraudsters also took advantage of the pandemic, as we saw increases in check fraud, scams and first-party fraud.
Behind the scenes, one element to keeping our payment infrastructure secure were the fraud strategies that accept and decline transactions based on predicted risk. But these fraud strategies were founded on historic behavior that didn’t match the behaviors observed during the pandemic. This became the moment of truth for adaptive machine learning models. How quickly could they adapt to ensure that genuine customers weren’t blocked when they were most in need of their funds, especially when the legitimate purchases may have been unusual for that customer, such as their first online grocery order?
From the models Featurespace had in operation, we saw that our underlying behavioral profiles began adapting to the new behaviors within 48 hours of lockdowns being implemented around the globe (it was this observation that inspired our case study with TSYS). This enabled fraud detection rates to remain stable throughout the pandemic, ensuring that consumers could still use their banking products, that fraudsters couldn’t exploit a new weakness and that fraud operations teams weren’t inundated as they adjusted to working from home.
Over the last 12 months, we’ve also seen the rise of movements for greater race and gender equity and fairness. This is an ongoing challenge as we continue to digitize the payments ecosystem to ensure access for everyone, including the underbanked. As we continue the path to digitization – and inherently automation – we must ensure that the logic behind those decisions is unbiased. This is particularly acute for machine learning systems.
Machine learning systems identify patterns in the data provided to them – and unfortunately, when applied naively, many datasets include biases from historical decisions. When machine learning is introduced into the payments ecosystem, model fairness must be included as a key requirement, and the data used to test and validate must be carefully selected to ensure that past biases are not transferred into our “next-gen” digital futures, where only the adaptive will survive.