Jim's Journal

Data Exchange to Shared Sessions – Where Treasury Management (and Digital Banking) Wants to Be Now

Problem/Solution
Bank’s exchanging data with their commercial customers to manage their finances and execute payment transactions: Realtime connections between the companies and their financial institutions so that the information exchange is realtime, integrated, robust and efficient. And dare I say, it just happens: embedded banking.

I have the benefit of coming into online commercial banking relatively early in the market. I blogged about that a way back in my post Why and How we went from SaaS to Cloud, Services to APIs – It’s just Business! That was more about the architecture of the systems. This post ties back to direct connections between companies and their financial institutions (FIs). One of the key innovations was data exchange – stated simply, FI’s sharing data between them about customer accounts so that they could aggregate that data and deliver it, eliminating the need to go to each one (some of which didn’t have treasury management offerings (called cash management, back then). This was 1987 – let that sink in. And this wasn’t some unique and leading edge thing – well it was for most of the market – for commercial banking and cash management professionals it was a given. Pretty large service bureaus grew out of this opportunity some of which you know today including ADP (yes the payroll company). Initially we had direct connections via telco networks, even PCs with modems and eventually the internet. But we stuck with batch and never considered leveraging other industry technology innovations in realtime data connections with message queuing and eventually data streaming that we were already deploying between bank systems to get at the data in the first place. Why then did it take 30 years to finally change so much that we are actually moving to a new approach?

My hypothesis is that we are so dependent on our eyes that we forget our minds eye – yes, I just went philosophical but I believe it is true. Every system I’ve sold since then has shifted from the “prompt/response” to visual representation in browsers and mobile phones. There was a progression from fast, immediate data to asynchronous data exchange from a server to a client. From PC running DOS dialing up via modem to a Desktop running Windows/Mac to browser running HTML applications. But we stuck with batch feeds and focused on formatting for the screen. An old friend who built one of the second major treasury management system I sold called HTML “Prompt Response with a pretty front end.” The key issue is – we continued to exchange data in legacy format that were based on 20 year old technology: BAI formatted files that had to be downloaded, translated, uploaded and processed. Sure we experimented with XML and some ISO formats but in the end BAI still lives and I know some digital banking product managers that are still coding to get to it. All the innovation was in the visualization of data, not the use of it. When I did demos they just wanted to see more reports, graphics, forms and fancy screens. Trying to explain data streaming or shared sessions is damn hard.

It is much more appealing to see a graph then to try and explain this to a customer

//Execute query in KSQL sidecar to filter stream
ksql> CREATE STREAM orders (ORDERID string, ORDERTIME bigint…) WITH (kafka_topic='orders', value_format='JSON');
ksql> CREATE STREAM platinum_emails as select * from orders, customers where client_level == ‘PLATINUM’ and state == ‘CONFIRMED’;

Building a Microservices Ecosystem with Kafka Streams and KSQL

Innovation in data exchange began to take hold on the consumer banking side. We all didn’t start using our computers to do finances until Intuit delivered Quicken – and then started using their batch data exchange format OFX – developing the standard which we all rely on even today. Smart software companies did begin to connect corporate treasury management system to banks with some level of realtime connections and we got creative with automated batch feeds and scheduled delivery/dial in. And Account Aggregation became an industry – moving beyond data exchange in treasury to consolidators screen scraping consumer account data then connecting directly to digital banking platforms to get the data. It all remains asynchronous though. The industry is on the brink of change now and there are some really innovative companies starting to take this on.

Aggregators of Aggregators are now developing tools to make those connections between banks more efficient. New technology to stream data is finally coming to the digital banking space so that we can not only connect consumer and business finance tools to banks directly, but also simplify those connections. I’m waiting for today’s modern developers to start creating shared sessions between them. I’ve talked about the shift to micro services based cloud technology and the value of shared sessions between these micro services based applications. Data streaming between companies makes the “machine to machine” metaphor even more powerful. It is incrementally more powerful and will happen. Shared sessions would be game changing. I started to consider this in a previous post, where I started back to blogging a few years back Microservices and other shifts in techspeak. API access is great but when these systems become shared application sets that allow financial activities to become one process between companies and financial institutions – that will be true embedded banking and I think it is the way of the future. Today you can build applications in the cloud based on true microservices based architecture that create shared sessions between applications, people and even devices. When we get there in digital banking connecting customers and their FI’s we will have true embedded banking, it will accelerate BaaS and move beyond APIs, data streaming and our obsession with visualizing solutions to banking that just happens. Elegantly and without effort.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.