BIG Industries helps retail banks modernise in a customer-friendly way

SHARE

https-www.jotform.compsd2-regulationUse case PSD2 Event Processor with Apache Kafka and Flink

BIG Industries recently helped a retail bank with a new feature to make its mobile banking application more user-friendly, in line with Payment Services Directive 2 (PSD2). The updated application gives customers more security for their mobile banking, allowing them to set their own payment thresholds for their account transactions.

This new feature is available for all card and payment types and methods, including credit and debit card payments, online and direct transactions, envelope payments and more. Customers can also receive push notifications on their mobile app if they wish. These new features give customers more control over all payments made from their accounts. Srivatsan Sadagopan, Big Data Engineer at BIG Industries, explains what was unique about this project’s implementation and results.

Reporting unwanted activities

Srivatsan Sadagopan: "The bank’s customers previously had to sign up for a separate service to detect fraudulent activities and receive notifications about. Now, customers can set their own personal spending limits, and all payment destinations are checked automatically, which is a huge leap forward in terms of security and an upgrade that customers will benefit from straight away."

"Another big advantage is the day-to-day convenience of not needing to double-check your monthly payments, such as the direct debit for your mortgage; push notifications let customers know the transaction has been processed."

Challenging project

Srivatsan talks about the challenges of this project: "Banks are acutely aware of the importance of customer friendliness – and that they need modern technology to achieve it. All banks are now rapidly jumping on board with this, so deadlines were tight and we had to collaborate in different teams with different skillsets, but we all worked really well together which was essential for our productivity. Security restrictions within a bank are of course also extremely strict, so we adapted our working methods to remain compliant throughout the project."

Data sources

The bank’s data hub – a data lake where streaming data from the bank’s various departments is stored – serves as the basis for the application. Srivatsan: "The data comes from payment card transactions, cash deposits and withdrawals, online and mobile transactions, different currencies and failed transactions. It even includes the mobile app users’ personal settings, so there are lots of different data streams that need accurate filtering. Just the scope of this project alone is one reason why I found it such an interesting challenge."

"We also had to take secondary data sources – such as account and contract details, and data collected about the various services that customers buy from the bank – into account."

Event Processor

The developed PSD2 Event Processor determines whether to send a notification to the customer’s mobile app when a payment event occurs that exceeds the individually set threshold – an almost instantaneous process. The PSD2 event processor is right at the heart of the solution because the main goal is to notify customers within three seconds of a transaction. Srivatsan: "A separate team of programmers worked on each step of the event processor’s development."

"All the data arrives on an Apache Kafka cluster in different topics. We then use an application based on Apache Flink to process these data streams, and determine which app we need to send the notification to for each payment event. For example, if an enterprise customer has a retail account, the alert needs sending to both apps. The application then applies personal preferences, such as that pre-set spending limit, before sending a REST call to a notification engine that pushes the alert on to the user. All these events happen almost instantaneously."

"Banks always have legacy IT systems that are no longer relevant for the latest applications, but they are gradually moving away from them in their efforts to upgrade their processes, so innovative projects like this are built completely from scratch, independently from the old IT infrastructure to ensure it has no impact on the power of the new application."

Apache Flink

Srivatsan: "I fully immersed myself in Apache Flink during the project. Learning it on the job helped me to put it in context, and it was a very valuable experience. Apache Flink was the best choice for this application because it can handle a constant data flow almost instantly. It’s the technology we needed to be able to send immediate notifications to customers."

"Apache Flink is powerful and fast enough to combine and process all types of data straight away. The different data streams originate from several of the bank’s departments, which makes testing complex. It’s a pipeline of data that we have to fully process, so any change in the code needs to be tested end-to-end. This was a huge task for us in which we focused on optimizing the testing techniques."

Testing

Srivatsan explains: "A real-time stream processing application like this, which is intended for the bank’s end customer, really does need to be thoroughly tested. We assumed different types of end customers – because a private individual will use the app differently from a retail company or larger organization. We also ensured frequent coordination between the various teams for the testing," he continues, "and selected a few people from each technical team for regular syncing meetings. The management team also had regular meeting so we kept a good overview at all levels. The application is currently in the beta phase, so we’re excited about the outcome and very happy with the timing."

Need help with your Data Project?

 

Ready to set off on a BIG journey?

The top notch technologies we use set us apart from other consultancies