Knight Capital, an American global financial services firm lost $440 million, thanks to a glitch in the code. In financial services, the true cost of a software bug can be gargantuan since you’re directly dealing with money. Financial losses from software failures affected 3.6 billion people in 2017, causing $1.7 trillion in financial losses.
A typical finance application encompasses a complex integration of multiple systems. For example, a Loan processing System or a Credit Card Origination System will be integrated with multiple other systems like Client Management System, Documents Management System, eKYC, Risk, Pricing and so on. QA plays a major role in ensuring that all the systems work together in tandem to achieve the intended business functionality.
The QA team typically tends to don the hat of end-users while testing applications, be it automation or manual. But the sensitivity of financial applications calls for a QA team who can mimic hackers with the zeal to break and tamper the system to mitigate any vulnerabilities.
At Ideas2IT, we have extensive experience in testing finance applications and delivering large scale projects. Leveraging that experience, we will cover some points on how QA teams could think like hackers and build a strong moat for defense.
Handling Sensitive Data
Finance systems like Loan or Credit Card Origination Systems will deal with PI information and data that needs to be sensitively handled. The QA team should test:
- Encryption: If the client data is being saved in the database in an encrypted format, especially sensitive data like SSN.
- End to end encryption: Whether any un-encrypted PI data is being sent over mail or any other communication mechanism.
- Data exposure: If there any URLs that are exposing sensitive data that are vulnerable to hacking. For example, when the user moves to pages like change password or phone number or profile update, the data should not be explicitly displayed in the URL and be visible to the end-user. Also, any API call should return only the data that is really needed by the caller and no extra information should be returned from the API.
Though the second and third scenarios look self-evident and indisputable many QA teams actually end by missing these use cases.
As mentioned earlier, finance applications are integrations of many systems and QA has to ensure that all the integration scenarios are covered. Few scenarios:
- Edge cases: Test how the system responds when one or more of the integrated systems is down or does not respond when the call is made. For example, the service that checks the approved credit limit for a user will depend on external services like Pricing. In case Pricing is down, the origination system should handle it gracefully. One example of elegantly handling the situation is the origination system recording the requests in a Queue, retrying the same action a stipulated number of times and moving the request to get manual intervention if it fails in all attempts. Thorough testing of such scenarios is key for preventing unanticipated errors.
- Logging the integrations: Make sure that the system logs all the integration calls and the result of the call either in DB or log. Otherwise, it will be difficult to trace the root cause of what happened and where it happened as there are a lot of external/internal touchpoints.
- Data persistence: Even if one of the integrating systems fail, we need to check whether the data is properly saved in DB or propagated to the subsequent systems to ensure that system issues are gracefully handled.
- Data Integrity: Make sure that at any point in the process workflow, there are no inconsistencies in the data. In other words, there must be a proper rollback mechanism throughout the flow.
- Time-zone consistency: Systems are expected to work the same way irrespective of the time zones – CST, EST or PST. The test team should ensure the application behavior is the same across different time zones. Let us consider a rare situation to explain this. Applicants will submit documents as identity/residence/income proof. Acceptance of a document depends on the ‘expiry date’ of the document and the current date. And the current date depends on the system time zone! Think of the rare situation where the applicant is in the PST zone and the bank agent is in EST. Such scenarios also need to be covered by the testing teams.
- Batch processing: The team should test batch processing jobs that handle bulk data updates. For example, a job could identify all the applicants who have not logged into the system for a particular period of time and change their status to Expired or Inactive. Test Scenarios should ensure that such jobs update all the integrated systems in sync.
- Handling race conditions: Test scenarios to check the functionality of Race Condition. Normally the Loan application agent will pick the application from a queue system to verify the application. In rare situations, more than one agent will request the queue for applications at the same moment and they both will get the same application from the queue. Systems should handle these race conditions gracefully and notify the agents with meaning messages. Though these are rare situations, the test team needs to cover them to ensure that the system is foolproof.
- Handling network latency: Integrations between systems could be slow due to system or network sluggishness. For end-user facing finance applications, the system should be very responsive. Hence test team should cover performance, responsiveness aspects in the test scenarios. If the system is taking some time to respond, users should be shown a progress bar or informed through any other meaning mechanism.
- Monitoring systems: Applications might have monitoring/control mechanisms to check the availability of each of the sub-systems. For example Every half hour, a job will run to check if all the sub-systems are up. If any of them is down, it will send notifications to appropriate people. These monitoring systems should also be thoroughly tested by the testing team.
Finance domains have multiple regulations in place and strict adherence to those is mandatory. Application development and testing will usually cover SOX or other compliances. Here are a few compliances that testing teams usually miss:
- Do Not Call Registry: Bank agents will have call mechanisms to get additional information from the lead or applicant. They might also call to cross-sell or upsell products. In certain geographies like the US, the finance system agents should be careful to NOT call people whose status is set as DNC (Do Not Call). Hence, the Test team should have enough negative test cases to ensure that the clients with DNC status are not called by our agents. There are some intricacies at play here:
- DNC can be set for a mobile phone and not the landline
- DNC set by one client for his mobile and there is one more person in the family who is also using the mobile and he/she has not set DNC mode and so on.
Testing all these negative scenarios is crucial to prevent customers from getting irked, the clients will sue the bank. Bank of America once agreed to pay ~$32 million to settle charges that it made harassing debt collection calls to customers’ cell phones.
- Multi-factor authentication:
- Verify all the Multiple-factor authentications – Emails or text messages sent to phone or call option to phone (for landlines)
- Verify whether MFA is triggered whenever there is a change in contact information like email or phone or device
- Verify that after multiple attempts with Invalid OTP, the user should be blocked for a certain time
- Verify the maximum limit to access the OTP in 24 hours (10, 20 or 30 like that) and also the validity of the OTP sent (2min or 5min)
e-KYC step of any finance application will be elaborate and will require enormous document handling. Applicants or Agents will be uploading documents for Identity, Residence or Income proof. The testing team should be covering the following test case scenarios:
- Format support: Test with all the expected specific file types – jpg, png, and pdf, etc.
- File Size support: Systems mighVerify system behavior whent have a limit on the maximum size for the documents. Eg: 10 MB or 20 MB. The test team should upload documents of maximum size and also documents of bigger sizes and test the system behavior
- Browser compatibility: Verify the browser compatibility in the major browsers(Chrome, Firefox, IE and Safari latest version) for the functionality ‘Upload Document’
- Responsiveness: Verify the Document uploading functionality is working as expected in Mobile devices
- Document rendering: Verify the documents that are uploaded are rendered properly in integration with other systems
- Malware detection and alerts: Test whether Malware check is enabled for docs uploaded and verify whether notifications are sent to a collective set of people like agents, managers when malware is identified in the system
- Validating expiry: Verify the Expiry date of the proof document is validated by the application
Most of the finance applications support multiple languages to provide a personalized experience to their clients. Testing business functionalities for all the supported languages is key to ensure a seamless user experience. Here are our experiences
- Localization: When the user changes the language option (say from Spanish to English) all the label names and the content need to change to the selected language. And this change should be uniform across screen, devices, browsers. Coverage for this testing should be 100% as it will directly impact end-user comfort. Our suggestion is to maintain a common sheet to map language conversions (for example English – to – Spanish). Use the same to create automation suites.
- Know your client landscape: B2C finance applications need to serve multiple devices and versions. For such requirements, the test team should gather the client’s alignment on the list of browser versions, device types, OS that need to be included for testing. We need to keep in mind that NOT all banks will go for the same device types and versions. For example, one of the Fintechs that we worked with had clients who were not very tech-savvy and they were expected to have older models of phones.
- Cross-browser testing: Testing the functionality across those multiple browsers, devices, and environments to ensure a seamless experience to users is important. But it is a cumbersome activity and hence leveraging a CBT(Cross Browser Testing) tool for the same will help.
- Test on the actual device: Ensuring the special chars for other languages like Spanish are properly translated while rendering in the mobile device. For example, the encoding of special characters impacts how the character is displayed on a mobile device. This cannot be caught if the testing is done on simulators. Hence testing the screens/UI on the exact mobile device is very necessary.
With most of the application development going the Agile and CI/CD way, test automation is key to complete testing within the sprint/release timeframe. Here are a few of our experiences:
- Importance of dynamic test data: Test data generation is one of the most important parts of the test automation process of a fin-tech application. A small and static set of test data will not be enough to test the application completely. The dataset generated should be large and dynamic. The test team should come up with test data generators to handle this.
- Test data vs real data: The test data should be as close as possible to the real-time data. This will help in most cases to check how the system responds to the actual data sets and will help in covering edge cases pretty easily. Ex: The end-user can have more than 2 spaces in his name, a Spanish name can have accent characters, the address field can have special characters
- Automation of complete test suite: Since all applications are built and deployed automatically using a CI/CD pipeline it is not enough if we only automate the regression test cases. It is also important to automate other functional test cases
- Common service data provider: As an application has many microservices, each service has a dependency on others. Automating a test case involving any one of the microservices might require data from another service. Exploring a common service to provide test data alone helps to handle such situations and ensure the common service can be accessed only with proper authorization.
- No Hardcoding: The test team should ensure that sensitive information like database credentials, authorization token are not hardcoded in the test script itself.
- Automate the impossible (or the nearly impossible): There may be a few cases that are practically difficult or nearly impossible to automate. For instance, testing if user is blocked for 24 hours in MFA. Practically a suite cannot wait for 24 hours while it is running. But a good practice is to try if we can make it possible to automate. We achieved it by having the block time configurable so that in test environments we can have a smaller practical time in minutes instead of 24 hours and test it.
- Automate to the point: In each case, make sure you are starting the automation only from where it is required. Your script should not care about the preconditions. Those should be generated well in advance by the test data generator. For example: If you are checking whether the end-user is receiving a bounced email for an application that is bounced by the agent, your script should not start from where we are applying a new application. A test data generator should provide you with an application that is submitted for verification.
- Automate with the real flow: Make sure your automation runs through the same flow as it would be in real-time. Ex: To validate the touchpoint of pricing engine and functionality, you cannot call the pricing API directly and assert that it passed. In real-world, pricing is going to be called only via the origination system and hence we should make an application move through that stage and then check if pricing is called successfully
- Handling MFA/OTP: Make sure you have a test phone number to receive OTP/Verification code. It is always a challenge to handle MFA verification in automation scripts and in order to handle it we can use Google OAuth or any such similar utility.
Here are a few steps to keep in mind while testing finance mobile services-
- Devices: Make sure to gather the list of mobile devices and browsers on which the application needs to be tested
- Device procurement/simulators: Based on the list of devices either procure the device or ensure that cloud platform like Sauce labs or CBT are available to test
- Network span: Verify if the application is accessible in a various range of network (2G/3G/Wifi)
- Offline data synchronization: In the case of offline mobile application, ensure all data created in offline sessions are synced with the server without any loss or duplication in data when the device switches from offline to online.
- Responsiveness: Ensure all screens are responsive in mobile devices with any deviation in the UI.
- Location: Ensure to test the application by turning ON/OFF the location access. This is a mandate test case if the application uses maps (E.g Payment locations)
- User preferences: Ensure to test the SMS notifications like payment reminders, AutoPay alerts, payment acknowledgments are triggered as per the user preference set by the user.
- Session Timeouts: Verify system behavior when the session times out when the application is idle for some time. System integrity and graceful exit are to be ensured.
- Concurrent access: Verify that the system handles the rare situation where different users (applicants, agents) from different locations access the same application at the same time.
- Test everything: While we do performance testing for the key screens/steps, we might miss out on testing the performance of screens like the ‘Single sign-on screen’. These might also lead to production issues and hence needs to be covered in testing.
- Cross-platform sync: If the application can be accessed via multiple platforms(E.g Android, iOS, Web application), a test should be performed to ensure the updated data is synced to all platforms whenever data gets updated.
- Different types of user: Test cases should be designed targeting all different user types (E.g Current loan user, Charged off user, GCP user, Delinquent user) and each user behavior should be validated.