In order to deliver a high quality application, testing is a necessary component of the deliverable portfolio. Often this step is overlooked, underappreciated, or worse, rushed and hurried to meet a deadline. The best solution would be to integrate testing throughout the development process.
The way to approach testing a Business Intelligence (BI) system is to get the business to have ownership and buy-in early and often. The business users should be writing test cases and be responsible for executing them from a business perspective as well as trains them on the content in the system. The technical people should be ready to assist with query development or whatever to help the testing to be completed.
There should be some validation that is part of the design of the Extract, Transform, and Load (ETL) process itself. Some of this is to make sure mechanically things happened as they should and that there are appropriate logs when they don’t. In addition, the ETL developers should perform some kind of UFI (Unit Function) testing prior to moving to a TEST environment as well as a code review or peer review. Depending on the complexity of the ETL process, one generally don’t test each component of the process due to the details involved, but focus more on the net result of the test (i.e., all rows were inserted with no errors and all columns contain values – what happened in between is not as important to test because the load was successful).
In addition, the technician should take the next step of developing quality controls that make sure what was in the final table structures is what was expected. For example, have a report from the Operational Data Store (ODS) area that groups and sums some business keys with some key metrics and compared them to the results from the new implementation area, while highlighting only variances. This should be sent to a data governance team every morning. As long as it was clean, the BI team is sure mechanically things were working pretty well.
Depending on some of the business rules implemented, one may need to have reports that highlight “Unknown” values and other things that need to be dealt with by the business. Some of these scenarios should become test cases. The business should be using the Ad-hoc environment to be writing reports and queries to test the results. Ultimately, these reports should be reviewed by the data stewards as part of the data governance initiative.
For the most thorough results as well as the highest quality BI environment, everywhere there was a business rule implemented, there should be a test case that verifies the rule was implemented correctly. Depending on the volume and complexity, one may need to prioritize them and tackle the most important ones first.
For the documentation, it can be as simple as keeping a spreadsheet with the following items:
• Test Case #
• Test Case Description
• Tester
• Date Tested
• Expected Outcome
• Actual Outcome
• Pass/Fail
It is critical, as originally stated, to get the business users involved in the testing of the deliverables. There have been cases where the business thought what they were using to compare balances with was correct, but were eventually convinced the BI application was correct and they had a broken business process instead. This is most difficult because it is on a case by case basis, but this usually becomes the biggest challenge and hurdle that needs to be overcome to be perceived as successful. Ultimately the business must provide you with the information to know whether “the values put into the Data Warehouse or BI dashboard are correct”. And you are completely dependent on the business rules they gave you are correct (a lot of times they aren’t in version 1)…and even more risky if there is no data governance process in place.
A word of caution, if you don’t get the business buy-in on testing, they will certainly blame you when things aren’t correct in production (especially if that happens for things that were overlooked in testing). It is wise to have a step where the business had to sign off on testing and that they were comfortable with what was moving into production and that was very helpful when issues arose. Because they were involved in the process and it was not mostly IT doing the testing, finger pointing was kept to a minimum. In addition, shared success and teamwork was fostered bridging the gap between business users and Information Technology (IT) groups that sadly exists in a lot of organizations.
0 Comments