The client used MS Excel to generate a monthly budget and forecast model and a set of regional performance reports and key performance metrics. The file was large and was being emailed to 20 field offices across the US, sometimes multiple times per month. The file was used to perform line-item general ledger variance analysis and cost-center reporting. After a time-consuming and manual reconciliation, managers received a final copy of the model which they used to make key business decisions. Delivery delays of the file and errors caused major pain-points throughout the organization, not to mention the file size often clogged recipients' email inboxes.
The business stakeholders wanted a complete QlikView application that would streamline the entire process, provided all reports would be standard and have a static appearance irrespective of selections and filtering.
Initially, a proof of concept was built to demonstrate the look and feel, KPI placement and conditional functionality, and to surface any potential data issues. Once approved, a fully functioning application was designed and deployed on the QlikView platform. The data structure was challenging because the general ledger account structure was not consistent; the resulting behavior caused the report lines to change when a filter selection was made. A combination of SQL data mining and QlikView scripting was used to apply consistency across the data to standardize the accounts and solve this issue. The completed application eliminated numerous manual process and redundancies, as well as greatly sped-up the report delivery time.
The client had a large call center operation, making both inbound and outbound customer contacts. The outbound calls were of great importance because the call results impacted employees' pay structure. The process involved multiple CSV files that were updated, saved, and copied many times per day by call center representatives, team supervisors and managers. Given the number of individuals handling these files there were numerous issues and pain points in the process; data quality was a big issue.
Business stakeholders asked for a QlikView solution that would make the entire process faster and more efficient, but they didn't have any concept ideas. In addition, the solution had to successfully address three key areas: multiple handling of the CSV files; the communication of data between the managers, supervisors, and phone representatives; and, the consolidation, analysis and reporting of the data by management.
Experience and understanding of business process improvement was relied upon to develop a solution that addressed the stated concerns. The solution required a small data repository that would allow for simultaneous data collection and storage, that would also serve as the data source for the QlikView application.
MS Access was used as the data acquisition, consolidation, collection, and delivery engine and data warehouse, because of its ease of use and flexibility. This was accomplished with the use of multiple tables, queries, and data display and input forms. Programmed into the UI were several data validation and quality rules as well as a level of security that ensured each supervisor and phone rep only had access to their assigned records. Completing this phase of the project was a major accomplishment, and several processes around handling and updating the CSV files were eliminated.
The next phase of the solution was focused on reporting and analysis. A QlikView application was designed to accommodate multiple audiences, with each having a different gateway in from a central landing page. The dashboard contained numerous show-hide and layered object visualizations that provided deep and broad-based data discovery.
The company didn’t have a standard reporting platform. Various departments used variations of MS Office products to produce reports, and there were many instances of redundancy. The information flow was not timely; there was no history saved and, more importantly, true insights into the business were not obtained.
There were various third-party data sets used throughout the business, including structured and unstructured SQL databases and manually updated MS Excel files. In some cases, the data had quality issues and lacked adequate associations. The business stakeholders wanted a data consolidation and reporting system to use across multiple departments, but they were hesitant to commission a large-scale data warehouse project.
A good understanding of their department workflows and deliverables were obtained through conversations with relevant parties, as well as reverse-engineering processes where it was most effective. Qlik’s ETL engine was used to consolidate and stage the different data inputs into a base set of tables (qvd’s) that would ultimately be the source for all reporting needs. A second data layer was created to transform and relate the data inputs and to provide an automated validation checkpoint.
The QlikView dashboard was comprised of several layers of data visualizations and KPIs rolled up to the senior executive level. Because the dashboard had multiple audiences across the enterprise each view was designed to provide visualizations and information that were relevant to each role. An out-of-scope special request was received to explore if a user could change the components of a metric calculation on-the-fly by changing the calculation methodology. This request was implemented by providing end users an additional drop-down menu to select the components needed in the numerator and denominator.