Type to SearchView Tags

Automated Reporting Tool to Extract API Analytics Data
Raghavendra Rao Baru Technical Architect | September 10, 2020
43 Views

Co-author: Priyabrata Rath

Introduction:

Importance of Analytics Data in Business Applications

In today’s world of integrated business technologies which deal with business transactions, there is an utmost importance of preserving the analytics data related to business transactions. The primary objective of preserving such analytics data could be to verify the transaction details such as transactions IDs, timestamps, IP addresses, customer names, etc. looking at the daily/weekly/monthly/yearly transaction trends. Analytics data is crucial for creating insights data which helps business decide on their specific areas and services where they should up/down or even plan of introducing any required change(s) to any of their existing business processes to better and improve their businesses.

To do this, businesses usually may prefer implementing exclusive business analytics technologies/tools, which can facilitate recording, preserving, and providing the business transaction details for business-related references. For instance, in today’s date, tools such as Splunk are being widely used across many projects world-wide. These kind of exclusive third-party automated reporting tools are generally managed by professionals having good experience in working with such tools to fetch default or customized analytics data/graphs/reports for any of their business applications.

However, depending on the scope of the businesses, there may be organizations that may not prefer using such exclusive analytics technologies for reporting their applications, and may look for any available internal reporting tools from the platforms itself on which their applications are being managed. For example, the API integration platform - MuleSoft Anypoint platform (iPaaS) – provides API Analytics Event feature, which is capable of producing ad-hoc analytics reports for its applications (APIs) out of-the-box (of course, with certain limitations!). This implies organizations, who might want to pull out analytics reports for their business applications, could leverage these internal analytics reporting features without needing to setup any exclusive/external analytics reporting tools.

Organizations can leverage MuleSoft Anypoint platform to pull out analytics reports for business applications

Because they involve very little/no extra costs, using such internal Data Analytics features could lead to significant cost savings.

Demerits of Using Internal Analytics Reporting Tools

Like every coin, using internal analytics reporting tools has a flip side too. One of the prime demerits is the need to manually fetch the analytics data. This may be done by running data analytics queries to some services exposed by the integrated platforms which records and stores the details of the business transactions.

As most of such data analytics reporting features lack any automatic report creation mechanism, hence, such business scenarios only encourage the concerned operators/IT engineers to manually download the data analytics reports based on ad-hoc requests or frequently.

This raises a concern on the efficiency and the productiveness of the people who are involved in such repetitive manual tasks.

So, the idea is to automate manual reporting tasks, wherever possible, which can save a lot of time for the people by partially/completely removing the need of manual actions on repetitive tasks in the projects. Analytics automated reports gives them more time to focus on the other tasks of strategic importance.

Real client-deliverable3>

Our client, the Reserve Bank of Australia, has used custom MuleSoft applications (developed and maintained by HCL) for carrying out a significant portion of their daily businesses with their customer banks. As the applications run completely on Anypoint Cloudhub platform, therefore, the bank decided to use the API Analytics reporting feature provided out-of-the-box by Mulesoft Anypoint platform, to consolidate the daily transaction details such as client name, client IP address, transaction ID, transaction timestamp, policy violated, status code, other request-related information, etc. The Analytics information is very crucial to the bank’s data team in terms of decision making and their strategy.

Since the bank usually deals with daily transactions that number around 20000 every 10 minutes, hence they were comfortable using the Anypoint platform’s inbuilt API Analytics feature without any concern of losing any transactional records. However, this task required manual efforts from the HCL team, and the bank wanted daily reporting.

Business Problem:

  • There is a manual process involved on a daily basis in downloading and preparing the API Analytics reports for the multiple APIs from the production environment.​
  • The daily reports are being manually consolidated by sending multiple queries to the API analytics reporting feature of the Anypoint platform.
  • The MuleSoft Analytics API accepts maximum of 10 requests per minute, and gives a maximum of 20000 records in response to each request for any time-interval (‘startDate’ and ‘endDate’ parameters) in the query.​ Hence, a 24-hour duration has to be split into several fragments to fetch the Analytics data for each of these time-intervals, which have to be consolidated to get the Analytics report for a full 24-hour time period.
  • This manual task takes around 2 – 3 hours of daily time
  • The repetitive task consumes valuable time of the support engineers in the project.

Solution:

The automation Java code has been built successfully with the below significances: -

  • The Java program uses concurrent asynchronous reusable multi-threading principle by using the ExecutorService interface from java.util.concurrent package in Java 8.
  • The program generates only one CSV file per application per day, containing the business analytics data of the preceding day.
  • The program is triggered by a scheduler on a local server, daily at 12:00 AM, to query the Analytics Event API and fetch the analytics information for the preceding day’s transaction details and write them to the CSV files locally.
  • The program records its own execution-level log messages (INFO, WARN, ERROR) both on the runtime consoles as well as in a local log4j.log file.
  • Daily, the program sends out the Analytics CSV reports to the concerned team members via emails for each successful execution. And in case of any errors in the execution, it sends out appropriate error emails to the concerned groups as well.
  • Depending on the requirement, the program is configurable for different query time intervals - from a minute, 15 mins, 30 mins, 1hr. Based on the interval, the thread frequency is adjustable where the threads are managed from the reusable thread pool.
  • Moreover, despite of the limitations around MuleSoft’s Analytics event API, the program is intricately designed and built to get the API Analytics data for the different APIs from any runtime environment, by sticking to those limitations.

Benefit:

  • Because the automation application has been built on Java, and not MuleSoft, it has saved additional cloud vCores and infrastructure on the client’s Anypoint platform that would have been used if this was a MuleSoft-based application.
  • Since the application runs on-premise local servers, it is easy to extract the CSV files locally, as compared to the Cloudhub-way of doing the same where the output files are generally needed to be sent to some external file locations for their extraction.
  • It saves about two hours of time daily, and around three hours of time on Mondays. Overall, a saving of 11 weekly hours.
  • The program is dynamic and extremely scalable where we can decide on what frequency it can be executed based on the volume of analytics data.
  • The program can be reused for other MuleSoft apps' Analytics report as well.
  • Removal of manual processes led to an increased productivity of the people in the project.
  • Technically it is easy to manage and modify the program as it is built on Java using advanced level of concurrent multithreading. Also, because the threads are reusable, it makes an efficient use of memory of the system on which it runs.
  • Additionally, one of the recipient’s Outlook email account has been configured for uploading the analytics reports to the appropriate file storage location(s) for storage purpose.

How and what did we achieve?

The process of automating the Analytics reporting is based on agile framework in our case. However, the implementation approach here is more of DevOps practice. Below are some of the DevOps significances: -

  • This automated process is developed, tested, and is being managed by a single team (in this case, the HCL team) only who are the primary developers of the automation application. The team is also responsible for sending the Analytics reports to the client. Hence, it can be very well seen that almost every part of this activity is being taken care by a single development team rather than multiple functional teams. This is the fundamental aspect of DevOps concept, which has been well addressed here.
  • The activity focusses on speed and automation of the analytics reporting.
  • The application is very well designed to be flexible and re-configurable based on the customer’s needs on the analytics reporting, which makes it flexible and reusable using minimum time.
  • DevOps also focusses on cost-cutting. As already mentioned, the analytics API is provided out-of-the-box from the Anypoint platform that the client is already using for their business. Hence, there was no extra cost involved for them to get these Analytics data for their day-to-day business Insights.
  • Pure agile tends to stop after these three stages. In contrast, DevOps includes operations, which happen continually. Therefore, software development and monitoring are also continuous.

Conclusion:

The article showcases a real-time business scenario in which a custom automation application has been flexibly built and maintained to automate the analytics data reporting activity for the business transactions for the client by addressing some of the key industry-specific factors, and thereby, adding values to the business.