Cloud & Engineering

Export CloudHub Logs To An External Logging System

Posted by Rashmi Choudhary on 30 June 2016

tech, mule

Overview

It is a common use case to export the application logs out of CloudHub to a target system for example Loggly or SPLUNK Enterprise because these products have rich capabilities and support of representing the operational information to different audiences.

This article presents some of the available options to export application logs from CloudHub to a target system, the pros and cons for each of these options and finally proposing the chosen approach.

At the high level, the logs can either be pushed from CloudHub to the external system or be pulled from CloudHub at regular intervals. The first approach uses custom log4j file configured in the CloudHub environment which will then direct the logs to external target system at a specified port. This is the common approach but I wanted to attempt the pull mechanism instead. This article covers the pull mechanism utilising MuleSoft CloudHub REST API.

An application deployed in CloudHub

As per CloudHub reference, a new optional user interface and expanded storage capability has been introduced for logging via the new Enhanced Log Management setting. Although this feature is optional, this is enabled by default when a new application is deployed in CloudHub. The decision to keep this feature enabled or not will make a big difference in terms of which logs would become available to export and if they would be suitable for the purpose. Changing this option at a later stage will result in loss of logging data and hence, it is recommended to understand and analyse this option as early as possible.

Log Export Strategy Option 1

When enhanced log management is enabled

Below is the screenshot of the application where this feature is enabled.

Enhanced Log Management Setting

Application Deployed with Enhanced Logging Enabled

As this feature was enabled while deploying the application, all the logs will appear under Deployment section as displayed in the below figure.

Application Logs

Application Logs with Enhanced Logging Enabled

In this case, the following two CloudHub API calls will be used to export the logs out of CloudHub.

  • Firstly, get all application deployments data and then use the deploymentId whose “status” is “STARTED”.
https://anypoint.mulesoft.com/cloudhub/api/v2/applications/{domain}/deployments

Sample response for this call would look something like below.

{
  "data": [
    {
      "deploymentId": "56aee310e4b03f8acb120bef",
      "createTime": "2016-02-01T04:46:08.420Z",
      "startTime": "2016-02-01T04:46:11.007Z",
      "endTime": "2016-02-01T04:51:08.051Z",
      "instances": [
        {
          "instanceId": "56aee310e4b03f8acb970bef-0",
          "publicIPAddress": "52.11.11.111",
          "status": "STARTED",
          "region": "ap-southeast-2"
        }
      ]
    }
 ],
  "total": 1
}
  • Secondly, use the same deploymentId to call the second API which will then export all application logs.
https://anypoint.mulesoft.com/cloudhub/api/v2/applications/{domain}/deployments/{deploymentId}/logs

Sample response for this call would look something like below

{
  "data": [
    {
      "loggerName": "my.api.request",
      "threadName": "api-httpListenerConfig.worker.03",
      "timestamp": 1455859411243,
      "message": "esb.my.api.request.request esbTransactionID=b1039897-17f8-49e7-854b-c52d2734de75 txnState=start",
      "priority": "INFO",
      "instanceId": "56c385bae4b023af26e25866-0"
    }],
   "total": 1
}

As shown above, this second CloudHub API call will return all the application logs which can then be sent to a target system. The logs are returned in the JSON format with pre-defined attributes.

PROS

When Enhanced Log Management is enabled, using the above two API calls, logs can be exported for an application at any time.

CONS

In this approach, in absence of query parameters like startTime or endTime, all the logs will be returned everytime the call is made thereby creating duplicate logs in the target system or else some extra logic will be needed to parse the last timestamp. Hence, this approach is good for consuming the logs once off and is not suitable as a regular pull mechanism.

Log Export Strategy Option 2

When enhanced log management is disabled

Below is the screenshot of the application where this feature is disabled.

Deployed Application Setting

Application Deployed with Enhanced Logging Disabled

When this feature is disabled for an application, all the logs will appear in default interface as shown in below figure.

Application Logs

Application Logs with Enhanced Logging Enabled

In this case, the following CloudHub API end point can be used to export the logs. As this end point supports query parameters like - maximum limit of logs entries to be exported, startDate, endDate, etc. this was a more feasible option for me.

https://anypoint.mulesoft.com/cloudhub/api/applications/{domain}/logs?limit=1000&startDate=2016-02-21T04:00:53.832Z&endDate=2016-02-21T04:35:26.551Z

Sample response for this call would look something like below.

{
  "data": [
    {
      "sequenceNumber": 1456057470999,
      "timestamp": 1456057748869,
      "serverId": "i8cf4e052",
      "message": "\n**\n* Application \"rctest\" shut down normally on: 2/21/16 12:29  *\n* PM *\n* Up for: 0 days, 0 hours, 4 mins, 22.144 sec *\n***",
      "priority": "INFO"
    }],
  "total": 1
}

PROS

Query parameters like startDate, endDate and limit are supported with this call which provides more control over the log export process.

CONS

I had to add some logic of polling CloudHub API at regular intervals and storing the endDate\timestamp when the logs were last exported. This last endDate or timestamp could be persisted anywhere (like AWS S3 bucket, file system or database) however, I chose to use ObjectStore to store this information.

Decision - Option 2

Option 2 provided me more flexibility and control over the log export process. I created another MuleSoft project as a ‘Log Export Service’ and deployed that in CloudHub. This project polls specified applications to export the logs at regular intervals and then calls the target system’s API (in my case the target system was Logsene) to send the logs to. I chose to create another Mulesoft project to export the logs however, this logic can be implemented in any other programming language and outside of MuleSoft runtime environment as required. I also created a simple admin page to add\delete\update application, startDate and endDate timestamps in the ObjectStore so that anyone can monitor these parameters via a user interface.

References

 

If you like what you read, join our team as we seek to solve wicked problems within Complex Programs, Process Engineering, Integration, Cloud Platforms, DevOps & more!

 

Have a look at our opening positions in Deloitte. You can search and see which ones we have in Cloud & Engineering.

 

Have more enquiries? Reach out to our Talent Team directly and they will be able to support you best.

Leave a comment on this blog: