Log Analytics

The Kinvey Log Analytics feature provides a way for single tenant users to stream backend logs to external systems. The Kinvey backend logs consist of events such as incoming requests, responses, authentication information, as well as logs from other services like Flex, Business Logic and RapidData.

Requirements

  • A dedicated Kinvey instance. Only owners of dedicated instances can setup Log Analytics.
  • Instance level role of Admin. Only instance administrators can create and configure Log Analytics services.
  • An external system that will consume the streamed logs. Log Analytics currently supports the following external systems:

Creating a Log Analytics Service

The four steps below explain how to access the Log Analytics creation form. The two paragraphs after that describe the information needed to configure the service.

  1. Navigate to your Kinvey Console instance and log in.
  2. Open the Instance Settings view by clicking on the respective icon in the top right corner.
  3. Select the Log Analytics tab from the left-side menu.
  4. Click on Add a Log Analytics service and select the type of service you want to create based on your external system.

Add Log Analytics Service

The Log Analytics service creation form has several fields that are common between the different types of external systems:

FieldDescriptionRequired
NameThe name of the Log Analytics service as it will appear in the Instance Settings viewYes
DescriptionThis field can contain any additional information or comments about the Log Analytics serviceNo
Test ModeSpecifies if the service should produce a test event once every second-

The Test Mode provides an easy way to verify if the configuration of the external system works as expected. This mode can be disabled when you initially create the service, but it is recommended to keep it enabled until you verify that the integration is successful.

There are additional service-specific fields that are different based on the external consumer system:

After you configure and save the service, you will be redirected to the Status Page of the service. The status page shows the type of service, whether it operates in test mode or not and a list of any recent warnings or error logged by the service. When you have verified that the events are received in the external system, you can disable the Test Mode.

The Recent Events section will be empty if there are no warning or errors associated with the service.

Configuring service for Elasticsearch

FieldDescriptionRequired
HostsSpecifies one or more Elasticsearch hosts (domain name and port) without the protocol. For example, customhost.com:9200,customhost2.com:9200,customhost2.com:9300/elastic_searchYes
Index NameDescribes the Elasticsearch index in which the events are stored - for example, kinvey-%Y.%m.%d. It is recommended to specify a date pattern in the index name so that events are partitioned for faster searching for specific date range and easier deletion of old data. The default value will partition the data daily. Format parameters:
  • %Y - year with century (2015, 1995)
  • %m - month of the year, zero-padded (01..12)
  • %W - week of the year (00..53)
  • %d - day of the month, zero-padded (01..31)
  • %H - hour of the day, zero-padded (00..23)
Yes
UserSpecifies the username for HTTP Basic authenticationNo
PasswordSpecifies the password for HTTP Basic authenticationNo
Certificate AuthorityThe CA certificate (PEM format) used for verifying a specific Elasticsearch host certificateNo
Client CertificateThe client certificate (PEM format) used when your Elasticsearch cluster wants to verify client connectionsNo
Client KeyThe private key (PEM format) for the clientNo
Client Key PasswordSpecifies the password for the client's private key, if anyNo
SSL VerifySpecifies whether to verify the Elasticsearch's SSL certificate. This setting is enabled by default and we do not recommend disabling it in production-
ProtocolThe protocol to be used when calling the Hosts. HTTPS is selected by default. We do not recommend using HTTP in production-

All passwords, secrets and private keys are securely encrypted and cannot be viewed or extracted once set. When editing an existing service configuration, the user can only overwrite the existing value, but not read it.

Configuring service for Splunk

The Splunk integration utilizes the HTTP Event Collector (HEC) approach for managing the data. For more information about HEC, refer to the respective article in the Splunk documentation.

FieldDescriptionRequired
HostThe hostname/IP for the HTTP Event Collector (HEC) API token or the HEC load balancer without the protocolYes
TokenSpecifies the authorization token for the HTTP Event Collector (HEC) APIYes
PortThe port number for the HTTP Event Collector (HEC)No
Certificate AuthorityThe CA certificate (PEM format) used for verifying a specific Splunk host certificateNo
Client CertificateThe client certificate (PEM format) used when your Splunk host wants to verify client connectionsNo
Client KeyThe private key (PEM format) for the clientNo
Insecure SSLSpecifies whether to skip verification of the Splunk host's SSL certificate. This setting is enabled by default and we do not recommend disabling it in production-
ProtocolThe protocol to be used when calling the Hosts. HTTPS is selected by default. We do not recommend using HTTP in production-

All passwords, secrets and private keys are securely encrypted and cannot be viewed or extracted once set. When editing an existing service configuration, the user can only overwrite the existing value, but not read it.

Configuring service for AWS S3

FieldDescriptionRequired
AWS S3 BucketThe name of the Amazon S3 (Amazon Simple Storage Service) bucket to write logs toYes
AWS Access Key IDSpecifies the AWS access key idYes
AWS Secret KeySpecifies the AWS secret keyYes
AWS S3 RegionThe name of the Amazon S3 regionNo
Path PrefixSpecifies the path where to store the files in the bucketNo
Store asSpecifies the format of the files containing the events. The default is gzip compressed which saves storage space and network traffic-

All passwords, secrets and private keys are securely encrypted and cannot be viewed or extracted once set. When editing an existing service configuration, the user can only overwrite the existing value, but not read it.

Configuring service for Google Cloud Storage

FieldDescriptionRequired
GCS BucketThe name of the Google Cloud Storage bucket to write logs toYes
Google Cloud Credentials FileContains credentials to access the GCS bucket. It is advisable to specify only the minimum required permissions, which are storage.buckets.get and storage.objects.create for this bucket.Yes
Path PrefixSpecifies the path where to store the files in the bucketNo
Store asSpecifies the format of the files containing the events. The default is gzip compressed which saves storage space and network traffic-

All passwords, secrets and private keys are securely encrypted and cannot be viewed or extracted once set. When editing an existing service configuration, the user can only overwrite the existing value, but not read it.

Configuring service for Azure Blob Storage

FieldDescriptionRequired
Azure Storage Account NameThe name of the Storage account in the Azure Management portalYes
Azure Storage Access KeyThe access key for the Storage account. Can be retrieved from the Azure Management portalYes
Azure Storage Container NameSpecifies the name of the Azure container, which will store the logs. A new one will be created if it does not existYes
Path PrefixSpecifies the path in the Azure container where the log files will be storedNo
Azure CloudSpecifies which Azure cloud will be used to store the logs. The default value is Azure Public Cloud. The other three options are Azure German Cloud, Azure China Cloud and Azure US Government CloudYes

All passwords, secrets and private keys are securely encrypted and cannot be viewed or extracted once set. When editing an existing service configuration, the user can only overwrite the existing value, but not read it.

Troubleshooting

Each Log Analytics service has a status page in which you can find information about the state of the service and the mode in which it operates at the moment. The Recent Events section on the status page will show any connection or configuration issues associated with the service and provide a way to identify and debug various problems.

Add Log Analytics Service