Learn more about using our two versions of the Export API.

Getting Started

LogDNA provides two endpoints for exporting your log lines. Read the sections below to learn about each, and determine which is appropriate for your use case.

Make sure that you generate a service key before attempting to export logs. Follow the steps here to create one if needed. This service key is associated with your organization and will be used for authentication purposes.

Authentication

HTTP Basic authentication is used for both versions of the Export API. To follow the expected scheme, a service key should be passed in as the username and the password should be left empty (cURL examples are provided in the reference guides).


Export API v1

The Export API v1 is used to export log lines in JSONL format. Based on your specific plan, the maximum number of logs returned is limited to 10,000 / 20,000. There are two distinct functionalities:

  1. Streaming desired log lines directly from the response as raw text data
  2. Specifying an email to receive a link that can be used to the download the desired log lines

Refer to the /v1/export guide for a detailed specification of the endpoint.

📘

Note

When an email address is specified, the desired log lines will not be streamed to the response body. The response will instead send a notification indicating that the request was successful and that the results will be sent to the email address provided in the request. This entails archiving and uploading the desired logs to a storage provider. The email includes a URI to download the logs. Be aware that access to this URI will expire after 24-48 hours.


Export API v2

The Export API v2 provides an enhancement over v1, allowing you to export any number of log lines using pagination and by-passing the plan-specific limit imposed in v1. By making (multiple) subsequent requests using the pagination_id parameter, you can retrieve logs in batches of JSON until all matching log lines have been returned.

Each Export API v2 request is limited to 10,000 lines.

Please review the following sections + information below before referring to the /v2/export guide.

Pagination Functionality

Pagination is only necessary when the desired set of log lines is larger than 10,000, since this is the maximum number of results that can be returned in a given page. If an export does not exceed this size, the behavior is similar to v1 -- all of the logs will be returned in a single "page" and no further requests are needed. When an export does exceed this size, refer to the steps outlined below.

This is the expected workflow to export a set (cardinality > 10,000) of log lines that requires pagination:

  1. Send an initial request with valid query parameters to retrieve logs. Pass a null value for the pagination_id in the first request.
  2. The JSON response contains 2 fields: lines and pagination_id. The former is an array containing the corresponding batch (the first batch in this case) of logs and the latter is a token used to retrieve the next page of results.
  3. Send a subsequent request using the same initial query parameters and also passing in the token received in the response to the pagination_id parameter. This will retrieve the next page of results.
  4. Repeat this request-response cycle, updating the pagination_id parameter with the token returned in every new response, until the response eventually returns a null value for the token. This indicates that the corresponding batch of logs is the last and that you have reached the final page.

🚧

Important Considerations

  • In subsequent requests to paginate through results, make sure none of the query parameters & values are modified from the initial request (excluding pagination_id)

  • The tokens used in the pagination_id parameter are tied to a specific export and page so they can be used repeatedly in requests to return deterministic results

  • "0" can be passed in to the from and to parameters which will be resolved to timestamps for a plan's retention boundary and the current time respectively. This behavior is independent so the value can be passed interchangeably to the parameters -- resulting in 4 possible scenarios:

    • [from: <user-specified timestamp>, to: <user-specified timestamp>]
    • [from: 0 (retention boundary), to: <user-specified timestamp>]
    • [from: <user-specified timestamp>, to: 0 (current time)]
    • [from: 0 (retention boundary), to: 0 (current time)]

Pagination Example

An example + templatized Javascript (Node.js) script using the npm got request library has been provided below:

'use strict'

const got = require('got')

async function exportRequest() {
  // insert your service key below; it should be passed as the username for the HTTP Basic authentication scheme
  const serviceKey = '<service-key>'

  // replace the parameters here with valid values based on the API reference
  // the current values specify a timeframe that will export all matching logs in the past 24 hours
  const requiredParams = {
    from: Date.now() - (1000 * 60 * 60 * 24)
  , to: Date.now()
  }
  
  // replace the (optional) parameters here with valid values based on the API reference
  const optionalParams = {
    size: undefined
  , hosts: undefined
  , apps: undefined
  , levels: undefined
  , tags: undefined
  , query: undefined
  , prefer: undefined
  }

  const queryParams = {
    ...requiredParams
  , ...optionalParams
  }

  let pagination_id
  let lines
  do {
    const resp = await got('https://api.logdna.com/v2/export', {
      username: serviceKey
    , searchParams: {
        ...queryParams
      , pagination_id
      }
    }).json()

    pagination_id = resp.pagination_id
    lines = resp.lines
    console.log(lines) // the lines field in the returned JSON will be an array containing the exported logs (from a specific batch if pagination is necessary)
    // ... process log lines as needed
  } while (pagination_id)
}

exportRequest()
  .then((res) => {...}) // handle function resolution
  .catch((err) => {...}) // handle any caught errors