Overview

Amazon S3 (Simple Storage Service) is a highly scalable, durable, and secure cloud storage service offered by Amazon Web Services (AWS). It is commonly used for various data storage purposes, including storing server logs.

Using S3 to store server logs provides a reliable, scalable, and secure solution for retaining valuable log data. The centralized storage also simplifies log management, enhances data analysis capabilities, and supports compliance and auditing requirements. AWS’s extensive global infrastructure ensures that your log data is available and accessible from anywhere with low latency, making S3 a popular choice for log storage in the cloud.

Follow the steps below to configure log shipping to Amazon S3.

Step 1 - Credential Procurement

Authentication credentials are required to ship logs to S3.

Follow these steps to obtain the credentials, store them as an AWS Secret, and configure external logging.

  1. Refer to these instruction to create a new AWS IAM user in your AWS account.

    • Select Programmatic access when creating the user and take note of the Access and Secret Key.
  2. This user, at a minimum, must have an associated policy with the s3:PutObject action.

Sample AWS Policy (substitute S3_BUCKET_NAME):

JSON
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "VisualEditor0",
      "Effect": "Allow",
      "Action": "s3:PutObject",
      "Resource": "arn:aws:s3:::S3_BUCKET_NAME/*"
    }
  ]
}
  1. From the Control Plane Console UI, click Secrets from the left menu.
  2. Click the New button.
  3. Enter a Name for the secret, and select AWS from the Secret Type list.
  4. Enter the Access Key and Secret Key and click Save.
  5. This secret can now be used when configuring logging using the UI Console or CLI.

Step 2 - Configure External Logging

External logging can be configured by using either the UI Console or the CLI.

Enable Logging using the UI Console

  1. From the Control Plane Console UI, click on Org in the left menu.
  2. Click External Logs in the middle context menu.
  3. Select S3 and fill out the required fields.
  4. Select the AWS secret to authenticate to S3. Refer to the credential procurement section to obtain and configure the necessary credentials.
  5. Click Save.
  6. After the configuration is complete, log entries will be available at S3 within a few minutes.

The prefix will be the folder where the logs will be written.

The folder structure will follow the format:

PREFIX/ORG_NAME/YEAR/MONTH/DAY/HOUR/MINUTE/LOG_FILE.jsonl

The .jsonl file will contain ~1-3k of single line log entries in JSON.

Each entry will contain the following keys:

  • time
  • log
  • location,
  • version
  • provider
  • container
  • replica
  • workload
  • gvc
  • org
  • stream

Enable Logging using the CLI

The external logging configuration can be created / updated using the CLI’s cpln org patch ORG_NAME -f FILE.yaml command.

Below is an example Org manifest (in YAML). Edit and save the YAML as a file and use it as an input to the CLI’s cpln org patch ORG_NAME -f FILE.yaml command.

Refer to the credential procurement section to obtain and configure the necessary credentials.

  • Substitute: ORG_NAME, S3_BUCKET_NAME, AWS_SECRET, and AWS_REGION.
YAML
kind: org
name: ORG_NAME
spec:
  logging:
    s3:
      bucket: S3_BUCKET_NAME
      credentials: //secret/AWS_SECRET
      prefix: /
      region: AWS_REGION