AWS Lambda: Function to stream logs via SQS
Summary
As part of logging and monitoring strategy it is quite important to emit data from AWS services to another service or system or aggregation layer. AWS SQS (Amazon Simple Queue Service) is a great tool to communicate between such micro-services with real-time & between software components at any volume.
Aim of this article is a snippet to stream logs via SQS to an external service (like logstash or siem tools) using Lambda Function
Pre-Reqs
- Permission to pull data from specific SQS queue
Steps
- Ensure lambda function can reach the SQS queue
- Below is a snippet of code to push the data in real-time
import gzip
import json
import base64
import boto3
import time
def lambda_handler(event, context):
sqs = boto3.client('sqs')
account = boto3.client('sts').get_caller_identity()['Account']
queue_url = "https://sqs.eu-west-1.amazonaws.com/12345567928/my-app-{}.fifo".format(account)
cw_data = event['awslogs']['data']
compressed_payload = base64.b64decode(cw_data)
uncompressed_payload = gzip.decompress(compressed_payload)
payload = json.loads(uncompressed_payload)
log_events = payload['logEvents']
log_group = payload['logGroup']
log_stream = payload['logStream']
event= {}
for log_event in log_events:
event['AccountID']=account
event['LogGroup']=log_group
event['LogStream']=log_stream
event['Log']=log_event
response = sqs.send_message(
QueueUrl=queue_url,
MessageGroupId="my_logging",
MessageDeduplicationId="%.20f" % time.time(),
MessageBody=json.dumps(event)
)
- Now pull data using logstash or similar service
input {
sqs {
queue => "MYQUEUENAME-SQS"
access_key_id => "ABCDEFGHIJK"
secret_access_key => "WW1123ABCDEFGHIJK"
region => "us-west-1"
proxy_uri => "https://10.20.30.40:1234"
id_field => "sqs_message_id"
sent_timestamp_field => "sqs_sent_timestamp"
add_field => { "[my][queue]" => "my-app-queue" }
}
}
filter {
}
output {
elasticsearch {
hosts => "my_elastic_hostname"
data_stream => "true"
}
}
Please provide your feedback