Skip to main content


Showing posts with the label siem

AWS Lambda: Function to stream logs via SQS

 AWS Lambda: Function to stream logs via SQS Summary As part of logging and monitoring strategy it is quite important to emit data from AWS services to another service or system or aggregation layer. AWS SQS (Amazon Simple Queue Service) is a great tool to communicate between such micro-services with real-time & between software components at any volume. Aim of this article is a snippet to stream logs via SQS to an external service (like logstash or siem tools) using Lambda Function Pre-Reqs Permission to pull data from specific SQS queue Steps Ensure lambda function can reach the SQS queue Below is a snippet of code to push the data in real-time import gzip import json import base64 import boto3 import time def lambda_handler(event, context): sqs = boto3.client('sqs') account = boto3.client('sts').get_caller_identity()['Account'] queue_url = "{}.fifo".format(account) cw_da

Elastic Beats on pfSense : Installation and configuration

Summary Though in many cases syslog is preferred to transport the pfSense logs to external system, Elastic beats provides quite a niche way to send the logs while modelling the data alongside. This makes it ready-made to send to ElasticSearch directly and get ready-made outcomes like SIEM, performance etc. Pre-reqs A build server (preferably Ubuntu or Fedora) with internet connectivity shell access to pfsense server Basic knowledge of Elastic Stack (filebeat.yml configurations etc) Ensure connectivity is allowed from pfsense machine to your Elastic Stack receiver Setup Summary Connectivity tests Install dependencies in build server (vagrant, virtualbox, gmake, go etc) Download Elastic Beats source Make elastic Beats package for FreeBSD Copy binary packages to pfsense server Configure Beats to send to destination Configure ElasticSearch to view the data Installation Steps Connectivity tests Logon to pfsense server via Shel