sudo apt install python3-pip
pip install redis -t .
In index.py
import boto3
import redis
import datetime
# Redis configuration
redis_host = 'cluster-session.***.euc1.cache.amazonaws.com'
redis_port = 6379
redis_password = ''
# S3 configuration
s3_bucket_name = '*****-redis-sessions-debug'
def handler(event, context):
# Connect to Redis
client = redis.StrictRedis(
host=redis_host,
port=redis_port,
password=redis_password,
decode_responses=True
)
# Get all keys from Redis
keys = client.keys('*')
# Find the biggest key by size
biggest_key = ''
biggest_size = 0
for key in keys:
size = client.strlen(key)
if size > biggest_size:
biggest_key = key
biggest_size = size
# Get the value of the biggest key
value = client.get(biggest_key)
# Generate a dynamic filename based on the current date and time
current_time = datetime.datetime.now().strftime('%Y-%m-%d_%H-%M')
filename = f'{current_time}.txt'
# Upload the value as a file to S3
s3 = boto3.resource('s3')
s3_object = s3.Object(s3_bucket_name, filename)
s3_object.put(Body=value)
# Disconnect from Redis
client.close()
return 'Success'
The above code will be executed upon a cloudwatch alert (in our case redis cluster memory utilization above a certain threshold). Lambda will then be triggered and will connect to that redis cluster, list all keys and find the one with the biggest value. Then it will copy the value as a text file and send it to an s3 bucket for further dev review. The file itself is named using the current date time.
zip -r redis-debug.zip .
upload the zip file in Lambda
runtime: python 3.10 on arm64 infra
allowed ram: 2056 MB
max execution time: 1min or more depending on how busy your redis cluster is
remember to run lambda in a vpc and give it the appropriate permissions to write to the required s3 bucket
