Database Management
Redis
Subjective
Oct 05, 2025
How do you use Redis as a message queue system?
Detailed Explanation
Redis can implement various message queue patterns:
• Simple Queue with Lists:**
import redis
import json
import time
r = redis.Redis(decode_responses=True)
# Producer
def enqueue_job(queue_name, job_data):
job = {
'id': str(int(time.time() * 1000)),
'data': job_data,
'created_at': time.time()
}
r.lpush(queue_name, json.dumps(job))
return job['id']
# Consumer
def process_jobs(queue_name):
while True:
# Blocking pop (wait for jobs)
result = r.brpop(queue_name, timeout=5)
if result:
queue, job_json = result
job = json.loads(job_json)
try:
# Process job
print(f"Processing job {job['id']}: {job['data']}")
time.sleep(2) # Simulate work
print(f"Job {job['id']} completed")
except Exception as e:
print(f"Job {job['id']} failed: {e}")
• Priority Queue with Sorted Sets:**
def enqueue_priority_job(queue_name, job_data, priority=0):
job_id = str(int(time.time() * 1000000))
job = {'id': job_id, 'data': job_data}
# Higher score = higher priority
r.zadd(queue_name, {json.dumps(job): priority})
return job_id
def dequeue_priority_job(queue_name):
# Get highest priority job
result = r.zpopmax(queue_name)
if result:
job_json, priority = result[0]
return json.loads(job_json)
return None
• Delayed Jobs with Sorted Sets:**
def enqueue_delayed_job(queue_name, job_data, delay_seconds):
execute_at = time.time() + delay_seconds
job = {
'id': str(int(time.time() * 1000000)),
'data': job_data
}
r.zadd(f"{queue_name}:delayed", {json.dumps(job): execute_at})
def process_delayed_jobs(queue_name):
now = time.time()
# Get jobs ready to execute
jobs = r.zrangebyscore(
f"{queue_name}:delayed",
0, now,
withscores=True
)
for job_json, score in jobs:
# Move to main queue
r.zrem(f"{queue_name}:delayed", job_json)
r.lpush(queue_name, job_json)
**Best Practices:**
• Use reliable queues for critical jobs
• Implement retry logic with exponential backoff
• Monitor queue lengths and processing times
• Use dead letter queues for failed jobs
• Consider using Redis Streams for complex scenarios
Discussion (0)
No comments yet. Be the first to share your thoughts!
Share Your Thoughts