Running Redis rq worker on Docker
10:40 04 Nov 2020

I am trying to make a queue of tasks using redis rq. I was trying to follow a tutorial but I am using docker. Below is my code-

app.py

from flask import Flask, request
import redis
from rq import Queue

import time
app = Flask(__name__)

r = redis.Redis()
q = Queue(connection=r)

def background_task(n):

    """ Function that returns len(n) and simulates a delay """

    delay = 2

    print("Task running")
    print(f"Simulating a {delay} second delay")

    time.sleep(delay)

    print(len(n))
    print("Task complete")

    return len(n)

def index():

    if request.args.get("n"):

        job = q.enqueue(background_task, request.args.get("n"))

        return f"Task ({job.id}) added to queue at {job.enqueued_at}"

    return "No value for count provided"


if __name__ == "__main__":
    app.run()

Docker compose file-

version: "3.8"
services:
  web:
    build: .
    ports:
      - "5000:5000"
    volumes:
      - .:/code
    environment:
      FLASK_ENV: development
     
      
  redis:
    image: "redis:alpine"

Dockerfile

FROM python:3.7-alpine
WORKDIR /code
ENV FLASK_APP=app.py
ENV FLASK_RUN_HOST=0.0.0.0
RUN apk add --no-cache gcc musl-dev linux-headers
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
EXPOSE 5000
COPY . .
CMD ["flask", "run"]

Whenever I run '''docker-compose up --build''' and open http://localhost:5000/ I get Url not found Where am I going wrong? How is one supposed to use rq worker command in docker containers.

python docker flask redis docker-compose