- Set everything up normally and get things working in testing.
- Have the docker run something like
CMD ["./server.sh"]
at the end
- In
server.sh
have something like
redis-server &
P1=$!
celery -A app.celery worker --loglevel=info &
P2=$!
gunicorn --bind :5000 --workers 4 --threads 8 --timeout 0 app:app
P3=$!
wait $P1 $P2 $P3
- And run that locally... Does it work? Cool.
- When you deploy to Cloud Run, your tasks might fuck up if they're long lasting or making external API calls. In my case, I'm making a call to a Google API that takes a long time. Before I made a specific change I got an SSL errors galore.
- Make sure you switch the "CPU Allocation and Pricing" option to CPU is always allocated
- Should work
Yes I know I should use a VM or Sidecars or something. I don't care right now. I just need something that works. Remember, tech debt can be a privelage.
Hey, Luke! Thanks. I ended up doing an old school VM. More or less the same thing (later I added a django-rq to avoid database locks in Sqlite).
BUt the "tech debt can be a privilege" hasn't been forgot till this day. :D