Logging

There is a thousand years old Chinese quote that says:

Whats the biggest redflag of a startup? It has no logging mechanism.

- A frustrated client handling startup engineer..
 

I can't emphasize how many times logging has saved me. I used to work in a chatbot company. The rule of thumb is that end-users are to be always considered to be dumb and our application should be able to handle their actions. You will explicitly tell them to upload a jpg/png image file and they will upload anything which is not jpg/png. In such situations, our clients would tell us, end-users are complaining that they are unable to send a file. We needed to investigate such cases and come up with an RCA(Root cause analysis). The very first thing that we would do is to look up the celery logs.

I really can't stress the importance of logs in a company of any scale. They can be used for analytics, RCAs, Security Audits, error analysis, transaction recovery, and much more. In the industry, we mostly use either rotating file logging or streaming the logs to an external service. For now, we are simply logging the events into a file.
Logging comes with a lot of benefits, some of which are:

  • We can store the logs in a file. Even better we can store multiple files e.g. celery.log, celery.log.1, .... ,celery.log.10 The higher the number the older the logs. So, if some clients ask for RCA of a problem that occurred 2 days back, we can inspect the logs and provide the root cause.
  • We can configure the logging level, So, even if we have 1000s of info logs, warning logs, and error logs. When we want to only see the errors, We can get only the error logs.

In this tutorial, we are going to configure a very simple logging system that we can utilize to log into a file. This would be especially beneficial to track and analyzing our background tasks.

πŸ“ ./
β”œβ”€πŸ“„ config.py
β”œβ”€πŸ“„ main.py
β”œβ”€πŸ“„ .env
β”œβ”€πŸ“„ .gitignore
β”œβ”€πŸ“„ requirements.txt
β””β”€πŸ“ env/

We have already seen .env and config.py in the previous post. Let's focus on the main part i.e. the main.py file. I would suggest you to write it down instead of copying it, in this way, you need not read the blog below. You should be able to understand everything.
 

import os
import time
import logging
from fastapi import FastAPI
from celery import Celery

from config import settings


if not os.path.exists('logs'):
   os.makedirs('logs')

logging.basicConfig(filename='logs/app.log',level=logging.INFO , format='%(name)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)

app = FastAPI()

celery = Celery(
    __name__,
    broker=settings.CELERY_BROKER_URL,
    backend=settings.CELERY_RESULT_BACKEND
)


@celery.task
def send_notification(device_token: str):
    logger.info("starting background task")
    time.sleep(10)  # simulates slow network call to firebase/sns
    try:
        a=11/0  #critical part that may fail, and its analysis is important
        logger.info(f"notification sent {device_token}")
    except Exception as e:
        logger.error(f"exception while division {e}")


@app.get("/push/{device_token}")
async def notify(device_token: str):
    logger.info("sending notification in background")
    send_notification.delay(device_token)
    return {"message": "Notification sent"}


Let us breakdown the lines:

  • We do the necessary imports.
  • We create a logs directory if it does not exist. This is the directory where we are going to dump our log files.
  • We set up a basic logging setup, here we specify the file to log into, the level above which logs should be considered, and the format in which our logs should appear.
  • We instantiate FastAPI and Celery.
  • @celery.task
    def send_notification(device_token: str):
    Once we decorate a function with @celery.task or @shared_task, Celery is able to identify it and recognize it as a background task.
  • Finally, we define a router and hook our background task to it. This way our background tasks execute and log when a request comes at the specified path.

All done now, In one terminal start the uvicorn server by uvicorn main:app --reload, and in another terminal start a celery worker by celery -A main.celery worker --loglevel=info Use the docs or postman to make a request to this endpoint. Hopefully, our logs should be there in logs/app.log let's visit this file and wait for disappointment!

#logs/app.log
main - INFO - sending notification in background

All you should see is this, even if we wait for 10 seconds, celery logs won't appear here🀦. However, in the celery worker terminal, you should see the missing logs in a non-formatted format. What went wrong and how can we fix it? Answers in next episode!!

FastAPITutorial

Brige the gap between Tutorial hell and Industry. We want to bring in the culture of Clean Code, Test Driven Development.

We know, we might make it hard for you but definitely worth the efforts.

Contacts

Refunds:

Refund Policy
Social

Follow us on our social media channels to stay updated.

Β© Copyright 2022-23 Team FastAPITutorial