DEV Community

Cover image for Implement real-time updates with Django REST Framework | Building Cryptocurrency API
Rashid
Rashid

Posted on • Edited on

Implement real-time updates with Django REST Framework | Building Cryptocurrency API

This post cross-published with OnePublish

What's up DEVs?

It is time to create another cool project using Django and REST Framework.

In this post we are going to build real-time REST API.

Youtube Channel with video tutorials - Reverse Python Youtube

Before we go on, please take a look REVERSE PYTHON. You can find more articles like this with UI UX design and if you liked it please share on social media or with your friends.

Currently I am interested in cryptocurrency, so I decided to create Cryptocurrency API to use it in React. Well, we need to crawl and update data continuously and avoid from long request timeout.

Installation and Configuration

Let's start with creating new project named cryptocurrencytracking and inside your project create app named trackingAPI

django-admin startproject cryptocurrencytracking cd cryptocurrencytracking django-admin startapp trackingAPI 
Enter fullscreen mode Exit fullscreen mode

and install REST Framework:

pip install djangorestframework 
Enter fullscreen mode Exit fullscreen mode

Once installation completed, open your settings.py and update INSTALLED_APPS.

INSTALLED_APPS = [ ... 'rest_framework', 'trackingAPI', ] 
Enter fullscreen mode Exit fullscreen mode

So, I stated before we need to handle long term requests. Celery is the best choice for doing background task processing in the Python/Django ecosystem. It has a simple and clear API, and it integrates beautifully with Django. So, we are using Celery to handle the time-consuming tasks by passing them to queue to be executed in the background and always keep the server ready to respond to new requests.

To install celery run following command:

pip install Celery 
Enter fullscreen mode Exit fullscreen mode

Celery requires a solution to send and receive messages; usually this comes in the form of a separate service called a message broker. We will be configuring celery to use the RabbitMQ messaging system, as it provides robust, stable performance and interacts well with celery.

We can install RabbitMQ through Ubuntu’s repositories by following command:

sudo apt-get install rabbitmq-server 
Enter fullscreen mode Exit fullscreen mode

Then enable and start the RabbitMQ service:

sudo systemctl enable rabbitmq-server sudo systemctl start rabbitmq-server 
Enter fullscreen mode Exit fullscreen mode

Install RabbitMQ on Mac

Once installation completed, add the CELERY_BROKER_URL configuration to the end of settings.py file:

CELERY_BROKER_URL = 'amqp://localhost' 
Enter fullscreen mode Exit fullscreen mode

Then, create celery.py inside your project.

celery

celery.py

import os from celery import Celery os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cryptocurrencytracking.settings') app = Celery('cryptocurrencytracking') app.config_from_object('django.conf:settings', namespace='CELERY') app.autodiscover_tasks() 
Enter fullscreen mode Exit fullscreen mode

We are setting the default Django settings module for the 'celery' program and loading task modules from all registered Django app configs.

Now inside your __init__.py import the celery:

from .celery import app as celery_app __all__ = ['celery_app'] 
Enter fullscreen mode Exit fullscreen mode

This will make sure our Celery app loaded every time Django starts.

Creating model

In your models.py:

from django.db import models class Cryptocurrency(models.Model): cryptocurrency = models.CharField(max_length=100) price = models.CharField(max_length=100) market_cap = models.CharField(max_length=100) change = models.CharField(max_length=100) def __str__(self): return self.cryptocurrency 
Enter fullscreen mode Exit fullscreen mode

We are going to crawl website named Coinranking and if you visit the site you can see the field names there.

Reverse Python

Crawling Cryptocurrency Data

We will use BeautifulSoup to crawl cryptocurrency values in given URL.

Beautiful Soup is a Python library for pulling data out of HTML and XML files. It works with your favorite parser to provide idiomatic ways of navigating, searching, and modifying the parse tree. It commonly saves programmers hours or days of work. Run the following command in your terminal to install beautifulsoup:

pip install beautifulsoup4 
Enter fullscreen mode Exit fullscreen mode

Now, create new file named tasks.py inside our app trackingAPI.

# tasks.py  from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency(): print('Crawling data and creating objects in database ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') # Find first 5 table rows  rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) # Create object in database from crawled data  Cryptocurrency.objects.create( cryptocurrency = cryptocurrency, price = price, market_cap = market_cap, change = change ) # Sleep 3 seconds to avoid any errors  sleep(3) 
Enter fullscreen mode Exit fullscreen mode

@shared_task will create the independent instance of the task for each app, making task reusable. This makes the @shared_task decorator useful for libraries and reusable apps, since they will not have access to the app of the user.

As you see we are crawling our data and cleaning it from useless characters, then creating new object in database.

Once data crawled we need contentiously update these objects.

#tasks.py @shared_task def update_currency(): print('Updating data ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) data = {'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change} Cryptocurrency.objects.filter(cryptocurrency=cryptocurrency).update(**data) sleep(3) # Run this function if database is empty if not Cryptocurrency.objects.all(): crawl_currency() while True: sleep(15) update_currency() 
Enter fullscreen mode Exit fullscreen mode

As you see, we are crawling data every 15 seconds and updating our objects.

If you want to see the result start celery in terminal:

celery -A cryptocurrencytracking worker -l info 
Enter fullscreen mode Exit fullscreen mode

and go check your admin to see created objects.

Building API

Alright! Now our objects are updating and we need to create API using REST Framework.

Now, create serializers.py inside our app.

Serializers allow complex data such as querysets and model instances to be converted to native Python datatypes that can then be easily rendered into JSON, XML or other content types. Serializers also provide deserialization, allowing parsed data to be converted back into complex types, after first validating the incoming data.

#serializers.py  from rest_framework import serializers from .models import Cryptocurrency class CryptocurrencySerializer(serializers.ModelSerializer): class Meta: model = Cryptocurrency fields = ['cryptocurrency', 'price', 'market_cap', 'change'] 
Enter fullscreen mode Exit fullscreen mode

The ModelSerializer class provides a shortcut that lets you automatically create a Serializer class with fields that correspond to the Model fields.

For more information take look documentation

Next step is building API views, so open views.py:

#views.py from django.shortcuts import render from rest_framework import generics from .models import Cryptocurrency from .serializers import CryptocurrencySerializer class ListCryptocurrencyView(generics.ListAPIView): """ Provides a get method handler. """ queryset = Cryptocurrency.objects.all() serializer_class = CryptocurrencySerializer 
Enter fullscreen mode Exit fullscreen mode

and finally configure urls.py

#urls.py from django.contrib import admin from django.urls import path from trackingAPI.views import ListCryptocurrencyView urlpatterns = [ path('admin/', admin.site.urls), path('', ListCryptocurrencyView.as_view()), ] 
Enter fullscreen mode Exit fullscreen mode

when you run the server and celery (separate terminals) you will see following result:

Reverse Python

Try to refresh the page every 15 second or every minute and you will notice that values are changing.

You can clone or download this project from my GitHub:

GitHub logo thepylot / Cryptocurrency-REST-API-Django

API built with Django REST Framework

Cryptocurrency-REST-API-Django

API built with Django REST Framework

Getting Started

This tutorial works on Python 3+ and Django 2+.

Install dependencies:

python3 -m pip3 install -r requirements.txt 

run following commands:

python3 manage.py makemigrations trackingAPI python3 manage.py migrate python3 manage.py runserver 

and start Celery worker:

celery -A cryptocurrencytracking worker -l info 

Mission Accomplished!

I hope you learned something from this tutorial and make sure you are following me on social media. Also check REVERSE PYTHON

Top comments (11)

Collapse
 
christiangeng profile image
ChristianGeng

Lovely, thanks for sharing this!
When trying to get it to work, I have a potentially dumb error:
I am getting the Rest Entry Point at localhost:8000, however the list of data stays empty.
Same with the default database db.sqlite3 where no data are arriving either (the table is created after doing the migrations, but the select * returns nothing).
However, the celery task works in that it prints the data updates to the console (starting with
Updating data... and the new bitcoin etc. rates).
Now I wonder: am I making a kind of typical, very dumb mistake?

Collapse
 
thedevtimeline profile image
Rashid

Did you migrate your project? Try to delete all migrations and migrate your project right after you created models. You can check my github I put commands there. Let me know if it worked for you :)

Collapse
 
christiangeng profile image
ChristianGeng

Thank you very much for the fast response! I have now rebuilt everything (also a fresh install of a venv with the requirements), but I am still getting the same problem:

find . -path "*/migrations/*.py" -not -name "__init__.py" -delete find . -path "*/migrations/*.pyc" -delete rm -v db.sqlite3 python3 manage.py makemigrations trackingAPI python3 manage.py migrate python3 manage.py runserver celery -A cryptocurrencytracking worker -l info 

I also tried a completely fresh checkout with a git clone, but again the same issue (Database gets created, but no data end up there. Rest is visible on localhost:8000, but no data arrive, only the celery worker prints the data as intendended)

Thread Thread
 
thedevtimeline profile image
Rashid

Hmm, I will check it again in few hours. Can you please update your tasks.py file:

# tasks.py  from time import sleep from celery import shared_task from bs4 import BeautifulSoup from urllib.request import urlopen, Request from .models import Cryptocurrency @shared_task # do some heavy stuff def crawl_currency(): print('Crawling data and creating objects in database ..') req = Request('https://coinranking.com', headers={'User-Agent': 'Mozilla/5.0'}) html = urlopen(req).read() bs = BeautifulSoup(html, 'html.parser') # Find first 5 table rows  rows = bs.find('tbody', class_="table__body").find_all('tr', class_="table__row")[0:5] for row in rows: cryptocurrency = row.find('span', class_="profile__name").get_text().strip().replace('\n', '') values = row.find_all('div', class_="valuta") price = values[0].get_text().strip().replace('\n', '') market_cap = values[1].get_text().strip().replace('\n', '') change = row.find('div', class_="change").find('span').get_text().strip().replace('\n', '') print({'cryptocurrency': cryptocurrency, 'price':price, 'market_cap':market_cap, 'change':change}) # Create object in database from crawled data  Cryptocurrency.objects.create( cryptocurrency = cryptocurrency, price = price, market_cap = market_cap, change = change ) # Sleep 3 seconds to avoid any errors  sleep(3) crawl_currency() 

and go check your admin to see objects are created.

Make sure you are running celery and django at the same time.

Thread Thread
 
christiangeng profile image
ChristianGeng

Thanks a lot, that was it!

not Cryptocurrency.objects 

is always false, so crawl_currency() was never called.
I have set a celery breakpoint there for the first time, then called crawl_currency() at the breakpoint by hand, and now it works like a charm!

Thanks so much!

P.S.:
Would it make sense to test for an empty queryset instead? Sth. like this:

not Cryptocurrency.objects.all() 
Thread Thread
 
thedevtimeline profile image
Rashid

Great! Ah, I see. Yeah I will fix that line. Thank you for your attention👍

Collapse
 
marcelolop3s profile image
Marcelo C. Lopes

Hello guys, because when this code snippet is active, rabbitmq is idle.

if not Cryptocurrency.objects:

crawl_currency()

while True:
sleep(15)
update_currency()

I didn't understand this logic, could someone help me

Collapse
 
khanashes profile image
Hamza

thank you sir for such an amazing article. Now I am working on a similar project but instead of refreshing I want my API to automatically stream data whenever new data comes

Collapse
 
smyja profile image
Smyja

Must the data be stored? can't you just display it without storing? Kindly explain.

Collapse
 
arshiamhm profile image
arshiamhm

Can you clarify why does the data have to be stored? serializers seem enough

Collapse
 
brahim024 profile image
brahim boughanm

Hi can you add async services like channels to see this data updated in front-end