Build a Crypto Portfolio Tracker with Django and the CoinGecko API
The CoinGecko API is free, requires no API key for basic use, and covers over 13,000 coins. This guide wires it into a Django application with Celery Beat refreshing prices every five minutes, Django Channels pushing live updates over WebSocket, a price alert system that fires when a target is crossed, and a DRF API your frontend can query. All on the free tier.
1. What We're Building
A multi-user portfolio tracker where each user can create named portfolios, add coin holdings with a quantity and average buy price, and see their total value updated in real time. The system has four moving parts:
- CoinGecko client — a thin wrapper around the free
/simple/priceendpoint that fetches current USD prices, market cap, 24-hour volume, and 24-hour change percentage for a list of coin IDs. - Celery Beat task — runs every five minutes, fetches prices for all distinct coins held across all portfolios, and stores a
PriceSnapshotrow per coin. - Price alert system — each holding can have alerts set at a target price in either direction. After every price refresh, a second Celery task checks which alerts have been crossed and fires a notification.
- Django Channels WebSocket — after each price refresh, the Celery task pushes updated prices to a channel group; connected browser clients receive them and update displayed values without polling.
CoinGecko API
↓ (every 5 min via Celery Beat)
refresh_portfolio_prices task
→ bulk_create PriceSnapshot rows
→ channel_layer.group_send("portfolio_prices", prices)
→ check_price_alerts.delay()
↓
PortfolioConsumer (WebSocket)
→ browser receives price_update event
→ updates DOM without page reload
2. Project Setup and Dependencies
pip install django djangorestframework celery redis channels \
channels-redis requests django-celery-beat Pillow
Start Redis with Docker if you don't have it running already:
docker run -d --name redis -p 6379:6379 redis:7-alpine
Add the apps and configure Channels and Celery in settings.py:
# settings.py
INSTALLED_APPS = [
# ...
"rest_framework",
"channels",
"django_celery_beat",
"portfolio",
]
ASGI_APPLICATION = "myproject.asgi.application"
CHANNEL_LAYERS = {
"default": {
"BACKEND": "channels_redis.core.RedisChannelLayer",
"CONFIG": {"hosts": [("127.0.0.1", 6379)]},
}
}
CELERY_BROKER_URL = "redis://127.0.0.1:6379/0"
CELERY_RESULT_BACKEND = "redis://127.0.0.1:6379/0"
CELERY_BEAT_SCHEDULER = "django_celery_beat.schedulers:DatabaseScheduler"
# Optional: CoinGecko Pro API key for higher rate limits
COINGECKO_API_KEY = env("COINGECKO_API_KEY", default=None)
Set up the ASGI application to route HTTP and WebSocket connections:
# myproject/asgi.py
import os
from django.core.asgi import get_asgi_application
from channels.routing import ProtocolTypeRouter, URLRouter
from channels.auth import AuthMiddlewareStack
import portfolio.routing
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myproject.settings")
application = ProtocolTypeRouter({
"http": get_asgi_application(),
"websocket": AuthMiddlewareStack(
URLRouter(portfolio.routing.websocket_urlpatterns)
),
})
3. The Portfolio Models
Four models cover the full domain: who owns what, what the prices are, and what alerts are watching for price targets.
# portfolio/models.py
import uuid
from django.db import models
from django.conf import settings
from django.utils import timezone
class Portfolio(models.Model):
user = models.ForeignKey(
settings.AUTH_USER_MODEL, on_delete=models.CASCADE, related_name="portfolios"
)
name = models.CharField(max_length=100)
created_at = models.DateTimeField(auto_now_add=True)
class Meta:
ordering = ["name"]
def __str__(self):
return f"{self.name} ({self.user})"
@property
def total_value_usd(self):
return sum(h.current_value_usd or 0 for h in self.holdings.all())
@property
def total_cost_usd(self):
return sum(
float(h.quantity) * float(h.avg_buy_price)
for h in self.holdings.all()
if h.avg_buy_price
)
class Holding(models.Model):
portfolio = models.ForeignKey(Portfolio, on_delete=models.CASCADE, related_name="holdings")
coin_id = models.CharField(max_length=50) # CoinGecko ID: "bitcoin", "ethereum"
coin_symbol = models.CharField(max_length=20) # "BTC", "ETH"
coin_name = models.CharField(max_length=100) # "Bitcoin", "Ethereum"
quantity = models.DecimalField(max_digits=30, decimal_places=18)
avg_buy_price = models.DecimalField(max_digits=20, decimal_places=8, null=True, blank=True)
added_at = models.DateTimeField(auto_now_add=True)
class Meta:
unique_together = ("portfolio", "coin_id")
ordering = ["coin_name"]
def __str__(self):
return f"{self.coin_symbol} in {self.portfolio.name}"
@property
def current_price_usd(self):
snap = (
PriceSnapshot.objects.filter(coin_id=self.coin_id)
.order_by("-recorded_at")
.first()
)
return snap.price_usd if snap else None
@property
def current_value_usd(self):
price = self.current_price_usd
return float(self.quantity) * float(price) if price else None
@property
def pnl_usd(self):
if not self.avg_buy_price or not self.current_price_usd:
return None
return float(self.quantity) * (float(self.current_price_usd) - float(self.avg_buy_price))
class PriceSnapshot(models.Model):
coin_id = models.CharField(max_length=50, db_index=True)
price_usd = models.DecimalField(max_digits=20, decimal_places=8)
market_cap_usd = models.BigIntegerField(null=True)
volume_24h_usd = models.BigIntegerField(null=True)
price_change_24h_pct = models.FloatField(null=True)
recorded_at = models.DateTimeField(auto_now_add=True)
class Meta:
indexes = [models.Index(fields=["coin_id", "-recorded_at"])]
def __str__(self):
return f"{self.coin_id} @ ${self.price_usd} on {self.recorded_at:%Y-%m-%d %H:%M}"
class PriceAlert(models.Model):
ABOVE = "above"
BELOW = "below"
DIRECTION_CHOICES = [(ABOVE, "Above"), (BELOW, "Below")]
holding = models.ForeignKey(Holding, on_delete=models.CASCADE, related_name="alerts")
direction = models.CharField(max_length=10, choices=DIRECTION_CHOICES)
target_price = models.DecimalField(max_digits=20, decimal_places=8)
is_active = models.BooleanField(default=True, db_index=True)
triggered_at = models.DateTimeField(null=True, blank=True)
created_at = models.DateTimeField(auto_now_add=True)
def __str__(self):
return f"{self.holding.coin_symbol} {self.direction} ${self.target_price}"
Run migrations:
python manage.py makemigrations portfolio
python manage.py migrate
4. CoinGecko API Client
CoinGecko's free /simple/price endpoint returns prices, market cap, 24-hour
volume, and 24-hour change for up to 250 coin IDs per request, with no API key required.
The Pro tier removes rate limits and adds sub-minute data; plugging in a key is a one-line
change to the request headers.
# portfolio/coingecko.py
import requests
from django.conf import settings
_BASE_URL = "https://api.coingecko.com/api/v3"
_BATCH_SIZE = 250 # free tier limit per request
def _headers() -> dict:
key = getattr(settings, "COINGECKO_API_KEY", None)
return {"x-cg-pro-api-key": key} if key else {}
def fetch_prices(coin_ids: list[str]) -> dict[str, dict]:
"""
Returns {coin_id: {usd, usd_market_cap, usd_24h_vol, usd_24h_change}}
for all requested coin IDs. Batches automatically at 250 per request.
"""
results = {}
for i in range(0, len(coin_ids), _BATCH_SIZE):
batch = coin_ids[i : i + _BATCH_SIZE]
resp = requests.get(
f"{_BASE_URL}/simple/price",
params={
"ids": ",".join(batch),
"vs_currencies": "usd",
"include_market_cap": "true",
"include_24hr_vol": "true",
"include_24hr_change": "true",
},
headers=_headers(),
timeout=15,
)
resp.raise_for_status()
results.update(resp.json())
return results
def search_coins(query: str) -> list[dict]:
"""Search CoinGecko for a coin by name or symbol. Returns id, name, symbol."""
resp = requests.get(
f"{_BASE_URL}/search",
params={"query": query},
headers=_headers(),
timeout=10,
)
resp.raise_for_status()
coins = resp.json().get("coins", [])
return [{"id": c["id"], "name": c["name"], "symbol": c["symbol"]} for c in coins[:10]]
The search_coins helper is useful for an autocomplete field when users add a
holding — they can type "ethereum" and get back {"id": "ethereum", "symbol": "ETH",
"name": "Ethereum"}, which is what you store in the Holding model.
5. Celery Beat: Scheduled Price Refresh
The refresh task fetches prices for every distinct coin held across all user portfolios, stores a snapshot row for each, then pushes the new prices to all connected WebSocket clients via the channel layer.
# portfolio/tasks.py
import logging
from asgiref.sync import async_to_sync
from celery import shared_task
from channels.layers import get_channel_layer
from django.utils import timezone
from .coingecko import fetch_prices
from .models import Holding, PriceAlert, PriceSnapshot
logger = logging.getLogger(__name__)
@shared_task(bind=True, max_retries=3, default_retry_delay=60)
def refresh_portfolio_prices(self):
coin_ids = list(Holding.objects.values_list("coin_id", flat=True).distinct())
if not coin_ids:
return {"coins": 0}
try:
data = fetch_prices(coin_ids)
except Exception as exc:
logger.warning("CoinGecko fetch failed: %s", exc)
raise self.retry(exc=exc)
snapshots = [
PriceSnapshot(
coin_id=coin_id,
price_usd=prices["usd"],
market_cap_usd=prices.get("usd_market_cap"),
volume_24h_usd=prices.get("usd_24h_vol"),
price_change_24h_pct=prices.get("usd_24h_change"),
)
for coin_id, prices in data.items()
]
PriceSnapshot.objects.bulk_create(snapshots)
# Push live prices to all connected dashboard clients
channel_layer = get_channel_layer()
async_to_sync(channel_layer.group_send)(
"portfolio_prices",
{
"type": "price.update",
"prices": {coin_id: str(prices["usd"]) for coin_id, prices in data.items()},
"changes": {
coin_id: prices.get("usd_24h_change")
for coin_id, prices in data.items()
},
},
)
check_price_alerts.delay()
return {"coins": len(snapshots)}
Register the periodic task via CELERY_BEAT_SCHEDULE in settings. The
django-celery-beat database scheduler lets you also manage schedules through
Django admin at runtime, which is useful for tuning the refresh interval without
deploying.
# settings.py
from celery.schedules import crontab
CELERY_BEAT_SCHEDULE = {
"refresh-portfolio-prices": {
"task": "portfolio.tasks.refresh_portfolio_prices",
"schedule": crontab(minute="*/5"), # every 5 minutes
},
}
Start the Beat scheduler alongside your Celery worker:
# terminal 1 — worker
celery -A myproject worker -l info -Q default
# terminal 2 — beat scheduler
celery -A myproject beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
6. Price Alert System
After each price refresh, a second task checks every active PriceAlert against
the freshest snapshot. When the target is crossed, the alert is deactivated and a
notification task is dispatched.
# portfolio/tasks.py (continued)
@shared_task
def check_price_alerts():
active_alerts = (
PriceAlert.objects.filter(is_active=True)
.select_related("holding")
)
triggered_ids = []
for alert in active_alerts:
snap = (
PriceSnapshot.objects.filter(coin_id=alert.holding.coin_id)
.order_by("-recorded_at")
.first()
)
if not snap:
continue
crossed = (
alert.direction == PriceAlert.ABOVE and snap.price_usd >= alert.target_price
) or (
alert.direction == PriceAlert.BELOW and snap.price_usd <= alert.target_price
)
if crossed:
triggered_ids.append(alert.pk)
if triggered_ids:
PriceAlert.objects.filter(pk__in=triggered_ids).update(
is_active=False,
triggered_at=timezone.now(),
)
for alert_id in triggered_ids:
send_alert_notification.delay(alert_id)
return {"triggered": len(triggered_ids)}
@shared_task
def send_alert_notification(alert_id: int):
try:
alert = PriceAlert.objects.select_related(
"holding__portfolio__user"
).get(pk=alert_id)
except PriceAlert.DoesNotExist:
return
user = alert.holding.portfolio.user
coin = alert.holding.coin_name
direction = alert.direction
target = alert.target_price
# Replace with your preferred channel: email, Telegram, Slack, push notification
logger.info(
"ALERT triggered for user=%s: %s went %s $%s",
user.email, coin, direction, target,
)
# e.g. send_mail(subject, body, from_email, [user.email])
Using bulk update on triggered_ids before dispatching individual
notification tasks avoids double-triggering if the worker picks up the task twice. The
is_active=False update is atomic per batch — notifications fire separately
and can retry without re-deactivating the alert.
7. DRF API Endpoints
The REST API covers portfolio CRUD, holding management, coin search, and a read-only summary endpoint that returns total value and P&L for each portfolio.
# portfolio/serializers.py
from rest_framework import serializers
from .models import Holding, Portfolio, PriceAlert, PriceSnapshot
class HoldingSerializer(serializers.ModelSerializer):
current_price_usd = serializers.ReadOnlyField()
current_value_usd = serializers.ReadOnlyField()
pnl_usd = serializers.ReadOnlyField()
class Meta:
model = Holding
fields = [
"id", "coin_id", "coin_symbol", "coin_name",
"quantity", "avg_buy_price",
"current_price_usd", "current_value_usd", "pnl_usd",
"added_at",
]
read_only_fields = ["id", "added_at"]
class PortfolioSerializer(serializers.ModelSerializer):
holdings = HoldingSerializer(many=True, read_only=True)
total_value_usd = serializers.ReadOnlyField()
total_cost_usd = serializers.ReadOnlyField()
class Meta:
model = Portfolio
fields = ["id", "name", "total_value_usd", "total_cost_usd", "holdings", "created_at"]
read_only_fields = ["id", "created_at"]
class PriceAlertSerializer(serializers.ModelSerializer):
class Meta:
model = PriceAlert
fields = ["id", "holding", "direction", "target_price", "is_active", "triggered_at"]
read_only_fields = ["id", "is_active", "triggered_at"]
class CoinSearchSerializer(serializers.Serializer):
id = serializers.CharField()
name = serializers.CharField()
symbol = serializers.CharField()
# portfolio/views.py
from rest_framework import viewsets, status
from rest_framework.decorators import action
from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
from rest_framework.views import APIView
from .coingecko import search_coins
from .models import Holding, Portfolio, PriceAlert
from .serializers import (
CoinSearchSerializer, HoldingSerializer,
PortfolioSerializer, PriceAlertSerializer,
)
class PortfolioViewSet(viewsets.ModelViewSet):
serializer_class = PortfolioSerializer
permission_classes = [IsAuthenticated]
def get_queryset(self):
return (
Portfolio.objects.filter(user=self.request.user)
.prefetch_related("holdings")
)
def perform_create(self, serializer):
serializer.save(user=self.request.user)
class HoldingViewSet(viewsets.ModelViewSet):
serializer_class = HoldingSerializer
permission_classes = [IsAuthenticated]
def get_queryset(self):
return Holding.objects.filter(portfolio__user=self.request.user)
def perform_create(self, serializer):
portfolio_id = self.request.data.get("portfolio")
portfolio = Portfolio.objects.get(pk=portfolio_id, user=self.request.user)
serializer.save(portfolio=portfolio)
class PriceAlertViewSet(viewsets.ModelViewSet):
serializer_class = PriceAlertSerializer
permission_classes = [IsAuthenticated]
def get_queryset(self):
return PriceAlert.objects.filter(holding__portfolio__user=self.request.user)
class CoinSearchView(APIView):
permission_classes = [IsAuthenticated]
def get(self, request):
query = request.query_params.get("q", "").strip()
if len(query) < 2:
return Response({"error": "Query must be at least 2 characters."}, status=400)
results = search_coins(query)
serializer = CoinSearchSerializer(results, many=True)
return Response(serializer.data)
# portfolio/urls.py
from django.urls import path, include
from rest_framework.routers import DefaultRouter
from .views import CoinSearchView, HoldingViewSet, PortfolioViewSet, PriceAlertViewSet
router = DefaultRouter()
router.register("portfolios", PortfolioViewSet, basename="portfolio")
router.register("holdings", HoldingViewSet, basename="holding")
router.register("alerts", PriceAlertViewSet, basename="alert")
urlpatterns = [
path("api/", include(router.urls)),
path("api/coins/search/", CoinSearchView.as_view(), name="coin-search"),
]
The router generates the standard REST endpoints automatically:
GET /api/portfolios/ # list user's portfolios with holdings + values
POST /api/portfolios/ # create portfolio
GET /api/portfolios/{id}/ # retrieve single portfolio
PUT /api/portfolios/{id}/ # update portfolio name
DELETE /api/portfolios/{id}/ # delete portfolio
GET /api/holdings/ # list all holdings across all portfolios
POST /api/holdings/ # add a holding
PATCH /api/holdings/{id}/ # update quantity / avg buy price
DELETE /api/holdings/{id}/ # remove holding
GET /api/alerts/ # list active and past alerts
POST /api/alerts/ # create alert
DELETE /api/alerts/{id}/ # cancel alert
GET /api/coins/search/?q=eth # search CoinGecko for coin metadata
8. Real-Time Updates with Django Channels
The PortfolioConsumer accepts WebSocket connections from authenticated users.
It joins a shared portfolio_prices group, so every connected client receives
the same price broadcast when Celery fires after a refresh.
# portfolio/consumers.py
import json
from channels.generic.websocket import AsyncWebsocketConsumer
class PortfolioConsumer(AsyncWebsocketConsumer):
GROUP = "portfolio_prices"
async def connect(self):
if self.scope["user"].is_anonymous:
await self.close()
return
await self.channel_layer.group_add(self.GROUP, self.channel_name)
await self.accept()
async def disconnect(self, close_code):
await self.channel_layer.group_discard(self.GROUP, self.channel_name)
# Receive message from channel group (sent by Celery task)
async def price_update(self, event):
await self.send(text_data=json.dumps({
"type": "price_update",
"prices": event["prices"],
"changes": event.get("changes", {}),
}))
# portfolio/routing.py
from django.urls import re_path
from .consumers import PortfolioConsumer
websocket_urlpatterns = [
re_path(r"^ws/portfolio/$", PortfolioConsumer.as_asgi()),
]
The channel group uses the dot notation price.update in
group_send, which Channels maps to the price_update method on
the consumer (dots become underscores). The async_to_sync wrapper in the
Celery task bridges the synchronous Celery worker into the async channel layer.
9. Frontend WebSocket Dashboard
The dashboard renders a holdings table where each row carries the coin ID and quantity as
data-* attributes. When a WebSocket message arrives, the handler walks the rows
and updates price, value, and 24-hour change in place — no page reload, no polling.
<!-- portfolio_dashboard.html (simplified) -->
<table id="holdings-table">
<thead>
<tr>
<th>Coin</th>
<th>Quantity</th>
<th>Price (USD)</th>
<th>Value (USD)</th>
<th>24h Change</th>
</tr>
</thead>
<tbody>
{% for holding in portfolio.holdings.all %}
<tr data-coin-id="{{ holding.coin_id }}"
data-quantity="{{ holding.quantity }}">
<td>
<strong>{{ holding.coin_symbol }}</strong>
<span class="coin-name">{{ holding.coin_name }}</span>
</td>
<td>{{ holding.quantity }}</td>
<td class="current-price">
{% if holding.current_price_usd %}
${{ holding.current_price_usd|floatformat:2 }}
{% else %}—{% endif %}
</td>
<td class="holding-value">
{% if holding.current_value_usd %}
${{ holding.current_value_usd|floatformat:2 }}
{% else %}—{% endif %}
</td>
<td class="price-change">—</td>
</tr>
{% endfor %}
</tbody>
</table>
<p id="ws-status" class="ws-status">Connecting…</p>
// portfolio_dashboard.js
(function () {
var wsUrl = (location.protocol === "https:" ? "wss" : "ws") + "://" + location.host + "/ws/portfolio/";
var ws;
var reconnectDelay = 2000;
function fmt(n, decimals) {
return Number(n).toLocaleString("en-US", {
minimumFractionDigits: decimals,
maximumFractionDigits: decimals,
});
}
function updatePrices(prices, changes) {
document.querySelectorAll("#holdings-table tbody tr[data-coin-id]").forEach(function (row) {
var coinId = row.dataset.coinId;
var qty = parseFloat(row.dataset.quantity);
if (prices[coinId] !== undefined) {
var price = parseFloat(prices[coinId]);
var value = qty * price;
row.querySelector(".current-price").textContent = "$" + fmt(price, 2);
row.querySelector(".holding-value").textContent = "$" + fmt(value, 2);
if (changes && changes[coinId] != null) {
var pct = parseFloat(changes[coinId]);
var changeEl = row.querySelector(".price-change");
changeEl.textContent = (pct >= 0 ? "+" : "") + fmt(pct, 2) + "%";
changeEl.className = "price-change " + (pct >= 0 ? "positive" : "negative");
}
}
});
// Update total portfolio value
var total = 0;
document.querySelectorAll("#holdings-table tbody tr[data-coin-id]").forEach(function (row) {
var valText = row.querySelector(".holding-value").textContent.replace(/[$,]/g, "");
var val = parseFloat(valText);
if (!isNaN(val)) total += val;
});
var totalEl = document.getElementById("total-value");
if (totalEl) totalEl.textContent = "$" + fmt(total, 2);
}
function connect() {
ws = new WebSocket(wsUrl);
ws.onopen = function () {
document.getElementById("ws-status").textContent = "Live";
document.getElementById("ws-status").className = "ws-status connected";
reconnectDelay = 2000;
};
ws.onmessage = function (event) {
var data = JSON.parse(event.data);
if (data.type === "price_update") {
updatePrices(data.prices, data.changes);
}
};
ws.onclose = function () {
document.getElementById("ws-status").textContent = "Reconnecting…";
document.getElementById("ws-status").className = "ws-status disconnected";
setTimeout(connect, reconnectDelay);
reconnectDelay = Math.min(reconnectDelay * 2, 30000); // cap at 30s
};
ws.onerror = function () {
ws.close();
};
}
connect();
})();
The reconnect logic uses exponential backoff capped at 30 seconds. On reconnect, the server will push the next scheduled price update naturally — there is no need to request the current state manually because the table is server-rendered with the latest snapshot values on page load.
10. Production Checklist
Rate limiting and CoinGecko quota
The free CoinGecko API allows approximately 30 calls per minute. With a 5-minute Celery Beat
interval this is well within limits even with multiple portfolios — the task batches all
distinct coin IDs into a single API call. If your user base grows, upgrade to CoinGecko Pro
and set COINGECKO_API_KEY in environment variables; the client picks it up with
no other changes.
Pruning PriceSnapshot rows
At one snapshot per coin every 5 minutes, a 100-coin portfolio generates 28,800 rows per coin per month. Add a nightly Celery task to delete snapshots older than 90 days:
@shared_task
def prune_old_snapshots():
cutoff = timezone.now() - timedelta(days=90)
deleted, _ = PriceSnapshot.objects.filter(recorded_at__lt=cutoff).delete()
return {"deleted": deleted}
For larger deployments, migrate PriceSnapshot to TimescaleDB and use its
built-in retention policies. The model requires no code changes — TimescaleDB is a
PostgreSQL extension that handles partitioning transparently.
Celery task retry on CoinGecko 429
CoinGecko returns HTTP 429 when the free rate limit is exceeded. The task already retries
with max_retries=3, default_retry_delay=60. Add a specific check for 429 to
use a longer backoff:
from requests.exceptions import HTTPError
except HTTPError as exc:
retry_after = 120 if exc.response.status_code == 429 else 60
raise self.retry(exc=exc, countdown=retry_after)
Historical price charts
Add a /api/holdings/{id}/history/ endpoint that queries
PriceSnapshot with a time range filter and returns OHLCV-style aggregates.
Pair it with Chart.js on the frontend for a sparkline per holding. CoinGecko also has a
free /coins/{id}/market_chart endpoint for up to 365 days of historical data
if you want to populate snapshots retroactively on first load.
WebSocket authentication in production
Channels' AuthMiddlewareStack reads the Django session cookie, which works for
browser clients. For mobile or SPA clients using JWT, replace it with a custom middleware
that validates a token passed as a query parameter on the WebSocket URL (e.g.
wss://example.com/ws/portfolio/?token=…). Never pass a token in the path —
it ends up in server logs.
nginx WebSocket proxying
# nginx.conf — WebSocket upgrade headers required
location /ws/ {
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_read_timeout 3600;
}
Set proxy_read_timeout to at least 3600 seconds. Without it, nginx will close
idle WebSocket connections after 60 seconds, triggering your reconnect loop constantly.
Separate Celery queues
Run price refreshes and alert checks on a dedicated queue to prevent them being starved behind long-running tasks:
CELERY_TASK_ROUTES = {
"portfolio.tasks.refresh_portfolio_prices": {"queue": "prices"},
"portfolio.tasks.check_price_alerts": {"queue": "prices"},
"portfolio.tasks.send_alert_notification": {"queue": "default"},
}
# start price worker
celery -A myproject worker -l info -Q prices -c 2
Wrapping Up
The free CoinGecko API is generous enough to run a multi-user portfolio tracker on a single
Celery Beat schedule without ever hitting a rate limit under normal load. The key design
decisions are batching all coin IDs into one API call per refresh cycle, storing a
PriceSnapshot row rather than updating holdings directly (so you get a natural
price history), and pushing live updates through the channel layer rather than making the
browser poll an HTTP endpoint.
The alert system deliberately uses a batch update() before dispatching
notification tasks — this makes alert deactivation atomic and prevents a flapping price from
sending duplicate notifications if the task retries. The same pattern applies if you later
add SMS or push notifications: deactivate first, notify after.
From here the natural extensions are price history charts, portfolio performance over time (daily NAV snapshots), and a transaction log so users can track buys and sells rather than just a static average cost basis.