abelcastro.dev

Sync model fields using Postgres triggers

2022-03-22

PostgresDjango

My self made and hosted google analytics alternative

2021-12-20

DjangoChart.js

My initial macOS setup - tools and resources list

2021-10-07

macOSDockerDjango
12
...
7
89

Abel Castro 2025 - checkout the source code of this page on GitHub - Privacy Policy

The goal:

Sync some model fields between two different models.

Example scenario:

class PlayerData(models.Model):
	goals = models.IntegerField()
	assists = models.IntegerField()
	tackles = models.IntegerField()
	# 10 fields more with player data 


class PlayerMostRelevantData(models.Model):
	player_data_id = models.ForeignKey(PlayerData, on_delete=models.CASCADE)
	goals = models.IntegerField()
	assists = models.IntegerField()

We want to keep the data of PlayerData and PlayerMostRelevantData on sync by every create, update and object deletion.

Possible solutions:

Of course for this task we could use django signals but a more reliable solution would be to make the data synchronisation at database level using Postgres triggers.

Following SQL-Snippet explain what we want to achieve. If you run the statements you will see that inserting a new row in the table called "original" will create a new row with the same content in the table "original_copy".

drop table if exists original;
create table original(id int, name text);
drop table if exists original_copy;
create table original_copy(original_id int PRIMARY KEY, name text);


create or replace function trigger_on_example()
returns trigger language plpgsql as $$
begin
    insert into original_copy (original_id, name)
    values(new.id, new.name)
    ON CONFLICT (original_id)
    DO
        UPDATE SET name = new.name;
	return new;
end
$$;

create trigger trigger_on_example
before insert or update on original
for each row execute procedure trigger_on_example();

insert into original (id, name)
values (1, 'John');

select *
from original;

select *
from original_copy;

The solution

For our example with the PlayerData and PlayerMostRelevantData we can write a django migration like this that creates the Postgres triggers for us in the database.


from django.db import connection, migrations


def create_triggers(apps, schema_editor):
    with connection.cursor() as cursor:
        # player data create or update trigger
        cursor.execute(
            """
            create or replace function trigger_on_player_data_create_or_update()
            returns trigger language plpgsql as $$
            begin
               insert into players_playerdata(player_data_id, goals, assists, tackles)
               values(new.id, new.goals, new.assists, new.tackles)
               ON CONFLICT (player_data_id)
               DO
                   UPDATE SET player_data_id = new.id, goals = new.goals,
                   assists = new.assists;
               return new;
            end
            $$;
            create trigger trigger_on_player_data_create_or_update
            before insert or update on players_playerdata
            for each row execute procedure trigger_on_player_data_create_or_update();
            """
        )

        # player data delete trigger
        cursor.execute(
            """
            create or replace function trigger_on_player_data_delete()
            returns trigger language plpgsql as $$
            begin
               delete from players_playermostrelevantdata where player_data_id=old.id;
               return NULL;
            end
            $$;
            create trigger trigger_on_player_data_delete
            before delete on players_playerdata
            for each row execute procedure trigger_on_player_data_delete();
            """
        )


class Migration(migrations.Migration):

    dependencies = [
        ("players", "0001_initial"),
    ]

    operations = [migrations.RunPython(create_triggers, reverse_code=migrations.RunPython.noop)]


Sources and inspiration:

  • https://www.postgresqltutorial.com/postgresql-upsert/
  • https://stackoverflow.com/a/40901037
  • https://objectpartners.com/2019/01/24/creating-raw-sql-migrations-in-django/

Basic analytics 2.0

I didn't wanted to use Google Analytics or similar tools to track the traffic of my side projects because: a) I don't need that much details about my views and b) I don't want to have a cookie consent dialog for asking for permission to forward user related data to a 3rd party service.

Because of that I use a self made and hosted service for tracking basic web traffic analytics stuff: basic_analytics.

It is made with Django, Bootstrap and chart.js and provides an API endpoint where you can send the page view data as payload.

With that data it can generate different charts that show:

  • Monthly average page views
  • Page views by urls
  • Browser, country, device and operative system distribution of the views
  • Filter all this data by real user views or include robots views

Update February 2022

In order to send data to Basic Analytics from the backend I created a Django app called django_basic_analytics_tracker. It provides a mixin that you can use in the views that you want to track. It overrides the dispatch method, take some request data and sends it to basic analytics service in the post payload.

This is a collection of commands and resources that were useful for me with the initial setup of my MacBook.

  • Installs Xcode Command Line Tools
xcode-select --install
  • Generate SSH key and add it to the ssh agent
  • Homebrew https://brew.sh/
  • Oh my zsh https://ohmyz.sh/
  • nvm https://github.com/nvm-sh/nvm#git-install
  • Install Docker Desktop https://docs.docker.com/desktop/mac/install/
  • Rectangle: Move and resize windows in macOS using keyboard shortcuts or snap areas https://rectangleapp.com/
  • Add some shell aliases. I can't work without these:
alias g='git'
alias gc='git checkout'
alias gco='git commit'
alias gs='git status'
alias gp='git pull'

alias dc='docker-compose'
alias up='docker-compose up'
alias down='docker-compose down'