Skip to content

Sample project that describes how you can handle schema within your Django application.

License

Notifications You must be signed in to change notification settings

willianantunes/django-multiple-schemas

Repository files navigation

Django Multiple Schemas

Code style: black Coverage Lines of Code

Here you'll find an honest project that shows how to use schema with Django. It has a script that creates all the scenario the project needs in PostgreSQL, it even has tests to guarantee that it is created as expected. Check more details below!

Running the project and checking multiple schemas working

Execute docker-compose up -d remote-interpreter jafar-app iago-app jasmine-app. After they are running, you can check the database through its port (you can use a client for that, like DataGrip). If you'd like to access remote-interpreter service administration panel, access http://0.0.0.0:8000/ and use admin for username and password.

All the services can be accessed through their admin interface. Please see docker-compose.yaml to figure out which port to use.

Why this project?

If you are in a scenario where there are many applications using the same database machine, it's advisable to create one single database, let's say db_production, and then separate each application by its own schema (we can understand it as a folder). This is quite important because if you are using a CDC (Change Database Capture) solution like Amazon DMS, it consumes an entire session (know more about it here) per database, which is a quite expensive resource.

Let's say you have three Apps and each one has its own database. If your company needs data from these three databases, then three sessions will be consumed. Now if you're using schemas, only one. To illustrate:

An image which shows all the database's objects

Some basic details

You can check it out consulting initialize-database.sh file.

I created a test script where it guarantees the script executed as expected. Check test_initialize_database.py to know more. Great place of reference here.

The result is like the following (it may be outdated):

An image which shows all the database's objects

Development

Updating pipenv dependencies

If you update Pipfile, you can issue the following command to refresh your lock file:

docker-compose run remote-interpreter pipenv update

Playing with PostgreSQL

First execute the following:

docker-compose up -d db

When it's UP, enter the container through the command:

docker exec -it django-multiple-schemas_db_1 bash

You can do this as well:

docker-compose exec db bash

We're accessing through bash, but you are able to access psql directly.

Then execute the command psql -U boss_role (check if the user matches with what is in docker-compose.yaml) to be able to execute SQL commands direct to the database.

Listing all the schemas

Sample output of the command select schema_name from information_schema.schemata;:

    schema_name     
--------------------
 pg_toast
 pg_catalog
 public
 information_schema
(4 rows)

Listing all database

Sample output of the command SELECT datname FROM pg_database WHERE datistemplate = false;:

           datname           
-----------------------------
 postgres
 django_multiple_schemas_dev
(2 rows)