Both SQLAlchemy and Alembic are great tools, but they have a relatively steep learning
curve. Furthermore, in order to get started you need to do a bunch of steps which are
a bit repetitive (e.g. the naming conventions
boilerplate,
updating env.py
) which means that testing stuff can be a bit tiresome.
What I mainly wanted was a throwaway project/db which I could use to learn the ins and outs of models and migrations without having to bother with the complexity of a real-world code-base.
Tip: Pull Requests welcome!
- You must first define some env variables. For convenience you can just source
envrc
. You can edit the file if you want but, unless there is a port or ENV variable name conflict, it should be fine to keep the defaults:
source envrc
- Start the Postgresql database:
docker-compose up -d
- Try connecting to the database:
pgcli -U "${DBUSER}" -h "${DBHOST}" -p "${DBPORT}" "${DBNAME}"
# or
pgcli "${DB_CONNECTION_STRING}"
If you defined $PASS
on envrc
then you might need to call pgcli
with -W
so that
it will prompt you for a password.
- If you can connect, you are ready to start using SQLAlchemy and Alebmic!
Whatever you like?!?!
In this repo we have already taken care of the first steps of the alembic tutorial. This means that we have already run:
alembic init migrations
- We have already created an SQLAlchemy Base
- configured
migrations/env.py
to use our application'sBase
. - defined a sample model at
pg/models/definitions.py
.
So, if you want to jump straight into the fun you can start for example by generating your first migration with:
alembic revision --autogenerate -m "Add person table."
You can then inspect the migration code (i.e. python) with
cat migrations/versions/*
and the resultant SQL code with:
alembic upgrade --sql +1
Finally, you can apply the migration with:
alembic upgrade +1
If you now connect to pgcli
, you can test that the new table exists:
pgcli -U $DBUSER -h $DBHOST -p $DBPORT $DBNAME
# and then on the pgcli console:
\dt
# and again on the pgcli console:
SELECT * FROM person;
As always, before committing the migration, make sure that the downgrade step works as intended:
alembic downgrade -1
Whatever you like?
I mainly use this to test various things like e.g. the SQL code that gets generated by model definitions etc.
Running the following command should get rid of the database instance and its data
docker-compose down -v
If you also want to remove the existing migrations, you can do it with:
rm -rf migrations/versions/*.py