In this development blog for my online single/coop 4X space game post I will write about my Python pytest testing with a database environment.
My game has to track progress of things. Be it research progress, build queues, amount of resources, etc. I use a table that I call progress
for that, at least for the purpose of building the Test Game in the process of realizing how I will do things in the Real Game.
This table would be updated with a periodic task running in Celery, and as I wrote the task's function I realized I'm gonna need some automated tests. It would be painful to code such task, run a worker locally, monitor the database for changes, fix a bug, do it all over again, and so on. Not to mention the pains of refactoring which will surely happen as I realize what my data structures should better be like.
So I had a few funny rounds of wresting with pytest and a database session fixture and I eventually came up with something like the following (from my current code):
# NOTE: I am not very experienced with SQL/ORM/RDBS so I may be doing
# stupid things. Please comment below to let me know if I can do something
# better or if I made you pull your hair with disgust
# TODO: probably be function scope, then make create_ and drop_all in another fixture
@fixture(scope="session")
def db_session():
Base.metadata.create_all(engine)
session = Session()
yield session
Base.metadata.drop_all(engine)
session.close()
And an example for a test that uses this:
# NOTE: I am not very experienced with SQL/ORM/RDBS so I may be doing
# stupid things. Please comment below to let me know if I can do something
# better or if I made you pull your hair with disgust
# TODO: move somewhere more common
@fixture
def game(db_session):
game = GameMeta(name="testgame", seed=0)
db_session.add(game)
db_session.commit()
yield game
db_session.delete(game)
db_session.commit()
def test_advance_all_games_progress(db_session, game):
progress1 = Progress(progress=0.0, game=game)
progress2 = Progress(progress=0.0, game=game)
db_session.add(progress1)
db_session.add(progress2)
db_session.commit()
advance_all_games_progress()
stmt = select(Progress)
result = db_session.execute(stmt)
for progress in result.scalars():
assert progress.progress > 0.0
db_session.delete(progress1)
db_session.delete(progress2)
db_session.commit()
def test_progress_increment(db_session, game):
progress = Progress(
progress=1.0, game=game, increment_min=10.0, increment_max=100.0
)
db_session.add(progress)
db_session.commit()
advance_all_games_progress()
progress = db_session.get(Progress, progress.id)
assert progress.progress >= 10.0
db_session.delete(progress)
db_session.commit()
def test_progress_cap(db_session, game):
progress = Progress(
progress=1.0, cap=2.0, game=game, increment_min=10.0, increment_max=100.0
)
db_session.add(progress)
db_session.commit()
advance_all_games_progress()
progress = db_session.get(Progress, progress.id)
assert progress.progress > 1.0
assert progress.progress <= 2.0
db_session.delete(progress)
db_session.commit()
It seems that SQLALchemy sessions support some sort of session-inside-a-session or nested-sessions that I should look into. The function advance_all_games_progress
opens a session inside itself, and it is called from within the db_session
fixture.
I initially tried doing every operation (creating mock rows and objects, destroying them later on, for example) with a db_session of its own. It was a disaster. Once you create an ORM-based object in a session it is tied to it, and once the session ends you end up with something that can't realize its lazy-loaded relationships and the likes. There's a method called expunge
that you can do for these kind of objects to "untie" them from a session, but then I wasn't sure if I should re-tie them to a new session for later usage (imagine the game
fixture). I did all kinds of random programming guesses and some shots in the dark while Googling, but with a simple db_session fixture and a lot of db_session.commit
I got stuff to work and I think I'm even not doing anything strange. Please fix me if I am totally wrong, thanks :)
Now there's the problem of actually having a test database to test things against. Using Docker, I started a local Postgres server and created a test database for game testing. It looks like this (from my README):
docker run -d --name postgres -e POSTGRES_PASSWORD=password -p5432:5432 postgres:alpine
docker run --rm -it --network host postgres:alpine psql postgres://postgres:password@localhost:5432/
# or if you have the postgres-client package installed:
# psql postgres://postgres:password@localhost:5432
CREATE DATABASE gametest;
I mentioned a README. Even being closed-source (for now?) I find it important to include instructions for my future self for doing things with my projects.
My .env.test
file has this entry:
DATABASE_URL=postgres://postgres:password@localhost:5432/gametest
And somewhere in my code which is FastAPI-based and thus uses Starlette:
# config.py
TESTING = environ.get("TESTING")
config = Config(".env" if not TESTING else ".env.test")
...
DATABASE_URL = config("DATABASE_URL", default=None)
Lastly, my conftest.py (for pytest) ensures the TESTING env var is present:
from starlette.config import environ
environ["TESTING"] = "True"
Now that I think about this, it's a bit strange that code that is not supposed to run for the Heroku web worker is using starlette configs. That is, my "workers" which are supposed to run tasks for the game, do include reference to starlette-based configs. Oh well. Things work and for now my focus is really on actually making a game so I must overlook these kind of things, which might not be issues at all.
Top comments (0)