Skip to content

Running a Local Database

If you want to ensure your scraped data imports or work on or API v3, you'll need a local database.

This can be a bit cumbersome, since running Postgres locally varies a lot platform-to-platform, and you'll need to populate it as well.

If you're comfortable with Postgres, most of these steps can be easily modified to use your own Postgres instance, but for the remainder of this guide we'll be using a dockerized postgres image.


Be sure you've already installed docker and docker-compose, as noted in Installing Prerequisites.

You'll need openstates-scrapers checked out, even if you aren't working on scrapers. This repository has the docker-compose.yml config and initialization scripts for the database.

If you want to initialize the database for work you'll need that project checked out as well.

Initialize Database For Scraping

  1. Run from within the openstates-scrapers directory:


If you've already run this before, running scripts/ will reset your database to scratch!

openstates-scrapers/$ ./scripts/
+ docker-compose down
Removing scrapers_db_1 ... done
Removing network openstates-network
+ docker volume rm openstates-postgres
+ docker-compose up -d db
Creating network "openstates-network" with the default driver
Creating volume "openstates-postgres" with default driver
Creating scrapers_db_1 ... done
+ sleep 3
+ DATABASE_URL=postgis://openstates:openstates@db/openstatesorg
+ docker-compose run --rm --entrypoint 'poetry run os-initdb' scrape
Creating scrapers_scrape_run ... done
Operations to perform:
  Apply all migrations: contenttypes, data
Running migrations:
  Applying contenttypes.0001_initial... OK
  Applying contenttypes.0002_remove_content_type_name... OK
  Applying data.0001_initial... OK
  Applying data.0002_auto_20200422_0028... OK
  Applying data.0003_auto_20200422_0031... OK


loading WY
loading DC
loading PR
loading US

This will populate your database with the tables needed for scraping, as well as some basic static data such as the jurisdiction metadata. If all you want to do is run scrapers and import data into a database for inspection, you're good to go!

Initialize Database for


You must run openstates-scrapers' as shown above first!

From within the directory run docker/$ ./docker/
+ docker-compose run --rm -e PYTHONPATH=docker/ --entrypoint 'poetry run ./ migrate' django
Creating openstatesorg_django_run ... done
Operations to perform:
  Apply all migrations: account, admin, auth, bulk, bundles, contenttypes, dashboards, data, people_admin, profiles, sessions, sites, socialaccount
Running migrations:
  Applying auth.0001_initial... OK


docker-compose run --rm -e PYTHONPATH=docker/ --entrypoint 'poetry run ./ shell -c "import testdata"' django

This creates the Django-specific tables, and also creates a local API key testkey that can be used for local development.

Working with the Local Database

The database will persist to disk, so for the most part once you run these steps you're good to go.

Starting the Database

You'll need to make sure that database is running whenever you're working on scrapers or locally.

You can do that by running docker-compose up -d db from the openstates-scrapers directory.

openstates-scrapers$ docker-compose up -d db
Starting scrapers_db_1 ... done

If it is already running output will look like:

openstates-scrapers$ docker-compose up -d db
scrapers_db_1 is up-to-date

Stopping the Database

openstates-scrapers$ docker-compose stop db
Stopping scrapers_db_1 ... done

Resetting the Database

You can always run scripts/ to reset your database. This can be good if you have some bad data, or just whenever you'd like a fresh start:

openstates-scrapers/$ ./scripts/
Back to top