The data used in the eGo^N project along with the code importing, generating and processing it.
- Free software: GNU Affero General Public License v3 or later (AGPLv3+)
In addition to the installation of Python packages, some non-Python packages are required too. Right now these are:
Docker: Docker is used to provide a PostgreSQL database (in the default case).
Docker provides extensive installation instruction. Best you consult their docs and choose the appropriate install method for your OS.
Docker is not required if you use a local PostreSQL installation.
The psql executable. On Ubuntu, this is provided by the postgresql-client-common package.
Header files for the
libpq5
PostgreSQL library. These are necessary to build thepsycopg2
package from source and are provided by thelibpq-dev
package on Ubuntu.osm2pgsql On recent Ubuntu version you can install it via
sudo apt install osm2pgsql
.
Since no release is available on PyPI and installations are probably used for development, cloning it via
git clone [email protected]:openego/eGon-data.git
and installing it in editable mode via
pip install -e eGon-data/
are recommended.
In order to keep the package installation isolated, we recommend installing the package in a dedicated virtual environment. There's both, an external tool and a builtin module which help in doing so. I also highly recommend spending the time to set up virtualenvwrapper to manage your virtual environments if you start having to keep multiple ones around.
If you run into any problems during the installation of egon.data
,
try looking into the list of known installation problems we have
collected. Maybe we already know of your problem and also of a solution
to it.
The :py:mod:`egon.data` package installs a command line application
called egon-data
with which you can control the workflow so once
the installation is successful, you can explore the command line
interface starting with egon-data --help
.
The most useful subcommand is probably egon-data serve
. After
running this command, you can open your browser and point it to
localhost:8080, after which you will see the web interface of Apache
Airflow with which you can control the eGo^n data processing
pipeline.
If running egon-data
results in an error, we also have collected
a list of known runtime errors, which can consult in search of a
solution.
Warning
A complete run of the workflow might require much computing power and can't be run on laptop. Use the :ref:`test mode <Test mode>` for experimenting.
The workflow can be tested on a smaller subset of data on example of the federal state of Bremen.
Warning
Right now, only OSM data for Bremen get's imported. This is hard-wired in egon.data/data_sets.yml.
You can find more in depth documentation at https://eGon-data.readthedocs.io.