Skip to content

Aura-healthcare/seizure_detection_pipeline

Repository files navigation

Seizure Detection pipeline

This project aims to use machine learning algorithms to detect seizure from ECG data, in a MLOps environment. Here is an overview of the automated ML pipeline :

Automated pipeline

This pipeline runs inside a dockerised environment, represented below :

Architecture

Prerequisites

Dependencies

The pipeline requires these packages to run :

  • Python >= 3.7
  • pandas == 1.1.5
  • numpy == 1.19.5
  • pyEDFlib == 0.1.22
  • click == 8.0.1
  • py-ecg-detectors == 1.0.2
  • wfdb == 3.4.0
  • biosppy == 0.7.3
  • hrv-analysis == 1.0.4
  • ecg-qc == 1.0b5
  • great-expectations == 0.13.25
  • airflow-provider-great-expectations == 0.0.7
  • psycopg2-binary == 2.8.6

You can install them in a virtual environment on your machine via the command :

    $ pip install -r requirements.txt

Environment

You need to have docker and docker-compose installed on your machine to set the environment up.

Getting started

Setting up environment and launch docker-compose

Using symbolic link is most conveniant use to import data stored in another path. In this case, first create a symbolic link in data folder:

    $ ln -s -r PATH_TO_DATA_FOLDER data/

Then update env.sh file with the the name of the folder of symbolic link at last line:

export SYMLINK_FOLDER='SYMBOLIC_NAME_FOLDER_NAME'

You can now run these commands :

    $ source setup_env.sh
    $ docker-compose build
    $ docker-compose up airflow-init
    $ docker-compose up -d

Warning: Here are the default ports used by the different services. If one of them is already in use on your machine, change the value of the corresponding environment variables in the env.sh file before running the commands above.

Service Default port
Postgresql 5432
InfluxDB 8086
Airflow 8080
Grafana 3000
MLFlow 5000
Great expectations (via NGINX) 8082
Flower 5555
Redis 6379

Before running Airflow, you must fetch data with:

    $ make fetch_data

UI

Once the services are up, you can interact with their UI :

When required, usernames and passwords are admin.

Executing script separately

First export the python path to access the scripts :

    $ export PYTHONPATH=$(pwd)

You can now execute each Python script separately by running :

    $ python3 <path-to-Python-script> [OPTIONS]

The required options are shown by running the --help option.

Setting down the environment

You can stop all services by running :

    $ docker-compose down 

If you add the -v option, all services' persistent data will be erased.

License

This project is licensed under the GNU GENERAL PUBLIC License.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published