Skip to content

Commit

Permalink
Release 1 (#66)
Browse files Browse the repository at this point in the history
Co-authored-by: Xander Bil <[email protected]>
Co-authored-by: driesaster <[email protected]>
Co-authored-by: Xander Bil <[email protected]>
Co-authored-by: miboelae <[email protected]>
Co-authored-by: Michaël Boelaert <[email protected]>
Co-authored-by: Marieke <[email protected]>
  • Loading branch information
7 people authored Mar 14, 2024
1 parent 2c4d4bd commit d14f67d
Show file tree
Hide file tree
Showing 96 changed files with 3,216 additions and 321 deletions.
60 changes: 59 additions & 1 deletion .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Lint
name: Code Style

on:
push:
Expand Down Expand Up @@ -43,4 +43,62 @@ jobs:
- name: Run Backend linter
run: autopep8 -rd --exit-code backend

pytest:
name: Unit Testing
runs-on: self-hosted
env:
DATABASE_URI: "${{ secrets.POSTGRES_CONNECTION }}"
FRONTEND_URL: "https://localhost:8080"
CAS_SERVER_URL: "https://login.ugent.be"
SECRET_KEY: "test"
ALGORITHM: "HS256"
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: 3.12
- name: Install dependencies
working-directory: backend
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Initialize Database
working-directory: backend
run: |
alembic upgrade head
- name: Test with pytest
working-directory: backend
run: |
pip install pytest pytest-cov pytest-html pytest-sugar pytest-json-report
py.test -v --cov --html=../reports/pytest/report.html
- name: Archive pytest coverage results
uses: actions/upload-artifact@v1
with:
name: pytest-coverage-report
path: reports/pytest/
- name: clean up alembic
working-directory: backend
if: always()
run: alembic downgrade base
pyright:
name: Pyright
runs-on: self-hosted
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.12
uses: actions/setup-python@v5
with:
python-version: 3.12
- name: Install dependencies
working-directory: backend
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pyright
- name: run pyright
working-directory: backend
run: |
pyright
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
.coverage
**/reports/
1 change: 1 addition & 0 deletions backend/.flake8
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,4 @@
extend-ignore = E203
exclude = .git,__pycache__,venv
indent-size = 4
max-line-length = 88
5 changes: 5 additions & 0 deletions backend/.gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,7 @@
__pycache__
*venv*

*.db
config.yml
.env
.coverage
53 changes: 41 additions & 12 deletions backend/README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,26 @@
## Run the backend api
# Backend API

### Setup
## Running the API

#### In this directy execute the following command to create a python environment:
### Setup

```sh
# Create a python virtual environment
python -m venv venv
```

#### Activate the environment:

```sh
# Activate the environment
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
```

#### Install the dependencies:
#### Create a `.env` file with following content

```sh
pip install -r requirements.txt
```yml
FRONTEND_URL="https://localhost:8080"
CAS_SERVER_URL="https://login.ugent.be"
DATABASE_URI="database connection string: postgresql://..., see discord..."
SECRET_KEY="<secret key to sign JWT tokens>" # e.g. generate with `openssl rand -hex 32`
ALGORITHM="HS256" # algorithm used to sign JWT tokens
```

### Usage
Expand All @@ -34,10 +37,36 @@ source venv/bin/activate
./run.sh
```

It will start a local development server on port `8000`
This will start a local development server on port `5173`

## The API

## Login

Authentication happens with the use of CAS. The client can ask where it can find
the CAS server with the `/api/authority` endpoint. A ticket then can be obtained
via `<authority>?service=<redirect_url>`. The CAS server will redirect to
`<redirect_url>?ticket=<ticket>` after authentication. Once the client is
authenticated, further authorization happens with [JWT](https://jwt.io/). To
obtain this token, a `POST` request has to be made to `/api/token/`, with the
CAS ticket `<ticket>` and the `<redirect_url>`. The redirect url is needed to
verify the ticket. If the ticket is valid, a webtoken will be returned. To
authorize each request, add the token in the `Authorization` header.

## Developing

#### To format the python code in place to conform to PEP 8 style:

```sh
autopep8 -ri .
```

## Testing

You can add tests by creating `test_*` files and `test_*` functions under `tests` directory.

### Run the tests (in the virtual environment):

```sh
pytest -v
```
113 changes: 113 additions & 0 deletions backend/alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# A generic, single database configuration.

[alembic]
# path to migration scripts
script_location = alembic

# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s

# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .

# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =

# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40

# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false

# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false

# version location specification; This defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions

# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
version_path_separator = os # Use os.pathsep. Default configuration used for new projects.

# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false

# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8

[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples

# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME

# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = --fix REVISION_SCRIPT_FILENAME

# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARN
handlers = console
qualname =

[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
1 change: 1 addition & 0 deletions backend/alembic/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Generic single-database configuration.
102 changes: 102 additions & 0 deletions backend/alembic/env.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
from src.user.models import Base as UserBase
from src.subject.models import Base as SubjectBase
from src.project.models import Base as ProjectBase
from src.group.models import Base as GroupBase
from src.submission.models import Base as SubmissionBase
import os
import sys
from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool
from sqlalchemy import MetaData

from alembic import context
from src import config as c


# Calculate the path based on the location of the env.py file
d = os.path.dirname
parent_dir = d(d(os.path.abspath(__file__)))
sys.path.append(parent_dir)

# Import the Base from each of your model submodules

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)

config.set_main_option('sqlalchemy.url', c.CONFIG.database_uri)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
combined_metadata = MetaData()
for base in [ProjectBase, SubjectBase, UserBase, GroupBase, SubmissionBase]:
for table in base.metadata.tables.values():
combined_metadata._add_table(table.name, table.schema, table)

target_metadata = combined_metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.


def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)

with context.begin_transaction():
context.run_migrations()


def run_migrations_online() -> None:
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)

with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)

with context.begin_transaction():
context.run_migrations()


if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
Loading

0 comments on commit d14f67d

Please sign in to comment.