In this folder we provide an example scripts for running sets of simulations on an HPC cluster, or locally. These instructions are the framework for argument parsing are directly adapted from the smart-comp-sci repository.
usage: main.py [-h] [--submit-tscc] {convert-notebooks,mechanotransduction-preprocess,mechanotransduction,mechanotransduction-nuc-only}
convert-notebooks Convert notebooks to python files
mechanotransduction-preprocess
Preprocess mesh for mechanotransduction example
mechanotransduction
Run mechanotransduction example with cell on nanopillars
mechanotransduction-nuc-only
Run mechanotransduction example with cell on nanopillars, only considering YAP/TAZ transport in and out of nucleus
Each script has many arguments associated with testing conditions such as nuclear indentation, N-WASP reaction rate, etc. The full list with all default values defined can be found in mech_parser_args.py
within the model-files
folder.
There are currently 3 ways to execute the scripts. Examples of such calls can be found in the assorted bash scripts in this folder.
- By running the script without any additional flags, e.g
will run the script directly as a normal script.
python3 main.py mechanotransduction [args]
[args]
is a series of arguments giving the specifications for a given simulation. For instance, to specify a nuclear indentation of 2.8,--nuc-compression 2.8
would be appended. - You can submit a job to an HPC cluster by adjusting the SLURM script in
runner.py
and passing the--submit-tscc
(or another custom) flag, e.gRather than running the script directly, this will generate a SLURM job script (seepython3 main.py --submit-tscc mechanotransduction [args]
runner.py
) for submission to an HPC cluster. - You can navigate to the example folders and run the notebooks directly using
jupyter
All the code in this repository depends on smart
, which in turn depends on the development version of legacy FEniCs. While smart
is a pure python package and can be easily installed with pip
(i.e python3 -m pip install fenics-smart
), the development version of FEniCs can be tricky to install, and we recommend to use docker for running the code locally, or singularity to run on a cluster. Alternatively, you can setup an environment for running on HPC clusters using Spack as described in the smart-comp-sci repository.
We provide a pre-built docker image containing both the development version of FEniCS and smart which you can pull using
docker pull ghcr.io/rangamanilabucsd/smart:2.2.3
If you prefer to run the code in jupyter notebooks we also provide a docker image for that
docker pull ghcr.io/rangamanilabucsd/smart-lab:2.2.3
You can read more about how to initialize a container and running the code in the smart documentation.
All the scripts are available as Jupyter notbooks. If you want to run the examples using the main.py
script in the following folder, you need to convert the notebooks to python files first.
In order to run the scripts on the cluster we need to first convert the notebooks to python files. To do this we will use jupytext
which is part of the requirements. To convert all the notebooks into python files you can do
python3 main.py convert-notebooks
inside this folder (called scripts
)