Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build Charm4Py with UCX and SlurmPMI on SDSC Expanse #213

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft

Conversation

ZwFink
Copy link
Contributor

@ZwFink ZwFink commented Aug 12, 2021

Experimental setup (SDSC Expanse)

  1. Install Miniconda, install packages
    wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
    chmod +x ./Miniconda3-latest-Linux-x86_64.sh 
    ./Miniconda3-latest-Linux-x86_64.sh 
    conda create --name charm4py
    conda activate charm4py
    conda install --yes numpy scipy pandas greenlet cython
  1. Install UCX
    git clone https://github.com/openucx/ucx.git
    # Version 1.10.1
    git checkout 6a5856ef4f72c8139951e7ed1d0a4cb75a2e82ec
    mkdir ~/.local
    cd ucx/
    module load gcc/10.2.0
    # This was required because UCX wasn't finding the NUMA libraries. 
    # This command should work directly on Expanse (from [[https://github.com/openucx/ucx/issues/4774#issuecomment-586646345][here]])
    export NUMA_HOME=/cm/shared/apps/spack/cpu/opt/spack/linux-centos8-zen/gcc-8.3.1/numactl-2.0.12-uvjxkgifpcwra25lv6tzxa5gof5ayfkq
    CFLAGS="-I$NUMA_HOME/include"
    LDFLAGS="-L$NUMA_HOME/lib -Wl,-rpath,$NUMA_HOME/lib"
    export CFLAGS LDFLAGS
    ./autogen.sh
    mkdir build
    cd build
    ../contrib/configure-release --prefix=$HOME/.local/ucx
    make -j && make install
  1. Install Charm++
    cd ~
    mkdir charms
    cd charms
    git clone https://github.com/UIUC-PPL/charm.git
    cd charm
    git checkout charm4py_expanse
    # The paths containing UCX libraries, Slurm, and PMI should be on your LD_LIBRARY_PATH.
    # You will need to add the UCX libraries yourself, but on expanse Slurm and PMI should be there
    # to begin with.
    ./build charm4py ucx-linux-x86_64 slurmpmi --with-production --basedir=$HOME/.local/ucx -j --force
  1. Install Charm4Py
 cd ~/charms
 git clone https://github.com/UIUC-PPL/charm4py.git
 cd charm4py
 git checkout expanse
 ln -s $HOME/charms charm_src
 python3 -m pip install --user -e .

@ZwFink
Copy link
Contributor Author

ZwFink commented Aug 16, 2021

OpenMPI/mpi4py

OpenMPI

git clone https://github.com/open-mpi/ompi.git
cd ompi/
git checkout a8dd8708d8b6d1346328d7f4612d63b307c25653
git submodule update --init --recursive
./autogen.pl
mkdir build; cd build
../configure --prefix=$HOME/.local/ompi --enable-mpirun-prefix-by-default --with-ucx=$HOME/.local/ucx --without-lsf --without-psm --without-libfabric --without-verbs --without-psm2 --without-alps --without-sge --with-slurm --without-tm --without-loadleveler --disable-debug --disable-memchecker --disable-oshmem --disable-java --disable-mpi-java --disable-man-pages --with-pmi=/cm/shared/apps/slurm/current/
make -j && make install
# These lines should also go into your bashrc
export OPAL_PREFIX=$HOME/.local/ompi/
export OPAL_LIBDIR=$OPAL_PREFIX/lib
export PATH=$OPAL_PREFIX/bin:$PATH
srun --nodes=1 -A TG-ASC050039N --ntasks-per-node=10 --time=15:00 --partition=compute --exclusive --pty bash -i

mpi4py

Diff applied to setup.cfg

diff --git a/mpi.cfg b/mpi.cfg
index a704440..542db7e 100644
--- a/mpi.cfg
+++ b/mpi.cfg
@@ -53,7 +53,7 @@ mpicxx               = %(mpi_dir)s/bin/mpicxx
# Open MPI example
# ----------------
[openmpi]
-mpi_dir              = /home/devel/mpi/openmpi/5.0.0
+mpi_dir              = /home/zanef2/.local/ompi
mpicc                = %(mpi_dir)s/bin/mpicc
mpicxx               = %(mpi_dir)s/bin/mpicxx
#include_dirs         = %(mpi_dir)s/include

Installation

git clone https://github.com/mpi4py/mpi4py.git
cd mpi4py/
python3 -m pip install --user --install-option="--mpi=openmpi" .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant