Skip to content

Commit

Permalink
Pynx070 v021 (#25)
Browse files Browse the repository at this point in the history
* Switching to pint based mapping code from pynxtools-em

* Fixed the old jupyter notebooks

---------

Co-authored-by: mkuehbach <[email protected]>
  • Loading branch information
mkuehbach and atomprobe-tc authored Sep 12, 2024
1 parent f086312 commit 2755697
Show file tree
Hide file tree
Showing 24 changed files with 1,160 additions and 968 deletions.
11 changes: 4 additions & 7 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@
# documentation.
name: Upload Python Package

on: workflow_dispatch
# release:
# types: [published]
on:
release:
types: [published]
env:
UV_SYSTEM_PYTHON: true

Expand Down Expand Up @@ -39,7 +39,4 @@ jobs:
run: python -m build
- name: Publish package distributions to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
# 27b31702a0e7fc50959f5ad993c78deac1bdfc29
# with:
# user: __token__
# password: ${{ secrets.PYPI_API_TOKEN }}

2 changes: 1 addition & 1 deletion .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
"request": "launch",
"cwd": "${workspaceFolder}",
"program": "../.py3.12.4/bin/dataconverter",
"args": [//"convert",
"args": [//workflow files str.str, hits.hits, root.root should be created via touch str.str ...
"tests/data/eln/eln_data.yaml",
"tests/data/eln/apm.oasis.specific.yaml",
"tests/data/apt/Si.apt",
Expand Down
3 changes: 2 additions & 1 deletion dev-requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -295,8 +295,9 @@ numexpr==2.10.1
# via
# blosc2
# tables
numpy==2.1.1
numpy==1.26.4
# via
# pynxtools-apm (pyproject.toml)
# ase
# blosc2
# contourpy
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorial/nexusio.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# How to use a NeXus/HDF5 file

[The Jupyter notebook is available here](https://github.com/FAIRmat-NFDI/pynxtools-apm/blob/main/examples/HowToUseNXapmNeXusTutorial.ipynb)
[The Jupyter notebook is available here](https://github.com/FAIRmat-NFDI/pynxtools-apm/blob/main/examples/HowToUseTutorial.ipynb)
2 changes: 1 addition & 1 deletion docs/tutorial/standalone.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ You will have a basic understanding how to use pynxtools-apm for converting your

## Steps

[The Jupyter notebook is available here](https://github.com/FAIRmat-NFDI/pynxtools-apm/blob/main/examples/HowToUseTutorial.ipynb)
[The Jupyter notebook is available here](https://github.com/FAIRmat-NFDI/pynxtools-apm/blob/main/examples/HowToCreateTutorial.ipynb)

**Congrats! You now have a FAIR NeXus file!**

Expand Down
276 changes: 276 additions & 0 deletions examples/HowToCreateTutorial.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,276 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# How to convert atom probe (meta)data to NeXus/HDF5"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The aim of this tutorial is to guide users how to create a NeXus/HDF5 file to parse and normalize pieces of information<br>\n",
"from typical file formats of the atom probe community into a common form. The tool assures that this NeXus file matches<br>\n",
"to the NXapm application definition. Such documented conceptually, the file can be used for sharing atom probe research<br>\n",
"with others (colleagues, project partners, the public), for uploading a summary of the (meta)data to public repositories<br>\n",
"and thus avoiding additional work that is typically with having to write documentation of metadata in such repositories<br>\n",
"or a research data management systems like NOMAD Oasis.<br>\n",
"\n",
"The benefit of the data normalization that pynxtools-apm performs is that all pieces of information are represents in the<br>\n",
"same conceptual way with the benefit that most of the so far required format conversions when interfacing with software<br>\n",
"from the technology partners or scientific community are no longer necessary.<br>"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": []
},
"source": [
"### **Step 1:** Check that packages are installed and working in your local Python environment."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Check the result of the query below specifically that `jupyterlab_h5web` and `pynxtools` are installed in your environment.<br>\n",
"Note that next to the name pynxtools you should see the directory in which it is installed. Otherwise, make sure that you follow<br>\n",
"the instructions in the `README` files: \n",
"- How to set up a development environment as in the main README \n",
"- Lauch the jupyter lab from this environement as in the README of folder `examples`"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"! pip list | grep \"h5py\\|nexus\\|jupyter\\|jupyterlab_h5web\\|pynxtools\\|pynxtools-apm\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Set the pynxtools directory and start H5Web for interactive exploring of HDF5 files."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import zipfile as zp\n",
"from jupyterlab_h5web import H5Web\n",
"print(f\"Current working directory: {os.getcwd()}\")\n",
"print(f\"So-called base, home, or root directory of the pynxtools: {os.getcwd().replace('/examples/apm', '')}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### **Step 2:** Use your own data or download an example"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": []
},
"source": [
"Example data can be found on Zenodo https://www.zenodo.org/record/7986279."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"! curl --output usa_denton_smith_apav_si.zip https://zenodo.org/records/7986279/files/usa_denton_smith_apav_si.zip?download=1"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"zp.ZipFile(\"usa_denton_smith_apav_si.zip\").extractall(path=\"\", members=None, pwd=None)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<div class=\"alert alert-block alert-danger\">\n",
"Please note that the metadata inside the provided apm.oasis.specific.yaml and eln_data_apm.yaml files<br>\n",
"contain exemplar values. These do not necessarily reflect the conditions when the raw data of example<br>\n",
"above-mentioned were collected by the scientists. Instead, these file are meant to be edited by you,<br>\n",
"either and preferably programmatically e.g. using output from an electronic lab notebook or manually.</div>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This example shows the types of files from which the parser collects and normalizes pieces of information:<br>\n",
"* **eln_data_apm.yaml** metadata collected with an electronic lab notebook (ELN) such as a NOMAD Oasis custom schema<br>\n",
"* **apm.oasis.specific.yaml** frequently used metadata that are often the same for many datasets to avoid having to<br>\n",
" type it every time in ELN templates. This file can be considered a configuration file whereby e.g. coordinate system<br>\n",
" conventions can be injected or details about the atom probe instrument communicated if that is part of frequently used<br>\n",
" lab equipment. The benefit of such an approach is that eventual all relevant metadata to an instrument can be read from<br>\n",
" this configuration file via guiding the user e.g. through the ELN with an option to select the instrument.<br>\n",
"* **reconstructed ion positions** in community, technology partner format with<br>\n",
" the ion positions and mass-to-charge state ratio values for the tomographic reconstruction.<br>\n",
"* **ranging definitions** in community / technology partner formatting with<br>\n",
" the definitions how mass-to-charge-state-ratio values map on ion species.<br>\n",
"\n",
"The tool supports the most commonly used information exchange formats of the atom probe community.<br>\n",
"Consult the reference part of the documentation to get a detailed view on how specific formats are supported.<br>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<div class=\"alert alert-block alert-info\">\n",
"Please note that the proprietary file formats RRAW, STR, ROOT, RHIT, and HITS from AMETEK/Cameca are currently not processable<br>\n",
"with pynxtools-apm although we have investigated the situation and were able confirm that a substantial number of metadata have been<br>\n",
"documented by Cameca and are technically extractable and interpretable using Python. This would enable automated mapping and<br>normalizing of these metadata into NeXus via simpler than the current route where an additional ELN or supplementary file like yaml has to be<br>\n",
"used for and eventually users have to enter the same information more than once. AMETEK/Cameca is currently working on<br>\n",
"the implementation of features in AP Suite to make some of these metadata available through the open-source APT file format<br>\n",
"when this is available we will work on an update of pynxtools-apm to support this functionality.</div>"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": []
},
"source": [
"### **Step 3:** Run the parser"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"eln_data_file_name = [\"eln_data.yaml\"]\n",
"deployment_specific = [\"apm.oasis.specific.yaml\"]\n",
"input_recon_file_name = [\"Si.apt\",\n",
" \"Si.epos\",\n",
" \"Si.pos\"]\n",
"input_range_file_name = [\"Si.RRNG\",\n",
" \"Si.RNG\",\n",
" \"Si.RNG\"]\n",
"output_file_name = [\"apm.case1.nxs\",\n",
" \"apm.case2.nxs\",\n",
" \"apm.case3.nxs\"]\n",
"for case_id in range(0, 3):\n",
" ELN = eln_data_file_name[0]\n",
" CFG = deployment_specific[0]\n",
" RECON = input_recon_file_name[case_id]\n",
" RANGE = input_range_file_name[case_id]\n",
" OUTPUT = output_file_name[case_id]\n",
"\n",
" ! dataconverter $ELN $CFG $RECON $RANGE --reader apm --nxdl NXapm --output $OUTPUT"
]
},
{
"cell_type": "markdown",
"metadata": {
"tags": []
},
"source": [
"### **Step 4:** Inspect the NeXus/HDF5 file using H5Web."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"H5Web(OUTPUT)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The NeXus file an also be viewed with H5Web by opening it via the file explorer panel to the left side of this Jupyter lab window."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Conclusions:\n",
"***"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This tutorial showed how you can call the pynxtools-apm via a jupyter notebook.<br>\n",
"This opens many possibilities like processing the results further with Python such as through e.g.<br>\n",
"<a href=\"https://conda.io/projects/conda/en/latest/user-guide/install/index.html\">conda</a> on your local computer, <a href=\"https://docs.python.org/3/tutorial/venv.html\">a virtual environment</a>, to interface with AMETEK/Cameca\\'s AP Suite<br>\n",
"<a href=\"https://github.com/CamecaAPT/cameca-customanalysis-interface/wiki\">extension interface</a> to do processing of the data with scientific software from the atom probe<br>\n",
"such as <a href=\"https://github.com/FAIRmat-NFDI/AreaB-software-tools\">open-source tools</a> (paraprobe-toolbox and others</a>) or IVAS / AP Suite.<br>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Contact person for pynxtools-apm and related examples in FAIRmat:\n",
"Dr.-Ing. Markus Kühbach, 2024/09/12<br>\n",
"\n",
"### Funding\n",
"<a href=\"https://www.fairmat-nfdi.eu/fairmat\">FAIRmat</a> is a consortium on research data management which is part of the German NFDI.<br>\n",
"The project is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – project 460197019."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.4"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Loading

0 comments on commit 2755697

Please sign in to comment.