This repository contains code and scripts for the bachelor thesis "Detect manipulated face Images using deep learning tools". This repository will mostly contain code for training and testing for the different models in the thesis. The other part will contain various utility scripts for preprocessing, and modifying data.
.
├── classifiers
├── dataset_utils
├── MobaiSVM
├── morph_utils
└── multiple_feature_shapes
This repository has 5 main directories: classifiers, dataset_utils, MobaiSVM, morph_utils and multiple_feature_shapes Classifiers is the directory for the different training, testing code for classifiers modified and used in the thesis.
The datset_utils folder is for utilities and scripts used for preparing dataset. This directory includes files like renaming scripts and dataset validation scripts.
The MobaiSVM directory is the original SVM received from MOBAI AS, this is whats been used as a base for the modified classifiers in the classifiers directory.
The morph utils directory contains various scripts used to make the morphing algorithms run with the desired format.
The multiple_feature_shapes directory is for code used to figure out the best featureshape.
Step 1:
Run create_dataset.py in mobai_dataset_utils:
Example:
python create_dataset.py \
--original datasets/unibo/ \
--new datasets/mordiff/ \
--output datasets/mordiff_runnable/
Step 2:
Run the Svm with the dataset created in previous step.
Usage:
python svm_training_pipeline.py --bonaFideFeatures BONAFIDEFEATURES \
--morphedAttackFeatures MORPHEDATTACKFEATURES \
--modelOutput MODELOUTPUT \
If assert errors occur try removing all the .DS_store files.
You can do this by moving to the desired folder and running:
trash **/.DS_store
Step 1:
Copy the model folder, since the results will be overwritten.
Step 2:
Example usage:
python3 svm_testing_pipeline.py --bonaFideFeatures unibo/Feature_Bonafide \
--morphedAttackFeatures unibo/Feature_Morphed \
--modelOutput mordiff_runnable/model
Step 1:
If the datasets does not have any probe images in it run create_dataset.py from mobai_dataset_utils with unibo dataset as original.
Example:
python create_dataset.py \
--original datasets/unibo/ \
--new datasets/mordiff/ \
--output datasets/mordiff_runnable/
Do this for each dataset that is gonna be merged/combined.
Step 2:
To merge the datasets whilst keeping the size proportional run.
Run mege_dataset.py in mobai_dataset_utils
Example usage:
python merge_dataset.py --datasets UNIBO MORDIFF MIPGAN --ouput MERGED
To combine the datasets fully
Run combine_dataset.py in mobai_dataset_utils
Example usage:
python combine_dataset.py --datasets UNIBO MORDIFF MIPGAN --ouput COMBINED
This repository has utilities and code for running among other things the MORDIFF morphing algorithm.
The github repository for mordiff: mordiff
MORDIFF paper: mordiff_paper
Mordiff SYN-mad benchmark: syn-mad
This repository has utilities and code for running among other things the MIPGAN morphing algorithm.
Mipgan github repository: mipgan