DeepLabV3-Plus-MobileNet-Quantized: Quantized Deep Convolutional Neural Network model for semantic segmentation
DeepLabV3 Quantized is designed for semantic segmentation at multiple scales, trained on various datasets. It uses MobileNet as a backbone.
This is based on the implementation of DeepLabV3-Plus-MobileNet-Quantized found here. This repository contains scripts for optimized on-device export suitable to run on Qualcomm® devices. More details on model performance accross various devices, can be found here.
Sign up for early access to run these models on a hosted Qualcomm® device.
Once installed, run the following simple CLI demo:
python -m qai_hub_models.models.deeplabv3_plus_mobilenet_quantized.demo
More details on the CLI tool can be found with the --help
option. See
demo.py for sample usage of the model including pre/post processing
scripts. Please refer to our general instructions on using
models for more usage instructions.
This repository contains export scripts that produce a model optimized for on-device deployment. This can be run as follows:
python -m qai_hub_models.models.deeplabv3_plus_mobilenet_quantized.export
Additional options are documented with the --help
option. Note that the above
script requires access to Deployment instructions for Qualcomm® AI Hub.
- The license for the original implementation of DeepLabV3-Plus-MobileNet-Quantized can be found here.
- The license for the compiled assets for on-device deployment can be found here
- Join our AI Hub Slack community to collaborate, post questions and learn more about on-device AI.
- For questions or feedback please reach out to us.