This is a repository for the fifth and final project in the AWS Cloud DevOps Engineer Udacity Nanodegree (Capstone Project). The project would deploy a Deep Learning API Microservice to classify input images according to the ImageNet dataset classes.
The objective of the project is as follows:
- Build a web application that serves a pre-trained PyTorch MobileNet-V2 ImageNet Classifier Deep Learning model via a Python Flask application in a Waitress production WSGI server.
- Set up a Docker image that contains the web application and required dependencies which can also load the model to an NVIDIA CUDA GPU if available for faster inference.
- Deploy the Docker image on a Kubernetes cluster using the AWS Elastic Kubernetes Service with the
eksctl
command line toolkit while leveraging a Jenkins CI/CD Pipeline. - The Kubernetes cluster would contain two pods with one container each of the Docker image deployed on AWS EC2
t3.large
instance nodes with the following Cluster AutoScaler setting: (minimum: one node, desired: two nodes, maximum: two nodes). - The Kubernetes cluster would be updated via a Rolling Update strategy.
- Post-Submission: Changed the Kubernetes cluster configuration to deploy eight pods with one container each of the Docker image deployed on AWS EC2
g4dn.xlarge
instance nodes that utilize the Tesla T4 GPU for accelerated machine learning and deep learning inference workloads with the same Cluster Autoscaling setting as before such that the containers would utilize the available GPU resources.
Note: Links to useful resources I have found while doing the project are in the 'useful_resources.txt' file in 'project_instructions'.
- server.py: The web application server.
- model: Pre-trained ImageNet MobileNet-V2 PyTorch model.
- templates: Contains the required HTML file.
- requirements.txt: Required Python packages to run the application.
- hadolint: Hadolint v1.18.0 Linux-86_64 executable.
- Jenkinsfile: Defines the Jenkins CI/CD stage pipeline to run on a Jenkins server.
- Dockerfile: Defines the Docker image.
- docker_build.sh: Builds the Docker image.
- docker_upload.sh: Tags and uploads the Docker image to DockerHub.
- docker_run_cpu.sh: Runs the application locally in a Docker container using the CPU.
- docker_run_gpu.sh: Runs the application locally in a Docker container using an available NVIDIA CUDA GPU.
- kubernetes_run_local.sh: Runs the Docker image on a local Kubernetes cluster (e.g: Minikube).
- kubernetes_deployment.yml: Defines the Kubernetes deployment configuration on the AWS EKS Cluster.
- eksctl_create_aws_kubernetes_cluster.yml: Creates a Managed AWS EKS Cluster (implicitly using AWS Cloudformation).
- eksctl_apply_deployment_aws_kubernetes_cluster.yml: Applies the kubernetes_deployment.yml Kubernetes deployment configuration to the created EKS Cluster.
- eksctl_delete_aws_kubernetes_cluster.yml: Deletes the EKS Cluster and all its resources.