Skip to content

Latest commit

 

History

History
 
 

pubsub2inbox

Pubsub2Inbox

Pubsub2Inbox is a generic tool to handle input from Pub/Sub messages and turn them into email, webhooks or GCS objects. It's based on an extendable framework consisting of input and output processors. Input processors can enrich the incoming messages with details (for example, fetching the budget from Cloud Billing Budgets API). Multiple output processors can be chained together.

Pubsub2Inbox is written in Python 3 and can be deployed as a Cloud Function easily. To guard credentials and other sensitive information, the tool can fetch its YAML configuration from Google Cloud Secret Manager.

The tool also supports templating of emails, messages and other parameters through Jinja2 templating.

Please note: You cannot connect to SMTP port 25 from GCP. Use alternative ports 465 or 587, or connect via Serverless VPC Connector to your own mailservers.

Out of the box

Out of the box, you'll have the following functionality:

Input processors

Available input processors are:

Please note that the input processors have some IAM requirements to be able to pull information from GCP:

  • Resend mechanism (see below)
    • Storage Object Admin (roles/storage.objectAdmin)
  • Signed URL generation (see filters/strings.py:generate_signed_url)
    • Storage Admin on the bucket (roles/storage.admin)
  • Budgets: budget.py
    • Billing Account Viewer (roles/billing.viewer) to retrieve budget details.
    • Browser (roles/browser) to fetch project details.
  • Security Command Center: scc.py
    • Browser (roles/browser) to fetch project details.
  • BigQuery: bigquery.py
    • BigQuery Job User (roles/bigquery.jobUser) and BigQuery Data Viewer (roles/bigquery.dataViewer) to read data.
  • Recommendations: recommendations.py
    • Browser (roles/browser) to fetch project details.
    • Compute Viewer (roles/compute.viewer)
    • Compute Recommender Viewer (roles/recommender.computeViewer), Firewall Recommender Viewer (roles/recommender.firewallViewer), IAM Recommender Viewer (roles/recommender.iamViewer), Product Suggestion Recommender Viewer (roles/recommender.productSuggestionViewer), Viewer of Billing Account Usage Commitment Recommender (roles/recommender.billingAccountCudViewer) and/or Project Usage Commitment Recommender Viewer (roles/recommender.projectCudViewer). If you want billing account level recommendations, also add Billing Account Viewer (roles/billing.viewer) and Billing Account Usage Commitment Recommender Viewer (roles/recommender.billingAccountCudViewer) on the billing account itself.
    • Groups: Groups Reader permission in admin.google.com for the serviec account.

Output processors

Available output processors are:

  • mail.py: can send HTML and/or text emails via SMTP gateways or SendGrid.
  • gcs.py: can create objects on GCS from any inputs.
  • webhook.py: can send arbitrary HTTP requests, optionally with added OAuth2 bearer token from GCP.
  • gcscopy.py: copies files between buckets.
  • logger.py: Logs message in Cloud Logging.
  • pubsub.py: Sends one or more Pub/Sub messages.

Please note that the output processors have some IAM requirements to be able to pull information from GCP:

Configuring Pubsub2Inbox

Pubsub2Inbox is configured through a YAML file (for examples, see the examples/ directory). Input processors are configured under processors key and outputs under outputs.

Features of the specific processors are explain in the corresponding examples.

Retry and resend mechanism

Pubsub2Inbox has two mechanisms to prevent excessive retries and resend of messages.

The retry mechanism acknowledges and discards any messages that are older than a configured period (retryPeriod in configuration, default 2 days).

The resend mechanism is to prevent recurring notifications from being send. It relies on a Cloud Storage bucket where is stores zero-length files, that are named by hashing the resendKey (if it is omitted, all template parameters are used). The resend period is configurable through resendPeriod. To prevent the resend bucket from accumulating unlimited files, set an Object Lifecycle Management policy on the bucket.

Deploying as Cloud Function

Deploying manually

First, we have the configuration in config.yaml and we're going to store the configuration for the function as a Cloud Secret Manager secret.

Let's define some variables first:

export PROJECT_ID=your-project # Project ID where function will be deployed
export REGION=europe-west1 # Where to deploy the functions
export SECRET_ID=pubsub2inbox # Secret Manager secret name
export SA_NAME=pubsub2inbox # Service account name
export SECRET_URL="projects/$PROJECT_ID/secrets/$SECRET_ID/versions/latest"
export FUNCTION_NAME="pubsub2inbox"
export PUBSUB_TOPIC="billing-alerts" # projects/$PROJECT_ID/topics/billing-alerts

Then we'll create the secrets in Secret Manager:

gcloud secrets create $SECRET_ID \
    --replication-policy="automatic" \
    --project $PROJECT_ID

gcloud secrets versions add $SECRET_ID \
    --data-file=config.yaml \
    --project $PROJECT_ID

We will also create a service account for the Cloud Function:

gcloud iam service-accounts create $SA_NAME \
    --project $PROJECT_ID

gcloud secrets add-iam-policy-binding $SECRET_ID \
    --member "serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com" \
    --role "roles/secretmanager.secretAccessor" \
    --project $PROJECT_ID

gcloud iam service-accounts add-iam-policy-binding $SA_NAME@$PROJECT_ID.iam.gserviceaccount.com \
    --member "serviceAccount:$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com" \
    --role "roles/iam.serviceAccountTokenCreator" \
    --project $PROJECT_ID

Now we can deploy the Cloud Function:

gcloud functions deploy $FUNCTION_NAME \
    --entry-point process_pubsub \
    --runtime python38 \
    --trigger-topic $PUBSUB_TOPIC \
    --service-account "$SA_NAME@$PROJECT_ID.iam.gserviceaccount.com" \
    --set-env-vars "CONFIG=$SECRET_URL" \
    --region $REGION \
    --project $PROJECT_ID

If you change the configuration, you can update it via:

Deploying via Terraform

Sample Terraform scripts are provided in main.tf, variables.tf and outputs.tf.

Running tests

Run the command:

# python3 -m unittest discover