Monday, November 3, 2025

thumbnail

Managing Secrets in Cloud Composer Workflows

 ✅ Why Secrets Management Matters in Cloud Composer


Secrets include sensitive information like:


API keys


Database credentials


OAuth tokens


Cloud service account keys


Why you can’t store them in plaintext:


Composer DAGs are stored in Cloud Storage and can be versioned — anyone with access could see sensitive info.


Hardcoding secrets violates security best practices and may lead to accidental leaks.


๐Ÿ”‘ Best Practices for Handling Secrets in Composer


Never hardcode secrets in DAG files


Avoid directly writing keys, passwords, or tokens in Python code.


Use a dedicated secret manager


Google Cloud Secret Manager is the recommended service. (cloud.google.com/secret-manager

)


It encrypts secrets at rest, manages access via IAM, and allows versioning.


Access secrets dynamically at runtime


Fetch secrets in your DAG tasks using either operators, hooks, or Python code.


Use Airflow connections for credentials


Airflow “Connections” can store secrets for databases, APIs, or cloud providers.


Sensitive fields can be stored as “extra” JSON and encrypted by Airflow.


Set proper IAM permissions


Composer’s environment service account should have Secret Manager Access (roles/secretmanager.secretAccessor) for the secrets it needs.


Principle of least privilege: only grant access to the secrets needed for that workflow.


๐Ÿ›  Methods to Manage Secrets in Cloud Composer

1. Using Google Cloud Secret Manager


Step 1: Create a secret


gcloud secrets create my-db-password --replication-policy="automatic"

gcloud secrets versions add my-db-password --data-file="password.txt"



Step 2: Grant Composer access


gcloud secrets add-iam-policy-binding my-db-password \

    --member="serviceAccount:YOUR_COMPOSER_SA" \

    --role="roles/secretmanager.secretAccessor"



Step 3: Access secret in a DAG


from airflow import DAG

from airflow.utils.dates import days_ago

from google.cloud import secretmanager


def get_secret(secret_id):

    client = secretmanager.SecretManagerServiceClient()

    name = f"projects/YOUR_PROJECT_ID/secrets/{secret_id}/versions/latest"

    response = client.access_secret_version(name=name)

    return response.payload.data.decode("UTF-8")


def task_using_secret():

    db_password = get_secret("my-db-password")

    print(f"Using password: {db_password}")  # Use in actual DB connection


with DAG(dag_id="secret_example", start_date=days_ago(1), schedule_interval=None) as dag:

    from airflow.operators.python import PythonOperator

    t1 = PythonOperator(

        task_id="use_secret",

        python_callable=task_using_secret

    )


2. Using Airflow Connections


Go to Composer → Environment → Airflow → Admin → Connections


Add a connection with type (MySQL, Postgres, API, etc.)


Store the sensitive info in the password field or extra JSON


In DAG:


from airflow.hooks.base_hook import BaseHook


conn = BaseHook.get_connection("my_connection_id")

print(conn.password)  # Fetches the secret securely


3. Environment Variables (Use cautiously)


You can pass secrets via environment variables in Composer tasks.


Better: use Cloud Secret Manager + environment variable injection (e.g., via overrides in Cloud Run or KubernetesPodOperator).


Avoid committing env variables with secrets to code repos.


4. Parameterizing Tasks Safely


If tasks need dynamic secrets (e.g., per date or user), fetch them at runtime from Secret Manager or Connections.


Combine Jinja templates in Airflow with runtime secret fetching:


db_password = "{{ var.json.db_password }}"



Use Airflow Variables or JSON variables cautiously (encrypt sensitive ones).


⚠️ Key Security Tips


Audit Access: Regularly check who has access to your secrets.


Versioning & Rotation: Secret Manager allows versions; rotate credentials periodically.


Encryption: Cloud Secret Manager encrypts automatically, but for extra security, consider CMEK (Customer-Managed Encryption Keys).


Logging: Avoid printing secrets to logs. Use print("***") instead of actual values.


Principle of Least Privilege: Only grant Composer tasks access to the secrets they need.


๐Ÿ“Œ Summary


Secrets in Cloud Composer should never be hardcoded.


Use Google Cloud Secret Manager for storing credentials securely.


Use Airflow Connections for service credentials.


Always follow least privilege, encryption, and auditing best practices.


Fetch secrets dynamically at runtime to ensure security.

Learn GCP Training in Hyderabad

Read More

Triggering Cloud Run Jobs from Composer DAGs

Visit Our Quality Thought Training Institute in Hyderabad

Get Directions 

Subscribe by Email

Follow Updates Articles from This Blog via Email

No Comments

About

Search This Blog

Powered by Blogger.

Blog Archive