Monitoring File Access Logs with Cloud Logging and Cloud Storage

 ๐Ÿ” Monitoring File Access Logs with Cloud Logging and Cloud Storage

Overview

Google Cloud provides tools to help monitor and audit file access activity, especially when using Cloud Storage. By enabling Cloud Audit Logs and Cloud Logging, you can track who accessed which files, when, and how.


✅ Prerequisites

A Google Cloud project


Cloud Storage bucket(s)


Appropriate IAM roles (e.g., Logging Viewer, Storage Admin, Cloud Audit Logs Viewer)


๐Ÿ”ง Step-by-Step Setup

1. Enable Cloud Audit Logs

Google Cloud automatically generates Audit Logs for Cloud Storage. There are two main types:


Admin Activity logs (enabled by default): Records actions that modify configurations.


Data Access logs (must be explicitly enabled): Records file read/write operations.


To enable Data Access logs:


Go to the IAM & Admin > Audit Logs page in the Cloud Console.


Select your project.


Locate Cloud Storage and check Read and/or Write under Data Access for the desired identities (users, groups, service accounts).


Save changes.


2. Access Logs in Cloud Logging

After enabling logs:


Navigate to Logging > Logs Explorer in the Google Cloud Console.


Use filters to find relevant logs.


Example query:


sql

Copy

Edit

resource.type="gcs_bucket"

logName="projects/[PROJECT_ID]/logs/cloudaudit.googleapis.com%2Fdata_access"

protoPayload.resourceName="projects/_/buckets/[BUCKET_NAME]/objects/[OBJECT_NAME]"

This shows access to a specific object in a bucket.


๐Ÿ”Ž Interpreting Logs

Important fields in a log entry:


timestamp: When the access occurred


protoPayload.authenticationInfo.principalEmail: Who accessed the file


protoPayload.methodName: The type of access (e.g., storage.objects.get, storage.objects.create)


resource.labels.bucket_name: Which bucket was accessed


resource.labels.object_name: Specific file


๐Ÿ“ค Exporting Logs (Optional)

You can export logs to:


BigQuery: For advanced querying


Cloud Storage: For archival


Pub/Sub: For real-time alerts and integrations


Steps:


Go to Logging > Log Router


Click Create Sink


Define filters (e.g., all Cloud Storage access logs)


Choose a destination (BigQuery, GCS, Pub/Sub)


๐Ÿ”” Optional: Set Up Alerts

Go to Logging > Logs-based Metrics


Create a metric for events of interest (e.g., all storage.objects.get)


Use Cloud Monitoring > Alerting to create an alert based on the metric


๐Ÿ“˜ Example Use Case

Monitor all download activity from a public-facing bucket:


Enable Data Access logs


Use a Logs Explorer filter for storage.objects.get


Export logs to BigQuery for regular audits


Set up alerts for high-volume downloads or unexpected user access


๐Ÿ›ก️ Best Practices

Regularly review and rotate IAM policies


Limit who can access logs and enable logging for sensitive buckets


Use log retention policies to control cost

Learn Google Cloud Data Engineering Course

Read More

Using Signed URLs and Tokens for Secure Data Downloads

Building a Unified Data Lake and Warehouse with BigQuery and Cloud Storage

Encrypting Data on Ingress and Egress from Cloud Storage

Implementing Multi-Tiered Storage Strategies in Cloud Storage

Visit Our Quality Thought Training in Hyderabad

Get Directions 

Comments

Popular posts from this blog

Understanding Snowflake Editions: Standard, Enterprise, Business Critical

Installing Tosca: Step-by-Step Guide for Beginners

Why Data Science Course?