Building a Secure File Upload API Using Cloud Storage and Cloud Functions
Building a secure file upload API using cloud storage (like Google Cloud Storage or AWS S3) and cloud functions (like Google Cloud Functions or AWS Lambda) involves several key components:
๐ Security Goals
Validate file types and size.
Authenticate and authorize users.
Prevent direct access to cloud storage.
Use signed URLs or proxy uploads.
Sanitize file names.
✅ Recommended Architecture Overview
pgsql
Copy
Edit
Client (e.g., web or mobile app)
|
| ---> Upload request
|
[Authenticated API Gateway / HTTPS Endpoint]
|
| ---> Validates request
| ---> Generates signed URL
|
[Cloud Function]
|
| ---> Returns signed URL or proxies the upload
|
[Cloud Storage Bucket (Private)]
๐ฆ Step-by-Step Implementation (GCP Example)
1. Set up a secure Cloud Storage bucket
Create a private bucket.
Disable public access.
Enable uniform bucket-level access.
bash
Copy
Edit
gsutil mb -p [PROJECT_ID] -l [LOCATION] gs://[BUCKET_NAME]
gsutil uniformbucketlevelaccess set on gs://[BUCKET_NAME]
2. Write a Cloud Function to Generate Signed URLs
python
Copy
Edit
# Python (Google Cloud Function - HTTP Trigger)
from google.cloud import storage
from flask import jsonify, request
import time
def generate_upload_url(request):
request_json = request.get_json()
filename = request_json.get('filename')
content_type = request_json.get('contentType')
# Validation (basic)
if not filename or not content_type:
return jsonify({'error': 'Missing required fields'}), 400
# File extension validation (example)
if not filename.lower().endswith(('.png', '.jpg', '.jpeg', '.pdf')):
return jsonify({'error': 'Invalid file type'}), 400
bucket_name = 'your-private-bucket'
expiration_time = 15 * 60 # 15 minutes
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(filename)
url = blob.generate_signed_url(
version="v4",
expiration=expiration_time,
method="PUT",
content_type=content_type,
)
return jsonify({'url': url, 'method': 'PUT', 'expiresIn': expiration_time})
3. Deploy the Cloud Function
bash
Copy
Edit
gcloud functions deploy generate_upload_url \
--runtime python310 \
--trigger-http \
--allow-unauthenticated # (or add auth later)
4. Client Upload Flow
javascript
Copy
Edit
// Example using fetch in JavaScript
async function uploadFile(file) {
const res = await fetch('https://<cloud-function-url>', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ filename: file.name, contentType: file.type })
});
const { url, method } = await res.json();
const uploadRes = await fetch(url, {
method,
headers: { 'Content-Type': file.type },
body: file
});
if (uploadRes.ok) {
console.log('Upload successful!');
} else {
console.error('Upload failed.');
}
}
๐ Additional Security Measures
Authentication: Use Firebase Auth, OAuth, or IAM-based service auth.
Virus scanning: Trigger another cloud function after upload to scan the file (e.g., with ClamAV on Cloud Functions).
File size limits: Enforce client-side and server-side limits.
Storage lifecycle rules: Auto-delete old files.
Audit logs: Enable GCP/AWS audit logs to track uploads and access.
๐ Optional: Proxy File Uploads via Cloud Function
Instead of signed URLs, you can have clients upload files directly to a cloud function (e.g., multipart/form-data), which then writes to storage — slower and less scalable but useful for full control.
Learn Google Cloud Data Engineering Course
Read More
Cloud Storage - Specialized Use Cases & Security
Building a Pub/Sub Message Router with Cloud Run
Creating a Multi-Tenant Event Bus with Cloud Pub/Sub
Using Cloud Storage to Archive High-Volume Streaming Data
Visit Our Quality Thought Training in Hyderabad
Comments
Post a Comment