How to Authenticate and Access Google Cloud Storage Buckets via Service Account Using gcloud CLI


2 views

When working with Google Cloud Storage (GCS) from command-line interfaces, many developers encounter authentication hurdles when using service accounts. The standard approach of using gcloud auth activate-service-account sometimes fails with cryptic OpenSSL errors like PEM_read_bio: no start line, leaving developers scratching their heads.

Google Cloud offers two primary ways to configure service account access:

# Modern method (sometimes problematic)
gcloud auth activate-service-account SERVICE_ACCOUNT_EMAIL --key-file=KEY_FILE.json

# Legacy method (often more reliable)
gsutil config -e -o ~/.boto

The legacy gsutil config approach tends to be more reliable for service account authentication because it handles the credential conversion process differently.

Here's the complete workflow that actually works:

# First, clean up any existing configurations
rm -rf ~/.config/gcloud

# Then authenticate using the service account
gcloud auth activate-service-account SERVICE_ACCOUNT@PROJECT.iam.gserviceaccount.com \
    --key-file=/path/to/service-account-key.json

# Alternatively, use the legacy method
gsutil config -e -o ~/.boto

Your IAM setup should include these minimum permissions:

# For the service account
roles/storage.objectAdmin  # For read/write operations
roles/storage.legacyBucketOwner  # If using legacy ACLs

# Bucket-level permissions via:
gsutil iam ch serviceAccount:SERVICE_ACCOUNT@PROJECT.iam.gserviceaccount.com:roles/storage.admin gs://BUCKET_NAME

If you still encounter problems:

# Enable debug logging
gsutil -D cp file.txt gs://your-bucket/

# Verify active account
gcloud auth list

# Check service account permissions
gcloud projects get-iam-policy PROJECT_ID \
    --flatten="bindings[].members" \
    --format='table(bindings.role)' \
    --filter="bindings.members:SERVICE_ACCOUNT@PROJECT.iam.gserviceaccount.com"

Here's a robust bash script for automated uploads:

#!/bin/bash

SERVICE_ACCOUNT="your-service-account@project.iam.gserviceaccount.com"
KEY_FILE="/path/to/keyfile.json"
BUCKET="gs://your-bucket"

# Authenticate
if ! gcloud auth activate-service-account $SERVICE_ACCOUNT --key-file=$KEY_FILE; then
    echo "Failed to authenticate service account"
    exit 1
fi

# Verify access
if ! gsutil ls $BUCKET > /dev/null 2>&1; then
    echo "Cannot access bucket. Check permissions."
    exit 1
fi

# Upload files
for file in /path/to/files/*; do
    gsutil cp "$file" "$BUCKET/" || echo "Failed to upload $file"
done

The error message [('PEM routines', 'PEM_read_bio', 'no start line')] typically indicates an authentication certificate parsing issue. While the deprecated gsutil config -e method works, here's the modern approach using gcloud auth:

# Authenticate the service account
gcloud auth activate-service-account SERVICE_ACCOUNT_EMAIL \
    --key-file=/path/to/key.json \
    --project=PROJECT_ID

# Verify active credentials
gcloud auth list

# Set default project (optional but recommended)
gcloud config set project PROJECT_ID

Modern gcloud versions (v300+) use a different credential storage mechanism. To ensure gsutil picks up the credentials:

# Generate or update .boto configuration
gsutil config -e -f

# Alternative method: explicitly specify credentials
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key.json"

Before attempting transfers, verify your service account has:

  1. Storage Object Creator role at minimum
  2. Proper bucket-level IAM permissions

Test with:

gsutil iam get gs://mybucket
gsutil ls gs://mybucket
#!/bin/bash

# Set environment variables
export PROJECT_ID="your-project"
export BUCKET_NAME="mybucket"
export SERVICE_ACCOUNT="service-account@project.iam.gserviceaccount.com"
export KEY_PATH="/path/to/key.json"

# Authenticate and configure
gcloud auth activate-service-account $SERVICE_ACCOUNT \
    --key-file=$KEY_PATH \
    --project=$PROJECT_ID

# Verify access
gsutil ls gs://$BUCKET_NAME

# Upload test file
echo "test content" > testfile.txt
gsutil cp testfile.txt gs://$BUCKET_NAME/testfile.txt

# Clean up
gsutil rm gs://$BUCKET_NAME/testfile.txt
rm testfile.txt

If issues persist:

  • Run gsutil -D cp ... for debug output
  • Check ~/.config/gcloud/credentials.db for corruption
  • Verify the key file hasn't been modified (check BEGIN/END PRIVATE KEY markers)
  • Test with curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https://storage.googleapis.com/storage/v1/b/mbucket