Back to Documentation
    DocumentationintegrationsFeb 9, 2025

    BigQuery Service Account Setup

    Step-by-step guide to creating and configuring a Google Cloud service account for secure BigQuery access with Cogny, including IAM roles and best practices.

    Overview

    Cogny connects to your BigQuery data warehouse using a Google Cloud service account with read-only access. This guide walks through creating and configuring a service account with the minimum required permissions.

    Time to complete: 10-15 minutes

    Prerequisites:

    • Google Cloud Project with BigQuery enabled
    • Project Owner or Project IAM Admin role
    • GA4 BigQuery export configured (for GA4 analysis)

    Why Service Accounts?

    Service accounts provide secure, auditable access to BigQuery data:

    • Read-only access: Cogny never modifies your data
    • Fine-grained permissions: Grant only necessary access
    • Audit trail: All queries logged in Cloud Logging
    • Key rotation: Easily rotate credentials if needed
    • No user credentials: No personal OAuth tokens required

    Step 1: Create Service Account

    Via Google Cloud Console

    - Name: cogny-analytics (or your preferred name) - ID: cogny-analytics@your-project.iam.gserviceaccount.com - Description: Service account for Cogny growth analytics platform

    • Click Create and Continue

    Via gcloud CLI

    # Set your project ID
    export PROJECT_ID="your-project-id"

    # Create service account
    gcloud iam service-accounts create cogny-analytics \
    --project=$PROJECT_ID \
    --description="Service account for Cogny growth analytics platform" \
    --display-name="Cogny Analytics"

    # Verify creation
    gcloud iam service-accounts list --project=$PROJECT_ID

    Step 2: Grant IAM Roles

    Required Roles

    Grant these roles for read-only BigQuery access:

    RolePurpose
    roles/bigquery.dataViewerRead table data
    roles/bigquery.jobUserExecute queries
    roles/bigquery.metadataViewerView table schemas

    Via Google Cloud Console

    • In the service account list, find cogny-analytics
    • Click the three-dot menu > Manage permissions
    • Click Grant Access
    • Add the service account email as a principal
    • Assign roles:

    - BigQuery Data Viewer - BigQuery Job User - BigQuery Metadata Viewer

    • Click Save

    Via gcloud CLI

    # Set service account email
    export SA_EMAIL="cogny-analytics@${PROJECT_ID}.iam.gserviceaccount.com"

    # Grant BigQuery roles
    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer"

    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.jobUser"

    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.metadataViewer"

    # Verify role bindings
    gcloud projects get-iam-policy $PROJECT_ID \
    --flatten="bindings[].members" \
    --filter="bindings.members:serviceAccount:${SA_EMAIL}"

    Step 3: Create and Download Key

    Via Google Cloud Console

    • Navigate to your service account
    • Go to Keys tab
    • Click Add Key > Create new key
    • Select JSON format
    • Click Create
    • Save the JSON file securely (it contains credentials)

    Via gcloud CLI

    # Create and download key
    gcloud iam service-accounts keys create cogny-credentials.json \
    --iam-account=${SA_EMAIL} \
    --project=$PROJECT_ID

    # Verify key creation
    ls -lh cogny-credentials.json

    The JSON file contains:

    {
    "type": "service_account",
    "project_id": "your-project-id",
    "private_key_id": "abc123...",
    "private_key": "-----BEGIN PRIVATE KEY-----\n...",
    "client_email": "cogny-analytics@your-project.iam.gserviceaccount.com",
    "client_id": "123456789",
    "auth_uri": "https://accounts.google.com/o/oauth2/auth",
    "token_uri": "https://oauth2.googleapis.com/token",
    "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
    "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/..."
    }

    Step 4: Configure Dataset Permissions (Optional)

    For additional security, restrict access to specific datasets:

    Via Google Cloud Console

    • Navigate to BigQuery
    • Select your dataset (e.g., analytics_123456789)
    • Click Sharing > Permissions
    • Click Add Principal
    • Enter service account email
    • Select roles:

    - BigQuery Data Viewer - BigQuery Metadata Viewer

    • Click Save

    Via gcloud CLI

    # Set dataset ID
    export DATASET_ID="analytics_123456789"

    # Grant dataset-level access
    bq add-iam-policy-binding \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer" \
    ${PROJECT_ID}:${DATASET_ID}

    bq add-iam-policy-binding \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.metadataViewer" \
    ${PROJECT_ID}:${DATASET_ID}

    # Verify dataset permissions
    bq show --format=prettyjson ${PROJECT_ID}:${DATASET_ID}

    Step 5: Connect to Cogny

    Via Cogny Dashboard

    • Log in to app.cogny.com
    • Navigate to Settings > Data Warehouses
    • Click Connect BigQuery
    • Enter connection details:

    - Project ID: Your GCP project ID - Dataset ID: Your GA4 dataset (e.g., analytics_123456789) - Service Account Key: Upload the JSON file

    • Click Test Connection
    • If successful, click Save

    Via API

    curl -X POST https://api.cogny.com/v1/warehouses \
    -H "Authorization: Bearer sk_live_abc123xyz789" \
    -H "Content-Type: application/json" \
    -d '{
    "type": "bigquery",
    "name": "Production Analytics",
    "config": {
    "project_id": "your-project-id",
    "dataset_id": "analytics_123456789",
    "credentials": {
    "type": "service_account",
    "project_id": "your-project-id",
    "private_key_id": "abc123...",
    "private_key": "-----BEGIN PRIVATE KEY-----\n...",
    "client_email": "cogny-analytics@your-project.iam.gserviceaccount.com"
    }
    }
    }'

    Connection Test

    Cogny runs these validation checks:

    • Authentication: Verify service account credentials
    • Project Access: Confirm project exists and is accessible
    • Dataset Access: Verify dataset read permissions
    • Query Execution: Run sample query to test permissions

    -- Test query executed by Cogny
    SELECT
    COUNT(*) as row_count,
    COUNT(DISTINCT user_pseudo_id) as unique_users
    FROM `your-project.analytics_123456789.events_*`
    WHERE _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', CURRENT_DATE() - 1)
    LIMIT 1

    Security Best Practices

    1. Use Least Privilege

    Grant only necessary permissions:

    # ✅ Good: Project-level read-only access
    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer"

    # ❌ Bad: Project editor access
    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/editor"

    2. Rotate Keys Regularly

    Rotate service account keys every 90 days:

    # Create new key
    gcloud iam service-accounts keys create cogny-credentials-new.json \
    --iam-account=${SA_EMAIL}

    # Update Cogny with new credentials
    # Then delete old key

    # List keys
    gcloud iam service-accounts keys list \
    --iam-account=${SA_EMAIL}

    # Delete old key
    gcloud iam service-accounts keys delete KEY_ID \
    --iam-account=${SA_EMAIL}

    3. Monitor Service Account Activity

    Enable Cloud Audit Logs:

    # View recent BigQuery queries from service account
    gcloud logging read "protoPayload.authenticationInfo.principalEmail=${SA_EMAIL}" \
    --limit 50 \
    --format json

    4. Store Credentials Securely

    Never commit credentials to version control:

    # Add to .gitignore
    echo "cogny-credentials.json" >> .gitignore
    echo "*-credentials.json" >> .gitignore

    # Use environment variables or secret managers
    export COGNY_CREDENTIALS_PATH="/secure/path/cogny-credentials.json"

    5. Restrict Network Access (Optional)

    For enterprise deployments, use VPC Service Controls:

    # Create access policy
    gcloud access-context-manager policies create \
    --title="Cogny BigQuery Access"

    # Create service perimeter
    gcloud access-context-manager perimeters create cogny-perimeter \
    --resources=projects/PROJECT_NUMBER \
    --restricted-services=bigquery.googleapis.com

    Troubleshooting

    Error: "Permission Denied"

    Cause: Service account lacks required roles

    Solution:

    # Verify current roles
    gcloud projects get-iam-policy $PROJECT_ID \
    --flatten="bindings[].members" \
    --filter="bindings.members:serviceAccount:${SA_EMAIL}"

    # Re-grant required roles
    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer"

    Error: "Dataset Not Found"

    Cause: Service account cannot access dataset

    Solution:

    # Check dataset permissions
    bq show --format=prettyjson ${PROJECT_ID}:${DATASET_ID}

    # Grant dataset access
    bq add-iam-policy-binding \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer" \
    ${PROJECT_ID}:${DATASET_ID}

    Error: "Invalid Credentials"

    Cause: Service account key is invalid or expired

    Solution:

    # Create new key
    gcloud iam service-accounts keys create cogny-credentials-new.json \
    --iam-account=${SA_EMAIL}

    # Update credentials in Cogny

    Error: "Query Execution Failed"

    Cause: Service account lacks bigquery.jobUser role

    Solution:

    # Grant job execution permission
    gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.jobUser"

    Advanced Configuration

    Multi-Project Access

    Grant access to multiple projects:

    # Grant access to additional projects
    for project in project-1 project-2 project-3; do
    gcloud projects add-iam-policy-binding $project \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer"

    gcloud projects add-iam-policy-binding $project \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.jobUser"
    done

    Cross-Project Queries

    Enable cross-project queries:

    # Grant data viewer on source project
    gcloud projects add-iam-policy-binding source-project \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer"

    # Grant job user on destination project
    gcloud projects add-iam-policy-binding destination-project \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.jobUser"

    Table-Level Access

    Restrict to specific tables:

    # Grant access to specific table
    bq add-iam-policy-binding \
    --member="serviceAccount:${SA_EMAIL}" \
    --role="roles/bigquery.dataViewer" \
    ${PROJECT_ID}:${DATASET_ID}.events_*

    Cost Optimization

    Query Cost Monitoring

    Track query costs for the service account:

    # View query costs
    bq ls --jobs \
    --project_id=$PROJECT_ID \
    --filter="configuration.query.user_email:${SA_EMAIL}"

    Set Query Cost Limits

    Configure BigQuery quota:

    # Set daily query limit (TB)
    gcloud alpha services quota update \
    --service=bigquery.googleapis.com \
    --consumer=projects/$PROJECT_ID \
    --metric=bigquery.googleapis.com/quota/query/usage \
    --value=10

    Enable Query Cache

    Cogny automatically uses BigQuery query cache to reduce costs:

    -- Cached queries don't incur processing costs
    SELECT *
    FROM `project.dataset.table`
    WHERE DATE(event_timestamp) = '2025-02-09'
    -- Subsequent identical queries use cache

    Terraform Configuration

    Automate service account creation with Terraform:

    # Service account
    resource "google_service_account" "cogny" {
    account_id = "cogny-analytics"
    display_name = "Cogny Analytics"
    description = "Service account for Cogny growth analytics platform"
    project = var.project_id
    }

    # IAM roles
    resource "google_project_iam_member" "cogny_data_viewer" {
    project = var.project_id
    role = "roles/bigquery.dataViewer"
    member = "serviceAccount:${google_service_account.cogny.email}"
    }

    resource "google_project_iam_member" "cogny_job_user" {
    project = var.project_id
    role = "roles/bigquery.jobUser"
    member = "serviceAccount:${google_service_account.cogny.email}"
    }

    resource "google_project_iam_member" "cogny_metadata_viewer" {
    project = var.project_id
    role = "roles/bigquery.metadataViewer"
    member = "serviceAccount:${google_service_account.cogny.email}"
    }

    # Service account key
    resource "google_service_account_key" "cogny" {
    service_account_id = google_service_account.cogny.name
    }

    # Output credentials (store securely)
    output "service_account_key" {
    value = base64decode(google_service_account_key.cogny.private_key)
    sensitive = true
    }

    Next Steps

    Support

    Need Implementation Help?

    Talk to Our Technical Team

    Schedule a technical consultation to discuss your integration requirements and implementation strategy.

    Schedule Demo