Setting up Google GCR and Artifact Registry APIs and Permissions

In this guide, we’ll walk through the steps to enable the Google Container Registry and Artifact Registry APIs and set permissions for a service account to push Docker images to both registries. We’ll provide examples using the Google Cloud CLI, Terraform, Pulumi, and Google Cloud Console.

Step 1: Enable Google Container Registry and Artifact Registry APIs

Using Google Cloud Console

  1. Navigate to the Google Cloud Console.
  2. Select your project from the dropdown menu.
  3. Go to “Navigation Menu” > “APIs & Services” > “Library”.
  4. Search for “Google Container Registry API”.
  5. Click on “Enable”.
  6. Repeat steps 3-5 for “Artifact Registry API”.

Using Google Cloud CLI

gcloud services enable containerregistry.googleapis.com
gcloud services enable artifactregistry.googleapis.com

Using Terraform

provider "google" {
  credentials = file("path/to/credentials.json")
  project     = "your-project-id"
  region      = "your-region"
}

resource "google_project_service" "container_registry" {
  project = "your-project-id"
  service = "containerregistry.googleapis.com"

  disable_on_destroy = false
}

resource "google_project_service" "artifact_registry" {
  project = "your-project-id"
  service = "artifactregistry.googleapis.com"

  disable_on_destroy = false
}

Using Pulumi

import * as pulumi from "@pulumi/pulumi";
import * as gcp from "@pulumi/gcp";

const projectName = "your-project-id";

// Enable the Google Container Registry API
const containerRegistry = new gcp.projects.Service(projectName, {
    service: "containerregistry.googleapis.com",
});

// Enable the Artifact Registry API
const artifactRegistry = new gcp.projects.Service(projectName, {
    service: "artifactregistry.googleapis.com",
});

export const containerRegistryServiceName = containerRegistry.service;
export const artifactRegistryServiceName = artifactRegistry.service;

Step 2: Set Permissions for Service Account

Using Google Cloud Console

  1. Navigate to the Google Cloud Console.
  2. Select your project from the dropdown menu.
  3. Go to “Navigation Menu” > “IAM & Admin” > “IAM”.
  4. Click on “Add”.
  5. In the “New members” field, enter the service account email ([email protected]).
  6. Select “Storage” > “Storage Admin” from the “Role” dropdown.
  7. Click on “Save”.

Using Google Cloud CLI

PROJECT_ID="your-project-id"
SERVICE_ACCOUNT_EMAIL="[email protected]"

gcloud projects add-iam-policy-binding $PROJECT_ID \
  --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" \
  --role="roles/storage.admin"

Using Terraform

resource "google_project_iam_member" "registry_writer" {
  project = "your-project-id"
  role    = "roles/storage.admin"
  member  = "serviceAccount:[email protected]"
}

Using Pulumi

import * as pulumi from "@pulumi/pulumi";
import * as gcp from "@pulumi/gcp";

const projectName = "your-project-id";
const serviceAccountEmail = "[email protected]";

// Grant permissions to the service account to push Docker images to both Container Registry and Artifact Registry
const storageAdminRoleBinding = new gcp.projects.IAMBinding("storage-admin", {
    project: projectName,
    role: "roles/storage.admin",
    members: [
        `serviceAccount:${serviceAccountEmail}`,
    ],
});

export const serviceAccountEmail = storageAdminRoleBinding.members;

Feel free to adjust the placeholders like your-project-id, your-region, and path/to/credentials.json with your actual values. These examples should help you set up the necessary configurations for your GitHub Actions pipeline to build and push Docker images to both the Google Container Registry and Artifact Registry seamlessly.