Cloud Endpoints

Cloud Endpoints enable you to deploy and manage your APIs on GCP. They enable you to implement an API Gateway.

Implement an API Gateway

An API gateway creates a layer of abstraction and insulates the clients from the partitioning of an application into microservices.

The API for your app can run on backends such as App Engine, GKE or Compute Engine.

They may work as a facade for legacy applications that cannot be refactored and moved to the cloud. Each consumer can then invoke these modern APIs instead of invoking outdated APIs.

Usign the Apigee platform

Deploying and managing an API on your own may be difficult

There’re a few issues to consider when deploying and manaying APIs on your own

  • What language and format will you use to describe the interface?
  • How will you authenticate services and users?
  • How will you scale your API to meet demand?
  • Does your infrastructure log details and provide metrics?

Cloud Endpoint provides the infrastructure support that you need to deploy and manage robust, secure and scalable APIs.

Cloud Endpoints supports REST APIs and gRPC APIs

Json HTTP 1.1 REST APIs are popular and easy to use. To enable Cloud Endpoints for REST APIs, create the API configuration in a YAML file based on Open API specification.

gRPC is a newer faster technology. You can generate client libraries for various prorgamming languages. Your app can then make type-safe, remote server calls as if they were local calls.
To enable Cloud Endpoints for gRPC APIs, create your service definition by usign protocol buffers, and then create a service configuration by using the gRPC API specification.
Cloud Endpoints support transcoding of HTTP JSON calls into gRPC. Your clients can access your gRPC API while using plain old HTTP JSON calls.

Develop and deploy your API configuration and API backend

  1. Develop API backend
  2. Develop API configuration
  3. Deploy API configuration
  4. Deploy API backend

After you develop your REST API backend, create an API config file that describes your Cloud Endpoints’ API. Deploy the API config by using the gcloud service management deploy command. This returns the service config ID and service name.

Use the Open API specification as the interface Definition Language

Create an API config file that describes your Cloud Endpoints API. The API config file is a yaml file that describes the API using the Open API specification. The Open API specification and its extensions enable you to describe the surface of the API and security definitions. You can specify security definitions for user authentication and for service authentication.
Various tools are available to help you create and manage your Open API specifications.

Service Management API, Service Control API and Extensible Service Proxy from the core Cloud Endpoints

When you deploy the API configuration, it’s registered with the service management API and shared with the Extensible Service Proxy. Service management uses the host value in your deployment config file to create a new Cloud Endpoint service with a name in the format [CUSTOM-API-NAME].endpoints.[PROJECT-ID]
This API service name is created if it doesn’t already exists and then Cloud Endpoints configures the service according to your Open API config file.

Cloud Endpoints uses DNS compatible names to uniquely identify services. Because projects in GCP are guarenteed to have a globally unique name you can use your project name to create a unique API service name such as

You can also map your own DNS name to your API. The Extensible Service Proxy is a NGINX based proxy that runs in front of the API backend and injects Cloud Endpoints functionalities such as authentication, monitoring and logging. At runtime Cloud Endpoints can receive calls from any source. The calls are load balanced and routed to the Extensible Service Proxy. This service works with the service control API to check the request against the API config and verify that the requests can be passed through to the backend. If the request authenticates succesfully the Extensible Service Proxy passes it on the backend API.

The service control API logs info about incoming requests. These logs and metrics can be viewed through the Cloud console.

Enable user authentication and service-to-service authentication

With Cloud Endpoints API you can authenticate users who are attempting to invoke your frontend API. Cloud Endpoints supports user authentication with Firebase, auth0, Google Authentication and other custom authentication messages.

After the user signs in, the authentications providers send a signed JSON web token or JWT to Cloud Endpoints. Cloud Endpoints check that the JWT is signed by the provider that you specified in your API config.

For service to service authentication create a service account and use a Google ID token to sign the request.

You can restrict API access

To restrict access to your Cloud Endpoints API you can assign Cloud identity and access management, Cloud IAM roles to specific users. You can grant access to a specific API or the entire Cloud project.
To give users the ability to enable your service in their own Cloud Project and invoke its API assign the Service Consumer role to them. This is the most common use case. You can assign the Service Controller, Viewer, Editor or Owner roles to give users greater permissions to view and manage service configs and projects.

You can deploy multiple versions of your API in production

You might make changes to your API backend. These changes are usually backwards compatible. In such cases it’s a good practice to increment the version attribute in the Cloud Endpoints API config and then redeploy the API config and the API backend.

If you make changes that aren’t backwards compatible and will break functionality for consumers of your API, deploy two versions of your Cloud Endpoints API by creating a separate API config for each version. The gcloud service management deploy command will return a different service config ID for each version. Update the API backend config of each version with a corresponding service config ID. You can also delete versions of your API when they’re no longer in need. Make sure to announce the deprecation and phase out plans to the consumers of the API well in advance.

You can deploy APIs in multiple environments

When developing and deploying APIs you will have separate environments for developing and staging limited releases such as private Alphas and for production. You can create separate projects for each environment and deploy a Cloud Endpoints API and backend with separated endpoints for your consumers.

environment example
prod alpha

Error metrics

In the Cloud Endpoints dashboard you can see metrics related to request 4xx and 5xx errors and latency.