Direct Library Usage
We recommend that you use the google-cloud-logging
library by integrating it with the Python logging standard library; However, you can also use the library to interact with the Google Cloud Logging API directly.
In addition to writing logs, you can use the library to manage logs, sinks, metrics, and other resources.
Setup
Create a Client
You must set up a Client to use the library:
import google.cloud.logging # if project not given, it will be inferred from the environment client = google.cloud.logging.Client(project="my-project")
To use HTTP, disable gRPC when you set up the Client:
http_client = google.cloud.logging.Client(_use_grpc=False)
Create a Logger
Loggers read, write, and delete logs from Google Cloud.
You use your Client to create a Logger.
client = google.cloud.logging.Client(project="my-project") logger = client.logger(name="log_id") # logger will bind to logName "projects/my_project/logs/log_id"
To add custom labels, do so when you initialize a Logger. When you add custom labels, these labels are added to each LogEntry written by the Logger:
custom_labels = {"my-key": "my-value"} label_logger = client.logger(log_id, labels=custom_labels)
By default, the library adds a Monitored Resource field associated with the environment the code is run on. For example, code run on App Engine will have a gae_app resource, while code run locally will have a global resource field.
To manually set the resource field, do so when you initialize the Logger:
from google.cloud.logging_v2.resource import Resource resource = Resource(type="global", labels={}) global_logger = client.logger(log_id, resource=resource)
Write Log Entries
You write logs by using Logger.log
:
logger.log("A simple entry") # API call
You can add LogEntry fields by passing them as keyword arguments:
logger.log( "an entry with fields set", severity="ERROR", insert_id="0123", labels={"my-label": "my-value"}, ) # API call
Logger.log
chooses the appropriate LogEntry type based on input type. To specify type, you can use the following Logger methods:
Logger.log_text
creates aTextEntry
Logger.log_struct
creates aStructEntry
Logger.log_proto
creates aProtobufEntry
Logger.log_empty
creates an emptyLogEntry
Batch Write Logs
By default, each log write takes place in an individual network request, which may be inefficient at scale.
Using the Batch
class, logs are batched together, and only sent out when batch.commit
is called.
batch = logger.batch() batch.log("first log") batch.log("second log") batch.commit()
To simplify things, you can also use Batch
as a context manager:
with logger.batch() as batch: batch.log("first log") # do work batch.log("last log")
In the previous example, the logs are automatically committed when the code exits the “with” block.
Retrieve Log Entries
You retrieve log entries for the default project using list_entries()
on a Client or Logger object:
for entry in client.list_entries(): # API call(s) do_something_with(entry)
Entries returned by Client.list_entries()
or Logger.list_entries()
are instances of one of the following classes:
To filter entries retrieved using the Advanced Logs Filters syntax
To fetch entries for the default project.
filter_str = "logName:log_name AND textPayload:simple" for entry in client.list_entries(filter_=filter_str): # API call(s) do_something_with(entry)
To sort entries in descending timestamp order.
from google.cloud.logging import DESCENDING for entry in client.list_entries(order_by=DESCENDING): # API call(s) do_something_with(entry)
To retrieve entries for a single logger, sorting in descending timestamp order:
from google.cloud.logging import DESCENDING for entry in logger.list_entries(order_by=DESCENDING): # API call(s) do_something_with(entry)
For example, to retrieve all GKE Admin Activity audit logs from the past 24 hours:
import google.cloud.logging from datetime import datetime, timedelta, timezone import os # pull your project id from an environment variable project_id = os.environ["GOOGLE_CLOUD_PROJECT"] # construct a date object representing yesterday yesterday = datetime.now(timezone.utc) - timedelta(days=1) # Cloud Logging expects a timestamp in RFC3339 UTC "Zulu" format # https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry time_format = "%Y-%m-%dT%H:%M:%S.%f%z" # build a filter that returns GKE Admin Activity audit Logs from # the past 24 hours # https://cloud.google.com/kubernetes-engine/docs/how-to/audit-logging filter_str = ( f'logName="projects/{project_id}/logs/cloudaudit.googleapis.com%2Factivity"' f' AND resource.type="k8s_cluster"' f' AND timestamp>="{yesterday.strftime(time_format)}"' ) # query and print all matching logs client = google.cloud.logging.Client() for entry in client.list_entries(filter_=filter_str): print(entry)
Delete Log Entries
To delete all logs associated with a logger, use the following call:
logger.delete() # API call
Manage Log Metrics
Logs-based metrics are counters of entries which match a given filter. They can be used within Cloud Monitoring to create charts and alerts.
To list all logs-based metrics for a project:
for metric in client.list_metrics(): # API call(s) do_something_with(metric)
To create a logs-based metric:
metric = client.metric(metric_name, filter_=filter, description=description) assert not metric.exists() # API call metric.create() # API call assert metric.exists() # API call
To refresh local information about a logs-based metric:
existing_metric = client.metric(metric_name) existing_metric.reload() # API call
To update a logs-based metric:
existing_metric.filter_ = updated_filter existing_metric.description = updated_description existing_metric.update() # API call
To delete a logs-based metric:
metric.delete()
Log Sinks
Sinks allow exporting of log entries which match a given filter to Cloud Storage buckets, BigQuery datasets, or Cloud Pub/Sub topics.
Cloud Storage Sink
Ensure the storage bucket that you want to export logs to has cloud-logs@google.com
as an owner. See Setting permissions for Cloud Storage.
Ensure that cloud-logs@google.com
is an owner of the bucket:
bucket.acl.reload() # API call logs_group = bucket.acl.group("cloud-logs@google.com") logs_group.grant_owner() bucket.acl.add_entity(logs_group) bucket.acl.save() # API call
To create a Cloud Storage sink:
destination = "storage.googleapis.com/%s" % (bucket.name,) sink = client.sink(sink_name, filter_=filter, destination=destination) assert not sink.exists() # API call sink.create() # API call assert sink.exists() # API call
BigQuery Sink
To export logs to BigQuery, you must log into the Cloud Console and add cloud-logs@google.com
to a dataset.
See: Setting permissions for BigQuery
from google.cloud.bigquery.dataset import AccessEntry entry_list = dataset.access_entries entry_list.append(AccessEntry("WRITER", "groupByEmail", "cloud-logs@google.com")) dataset.access_entries = entry_list client.update_dataset(dataset, ["access_entries"]) # API call
To create a BigQuery sink:
destination = "bigquery.googleapis.com%s" % (dataset.path,) sink = client.sink(sink_name, filter_=filter_str, destination=destination) assert not sink.exists() # API call sink.create() # API call assert sink.exists() # API call
Pub/Sub Sink
To export logs to BigQuery you must log into the Cloud Console and add cloud-logs@google.com
to a topic.
See: Setting permissions for Pub/Sub
topic_path = client.topic_path(project_id, topic_id) topic = client.create_topic(request={"name": topic_path}) policy = client.get_iam_policy(request={"resource": topic_path}) # API call policy.bindings.add(role="roles/owner", members=["group:cloud-logs@google.com"]) client.set_iam_policy( request={"resource": topic_path, "policy": policy} ) # API call
To create a Cloud Pub/Sub sink:
destination = "pubsub.googleapis.com/%s" % (topic.name,) sink = client.sink(sink_name, filter_=filter_str, destination=destination) assert not sink.exists() # API call sink.create() # API call assert sink.exists() # API call
Manage Sinks
To list all sinks for a project:
for sink in client.list_sinks(): # API call(s) do_something_with(sink)
To refresh local information about a sink:
existing_sink = client.sink(sink_name) existing_sink.reload()
To update a sink:
existing_sink.filter_ = updated_filter existing_sink.update()
To delete a sink:
sink.delete()