Skip to content

Instantly share code, notes, and snippets.

@erleene
Last active August 2, 2018 13:52
Show Gist options
  • Save erleene/623eea1ad709ffb39e1f54e6bf7d8d0b to your computer and use it in GitHub Desktop.
Save erleene/623eea1ad709ffb39e1f54e6bf7d8d0b to your computer and use it in GitHub Desktop.
bigquery for logging and alerting
BigQuery allows organizations to capture and analyze data in real time using its powerful streaming ingestion
capability so that your insights are always current, and it’s free for up to 1 TB of data analyzed each month and 10 GB of data stored.
Google BigQuery gives you full view of all your data by seamlessly querying data stored in BigQuery’s managed columnar storage,
------
1. export log entries
Created an export of the log we need in Stackdriver from log viewer.
Exports logs by creating a sink that includes a log filter and an export destination.
The export is a sink to send to BigQuery or Cloud Storage bucket
a copy of the log entry is copied over to bigquery or cloud storage.
you must have the permission to create a sink.
When you create a sink, Stackdriver Logging creates a new service account for the sink, called a unique writer identity.
Your export destination must permit this service account to write log entries.
2. big query
you should see a new sink destination in big query.
https://sites.google.com/site/scriptsexamples/learn-by-example/export-logs-to-bigquery/set-up-exports
3. cloud storage
you should see a new sink destination in cloud storage.
3. alerting
an alert is setup on the GCS bucket to get triggered if there is a new entry made to the bucket object.
for the time being, we want to have an alert sent via a slack webhook via a cloud function.
When a new item is added onto the bucket, a cloud function will get triggered to send a slack notification.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment