An example about configuring PubSub BigQuery Subscription with Pulumi
BigQuery Subscription
It’s hard to view the content of the messages that were published to a topic because the application
has already processed and acknowledged them before you can do anything. Usually, you have to create
another test subscription for the messages to be replicated to and then pull messages from that test
subscription. However, the Google PubSub UI doesn’t provide any way to pull specific message by id.
The GCloud Console UI is a frustrating UI itself, slow to load and had to pull several times to find
the necessary messages.
Google offers BigQuery Subscription, a solution to that issue and also to provide a long term
storage for your messages so you can troubleshoot and do complex query later. In this post, I’m
going to show a sample BigQuery Subscription workflow with Pulumi.
First, you need to create a BigQuery Dataset and a BigQuery Table following the schema defined
here. You can do it
manually on the UI or via Pulumi
BigQuery Dataset
const pubsubDatasetId = `pubsub`;
export const pubsubDataset = new gcp.bigquery.Dataset(
`my-dataset`,
{ datasetId: pubsubDatasetId }
);
BigQuery Table (a bit messy since the schema has to be defined in JSON string)
export const messageTable = new gcp.bigquery.Table(
`my-table`,
{
datasetId: pubsubDatasetId,
tableId: `message-values`,
// if you don't want other people to accidentally delete is, set to true
deletionProtection: true,
schema: `
[
{
"name": "data",
"type": "STRING",
"mode": "NULLABLE",
"description": "The message body"
},
{
"name": "subscription_name",
"type": "STRING",
"mode": "NULLABLE",
"description": ""
},
{
"name": "message_id",
"type": "STRING",
"mode": "NULLABLE",
"description": ""
},
{
"name": "publish_time",
"type": "TIMESTAMP",
"mode": "NULLABLE",
"description": ""
},
{
"name": "attributes",
"type": "STRING",
"mode": "NULLABLE",
"description": "Message attributes as JSON string"
}
]
`,
},
{
dependsOn: [pubsubDataset],
}
);
Read more