Skip to content

Destination Azure Blob Storage: Support writing timestamps #6063

Closed
@ghost

Description

Tell us about the problem you're trying to solve

I'd like to be able to use Azure Blob Storage (or S3 / GCS) as a durable data lake while also facilitating quick loads into a DW, like Snowflake and BigQuery.

Describe the solution you’d like

The option to add append the current timestamp (_airbyte_emitted_at) to the resulting filename in Cloud Storage. This would allow incremental reads to create individual files that can be loaded, queried, managed efficiently.

Describe the alternative you’ve considered or used

An alternative would be to manage a larger workflow outside of Airbyte that loads the file, copies to a durable location, and then removes the original.

Another possible alternative may be to enhance DW Destinations that leverage Cloud Storage by allowing the user to retain the staged data, as opposed to removing it automatically. I could see value in both enhancements.

Additional context

Similar to #4610

Are you willing to submit a PR?

Perhaps. :)

Metadata

Metadata

Type

No type

Projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions