These tips represent what I see in the field working with and coaching teams. Keep in mind the majority of this applies to what I work mostly, that is Snowflake and adopting DataOps style agile patterns.
Blogs
Auto ingestion into Snowflake alternative to using Snowpipe - External Tables with Snowflake Streams
Snowflake’s auto ingestion service Snowpipe has been around since Dec 2018 and, in my experience, has proven to be an excellent method for automatically ingesting incoming data landing in cloud storage into Snowflake.
Snowpipe with auto ingestion requires addition cloud services over and above the simple cloud storage containers your data lands in.
It's been a couple of months since I sat the above exam and I've had a few people asking me about it, so I thought I'd share my thoughts on it in a blog.
There are so many tools to migrate the data from your server to Azure blob storage. The tools may not work as you want. Here is the way to minimize your third-party cost and process your data in an effective way. Now, we are uploading CSV files from the folder recursively i.e. file inside the sub-folder too.
Snowpipe enables loading data from files as soon as they’re available. This means you can load data from files in micro-batches, making it available to users within minutes, rather than manually executing COPY statements on a schedule to load larger batches.
Screw the Business Key!…Say What Now?
The business key…. Well what is the business key? In the Data Vault methodology it is simply a unique key used to identify a record to either be inserted into the Data Vault (hub) or check against an existing record.
Data Vault, is it a replacement for traditional data warehouses and star schemas? This is a common question that not only often gets asked to me, but also a question that get gets asked by those interested or learning about Data Vault.