You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Fix error when setting a large number of properties (#312)
* Fix error when setting a large number of properties
Bugfix
Fix#269.
This change greatly reduces the likelihood of an error when specifying a large number of property_ids in `ga4.combine_property_data()`.
* Fixed the following bug
* Changed to copy a table for each peoperty_id
dbt_project.yml
```yml
vars:
ga4:
source_project: source-project-id
property_ids: [
000000001
, 000000002
, ...
, 000000040
]
start_date: 20210101
static_incremental_days: 3
combined_dataset: combined_dataset_name
```
```shell
$ dbt run -s base_ga4__events --full-refresh
06:51:19 Running with dbt=1.5.0
06:52:05 Found 999 models, 999 tests, 999 snapshots, 999 analyses, 999 macros, 999 operations, 999 seed files, 999 sources, 999 exposures, 999 metrics, 999 groups
06:52:06
06:52:14 Concurrency: 4 threads (target='dev')
06:52:14
06:52:14 1 of 1 START sql view model dataset_name.base_ga4__events ......... [RUN]
06:56:17 BigQuery adapter: https://console.cloud.google.com/bigquery?project=project-id&j=bq:asia-northeast1:????????-????-????-????-????????????&page=queryresults
06:56:17 1 of 1 ERROR creating sql view model dataset_name.base_ga4__events [ERROR in 243.80s]
06:56:18
06:56:18 Finished running 1 view model in 0 hours 4 minutes and 11.62 seconds (251.62s).
06:56:22
06:56:22 Completed with 1 error and 0 warnings:
06:56:22
06:56:23 Database Error in model base_ga4__events (models/staging/base/base_ga4__events.sql)
06:56:23 The query is too large. The maximum standard SQL query length is 1024.00K characters, including comments and white space characters.
06:56:23
06:56:23 Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1
```
Merging this pull request will enable execution.
```shell
$ dbt run -s base_ga4__events --full-refresh
HH:mm:ss Running with dbt=1.5.0
HH:mm:ss Found 999 models, 999 tests, 999 snapshots, 999 analyses, 999 macros, 999 operations, 999 seed files, 999 sources, 999 exposures, 999 metrics, 999 groups
HH:mm:ss
HH:mm:ss Concurrency: 4 threads (target='dev')
HH:mm:ss
HH:mm:ss 1 of 1 START sql incremental model dataset_name.base_ga4__events ... [RUN]
HH:mm:ss Cloned from `source-project-id.analytics_000000001.events_*[20210101-20240324]` to `project-id.combined_dataset_name.events_YYYYMMDD000000001`.
HH:mm:ss Cloned from `source-project-id.analytics_000000002.events_*[20210101-20240324]` to `project-id.combined_dataset_name.events_YYYYMMDD000000002`.
....
HH:mm:ss Cloned from `source-project-id.analytics_000000040.events_*[20210101-20240324]` to `project-id.combined_dataset_name.events_YYYYMMDD000000040`.
HH:mm:ss 1 of 1 OK created sql incremental model dataset_name.base_ga4__events [CREATE TABLE (? rows, ? processed) in ?]
HH:mm:ss
HH:mm:ss Finished running 1 incremental model in ? (?).
HH:mm:ss
HH:mm:ss Completed successfully
HH:mm:ss
HH:mm:ss Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
```
---
Fixed timeout in clone operation
The following error will almost never occur because I have changed to clone separated by property_id.
* Removed https://github.com/Velir/dbt-ga4/blame/6.0.1/README.md#L323-L332 from README.md
* Resolved the following operation
https://github.com/Velir/dbt-ga4/blame/6.0.1/README.md#L323-L332
> Jobs that run a large number of clone operations are prone to timing out. As a result, it is recommended that you increase the query timeout if you need to backfill or full-refresh the table, when first setting up or when the base model gets modified. Otherwise, it is best to prevent the base model from rebuilding on full refreshes unless needed to minimize timeouts.
>
> ```
> models:
> ga4:
> staging:
> base:
> base_ga4__events:
> +full_refresh: false
> ```
* Changed the implementation of combine_property_data to the minimum necessary
* Remove latest_shard_to_retrieve
Copy file name to clipboardExpand all lines: README.md
-10
Original file line number
Diff line number
Diff line change
@@ -320,16 +320,6 @@ vars:
320
320
321
321
With these variables set, the `combine_property_data` macro will run as a pre-hook to `base_ga4_events` and clone shards to the target dataset. The number of days' worth of data to clone during incremental runs will be based on the `static_incremental_days` variable.
322
322
323
-
Jobs that run a large number of clone operations are prone to timing out. As a result, it is recommended that you increase the query timeout if you need to backfill or full-refresh the table, when first setting up or when the base model gets modified. Otherwise, it is best to prevent the base model from rebuilding on full refreshes unless needed to minimize timeouts.
324
-
325
-
```
326
-
models:
327
-
ga4:
328
-
staging:
329
-
base:
330
-
base_ga4__events:
331
-
+full_refresh: false
332
-
```
333
323
# dbt Style Guide
334
324
335
325
This package attempts to adhere to the Brooklyn Data style guide found [here](https://github.com/brooklyn-data/co/blob/main/sql_style_guide.md). This work is in-progress.
{%- if relation_suffix|int>= earliest_shard_to_retrieve|int-%}
25
-
CREATE OR REPLACE TABLE`{{target.project}}.{{var('combined_dataset')}}.events_intraday_{{relation_suffix}}{{property_id}}`CLONE`{{var('source_project')}}.analytics_{{property_id}}.events_intraday_{{relation_suffix}}`;
25
+
create or replace table`{{target.project}}.{{var('combined_dataset')}}.events_intraday_{{relation_suffix}}{{property_id}}`clone`{{var('source_project')}}.analytics_{{property_id}}.events_intraday_{{relation_suffix}}`;
26
26
{%- endif -%}
27
27
{% endfor %}
28
+
28
29
{# Copy daily tables and drop old intraday table #}
{%- if relation_suffix|int>= earliest_shard_to_retrieve|int-%}
33
-
CREATE OR REPLACE TABLE`{{target.project}}.{{var('combined_dataset')}}.events_{{relation_suffix}}{{property_id}}`CLONE`{{var('source_project')}}.analytics_{{property_id}}.events_{{relation_suffix}}`;
34
-
DROPTABLE IF EXISTS`{{target.project}}.{{var('combined_dataset')}}.events_intraday_{{relation_suffix}}{{property_id}}`;
34
+
create or replace table`{{target.project}}.{{var('combined_dataset')}}.events_{{relation_suffix}}{{property_id}}`clone`{{var('source_project')}}.analytics_{{property_id}}.events_{{relation_suffix}}`;
35
+
droptable if exists`{{target.project}}.{{var('combined_dataset')}}.events_intraday_{{relation_suffix}}{{property_id}}`;
35
36
{%- endif -%}
36
37
{% endfor %}
38
+
{%- endset -%}
39
+
40
+
{% do run_query(combine_specified_property_data_query) %}
0 commit comments