Skip to content

BigQuery Connector Ignores location Option When Submitting Query Jobs — Always Defaults to US #1348

Open
@sbaldassin

Description

@sbaldassin

Hi team,

I'm using the Spark BigQuery Connector to query datasets that reside in the EU region, and I’ve encountered a consistent issue.
The connector always executes query jobs in the US region, even when using .option("location", "EU"), resulting in errors like:

Not found: Dataset project:dataset was not found in location US

After investigating the codebase and tracing the execution path, here’s what I found. The connector ultimately submits query jobs using:

bigQuery.create(JobInfo.of(queryJobConfig))

This creates a JobInfo with a JobConfiguration, but without a JobId. In the BigQuery API, location is part of jobReference, not configuration. Therefore, unless a JobId is created with .setLocation(...), BigQuery defaults to US.

Would you please take a look at this one?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions