-
Notifications
You must be signed in to change notification settings - Fork 0
fix: properly detect bigquery catalog #629
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Co-authored-by: Thomas Chow <[email protected]>
WalkthroughThe changes modify the Changes
Poem
Warning Review ran into problems🔥 ProblemsGitHub Actions and Pipeline Checks: Resource not accessible by integration - https://docs.github.com/rest/actions/workflow-runs#list-workflow-runs-for-a-repository. Please grant the required permissions to the CodeRabbit GitHub App under the organization or repository settings. 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🧰 Additional context used🧬 Code Graph Analysis (1)cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigQueryCatalogTest.scala (5)
⏰ Context from checks skipped due to timeout of 90000ms (17)
🔇 Additional comments (2)
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (1)
28-48
: Consider fallback
If the table isn't found, throwing an exception may be strict. Evaluate returning None for a graceful fallback.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (1)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala
(2 hunks)
🧰 Additional context used
🧠 Learnings (1)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (2)
Learnt from: tchow-zlai
PR: zipline-ai/chronon#263
File: cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigQueryFormat.scala:29-60
Timestamp: 2025-04-09T21:40:05.504Z
Learning: In BigQuery integration, table existence check is performed outside the BigQueryFormat.createTable method, at a higher level in TableUtils.createTable.
Learnt from: tchow-zlai
PR: zipline-ai/chronon#263
File: cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigQueryFormat.scala:56-57
Timestamp: 2025-04-09T21:40:05.505Z
Learning: For BigQuery table creation operations in BigQueryFormat.scala, allow exceptions to propagate directly without wrapping them in try-catch blocks, as the original BigQuery exceptions provide sufficient context.
🧬 Code Graph Analysis (1)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (2)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/SparkBQUtils.scala (2)
SparkBQUtils
(6-17)toTableId
(8-15)spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala (1)
readFormat
(15-23)
⏰ Context from checks skipped due to timeout of 90000ms (4)
- GitHub Check: scala_compile_fmt_fix
- GitHub Check: non_spark_tests
- GitHub Check: non_spark_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (3)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (3)
5-8
: Approved imports
No issues detected.
50-60
: Clear parsing
Good fallback to currentCatalog if no explicit catalog. Brief tests with edge cases recommended.
62-69
: Check unknown catalog
catalogManager.catalog(catalog)
might fail if the catalog is unregistered. Confirm usage or handle gracefully.
Co-authored-by: Thomas Chow <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just had one question
if (isBigQueryCatalog(parsedCatalog)) { | ||
logger.info(s"Detected BigQuery catalog: $parsedCatalog") | ||
Try { | ||
val btTableIdentifier = SparkBQUtils.toTableId(tableName)(sparkSession) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remind me again but why do we need to convert tableName to a big table identifier?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's just what the bigQueryClient accepts - see the line below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tbh this isn't even entirely correct just yet. We actually need to find the project_id associated with the catalog, and replace that in the table Name before we pass it off to the BQ Client. but will do that in a follow-up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alright I've adjusted this actually to make it work with BigQuery.
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
ptal @david-zlai |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/SparkBQUtils.scala (1)
18-22
: New method to convert table name to Identifier.Clean implementation that parses and restructures table name components. Consider adding a comment explaining the reversal logic for clarity.
def toIdentifier(tableName: String)(implicit spark: SparkSession): Identifier = { + // Parse and reverse to get table name first, then namespace components val parseIdentifier = spark.sessionState.sqlParser.parseMultipartIdentifier(tableName).reverse Identifier.of(parseIdentifier.tail.reverse.toArray, parseIdentifier.head) }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (6)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/DelegatingBigQueryMetastoreCatalog.scala
(6 hunks)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala
(2 hunks)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/SparkBQUtils.scala
(2 hunks)cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigQueryCatalogTest.scala
(3 hunks)spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala
(1 hunks)spark/src/test/scala/ai/chronon/spark/test/TableUtilsTest.scala
(3 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/SparkBQUtils.scala (1)
spark/src/main/scala/ai/chronon/spark/TableUtils.scala (1)
sql
(297-325)
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigQueryCatalogTest.scala (5)
spark/src/main/scala/ai/chronon/spark/format/Iceberg.scala (1)
Iceberg
(8-56)spark/src/main/scala/ai/chronon/spark/format/FormatProvider.scala (2)
FormatProvider
(21-48)from
(23-48)spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala (1)
readFormat
(15-23)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigQueryExternal.scala (1)
BigQueryExternal
(12-101)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigQueryNative.scala (1)
BigQueryNative
(10-91)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (5)
spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala (1)
getCatalog
(25-35)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/SparkBQUtils.scala (2)
SparkBQUtils
(7-24)toIdentifier
(18-22)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/DelegatingBigQueryMetastoreCatalog.scala (1)
loadTable
(119-177)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigQueryNative.scala (1)
BigQueryNative
(10-91)cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/BigQueryExternal.scala (1)
BigQueryExternal
(12-101)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: non_spark_tests
- GitHub Check: non_spark_tests
- GitHub Check: scala_compile_fmt_fix
🔇 Additional comments (15)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/SparkBQUtils.scala (1)
5-5
: Added import for Identifier.Import needed for the new
toIdentifier
method.spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala (1)
25-35
: Added method to extract catalog from table name.Concise implementation with proper pattern matching for different identifier formats. Good fallback to current catalog when not explicitly specified.
spark/src/test/scala/ai/chronon/spark/test/TableUtilsTest.scala (2)
28-29
: Added necessary imports for new test.These imports are required for the new catalog detection test.
646-656
: Comprehensive test for catalog detection.Thorough test coverage of various table name formats and edge cases.
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/BigQueryCatalogTest.scala (3)
13-14
: Added import for Iceberg format.Required for the new test comparing format detection results.
45-53
: Uncommented Iceberg catalog configuration.Enables necessary settings for Iceberg catalog integration tests.
118-134
: Added integration test for format detection.Covers key table formats (BigQuery external, native, Iceberg) and non-existent tables. Appropriate to mark as
ignore
since it requires actual BigQuery resources.cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (2)
22-44
: Improved catalog detection logic.Refactored implementation properly identifies BigQuery catalog types and returns appropriate format based on table properties.
4-7
: Appropriate import additions.Added necessary imports for new functionality.
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/DelegatingBigQueryMetastoreCatalog.scala (6)
93-93
: Added catalogProps variable.Stores catalog properties for later use.
120-127
: Improved table identifier handling.Properly removes catalog segment from namespace and supports Iceberg tables.
130-131
: Better project ID retrieval.Uses catalogProps for consistent project ID retrieval.
157-159
: Explicit PARQUET provider type.Sets provider type for format detection.
169-170
: Explicit BIGQUERY provider type.Sets provider type for format detection.
204-204
: Store catalog properties.Populates catalogProps for use in loadTable method.
private[cloud_gcp] def getFormat(table: Table): Format = { | ||
table.getDefinition.asInstanceOf[TableDefinition] match { | ||
case _: ExternalTableDefinition => | ||
val parsedCatalog = getCatalog(tableName) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Missing getCatalog implementation.
Method is called but appears to be missing implementation.
🏁 Script executed:
#!/bin/bash
# Check for getCatalog method implementation
grep -r "def getCatalog" --include="*.scala" .
Length of output: 171
Implement or Inherit getCatalog
The call to getCatalog
in GcpFormatProvider.scala
(line 23) doesn’t resolve locally. Although a similar method exists in DefaultFormatProvider.scala
, it isn’t automatically available here. Please either inherit from a common base that provides the implementation or add a GCP-specific getCatalog
method.
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
766a1bd
to
b7b44d4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🔭 Outside diff range comments (1)
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProviderTest.scala (1)
19-21
: 🛠️ Refactor suggestionUpdate mock setup for new method signature.
The test setup creates a mock Table but readFormat only needs a tableName string, making the mock setup unnecessary.
Consider either removing the unused mock setup or enhancing the test to verify format detection properly.
Also applies to: 36-36
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro (Legacy)
📒 Files selected for processing (2)
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProviderTest.scala
(1 hunks)spark/src/test/scala/ai/chronon/spark/test/TableUtilsTest.scala
(3 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (2)
cloud_gcp/src/test/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProviderTest.scala (2)
cloud_gcp/src/main/scala/ai/chronon/integrations/cloud_gcp/GcpFormatProvider.scala (1)
readFormat
(22-45)spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala (1)
readFormat
(15-23)
spark/src/test/scala/ai/chronon/spark/test/TableUtilsTest.scala (2)
spark/src/main/scala/ai/chronon/spark/format/FormatProvider.scala (1)
FormatProvider
(21-48)spark/src/main/scala/ai/chronon/spark/format/DefaultFormatProvider.scala (2)
DefaultFormatProvider
(10-63)getCatalog
(25-35)
⏰ Context from checks skipped due to timeout of 90000ms (18)
- GitHub Check: streaming_tests
- GitHub Check: spark_tests
- GitHub Check: groupby_tests
- GitHub Check: fetcher_tests
- GitHub Check: analyzer_tests
- GitHub Check: join_tests
- GitHub Check: batch_tests
- GitHub Check: streaming_tests
- GitHub Check: join_tests
- GitHub Check: groupby_tests
- GitHub Check: analyzer_tests
- GitHub Check: fetcher_tests
- GitHub Check: batch_tests
- GitHub Check: non_spark_tests
- GitHub Check: spark_tests
- GitHub Check: scala_compile_fmt_fix
- GitHub Check: non_spark_tests
- GitHub Check: enforce_triggered_workflows
🔇 Additional comments (2)
spark/src/test/scala/ai/chronon/spark/test/TableUtilsTest.scala (2)
28-30
: Added necessary imports.Added imports for FormatProvider and ParseException to support the new test case.
646-656
: Comprehensive test coverage for catalog detection.The test thoroughly verifies catalog extraction logic across various table name formats including quoted identifiers and default catalog fallbacks. Good job covering edge cases like empty strings.
@@ -33,6 +33,6 @@ class GcpFormatProviderTest extends AnyFlatSpec with MockitoSugar { | |||
.build()) | |||
when(mockTable.getTableId).thenReturn(TableId.of("project", "dataset", "table")) | |||
|
|||
val gcsFormat = gcpFormatProvider.getFormat(mockTable) | |||
val gcsFormat = gcpFormatProvider.readFormat(tableName) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Method changed but test still ignored.
Test was updated to use readFormat
instead of getFormat
but remains ignored and lacks assertions.
-val gcsFormat = gcpFormatProvider.readFormat(tableName)
+val gcsFormat = gcpFormatProvider.readFormat(tableName)
+assert(gcsFormat.isDefined, "Format should be detected")
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
val gcsFormat = gcpFormatProvider.readFormat(tableName) | |
val gcsFormat = gcpFormatProvider.readFormat(tableName) | |
assert(gcsFormat.isDefined, "Format should be detected") |
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
Co-authored-by: Thomas Chow <[email protected]>
b5abf56
to
a423261
Compare
## Summary - Now that we have configurable catalogs, we should rely on that to determine whether we're interacting with BigQuery. ## Checklist - [ ] Added Unit Tests - [ ] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Summary by CodeRabbit - **New Features** - Enhanced cloud integration logic to accurately determine data table catalogs, ensuring more reliable data format retrieval. - Improved error handling, providing clearer feedback when encountering issues with data format detection. - Introduced a new method for converting table names into identifiers, enhancing catalog detection functionality. - Added a method for retrieving the catalog name based on table names, improving the overall functionality. - **Bug Fixes** - Corrected a typo in the error message for the `NoSuchTableException`. - **Tests** - Added a test case to verify the functionality of catalog detection in the `GcpFormatProvider` class, increasing test coverage. - Introduced a new test case for validating catalog detection based on various input strings in the `TableUtilsTest` class. - Enhanced test coverage for format detection scenarios in the `BigQueryCatalogTest` class. <!-- end of auto-generated comment: release notes by coderabbit.ai --> <!-- av pr metadata This information is embedded by the av CLI when creating PRs to track the status of stacks when using Aviator. Please do not delete or edit this section of the PR. ``` {"parent":"main","parentHead":"","trunk":"main"} ``` --> --------- Co-authored-by: Thomas Chow <[email protected]>
## Summary - Now that we have configurable catalogs, we should rely on that to determine whether we're interacting with BigQuery. ## Checklist - [ ] Added Unit Tests - [ ] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Summary by CodeRabbit - **New Features** - Enhanced cloud integration logic to accurately determine data table catalogs, ensuring more reliable data format retrieval. - Improved error handling, providing clearer feedback when encountering issues with data format detection. - Introduced a new method for converting table names into identifiers, enhancing catalog detection functionality. - Added a method for retrieving the catalog name based on table names, improving the overall functionality. - **Bug Fixes** - Corrected a typo in the error message for the `NoSuchTableException`. - **Tests** - Added a test case to verify the functionality of catalog detection in the `GcpFormatProvider` class, increasing test coverage. - Introduced a new test case for validating catalog detection based on various input strings in the `TableUtilsTest` class. - Enhanced test coverage for format detection scenarios in the `BigQueryCatalogTest` class. <!-- end of auto-generated comment: release notes by coderabbit.ai --> <!-- av pr metadata This information is embedded by the av CLI when creating PRs to track the status of stacks when using Aviator. Please do not delete or edit this section of the PR. ``` {"parent":"main","parentHead":"","trunk":"main"} ``` --> --------- Co-authored-by: Thomas Chow <[email protected]>
## Summary - Now that we have configurable catalogs, we should rely on that to determine whether we're interacting with BigQuery. ## Checklist - [ ] Added Unit Tests - [ ] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Summary by CodeRabbit - **New Features** - Enhanced cloud integration logic to accurately determine data table catalogs, ensuring more reliable data format retrieval. - Improved error handling, providing clearer feedback when encountering issues with data format detection. - Introduced a new method for converting table names into identifiers, enhancing catalog detection functionality. - Added a method for retrieving the catalog name based on table names, improving the overall functionality. - **Bug Fixes** - Corrected a typo in the error message for the `NoSuchTableException`. - **Tests** - Added a test case to verify the functionality of catalog detection in the `GcpFormatProvider` class, increasing test coverage. - Introduced a new test case for validating catalog detection based on various input strings in the `TableUtilsTest` class. - Enhanced test coverage for format detection scenarios in the `BigQueryCatalogTest` class. <!-- end of auto-generated comment: release notes by coderabbit.ai --> <!-- av pr metadata This information is embedded by the av CLI when creating PRs to track the status of stacks when using Aviator. Please do not delete or edit this section of the PR. ``` {"parent":"main","parentHead":"","trunk":"main"} ``` --> --------- Co-authored-by: Thomas Chow <[email protected]>
## Summary - Now that we have configurable catalogs, we should rely on that to determine whether we're interacting with BigQuery. ## Checklist - [ ] Added Unit Tests - [ ] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Summary by CodeRabbit - **New Features** - Enhanced cloud integration logic to accurately determine data table catalogs, ensuring more reliable data format retrieval. - Improved error handling, providing clearer feedback when encountering issues with data format detection. - Introduced a new method for converting table names into identifiers, enhancing catalog detection functionality. - Added a method for retrieving the catalog name based on table names, improving the overall functionality. - **Bug Fixes** - Corrected a typo in the error message for the `NoSuchTableException`. - **Tests** - Added a test case to verify the functionality of catalog detection in the `GcpFormatProvider` class, increasing test coverage. - Introduced a new test case for validating catalog detection based on various input strings in the `TableUtilsTest` class. - Enhanced test coverage for format detection scenarios in the `BigQueryCatalogTest` class. <!-- end of auto-generated comment: release notes by coderabbit.ai --> <!-- av pr metadata This information is embedded by the av CLI when creating PRs to track the status of stacks when using Aviator. Please do not delete or edit this section of the PR. ``` {"parent":"main","parentHead":"","trunk":"main"} ``` --> --------- Co-authored-by: Thomas Chow <[email protected]>
## Summary - Now that we have configurable catalogs, we should rely on that to determine whether we're interacting with BigQuery. ## Cheour clientslist - [ ] Added Unit Tests - [ ] Covered by existing CI - [ ] Integration tested - [ ] Documentation update <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Summary by CodeRabbit - **New Features** - Enhanced cloud integration logic to accurately determine data table catalogs, ensuring more reliable data format retrieval. - Improved error handling, providing clearer feedbaour clients when encountering issues with data format detection. - Introduced a new method for converting table names into identifiers, enhancing catalog detection functionality. - Added a method for retrieving the catalog name based on table names, improving the overall functionality. - **Bug Fixes** - Corrected a typo in the error message for the `NoSuchTableException`. - **Tests** - Added a test case to verify the functionality of catalog detection in the `GcpFormatProvider` class, increasing test coverage. - Introduced a new test case for validating catalog detection based on various input strings in the `TableUtilsTest` class. - Enhanced test coverage for format detection scenarios in the `BigQueryCatalogTest` class. <!-- end of auto-generated comment: release notes by coderabbit.ai --> <!-- av pr metadata This information is embedded by the av CLI when creating PRs to traour clients the status of staour clientss when using Aviator. Please do not delete or edit this section of the PR. ``` {"parent":"main","parentHead":"","trunk":"main"} ``` --> --------- Co-authored-by: Thomas Chow <[email protected]>
Summary
Checklist
Summary by CodeRabbit
Summary by CodeRabbit
New Features
Bug Fixes
NoSuchTableException
.Tests
GcpFormatProvider
class, increasing test coverage.TableUtilsTest
class.BigQueryCatalogTest
class.