-
Notifications
You must be signed in to change notification settings - Fork 0
refactor: split fetcher logic into multiple files #425
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
7 commits
Select commit
Hold shift + click to select a range
959f1c5
refactor: split fetcher logic into multiple files
nikhil-zlai db03b87
capitalization fix
nikhil-zlai 8c8c5a8
float up gb fetcher methods
nikhil-zlai f97321d
2.13 fixes
nikhil-zlai 672d1ee
undo 2.13 on sync
nikhil-zlai ac5686e
test fix
nikhil-zlai 9cfa908
test fixes
nikhil-zlai File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
package ai.chronon.api | ||
|
||
import ai.chronon.api.thrift.protocol.{TBinaryProtocol, TCompactProtocol} | ||
import ai.chronon.api.thrift.{TDeserializer, TSerializer} | ||
|
||
object SerdeUtils { | ||
@transient | ||
lazy val compactSerializer: ThreadLocal[TSerializer] = new ThreadLocal[TSerializer] { | ||
override def initialValue(): TSerializer = new TSerializer(new TCompactProtocol.Factory()) | ||
} | ||
|
||
@transient | ||
lazy val compactDeserializer: ThreadLocal[TDeserializer] = new ThreadLocal[TDeserializer] { | ||
override def initialValue(): TDeserializer = new TDeserializer(new TCompactProtocol.Factory()) | ||
} | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -19,9 +19,9 @@ import ai.chronon.flink.window.KeySelectorBuilder | |
import ai.chronon.online.Api | ||
import ai.chronon.online.FlagStoreConstants | ||
import ai.chronon.online.GroupByServingInfoParsed | ||
import ai.chronon.online.MetadataStore | ||
import ai.chronon.online.SparkConversions | ||
import ai.chronon.online.TopicInfo | ||
import ai.chronon.online.fetcher.{FetchContext, MetadataStore} | ||
import org.apache.flink.api.common.eventtime.SerializableTimestampAssigner | ||
import org.apache.flink.api.common.eventtime.WatermarkStrategy | ||
import org.apache.flink.configuration.CheckpointingOptions | ||
|
@@ -243,7 +243,7 @@ object FlinkJob { | |
// we set an explicit max parallelism to ensure if we do make parallelism setting updates, there's still room | ||
// to restore the job from prior state. Number chosen does have perf ramifications if too high (can impact rocksdb perf) | ||
// so we've chosen one that should allow us to scale to jobs in the 10K-50K events / s range. | ||
val MaxParallelism = 1260 // highly composite number | ||
val MaxParallelism: Int = 1260 // highly composite number | ||
|
||
// We choose to checkpoint frequently to ensure the incremental checkpoints are small in size | ||
// as well as ensuring the catch-up backlog is fairly small in case of failures | ||
|
@@ -254,11 +254,11 @@ object FlinkJob { | |
val CheckpointTimeout: FiniteDuration = 5.minutes | ||
|
||
// We use incremental checkpoints and we cap how many we keep around | ||
val MaxRetainedCheckpoints = 10 | ||
val MaxRetainedCheckpoints: Int = 10 | ||
|
||
// how many consecutive checkpoint failures can we tolerate - default is 0, we choose a more lenient value | ||
// to allow us a few tries before we give up | ||
val TolerableCheckpointFailures = 5 | ||
val TolerableCheckpointFailures: Int = 5 | ||
|
||
// Keep windows open for a bit longer before closing to ensure we don't lose data due to late arrivals (needed in case of | ||
// tiling implementation) | ||
|
@@ -306,7 +306,7 @@ object FlinkJob { | |
val kafkaBootstrap = jobArgs.kafkaBootstrap.toOption | ||
|
||
val api = buildApi(onlineClassName, props) | ||
val metadataStore = new MetadataStore(api.genKvStore, MetadataDataset, timeoutMillis = 10000) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 10k is the default - there was a compiler warning |
||
val metadataStore = new MetadataStore(FetchContext(api.genKvStore, MetadataDataset)) | ||
|
||
val flinkJob = | ||
if (useMockedSource) { | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
32 changes: 0 additions & 32 deletions
32
online/src/main/scala/ai/chronon/online/CompatParColls.scala
This file was deleted.
Oops, something went wrong.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
36 changes: 36 additions & 0 deletions
36
online/src/main/scala/ai/chronon/online/fetcher/FetchContext.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,36 @@ | ||
package ai.chronon.online.fetcher | ||
import ai.chronon.api.Constants.MetadataDataset | ||
import ai.chronon.api.ScalaJavaConversions.JMapOps | ||
import ai.chronon.online.{FlagStore, FlagStoreConstants, FlexibleExecutionContext, KVStore} | ||
|
||
import scala.concurrent.ExecutionContext | ||
|
||
case class FetchContext(kvStore: KVStore, | ||
metadataDataset: String = MetadataDataset, | ||
timeoutMillis: Long = 10000, | ||
debug: Boolean = false, | ||
flagStore: FlagStore = null, | ||
disableErrorThrows: Boolean = false, | ||
executionContextOverride: ExecutionContext = null) { | ||
|
||
def isTilingEnabled: Boolean = { | ||
Option(flagStore) | ||
.map(_.isSet(FlagStoreConstants.TILING_ENABLED, Map.empty[String, String].toJava)) | ||
.exists(_.asInstanceOf[Boolean]) | ||
} | ||
|
||
def isCachingEnabled(groupByName: String): Boolean = { | ||
Option(flagStore) | ||
.exists(_.isSet("enable_fetcher_batch_ir_cache", Map("group_by_streaming_dataset" -> groupByName).toJava)) | ||
} | ||
|
||
def shouldStreamingDecodeThrow(groupByName: String): Boolean = { | ||
Option(flagStore) | ||
.exists( | ||
_.isSet("disable_streaming_decoding_error_throws", Map("group_by_streaming_dataset" -> groupByName).toJava)) | ||
} | ||
|
||
def getOrCreateExecutionContext: ExecutionContext = { | ||
Option(executionContextOverride).getOrElse(FlexibleExecutionContext.buildExecutionContext) | ||
} | ||
} |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no bugs here - just needed thread local