You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, I recently encountered a rather nasty error which has had some costly effects in terms of lost data. Essentially, our team discovered we had a gradual data drift in our downstream data science / product dashboards causal to lost messages sent to mixpanel.
After a bit of investigation, I discovered that there is a maximum JSON serialization limit for JSON generation. One of the challenges for us was that we didn't realize the gem caught this error and piped out to logs. Our log pipelines are quite busy so its hard to catch minor errors like this.
I wonder if it might make sense to simply raise an error saying "Limit of message reached" when you try to track an event with a message size > 32k, instead of logging and failing silently.
Hi there, I recently encountered a rather nasty error which has had some costly effects in terms of lost data. Essentially, our team discovered we had a gradual data drift in our downstream data science / product dashboards causal to lost messages sent to mixpanel.
After a bit of investigation, I discovered that there is a maximum JSON serialization limit for JSON generation. One of the challenges for us was that we didn't realize the gem caught this error and piped out to logs. Our log pipelines are quite busy so its hard to catch minor errors like this.
I wonder if it might make sense to simply raise an error saying "Limit of message reached" when you try to
track
an event with a message size > 32k, instead of logging and failing silently.analytics-ruby/lib/segment/analytics/defaults.rb
Line 22 in c440393
analytics-ruby/lib/segment/analytics/message_batch.rb
Lines 24 to 31 in c440393
The text was updated successfully, but these errors were encountered: