-
Notifications
You must be signed in to change notification settings - Fork 117
Remaining validations #332
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## main #332 +/- ##
==========================================
+ Coverage 38.93% 39.75% +0.82%
==========================================
Files 54 54
Lines 3601 3655 +54
==========================================
+ Hits 1402 1453 +51
- Misses 2199 2202 +3
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
c4cf7f4
to
033b7d5
Compare
Crap. Just realized |
@bryannaegele filtermap is from R16 |
end; | ||
is_allowed(_, _) -> | ||
false. | ||
%% already in the map and not a binary so update |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why adding anything that isn't a binary?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The API will have already filtered out everything in the validation step before getting to the SDK, so it should only need to possibly truncate a binary value if it exceeds the length limit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, gotcha.
One thing I forgot to do and I don't think I see here is also limiting the length of an list (array). Think you can sneak that in here? If no I'll open a separate PR after this gets merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I thought there was something for that in the spec but I couldn't find it. Is it a separate value from the binary limit?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Na, same limit. I'll go look for it in the spec, I'm pretty sure its there somewhere.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, well first I guess, is each string in an array has to be truncated:
if it is an array of strings, then apply the above rule to each of the values separately,
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah but are we going to enforce that? I thought we decided to not validate or mess with list contents due to the perf hit. I guess since it's a resources protection issue, maybe we have to do it anyhow. At that point, I may as well go ahead and add the array homogeneity checks, too.
Up to you. If it's a real drag on performance we could add an opt-out option later, I suppose. I'd be a little surprised if folks had many lists or that they'd be lengthy enough to be truly impactful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Iterating a list of binaries and checking their length is probably the worst possible perf iteration we could do, so makes sense to just add everything if we're already making that possible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is in the spec as a requirement if we support limits, so yea, we should have it.
We can document to a user that if they want to avoid these iterations they can set the length limit to infinity. We could check homogeneity only when it isn't infinity for length. Possibly a bit odd behavior to some but simple to document. And the issue of wanting to drop before the exporter (where it'll be dropped if it isn't homogeneous) is only an issue if they add too many attributes that they hit the limit and then want latter attributes to still be included since those should have been dropped... so I don't think its a big deal.
Yeah, was referring to the One questionable thing I did was returning |
f03f878
to
5f22cdf
Compare
Oh, missed the part about raising. Na, should never raise. Should create a zero span where all id's are 0. |
And
|
This adds validations and limits attribute values to those in the spec rather than what the protos are capable of exporting. I've also added validation to span start opts which were previously not being validated in any way.
There's a second commit here that removes validations and tracking attribute validation drops at the SDK level. Looking through the spec, this section indicates the dropped attribute count is tracking attributes dropped due to count limits. This makes sense to me since validation and allowed types are an API-level concern which aren't trackable, whereas system limits are an SDK-level concern which is trackable.
We can try to get some verification on that one.
Closes #323