Skip to content

Remaining validations #332

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Dec 26, 2021
Merged

Conversation

bryannaegele
Copy link
Contributor

This adds validations and limits attribute values to those in the spec rather than what the protos are capable of exporting. I've also added validation to span start opts which were previously not being validated in any way.

There's a second commit here that removes validations and tracking attribute validation drops at the SDK level. Looking through the spec, this section indicates the dropped attribute count is tracking attributes dropped due to count limits. This makes sense to me since validation and allowed types are an API-level concern which aren't trackable, whereas system limits are an SDK-level concern which is trackable.

We can try to get some verification on that one.

Closes #323

@codecov
Copy link

codecov bot commented Dec 25, 2021

Codecov Report

Merging #332 (afda6b6) into main (b8185bd) will increase coverage by 0.82%.
The diff coverage is 79.22%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #332      +/-   ##
==========================================
+ Coverage   38.93%   39.75%   +0.82%     
==========================================
  Files          54       54              
  Lines        3601     3655      +54     
==========================================
+ Hits         1402     1453      +51     
- Misses       2199     2202       +3     
Flag Coverage Δ
api 67.28% <76.47%> (+1.16%) ⬆️
elixir 16.44% <30.88%> (+1.34%) ⬆️
erlang 39.79% <77.92%> (+0.83%) ⬆️
exporter 19.98% <ø> (+0.34%) ⬆️
sdk 77.44% <100.00%> (+0.02%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
apps/opentelemetry_api/lib/open_telemetry.ex 21.42% <ø> (ø)
apps/opentelemetry_api/lib/open_telemetry/span.ex 26.31% <ø> (ø)
...pps/opentelemetry_api/lib/open_telemetry/tracer.ex 37.50% <ø> (ø)
apps/opentelemetry_api/src/otel_tracer_noop.erl 81.81% <0.00%> (-8.19%) ⬇️
apps/opentelemetry_api/src/otel_tracer.erl 63.15% <50.00%> (-18.67%) ⬇️
apps/opentelemetry_api/src/otel_span.erl 76.31% <80.85%> (+6.61%) ⬆️
apps/opentelemetry/src/otel_attributes.erl 90.00% <100.00%> (+0.52%) ⬆️
apps/opentelemetry_api/src/opentelemetry.erl 80.68% <100.00%> (+3.67%) ⬆️
...er/src/opentelemetry_exporter_trace_service_pb.erl 15.94% <0.00%> (+0.28%) ⬆️
... and 1 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b8185bd...afda6b6. Read the comment docs.

@bryannaegele
Copy link
Contributor Author

Crap. Just realized filtermap is an OTP 24 introduction :/

@tsloughter
Copy link
Member

@bryannaegele filtermap is from R16

end;
is_allowed(_, _) ->
false.
%% already in the map and not a binary so update
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why adding anything that isn't a binary?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The API will have already filtered out everything in the validation step before getting to the SDK, so it should only need to possibly truncate a binary value if it exceeds the length limit.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, gotcha.

One thing I forgot to do and I don't think I see here is also limiting the length of an list (array). Think you can sneak that in here? If no I'll open a separate PR after this gets merged.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I thought there was something for that in the spec but I couldn't find it. Is it a separate value from the binary limit?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Na, same limit. I'll go look for it in the spec, I'm pretty sure its there somewhere.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, well first I guess, is each string in an array has to be truncated:

if it is an array of strings, then apply the above rule to each of the values separately,

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah but are we going to enforce that? I thought we decided to not validate or mess with list contents due to the perf hit. I guess since it's a resources protection issue, maybe we have to do it anyhow. At that point, I may as well go ahead and add the array homogeneity checks, too.

Up to you. If it's a real drag on performance we could add an opt-out option later, I suppose. I'd be a little surprised if folks had many lists or that they'd be lengthy enough to be truly impactful.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Iterating a list of binaries and checking their length is probably the worst possible perf iteration we could do, so makes sense to just add everything if we're already making that possible.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is in the spec as a requirement if we support limits, so yea, we should have it.

We can document to a user that if they want to avoid these iterations they can set the length limit to infinity. We could check homogeneity only when it isn't infinity for length. Possibly a bit odd behavior to some but simple to document. And the issue of wanting to drop before the exporter (where it'll be dropped if it isn't homogeneous) is only an issue if they add too many attributes that they hit the limit and then want latter attributes to still be included since those should have been dropped... so I don't think its a big deal.

@bryannaegele
Copy link
Contributor Author

@bryannaegele filtermap is from R16

Yeah, was referring to the maps:filtermap which was more recent. I didn't need to be using that anyhow since there was no transform.

One questionable thing I did was returning undefined on invalid names. I didn't want to raise and the only other alternative would be to have some placeholder, but in a with_span then that undefined would get passed to end_span which presently doesn't handle that; it always expects a span ctx. We can start handling undefined on end_span or raise on the name. Thus far, we've generally stayed quiet on errors but maybe this calls for it.

@tsloughter
Copy link
Member

Oh, missed the part about raising. Na, should never raise. Should create a zero span where all id's are 0.

@tsloughter
Copy link
Member

And is_valid is_recording are false:

-define(NOOP_SPAN_CTX, #span_ctx{trace_id=0,
                                 span_id=0,
                                 trace_flags=0,
                                 tracestate=[],
                                 is_valid=false,
                                 is_recording=false,
                                 span_sdk=undefined}).

@tsloughter tsloughter merged commit 6585266 into open-telemetry:main Dec 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Limit attribute value types
2 participants