You've already forked opentelemetry-go
mirror of
https://github.com/open-telemetry/opentelemetry-go.git
synced 2025-12-01 23:12:29 +02:00
The current design of the packages is for ergonomics, and it works well
at this. However, it is not the most performant. When a user passes a
slice of attributes it will use
[`metric.WithAttributes`](cf787f3e3a/metric/instrument.go (L372-L376)):
```go
func WithAttributes(attributes ...attribute.KeyValue) MeasurementOption {
cp := make([]attribute.KeyValue, len(attributes))
copy(cp, attributes)
return attrOpt{set: attribute.NewSet(cp...)}
}
```
This will copy the passed attributes and then pass that copy to
`attribute.NewSet` which adds its own allocation.
If a user is performance conscious and already have a set, allow them to
pass it directly and reduce the required allocations for the measurement
call.
## Alternatives Considered
### Why not allow users to just use `Inst()` method?
With the current design a user can still just call:
```go
counter.Inst().Add(ctx, val, metric.WithAttributeSet(set))
```
However, the option slice (in this case `[]metric.AddOption`) is
allocated on the call. Meaning a user will also need to recreate the
pooling that our semconv generated packages already handle.
Providing the `*Set` methods is convenient, but it also helps the
application performance as one larger pool will be less strain on the GC
than many smaller ones.
### Why not just call `WithAttributeSet` in the existing methods?
Instead of appending a `WithAttributes` option internal to all the
measurement methods, we could also just create a `Set` and pass that
with `WithAttributeSet`. This will avoid the additional slice copy that
`WithAttributes` does on top of calling `attribute.NewSet`.
However, this copy that `WithAttributes` does is important as calling
`attribute.NewSet` will sort the original slice. Bypassing that will
mean all the passed attribute slices will be modified when used.
This is not great as it is likely going to be unexpected behavior by the
user, and is the reason we copy the slice in `WithAttributes` in the
first place.
### Why not just copy user attributes to an amortized slice pool to
create a new `attribute.Set`?
The additional creation of an `[]attribute.KeyValue` that user
attributes are copied to in order to prevent modification can be
amortized with a pool. For example, #7249. There shouldn't be any
difference than between calling a `*Set` method introduced here vs the
non-`Set` method.
This is true, and motivates this type of change. However, also providing
these additional methods allows the user additional performance gains if
the set is used for more than one measurement. For example:
```go
set := attributes.NewSet(attrs)
counter1.AddSet(ctx, 1, set)
counter2.AddSet(ctx, 2, set) // No additional set is created here.
```
Without these methods:
```go
counter1.Add(ctx, 1, attrs...) // A set is created for attrs
counter2.Add(ctx, 2, attrs...) // Another set is created for attrs
```
Each `attribute.Set` created will require an allocation (i.e. the
internal `any` field needs to be allocated to the heap). Meaning without
these methods a user will still not be able to write the most performant
code.
The attribute pool approach has merit, and should be pursued. However,
it does not invalidate the value of these changes.