Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[cmd/otelinmemexporter] Add histogram support #229

Open
wants to merge 25 commits into
base: main
Choose a base branch
from

Conversation

ericywl
Copy link
Contributor

@ericywl ericywl commented Feb 19, 2025

πŸ§‘β€πŸ’» What is being changed

  • Add histogram support for otelinmemexporter. Aggregations for histogram includes percentile and sum.
  • Refactor / rename some existing functions, fields etc.

βœ… How to validate the change

  1. Configure ocb-config.yaml to use local path for otelinmemexporter and rebuild the collector. For example:
exporters:
  - gomod: go.opentelemetry.io/collector/exporter/otlpexporter v0.120.0
  - gomod: github.com/elastic/apm-perf/cmd/otelinmemexporter v0.0.0-20241111024659-ec5113ead30e
    path: /Users/ericyap/Repos/apm/apm-perf/cmd/otelinmemexporter

processors:
  - gomod: go.opentelemetry.io/collector/processor/batchprocessor v0.120.0
  - gomod: github.com/open-telemetry/opentelemetry-collector-contrib/processor/cumulativetodeltaprocessor v0.120.0

receivers:
  - gomod: go.opentelemetry.io/collector/receiver/otlpreceiver v0.120.0
  1. Configure aggregation for some histogram.

  2. Run benchmark (with some changes to otelinmemexporter to log / print out histogram aggregations). Example output:

{"latency":{"":{"resourceMetrics":[{"resource":{},"scopeMetrics":[{"scope":{},"metrics":[{"histogram":{"dataPoints":[{"attributes":[{"key":"partition","value":{"stringValue":"0"}},{"key":"project_id","value":{"stringValue":"d8b2b3711fa948518f497173e850fe19"}}],"startTimeUnixNano":"1739948581207620451","timeUnixNano":"1739948591207399095","count":"43","sum":265.11085513400394,"bucketCounts":["0","27","5","11","0","0","0","0","0","0","0","0","0","0","0","0"],"explicitBounds":[0,5,10,25,50,75,100,250,500,750,1000,2500,5000,7500,10000]}]}}]}]}]}}}
map[string]float64{"":22.068181818181817}

{"latency":{"":{"resourceMetrics":[{"resource":{},"scopeMetrics":[{"scope":{},"metrics":[{"histogram":{"dataPoints":[{"attributes":[{"key":"partition","value":{"stringValue":"0"}},{"key":"project_id","value":{"stringValue":"d8b2b3711fa948518f497173e850fe19"}}],"startTimeUnixNano":"1739948581207620451","timeUnixNano":"1739948601207784807","count":"93","sum":438.06849422600135,"bucketCounts":["0","77","5","11","0","0","0","0","0","0","0","0","0","0","0","0"],"explicitBounds":[0,5,10,25,50,75,100,250,500,750,1000,2500,5000,7500,10000]}]}}]}]}]}}}
map[string]float64{"":18.659090909090903}

{"latency":{"":{"resourceMetrics":[{"resource":{},"scopeMetrics":[{"scope":{},"metrics":[{"histogram":{"dataPoints":[{"attributes":[{"key":"partition","value":{"stringValue":"0"}},{"key":"project_id","value":{"stringValue":"d8b2b3711fa948518f497173e850fe19"}}],"startTimeUnixNano":"1739948581207620451","timeUnixNano":"1739948611207250531","count":"137","sum":632.7414063080014,"bucketCounts":["0","119","7","11","0","0","0","0","0","0","0","0","0","0","0","0"],"explicitBounds":[0,5,10,25,50,75,100,250,500,750,1000,2500,5000,7500,10000]}]}}]}]}]}}}
map[string]float64{"":15.659090909090901}
  1. It should also appear in apmbench output. For example:
BenchmarkAgentAll-4	     618	 828864153 ns/op	         4.750 latency ...
BenchmarkAgentAll-4	     427	 746870351 ns/op	     	 4.765 latency ...

@ericywl
Copy link
Contributor Author

ericywl commented Feb 19, 2025

Will merge main over after #230 is merged.

@ericywl ericywl marked this pull request as ready for review February 19, 2025 07:19
@ericywl ericywl requested a review from a team as a code owner February 19, 2025 07:19
1pkg
1pkg previously approved these changes Mar 8, 2025
Copy link
Member

@1pkg 1pkg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code looks good and very clean, left just a few nits.
Since it's a larger change and it's easy to lose something, I would recommend to get at least a couple approvals before merging.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants