Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporterqueue] Default Batcher that reads from the queue, batches and exports #11507

Closed

Conversation

sfc-gh-sili
Copy link
Contributor

@sfc-gh-sili sfc-gh-sili commented Oct 22, 2024

Description

This PR implements a new component that pulls from queue_sender. This component will replace consumers in queue_sender

The idea is that instead of allocating a group of reading goroutine and block them until the corresponding batch gets flushed, we now use a goroutine to read and then use the same goroutine go flush while allocating new goroutine to read.

Design doc:
https://docs.google.com/document/d/1y5jt7bQ6HWt04MntF8CjUwMBBeNiJs2gV4uUZfJjAsE/edit?usp=sharing

Link to tracking issue

#8122
#10368

Testing

Documentation

@sfc-gh-sili sfc-gh-sili changed the title Batcher [exporterqueue] Queue Batcher that reads from the queue, batches and exports Oct 22, 2024
@sfc-gh-sili sfc-gh-sili marked this pull request as ready for review October 22, 2024 08:18
@sfc-gh-sili sfc-gh-sili requested a review from a team as a code owner October 22, 2024 08:18
Copy link

codecov bot commented Oct 22, 2024

Codecov Report

Attention: Patch coverage is 81.46718% with 48 lines in your changes missing coverage. Please review.

Project coverage is 91.27%. Comparing base (2efeae4) to head (8d9f8a3).
Report is 42 commits behind head on main.

Files with missing lines Patch % Lines
exporter/internal/queue/batcher.go 81.52% 27 Missing and 7 partials ⚠️
exporter/internal/queue/fake_request.go 81.33% 11 Missing and 3 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main   #11507      +/-   ##
==========================================
- Coverage   91.41%   91.27%   -0.14%     
==========================================
  Files         433      435       +2     
  Lines       23657    23918     +261     
==========================================
+ Hits        21625    21831     +206     
- Misses       1658     1701      +43     
- Partials      374      386      +12     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@sfc-gh-sili sfc-gh-sili marked this pull request as draft October 23, 2024 22:33
dmitryax pushed a commit that referenced this pull request Oct 26, 2024
)

#### Description

This PR is a bare minimum implementation of a component called queue
batcher. On completion, this component will replace `consumers` in
`queue_sender`, and thus moving queue-batch from a pulling model instead
of pushing model.

Limitations of the current code
* This implements only the case where batching is disabled, which means
no merge of splitting of requests + no timeout flushing.
* This implementation does not enforce an upper bound on concurrency

All these code paths are marked as panic currently, and they will be
replaced with actual implementation in coming PRs. This PR is split from
#11507 for
easier review.

Design doc:

https://docs.google.com/document/d/1y5jt7bQ6HWt04MntF8CjUwMBBeNiJs2gV4uUZfJjAsE/edit?usp=sharing


#### Link to tracking issue

#8122
#10368
@sfc-gh-sili sfc-gh-sili changed the title [exporterqueue] Queue Batcher that reads from the queue, batches and exports [exporterqueue] Default Batcher that reads from the queue, batches and exports Oct 29, 2024
@sfc-gh-sili sfc-gh-sili marked this pull request as ready for review October 29, 2024 21:04
@sfc-gh-sili sfc-gh-sili marked this pull request as draft October 29, 2024 21:05
djaglowski pushed a commit to djaglowski/opentelemetry-collector that referenced this pull request Nov 21, 2024
…n-telemetry#11532)

#### Description

This PR is a bare minimum implementation of a component called queue
batcher. On completion, this component will replace `consumers` in
`queue_sender`, and thus moving queue-batch from a pulling model instead
of pushing model.

Limitations of the current code
* This implements only the case where batching is disabled, which means
no merge of splitting of requests + no timeout flushing.
* This implementation does not enforce an upper bound on concurrency

All these code paths are marked as panic currently, and they will be
replaced with actual implementation in coming PRs. This PR is split from
open-telemetry#11507 for
easier review.

Design doc:

https://docs.google.com/document/d/1y5jt7bQ6HWt04MntF8CjUwMBBeNiJs2gV4uUZfJjAsE/edit?usp=sharing


#### Link to tracking issue

open-telemetry#8122
open-telemetry#10368
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant