Skip to content

Conversation

@Abatom
Copy link
Contributor

@Abatom Abatom commented Oct 24, 2024

  1. Add metrics for request queue time, forward time, and execute time, which can be returned through the /metrics API.
  2. Remove the restriction on collect_model_forward_time and collect_model_execute_time due to the --otlp-traces-endpoint flag, so that metrics can also collect information about model_forward_time and model_execute_time.
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@Abatom Abatom changed the title [Misc]Add histogram metrics for both model_execute_time and time_in_queue. [Misc]Add histogram metrics for both latency for requests executing and waiting in queue Oct 24, 2024
@Abatom Abatom changed the title [Misc]Add histogram metrics for both latency for requests executing and waiting in queue [Misc] Add latency histogram metrics for request queuing and execution time Oct 24, 2024
@Abatom
Copy link
Contributor Author

Abatom commented Oct 25, 2024

@WoosukKwon, Could you help me review the code?

@Abatom Abatom changed the title [Misc] Add latency histogram metrics for request queuing and execution time [Misc] Add metrics for request queue time, forward time, and execute time Oct 25, 2024
@Abatom
Copy link
Contributor Author

Abatom commented Oct 25, 2024

@njhill , Could you help me review the code? Thanks!

@Abatom
Copy link
Contributor Author

Abatom commented Oct 25, 2024

@youkaichao , Could you help me review the code? Thanks!

@mgoin
Copy link
Member

mgoin commented Oct 25, 2024

@Abatom Abatom requested a review from simon-mo October 26, 2024 04:01
@simon-mo simon-mo added the ready ONLY add when PR is ready to merge/full CI is needed label Oct 28, 2024
@Abatom Abatom requested a review from simon-mo October 29, 2024 17:05
@simon-mo simon-mo merged commit 74fc2d7 into vllm-project:main Oct 29, 2024
68 of 71 checks passed
rasmith pushed a commit to rasmith/vllm that referenced this pull request Oct 30, 2024
…time (vllm-project#9659) Signed-off-by: Randall Smith <Randall.Smith@amd.com>
JC1DA pushed a commit to JC1DA/vllm that referenced this pull request Nov 11, 2024
sumitd2 pushed a commit to sumitd2/vllm that referenced this pull request Nov 14, 2024
…time (vllm-project#9659) Signed-off-by: Sumit Dubey <sumit.dubey2@ibm.com>
sleepwalker2017 pushed a commit to sleepwalker2017/vllm that referenced this pull request Dec 13, 2024
markmc added a commit to markmc/vllm that referenced this pull request Mar 3, 2025
vllm:time_in_queue_requests appears to be an exact duplicate of vllm:request_queue_time_seconds. Both record first_scheduled_time-arrival_time: ``` if seq_group.is_finished(): time_queue_requests.append( seq_group.metrics.first_scheduled_time - seq_group.metrics.arrival_time) ``` ``` def maybe_set_first_scheduled_time(self, time: float) -> None: if self.metrics.first_scheduled_time is None: self.metrics.first_scheduled_time = time self.metrics.time_in_queue = time - self.metrics.arrival_time ``` vllm:time_in_queue_requests was added by vllm-project#9659 and vllm:request_queue_time_seconds was later added by vllm-project#4464. However, neither existed when each PR was first created. The latter seems like the right one to keep since it is implemented in V1, used in the Grafana dashboard, and has test coverage. Signed-off-by: Mark McLoughlin <markmc@redhat.com>
markmc added a commit to markmc/vllm that referenced this pull request Mar 3, 2025
Metrics originally added by vllm-project#9659 These seem to be of questionable value relative to the existing prefill, decode, and inference time metrics. And since they would be challenging to implement in V1, and they don't conform to the standard of using seconds as units, let's deprecate them Signed-off-by: Mark McLoughlin <markmc@redhat.com>
LeiWang1999 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Mar 26, 2025
…time (vllm-project#9659) Signed-off-by: LeiWang1999 <leiwang1999@outlook.com>
@Abatom Abatom deleted the metrics branch June 25, 2025 02:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

3 participants