Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Pagination with Limit for getProgressDocument #2319

Open
6 of 10 tasks
AnujChhikara opened this issue Dec 27, 2024 · 0 comments · May be fixed by #2328 or #2325
Open
6 of 10 tasks

Implement Pagination with Limit for getProgressDocument #2319

AnujChhikara opened this issue Dec 27, 2024 · 0 comments · May be fixed by #2328 or #2325

Comments

@AnujChhikara
Copy link
Contributor

Issue Description

The getProgressDocument function retrieves all matching records in a single request, which can cause performance issues when handling large datasets. This behavior needs to be improved by implementing pagination and a configurable limit parameter.

Expected Behavior

  • The API should allow clients to request data in manageable chunks, with a default limit of 100 records per request.
  • Clients should be able to specify a page and limit parameter to fetch subsequent records.
  • The response should include metadata (count, page, limit) to facilitate pagination on the client side.

Current Behavior

  • All matching records are retrieved and sent in a single response.
  • This approach can lead to excessive memory usage and slow performance, especially when dealing with large datasets.
  • The API currently lacks support for pagination and limits.

Screenshots

image

Reproducibility

  • This issue is reproducible
  • This issue is not reproducible

Steps to Reproduce

  • Hit this API endpoint
  • We can notice that it is sending all the progresses count 3000+

Severity/Priority

  • Critical
  • High
  • Medium
  • Low

Additional Information

Checklist

  • I have read and followed the project's code of conduct.
  • I have searched for similar issues before creating this one.
  • I have provided all the necessary information to understand and reproduce the issue.
  • I am willing to contribute to the resolution of this issue.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant