-
Notifications
You must be signed in to change notification settings - Fork 531
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(presence): Add support for signal batching #23075
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Coverage Summary
↓ packages.framework.presence.src:
Line Coverage Change: 0.43% Branch Coverage Change: -3.29%
Metric Name | Baseline coverage | PR coverage | Coverage Diff |
---|---|---|---|
Branch Coverage | 85.18% | 81.89% | ↓ -3.29% |
Line Coverage | 69.69% | 70.12% | ↑ 0.43% |
Baseline commit: 136a519
Baseline build: 307305
Happy Coding!!
Code coverage comparison check failed!!
More Details: Readme
- Skip This Check!!
What to do if the code coverage check fails:
-
Ideally, add more tests to increase the code coverage for the package(s) whose code-coverage regressed.
-
If a regression is causing the build to fail and is due to removal of tests, removal of code with lots of tests or any other valid reason, there is a checkbox further up in this comment that determines if the code coverage check should fail the build or not. You can check the box and trigger the build again. The test coverage analysis will still be done, but it will not fail the build if a regression is detected.
-
Unchecking the checkbox and triggering another build should go back to failing the build if a test-coverage regression is detected.
-
You can check which lines are covered or not covered by your tests with these steps:
- Go to the PR ADO build.
- Click on the link to see its published artifacts. You will see an artifact named
codeCoverageAnalysis
, which you can expand to reach to a particular source file's coverage html which will show which lines are covered/not covered by your tests. - You can also run different kind of tests locally with
:coverage
tests commands to find out the coverage.
⯅ @fluid-example/bundle-size-tests: +245 Bytes
Baseline commit: 3b51758 |
- move workspace create to common setup - add more test cases - rework emit test without spies - remove console logging - remove compile coverage as part of regular tests
test(client-presence): rework Notifications tests
[ | ||
"Pres:DatastoreUpdate", | ||
{ | ||
"sendTimestamp": 1030, | ||
"avgLatency": 10, | ||
"data": { | ||
"system:presence": { | ||
"clientToSessionId": { | ||
"client2": { "rev": 0, "timestamp": 1000, "value": "sessionId-2" }, | ||
}, | ||
}, | ||
"s:name:testStateWorkspace": { | ||
"data": { | ||
"sessionId-2": { "rev": 2, "timestamp": 1030, "value": { "num": 65 } }, | ||
}, | ||
}, | ||
}, | ||
}, | ||
], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we drop the two processSignal calls, then I think this is the only signal expected when batching is implemented. The sendTimestamp should be 1070. (60 past when first local update was set at 1010.)
The test will also need a clock advance after last local set. I think you can clock.tick(1000) large number and timer would still happen at the right time.
🔗 No broken links found! ✅ Your attention to detail is admirable. linkcheck output
|
This is an in-progress change to add signal batching to presence.
Status
refactor(presence): Convert localUpdate to accept an options object
The localUpdate function now takes an options object instead of just forceBroadcast. There is also now an
allowableUpdateLatency
optional value which will be used in the batching/throttling logic.test cases
There are three test cases currently:
The test cases themselves are now more compact because of the snapshots. I added inline comments with the time and expected deadline for each operation.
Each test has two signals at the beginning for the join and the initial send of the workspaces, so the real signals for each test start with signal 3.
pending test cases
These test cases haven't been written yet: