Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

assertion failed: earliest > now #1490

Closed
KershawChang opened this issue Nov 7, 2023 · 4 comments · Fixed by #1491
Closed

assertion failed: earliest > now #1490

KershawChang opened this issue Nov 7, 2023 · 4 comments · Fixed by #1491

Comments

@KershawChang
Copy link
Collaborator

I hit this assertion when trying to land this patch.
According to the log below, this seems to be caused by the Idle/keepalive timer. It shows that we called process_output twice, but the instant values returned from Idle/keepalive timer were not changed.
Note that the delay returned from the first process_output is more than 1ms, so necko waited and didn't call process_output immediately. I think this means that this patch is not the culprit, but I am not super sure.

[task 2023-11-07T17:01:59.631Z] 17:01:59     INFO - GECKO(2667) | [Parent 2667: Socket Thread]: I/neqo_transport::* [neqo_transport::connection] [Client b14e84fff18672cc] Get callback delay Instant { tv_sec: 4466, tv_nsec: 62214290 }
[task 2023-11-07T17:01:59.632Z] 17:01:59     INFO - GECKO(2667) | [Parent 2667: Socket Thread]: I/neqo_transport::* [neqo_transport::connection] [Client b14e84fff18672cc] Idle/keepalive timer Instant { tv_sec: 4467, tv_nsec: 707236065 }
[task 2023-11-07T17:01:59.632Z] 17:01:59     INFO - GECKO(2667) | [Parent 2667: Socket Thread]: I/neqo_transport::* [neqo_transport::connection] [Client b14e84fff18672cc] Loss recovery timer Instant { tv_sec: 4478, tv_nsec: 984261778 }
[task 2023-11-07T17:02:01.277Z] 17:02:01     INFO - GECKO(2667) | [Parent 2667: Socket Thread]: I/neqo_transport::* [neqo_transport::connection] [Client b14e84fff18672cc] Get callback delay Instant { tv_sec: 4467, tv_nsec: 707664199 }
[task 2023-11-07T17:02:01.278Z] 17:02:01     INFO - GECKO(2667) | [Parent 2667: Socket Thread]: I/neqo_transport::* [neqo_transport::connection] [Client b14e84fff18672cc] Idle/keepalive timer Instant { tv_sec: 4467, tv_nsec: 707236065 }
[task 2023-11-07T17:02:01.280Z] 17:02:01     INFO - GECKO(2667) | [Parent 2667: Socket Thread]: I/neqo_transport::* [neqo_transport::connection] [Client b14e84fff18672cc] Loss recovery timer Instant { tv_sec: 4478, tv_nsec: 984261778 }
@KershawChang
Copy link
Collaborator Author

Oops, I just found that I missed to address the comment about using max(data.last_output_time, Instant::now()) for process_input. I'll add that and test again to see if this is fixed.

@KershawChang
Copy link
Collaborator Author

I've accidentally found a way to reproduce this reliably. I'll submit a test to demonstrate how this happened.

@larseggert larseggert linked a pull request Feb 1, 2024 that will close this issue
@larseggert
Copy link
Collaborator

I have another reproducer. In #1875, the idle_timeout_crazy_rtt fails with this every time at the moment.

@larseggert
Copy link
Collaborator

The issue in idle_timeout_crazy_rtt in #1875 seems to be that the idle timeout is the same as now:

0s 26ms TRACE [Server a437d00156e81143] Get callback delay Instant { tv_sec: 5599618, tv_nsec: 275458791 }
0s 26ms TRACE ack_time for ap = None
0s 26ms TRACE ack_time for hs = None
0s 26ms TRACE ack_time for in = None
0s 26ms TRACE IdleTimeout::expiry@Instant { tv_sec: 5599618, tv_nsec: 275458791 } pto=325ms, ka=false => Instant { tv_sec: 610399616, tv_nsec: 775458791 }
0s 26ms TRACE [Server a437d00156e81143] Idle/keepalive timer Instant { tv_sec: 610399616, tv_nsec: 775458791 }
0s 26ms TRACE [LossRecovery] next_timeout loss=None pto=Some(Instant { tv_sec: 5599618, tv_nsec: 275458791 })
0s 26ms TRACE [Server a437d00156e81143] Loss recovery timer Instant { tv_sec: 5599618, tv_nsec: 275458791 }
thread 'idle_timeout_crazy_rtt' panicked at neqo-transport/src/connection/mod.rs:1067:9:
assertion failed: earliest > now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants