Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Port collision when running multiple models #636

Open
heaversm opened this issue Nov 22, 2024 · 0 comments
Open

Bug: Port collision when running multiple models #636

heaversm opened this issue Nov 22, 2024 · 0 comments

Comments

@heaversm
Copy link

Contact Details

[email protected]

What happened?

llamafile-port-issue.mp4

Run one model in server mode at default port 8080. Then run another at the same port. Both indicate they are working.

Should detect port is not open and open the new server at a different port (e.g. 8081).

Version

v0.8.16

What operating system are you seeing the problem on?

Mac

Relevant log output

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant