Back to projects

candle-vllm

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.

Statistics

574

Stargazers

66

Forks

574

Watchers

37

Open Issues

8

Contributors

Details

License
MIT License
Created
Oct 29, 2023
Last Commit
Jan 15, 2026

Quality Indicators

  • README
  • License
  • CI/CD
  • Documentation