Back to projects

candle-vllm

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.

Statistics

562

Stargazers

64

Forks

562

Watchers

37

Open Issues

Details

License
MIT License
Created
Oct 29, 2023
Last Commit
2d ago

Quality Indicators

  • README
  • License
  • CI/CD
  • Documentation