Skip to content

Palm1r/llmcore

Repository files navigation

LLMCore

Build and Test GitHub Tag

Qt/C++ library for integrating LLM providers into Qt applications. Provides a unified streaming API across multiple backends with built-in tool calling support.

Note: The library does not cover the full API surface of each provider. Missing features will be added over time.

Supported Providers

Provider Client class Streaming Tools Thinking
Anthropic Claude ClaudeClient
OpenAI (Chat Completions) OpenAIClient
OpenAI (Responses API) OpenAIResponsesClient
Ollama OllamaClient
Google AI (Gemini) GoogleAIClient
llama.cpp LlamaCppClient

Requirements

  • C++20
  • Qt 6.5+
  • CMake 3.21+

Documentation

Support

  • Report Issues: open an issue on GitHub
  • Contribute: pull requests with bug fixes or new features are welcome
  • Spread the Word: star the repository and share with fellow developers
  • Financial Support:
    • Bitcoin (BTC): bc1qndq7f0mpnlya48vk7kugvyqj5w89xrg4wzg68t
    • Ethereum (ETH): 0xA5e8c37c94b24e25F9f1f292a01AF55F03099D8D
    • Litecoin (LTC): ltc1qlrxnk30s2pcjchzx4qrxvdjt5gzuervy5mv0vy
    • USDT (TRC20): THdZrE7d6epW6ry98GA3MLXRjha1DjKtUx

License

MIT — see LICENSE.

About

Qt C++ library for working with AI/LLM Providers

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors