.NET AI integration

Today’s AI landscape moves so fast and providers differ so much that vendor lock-in can become expensive. You need a clean, testeable way to add AI without tying your architecture to one SDK.

The solution to this problem is a model-agnostic solution.

Nuggets to use (you need to click to see preliminar versions):

  • Microsoft.Extensions.AI - This nugget implements IChatClient interface, which is an abstraction to use several LLM providers, from ChatGPT to Ollama.
  • Microsoft.Extensions.AI.OpenAI
  • OllamaSharp (previously Microsoft.Extensions.AI.Ollama)

You’ll need to go to https://platform.openai.com/ to set up a project, billing, and get an openAI API key.

This repository is a test implementation which connects to OpenAi’s ChatGPT and is able to send prompts.

Best Practices

  • Keep inputs short and specific
  • Validate outputs with regex/JSON schema. Reject or re-ask when invalid
  • Log prompts, token counts, latency and provider responses
  • Improve cost ops. Cache results, batch requests and prefer smaller models by default
  • Don’t commit or send secrets or personal information
  • Failover. Implement timeouts, retries, and fallback models
  • LLMs are stateless; maintaining and reconstructing conversational context is a developer’s responsibility (chat history or memory abstractions)

Security

  • prompt injection: beware with malicious prompts to subvert model guardrails, steal data or execute unintended actions
  • LLMs may leak private or internal data via crafted prompts
  • Training data poisoning may be injected by malicious actors
  • DoS and rate limiting: prevent overuse / abuse

Reference(s)

https://roxeem.com/2025/09/04/the-practical-net-guide-to-ai-llm-introduction/
https://roxeem.com/2025/09/08/how-to-correctly-build-ai-features-in-dotnet/