LogicGrid
Multi-agent AI orchestration for .NET
Local LLMs first. Provider-agnostic. Zero vendor lock-in. Build agents with Agent<T>, compose them through admins, observe everything by default.
using LogicGrid.Core.Agents;
using LogicGrid.Core.Llm;
var llm = LlmClientBase.Ollama("llama3.2");
IAgent agent = new Agent<string>(
name: "Helper",
description: "Answers questions concisely.",
systemPrompt: "Answer in one short sentence.",
llm: llm);
Console.WriteLine(
await agent.RunAsync("Capital of France?", new AgentContext()));
Why LogicGrid
Works with any LLM
Ollama, Claude, OpenAI, Gemini, vLLM, TEI, LM Studio, DeepSeek, Groq. Same code, any provider. Swap models in one line.
See everything
Structured events on every run. .WithLogging().WithTracing(out var trace) gives you token counts, costs, retries, tool calls — no boilerplate, no black boxes.
Pick the pattern
Sequential, group chat, parallel, map-reduce, reflexion, graph. Each is a single class. Compose them, swap them, or customize your own.
Built for security
Run entirely on local LLMs — no data ever leaves your network. API keys stay in your process and are never logged. No telemetry. The framework is yours to audit.
Real code, real output
Every example below is runnable. Output is verbatim from the corresponding sample.
using LogicGrid.Core.Agents;
using LogicGrid.Core.Llm;
var llm = LlmClientBase.Ollama("llama3.2");
IAgent agent = new Agent<string>(
name: "Helper",
description: "A helpful assistant.",
systemPrompt: "Answer the user concisely.",
llm: llm);
var ctx = new AgentContext();
Console.WriteLine(await agent.RunAsync("Say hello in three languages.", ctx));
Hello (English) — Bonjour (French) — Hola (Spanish)