Chapter 15: Creating an OpenAI-Compatible Plugin Specification
Theoretical Foundations
The theoretical foundation of creating OpenAI-compatible plugins in ASP.NET Core rests on the concept of semantic interoperability. In previous chapters, we focused on building isolated, high-performance AI endpoints. We treated them as monolithic services—efficient, but opaque to external orchestrators. The shift in this chapter is architectural: we are no longer just serving a model; we are exposing a set of capabilities to an external intelligence (the LLM) that must understand how to use them.
To understand this, we must look at the plugin not as a single API endpoint, but as a self-describing ecosystem of metadata and execution logic.
The Manifest: Identity and Discovery
At the heart of any plugin lies the AI Plugin Manifest (.well-known/ai-plugin.json). This is the "business card" of your API. When an LLM or a client like ChatGPT attempts to interact with your service, it first performs a discovery step to understand what you offer.
Think of this like a restaurant menu. A customer (the LLM) enters a restaurant (your server). They don't know what the kitchen (your business logic) can produce. They ask for the menu (the manifest). The menu lists the dishes (functions) available, their descriptions (prompts), and the ingredients required (parameters).
In the context of ASP.NET Core, this manifest is not a static file. It is a dynamic representation of your application's capabilities. It must be served from a predictable location (usually /.well-known/ai-plugin.json) to allow the orchestrator to find it without prior knowledge.
The manifest contains critical metadata:
- Schema Version: Defines the structure of the manifest itself.
- Name and Description: The LLM uses the description to determine when to call this plugin. If the user asks for "a joke," and your plugin description says "generates humorous anecdotes," the LLM selects your plugin.
- Auth: Defines the security model. This is where we bridge the gap between the stateless nature of HTTP and the stateful requirements of security.
The OpenAPI Specification: The Contract of Conversation
While the manifest tells the LLM what the plugin does, the OpenAPI specification (Swagger) tells it how to do it. This is the most critical theoretical component for function calling.
In traditional web development, OpenAPI is primarily for human developers to generate client code. In AI development, OpenAPI is for the LLM to generate function arguments.
We must view the OpenAPI schema as a type-safe grammar for the LLM. When an LLM plans to execute a function, it does not guess the parameters; it infers them based on the schema definitions provided.
Consider the analogy of a universal remote control. A standard remote has buttons labeled with fixed icons (Power, Volume). An AI plugin is like a universal remote that downloads the manual for a specific TV (the OpenAPI schema). It reads that the TV requires a "Channel" parameter which must be an integer between 1 and 99. When the user says "Change the channel to 5," the remote knows exactly how to format that command because it read the schema.
In ASP.NET Core, we leverage the Microsoft.OpenApi libraries to generate this schema dynamically. We rely on XML comments and data annotations ([Description], [Range], [Required]) to enrich this schema. The richer the metadata, the higher the accuracy of the LLM's function calling.
The Security Layer: The Bouncer
Security in AI plugins is distinct from traditional web security. We are not just protecting against unauthorized access; we are protecting against prompt injection and impersonation.
The theoretical model here is the Bouncer at a VIP club.
- The User Identity: The end-user interacting with the LLM.
- The Plugin Identity: The plugin itself.
- The Delegation: The LLM acts as an intermediary, carrying the user's request to the plugin.
In the OpenAI specification, security is typically handled via Bearer tokens. However, in a custom ASP.NET Core implementation, we often need a more granular approach. We need to validate that the request is coming from a trusted orchestrator and, crucially, that the request carries the context of the legitimate user.
This is where Middleware and Filters become the theoretical gatekeepers. We intercept the HTTP request before it reaches the controller logic. We inspect the Authorization header. We validate the token. But we also need to ensure that the token hasn't been tampered with.
A critical concept here is Contextual Authorization. Unlike a standard API where a token grants access to data, an AI plugin token grants access to compute. We must ensure that the plugin cannot be used to perform actions on behalf of a user who did not authorize it. This is often achieved by chaining the identity of the plugin caller (the LLM service) with the identity of the end-user (passed in the payload or a custom header).
The Controller: The Executor
Finally, we have the Controller. In our previous chapters, controllers were endpoints returning JSON. Here, the controller is an executor of intent.
The theoretical shift is from Request-Response to Intent-Execution. When the LLM calls our plugin, it has already performed the "Reasoning" phase. It has decided that our function is the correct tool. It has generated the arguments based on our OpenAPI schema.
Our Controller's role is to validate these arguments (again, using ASP.NET Core's built-in model binding and validation attributes) and execute the business logic. The response must be formatted in a way the LLM can digest—usually a structured JSON object that maps back to the expected output schema.
Architectural Flow Visualization
The following diagram illustrates the flow of metadata and execution requests within the ASP.NET Core pipeline:
Theoretical Foundations
In building this system, we must anticipate several theoretical challenges:
- The Hallucination Parameter: What if the LLM generates a parameter that fits the schema but is semantically nonsensical (e.g., "Temperature: 5000°C" when the valid range is 0-100)? The ASP.NET Core validation attributes (
[Range]) act as the final safety net, rejecting the request before it hits business logic. - State Management: HTTP is stateless, but conversations are stateful. If a plugin requires a multi-step interaction (e.g., "Select item -> Checkout"), we must rely on external state stores or session IDs passed by the LLM, as the LLM itself does not maintain a connection to our API between calls.
- Latency: LLMs have timeout limits. If our plugin takes 10 seconds to respond, the LLM might time out. We must theoretically design our controllers to be non-blocking or synchronous and fast, offloading heavy processing to background services if necessary, though this complicates the immediate response requirement of function calling.
By mastering these theoretical components—the Manifest as the identity, the OpenAPI schema as the grammar, and the Controller as the executor—we transform a simple ASP.NET Core API into a participant in the AI ecosystem.
Basic Code Example
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.OpenApi.Models;
using System.Text.Json;
using System.Text.Json.Serialization;
// This example demonstrates a minimal OpenAI-compatible plugin implementation.
// Scenario: A "Weather Service" plugin that LLMs (like ChatGPT) can call to get the current weather.
// It exposes two key endpoints:
// 1. /.well-known/ai-plugin.json: The manifest file describing the plugin.
// 2. /weather/current: The actual API endpoint for fetching weather data.
var builder = WebApplication.CreateBuilder(args);
// 1. Dependency Injection Setup
// We register the Swagger generator to dynamically generate the OpenAPI schema required by the plugin spec.
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen(c =>
{
c.SwaggerDoc("v1", new OpenApiInfo
{
Title = "Weather Plugin",
Version = "v1",
Description = "A simple plugin to get current weather conditions."
});
});
var app = builder.Build();
// 2. The Plugin Manifest Endpoint
// OpenAI plugins require a manifest file at the well-known path: /.well-known/ai-plugin.json
// This JSON file tells the LLM how to interact with your API.
app.MapGet("/.well-known/ai-plugin.json", () =>
{
// Define the manifest object structure
var manifest = new
{
schema_version = "v1",
name_for_model = "weather",
name_for_human = "Weather Plugin",
description_for_model = "Plugin for retrieving current weather data for a specific location. Use this when users ask about the weather.",
description_for_human = "Get the current weather for a city.",
auth = new
{
type = "none" // No authentication for this simple example
},
api = new
{
type = "openapi",
// Point to the OpenAPI schema endpoint we will define next
url = "/swagger/v1/swagger.json",
is_user_authenticated = false
},
// URL for the logo (can be a local file or external)
logo_url = "/logo.png",
contact_email = "support@example.com",
legal_info_url = "https://example.com/legal"
};
return Results.Json(manifest);
});
// 3. The OpenAPI Schema Endpoint
// The manifest points to a Swagger/OpenAPI JSON file.
// We utilize the built-in Swagger generator to provide this.
// This schema defines the parameters and return types for the LLM.
app.MapGet("/swagger/v1/swagger.json", (HttpContext httpContext) =>
{
// Generate the OpenAPI document
var swaggerProvider = app.Services.GetRequiredService<ISwaggerProvider>();
var swaggerDoc = swaggerProvider.GetSwagger("v1");
// Ensure the server URL is correct (handling reverse proxies/localhost)
swaggerDoc.Servers = new List<OpenApiServer>
{
new OpenApiServer { Url = $"{httpContext.Request.Scheme}://{httpContext.Request.Host}" }
};
return Results.Json(swaggerDoc);
});
// 4. The Actual API Endpoint (The Tool)
// This is the function the LLM will actually call.
// We use minimal API syntax for conciseness.
app.MapGet("/weather/current", (string location) =>
{
// Simulate a database lookup or external API call
// In a real app, you would inject a service here.
var weatherData = new WeatherResponse
{
Location = location,
Temperature = 22.5,
Unit = "Celsius",
Condition = "Sunny",
Humidity = 45
};
return Results.Ok(weatherData);
})
.WithName("GetWeather") // Important: Name the endpoint for Swagger reference
.WithOpenApi(); // Adds OpenAPI metadata to this endpoint
// 5. Swagger UI Setup (Optional but recommended for testing)
// This allows humans to test the plugin easily in the browser.
app.UseSwaggerUI(c =>
{
c.SwaggerEndpoint("/swagger/v1/swagger.json", "Weather Plugin V1");
c.RoutePrefix = "swagger"; // Access via /swagger
});
app.Run();
// DTO for the response
public class WeatherResponse
{
[JsonPropertyName("location")]
public string Location { get; set; } = string.Empty;
[JsonPropertyName("temperature")]
public double Temperature { get; set; }
[JsonPropertyName("unit")]
public string Unit { get; set; } = string.Empty;
[JsonPropertyName("condition")]
public string Condition { get; set; } = string.Empty;
[JsonPropertyName("humidity")]
public int Humidity { get; set; }
}
Detailed Explanation
This code creates a fully functional OpenAI-compatible plugin using ASP.NET Core Minimal APIs. Below is a step-by-step breakdown of how it works and why each part is necessary.
1. The Manifest Endpoint (/.well-known/ai-plugin.json)
The OpenAI specification requires a "manifest" file discoverable at a standard location. This file acts as the plugin's ID card.
schema_version: Specifies the version of the plugin schema.name_for_model: This is the identifier the LLM uses internally. It should be lowercase and descriptive (e.g., "weather").name_for_human: The friendly name displayed in the ChatGPT UI.description_for_model: Crucial. This text guides the LLM on when to use this tool. It acts like a system prompt for function selection.auth: Defines authentication. We settype: "none"for this example, but production plugins typically usetype: "service_http"with a Bearer token.api: Defines the interface type. We pointurlto our Swagger JSON endpoint. This tells the LLM to fetch the API documentation to understand how to construct requests.
2. The OpenAPI (Swagger) Schema Endpoint
While the manifest describes the plugin generally, the OpenAPI schema describes the specific API mechanics.
- Dynamic Generation: Instead of hardcoding a static JSON file, we use
ISwaggerProviderto generate the schema dynamically. This ensures that if you update your C# controller signatures, the plugin documentation updates automatically. WithOpenApi(): In Minimal APIs, calling.WithOpenApi()on a route definition ensures that the endpoint is included in the generated Swagger document. Without this, the LLM wouldn't know the endpoint exists or what parameters it accepts.
3. The API Logic (/weather/current)
This is the standard ASP.NET Core endpoint.
- Parameter Binding: The LLM will send the
locationas a query parameter (e.g.,?location=London). ASP.NET Core binds this automatically based on the function signature. - Return Type: The endpoint returns a strongly typed object (
WeatherResponse). The JSON serialization is handled automatically, matching the property names defined in the Swagger schema.
4. Execution Flow
- Discovery: The LLM (or ChatGPT) requests
/.well-known/ai-plugin.json. - Schema Fetch: The LLM parses the manifest, sees the
urlpointing to/swagger/v1/swagger.json, and downloads the OpenAPI spec. - Function Selection: When a user asks "What is the weather in Paris?", the LLM analyzes the
description_for_modeland the schema. It decides this is the correct tool. - Invocation: The LLM constructs an HTTP GET request to
/weather/current?location=Paris. - Response: Your API returns the JSON data, which the LLM interprets and presents to the user in natural language.
Common Pitfalls
-
Missing
WithOpenApi()Call: In ASP.NET Core Minimal APIs, simply defining a route does not automatically add it to the Swagger/OpenAPI document. If you omit.WithOpenApi(), the LLM will fetch the schema but won't see the endpoint definition, causing function calling to fail. -
Incorrect
urlin Manifest: Theapi.urlproperty inai-plugin.jsonmust be an absolute URL or a path relative to the domain. If the LLM cannot fetch this JSON, the plugin will not install. Ensure your Swagger middleware is correctly configured to serve the JSON at the specified path. -
Authentication Mismatch: If you set
auth.type = "none"in the manifest but your API endpoint requires an[Authorize]attribute, the LLM's requests will fail with 401 Unauthorized. Always ensure the authentication configuration in the manifest matches the actual API security requirements. -
Schema Naming Conventions: OpenAPI schemas are case-sensitive. If your C# property is
Location(PascalCase) but your JSON serializer settings or Swagger configuration outputlocation(camelCase), ensure consistency. The example uses[JsonPropertyName]attributes to explicitly control the JSON output, which is best practice for LLM interoperability.
Visualizing the Workflow
The chapter continues with advanced code, exercises and solutions with analysis, you can find them on the ebook on Leanpub.com or Amazon
Loading knowledge check...
Code License: All code examples are released under the MIT License. Github repo.
Content Copyright: Copyright © 2026 Edgar Milvus | Privacy & Cookie Policy. All rights reserved.
All textual explanations, original diagrams, and illustrations are the intellectual property of the author. To support the maintenance of this site via AdSense, please read this content exclusively online. Copying, redistribution, or reproduction is strictly prohibited.