Typed Contracts
You added a field to a proto file and three hand-maintained type
copies are already stale. The MCP tool schema still describes
yesterday's request shape, the service base class expects
arguments that no longer exist, and the client sends the old format.
Four libraries -- @forwardimpact/libcodegen,
@forwardimpact/libtype,
@forwardimpact/librpc, and
@forwardimpact/libmcp -- eliminate that drift. Define
the contract once in proto, run fit-codegen, and the
generated artifacts keep every layer consistent: JavaScript types,
typed clients, service base classes, gRPC definitions, and MCP tool
schemas.
This guide walks through the full pipeline, from writing a proto definition to calling the generated service from both gRPC clients and MCP-connected agents.
Prerequisites
- Node.js 18+
-
Proto files co-located in
proto/directories (either in your project root or inside installed@forwardimpact/*packages) - Install the codegen CLI:
npm install @forwardimpact/libcodegen
The runtime libraries install as dependencies of the packages that
consume them. @forwardimpact/libtype ships the
generated types. @forwardimpact/librpc ships the
server/client framework. @forwardimpact/libmcp bridges
gRPC methods to MCP tools.
How the pipeline works
The codegen pipeline reads proto files and produces five categories of output, each gated by a flag:
proto/*.proto
|
v
fit-codegen --all
|
+---> generated/types/types.js (--type) JavaScript protobuf types
+---> generated/types/metadata.js (--metadata) Field metadata for MCP schemas
+---> generated/services/*/service.js (--service) Service base classes
+---> generated/services/*/client.js (--client) Typed gRPC clients
+---> generated/definitions/ (--definition) gRPC service definitions
Every generated file carries a
@generated by fit-codegen header. Do not edit them --
changes are overwritten on the next run.
Step 1: Write a proto definition
Define your service contract in a .proto file inside
your project's proto/ directory. The codegen
discovers all .proto files from installed
@forwardimpact/* packages and your project root
automatically.
// proto/inventory.proto
syntax = "proto3";
package inventory;
import "common.proto";
import "tool.proto";
service Inventory {
rpc ListItems(ListItemsRequest) returns (tool.ToolCallResult);
rpc GetItem(GetItemRequest) returns (tool.ToolCallResult);
}
message ListItemsRequest {
// Category to filter by
optional string category = 1;
// Maximum number of results
optional int32 limit = 2;
}
message GetItemRequest {
// Unique item identifier
string item_id = 1;
}
The common.proto and tool.proto imports
resolve from the installed @forwardimpact/guide and
project-root proto/ packages. The codegen uses every
discovered proto/ directory as an include path, so
cross-file imports work without extra configuration.
Step 2: Run code generation
Generate all artifacts with a single command:
npx fit-codegen --all
Expected output:
Generated 14 files in ./generated/
types/ Protocol Buffer types
proto/ Proto source files
services/ Service bases and clients
definitions/ Service definitions
Code generation complete (types, services, clients, definitions, metadata).
You can also generate specific artifact categories:
npx fit-codegen --type # JavaScript types only
npx fit-codegen --service # Service base classes only
npx fit-codegen --client # Typed clients only
npx fit-codegen --definition # gRPC definitions only
npx fit-codegen --metadata # Field metadata for MCP only
Combine flags to generate a subset:
npx fit-codegen --type --client generates types and
clients without rebuilding service bases or definitions.
Step 3: Use generated types
@forwardimpact/libtype re-exports everything from
generated/types/types.js. Types are organized by proto
package namespace:
import { inventory, common, tool } from "@forwardimpact/libtype";
// Construct a typed request
const req = inventory.ListItemsRequest.fromObject({
category: "electronics",
limit: 10,
});
// Verify before sending
const error = inventory.ListItemsRequest.verify(req);
if (error) throw new Error(error);
// Convert to plain object for serialization
const plain = inventory.ListItemsRequest.toObject(req);
console.log(plain);
// { category: "electronics", limit: 10 }
Each generated type provides three static methods:
| Method | Purpose |
|---|---|
fromObject |
Create a typed instance from a plain object |
toObject |
Convert a typed instance to a plain object |
verify |
Validate a plain object against the proto schema |
Types with a resource.Identifier field (like
common.Message and tool.ToolFunction) also
receive a withIdentifier prototype method that
auto-populates id.type, id.name, and
id.tokens from the instance content.
Step 4: Implement the service
The codegen produces a base class in
generated/services/inventory/service.js with stub
methods for each RPC. Extend it with your business logic:
import { services } from "@forwardimpact/librpc";
// The generated base class
const { InventoryBase } = services;
class InventoryService extends InventoryBase {
#items;
constructor(config, items) {
super(config);
this.#items = items;
}
async ListItems(req) {
let results = this.#items;
if (req.category) {
results = results.filter((item) => item.category === req.category);
}
if (req.limit) {
results = results.slice(0, req.limit);
}
return { content: JSON.stringify(results) };
}
async GetItem(req) {
const item = this.#items.find((i) => i.id === req.itemId);
if (!item) return { content: "Not found" };
return { content: JSON.stringify(item) };
}
}
The base class validates and converts incoming requests using
fromObject and verify before your method
receives them. Each method receives a typed request instance and
returns a response matching the proto's return type.
Step 5: Start the gRPC server
Wrap the service implementation in the librpc
Server and start it:
import { Server } from "@forwardimpact/librpc";
const config = { name: "inventory", host: "0.0.0.0", port: 50060 };
const items = [
{ id: "a1", category: "electronics", name: "Sensor" },
{ id: "b2", category: "mechanical", name: "Bearing" },
];
const service = new InventoryService(config, items);
const server = new Server(service, config);
await server.start();
// Server listening on 0.0.0.0:50060
The Server class adds authentication (HMAC via
SERVICE_SECRET environment variable), keepalive
configuration, graceful shutdown on
SIGINT/SIGTERM, and health checks at the
standard gRPC health endpoint.
Step 6: Call the service from a typed client
The codegen produces a typed client in
generated/services/inventory/client.js. Use it directly
or through the createClient convenience factory:
import { createClient } from "@forwardimpact/librpc";
import { inventory } from "@forwardimpact/libtype";
const client = await createClient("inventory");
const req = inventory.ListItemsRequest.fromObject({
category: "electronics",
});
const result = await client.ListItems(req);
console.log(result.content);
// [{"id":"a1","category":"electronics","name":"Sensor"}]
The generated client validates that the request is an instance of
the expected type, converts it with toObject for the
wire format, and converts the response back using
fromObject. Retries are built in (10 attempts, 1-second
delay by default).
For streaming RPCs, use callStream instead of the typed
method:
const stream = client.callStream("WatchItems", req);
stream.on("data", (chunk) => console.log(chunk));
stream.on("end", () => console.log("Stream ended"));
Step 7: Expose the service as MCP tools
@forwardimpact/libmcp reads the codegen metadata and
the tool configuration from config/config.json to
register gRPC methods as MCP tools. No glue code, no hand-written
schemas.
Add the tool entries to your config/config.json:
{
"service": {
"mcp": {
"tools": {
"ListItems": {
"method": "inventory.Inventory.ListItems",
"description": "List inventory items, optionally filtered by category."
},
"GetItem": {
"method": "inventory.Inventory.GetItem",
"description": "Look up a single inventory item by its identifier."
}
}
}
}
}
The method value follows the pattern
{package}.{Service}.{Method}, matching the proto
definition exactly. libmcp uses the generated metadata
to build a Zod schema from the request type's fields -- proto
field types map to Zod validators (string to
z.string(), int32 to
z.number(), bool to
z.boolean()), repeated fields accept both single values
and arrays, and system fields like filter and
llm_token are excluded automatically.
In the MCP service, registration is a single call:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { registerToolsFromConfig } from "@forwardimpact/libmcp";
const server = new McpServer({ name: "my-service", version: "0.1.0" });
registerToolsFromConfig(server, config, {
inventory: inventoryClient,
});
When an agent calls the ListItems tool,
libmcp normalizes the parameters against the field
metadata, constructs a typed request via fromObject,
calls the gRPC method through the client, and returns the result as
MCP content.
What each library owns
| Library | Responsibility | Key export |
|---|---|---|
libcodegen |
Reads proto files, runs protobufjs-cli, renders
templates
|
npx fit-codegen |
libtype |
Re-exports generated types and metadata | { common, graph, ... } |
librpc |
gRPC server/client framework with auth, retry, tracing |
Server, Client,
createClient
|
libmcp |
Config-driven gRPC-to-MCP tool registration | registerToolsFromConfig |
The libraries are layered: libcodegen produces files
that libtype and librpc consume.
libmcp depends on both libtype (for
metadata and type classes) and librpc (for the gRPC
clients it wraps).
When proto definitions change
After editing a .proto file, re-run the codegen:
npx fit-codegen --all
Every downstream artifact updates: the types in
libtype reflect the new fields, the service base class
expects the new request shape, the client sends the new format, and
the MCP tool schema describes the new parameters. No manual
synchronization required.
If you add a new RPC method to an existing service, the generated
base class gains a new stub that throws
"not implemented" until you override it.
Existing methods are unaffected.
If you add a new service proto, the codegen discovers it
automatically and generates a complete set of service base, client,
and definition files in a new subdirectory under
generated/services/.
Tips
-
Proto comments become MCP tool parameter descriptions.
Add a comment above each field in your
.protofile and it flows through the metadata into the Zod schema's.describe()call. Agents see these descriptions when they discover your tools. -
fromObjectvsnew: Always usefromObjectto construct typed instances. It applies prototype patches (likewithIdentifier) that the raw constructor does not. -
Incremental generation saves time. When iterating
on a single service,
npx fit-codegen --clientregenerates only clients instead of the full suite. -
The codegen is installation-specific. Each
project runs its own
fit-codegenbecause it may define custom proto files. Generated code is never bundled in published npm packages.
Next steps
This guide covers the end-to-end pipeline from proto to typed runtime. For the bounded tasks within this workflow:
- Expose a gRPC Method as an MCP Tool -- add a new MCP tool entry for an existing gRPC service method.
- Ship a Service Endpoint -- add a new RPC method to an existing service and regenerate.