Thanks!
Context / what IngeIA is: IngeIA is our web-based platform for industrial project delivery and operations (EPC/EPCM + O&M) that unifies a 3D digital twin (BIM/IFC → 3D Tiles) with a 4D weekly schedule model (WBS-based Planned/Real/Forecast weeks) and field evidence from reality capture (drone photos, 360 imagery, videos + SRT, orthomosaics, point clouds, textured meshes). The goal is to turn “3D visualization” into an operational system where every spatial element (area polygon or BIM asset) is directly linked to work packages, progress status, and verifiable evidence—with time-dynamic overlays (often via CZML) and auditable interactions.
To clarify: in our experiments we did not use MCP yet. We implemented assistant-driven scene actions using LLM tool-calling (function calling) with a small orchestrator (we tested both a minimal in-house router and LangChain-style agents). The tools drove a CesiumJS viewer plus our own services for:
-
WBS / weekly time-phased progress (Planned Week / Real Week / Forecast Week)
-
Industrial BIM/IFC-derived 3D Tiles
-
Areas as polygons (workfaces/areas)
-
Evidence (drone photos, 360, videos + SRT, orthos, point clouds, textured meshes)
-
Issues/QA observations linked to assets/areas/WBS
-
Cesium ion asset operations (upload/version/metadata)
We’re evaluating MCP now because it looks like the right abstraction for reusable “community servers” (viewer control, CZML authoring, spatial analytics, ion ops).
Below are IngeIA-like command examples and the tool schemas we used (simplified but representative).
1) Natural language → tool calls (IngeIA-style)
Example A — “Week view” in a workface with evidence overlay
User: “Go to Area A3, set Week 2026-W14, show only planned vs real coloring, and load last week’s drone evidence.”
[
{
"tool": "viewer.flyTo",
"args": {
"target": { "type": "area", "id": "area_A3" },
"heightMeters": 220,
"headingDeg": 15,
"pitchDeg": -35,
"durationSec": 1.4
}
},
{
"tool": "timeline.setWeek",
"args": { "weekLabel": "2026-W14" }
},
{
"tool": "wbs.getWeekState",
"args": {
"projectId": "proj_demo",
"weekLabel": "2026-W14",
"scope": { "areaId": "area_A3" }
}
},
{
"tool": "czml.generateProgressOverlay",
"args": {
"weekLabel": "2026-W14",
"mode": "planned_vs_real",
"scope": { "areaId": "area_A3" },
"inputsRef": "wbs.getWeekState#last"
}
},
{
"tool": "viewer.loadCzml",
"args": { "czmlRef": "czml.generateProgressOverlay#last", "layerId": "overlay_progress" }
},
{
"tool": "evidence.query",
"args": {
"projectId": "proj_demo",
"time": { "weekLabel": "2026-W13" },
"scope": { "areaId": "area_A3" },
"types": ["photo", "photo360", "video", "ortho", "pointcloud", "mesh"],
"limit": 200
}
},
{
"tool": "viewer.addEvidencePins",
"args": {
"evidenceRef": "evidence.query#last",
"groupBy": "type",
"pinStyle": "compact",
"onClick": { "open": "evidenceDrawer" }
}
}
]
What mattered for us:
-
The assistant doesn’t “invent” progress; it calls wbs.getWeekState.
-
The viewer overlay is CZML time-dynamic, so it’s replayable and auditable.
Example B — Click an asset (IFC) → “show everything related”
User: “Highlight this asset, show its WBS tasks for the current week, and list the latest evidence nearby.”
[
{
"tool": "bim.getSelection",
"args": { "source": "viewer" }
},
{
"tool": "wbs.findByBimIds",
"args": {
"projectId": "proj_demo",
"bimIds": "bim.getSelection#ids",
"weekLabel": "timeline.currentWeek"
}
},
{
"tool": "viewer.highlightBimObjects",
"args": { "bimIds": "bim.getSelection#ids", "style": "outlineGlow" }
},
{
"tool": "evidence.query",
"args": {
"projectId": "proj_demo",
"scope": { "bimIds": "bim.getSelection#ids" },
"spatial": { "type": "near", "radiusMeters": 35 },
"time": { "range": "last_14_days" },
"types": ["photo", "photo360", "video"],
"limit": 80
}
},
{
"tool": "ui.openPanel",
"args": {
"panel": "AssetContext",
"data": {
"asset": "bim.getSelection#meta",
"tasks": "wbs.findByBimIds#last",
"evidence": "evidence.query#last"
}
}
}
]
This pattern (selection → tasks → evidence) is core for “3D that is actually operational”.
Example C — “Explain delay” with hard constraints + provenance
User: “Why is Area A3 delayed this week?”
[
{
"tool": "wbs.getWeekKpis",
"args": { "projectId": "proj_demo", "weekLabel": "2026-W14", "scope": { "areaId": "area_A3" } }
},
{
"tool": "wbs.getBlockingReasons",
"args": { "projectId": "proj_demo", "weekLabel": "2026-W14", "areaId": "area_A3" }
},
{
"tool": "evidence.healthCheck",
"args": { "projectId": "proj_demo", "weekLabel": "2026-W14", "scope": { "areaId": "area_A3" } }
}
]
The LLM only summarizes these deterministic outputs and links back to a filtered view (“show me the tasks in A3, Week 14”).
2) Tool schemas (adapted to our real primitives)
Viewer control (CesiumJS)
{
"name": "viewer.flyTo",
"description": "Move camera to an Area polygon, BIM asset, or coordinate with explicit height/orientation.",
"inputSchema": {
"type": "object",
"required": ["target", "heightMeters"],
"properties": {
"target": {
"type": "object",
"required": ["type"],
"properties": {
"type": { "type": "string", "enum": ["area", "asset", "coordinate"] },
"id": { "type": "string" },
"cartographicDeg": {
"type": "object",
"properties": {
"lon": { "type": "number" },
"lat": { "type": "number" },
"height": { "type": "number" }
}
}
}
},
"heightMeters": { "type": "number" },
"headingDeg": { "type": "number", "default": 0 },
"pitchDeg": { "type": "number", "default": -35 },
"rollDeg": { "type": "number", "default": 0 },
"durationSec": { "type": "number", "default": 1.2 }
}
}
}
Timeline / week semantics (4D)
{
"name": "timeline.setWeek",
"description": "Set current 4D time to a week label used by the schedule (time-phased weekly model).",
"inputSchema": {
"type": "object",
"required": ["weekLabel"],
"properties": {
"weekLabel": { "type": "string", "description": "e.g., 2026-W14" }
}
}
}
WBS / time-phased progress
{
"name": "wbs.getWeekState",
"description": "Return tasks/work packages in scope with Planned/Real/Forecast weeks and statuses for the requested week.",
"inputSchema": {
"type": "object",
"required": ["projectId", "weekLabel"],
"properties": {
"projectId": { "type": "string" },
"weekLabel": { "type": "string" },
"scope": {
"type": "object",
"properties": {
"areaId": { "type": "string" },
"discipline": { "type": "string" },
"wbsPrefix": { "type": "string" },
"bimIds": { "type": "array", "items": { "type": "string" } }
}
}
}
}
}
CZML authoring (time-dynamic overlays)
{
"name": "czml.generateProgressOverlay",
"description": "Generate CZML overlays to visualize progress for a given week (planned vs real vs forecast; status heatmaps).",
"inputSchema": {
"type": "object",
"required": ["weekLabel", "mode", "scope"],
"properties": {
"weekLabel": { "type": "string" },
"mode": {
"type": "string",
"enum": ["planned_vs_real", "forecast", "status", "delta_heatmap"]
},
"scope": {
"type": "object",
"properties": {
"areaId": { "type": "string" },
"discipline": { "type": "string" },
"wbsPrefix": { "type": "string" }
}
},
"inputsRef": { "type": "string", "description": "Reference to deterministic WBS output (for provenance)." }
}
}
}
Evidence query (drone reality capture)
{
"name": "evidence.query",
"description": "Query evidence objects (photos, 360, videos+SRT, orthos, point clouds, meshes) by time + spatial scope.",
"inputSchema": {
"type": "object",
"required": ["projectId"],
"properties": {
"projectId": { "type": "string" },
"types": {
"type": "array",
"items": { "type": "string", "enum": ["photo", "photo360", "video", "ortho", "pointcloud", "mesh"] }
},
"time": {
"type": "object",
"properties": {
"weekLabel": { "type": "string" },
"range": { "type": "string", "enum": ["last_7_days", "last_14_days", "custom"] },
"startIso": { "type": "string" },
"endIso": { "type": "string" }
}
},
"scope": {
"type": "object",
"properties": {
"areaId": { "type": "string" },
"bimIds": { "type": "array", "items": { "type": "string" } }
}
},
"spatial": {
"type": "object",
"properties": {
"type": { "type": "string", "enum": ["near", "withinPolygon"] },
"radiusMeters": { "type": "number" }
}
},
"limit": { "type": "integer", "default": 200 }
}
}
}
Cesium ion operations (high-level)
{
"name": "ion.assets.publishVersion",
"description": "Publish a new version of an ion asset (orthos/tiles/point clouds) with metadata and provenance.",
"inputSchema": {
"type": "object",
"required": ["assetId", "versionTag"],
"properties": {
"assetId": { "type": "string" },
"versionTag": { "type": "string" },
"metadata": { "type": "object" },
"provenance": {
"type": "object",
"properties": {
"sourceDatasetId": { "type": "string" },
"processingPipeline": { "type": "string" }
}
}
}
}
}
3) What we’d love MCP to standardize (based on this experience)
-
Deterministic tools + provenance: WBS outputs, CRS/units, dataset versions, and “replayable” sequences.
-
Confirm-before-apply patterns for expensive actions (loading huge assets, generating large CZML, mass styling).
-
Standard viewer control capabilities (camera/timeline/layers/query/snapshot/state export).
-
Shared CZML authoring server with best-practice templates for time-phased industrial workflows.
If it’s useful, we can also share how we represent areas (polygons), week labeling, and the “link-back” UX pattern (3D overlay click → filtered WBS view → evidence drawer) in a way that is reusable for the community.