How to deliver an effective ASPEN demonstration
nvidia-smi, confirm CUDA versionnpm run dev)The ASPEN Command & Control UI is a professional-grade operational interface — a single-page web application built on React 19, TypeScript, Mapbox GL, and deck.gl. It was designed from the ground up to look, feel, and function like the kind of interface a Navy operator would use in a real mission environment. It is not a debug tool or a research dashboard bolted on top of a simulation. It is an operationally realistic front end that demonstrates what full-stack autonomous undersea operations would look like.
The interface runs in any modern browser and connects to the ASPEN simulation backend via a REST API and WebSocket stream. During the demo it runs against a high-fidelity mock backend that returns realistic data, correct data types, and real-time telemetry updates — indistinguishable from a live connection.
Simulation engines are hard for non-engineers to evaluate. Isaac Sim is powerful, but a raw 3D viewport with vehicle meshes and telemetry logs does not communicate operational value to a commander, program manager, or decision-maker. The C2 UI translates the simulation into the language of operations.
When an audience sees a mission being planned on a real nautical chart, watches multiple vehicles execute autonomously, sees a comms degradation alert fire in real time, and then reviews the deviation analysis — they understand what ASPEN enables. The C2 UI is the argument that this platform is operationally relevant, not just technically interesting.
Mission authoring. Place waypoints on the chart, assign vehicles, define operational area and geofence, set per-waypoint behaviors (transit, survey, loiter, hover, surface), configure abort conditions and recovery point. Validates against vehicle capabilities and depth constraints before execution.
Mission rehearsal. Plays back a physics-generated telemetry series with adjustable speed (1x–16x). Environmental parameters (current strength, sea state, thermocline depth, water temperature, visibility) are tunable in real time — change conditions and regenerate the run to see mission impact. Event log shows waypoint completions, alerts, and autonomy decisions as they occur in sim time.
Live operations. Vehicles update position, heading, and sensor status at 1-second intervals via WebSocket. Battery depletion, comms degradation, and autonomy state changes appear in real time. Fleet health panel gives an at-a-glance status for all assigned vehicles. Alert queue surfaces critical conditions with severity classification.
Post-mission assessment. Overlays planned track against actual track, marks deviations with autonomy-generated explanations (why did the vehicle deviate, what behavior triggered it, what was the confidence). Coverage heatmap shows sensor footprint versus tasked area. Metrics panel: coverage percentage, distance traveled, deviations logged, estimated mission success probability.
nvidia-smi output. Verify CUDA version matches what Isaac Sim expects.sudo apt install nvidia-driver-560 and reboot.git lfs pull in the assets/ directory..env file. Regenerate from the Mapbox dashboard.