The ASPEN code itself — vehicle physics, ocean environments, sensor simulation, mission guidance, and the Command & Control UI. This is the layer MITRE built and transferred. It uses everything below.
NVIDIA's robotics simulation application. Provides the physics engine, sensor simulation framework, rendering, and Python API that ASPEN builds on. Think of it as the "game engine" — but for scientific-grade robotics simulation instead of games.
The underlying platform Isaac Sim runs inside. Omniverse provides the 3D scene format (USD), real-time collaboration, visual programming (OmniGraph), and the material/rendering pipeline. NVIDIA's answer to building interconnected 3D simulation worlds.
Everything runs on NVIDIA GPUs. Physics, rendering, volumetric ocean queries, AI training — all accelerated by CUDA and specialized hardware (RTX ray tracing cores, Tensor cores for AI). This is why an NVIDIA GPU is a hard requirement.
By building on NVIDIA's platform, ASPEN inherits millions of dollars of engineering in physics accuracy, GPU acceleration, and rendering. ASPEN developers focus only on what makes their problem unique — underwater vehicle dynamics and ocean environments — while NVIDIA handles the heavy infrastructure.
Web-based mission planning and monitoring. React + TypeScript + deck.gl for GPU-accelerated map visualization. Four modes: Plan, Simulate, Monitor, Analyze. Dark maritime theme with MIL-STD-2525D symbology.
The master conductor. Initializes Isaac Sim, loads the ocean and seafloor, places vehicles, runs the physics loop step by step, manages sensors, computes acoustic propagation, and streams results. Includes reinforcement learning scaffolding for AI training.
Imports raw ocean data from NOAA, Navy forecasts, and GEBCO. Converts coordinates, voxelizes to GPU-friendly formats, computes wind noise.
Full 6-DOF physics for 4 vehicles. 11 sensor types. PID autopilots. Waypoint guidance. ROS 2 interfaces for real hardware.
Ready-to-use 3D vehicle models (USD), pre-processed bathymetry and ocean data for 5 real-world locations, atmospheric data, and ocean surface rendering assets.
The file format for 3D scenes, created by Pixar and adopted by NVIDIA. Every vehicle, ocean surface, and seafloor mesh in ASPEN is a USD file. USD lets complex scenes be assembled from many files at runtime.
NVIDIA's technology for storing 3D volumetric data efficiently on the GPU. The ocean is a massive volume — temperature, salinity, and currents at every point. fVDB uses a sparse tree structure that only stores data where it exists.
NVIDIA's GPU-accelerated physics engine. In ASPEN, PhysX handles rigid body dynamics — how vehicles respond to forces, collisions with the seafloor, buoyancy effects, and hydrodynamic drag. GPU physics lets many vehicles run simultaneously.
Six Degrees of Freedom — vehicles move and rotate along all 3 axes. ASPEN models full hydrodynamic forces: thrust, drag, added mass, Coriolis effects, and buoyancy. These equations make simulated vehicles behave like the real ones.
A ray-tracing engine for modeling how sound travels through water. Sound is the primary sensing modality underwater. Bellhop traces acoustic rays accounting for how temperature and salinity bend sound, how the seafloor absorbs it, and how signals fade.
Open-source robotics middleware. ASPEN's ROS 2 bridges let simulated vehicles communicate using the same protocols as real hardware. Algorithms tested in simulation can deploy directly to a physical BlueROV2.
| Vehicle | Type | Description | Physics | Key Use |
|---|---|---|---|---|
| REMUS 100 | AUV | Torpedo-shaped autonomous vehicle (~30 kg). Standard Navy survey platform. | Full 6-DOF + GPU batched | Mine countermeasures, survey |
| REMUS 600 | AUV | Larger REMUS variant. Greater depth rating and endurance. | Full 6-DOF | Deep water ISR |
| IVER 3 | AUV | Long-endurance AUV with modular payload bay. | Full 6-DOF | Research, monitoring |
| BlueROV2 Heavy | ROV | 8-thruster ROV by Blue Robotics. Open-source. 300m depth. | Full 6-DOF + ROS 2 | Inspection, sim-to-real |
| Slocum G3 | Glider | Buoyancy-driven glider. Months-long endurance. No propeller. | 3D model only | Persistent monitoring |
Each vehicle can be equipped with any combination of these sensor models.
Position, velocity, heading with configurable noise and drift
Depth from hydrostatic pressure with Gaussian noise model
Seafloor imaging with echo simulation and shadow detection
Current profiling and bottom-depth measurement
Forward-looking range-based collision detection
Visual sensor with Isaac Sim rendering pipeline
GPU-batched distance sensing via sparse voxel ray casting
Bellhop ray tracing for transmission loss and sound channels
Real-time temperature, salinity, current at vehicle position
Bathymetry from NOAA/GEBCO/USGS surveys. Ocean conditions from Navy models (NCOM, HYCOM, ROMS) — temperature, salinity, current fields. Atmospheric data from WRF weather models. Formats: NetCDF, GeoTIFF, GRIB, OpenDAP, Zarr.
Convert lat/lon/depth to local Cartesian coordinates (meters). Subset to the area of interest. Regrid irregular ocean model output to uniform voxels. Compute derived quantities like wind noise spectra. Handle time-varying data.
Convert processed data into NVIDIA's sparse voxel format for microsecond GPU queries. "What is the temperature here?" can be answered instantly for any point in the water column. Output: .nvdb files.
Generate seafloor mesh with collision physics and sediment materials. Load ocean volume as queryable fVDB grid. Place vehicle 3D models. Set up camera, lighting, sky, and ocean surface. Assembled as a USD scene in Omniverse.
Main loop: advance physics, update vehicle positions from hydrodynamic forces, query ocean at each position, feed sensors to guidance, repeat. Telemetry streams to the C2 UI or logs for post-mission analysis.
Five real-world environments are included and ready to simulate.
Primary test environment. Multiple resolutions from 0.5m to 3m. Ocean currents with tidal and wind-driven components. 8 bathymetry variants.
Fjord-like environment with steep sidewalls. 10m and 90m variants. Ideal for testing navigation in constrained waterways. Used in primary SDK examples.
Strong tidal currents in an urban waterway. Tests vehicle performance in high-current environments with complex flow patterns.
Multiple resolutions (3m, 10m, 90m). Tropical conditions with different temperature and salinity profiles. Deep to shallow transitions.
Isaac Sim starts (with or without a visible 3D window). The USD stage is created — an empty 3D world. Physics engine and timeline are initialized.
Bathymetry mesh generated from pre-processed data with collision physics and sediment materials. Ocean volume loaded as fVDB grid. Optionally add surface rendering and sky assets.
USD vehicle models placed at starting positions. Each gets a physics model (6-DOF), sensor suite, and guidance system from YAML config. GPU-batched physics manager handles all vehicles in parallel.
Waypoints set programmatically, from config, or through the C2 UI. Guidance computes heading/depth commands. Constraints applied (max depth, speed, battery).
This loop repeats every time step (typically 1/60th of a second):
a. Query ocean conditions at each vehicle's position (temperature, salinity, current velocity)
b. Guidance computes steering commands (pitch, yaw) based on position vs. next waypoint
c. Physics model calculates forces — thrust, drag, buoyancy, ocean currents
d. Integrate equations of motion for new position, velocity, orientation
e. Update sensors — sonar, INS, pressure read new state with noise
f. Update 3D scene — vehicle models move to new positions
g. Optionally run acoustic propagation for sound coverage
h. Log telemetry and stream to C2 UI
Review vehicle tracks, sensor coverage, acoustic propagation, energy consumption, and mission success metrics. Compare planned vs. actual paths. Identify where currents pushed vehicles off course.
On a single RTX GPU, ASPEN simulates ~10–20 vehicles in real time. On the NPS DGX GB300 (72 Blackwell Ultra GPUs, 20+ TB GPU memory, 1+ exaFLOP), the same simulation scales to hundreds of vehicles or thousands of Monte Carlo variants in parallel — fleet-scale research and AI training not possible on a single machine.