The Challenge
NASA's Artemis III mission requires selecting a pre-mission landing site on the lunar south pole. The problem: real-time Earth control is impractical due to communication delay. Orbital imagery resolution (0.5m/pixel) is insufficient for detecting surface-level hazards like boulders, small craters, and unstable regolith. Manual analysis by specialists takes months and still lacks ground-truth verification before astronauts land.
The solution had to be fully autonomous — a rover that could navigate, characterize terrain, score potential habitat sites, and stream data back to Earth without waiting for command confirmations on every action.
Autonomous Rover System
Eight ROS2 nodes form the cognitive and operational architecture. Each node has a single responsibility and communicates via the ROS2 pub/sub system — loosely coupled, independently testable, and resilient to individual node failures.
1. Navigation Controller
Path planning with lunar terrain optimizations — handles low-gravity dynamics, regolith slip, and slope constraints. Implements A* with terrain cost maps.
2. Sensor Fusion
Integrates LiDAR, stereo cameras, and IMU via SLAM (Simultaneous Localization and Mapping). Builds a real-time 3D terrain model as the rover moves.
3. Habitat Monitor
Tracks environmental parameters — temperature, radiation levels, dust density. Maintains alert thresholds and flags conditions outside mission parameters.
4. Maintenance Patrol
Automated inspection routines for rover systems. Battery management, motor health monitoring, sensor calibration checks on schedule.
5. AI Decision Engine
Cognitive layer. Synthesizes data from all other nodes to make high-level mission decisions — when to investigate a candidate site, when to retreat from hazards.
6. Habitat Site Analyzer
Scores candidate sites on 4 axes: safety, buildability, resource score, and expandability. Produces ranked site recommendations for Mission Control.
7. WebSocket Bridge
Streams live telemetry, terrain data, and site scores from the ROS2 network to the browser dashboard. Sub-second latency for mission-critical monitoring.
8. Web Dashboard
React frontend providing Mission Control with live telemetry, interactive terrain maps, site ranking tables, and mission timeline visualization.
Landing Site Scoring
Multi-Sensor Array
LiDAR
Primary terrain mapping sensor. Generates point clouds for obstacle detection and slope calculation. Critical for SLAM localization in a GPS-free environment.
Stereo Cameras
Depth perception for close-range hazard detection. Visual odometry as IMU backup. Surface texture analysis for regolith stability assessment.
IMU
Inertial Measurement Unit tracks orientation and acceleration. Essential for maintaining stability on uneven terrain and detecting tip-over risk.
Temperature Sensors
Monitors lunar surface temperature extremes (-173°C to 127°C). Feeds habitat monitoring node — critical for identifying thermally stable zones.
Spectrometer Simulation
Simulated spectral analysis for water ice detection. Lunar south pole permanently shadowed regions are the primary target for resource scoring.
React Mission Dashboard
Live Telemetry
Real-time rover state — battery, motor health, sensor status, current coordinates. WebSocket stream with 1Hz update rate for critical metrics.
Terrain Map Generation
Live-rendered 3D terrain map updating as SLAM processes new LiDAR data. Interactive — Mission Control can click candidate zones for site analysis.
Site Ranking
Scored site list updates in real-time. Sortable by any of the 4 scoring axes. Visual overlay on terrain map shows site positions and composite scores.
Mission Timeline
Planned vs. actual mission progress. Tracks exploration coverage, sites evaluated, and estimated remaining battery life against mission objectives.
NASA Pitch
LunaBot reached the Regional Finalist stage of the NASA Space Apps Challenge. The pitch covered the full system: architecture, simulation results, scoring methodology, and the path from prototype to flight hardware.
The key differentiator in the evaluation wasn't the simulation — it was the architecture. Eight ROS2 nodes, clean separation of concerns, and a mission dashboard that made the autonomous decision-making legible to non-technical judges. Judges can't evaluate code. They evaluate whether you understand the problem deeply enough to explain the solution clearly.
Technical Decisions
ROS2 over ROS1
Real-time capabilities, DDS security layer, multi-robot support, and active maintenance. ROS1 is end-of-life — the right choice for a system targeting flight hardware.
Webots over Gazebo
Better lunar environment simulation out of the box — regolith physics, vacuum conditions, solar illumination angles. Lower setup overhead for a hackathon timeline.
NVIDIA Jetson as target
The compute-to-power ratio is optimal for solar-powered lunar operations. Jetson AGX Orin delivers 275 TOPS at 60W — no other hardware matches that for the use case.
Claude Code's Role
ROS2 Architecture
Designing the 8-node system, message type definitions, topic/service naming conventions, and launch file structure. ROS2 has a steep learning curve — Claude Code compressed weeks of documentation into working patterns.
Simulation Setup
Webots world configuration for lunar terrain, gravity, and lighting. Physics parameters for regolith simulation. ROS2-Webots bridge configuration for sensor data routing.
Dashboard Development
React dashboard architecture, WebSocket integration with the ROS2 bridge node, terrain visualization components, and real-time state management for mission-critical data.
Key Lesson
Hackathon quality and production quality aren't mutually exclusive. The right abstractions (ROS2, Webots, React) combined with Claude Code for boilerplate meant the focus could stay on the novel parts — the scoring algorithm, the sensor fusion logic, the decision engine. You win hackathons by solving hard problems clearly, not by building more features.