# Interactive Wizard Testing Guide
## Overview
This document describes how to test the new interactive `mecha10 init` wizard.
## Implementation Summary
The interactive wizard provides a comprehensive developer experience when initializing new robot projects. Instead of requiring command-line flags, users can now run `mecha10 init` and be guided through the complete configuration process, including hardware selection, behaviors, AI capabilities, and dashboard configuration.
### Phase 1 (Completed)
- Basic project name and robot type selection
- Simulation generation toggle
- Configuration summary
### Phase 2 (Completed - Current)
- Hardware component multi-select with smart defaults
- Behavior selection (navigate, follow, deliver, etc.)
- AI/ML capabilities (LLM control, object detection, speech recognition, etc.)
- Dashboard and control mode configuration
- Comprehensive configuration summary with confirmation
- Smart defaults based on robot type
## Files Modified
1. **Cargo.toml**
- Added `dialoguer = "0.11"` dependency for interactive prompts
2. **src/main.rs**
- Modified `Commands::Init` enum to make `name` and `template` optional
- Added `cmd_init_interactive()` function with interactive prompts
- Updated command dispatcher to call interactive mode when parameters are missing
## Testing Scenarios
### 1. Full Interactive Mode (Phase 2)
```bash
mecha10 init
```
**Expected behavior:**
1. Prompts for project name
2. Shows menu to select robot type (rover/humanoid/arm/basic)
3. **NEW:** Hardware component multi-select with smart defaults
- Rover gets: camera, lidar, imu, motor_controller pre-selected
- Humanoid gets: camera, imu, motor_controller, force_torque pre-selected
- Arm gets: camera, imu, motor_controller, force_torque pre-selected
- Basic gets: motor_controller only pre-selected
4. **NEW:** Behavior selection multi-select
- Rover defaults: navigate, autonomous_docking
- Humanoid defaults: follow_person, pick_and_place
- Arm defaults: pick_and_place
5. **NEW:** AI/ML capabilities multi-select (optional)
- LLM control, object detection, speech recognition, etc.
- No defaults (all optional)
6. **NEW:** Dashboard configuration
- Enable web dashboard? (default: Yes)
- Enable LLM chat interface? (only if LLM control selected)
- Control modes: manual, autonomous, semi-autonomous
7. Asks whether to generate simulation environments
8. **NEW:** Shows comprehensive configuration summary with all selections
9. **NEW:** Confirmation prompt before proceeding
10. Creates project with selected options
### 2. Partial Interactive Mode (Name Only)
```bash
mecha10 init my-robot
```
**Expected behavior:**
- Skips name prompt (uses "my-robot")
- Shows robot type selection menu
- Asks about simulation generation
- Creates project
### 3. Partial Interactive Mode (Template Only)
```bash
mecha10 init --template rover
```
**Expected behavior:**
- Prompts for project name
- Skips robot type selection (uses "rover")
- Asks about simulation generation
- Creates project
### 4. Non-Interactive Mode (Backward Compatibility)
```bash
mecha10 init my-robot --template rover
```
**Expected behavior:**
- No interactive prompts
- Uses provided parameters
- Asks about simulation generation
- Creates project (existing behavior preserved)
### 5. Skip Simulation Flag
```bash
mecha10 init --skip-sim
```
**Expected behavior:**
- Prompts for name and robot type
- Skips simulation generation question
- Creates project without simulation
## Manual Testing Steps
### Prerequisites
- Rust toolchain installed
- In the CLI package directory
### Build the CLI
```bash
cd packages/cli
cargo build --release
```
### Test Interactive Wizard
```bash
# Test 1: Full interactive
./target/release/mecha10 init
# Expected prompts:
# 1. "Project name:" (type: test-robot-1)
# 2. "What type of robot are you building?" (select: Rover)
# 3. "Generate simulation environments?" (select: Yes)
# Verify project created:
ls test-robot-1/
# Should show: drivers/, nodes/, config/, simulation/, etc.
```
### Test Partial Interactive
```bash
# Test 2: Name provided
./target/release/mecha10 init test-robot-2
# Expected prompts:
# 1. Robot type selection (select: Humanoid)
# 2. Simulation generation (select: Yes)
# Test 3: Template provided
./target/release/mecha10 init --template arm
# Expected prompts:
# 1. "Project name:" (type: test-robot-3)
# 2. Simulation generation (select: Yes)
```
### Test Backward Compatibility
```bash
# Test 4: Traditional usage
./target/release/mecha10 init test-robot-4 --template basic
# Expected: No interactive prompts
# Should work exactly as before
```
### Verify Project Structure
For each test, verify the created project has:
```
test-robot-X/
├── .mecha10/
├── clippy.toml
├── rustfmt.toml
├── .ls-lint.yml
├── config/
├── drivers/
├── logs/
├── mecha10.json
├── models/
├── nodes/
├── simulation/
└── tests/
```
### Verify mecha10.json
Check that `mecha10.json` contains the correct robot type:
```bash
```
## Expected User Experience
### Before (Command-line only)
```bash
$ mecha10 init
error: the following required arguments were not provided:
<NAME>
Usage: mecha10 init <NAME> --template <TEMPLATE>
```
### After Phase 2 (Interactive wizard with extended features)
```bash
$ mecha10 init
🤖 Mecha10 Project Initialization Wizard
? Project name: my-awesome-robot
? What type of robot are you building?
❯ Rover - 4-wheeled mobile robot with sensors
Humanoid - Bipedal walking robot
Robotic Arm - 6-DOF manipulator
Basic - Minimal starting template
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Hardware Components
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
? Select hardware components (space to select, enter to continue)
◉ Camera (vision/perception)
◉ LiDAR (navigation/mapping)
◉ IMU (orientation/balance)
◯ GPS (outdoor localization)
◯ Depth Camera (3D perception)
◉ Motor Controller (movement)
◯ Ultrasonic Sensors (proximity detection)
◯ Force/Torque Sensor (manipulation feedback)
◯ Microphone (audio input)
◯ Speaker (audio output)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Robot Behaviors
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
? Select robot behaviors (space to select, enter to continue)
◉ Navigate (autonomous path planning & obstacle avoidance)
◯ Follow Person (track and follow a human)
◯ Deliver (autonomous delivery tasks)
◯ Patrol (surveillance and monitoring routes)
◯ Pick & Place (object manipulation)
◯ Voice Control (speech command interface)
◯ Gesture Recognition (visual gesture commands)
◉ Autonomous Docking (return to charging station)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
AI & Machine Learning Capabilities
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
? Select AI/ML capabilities (optional, space to select, enter to continue)
◯ LLM Control (natural language command processing)
◯ Object Detection (vision-based object recognition)
◯ Speech Recognition (voice-to-text)
◯ Gesture Recognition ML (learned gesture patterns)
◯ Semantic Mapping (environment understanding)
◯ Person Detection & Tracking (human identification)
◯ Anomaly Detection (safety & monitoring)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Dashboard & Control Configuration
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
? Enable web dashboard interface? Yes
? Select control modes to enable
◉ Manual Teleoperation (keyboard/gamepad control)
◉ Autonomous Mode (behavior-based control)
◯ Semi-Autonomous (human-in-the-loop)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
RL Training & Simulation
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
? Generate Godot RL simulation environments? Yes
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
📋 Configuration Summary
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Project Configuration:
Name: my-awesome-robot
Robot Type: rover
Hardware Components (4):
- camera
- lidar
- imu
- motor_controller
Behaviors (2):
- navigate
- autonomous_docking
AI/ML Capabilities (0):
(none selected)
Dashboard & Control:
Web Dashboard: Yes
Control Modes: manual, autonomous
RL Training:
Generate Simulation: Yes
? Proceed with project initialization? Yes
Note: Extended configuration (hardware, behaviors, AI capabilities) will be stored in mecha10.json
Full template generation coming in next update!
🤖 Mecha10 Project Initialization
📋 Creating project: my-awesome-robot
Platform: rover
📁 Creating project structure...
✓ drivers/
✓ nodes/
...
```
## Phase 2 Completion Status
### Completed Features
- ✅ Hardware component multi-select with 10 component types
- ✅ Smart defaults based on robot type (rover, humanoid, arm, basic)
- ✅ Behavior selection with 8 behavior options
- ✅ AI/ML capabilities with 7 AI options
- ✅ Dashboard configuration prompts
- ✅ LLM chat interface toggle (conditional on LLM control selection)
- ✅ Control mode multi-select (manual, autonomous, semi-autonomous)
- ✅ Comprehensive configuration summary with counts
- ✅ Confirmation prompt before project creation
- ✅ Backward compatibility preserved (all existing flags still work)
### Known Limitations (Phase 2)
The current implementation collects user preferences but does not yet generate customized templates based on selections. Future phases will add:
- **Template Generation:** Generate driver/node templates based on hardware selections
- **Behavior Templates:** Generate behavior code templates for selected capabilities
- **AI Model Integration:** Configure and setup AI models (YOLO, LLM endpoints, etc.)
- **Dashboard Generation:** Generate custom dashboard with selected control modes
- **Deployment Architecture:** Add deployment target selection (on-robot, edge, cloud)
- **SaaS Integration:** Optional cloud connection for fleet management
### What Works Now
- All interactive prompts function correctly
- Smart defaults are applied based on robot type
- Configuration is collected and displayed in summary
- Projects are created with standard structure
- Backward compatibility maintained
### What Needs Implementation
- Storing extended configuration in mecha10.json
- Generating driver stubs for selected hardware
- Creating behavior node templates
- Setting up AI model configurations
- Generating custom dashboard based on selections
## Troubleshooting
### Build Errors
If you see compilation errors, ensure:
1. Rust version >= 1.70
2. All dependencies are up to date: `cargo update`
3. Clean build: `cargo clean && cargo build`
### Interactive Prompts Not Showing
If prompts don't appear:
1. Check terminal supports ANSI colors
2. Try without `ColorfulTheme`: modify code to use `SimpleTheme`
3. Check stdin is not redirected
### Backward Compatibility Issues
If existing scripts break:
1. Verify they provide both `name` and `--template`
2. Update scripts to use explicit parameters
3. The old usage `mecha10 init <name> --template <type>` still works
## Next Steps (Phase 3)
Future enhancements planned:
1. **Template Generation from Selections**
- Generate driver stubs for selected hardware components
- Create behavior node templates based on selections
- Configure AI model endpoints and configurations
- Generate custom dashboard with selected control modes
2. **Advanced Configuration**
- Deployment architecture selection (on-robot, edge, cloud)
- Cloud integration and fleet management
- Resource allocation and performance tuning
- Security and authentication setup
3. **Component Ecosystem Integration**
- Integrate with `mecha10 add` command for installing components
- Component registry and marketplace
- Dependency resolution and versioning
- Auto-wiring of components in node graph
4. **Interactive Development Experience**
- `mecha10 dev` interactive terminal (like `expo dev`)
- Hot-reload capabilities
- Live status monitoring
- Interactive training and deployment
## References
- [IDEAL_DEVELOPER_EXPERIENCE.md](../../IDEAL_DEVELOPER_EXPERIENCE.md) - Full vision
- [CLI README](./README.md) - Command documentation
- [dialoguer documentation](https://docs.rs/dialoguer/) - Interactive prompt library