brainwires-hardware
Hardware I/O for the Brainwires Agent Framework.
Provides a unified hardware abstraction layer covering audio, GPIO, Bluetooth, network hardware, and home automation protocols — all behind opt-in feature flags so you only compile what you need.
Modules
| Module | Feature | Description |
|---|---|---|
audio |
audio |
Audio capture/playback, STT, TTS (16 cloud providers + local Whisper) |
audio/vad |
(always) / vad |
Voice activity detection — EnergyVad (always) + WebRtcVad (vad) |
audio/wake_word |
wake-word |
Wake word detection — EnergyTriggerDetector + optional ML backends |
audio/assistant |
voice-assistant |
End-to-end voice assistant pipeline |
gpio |
gpio |
GPIO pin management with safety allow-lists and PWM (Linux) |
bluetooth |
bluetooth |
BLE advertisement scanning and adapter enumeration |
network |
network |
NIC enumeration, IP config, ARP host discovery, port scanning |
camera |
camera |
Webcam/camera frame capture (V4L2/AVFoundation/MSMF) |
usb |
usb |
Raw USB device enumeration and transfers (no libusb) |
homeauto/zigbee |
zigbee |
Zigbee 3.0 coordinator — EZSP (Silicon Labs) + ZNP (TI Z-Stack) backends |
homeauto/zwave |
zwave |
Z-Wave Plus v2 Serial API — node incl/excl, command class send/recv |
homeauto/thread |
thread |
OpenThread Border Router REST API client (Thread 1.3.0) |
homeauto/matter |
matter |
Matter 1.3 controller + device server (mDNS, UDP, TLV, QR commissioning) |
Getting started
[]
# Pick only what you need:
= { = "0.8", = ["audio"] }
= { = "0.8", = ["gpio"] }
= { = "0.8", = ["bluetooth"] }
= { = "0.8", = ["network"] }
# Or enable everything:
= { = "0.8", = ["full"] }
Feature flags
| Feature | Description |
|---|---|
audio |
Hardware audio I/O via CPAL + 16 cloud STT/TTS providers |
flac |
FLAC encode/decode |
local-stt |
Local Whisper STT inference via whisper-rs (heavy dep, opt-in) |
vad |
WebRTC VAD algorithm (EnergyVad is always available with audio) |
wake-word |
Wake word detection — EnergyTriggerDetector (zero deps) |
wake-word-rustpotter |
RustpotterDetector — pure-Rust ML wake word (opt-in, see notes) |
wake-word-porcupine |
PorcupineDetector — Picovoice Porcupine (requires AccessKey + git dep) |
voice-assistant |
Full pipeline: capture → wake word → VAD → STT → handler → TTS |
gpio |
GPIO pin control via Linux character device API (gpio-cdev) |
bluetooth |
BLE scanning and adapter enumeration via btleplug |
network |
NIC enumeration, IP config, ARP discovery, port scanning |
camera |
Webcam/camera capture via nokhwa (V4L2/AVFoundation/MSMF) |
usb |
Raw USB device access and transfers via nusb (no libusb) |
zigbee |
Zigbee 3.0 via EZSP (Silicon Labs EFR32) or ZNP (TI Z-Stack 3.x) |
zwave |
Z-Wave Plus v2 (ZAPI2) over USB serial stick |
thread |
OpenThread Border Router (OTBR) REST API client |
matter |
Matter 1.3 controller + device server (pure-Rust stack, no rs-matter) |
matter-ble |
BLE commissioning window (btleplug peripheral, Linux/macOS) |
homeauto |
All four home automation protocols (zigbee + zwave + thread + matter) |
homeauto-full |
All home automation including BLE (homeauto + matter-ble) |
full |
All features (except local-stt, wake-word-rustpotter, wake-word-porcupine) |
Audio
Supports hardware capture and playback via CPAL, plus cloud STT/TTS integrations:
STT: OpenAI, Azure, Deepgram, ElevenLabs, Fish Audio TTS: OpenAI, Azure, Deepgram, ElevenLabs, Fish Audio, Google, Murf, Cartesia
use ;
let tts = new;
let audio = tts.synthesize.await?;
GPIO (Linux)
Safe GPIO access with explicit allow-lists — no pin can be used unless it appears in the configured policy.
use ;
use GpioDirection;
let mut manager = from_config;
let pin = manager.acquire?;
Bluetooth
Cross-platform BLE scanning using btleplug:
use bluetooth;
use Duration;
let devices = scan_ble.await;
for d in &devices
Network
use network;
use Duration;
// List interfaces
for iface in list_interfaces
// IP config with gateways
for cfg in get_ip_configs
// Port scan
let results = scan_common_ports.await;
// ARP host discovery (requires CAP_NET_RAW)
let hosts = arp_scan.await;
Voice Activity Detection
EnergyVad is always available (no extra feature needed beyond audio). WebRtcVad requires the vad feature.
use ;
let vad = default; // -40 dBFS threshold
if vad.is_speech
Wake Word Detection
use ;
let mut detector = new;
// Feed 30 ms i16 frames from the mic:
if let Some = detector.process_frame
Voice Assistant Pipeline
use ;
use Transcript;
use async_trait;
;
// Build and run
let mut assistant = builder
.with_playback
.with_tts
.with_wake_word
.build;
assistant.run.await?;
Home Automation
All four protocols are behind the homeauto feature (or enable each individually).
= { = "0.8", = ["homeauto"] }
Zigbee (zigbee)
Two serial backends sharing the ZigbeeCoordinator trait:
use ;
// Silicon Labs EFR32 stick (EZSP v8 over ASH)
let coord = open.await?;
coord.start.await?;
coord.permit_join.await?;
for dev in coord.devices.await?
Z-Wave (zwave)
use ;
use CommandClass;
let ctrl = open.await?;
ctrl.start.await?;
for node in ctrl.nodes.await?
// Toggle a binary switch on node 3
ctrl.send_cc.await?;
Thread (thread)
use ThreadBorderRouter;
let otbr = new.await?;
let info = otbr.node_info.await?;
println!;
for neighbor in otbr.neighbors.await?
Matter (matter)
Controller — commission and control a device:
use ;
use Path;
let ctrl = new.await?;
let device = ctrl.commission_qr.await?;
ctrl.on_off.await?;
Device server — expose an agent as a Matter device:
use ;
let config = builder
.device_name
.vendor_id
.product_id
.discriminator
.passcode
.build;
let server = new.await?;
server.set_on_off_handler;
println!;
server.start.await?; // blocks; scan QR code with your Matter controller
Matter Stack — What's Implemented
The matter feature ships a complete Matter 1.3 protocol stack written entirely in pure Rust (no rs-matter or embassy-time dependency):
| Layer | Status | Notes |
|---|---|---|
| SPAKE2+ (RFC 9383) | Complete | RustCrypto p256, PBKDF2-HMAC-SHA256, cA/cB confirmation |
| PASE commissioning | Complete | Full PBKDFParam/Pake1/2/3 handshake, session key derivation |
| CASE operational | Complete | SIGMA Sigma1/2/3, P-256 ECDH, AES-CCM-128, NOC verification |
| Matter TLV certs | Complete | NOC/ICAC/RCAC encode/decode, P-256 ECDSA-SHA256 |
| Fabric management | Complete | Root CA generation, NOC issuance, JSON persistence |
| Message transport | Complete | Matter §4.4 header, MRP retry/backoff, AES-CCM-128 UDP |
| Interaction Model | Complete | Read/Write/Invoke/Subscribe, wildcard paths, TLV codec |
| Commissioning clusters | Complete | BasicInformation, GeneralCommissioning, OperationalCredentials, NetworkCommissioning |
| mDNS advertisement | Complete | _matterc._udp commissionable + _matter._tcp operational |
| BLE commissioning | Complete (matter-ble) |
BTP handshake, segmentation/reassembly, btleplug peripheral |
| CASE session resumption | Not yet implemented | Sigma2Resume path |
| Multi-fabric | Not yet implemented | Single fabric per controller instance |
| BLE on Windows | Not implemented | btleplug WinRT BLE requires additional work |
Run the ready-made examples to get started:
# Expose this machine as a Matter on/off light — scan the printed QR code
# Commission a real Matter device and toggle it
Migration from brainwires-audio
# Before
= "0.8"
# After
= { = "0.8", = ["audio"] }
All public types and traits are re-exported from the crate root — existing code using
brainwires_audio::* can switch to brainwires_hardware::* with no further changes.
Examples
# Wake word demo (prints detections from mic)
# Full voice assistant demo (requires OPENAI_API_KEY)
# Standalone voice assistant binary
# Home automation examples (requires physical hardware for full operation)