Crate kimi_fann_core

Crate kimi_fann_core 

Source
Expand description

§Kimi-FANN Core: Neural Inference Engine

Real neural network inference engine using ruv-FANN for micro-expert processing. This crate provides WebAssembly-compatible neural network inference with actual AI processing capabilities for Kimi-K2 micro-expert architecture. Enhanced with Synaptic Market integration for distributed compute.

Modules§

enhanced_router
Enhanced routing system with market integration
optimized_features
Optimized Feature Extraction for Neural Inference

Structs§

ExpertConfig
Configuration for creating a micro-expert
ExpertRouter
Expert router with intelligent request distribution and consensus
KimiRuntime
Main runtime for Kimi-FANN with neural processing
MicroExpert
A micro-expert neural network with real AI processing
MockNeuralNetwork
Mock neural network for demonstration and optimization testing
NetworkStats
Network statistics for distributed processing
NeuralConfig
Neural network configuration for micro-experts
NeuralWeights
Neural network weights and biases
ProcessingConfig
Processing configuration with neural parameters
TokenEmbedding
Token embedding for neural processing

Enums§

ExpertDomain
Expert domain enumeration with neural specialization

Constants§

VERSION
Version of the Kimi-FANN Core library

Functions§

init
Initialize the WASM module with neural network setup