📹🎙️🦀 Rust Client SDK for LiveKit
Use this SDK to add real-time video, audio and data features to your Rust app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
Features
- Receiving tracks
- Publishing tracks
- Data channels
- Simulcast
- SVC codecs (AV1/VP9)
- Adaptive Streaming
- Dynacast
- Hardware video enc/dec
- VideoToolbox for MacOS/iOS
- Supported Platforms
- Windows
- MacOS
- Linux
- iOS
- Android
Crates
livekit-api: Server APIs and auth token generationlivekit: LiveKit real-time SDKlivekit-ffi: Internal crate, used to generate bindings for other languages
When adding the SDK as a dependency to your project, make sure to add the
necessary rustflags
to your cargo config, otherwise linking may fail.
Also, please refer to the list of the supported platform toolkits.
Getting started
Currently, Tokio is required to use this SDK, however we plan to make the async executor runtime agnostic.
Using Server API
Generating an access token
use access_token;
use env;
Creating a room with RoomService API
use ;
async
Using Real-time SDK
Connect to a Room and listen for events:
use *;
async
Receive video frames of a subscribed track
...
use StreamExt; // this trait is required for iterating on audio & video frames
use *;
match event
Examples

- basic room: simple example connecting to a room.
- wgpu_room: complete example app with video rendering using wgpu and egui.
- mobile: mobile app targeting iOS and Android
- play_from_disk: publish audio from a wav file
- save_to_disk: save received audio to a wav file
Motivation and Design Goals
LiveKit aims to provide an open source, end-to-end WebRTC stack that works everywhere. We have two goals in mind with this SDK:
- Build a standalone, cross-platform LiveKit client SDK for Rustaceans.
- Build a common core for other platform-specific SDKs (e.g. Unity, Unreal, iOS, Android)
Regarding (2), we've already developed a number of client SDKs for several platforms and encountered a few challenges in the process:
- There's a significant amount of business/control logic in our signaling protocol and WebRTC. Currently, this logic needs to be implemented in every new platform we support.
- Interactions with media devices and encoding/decoding are specific to each platform and framework.
- For multi-platform frameworks (e.g. Unity, Flutter, React Native), the aforementioned tasks proved to be extremely painful.
Thus, we posited a Rust SDK, something we wanted build anyway, encapsulating all our business logic and platform-specific APIs into a clean set of abstractions, could also serve as the foundation for our other SDKs!
We'll first use it as a basis for our Unity SDK (under development), but over time, it will power our other SDKs, as well.