[][src]Struct openvr::System

pub struct System(_);

Methods

impl System[src]

pub fn recommended_render_target_size(&self) -> (u32, u32)[src]

Provides the game with the minimum size that it should use for its offscreen render target to minimize pixel stretching. This size is matched with the projection matrix and distortion function and will change from display to display depending on resolution, distortion, and field of view.

pub fn projection_matrix(
    &self,
    eye: Eye,
    near_z: f32,
    far_z: f32
) -> [[f32; 4]; 4]
[src]

Returns the projection matrix to use for the specified eye.

Clip plane distances are in meters.

pub fn projection_raw(&self, eye: Eye) -> RawProjection[src]

Returns the raw project values to use for the specified eye. Most games should use GetProjectionMatrix instead of this method, but sometimes a game needs to do something fancy with its projection and can use these values to compute its own matrix.

pub fn eye_to_head_transform(&self, eye: Eye) -> [[f32; 4]; 3][src]

Returns the transform between the view space and eye space. Eye space is the per-eye flavor of view space that provides stereo disparity. Instead of Model * View * Projection the model is Model * View * Eye * Projection. Normally View and Eye will be multiplied together and treated as View in your application.

pub fn time_since_last_vsync(&self) -> Option<(f32, u64)>[src]

Returns the number of elapsed seconds since the last recorded vsync event and the global number of frames that have been rendered. Timing information will come from a vsync timer event in the timer if possible or from the application-reported time if that is not available. If no vsync times are available the function will return None.

pub fn device_to_absolute_tracking_pose(
    &self,
    origin: TrackingUniverseOrigin,
    predicted_seconds_to_photons_from_now: f32
) -> TrackedDevicePoses
[src]

Calculates updated poses for all devices.

The pose that the tracker thinks that the HMD will be in at the specified number of seconds into the future. Pass 0 to get the state at the instant the method is called. Most of the time the application should calculate the time until the photons will be emitted from the display and pass that time into the method.

This is roughly analogous to the inverse of the view matrix in most applications, though many games will need to do some additional rotation or translation on top of the rotation and translation provided by the head pose.

Seated experiences should call this method with TrackingUniverseSeated and receive poses relative to the seated zero pose. Standing experiences should call this method with TrackingUniverseStanding and receive poses relative to the chaperone soft bounds. TrackingUniverseRawAndUncalibrated should probably not be used unless the application is the chaperone calibration tool itself, but will provide poses relative to the hardware-specific coordinate system in the driver.

pub fn tracked_device_class(
    &self,
    index: TrackedDeviceIndex
) -> TrackedDeviceClass
[src]

pub fn is_tracked_device_connected(&self, index: TrackedDeviceIndex) -> bool[src]

pub fn poll_next_event_with_pose(
    &self,
    origin: TrackingUniverseOrigin
) -> Option<(EventInfo, TrackedDevicePose)>
[src]

pub fn compute_distortion(
    &self,
    eye: Eye,
    u: f32,
    v: f32
) -> Option<DistortionCoordinates>
[src]

Computes the distortion caused by the optics Gets the result of a single distortion value for use in a distortion map. Input UVs are in a single eye's viewport, and output UVs are for the source render target in the distortion shader.

pub fn tracked_device_index_for_controller_role(
    &self,
    role: TrackedControllerRole
) -> Option<TrackedDeviceIndex>
[src]

Returns the device index associated with a specific role, for example the left hand or the right hand.

pub fn get_controller_role_for_tracked_device_index(
    &self,
    i: TrackedDeviceIndex
) -> Option<TrackedControllerRole>
[src]

Returns the controller type associated with a device index.

pub fn vulkan_output_device(
    &self,
    instance: *mut VkInstance_T
) -> Option<*mut VkPhysicalDevice_T>
[src]

pub fn bool_tracked_device_property(
    &self,
    device: TrackedDeviceIndex,
    property: TrackedDeviceProperty
) -> Result<bool, TrackedPropertyError>
[src]

pub fn float_tracked_device_property(
    &self,
    device: TrackedDeviceIndex,
    property: TrackedDeviceProperty
) -> Result<f32, TrackedPropertyError>
[src]

pub fn int32_tracked_device_property(
    &self,
    device: TrackedDeviceIndex,
    property: TrackedDeviceProperty
) -> Result<i32, TrackedPropertyError>
[src]

pub fn uint64_tracked_device_property(
    &self,
    device: TrackedDeviceIndex,
    property: TrackedDeviceProperty
) -> Result<u64, TrackedPropertyError>
[src]

pub fn matrix34_tracked_device_property(
    &self,
    device: TrackedDeviceIndex,
    property: TrackedDeviceProperty
) -> Result<[[f32; 4]; 3], TrackedPropertyError>
[src]

pub fn string_tracked_device_property(
    &self,
    device: TrackedDeviceIndex,
    property: TrackedDeviceProperty
) -> Result<CString, TrackedPropertyError>
[src]

pub fn hidden_area_mesh(
    &self,
    eye: Eye,
    ty: HiddenAreaMeshType
) -> Option<HiddenAreaMesh>
[src]

Returns the hidden area mesh for the current HMD.

The pixels covered by this mesh will never be seen by the user after the lens distortion is applied based on visibility to the panels. If this HMD does not have a hidden area mesh, None is returned. This mesh is meant to be rendered into the stencil buffer (or into the depth buffer setting nearz) before rendering each eye's view. This will improve performance by letting the GPU early-reject pixels the user will never see before running the pixel shader.

NOTE: Render this mesh with backface culling disabled since the winding order of the vertices can be different per-HMD or per-eye.

Passing HiddenAreaMeshType::Inverse will produce the visible area mesh that is commonly used in place of full-screen quads. The visible area mesh covers all of the pixels the hidden area mesh does not cover.

pub fn controller_state(
    &self,
    device: TrackedDeviceIndex
) -> Option<ControllerState>
[src]

Looks up the current input state of a controller.

Returns None if the device is not a controller, or if the user is currently in the system menu.

Needed for rendering controller components (e.g. trigger) accurately wrt. user input using the render_models API.

pub fn controller_state_with_pose(
    &self,
    origin: TrackingUniverseOrigin,
    device: TrackedDeviceIndex
) -> Option<(ControllerState, TrackedDevicePose)>
[src]

See controller_state

pub fn trigger_haptic_pulse(
    &self,
    device: TrackedDeviceIndex,
    axis: u32,
    microseconds: u16
)
[src]

Trigger a single haptic pulse on a controller.

After this call the application may not trigger another haptic pulse on this controller and axis combination for 5ms.

Vive controller haptics respond to axis 0. OpenVR seems to reject durations longer than 3999us.

pub fn acknowledge_quit_exiting(&self)[src]

Call this to acknowledge to the system that Event::Quit has been received and that the process is exiting.

This extends the timeout until the process is killed.

pub fn acknowledge_quit_user_prompt(&self)[src]

Call this to tell the system that the user is being prompted to save data.

This halts the timeout and dismisses the dashboard (if it was up). Applications should be sure to actually prompt the user to save and then exit afterward, otherwise the user will be left in a confusing state.

pub fn reset_seated_zero_pose(&self)[src]

Sets the zero pose for the seated tracker coordinate system to the current position and yaw of the HMD.

After reset_seated_zero_pose all device_to_absolute_tracking_pose calls that pass TrackingUniverseOrigin::Seated as the origin will be relative to this new zero pose. The new zero coordinate system will not change the fact that the Y axis is up in the real world, so the next pose returned from device_to_absolute_tracking_pose after a call to reset_seated_zero_pose may not be exactly an identity matrix.

NOTE: This function overrides the user's previously saved seated zero pose and should only be called as the result of a user action. Users are also able to set their seated zero pose via the OpenVR Dashboard.

Auto Trait Implementations

impl Send for System

impl Unpin for System

impl Sync for System

impl UnwindSafe for System

impl RefUnwindSafe for System

Blanket Implementations

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> Any for T where
    T: 'static + ?Sized
[src]