1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
//! General graphics-related utility functions
//!
//! # Coordinates
//!
//! The `gl_Position` output of the last GLSL vertex processing stage (vertex
//! shader, tessellation shader, or geometry shader) is in *clip space*, that is
//! 4D homogeneous (projective) coordinates (x, y, z, w). During vertex
//! post-processing, clip space coordinates are transformed to normalized device
//! coordinates (NDCs) by perspective divide and clipped to the range
//! [-1.0, 1.0] in the x, y, and z coorinates. Finally the viewport transform
//! takes NDCs and outputs *screen space* (or *window space*) coordinates, which
//! are 2D device coordinates + a 1D depth coordinate, on which rasterization is
//! performed to produce fragments.
//!
//! **2D**
//!
//! 2D x,y coordinates normalized to [-1.0, 1.0] can be passed through to vertex
//! post-processing unmodified. This is the case with shader pipelines that
//! specify inputs are in "`ClipSpace`"; the pass-thru vertex shader will add a z
//! component of 0.0 and a w component of 1.0.
//!
//! (Note: as of v0.6.0 none of the default draw2d resources use clip space
//! rendering pipelines)
//!
//! Even though the ultimate output of vertex post-processing is 2D screen space
//! (device) coordinates, in order to render screen-space input vertices, they
//! must be transformed to clip space by vertex processing. Two utility
//! functions are provided to convert between screen space and normalized
//! coordinates: `graphics::screen_2d_to_ndc_2d` and
//! `graphics::ndc_2d_to_screen_2d`. Both require screen dimensions as input.
use math_utils as math;
use ;
use cratecamera3d;
/// Convenience method that calls `math::orthographic_rh_no` on the given
/// `FrustmPlanes` struct
/// Convenience method calling `Matrix4::perspective_rh_no` to construct a
/// right-handed perspective matrix with -1 to 1 clip planes, equivalent to the
/// `gluPerspective` function.
///
/// This will transform points in right-handed camera (view, eye) space
/// (negative Z axis into the scene, positive Y axis 'up') into left-handed 4D
/// homogenous clip space where the Z axis is reversed with positive Z into the
/// screen while X and Y orientations remain unchanged.
/// Convert screen coordinate to OpenGL NDC based on a given screen resolution.
///
/// # Examples
///
/// ```
/// # extern crate gl_utils;
/// # extern crate math_utils as math;
/// # fn main () {
/// # use gl_utils::graphics::screen_2d_to_ndc_2d;
/// assert_eq!(
/// screen_2d_to_ndc_2d ([640, 480].into(), [320.0, 240.0].into()),
/// [0.0, 0.0].into()
/// );
/// math::approx::assert_relative_eq!(
/// screen_2d_to_ndc_2d ([640, 480].into(), [480.0, 264.0].into()),
/// [0.5, 0.1].into()
/// );
/// assert_eq!(
/// screen_2d_to_ndc_2d ([640, 480].into(), [0.0, 0.0].into()),
/// [-1.0, -1.0].into()
/// );
/// assert_eq!(
/// screen_2d_to_ndc_2d ([640, 480].into(), [0.0, 480.0].into()),
/// [-1.0, 1.0].into()
/// );
/// assert_eq!(
/// screen_2d_to_ndc_2d ([640, 480].into(), [640.0, 0.0].into()),
/// [1.0, -1.0].into()
/// );
/// assert_eq!(
/// screen_2d_to_ndc_2d ([640, 480].into(), [640.0, 480.0].into()),
/// [1.0, 1.0].into()
/// );
/// # }
/// ```
/// Maps OpenGL NDC coordinates to screen coordinates based on a given screen
/// resolution.
///
/// # Examples
///
/// ```
/// # extern crate gl_utils;
/// # extern crate math_utils as math;
/// # fn main () {
/// # use gl_utils::graphics::ndc_2d_to_screen_2d;
/// assert_eq!(
/// ndc_2d_to_screen_2d ([640, 480].into(), [0.0, 0.0].into()),
/// [320.0, 240.0].into()
/// );
/// assert_eq!(
/// ndc_2d_to_screen_2d ([640, 480].into(), [0.5, 0.1].into()),
/// [480.0, 264.0].into()
/// );
/// assert_eq!(
/// ndc_2d_to_screen_2d ([640, 480].into(), [-1.0, -1.0].into()),
/// [0.0, 0.0].into()
/// );
/// assert_eq!(
/// ndc_2d_to_screen_2d ([640, 480].into(), [-1.0, 1.0].into()),
/// [0.0, 480.0].into()
/// );
/// assert_eq!(
/// ndc_2d_to_screen_2d ([640, 480].into(), [1.0, -1.0].into()),
/// [640.0, 0.0].into()
/// );
/// assert_eq!(
/// ndc_2d_to_screen_2d ([640, 480].into(), [1.0, 1.0].into()),
/// [640.0, 480.0].into()
/// );
/// # }
/// ```