Expand description
Tensor Operations - Mathematical and Structural Operations
This module re-exports all tensor operations for convenient access. Operations are organized into submodules by category.
§Categories
- Arithmetic: +, -, *, /, power
- Comparison: ==, <, >, <=, >=
- Reduction: sum, mean, max, min
- Matrix: matmul, transpose, inverse
- Activation: relu, sigmoid, tanh, softmax
@version 0.1.0
@author AutomataNexus Development Team
Functions§
- clamp
- Clamps all elements to the range [min, max].
- clamp_
max - Clamps all elements to be at most max.
- clamp_
min - Clamps all elements to be at least min.
- elu
- Applies ELU (Exponential Linear Unit) activation.
- eq
- Element-wise equality comparison.
- gelu
- Applies GELU (Gaussian Error Linear Unit) activation.
- gt
- Element-wise greater-than comparison.
- leaky_
relu - Applies Leaky
ReLUactivation. - log_
softmax - Applies log-softmax along the specified dimension.
- lt
- Element-wise less-than comparison.
- silu
- Applies
SiLU(Sigmoid Linear Unit) / Swish activation. - softmax
- Applies softmax along the specified dimension.
- where_
cond - Selects elements from x or y based on condition.