Expand description
Python bindings for ToRSh via PyO3
Re-exports§
pub use tensor::PyTensor;pub use crate::pandas_support::DataAnalysisResult;pub use crate::pandas_support::PandasSupport;pub use crate::pandas_support::TorshDataFrame;pub use crate::pandas_support::TorshSeries;pub use crate::scipy_integration::LinalgResult;pub use crate::scipy_integration::OptimizationResult;pub use crate::scipy_integration::SciPyIntegration;pub use crate::scipy_integration::SignalResult;
Modules§
- tensor
- Python tensor wrapper module
Structs§
- PyAdam
- Adam optimizer
- PyData
Loader - Python wrapper for ToRSh DataLoader
- PyData
Loader Builder - Helper function to create a dataloader builder with advanced options
- PyLinear
- Linear (fully connected) layer
- PyModule
- Base class for neural network modules
- PyOptimizer
- Base optimizer class
- PyRandom
Data Loader - Python wrapper for random DataLoader
- PySGD
- SGD optimizer
Functions§
- arange
- Create tensor with values in a range
- binary_
cross_ entropy - Binary cross entropy loss
- cat
- Concatenate tensors along existing dimension
- cross_
entropy - Cross entropy loss
- cuda_
device_ count - Get number of CUDA devices
- cuda_
is_ available - Check if CUDA is available
- eye
- Create identity matrix
- from_
numpy - Create tensor from NumPy array
- full
- Create tensor filled with a scalar value
- gelu
- GELU activation function (Gaussian Error Linear Unit)
- linspace
- Create tensor with linearly spaced values
- log_
softmax - Log softmax function
- manual_
seed - Set manual seed for reproducibility
- mse_
loss - Mean squared error loss
- ones
- Create tensor of ones
- rand
- Create tensor with random uniform distribution
- randn
- Create tensor with random normal distribution
- relu
- ReLU activation function
- sigmoid
- Sigmoid activation function
- softmax
- Softmax function
- stack
- Stack tensors along a new dimension
- tanh
- Tanh activation function
- tensor
- Create tensor from data
- to_
numpy - Convert tensor to NumPy array
- zeros
- Create tensor of zeros