pub struct Expr {
pub result: f64,
pub is_learnable: bool,
pub name: Option<String>,
/* private fields */
}Expand description
Expression representing a node in a calculation graph.
This struct represents a node in a calculation graph. It can be a leaf node, a unary operation or a binary operation.
A leaf node holds a value, which is the one that is used in the calculation.
A unary expression is the result of applying a unary operation to another expression. For example, the result of applying the tanh operation to a leaf node.
A binary expression is the result of applying a binary operation to two other expressions. For example, the result of adding two leaf nodes.
Fields§
§result: f64The numeric result of the expression, as result of applying the operation to the operands.
is_learnable: boolWhether the expression is learnable or not. Only learnable Expr will have their values updated during backpropagation (learning).
name: Option<String>The name of the expression, used to identify it in the calculation graph.
Implementations§
Source§impl Expr
impl Expr
Sourcepub fn new_leaf(value: f64) -> Expr
pub fn new_leaf(value: f64) -> Expr
Creates a new leaf expression with the given value.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);Sourcepub fn new_leaf_with_name(value: f64, name: &str) -> Expr
pub fn new_leaf_with_name(value: f64, name: &str) -> Expr
Creates a new leaf expression with the given value and name.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf_with_name(1.0, "x");
assert_eq!(expr.name, Some("x".to_string()));Sourcepub fn tanh(self) -> Expr
pub fn tanh(self) -> Expr
Applies the hyperbolic tangent function to the expression and returns it as a new expression.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = expr.tanh();
assert_eq!(expr2.result, 0.7615941559557649);Sourcepub fn relu(self) -> Expr
pub fn relu(self) -> Expr
Applies the rectified linear unit function to the expression and returns it as a new expression.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(-1.0);
let expr2 = expr.relu();
assert_eq!(expr2.result, 0.0);Sourcepub fn exp(self) -> Expr
pub fn exp(self) -> Expr
Applies the exponential function (e^x) to the expression and returns it as a new expression.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = expr.exp();
assert_eq!(expr2.result, 2.718281828459045);Sourcepub fn pow(self, exponent: Expr) -> Expr
pub fn pow(self, exponent: Expr) -> Expr
Raises the expression to the power of the given exponent (expression) and returns it as a new expression.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(2.0);
let exponent = Expr::new_leaf(3.0);
let result = expr.pow(exponent);
assert_eq!(result.result, 8.0);Sourcepub fn log(self) -> Expr
pub fn log(self) -> Expr
Applies the natural logarithm function to the expression and returns it as a new expression.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(2.0);
let expr2 = expr.log();
assert_eq!(expr2.result, 0.6931471805599453);Sourcepub fn neg(self) -> Expr
pub fn neg(self) -> Expr
Negates the expression and returns it as a new expression.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = expr.neg();
assert_eq!(expr2.result, -1.0);Sourcepub fn recalculate(&mut self)
pub fn recalculate(&mut self)
Recalculates the value of the expression recursively, from new values of the operands.
Usually will be used after a call to Expr::learn, where the gradients have been calculated and
the internal values of the expression tree have been updated.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let mut expr2 = expr.tanh();
expr2.learn(1e-09);
expr2.recalculate();
assert_eq!(expr2.result, 0.7615941557793864);You can also vary the values of the operands and recalculate the expression:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf_with_name(1.0, "x");
let mut expr2 = expr.tanh();
let mut original = expr2.find_mut("x").expect("Could not find x");
original.result = 2.0;
expr2.recalculate();
assert_eq!(expr2.result, 0.9640275800758169);Sourcepub fn learn(&mut self, learning_rate: f64)
pub fn learn(&mut self, learning_rate: f64)
Applies backpropagation to the expression, updating the values of the gradients and the expression itself.
This method will change the gradients based on the gradient of the last expression in the calculation graph.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let mut expr2 = expr.tanh();
expr2.learn(1e-09);After adjusting the gradients, the method will update the values of the individual expression tree nodes to minimize the loss function.
In order to get a new calculation of the expression tree, you’ll need to call
Expr::recalculate after calling Expr::learn.
Sourcepub fn find(&self, name: &str) -> Option<&Expr>
pub fn find(&self, name: &str) -> Option<&Expr>
Finds a node in the expression tree by its name.
This method will search the expression tree for a node with the given name. If the node is not found, it will return None.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf_with_name(1.0, "x");
let expr2 = expr.tanh();
let original = expr2.find("x");
assert_eq!(original.expect("Could not find x").result, 1.0);Sourcepub fn find_mut(&mut self, name: &str) -> Option<&mut Expr>
pub fn find_mut(&mut self, name: &str) -> Option<&mut Expr>
Finds a node in the expression tree by its name and returns a mutable reference to it.
This method will search the expression tree for a node with the given name. If the node is not found, it will return None.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf_with_name(1.0, "x");
let mut expr2 = expr.tanh();
let mut original = expr2.find_mut("x").expect("Could not find x");
original.result = 2.0;
expr2.recalculate();
assert_eq!(expr2.result, 0.9640275800758169);Sourcepub fn parameter_count(&self, learnable_only: bool) -> usize
pub fn parameter_count(&self, learnable_only: bool) -> usize
Returns the count of nodes (parameters)in the expression tree.
This method will return the total number of nodes in the expression tree, including the root node.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = expr.tanh();
assert_eq!(expr2.parameter_count(false), 2);
assert_eq!(expr2.parameter_count(true), 1);Trait Implementations§
Source§impl Add<Expr> for f64
impl Add<Expr> for f64
Source§impl Add<f64> for Expr
impl Add<f64> for Expr
Source§impl Add for Expr
impl Add for Expr
This implementation allows the addition of two Expr objects.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = Expr::new_leaf(2.0);
let result = expr + expr2;
assert_eq!(result.result, 3.0);Source§impl Div<f64> for Expr
impl Div<f64> for Expr
Source§impl Div for Expr
impl Div for Expr
This implementation allows the division of two Expr objects.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = Expr::new_leaf(2.0);
let result = expr / expr2;
assert_eq!(result.result, 0.5);Source§impl Mul<Expr> for f64
impl Mul<Expr> for f64
Source§impl Mul<f64> for Expr
impl Mul<f64> for Expr
Source§impl Mul for Expr
impl Mul for Expr
This implementation allows the multiplication of two Expr objects.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = Expr::new_leaf(2.0);
let result = expr * expr2;
assert_eq!(result.result, 2.0);Source§impl Sub<Expr> for f64
impl Sub<Expr> for f64
Source§impl Sub<f64> for Expr
impl Sub<f64> for Expr
Source§impl Sub for Expr
impl Sub for Expr
This implementation allows the subtraction of two Expr objects.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = Expr::new_leaf(2.0);
let result = expr - expr2;
assert_eq!(result.result, -1.0);Source§impl Sum for Expr
impl Sum for Expr
Note that this implementation will generate temporary Expr objects,
which may not be the most efficient way to sum a collection of Expr objects.
However, it is provided as a convenience method for users that want to use sum
over an Iterator<Expr>.
Example:
use alpha_micrograd_rust::value::Expr;
let expr = Expr::new_leaf(1.0);
let expr2 = Expr::new_leaf(2.0);
let expr3 = Expr::new_leaf(3.0);
let sum = vec![expr, expr2, expr3].into_iter().sum::<Expr>();
assert_eq!(sum.result, 6.0);