[−][src]Function finiteelement::solve_fes
pub fn solve_fes<'a, F: Float + 'a, B: FiniteElement<F>>(
system: &[B],
positions: &mut [Point<F>],
nb_iter: usize,
epsilon_stop: F,
gradient_switch: F,
nb_gradient_steps: usize,
snapshot_steps: usize
) -> Vec<Vec<Point<F>>>
Solve a system of finite element.
Arguments:
system
is a slice of finite elements.posistions
is the initial vector of positions of the points of the system, it is updated at each step of the optimizationnb_iter
maximum number of optimization step (see below)epsilon_stop
threshold at witch the optimization is considered finished (see below)gradient_swith
threshold for switching to gradient descent (see below)nb_gradient_steps
number of gradient descent steps before switching back to Newton's method (see below).snapshot_step
number of optimization between each snapshot (see below).
Optimization steps are performed untill one of the following happens
nb_iter
steps have been performed- There is no point in the system on which an acceleration of norm greater than
epsilon_stop
is applied
The method used here is an hybrid between Newton's method
and gradient descent. In the first iterations, Newton's method will be used.
More precesily, it will perform the following opperation:
for i in 0..posistions.len() { positions[i] += delta[i]}
where delta is a solution
to the equation J*delta = F where F and J are respectively the acceleration vector
and its Jacobian.
If during one step of Newton's method the norm of delta
is smaller than gradient_switch
,
nb_gradient_steps
steps of gradient descent are performed. Each gradient step do the
following opperation:
for i in 0..positions.len() { positions[i] -= rate * force[i] }
where rate
is a parameter
that is updated using the rms prop heuristic. Once these gradient descent steps have been
done, the next steps are Newton's method steps.
Every snapshot_step
steps, the current value of positions
is pushed in a Vec
that will be
returned by this function. If snapshot_step
is set to 0, snapshots are never made and an
empty vector will be returned.
A value of about 5 is recommended for nb_gradient_steps
.