compute 0.2.2

A crate for statistical computing.
Documentation
## Existing features

- regression methods
  - polynomial
  - GLMs: logistic, (quasi-)Poisson, Gamma, exponential
- optimization methods
  - numerical differentiation, partial derivatives, automatic differentiation (currently with autodiff crate)
  - optimizers
    - Adam, Levenberg-Marquardt, SGD with (Nesterov) momentum
- numerical integration of functions
  - trapezoid, Romberg, 5-point Gauss-Legendre quadrature
- basic statistical distributions
  - continuous
    - (Multivariate) Normal, Beta, Gamma, Chi Squared, Student's T, Uniform, Exponential, Pareto
  - discrete
    - Bernoulli, Binomial, Poisson, Discrete Uniform
  - sampling, PDFs/PMFs
  - analytic means and variances
- mathematical and statistical functions
  - gamma, digamma, beta
  - logistic, logit, (general) boxcox transform, softmax
  - binomial coefficients
- statistical methods
  - (sample) covariance, mean, variance, min, max
- time series models
  - autoregressive models
  - related functions
    - autocorrelation, autocovariance, differencing
- validation methods
  - resampling
    - bootstrap, jackknife
- linear algebra: both BLAS/LAPACK and Rust implementations
  - vector and matrix structs  
    - overloaded arithmetic operations for combinations of {matrix, vector, scalar} with automatic broadcasting a la numpy 
  - general utilities
    - dot product, (blocked) matrix multiplication, matrix inversion, Toeplitz matrix, Vandermonde matrix, (infinity) norm, linear solve, transpose, design matrix
    - vector-vector, scalar-vector, vector-scalar operations with loop unrolling
  - decompositions and solvers
    - LU, Cholesky
- signal processing
  - convolutions
  - filters
    - Savitzky-Golay (LOESS)

## Planned features

- distributions: CDFs, fitting to data
- more time series models (SARIMA, exponential smoothing models, trend decomposition)
- non-linear optimizers (BFGS)
- ODE integrators (leapfrog, RK4)
- clustering algorithms (k-means/EM, DBSCAN)
- more regression models (mixed models, GP, penalized models, splines)
- prediction trees (CART, random forests, gradient boosted trees)
- order statistics (quantiles)
- statistical tests (t-test, ANOVA, Kolmogorov-Smirnov, Anderson-Darling)
- data preprocessing (outlier detection, standardization, dimensionality reduction (PCA))
- more linear algebra decompositions (QR, SVD)
- samplers? rejection, RWM, HMC, NUTS, (dynamic) nested sampling