Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
program-induction
A library for program induction and learning representations.
Implements Bayesian program learning and genetic programming. See the docs for more information.
Installation
Install rust and ensure you're up to date (rustup update
).
In a new or existing project, add the following to your Cargo.toml
:
[]
= "0.6"
# many examples also depend on polytype for the tp! and ptp! macros:
= "4.2"
The documentation requires a custom HTML header to include KaTeX for math
support. This isn't supported by cargo doc
, so to build the documentation
you may use:
Usage
Specify a probabilistic context-free grammar (PCFG; see pcfg::Grammar
) and
induce a sentence that matches an example:
extern crate polytype;
extern crate programinduction;
use ;
use ;
The Exploration-Compression (EC) algorithm iteratively learns a better
representation by finding common structure in induced programs. We can run
the EC algorithm with a polymorphically-typed lambda calculus representation
lambda::Language
in a Boolean circuit domain:
extern crate polytype;
extern crate programinduction;
use ;
You may have noted the above use of domains::circuits
. Some domains are
already implemented for you. Currently, this only consists of circuits and
strings.
TODO
(you could be the one who does one of these!)
- First-class function evaluation within Rust (and remove lisp interpreters).
- Add task generation function in
domains::strings
- Fallible evaluation (e.g. see how
domains::strings
handlesslice
). - Lazy evaluation.
-
impl GP for pcfg::Grammar
is not yet complete. - Consolidate lazy/non-lazy evaluation (for ergonomics).
- Permit non-
&'static str
-namedType
/TypeSchema
. - Ability to include recursive primitives in
lambda
representation. - Faster lambda calculus evaluation (less cloning; bubble up whether beta reduction happened rather than ultimate equality comparison).
- PCFG compression is currently only estimating parameters, not actually learning pieces of programs. An adaptor grammar approach seems like a good direction to go, perhaps minus the Bayesian non-parametrics.
- Add more learning traits (like
EC
orGP
) - Add more representations
- Add more domains