Releases: pyro-ppl/pyro
0.2.1
@poutine.broadcast
is a new effect hadler that allows sample site shapes to be automatically broadcast based on their enclosigiarange
s. This makes it very easy to experiment with different models by moving sample sites in and out ofiarange
s without any manual.expand()
changes. See the tensor shapes tutorial for details.pyro.optim.PyroLRScheduler
makes it easy to use PyTorch learning rate schedulers in Pyro.pyro.contrib.autoguide
now supports custom name prefixes and has more thorough error messages for name collision. This makes it easier to combine multiple autoguide strategies.pyro.ops.newton.newton_step_2d
is a fast differentiable optimizer for batched 2-dimensional loss functions that are themselves twice differentiable.pyro.contrib.gp.kernels.Coregionalize
andpyro.contrib.autoguide.AutoLowRankMultivariateNormal
both provide models multivariate data with low-rank plus diagonal covariance.TorchDistribution.expand()
is more flexible and more PyTorch idiomatic than the olderTorchDistribution.expand_by()
.- Miscellaneous bugfixes
0.2.0
Support for PyTorch 0.4
Pyro 0.2 supports PyTorch 0.4. See PyTorch release notes for comprehensive changes. The most important change is that Variable
and Tensor
have been merged, so you can now simplify
- pyro.param("my_param", Variable(torch.ones(1), requires_grad=True))
+ pyro.param("my_param", torch.ones(1))
PyTorch distributions
PyTorch's torch.distributions library is now Pyro’s main source for distribution implementations. The Pyro team helped create this library by collaborating with Adam Paszke, Alican Bozkurt, Vishwak Srinivasan, Rachit Singh, Brooks Paige, Jan-Willem Van De Meent, and many other contributors and reviewers. See the Pyro wrapper docs for wrapped PyTorch distributions and the Pyro distribution docs for Pyro-specific distributions.
Constrained parameters
Parameters can now be constrained easily using notation like
from torch.distributions import constraints
pyro.param(“sigma”, torch.ones(10), constraint=constraints.positive)
See the torch.distributions.constraints library and all of our Pyro tutorials for example usage.
Arbitrary tensor shapes
Arbitrary tensor shapes and batching are now supported in Pyro. This includes support for nested batching via iarange
and support for batched multivariate distributions. The iarange
context and irange
generator are now much more flexible and can be combined freely. With power comes complexity, so check out our tensor shapes tutorial (hint: you’ll need to use .expand_by()
and .independent()
).
Parallel enumeration
Discrete enumeration can now be parallelized. This makes it especially easy and cheap to enumerate out discrete latent variables. Check out the Gaussian Mixture Model tutorial for example usage. To use parallel enumeration, you'll need to first configure sites, then use the TraceEnum_ELBO
losss:
def model(...):
...
@config_enumerate(default="parallel") # configures sites
def guide(...):
with pyro.iarange("foo", 10):
x = pyro.sample("x", dist.Bernoulli(0.5).expand_by([10]))
...
svi = SVI(model, guide, Adam({}),
loss=TraceEnum_ELBO(max_iarange_nesting=1)) # specify loss
svi.step()
Markov chain monte carlo via HMC and NUTS
This release adds experimental support for gradient-based Markov Chain Monte Carlo inference via Hamiltonian Monte Carlo pyro.infer.HMC
and the No U-Turn Sampler pyro.infer.NUTS
. See the docs and example for details.
Gaussian Processes
A new Gaussian Process module pyro.contrib.gp provides a framework for learning with Gaussian Processes. To get started, take a look at our Gaussian Process Tutorial. Thanks to Du Phan for this extensive contribution!
Automatic guide generation
Guides can now be created automatically with the pyro.contrib.autoguide library. These work only for models with simple structure (no irange
or iarange
), and are easy to use:
from pyro.contrib.autoguide import AutoDiagNormal
def model(...):
...
guide = AutoDiagonalNormal(model)
svi = SVI(model, guide, ...)
Validation
Model validation is now available via three toggles:
pyro.enable_validation()
pyro.infer.enable_validation()
# Turns on validation for PyTorch distributions.
pyro.distributions.enable_validation()
These can also be used temporarily as context managers
# Run with validation in first step.
with pyro.validation_enabled(True):
svi.step()
# Avoid validation on subsequent steps (may miss NAN errors).
with pyro.validation_enabled(False):
for i in range(1000):
svi.step()
Rejection sampling variational inference (RSVI)
We've added support for vectorized rejection sampling in a new Rejector
distribution. See docs or RejectionStandardGamma
class for example usage.
0.1.2
0.1.1
Initial public release
0.1.0 Bump to version 0.1.0 (#481)