- examples now run on CPU by default
- added
brainstorm.tools.shuffle_data
andbrainstorm.tools.split
to help with data preparation SigmoidCE
andSquaredDifference
layers now outputs a loss for each dimension instead of summing over features.SquaredDifference
layer does no longer scale by one half.- Added a
SquaredLoss
layer that computes half the squared difference and has an interface that is compatible with theSigmoidCE
andSigmoidCE
layers. - Output probabilities renamed to predictions in
SigmoidCE
andSigmoidCE
layers.
- added a use_conv option to
brainstorm.tools.create_net_from_spec
- added criterion option to
brainstorm.hooks.EarlyStopper
hook - added
brainstorm.tools.get_network_info
function that returns information about the network as a string - added
brainstorm.tools.extract
function that applies a network to some data and saves a set of requested buffers. brainstorm.layers.mask
layer now supports masking individual features- added
brainstorm.hooks.StopAfterThresholdReached
hook
- EarlyStopper now works for any timescale and interval
- Recurrent, Lstm, Clockwork, and ClockworkLstm layers now accept inputs of arbitrary shape by implicitly flattening them.
- several fixes to make building the docs easier
- some performance improvements of NumpyHandler operations
binarize_t
andindex_m_by_v
- sped up tests
- several improvements to installation scripts
- fixed sqrt operation for
PyCudaHandler
. This should fix problems with BatchNormalization on GPU. - fixed a bug for task_type='regression' in
brainstorm.tools.get_in_out_layers
andbrainstorm.tools.create_net_from_spec
- removed defunct name argument from input layer
- fixed a crash when applying
brainstorm.hooks.SaveBestNetwork
to rolling_training loss - various minor fixes of the
brainstorm.hooks.BokehVisualizer
- fixed a problem with
sum_t
operation inbrainstorm.handlers.PyCudaHandler
- fixed a blocksize problem in convolutional and pooling operations in
brainstorm.handlers.PyCudaHandler
- First release on PyPI.