Sum Operations¶
LibSPN-Keras offers a convenient way to control the backward pass for sum operations used in the SPNs you build.
Internally, LibSPN-Keras defines a couple of sum operations with different backward passes, for gradient as well as
EM learning. All of these operations inherit from the SumOpBase
.
NOTE
By default, LibSPN-Keras uses SumOpGradBackprop
.
Getting And Setting a Sum Op¶
These methods allow for setting and getting the current default SumOpBase
. By setting a default all sum layers
(DenseSum
, Conv2DSum
, Local2DSum
and RootSum
)
will use that sum op, unless you explicitly provide a SumOpBase
instance to any of those classes when initializing them.
-
libspn_keras.
set_default_sum_op
(op)¶ Set default sum op to conveniently use it throughout an SPN architecture.
- Parameters
op (
SumOpBase
) – Implementation of sum op with corresponding backward pass definitions- Return type
None
-
libspn_keras.
get_default_sum_op
()¶ Obtain default sum op.
- Return type
SumOpBase
- Returns
The default sum op.
Sum Operation With Gradients In Backward Pass¶
-
class
libspn_keras.
SumOpGradBackprop
(logspace_accumulators=None)¶ Sum op primitive with gradient in backpropagation when computed through TensorFlow’s autograd engine.
Internally, weighted sums are computed with default gradients for all ops being used.
- Parameters
logspace_accumulators (
Optional
[bool
]) – If provided overrides default log-space choice. For aSumOpGradBackprop
the default isTrue
Sum Operations With EM Signals In Backward Pass¶
-
class
libspn_keras.
SumOpEMBackprop
¶ Sum op primitive with EM signals in backpropagation.
These are dense EM signals as opposed to the other EM based instances of
SumOpBase
-
class
libspn_keras.
SumOpHardEMBackprop
(sample_prob=None)¶ Sum op with hard EM signals in backpropagation when computed through TensorFlow’s autograd engine.
- Parameters
sample_prob (
Union
[float
,Tensor
,None
]) – Sampling probability in the range of [0, 1]. Sampling logits are taken from the normalized log probability of the children of each sum.
-
class
libspn_keras.
SumOpUnweightedHardEMBackprop
(sample_prob=None)¶ Sum op with hard EM signals in backpropagation when computed through TensorFlow’s autograd engine.
Instead of using weighted sum inputs to select the maximum child, it relies on unweighted child inputs, which has the advantage of alleviating a self-amplifying chain of hard EM signals in deep SPNs.
- Parameters
sample_prob (
Union
[float
,Tensor
,None
]) – Sampling probability in the range of [0, 1]. Sampling logits are taken from the normalized log probability of the children of each sum.
Sum Operations With Sample Signals In Backward Pass¶
-
class
libspn_keras.
SumOpSampleBackprop
¶ Sum op with hard EM signals in backpropagation when computed through TensorFlow’s autograd engine.
- Parameters
sample_prob – Sampling probability in the range of [0, 1]. Sampling logits are taken from the normalized log probability of the children of each sum.