hxtorch.spiking.modules.batch_dropout.BatchDropout
-
class
hxtorch.spiking.modules.batch_dropout.
BatchDropout
(size: int, dropout: float, experiment: Experiment, func: Union[Callable, torch.autograd.Function] = <function batch_dropout>) Bases:
hxtorch.spiking.modules.hx_module.HXFunctionalModule
Batch dropout layer
Caveat: In-place operations on TensorHandles are not supported. Must be placed after a neuron layer, i.e. Neuron.
-
__init__
(size: int, dropout: float, experiment: Experiment, func: Union[Callable, torch.autograd.Function] = <function batch_dropout>) → None Initialize BatchDropout layer. This layer disables spiking neurons in the previous spiking Neuron layer with a probability of dropout. Note, size has to be equal to the size in the corresponding spiking layer. The spiking mask is maintained for the whole batch.
- Parameters
size – Size of the population this dropout layer is applied to.
dropout – Probability that a neuron in the precessing layer gets disabled during training.
experiment – Experiment to append layer to.
func – Callable function implementing the module’s forward functionality or a torch.autograd.Function implementing the module’s forward and backward operation. Defaults to batch_dropout.
execution_instance – Execution instance to place to.
Methods
__init__
(size, dropout, experiment[, func])Initialize BatchDropout layer.
Add additional information
set_mask
()Creates a new random dropout mask, applied to the spiking neurons in the previous module.
Attributes
Getter for spike mask.
-
extra_repr
() → str Add additional information
-
property
mask
Getter for spike mask.
- Returns
Returns the current spike mask.
-
output_type
alias of
hxtorch.spiking.handle.NeuronHandle
-
set_mask
() → None Creates a new random dropout mask, applied to the spiking neurons in the previous module. If module.eval() dropout will be disabled.
- Returns
Returns a random boolean spike mask of size self.size.
-