hxtorch.spiking.modules.HXModuleWrapper

class hxtorch.spiking.modules.HXModuleWrapper(experiment: Experiment, **modules: List[HXModule])

Bases: hxtorch.spiking.modules.hx_module.HXFunctionalModule

Class to wrap HXModules

__init__(experiment: Experiment, **modules: List[HXModule])None

A module which wraps a number of HXModules defined in modules for which a single PyTorch-differential member function forward_func is defined. For instance, this allows to wrap a Synapse and a Neuron to describe recurrence.

Parameters
  • experiment – The experiment to register this wrapper in.

  • modules – A list of modules to be represented by this wrapper.

Methods

__init__(experiment, **modules)

A module which wraps a number of HXModules defined in modules for which a single PyTorch-differential member function forward_func is defined.

contains(modules)

Checks whether a list of modules modules is registered in the wrapper.

exec_forward(input, output, hw_map)

Execute the the forward function of the wrapper. This method assigns each output handle in output their corresponding PyTorch tensors and adds the wrapper’s forward_func to the PyTorch graph. :param input: A tuple of the input handles where each handle corresponds to a certain module. The order is defined by modules. Note, a module can have multiple input handles. :param output: A tuple of output handles, each corresponding to one module. The order is defined by modules. :param hw_map: The hardware data map.

extra_repr()

Add additional information

forward()

Forward method registering layer operation in given experiment

forward_func(input[, hw_data])

This function describes the unified functionality of all modules assigned to this wrapper. As for HXModules, this needs to be a PyTorch- differentiable function defined by PyTorch operations. The input and output of this member function is wrapped by (tuples of) Handles. The signature of this function is expected as: - Input: All input handles required for each module in modules as positional arguments in the order given by modules. - Outputs: Output a tuple of handles each corresponding to the output of one module in modules. The order is given by modules. - Additionally, hardware data can be accessed via a hw_data keyword arguments to which the the hardware data is supplied via a tuple holding the hardware data for each module.

update(**modules)

Update the modules and the function in the wrapper.

Attributes

contains(modules: Union[hxtorch.spiking.modules.hx_module.HXModule, List[hxtorch.spiking.modules.hx_module.HXModule]])bool

Checks whether a list of modules modules is registered in the wrapper. :param modules: The modules for which to check if they are registered. :return: Returns a bool indicating whether modules are a subset.

exec_forward(input: Tuple[hxtorch.spiking.handle.TensorHandle], output: Tuple[hxtorch.spiking.handle.TensorHandle], hw_map: Dict[_pygrenade_vx_network.PopulationOnNetwork, Tuple[torch.Tensor]])None

Execute the the forward function of the wrapper. This method assigns each output handle in output their corresponding PyTorch tensors and adds the wrapper’s forward_func to the PyTorch graph. :param input: A tuple of the input handles where each handle

corresponds to a certain module. The order is defined by modules. Note, a module can have multiple input handles.

Parameters
  • output – A tuple of output handles, each corresponding to one module. The order is defined by modules.

  • hw_map – The hardware data map.

extra_repr()str

Add additional information

forward()

Forward method registering layer operation in given experiment

forward_func(input: hxtorch.spiking.handle.TensorHandle, hw_data: Optional[Tuple[torch.Tensor]] = None)hxtorch.spiking.handle.TensorHandle

This function describes the unified functionality of all modules assigned to this wrapper. As for HXModules, this needs to be a PyTorch- differentiable function defined by PyTorch operations. The input and output of this member function is wrapped by (tuples of) Handles. The signature of this function is expected as: - Input: All input handles required for each module in modules as

positional arguments in the order given by modules.

  • Outputs: Output a tuple of handles each corresponding to the output

    of one module in modules. The order is given by modules.

  • Additionally, hardware data can be accessed via a hw_data keyword

    arguments to which the the hardware data is supplied via a tuple holding the hardware data for each module.

training: bool
update(**modules: Dict[HXModule])

Update the modules and the function in the wrapper. :param modules: The new modules to assign to the wrapper.