hxtorch.spiking.utils.from_nir_data.EventData
-
class
hxtorch.spiking.utils.from_nir_data.EventData(idx: numpy.ndarray, time: numpy.ndarray, n_neurons: int, t_max: float) Bases:
objectEvent-based data represented as a list of event indices and their corresponding timestamps. Each event is discrete and carries no magnitude; it is defined solely by its occurrence at a certain time.
- idxnp.ndarray[int], shape (n_samples, n_events)
Event indices. If there is no event, the index is -1.
- timenp.ndarray[float], shape (n_samples, n_events)
Event times. If there is no event, the time is np.inf.
- n_neuronsint
Total number of neurons in the layer.
- t_maxfloat
Maximum time of the recording.
-
__init__(idx: numpy.ndarray, time: numpy.ndarray, n_neurons: int, t_max: float) → None Initialize self. See help(type(self)) for accurate signature.
Methods
__init__(idx, time, n_neurons, t_max)Initialize self.
to_time_gridded(dt)Arguments
Attributes
-
idx: numpy.ndarray
-
n_neurons: int
-
property
n_samples
-
property
shape
-
t_max: float
-
time: numpy.ndarray
-
to_time_gridded(dt: float) → nir.data_ir.graph.TimeGriddedData - dtfloat
Time step size.