![]() ![]() Sinabs respects pytorch’s data covention where ever appropriate.īecause Sinabs deals with temporal data most often, there is an additional time axis expected in the input data. In addition, for compatibility with other layers in pytorch, the input data needs to be a float tensor (even if the original data happens to be binary). Data convention: float tensors #Īlthough Sinabs deals with spiking data or events, for considerations of simulation speed and learning efficiency, all the layers are designed to work with spike rasters or tensors. ![]() ![]() A right Riemann sum with 10 subintervals for the function f ( x) sin ( 2 x) x 2 10 + 3 on the interval. Sinabs layers augment the activation functions in pytorch, and are designed to work along side the various connection layers in pytorch such as AvgPool2d, Linear and Conv2d layers. Consider this applet 1, in which you will initially see the situation shown in Figure 4.3.2. In practice, Sinabs comprises a set of activation layers that emulate spiking neuronal dynamics.īecause of the temporal nature of SNNs, several layers in sinabs are stateful. Sinabs is designed to work along side pytorch by providing additional functionality to support spiking neural network dynamics. Fundamentals of Sinabs # Sinabs: PyTorch Extension # ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |