nn.Module
cofhe.nn.ModuleThe base class for all neural network modules in CoFHE, providing methods for setting up layers and applying forward passes. It operates similarly to PyTorch'snn.Module.cofhe.nn.SequentialThis is a container module that applies a sequence of layers or modules to the input tensor, similar to PyTorch'sSequential. It allows for building simple, linear neural networks by stacking layers.cofhe.nn.ModuleListA list-like container that holds a list of modules and applies them sequentially to the input tensor.cofhe.nn.ModuleDictA dictionary-like container that holds a collection of modules, allowing them to be applied based on their keys.cofhe.nn.ParameterRepresents a learnable parameter in a module. It holds a tensor that will be optimized during training.cofhe.nn.LayerNormA layer normalization module that normalizes across the last dimension. Similar to PyTorch’snn.LayerNorm, it normalizes input data and has learnable parameters.cofhe.nn.BatchNorm1dA batch normalization layer for 1D input, similar to PyTorch’snn.BatchNorm1d. It normalizes the input across batches to stabilize training.cofhe.nn.TransformerA transformer module that implements the transformer architecture, supporting both encoder and decoder layers. It accepts parameters such as model dimension, number of heads, and the number of layers.cofhe.nn.TransformerEncoderA module that implements the encoder part of the transformer model. It supports configurable parameters like the number of layers and attention heads.cofhe.nn.TransformerDecoderA module that implements the decoder part of the transformer model.cofhe.nn.TransformerEncoderLayerImplements a single transformer encoder layer, which consists of multi-head self-attention and feed-forward networks.cofhe.nn.TransformerDecoderLayerImplements a single transformer decoder layer, consisting of multi-head attention and cross-attention layers, along with feed-forward layers.cofhe.nn.LinearA fully connected layer (linear transformation), similar to PyTorch’snn.Linear.cofhe.nn.Conv2dA 2D convolutional layer, similar to PyTorch’snn.Conv2d. It accepts parameters such as input channels, output channels, kernel size, stride, and padding.cofhe.nn.DropoutA dropout layer that randomly zeros some elements of the input tensor during training, helping with regularization.cofhe.nn.EmbeddingAn embedding layer that maps discrete indices (e.g., words) to continuous vectors. Similar to PyTorch’snn.Embedding.cofhe.nn.CosineSimilarityA layer that computes the cosine similarity between two input tensors, typically used for tasks such as measuring similarity between vectors.cofhe.nn.PairwiseDistanceA layer that computes the pairwise distance between two input tensors, useful in tasks like metric learning.cofhe.nn.RNNA recurrent neural network layer. Similar to PyTorch’snn.RNN, it supports parameters such as input size, hidden size, and number of layers.cofhe.nn.MultiheadAttentionA multi-head attention layer, which is a key part of transformer models. It can attend to different parts of the input sequence in parallel.
Last updated