nn.Module
cofhe.nn.Module
The base class for all neural network modules in CoFHE, providing methods for setting up layers and applying forward passes. It operates similarly to PyTorch'snn.Module
.cofhe.nn.Sequential
This is a container module that applies a sequence of layers or modules to the input tensor, similar to PyTorch'sSequential
. It allows for building simple, linear neural networks by stacking layers.cofhe.nn.ModuleList
A list-like container that holds a list of modules and applies them sequentially to the input tensor.cofhe.nn.ModuleDict
A dictionary-like container that holds a collection of modules, allowing them to be applied based on their keys.cofhe.nn.Parameter
Represents a learnable parameter in a module. It holds a tensor that will be optimized during training.cofhe.nn.LayerNorm
A layer normalization module that normalizes across the last dimension. Similar to PyTorch’snn.LayerNorm
, it normalizes input data and has learnable parameters.cofhe.nn.BatchNorm1d
A batch normalization layer for 1D input, similar to PyTorch’snn.BatchNorm1d
. It normalizes the input across batches to stabilize training.cofhe.nn.Transformer
A transformer module that implements the transformer architecture, supporting both encoder and decoder layers. It accepts parameters such as model dimension, number of heads, and the number of layers.cofhe.nn.TransformerEncoder
A module that implements the encoder part of the transformer model. It supports configurable parameters like the number of layers and attention heads.cofhe.nn.TransformerDecoder
A module that implements the decoder part of the transformer model.cofhe.nn.TransformerEncoderLayer
Implements a single transformer encoder layer, which consists of multi-head self-attention and feed-forward networks.cofhe.nn.TransformerDecoderLayer
Implements a single transformer decoder layer, consisting of multi-head attention and cross-attention layers, along with feed-forward layers.cofhe.nn.Linear
A fully connected layer (linear transformation), similar to PyTorch’snn.Linear
.cofhe.nn.Conv2d
A 2D convolutional layer, similar to PyTorch’snn.Conv2d
. It accepts parameters such as input channels, output channels, kernel size, stride, and padding.cofhe.nn.Dropout
A dropout layer that randomly zeros some elements of the input tensor during training, helping with regularization.cofhe.nn.Embedding
An embedding layer that maps discrete indices (e.g., words) to continuous vectors. Similar to PyTorch’snn.Embedding
.cofhe.nn.CosineSimilarity
A layer that computes the cosine similarity between two input tensors, typically used for tasks such as measuring similarity between vectors.cofhe.nn.PairwiseDistance
A layer that computes the pairwise distance between two input tensors, useful in tasks like metric learning.cofhe.nn.RNN
A recurrent neural network layer. Similar to PyTorch’snn.RNN
, it supports parameters such as input size, hidden size, and number of layers.cofhe.nn.MultiheadAttention
A multi-head attention layer, which is a key part of transformer models. It can attend to different parts of the input sequence in parallel.
Last updated