nn.Functional
cofhe.nn.functional.relu(input)
Applies the ReLU activation function to the input tensor, replacing negative values with zero.cofhe.nn.functional.softmax(input)
Applies the softmax activation function across the input tensor, typically used for classification tasks.cofhe.nn.functional.log_softmax(input)
Applies the log softmax activation function to the input tensor. Useful when working with log probabilities.cofhe.nn.functional.leaky_relu(input, negative_slope)
Applies the Leaky ReLU activation function, where negative values are scaled by anegative_slope
factor.cofhe.nn.functional.tanh(input)
Applies the hyperbolic tangent (Tanh) activation function to the input tensor.cofhe.nn.functional.sigmoid(input)
Applies the sigmoid activation function to the input tensor, mapping the values to the range [0, 1].cofhe.nn.functional.gelu(input)
Applies the Gaussian Error Linear Unit (GELU) activation function, a smooth version of ReLU, widely used in transformer models.cofhe.nn.functional.mse_loss(input, target)
Computes the Mean Squared Error (MSE) loss between the input tensor and the target tensor.cofhe.nn.functional.cross_entropy(input, target)
Computes the Cross-Entropy loss between the input tensor (logits) and the target tensor (class labels).
Last updated