Diwa
Lightweight implementation of Artificial Neural Network for resource-constrained environments
|
Defines activation functions for use in the Diwa neural network. More...
Go to the source code of this file.
Classes | |
class | DiwaActivationFunc |
Class containing static methods for common activation functions. More... | |
Macros | |
#define | DIWA_ACTFUNC_LOWER_BOUND -30.0f |
#define | DIWA_ACTFUNC_UPPER_BOUND 30.0f |
Typedefs | |
typedef double(* | diwa_activation) (double) |
Typedef for activation function pointer. | |
Defines activation functions for use in the Diwa neural network.
This header file defines the DiwaActivationFunc class, which provides a set of common activation functions used in neural networks. Activation functions play a crucial role in determining the output of a neuron based on its input. They introduce non-linearity to the network, allowing it to learn complex patterns and relationships in the data.
The DiwaActivationFunc class contains static methods for popular activation functions, including sigmoid and gaussian functions. These activation functions are used to introduce non-linearity and regulate the output values of neurons in the neural network. They transform the input values into output values suitable for the task at hand, such as classification or regression.
#define DIWA_ACTFUNC_LOWER_BOUND -30.0f |
Lower bound for input values to prevent overflow.
#define DIWA_ACTFUNC_UPPER_BOUND 30.0f |
Upper bound for input values to prevent overflow.
typedef double(* diwa_activation) (double) |
Typedef for activation function pointer.
This typedef defines the signature for activation functions used in the Diwa neural network. Activation functions take a single input value and return a transformed output value.