The perceptron is the simplest form of a neural network, consisting of a single neuron. It was invented by Frank Rosenblatt in 1958 and laid the foundation for modern neural networks.
y = f(w₁x₁ + w₂x₂ + ... + wₙxₙ + b)
Where:
• y = Output (prediction/result)
• f = Activation function (introduces non-linearity)
• w₁, w₂, ..., wₙ = Weights (learnable parameters)
• x₁, x₂, ..., xₙ = Input features
• n = Number of input features
• b = Bias term (offset value)
Components:
- Inputs (x): Features or attributes fed into the neuron
- Weights (w): Parameters that determine the importance of each input
- Bias (b): Offset value that helps the model fit the data better
- Activation Function (f): Introduces non-linearity to the model
- Output (y): Final prediction or classification