Abstract
Artificial neural networks are computational models inspired by the brain, enabling them to capture complex nonlinear relationships between a response variable and its predictors. The simplest networks lack hidden layers, making them equivalent to linear regression models. Figure 1 illustrates a neural network representation of a linear regression model with four predictors. The coefficients assigned to these predictors are called "weights," and the forecasts are generated through a linear combination of the inputs. The weights are selected in the neural network framework using a "learning algorithm" that minimizes a "cost function," such as the mean squared error (MSE). However, for this simple case, 'near regression remains a more efficient approach.
Metadata
Item Type: | Monograph (Bulletin) |
---|---|
Creators: | Creators Email / ID Num. Yaccob, Nurul Aityqah UNSPECIFIED Zulkifle, Farizuwana Akma UNSPECIFIED |
Subjects: | L Education > L Education (General) Q Science > Q Science (General) Q Science > QA Mathematics |
Divisions: | Universiti Teknologi MARA, Negeri Sembilan |
Journal or Publication Title: | What’s What PSPM |
ISSN: | 2756-7729 |
Keywords: | Artificial, neural networks, computational, mean squared error, MSE |
Date: | 2025 |
URI: | https://ir.uitm.edu.my/id/eprint/113822 |