Abstract
The Steepest Descent (SD) method, also known as gradient descent, remains a widely utilized optimization technique across various domains. Over time, it has undergone notable refinements compared to its earlier versions. These advancements have addressed several limitations and enhanced its overall effectiveness. Unfortunately, the classical SD method faces challenges especially in handling large-scale unconstrained problems. To address this, we propose a comparative analysis of different line search methods within the SD framework, aimed at enhancing its performance in such settings. This study focuses on "Comparative Analysis of Line Search Methods in SD Algorithm for Unconstrained Optimization Problems". The current SD method tends to progress slowly and shows a zigzag pattern towards solutions because it struggles with functions that have sharp curves or tight valley, often requiring many steps to reach the optimal point. Additionally, it takes small steps in some areas and large steps in others, making it less efficient. Particularly noteworthy is its diminished performance when applied to large-scale problems. The study's trajectory takes a subtle turn as it proposes a modified methodology that includes a detailed comparison investigation of several line search strategies inside the SD algorithm. The study focuses on line search algorithms such as the golden section search, quadratic interpolation, and the Armijo rule, among others—the study aims to decipher their individual and collective efficacy in optimizing unconstrained problems. The investigation, conducted through comprehensive numerical experiments and implementation in MATLAB as our analysis tool, the study aims to not only clarify the subtleties of performance for each line search method but also to assist practitioners and researchers in choosing the best modified SD method of the optimization problem at hand. In summary, this study transcends the scope of mere modification to delve into a thorough comparative analysis, providing insights into the complicated interaction of line search techniques inside the SD algorithm for unconstrained optimization problems. The result from this study is choosing FMRI algorithm using exact line search to get faster convergence rate which means that the algorithm achieves a high level of accuracy in fewer iterations compared to using other algorithms and inexact line search.
Metadata
Item Type: | Student Project |
---|---|
Creators: | Creators Email / ID Num. Shukeri, Ahmad Zikri UNSPECIFIED Megat Sulzamzamendi, Puteri Qurratu Ain UNSPECIFIED Ibrahim, Suhaida UNSPECIFIED |
Subjects: | L Education > LB Theory and practice of education > Higher Education > Dissertations, Academic. Preparation of theses |
Divisions: | Universiti Teknologi MARA, Negeri Sembilan > Seremban Campus |
Programme: | Bachelor of Science (Hons.) Management Mathematics |
Keywords: | Steepest Descent, SD, Algorithm, MATLAB |
Date: | 2024 |
URI: | https://ir.uitm.edu.my/id/eprint/95174 |
Download
95174.pdf
Download (323kB)