Abstract
Conjugate Gradient method is commonly use to solve large scale unconstrained optimization problem. This is because they do not need the storage of matrices. Specifically, this project is to investigate more about three-term conjugate gradient methods. Inexact line search which is strong wolfe and modified parameter was use in this project. The methods that had been use in this project are Liu (2018), Norddin et. al.(2018), and modified Three-term Hestenes-Steifel (2007).
These methods have been tested using several optimization test functions which are Extended Rosenbrock, Himmeblau function, Beale and White & Holst function . The result is analysed based on the number of iteration and CPU time. This expectation result from this research is to identify the best method for strong wolfe to solve large scale unconstrained optimization problems.
Metadata
Item Type: | Thesis (Degree) |
---|---|
Creators: | Creators Email / ID Num. Abdullah, Nor Amila Sofiya 2016289532 Mohd Jalil, Nurul Nadia 2016284404 |
Contributors: | Contribution Name Email / ID Num. Thesis advisor Norddin, Nur Idalisa UNSPECIFIED |
Subjects: | Q Science > QA Mathematics > Equations Q Science > QA Mathematics > Mathematical statistics. Probabilities Q Science > QA Mathematics > Analysis Q Science > QA Mathematics > Instruments and machines > Electronic Computers. Computer Science > Algorithms |
Divisions: | Universiti Teknologi MARA, Terengganu > Kuala Terengganu Campus > Faculty of Computer and Mathematical Sciences |
Programme: | Bachelor of Science (Hons) Computational Mathematics |
Keywords: | Conjugate Gradient Method ; Storage Of Matrices ; Three-Term Conjugate Gradient Methods ; Rosenbrock ; Himmeblau Function ; Beale And White & Holst Function |
Date: | July 2019 |
URI: | https://ir.uitm.edu.my/id/eprint/41324 |
Download
41324.pdf
Download (151kB)