Arşiv ve Dokümantasyon Merkezi
Dijital Arşivi

Example-dependent cost-sensitive gradient boosting machines for credit scoring

Basit öğe kaydını göster

dc.contributor Graduate Program in Computational Science and Engineering.
dc.contributor.advisor Baydoğan, Mustafa Gökçe.
dc.contributor.author Kurtuluş, İlker.
dc.date.accessioned 2023-10-15T06:54:28Z
dc.date.available 2023-10-15T06:54:28Z
dc.date.issued 2022
dc.identifier.other CSE 2022 K87
dc.identifier.uri http://digitalarchive.boun.edu.tr/handle/123456789/19710
dc.description.abstract Although most of machine learning algorithms try to minimize cost-insensitive losses, many real world applications require cost-sensitive approaches where misclassifi cation costs among classes differ from each other. In addition to misclassification costs, examples in data sets may have nonidentical costs which is a case of example-dependent cost- sensitive learning. For example in credit scoring, mistakenly rejecting a good borrower and approving a bad client with financial distress result in different costs. Additionally, providing variety of credit amounts to applicants makes the credit scoring example-dependent. In other words, falsely approving 100M$ and 1M$ loans produce unequal costs. To overcome this problem, this thesis proposes an example- dependent cost-sensitive loss function. With the introduced loss function, cost sensitivity is handled during the learning process. This is achieved by changing the traditional loss function of Gradient Boosting Machines with the proposed one to make it Example-Dependent Cost-Sensitive Gradient Boosting Machines. The proposed algorithm is tested on two real world data sets that include credit amounts and synthetically generated data sets. The algorithm is compared with cost- insensitive learners, previously proposed example-dependent cost-sensitive classifiers that handles cost-sensitivity during learning, a post-processing method called Thresholding and a pre-processing method Oversam pling to make cost-insensitive classifiers cost-sensitive. Results show that our method outperforms those four methods in terms of financial savings.
dc.publisher Thesis (M.S.) - Bogazici University. Institute for Graduate Studies in Science and Engineering, 2022.
dc.subject.lcsh Credit scoring systems.
dc.subject.lcsh Machine learning -- Mathematical models.
dc.title Example-dependent cost-sensitive gradient boosting machines for credit scoring
dc.format.pages xi, 46 leaves


Bu öğenin dosyaları

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster

Dijital Arşivde Ara


Göz at

Hesabım