Skip to content

Implemented KNN, SVM (with RBF kernel), GBT, and XGBoost, explaining their mathematical foundations and practical applications through numerical examples and performance comparisons.

Notifications You must be signed in to change notification settings

reza-chehreghani/AI-Assignment-2-Machine-Learning-Algorithms

Repository files navigation

README for AI Assignment 2

Course: Artificial Intelligence, University of Tehran
Assignment Due Date: May 23, 2024

Overview

This repository contains the work completed for Assignment 2 of the Artificial Intelligence course. The assignment focuses on understanding and implementing fundamental machine learning algorithms, providing numerical examples, and explaining their practical application.

Completed Tasks

1. K-Nearest Neighbors (KNN) Algorithm

  • Explained the mathematical basis of KNN, including the computation of Euclidean distances and the methods for predicting based on the nearest neighbors (simple average and weighted average).
  • Provided a clear numerical example demonstrating the prediction process for both one and three nearest neighbors.

2. Support Vector Machines (SVM) Algorithm

  • Discussed the concept of margins, decision boundaries, and the role of support vectors in SVMs.
  • Explained the soft margin classification approach and the use of kernels for handling nonlinear datasets.
  • Provided a numerical example of SVM using the Radial Basis Function (RBF) kernel, demonstrating how the algorithm works in practice.

3. Gradient Boosted Trees (GBT)

  • Explained the working of Gradient Boosted Trees as a sequential ensemble learning method using decision trees.
  • Demonstrated the process of correcting residual errors iteratively to build a strong learner.
  • Illustrated the concept with a numerical example.

4. Extreme Gradient Boosting (XGBoost)

  • Covered the principles of XGBoost, including its enhancements over traditional gradient boosting, such as:
    • Parallel tree-building
    • Histogram-based algorithms for feature binning
    • Regularization to reduce overfitting
  • Highlighted XGBoost’s handling of sparse data, weighted quantile sketching, and out-of-core computing.
  • Presented a numerical example to showcase XGBoost's working.

Additional Notes

  • All code and numerical examples are provided in the notebook.
  • The notebook includes detailed explanations of the code for KNN, SVM, GBT, and XGBoost, adhering to the structure and format provided in the lectures.
  • Each algorithm's slides and illustrations have been integrated into the notebook for better clarity and understanding.

About

Implemented KNN, SVM (with RBF kernel), GBT, and XGBoost, explaining their mathematical foundations and practical applications through numerical examples and performance comparisons.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published