Course: Artificial Intelligence, University of Tehran
Assignment Due Date: May 23, 2024
This repository contains the work completed for Assignment 2 of the Artificial Intelligence course. The assignment focuses on understanding and implementing fundamental machine learning algorithms, providing numerical examples, and explaining their practical application.
- Explained the mathematical basis of KNN, including the computation of Euclidean distances and the methods for predicting based on the nearest neighbors (simple average and weighted average).
- Provided a clear numerical example demonstrating the prediction process for both one and three nearest neighbors.
- Discussed the concept of margins, decision boundaries, and the role of support vectors in SVMs.
- Explained the soft margin classification approach and the use of kernels for handling nonlinear datasets.
- Provided a numerical example of SVM using the Radial Basis Function (RBF) kernel, demonstrating how the algorithm works in practice.
- Explained the working of Gradient Boosted Trees as a sequential ensemble learning method using decision trees.
- Demonstrated the process of correcting residual errors iteratively to build a strong learner.
- Illustrated the concept with a numerical example.
- Covered the principles of XGBoost, including its enhancements over traditional gradient boosting, such as:
- Parallel tree-building
- Histogram-based algorithms for feature binning
- Regularization to reduce overfitting
- Highlighted XGBoost’s handling of sparse data, weighted quantile sketching, and out-of-core computing.
- Presented a numerical example to showcase XGBoost's working.
- All code and numerical examples are provided in the notebook.
- The notebook includes detailed explanations of the code for KNN, SVM, GBT, and XGBoost, adhering to the structure and format provided in the lectures.
- Each algorithm's slides and illustrations have been integrated into the notebook for better clarity and understanding.