
Learn to predict categorical outcomes using two powerful machine learning algorithms: Logistic Regression and Decision Trees. You'll understand how they work, when to use them, and how to evaluate their performance.
Predicting with Probability
After this session, you'll be able to explain how Logistic Regression predicts categories by estimating probabilities, not just drawing a line.
5 min
Logistic Regression Deep Dive
You'll be able to explain how Logistic Regression separates data with a decision boundary and understand what its coefficients mean.
5 min
How Decision Trees Work
You'll be able to explain how a Decision Tree makes predictions by asking a series of simple questions, much like a flowchart.
5 min
Evaluating Classifiers
You'll be able to interpret a Confusion Matrix and explain why metrics like Precision, Recall, and F1-score are crucial for evaluating classifiers, especially with imbalanced data.
5 min
Decision Tree Growth
You'll be able to explain how Decision Trees choose splits and how to prevent them from becoming overly complex and 'memorizing' data.
5 min
Classifying Algorithms Compared
You'll be able to compare Logistic Regression and Decision Trees, and understand when to choose one over the other for different classification problems.
5 min
Explain how Logistic Regression uses probability to classify data points.
Identify the strengths and limitations of Logistic Regression for classification tasks.
Describe the fundamental decision-making process of a Decision Tree.
Understand the concept of overfitting in Decision Trees and methods to mitigate it.
Differentiate between key classification evaluation metrics like accuracy, precision, recall, and F1-score.
Interpret a Confusion Matrix and its components.
Choose an appropriate classification algorithm (Logistic Regression vs. Decision Tree) for a given problem scenario.
Explain the importance of ROC AUC for evaluating classifier performance across different thresholds.