π
<-

OPEN QUESTIONS CHAPTER 4


File hierarchy

 Downloads
 Files created online(26684)
 TI-Nspire
(19815)

 nCreator(4711)

DownloadTélécharger


LicenceLicense : Non spécifiée / IncluseUnspecified / Included

 TéléchargerDownload

Actions



Vote :

ScreenshotAperçu


Informations

Catégorie :Category: nCreator TI-Nspire
Auteur Author: SPITZER2001
Type : Classeur 3.0.1
Page(s) : 1
Taille Size: 4.73 Ko KB
Mis en ligne Uploaded: 05/05/2025 - 22:44:48
Uploadeur Uploader: SPITZER2001 (Profil)
Téléchargements Downloads: 7
Visibilité Visibility: Archive publique
Shortlink : https://tipla.net/a4621080

Description 

Fichier Nspire généré sur TI-Planet.org.

Compatible OS 3.0 et ultérieurs.

<<
What is the main limitation of using linear regression for binary classification problems? Linear regression outputs continuous values, which are not confined between 0 and 1. This makes it unsuitable for predicting probabilities. Moreover, it does not model the log-odds relationship that classification tasks often require. How does logistic regression address the limitation of linear regression in classification tasks? Logistic regression uses the sigmoid function to constrain output values between 0 and 1, allowing them to be interpreted as probabilities. This makes it appropriate for binary classification. Explain the mathematical form of the logistic (sigmoid) function. What is its range and interpretation? The sigmoid function is defined as Ã(z) = 1 / (1 + e^(-z)). Its output lies strictly between 0 and 1, making it interpretable as a probability. What is the decision boundary in logistic regression and how is it determined? The decision boundary is the value of input features for which the predicted probability equals 0.5. It is typically defined by the set of inputs where the linear combination z = ¸Wx = 0. What role does the log-odds (logit) transformation play in logistic regression? Logistic regression models the log-odds of the probability as a linear function of the features: log(p / (1-p)) = ¸Wx. This ensures the model captures a linear relationship in the transformed (log-odds) space. How do you interpret the coefficients of a logistic regression model? Each coefficient represents the change in the log-odds of the outcome for a one-unit increase in the corresponding feature, holding all other features constant. Describe the difference between odds, probability, and log-odds in the context of logistic regression. Probability is a value between 0 and 1. Odds = p / (1 - p), representing the ratio of success to failure. Log-odds = log(p / (1 - p)), which logistic regression models as a linear function of features. Why is the Mean Squared Error (MSE) not suitable as a loss function for logistic regression? What is used instead? MSE is non-convex when applied to logistic regression, making optimization unstable. Instead, binary cross-entropy (log-loss) is used, which is convex and better suited to classification tasks. What is the binary cross-entropy (log-loss) function? Provide its formula and explain its purpose. The log-loss function is: L = -[y log(p) + (1 - y) log(1 - p)] It penalizes incorrect predictions by measuring the divergence between predicted probabilities and actual labels. What is the gradient descent update rule for logistic regression parameters? ¸ := ¸ - · * L(¸), where L(¸) is the gradient of the loss function (log-loss), and · is the learning rate. The gradient is computed with respect to each parameter ¸b. How is the sigmoid function involved in the gradient computation for logistic regression? The sigmoid function transforms the linear output ¸Wx into a probability, which is then used to compute the gradient of the loss with respect to the weights. Why is feature scaling often important when applying logistic regression? Feature scaling helps gradient descent converge faster and prevents features with larger numerical ranges from dominating the learning process. What are some indicators that a logistic regression model may be underfitting or overfitting? Underfitting: High training error, poor generalization, low accuracy on both training and test sets. Overfitting: High accuracy on training data but poor performance on unseen data. What does regularization do in logistic regression and why is it necessary? Regularization adds a penalty to large coefficient values in the loss function to reduce model complexity and prevent overfitting. Compare L1 and L2 regularization in logistic regression. What are the main differences in outcome? L1 (Lasso) adds |¸| penalty, resulting in sparse models where some coefficients become zero. L2 (Ridge) adds ¸² penalty, shrinking all coefficients smoothly but rarely zeroing them. Explain how logistic regression can be extended to handle multi-class classification problems. By using One-vs-Rest (OvR) or multinomial logistic regression. OvR trains one binary classifier per class, while multinomial treats all classes jointly in a softmax-based framework. What is the One-vs-Rest (OvR) strategy in multi-class logistic regression? How does it work? OvR trains one logistic regression model for each class against all others. During inference, the model with the highest probability is selected as the predicted class. Describe how multinomial logistic regression differs from binary logistic regression. Instead of modeling one logit, it models a set of logits for each class using a softmax function, allowing simultaneous classification into multiple categories. What metrics can be used to evaluate the performance of a logistic regression model on classification tasks? Accuracy, Precision, Recall, F1-score, ROC-AUC, and log-loss are commonly used metrics
[...]

>>

-
Rechercher
-
Social TI-Planet
-
Sujets à la une
Comparaisons des meilleurs prix pour acheter sa calculatrice !
"1 calculatrice pour tous", le programme solidaire de Texas Instruments. Reçois gratuitement et sans aucune obligation d'achat, 5 calculatrices couleur programmables en Python à donner aux élèves les plus nécessiteux de ton lycée. Tu peux recevoir au choix 5 TI-82 Advanced Edition Python ou bien 5 TI-83 Premium CE Edition Python.
Enseignant(e), reçois gratuitement 1 exemplaire de test de la TI-82 Advanced Edition Python. À demander d'ici le 31 décembre 2024.
Aidez la communauté à documenter les révisions matérielles en listant vos calculatrices graphiques !
1234
-
Faire un don / Premium
Pour plus de concours, de lots, de tests, nous aider à payer le serveur et les domaines...
Faire un don
Découvrez les avantages d'un compte donateur !
JoinRejoignez the donors and/or premium!les donateurs et/ou premium !


Partenaires et pub
Notre partenaire Jarrety Calculatrices à acheter chez Calcuso
-
Stats.
2507 utilisateurs:
>2482 invités
>18 membres
>7 robots
Record simultané (sur 6 mois):
29271 utilisateurs (le 11/07/2025)
-
Autres sites intéressants
Texas Instruments Education
Global | France
 (English / Français)
Banque de programmes TI
ticalc.org
 (English)
La communauté TI-82
tout82.free.fr
 (Français)