1. Simple Linear Regression Model
(i) Review of simple linear regression model: Y Bo+ BIX + ε, where ε is a continuous random variable with E(e) = 0, V(e) = o2. Estimation of ẞo and ẞ1, by the method of least squares.
(ii) Properties of estimators of ẞo and ẞ1.
(iii) Estimation of 02.
(iv) Assumption of normality of ε. Tests of hypothesis of ẞ1.
(v) Interval estimation in simple linear regression model.
(vi) Coefficient of determination.
(vii) Residual analysis: Standardized residuals, Studentized residuals, Residual plots.
(viii) Detection and treatment of outliers.
(ix) Interpretation of four plots produced by Im command in R.
2. Multiple Linear Regression Model
(i) Review of multiple linear regression model Y = Bo+ BIX + ... + BpXp, where ɛ is a continuous random variable with E(e) 0, V(e) o2. Estimation of regression parameters ẞo, B1, ... and ẞp by the method of least squares, obtaining normal equations, solutions of normal equations.
(ii) Estimation of o2.
(iii) Assumption of normality of ε. Tests of hypothesis of regression parameters.
(iv) Interval estimation in simple linear regression model.
(v) Variable selection and model building.
(vi) Residual diagnostics and corrective measures such as transformation of response variable, weighted least squares method.
(vii) Polynomial regression models.
3. Logistic Regression Model
(i) Binary response variable, Logit transform, estimation of parameters, interpretation of parameters.
(ii) Tests of hypotheses of model parameters, model deviance, LR test.
(iii) AIC and BIC criteria for model selection.
(iv) Interpretation of output produced by glm command in R.
(v) Multiple logistic regression.