Our analysis of linear regression focuses on parameter estimates, z-scores, p-values and confidence levels. Rarely in regression do we see a discussion of the estimates and F statistics given in the ANOVA table above the coefficients and p-values.
And yet, they tell you a lot about your model and your data. Understanding the parts of the table and what they tell you is important for anyone running any regression or ANOVA model.
Can you explain these estimates? In this training we discuss these often overlooked calculations.
Specifically we cover:
· The three sources of the sum of squares: model, residuals and total
· How the sum of squares are calculated
· How the sum of squares change as we add predictors to the model
· How they impact the calculations of r-squared, standard errors and confidence intervals
· The differences between Type I, Type II and Type III sum of squares
· What the F-test is testing and how does it compare to the t-test
· How to use the sums of squares to calculate effect size statistics
Upon completion of the training you will no longer fear being asked questions with regards to these overlooked and incredibly useful estimations.
About the Instructor
Jeff Meyer is a statistical consultant and the Stata expert at The Analysis Factor. He teaches workshops and provides Stata examples for a number of our workshops, including Intro to Stata, Missing Data, and Repeated Measures.
Jeff has an MBA from the Thunderbird School of Global Management and an MPA with a focus on policy from NYU Wagner School of Public Service.
Just head over and sign up for Statistically Speaking. You'll get access to this training webinar, 130+ other stats trainings, a pathway to work through the trainings that you need — plus the expert guidance you need to build statistical skill with live Q&A sessions and an ask-a-mentor forum.