hierarchical regression

Member Training: Hierarchical Regressions

April 1st, 2013 by

Hierarchical regression is a very common approach to model building that allows you to see the incremental contribution to a model of sets of predictor variables.Stage 2

Popular for linear regression in many fields, the approach can be used in any type of regression model — logistic regression, linear mixed models, or even ANOVA.

In this webinar, we’ll go over the concepts and steps, and we’ll look at how it can be useful in different contexts.


Note: This training is an exclusive benefit to members of the Statistically Speaking Membership Program and part of the Stat’s Amore Trainings Series. Each Stat’s Amore Training is approximately 90 minutes long.
Not a Member? Join!

About the Instructor

Karen Grace-Martin helps statistics practitioners gain an intuitive understanding of how statistics is applied to real data in research studies.

She has guided and trained researchers through their statistical analysis for over 15 years as a statistical consultant at Cornell University and through The Analysis Factor. She has master’s degrees in both applied statistics and social psychology and is an expert in SPSS and SAS.

Not a Member Yet?
It’s never too early to set yourself up for successful analysis with support and training from expert statisticians.

Just head over and sign up for Statistically Speaking.

You'll get access to this training webinar, 130+ other stats trainings, a pathway to work through the trainings that you need — plus the expert guidance you need to build statistical skill with live Q&A sessions and an ask-a-mentor forum.


Confusing Statistical Term #4: Hierarchical Regression vs. Hierarchical Model

December 21st, 2009 by

This one is relatively simple.  Very similar names for two totally different concepts.Stage 2

Hierarchical Models (aka Hierarchical Linear Models or HLM) are a type of linear regression models in which the observations fall into hierarchical, or completely nested levels.

Hierarchical Models are a type of Multilevel Models.

So what is a hierarchical data structure, which requires a hierarchical model?

The classic example is data from children nested within schools.  The dependent variable could be something like math scores, and the predictors a whole host of things measured about the child and the school.

Child-level predictors could be things like GPA, grade, and gender. School-level predictors could be things like: total enrollment, private vs. public, mean SES.

Because multiple children are measured from the same school, their measurements are not independent.  Hierarchical modeling takes that into account.

Hierarchical regression is a model-building technique in any regression model. It is the practice of building successive linear regression models, each adding more predictors.

For example, one common practice is to start by adding only demographic control variables to the model.   In the next model, you can add predictors of interest, to see if they predict the DV above and beyond the effect of the controls.

You’re actually building separate but related models in each step.  But SPSS has a nice function where it will compare the models, and actually test if successive models fit better than previous ones.

So hierarchical regression is really a series of regular old OLS regression models–nothing fancy, really.

Confusing Statistical Terms #1: Independent Variable

Confusing Statistical Terms #2: Alpha and Beta

Confusing Statistical Terms #3: Levels