Transformations don’t always help, but when they do, they can improve your linear regression model in several ways simultaneously.
They can help you better meet the linear regression assumptions of normality and homoscedascity (i.e., equal variances). They also can help avoid some of the artifacts caused by boundary limits in your dependent variable — and sometimes even remove a difficult-to-interpret interaction.
(more…)
Interpreting regression coefficients can be tricky, especially when the model has interactions or categorical predictors (or worse – both).
But there is a secret weapon that can help you make sense of your regression results: marginal means.
They’re not the same as descriptive stats. They aren’t usually included by default in our output. And they sometimes go by the name LS or Least-Square means.
And they’re your new best friend.
So what are these mysterious, helpful creatures?
What do they tell us, really? And how can we use them?
(more…)
Have you ever experienced befuddlement when you dust off a data analysis that you ran six months ago? 
Ever gritted your teeth when your collaborator invalidates all your hard work by telling you that the data set you were working on had “a few minor changes”?
Or panicked when someone running a big meta-analysis asks you to share your data?
If any of these experiences rings true to you, then you need to adopt the philosophy of reproducible research.
(more…)
We often talk about nested factors in mixed models — students nested in classes, observations nested within subject.
But in all but the simplest designs, it’s not that straightforward. (more…)
Linear regression with a continuous predictor is set up to measure the constant relationship between that predictor and a continuous outcome.
This relationship is measured in the expected change in the outcome for each one-unit change in the predictor.
One big assumption in this kind of model, though, is that this rate of change is the same for every value of the predictor. It’s an assumption we need to question, though, because it’s not a good approach for a lot of relationships.
Segmented regression allows you to generate different slopes and/or intercepts for different segments of values of the continuous predictor. This can provide you with a wealth of information that a non-segmented regression cannot.
In this webinar, we will cover (more…)
One of the biggest challenges that data analysts face is communicating statistical results to our clients, advisors, and colleagues who don’t have a statistics background.
Unfortunately, the way that we learn statistics is not usually the best way to communicate our work to others, and many of us are left on our own to navigate what is arguably the most important part of our work.
In this webinar, we will cover how to: (more…)