Do You Understand Statistics

A large majority of health and fitness claims substantiate their validity by citing ( or often just anecdotally referencing) “studies”. A couple questions, when you see “studies” referenced in an article, how often do you A. click the link to see if actually goes to a peer-reviewed study? B. Read at least the abstract of the study or studies. C. Examine the methods.

Secondly, how familiar are you with the terms “convenience sample” “Confidence interval” “self-selection bias”, “blinding”, “p-value” , "correlation vs. causation’ “operational definition”, and “statistical power”. Not looking to call anyone out, just curious as this is one of the main sources of information for strength, bodybuilding, and what not.

[quote]Bronco_XIII wrote:
A large majority of health and fitness claims substantiate their validity by citing ( or often just anecdotally referencing) “studies”. A couple questions, when you see “studies” referenced in an article, how often do you A. click the link to see if actually goes to a peer-reviewed study? B. Read at least the abstract of the study or studies. C. Examine the methods. Secondly, how familiar are you with the terms “convenience sample” “Confidence interval” “self-selection bias”, “blinding”, “p-value” , "correlation vs. causation’ “operational definition”, and “statistical power”. Not looking to call anyone out, just curious as this is one of the main sources of information for strength, bodybuilding, and what not.[/quote]

Most people only read studies or articles if they feel that the result will support their theories or ideas, otherwise they will just pass over it until they find something that does.

I take with a grain of salt some articles that say that a study was done at the Universoty of Kandahar or something and says the exercises tested were lateral raises and the pullover machine or something.

[quote]Nards wrote:
I take with a grain of salt some articles that say that a study was done at the Universoty of Kandahar or something and says the exercises tested were lateral raises and the pullover machine or something.[/quote]

Don’t forget with a sample of 12 retired nuns who smoke less than 1 pack a day and are self rated as semi sedentary

Do you understand life?

I have a bachelor’s degree in statistics lol. I’ve already forgotten more about statistics than most people will ever even learn. That being said, I never bother looking at the raw data or methods or whatever is available to back up some study related to strength training. I just assume it’s bullshit from the get go.

Relatively speaking there are very, very few studies on anything related to our subculture that have any validity to them lol. If it’s something legit, I figure I’ll hear about it in multiple studies from multiple places, at which point I may actually delve deeper (e.g. creatine).

I would also add multicollinearity to your list of statistics buzz words. It’s an extremely important one that most people probably would not know to think about.

My problem in reguards to sdudies is as follows:) 1)studies frequently contradict each other. One month, a study will come to one conclusion. The next month, another study comes out saying something completely opposite. 2) studies are often redundant. Take for example, a hypothetical study that says squatting produces massive gains. No shit, nobody needs studies to tell what we already know. 3) studies are often carried out by people with little to no real world experience around weights. I could tell you more than some buck thirty twigboy labcoat because I have years of experience.

I’ve learned alot through trial and error, inquiry, and observation. I have real world experience. I’m not saying I’m always gonna be right or that I speak the word of god, I’m just saying that I can offer my two cents from experience. What you do with that info is up to you. Question, argue, think. But for fuck’s sake, don’t justtake the word of some lab monkey.

^^^OOH, MOAR!!

I’m familiar with everything he said, but never encountered that in my graduate program.

I think this is a worthy topic of discussion. More because very few people can critically evaluate research, which isn’t particularly difficult and should be stressed more in education. Maybe then we wouldn’t have every idiot and their bother thinking something has been proven when a correlation is found.

[quote]bjj and w8lift wrote:
My problem in reguards to sdudies is as follows:) 1)studies frequently contradict each other. One month, a study will come to one conclusion. The next month, another study comes out saying something completely opposite. 2) studies are often redundant. Take for example, a hypothetical study that says squatting produces massive gains. No shit, nobody needs studies to tell what we already know. 3) studies are often carried out by people with little to no real world experience around weights. I could tell you more than some buck thirty twigboy labcoat because I have years of experience.

I’ve learned alot through trial and error, inquiry, and observation. I have real world experience. I’m not saying I’m always gonna be right or that I speak the word of god, I’m just saying that I can offer my two cents from experience. What you do with that info is up to you. Question, argue, think. But for fuck’s sake, don’t justtake the word of some lab monkey. [/quote]
Your problem with studies indicates that you don’t understand the scientific method.

The biggest problem with the application of studies is that correlation is assumed to equal causation. This is particularly a problem in cross sectional studies.

A study shows that A is correlated to B and people want B. They start doing A. But in reality A is correlated to C and C causes B. If you just read the study and start doing B, it won’t do jack because you didn’t start doing C.

Now, this is really an inherent reality of the scientific method and not so much a flaw with how studies are done. However, people will read one study and then make conclusions without taking this into account and that is where the problems occur.

[quote]csulli wrote:
I have a bachelor’s degree in statistics lol. [/quote]

Awesome. I didn’t know that.

You know who used to be fantastic about looking up and interpreting the research for us? Anonym. When something would come up in the Nutrition Forum particularly, he was cool about that. In fact, I TFriended him so I could read his posts more easily, but he never responded to my friend request, so “Screw you, Anonym!” :slight_smile: I haven’t seen him around in awhile. I have taken a few stats classes so I can get by, but I will get completely lost in the bio/ medical/ chem terminology when trying to read some of those studies.

I do love to teach Research Design in the Social Sciences to my little gifted eighth graders in the summers. They start out so trusting, but they are some jaded skeptics before I’m through. :slight_smile: They do get some appreciation for how difficult it is to do research with human subjects, and most of the numbers that are thrown around in ads or TV soundbites are just garbage. Junior high kids are the bomb. I just love em. We take them into an econ lab and let them “participate” in some research using dollar bills. So much fun.

^ One more thing. I like to tell them that I took a survey of all the women in my ballet class and found that 8 out of 10 of them prefer the color pink. So now I can assume that 8 out of 10 people prefer pink. … Yeah. They are pretty quick to point out the holes in that idea, but unfortunately that’s about the quality of a lot of the studies we hear cited on TV. Just teaching people about problems with sample selection and all the ways we can introduce bias would be a positive thing.

[quote]Silyak wrote:
The biggest problem with the application of studies is that correlation is assumed to equal causation. [/quote]

Are you telling me my anti-rhino-attack bracelet doesn’t work?!
Every lawyer in my building here in NYC is wearing one and there hasn’t been a single rhino attack since we started!

In my 13 months with a barbell, one thing is becoming clear to me. Once you get past the basic principles for success (eat, lift, recover), and especially once you get past the beginner phase of fast adaptation, lifting is voodoo. A dark art. Sorcery.

Building on what csulli said, no amount of statistical knowledge will demystify the countless cause-and-effect relationships at play in a tiny subset of people who advance past the (relatively) predictable beginner phase.

Compound this very small population with the timescales involved, years to decades, and you have a phenomenon that does not lend itself well to having, say, graduate students study in-depth. Nor are results gained over the course of years and decades easy to reproduce. Have a new “theory” about the best way to total 2,000 as a natural powerlifter? Yeah… Start with an 18 year-old stick man and get back to me in 10 years with your test results.

So, when it comes to fitness and lifting in particular, I place a much higher value on anecdotes coming from the mouth of a big dumb lunk who has accomplished what I want to do than I would from the most heavily peer-reviewed study coming from weak geniuses who have not.

There is a lot of bullshit to wade through, and in the age of information we are perpetually knee-deep in it. Knowledge of statistics aside, I believe that having an intuitive feel for plucking good information out of the sea of bullshit is an increasingly valuable skill to have.

[quote]Bronco_XIII wrote:
A large majority of health and fitness claims substantiate their validity by citing ( or often just anecdotally referencing) “studies”. A couple questions, when you see “studies” referenced in an article, how often do you A. click the link to see if actually goes to a peer-reviewed study? B. Read at least the abstract of the study or studies. C. Examine the methods.

Secondly, how familiar are you with the terms “convenience sample” “Confidence interval” “self-selection bias”, “blinding”, “p-value” , "correlation vs. causation’ “operational definition”, and “statistical power”. Not looking to call anyone out, just curious as this is one of the main sources of information for strength, bodybuilding, and what not.[/quote]

In general, most of the time. Methods examination depends on whether I can get access to the full text of the article, but as to points A) and B) almost all the time.

Sometimes if it links to a news story (without study links) instead of the study itself though I don’t feel like fishing. I’ve got shit to do.

In general I am pretty comfortable with general statistic measurements, coming from biochemistry. I’m not a statistician however.

[quote]csulli wrote:
That being said, I never bother looking at the raw data or methods or whatever is available to back up some study related to strength training. I just assume it’s bullshit from the get go.
[/quote]

This too! Lol.

[quote]csulli wrote:
I have a bachelor’s degree in statistics lol. I’ve already forgotten more about statistics than most people will ever even learn. That being said, I never bother looking at the raw data or methods or whatever is available to back up some study related to strength training. I just assume it’s bullshit from the get go.

Relatively speaking there are very, very few studies on anything related to our subculture that have any validity to them lol. If it’s something legit, I figure I’ll hear about it in multiple studies from multiple places, at which point I may actually delve deeper (e.g. creatine).

I would also add multicollinearity to your list of statistics buzz words. It’s an extremely important one that most people probably would not know to think about.[/quote]

I’ll one-up (or two-up) this with a Bachelor’s, Master’s, and PhD :stuck_out_tongue:

My personal dick-swinging-aside, I’ve found your second paragraph to be true of more than just exercise-physiology studies. I’ve told several friends that a year in the working world of medical research has convinced me that a disturbingly large proportion of studies (basic-science, epidemiology, and clinical trials) are analyzed somewhere from “slightly incorrectly” to “appallingly wrong” and the peer-review process does not catch this unless someone with decent statistical training happens to be picked.

Worse, a lot of times the primary data collection was done poorly, so even if the researcher is really sharp or has a good statistician, they might be analyzing data with lots of errors in it.

[quote]ActivitiesGuy wrote:
Worse, a lot of times the primary data collection was done poorly, so even if the researcher is really sharp or has a good statistician, they might be analyzing data with lots of errors in it.[/quote]

You mean minimum-wage or unpaid interns acting as primary data collectors might sometimes just turn in bullshit rather than actually collect real data?

[quote]jjackkrash wrote:

[quote]ActivitiesGuy wrote:
Worse, a lot of times the primary data collection was done poorly, so even if the researcher is really sharp or has a good statistician, they might be analyzing data with lots of errors in it.[/quote]

You mean minimum-wage or unpaid interns acting as primary data collectors might sometimes just turn in bullshit rather than actually collect real data?
[/quote]
lol

[quote]ActivitiesGuy wrote:
I’ve told several friends that a year in the working world of medical research has convinced me that a disturbingly large proportion of studies (basic-science, epidemiology, and clinical trials) are analyzed somewhere from “slightly incorrectly” to “appallingly wrong” and the peer-review process does not catch this unless someone with decent statistical training happens to be picked.[/quote]

For the most part I work with things that are pretty discrete, where statistics doesn’t really come into play, and most of published papers are pretty much the same; it either works and you have a theoretical proof of correctness… or it doesn’t. So my statistics is pretty weak.

I don’t know a whole lot about study design and the things that are often screwed up. What kind of things do people often screw up? What kinds of things are done wrong?