Is anyone able to make a transcript or a summary would be amazing! #severeME
Some highlights from Brian Hughes presentation:
"Every single thing I say about psychology can be said about the PACE trial and the way that this condition [ME] has been dealt with. And therefore I use is as sort of the climax of the whole book."
"..the claim was in 2011, that positive change had occurred as a result of CBT and exercise therapy, compared to standard medical care. And in 2013 it was even reported that 22% of patients in the trial who received CBT and or GET actually recovered from CFS."
"That by using this psychotherapy you are effectively reverse engineering the condition and fixing it".
"When this single study is treated as the final word on a topic then you are not dealing with good science per se because there is a big issue around replication. And science is a field of empirical study that relies on
replication."
"You just take a hundred studies you do them again and most of them do no result at all. So why does that happen?
One of the reasons that it happens, and it happens very much with the PACE trial, is what I call
Rampant methodological flexibility."
"..because there is no standard methodology in psychology research that means that it is very difficult to control what goes on in the research context.
And the PACE trial took advantage of rampant methodological flexibility in all sorts of ways."
"That flexibility is not good science it opens the door to confirmation bias, it opens the door to something that scholars call
Harking (hypothesising after the results are known)."
"
Moving the goal posts, we’ll hear about that a little bit later."
"
Fishing for findings: If you have a lot a data in a computer you can pull out a fraction of it and report that and make your study look very strong when in fact most of the data don’t show anything."
"
Method blaming, which means that if your study finds something different to the other guys study you can say well its because they’re different methods, because no two studies are alike."
"So,
the PACE trial then;
What’s the basic problem with the PACE trial in scientific terms is that when you have this open-ended flexibility you end up with studies that are weak by design. Studies that rely on self-reported data require a thing called
blinding."
"So, when you have flexible non-standardised methods, you make the study design up yourself, you open the door to
unconscious biases by the researchers, perhaps
conscious biases in some cases"
"The PACE trial is full of problems. But I would simply say this, even if you knew nothing more about the PACE trial except that it is a non-blinded trial based on self-report you know enough to know that you cannot rely on that trial, that trial is not a good study"
"[The researchers] between collecting the data and publishing the paper they changed the criteria."
"..the protocol was published before data collection. So we all know that they
moved the goalposts."
"another problem here that we call the ‘
winners curse’.
Which is, when we do lots of studies or a study with lots of bits, the temptation is to look at the bits that worked, publish that and then quietly forget about the other bits."
"The boring findings, the non-findings they are in the researchers file drawer. We call that the
file-drawer problem."
"The PACE trial, the original study had three principal investigators. All of them have a working history of promoting CBT and cognitive non-biological theories in their field. Each of them have published books, prior to the PACE trial and they show their hand.
Their view is that CBT is the cure for lots of things, cure for, for example, chronic fatigue"
"So the risk here is that there is a bias, what we call a
therapeutic allegiance, in these people. That they were so wedded to their theories, that they pre-empted the data and interpreted all the data in a weird way to justify their prior assumptions."
"We are guilty of
confirmation bias all the time even when we have little grounds to draw conclusions.
And one of the problems here is, that we know from people who have looked at this in psychology research, if you have a strong expectancy about your research you are more likely to have the finding that you were looking for."
"....the PACE trial stopped being independently verified or independently replicated. All the studies all the papers emanating from the PACE trial dataset are by the same network of people. They are all connected."
"
Psychology has a measurement crisis"
"A regular study would triangulate. They would use the objective measure to allow or disallow the subjective measure, but that’s not what they do on PACE."
"I mentioned earlier that the researchers
moved the goal posts."
"So they had to defend themselves, and in the written report in the journal they said that the reason they did this is because they pitched too high to begin with. They were asking too much of patients. They were saying that if you had a score of 85, half the population wouldn’t have a score of 85.
It’s what they said, in writing.
And they literally point out, that threshold would mean that approximately half the general working age population would fall outside the normal range. So they said ‘we got it wrong we should never have said 85 so that’s why we’re reducing it to 60’."
"But they base this conclusion on prior data showing that the average score was 85. But it was the
mean average.
Now I don’t want to be patronising, but in school we learn the difference between the mean, the median and the mode. And on this scale, this fatigue scale, this general functioning scale, the mean is 85 but the median is close to 100. So it is simply inaccurate to say most people score either 95 or better on this scale.
It’s inaccurate to conclude that just because the average, the mean average is down at around the 85 point, that this means that half the population are above and half the population are below. "
"The PACE trial entry criterion was 65, so in order to be considered sick enough to take part you had to have a score of 65, but in order to be considered recovered in the published paper a score of 60 will do. Which means your score could go down and you would be considered to have improved."
"
sampling crisis"
"the PACE trial uses the Oxford criteria for determining whether or not people had ME or CFS."
"in the PACE trial 47% of the people wouldn’t meet the CCC for CFS.
So if the PACE trial was funded and conducted in Canada half the participants wouldn’t even be in the study because they wouldn’t be diagnosed with CFS."
"Finally then, it all culminates in a notion of
exaggeration. So even if, when you break it down, when you step back, there is an awful lot of information in the PACE trial, but when you step back and draw a picture this is what people come up with.
No group CBT, GET or control, no group stands out as having improved much better than all the others. So even if you just step away from all the noise, and all the debate and just look at the findings as they are, they are very, very modest.
And this is what I referred to as an
exaggeration crisis."
"Psychologists and clinical professionals, they want therapies to work.
And there is a big problem in therapy research, not just psychotherapy research, of over optimistic interpretations of rather banal data."
"In a nutshell.
Psychology is full of potential, full of strengths, but the PACE trial and the ME controversies, CFS controversies, they put psychology in a negative light"
"..it’s a type of shame I feel when I hear my profession being talked about as a source of damage and a source of arrogance and a source of delusion. That it affects peoples lives."
"There is a lot of scepticism about bad science in psychology and a lot of concern that these types of cases get defended, sometimes at the highest level."
full transcript coming up
