Andy
Senior Member (Voting rights)
https://retractionwatch.com/2019/01...plication-in-the-literature-says-a-new-study/How seriously are journals taking duplicated work that they publish? That was the question Mario Malički and colleagues set out to answer six years ago. And last month, they published their findings in Biochemia Medica.
The upshot? Journals have a lot of work to do.
Since we’re often asked why duplication is a problem, we’ll quote from the new paper: Duplication “can inflate an author’s or journal’s prestige, but wastes time and resources of readers, peer reviewers, and publishers. Duplication of data can also lead to biased estimates of efficacy or safety of treatments and products in meta-analyses of health interventions, as the same data which is calculated twice exaggerates the accuracy of the analysis, and leaves an impression that more patients were involved in testing a drug. Not referencing the origin or the overlap of the data, can therefore be considered akin to fabrication, as it implies the data or information is new, when in fact it is not.”
RW: What were your main findings?
MM: Our main finding is that duplicate publications are not addressed – in our study only 54 % (n=194 of 359) of duplicates were addressed by journals and only 9% (n=33) retracted, although they should all have been retracted according to editorial standards (e.g. COPE).