When scientists have found out something – when can y actually rely on it? One answer is: When peers have reviewed study. Anor: When it was published in a renowned journal. But sometimes both are not enough toger, as researchers have now shown. In best and most elaborate way: you have repeated underlying experiments. and looked to see if same thing came out again.
It was about 21 social science studies from journals Nature and science. More reputation does not go. And of course, submitted work will be examined by experts (peer review). Neverless, in almost 40 percent of cases, same thing did not come out again – but mostly: nothing.
"I would have expected a better result, after all it was about nature and science," says John Ioannidis, medical statistician at Stanford University. He is one of most distinguished fighters against bad science and usually not squeamish with his colleagues. The study TÜV was initiated by American social psychologist Brian Nosek, founder of Center for Open Science. He says: "It could also be that in such magazines, works are less solid, because top-journals prefer sexy results."
What kind of results did it actually have? A few examples:
It all sounds sexy. Because results are surprising because y seem to reveal unconscious influences on our behavior because y are close to life and current issues, such as wher we rely too much on Google. All se results could not be confirmed (in technical jargon: replicating). This does not necessarily mean that y are false, blundered or counterfeit. But it means you can't rely on m.
Even if similar effects occurred in repetition of experiments, y were noticeably smaller than in original, on average only three-quarters of size. If one calculates non-replicable studies, average effect of all repetitions even shrinks to half. That is why research critic John Ioannidis says: "If you read an article about a social science experiment in nature or science, you have to cut effect in half."
In similar TÜV projects, 64 percent of psychology studies and 39 percent of economics studies could not be confirmed. In pharmaceutical and cancer research, samples showed even higher bankruptcy rates. And even supposedly harsh sciences have problems, says Ioannidis: "There are areas in chemistry where it doesn't look much better." Brian Nosek is not surprised. "The incentives are same in all disciplines: you have to publish a lot, in most prestigious journals, and for that you need exciting results that can be told as a simple story." An old problem.
Publish Date : 30 Ağustos 2018 Perşembe 12:00
A Few Easy Methods to Foster Employee Loyalty...
HOW TO SELECT THE BEST KITCHEN CABINETS
How Big Data Can Transform The Consturction Industry
Quick and Easy Tips to Save Money on Purchases
Kitchen Remodel: Cabinets Guide
How to Protect Yourself if You’re Injured at...
What to do if You’re Injured Outside of Your...
All That You Should Know About Apache Storm
Grow Thin Ceramics Sector Can Reinforce Bangladesh’s...
Why You Should Contact a Lawyer After an Alcohol-Related...
Debt Management -The Best Debt Solution to Consider
The Mixed Motives Of The Lenders And Borrowers...