Total Posts:13|Showing Posts:1-13
Jump to topic:

Video: Is Most Published Research Wrong?

Chaosism
Posts: 2,669
Add as Friend
Challenge to a Debate
Send a Message
8/11/2016 3:45:26 PM
Posted: 3 months ago
YouTube: https://www.youtube.com...

I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

Related: APS Registered Replication Reports (http://www.psychologicalscience.org...)
Fkkize
Posts: 2,149
Add as Friend
Challenge to a Debate
Send a Message
8/11/2016 9:48:47 PM
Posted: 3 months ago
Social nihilism ho!
: At 7/2/2016 3:05:07 PM, Rational_Thinker9119 wrote:
:
: space contradicts logic
Mhykiel
Posts: 5,987
Add as Friend
Challenge to a Debate
Send a Message
8/12/2016 2:58:47 AM
Posted: 3 months ago
At 8/11/2016 3:45:26 PM, Chaosism wrote:

YouTube: https://www.youtube.com...

I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

Related: APS Registered Replication Reports (http://www.psychologicalscience.org...)

I've read about this kind of thing months ago. Thanks for a well made video I can share with friends.

It's something to keep in mind when someone only has the results of one study to rely on.

Well everyone says science is reproducible, but if the experiment is not reproduced the conclusions can still be questionable.

It seems only when the results go against the consensus or normalcy bias, is a result repeated and examined.

Such as when an experiment concluded neutrons were traveling faster than light through rock. If that conclusion had been inline with expected values the faulty detectors would have gone longer before being discovered.
RainbowDash52
Posts: 294
Add as Friend
Challenge to a Debate
Send a Message
8/12/2016 6:05:47 PM
Posted: 3 months ago
At 8/11/2016 3:45:26 PM, Chaosism wrote:

YouTube: https://www.youtube.com...

I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

Related: APS Registered Replication Reports (http://www.psychologicalscience.org...)

I think this video does a very good job of explaining why mainstream science is unreliable.

One thing I would like to add is that this video demonstrates that science can't even reliably draw conclusions from relatively simplistic statistical experiments, so how can we expect science to draw more complex conclusions on topics like man made global warming, the existence of dark matter, or even evolution from common descent?
Diqiucun_Cunmin
Posts: 2,710
Add as Friend
Challenge to a Debate
Send a Message
8/13/2016 4:20:40 AM
Posted: 3 months ago
It's common knowledge that data snooping, specifically p-hacking, is a serious problem in the experimental sciences, but it's encouraging to see the changes that have taken place. A psychology journal has stopped accepting p-value in submissions, and in economics, they're planning to require researchers to disclose their experimental methods, hypotheses, etc. before the study is started. That should make any kind of data snooping (e.g. tinkering with the number of variables to find significant correlations) much harder.
The thing is, I hate relativism. I hate relativism more than I hate everything else, excepting, maybe, fibreglass powerboats... What it overlooks, to put it briefly and crudely, is the fixed structure of human nature. - Jerry Fodor

Don't be a stat cynic:
http://www.debate.org...

Response to conservative views on deforestation:
http://www.debate.org...

Topics I'd like to debate (not debating ATM): http://tinyurl.com...
RuvDraba
Posts: 6,033
Add as Friend
Challenge to a Debate
Send a Message
8/14/2016 2:36:52 AM
Posted: 3 months ago
At 8/11/2016 3:45:26 PM, Chaosism wrote:
I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

I fully support the contentions in the video, Chaosism. This is not the first time scientific methodology has been challenged through diligent reflection and had to improve in consequence. For example, double blind studies were only introduced in the early 20th century, and only came to prominence from the mid 20th century. So for almost three centuries, experiment design and execution weren't separated. Over-all, scientific notions of objectivity have also become both more conservative and more stringent from the 19th century to the 20th, and that trend seems to be continuing.

Science is a sociological endeavour, and it's entirely appropriate to review sociological effects in exploration, validation and verification of ideas -- although the methods by which we review are also subject to validation and verification themselves.

It's important both for diligence and public confidence that the issues raised are dealt with promptly and effectively, and as the video comments, this is under way. Speaking personally, I'm optimistic that with a growing range of publication forums, publication of data and repeated experiments will become easier and more acceptable, and this will help address the problem -- as will a competition among leading journals to lift diligence.

But in the meantime, it's equally important not to overstate the long-term significance of these concerns, and while I don't think the video did that, I think some members have. For example, science concerns itself not only with prediction (covered by the video), but also identifying mechanisms by which effects are caused. So drawing a statistically significant p-correlation accepted by peers is not the end of the story. If a correlation matters enough to inform public decision-making, then there will be demand to find candidate mechanisms creating the correlation. Over time (though not immediately), examining candidate mechanisms demands reproduced results produced in different ways, under greater diligence.

An example of this is in the notorious case of Pons and Fleischmann's Cold Fusion announcement in 1989: a claim that they had managed to perform what appeared to be a nuclear fusion reaction at room temperature -- which, if so, was a result of staggering commercial and scientific value.

The announcement was so significant, scientists all around the world jumped to repeat the experiment in varied ways, and it wasn't long before failed repetitions were being reported. The report was declared in error within the same year, and this illustrates a point that the video didn't pick up: the value of the result can and should inform the level of diligence, scrutiny and diversity used to validate and verify the methods.

Whenever systematic error is found, a diligent, accountable discipline must improve its methods, and the results most vulnerable to such errors are the most recent results hardest to reproduce (like pentaquarks.) However, I don't believe it's cause for widespread alarmism about long-term scientific results, because there are tacit social effects working for greater diligence, and not just against it.

I hope that may be useful.
Chaosism
Posts: 2,669
Add as Friend
Challenge to a Debate
Send a Message
8/14/2016 4:14:07 AM
Posted: 3 months ago
At 8/12/2016 11:05:11 PM, keithprosser wrote:
Video: Is Most Published Research Wrong?
Are most youtube videos right?

C'mon - they wouldn't let them on the internet if they were wrong. Duh! P
willbedone
Posts: 127
Add as Friend
Challenge to a Debate
Send a Message
8/14/2016 8:24:51 AM
Posted: 3 months ago
At 8/11/2016 3:45:26 PM, Chaosism wrote:

YouTube: https://www.youtube.com...

I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

Related: APS Registered Replication Reports (http://www.psychologicalscience.org...) : :

All scientific research is subjective to the senses of the researcher ( scientist ) and those who read the published papers by the researcher. Those who read the research papers can only believe what they read unless of course, they can reproduce the same exact research done by the scientist who probably spent most of his or her life experimenting with their subjective senses. In order for the reader of those published research papers to prove that the researcher was right or wrong, they would have to do the same exact experimentation and research as the one who wrote the research papers and come up with the exact answers which is almost impossible to accomplish.
Chaosism
Posts: 2,669
Add as Friend
Challenge to a Debate
Send a Message
8/15/2016 1:37:30 PM
Posted: 3 months ago
At 8/14/2016 2:36:52 AM, RuvDraba wrote:
At 8/11/2016 3:45:26 PM, Chaosism wrote:
I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

<snipped>

I hope that may be useful.

It is. that you for your input, Ruv. :)
Axonly
Posts: 1,802
Add as Friend
Challenge to a Debate
Send a Message
8/22/2016 6:02:40 AM
Posted: 3 months ago
At 8/11/2016 3:45:26 PM, Chaosism wrote:

YouTube: https://www.youtube.com...

I don't really have an opinion on this video, yet, but I'd like to hear if other people 'round here do. I found it interesting, nonetheless.

Related: APS Registered Replication Reports (http://www.psychologicalscience.org...)

It's quite unfortunate that there isn't much funding for reproducing experiments.
Meh!