Posted by : Unknown Friday, June 26, 2015

Click & Close Ads Click & Close Ads In the midst of a debate over scientific misconduct, one of the world’s leading scientific journals on Thursday posted the most comprehensive guidelines for the publication of studies in basic science to date, calling for the adoption of clearly defined rules on the sharing of data and methods. The guidelines, published in Science, come weeks after the journal retracted a study of the effect of political canvassing on voters’ perceptions of same-sex marriage, by Michael LaCour of the University of California, Los Angeles, and Donald Green, of Columbia, because of concerns over Mr. LaCour’s data. That study was the latest in a series of highly public retractions in recent years, in fields as diverse as social psychology, anesthesiology and stem cell research, and involving many different journals. Continue reading the main story RELATED COVERAGE Science, Now Under Scrutiny ItselfJUNE 15, 2015 Study Using Gay Canvassers Erred in Methods, Not Results, Author SaysMAY 29, 2015 Donald P. Green, left, a co-author of a challenged study by Michael LaCour, right, from Mr. LaCour’s Facebook page.Doubts About Study of Gay Canvassers Rattle the FieldMAY 25, 2015 Click & Close Ads Click & Close Ads Dr. Marcia McNutt, the editor in chief of Science and an author of the guidelines with more than 30 other scientists, said in an interview that the timing of the paper had nothing to do with any particular case. She and others had been working on them since early 2014, well before the retracted study appeared in December. “This was a bullet train that already left the station,” she said.She said that the new guidelines, even if fully implemented and enforced, would probably not have exposed Mr. LaCour, who is suspected of fabricating data. But they would at least make it easier for researchers to attempt a study with the aim of confirming the results of the original work. Problems emerged with Mr. LaCour’s data when two University of California, Berkeley, students tried to mount a similar study. The world of scientific publication includes more than 10,000 journals in scores of specialties, some of which already have rules governing transparency in reporting study results. Continue reading the main story RELATED IN OPINION Op-Ed Contributors: What’s Behind Big Science Frauds?MAY 22, 2015 But the new guidelines — called TOP, for Transparency and Openness Promotion — represent the first attempt to lay out a system that can be applied by journals across diverse fields. “Right now, virtually the only standards journals have are where to set the margins, where to put the figures — copy-editing stuff,” said Brian Nosek, a professor of psychology at the University of Virginia and the lead author of the new paper. “But journals now understand that they have a strong role not only in the publication of science, but in determining what is said and how it’s said.” Dr. Nosek is the executive director of the Center for Open Science, a nonprofit that promotes data sharing and was centrally involved in creating the guidelines. More than 100 journals and 31 scientific organizations are signatories to the new guidelines, including the Association for Psychological Science and the American Geophysical Union. The American Psychological Association — whose chief publications officer was one of the authors — has not signed on. A spokeswoman for the group said Thursday the officer was out of the country and gave no reason for not signing on. Outside experts said that the new rules were a good first step, but nothing more. “Look, any steps in this direction that even recognize this problem are good ones,” said Dr. Ivan Oransky, an editor of the blog Retraction Watch and editorial director of MedPage Today. “But the proof will be in the pudding, in whether journals actually hold scientists’ feet to the fire.” The guidelines include eight categories of disclosure, each with three levels of ascending stringency. For example, under the category “data transparency,” Level 1 has the journal require that articles state whether data is available, and if so, where. Level 2 requires that the data be posted to a trusted databank. Level 3 requires not only that data be posted, but also that the analysis be redone by an independent group before publication. The “data” in question varies depending on the field and the methods. So-called raw data from social science studies — survey answers, for instance, stripped of any personal information — are easily cached and understood. Not so raw readouts from genetic analysis or magnetic resonance imaging recordings, which take up enormous digital capacity. That is one reason the guidelines also include a category called “analytic methods transparency.” In Level 1, scientists are called on to declare whether the code they used to analyze all those bites of raw data is available, and if so where; Level 3 would require the code be posted in a databank and the reported analysis reproduced before publication. Click & Close Ads Click & Close Ads The guidelines also call for, among other things, “preregistration” of studies: that is, that an outline of study methods, design and hypotheses be posted before the work is carried out. This kind of requirement makes little sense in some fields like geosciences in which investigators rush to study the effects of Hurricane Sandy, for instance. But it should serve as a check against the so-called file-drawer problem that has plagued social sciences and others, in which authors report only versions of a study that produce strong results, not ones with weak or null findings, Dr. Nosek said. Preregistration is the law for most clinical drug trials, and it is already done by many social scientists. Click & Close Ads Click & Close Ads Even journals that have endorsed the new guidelines, including Science, have not decided how to implement them, and at what level. “We are still evaluating them to see what makes sense for us,” Dr. McNutt said. But she added that the requirements for transparency of data, methods and materials used were most important. The guidelines were designed with flexibility in mind, allowing journals to choose which categories are most relevant for their field, and which levels increase transparency without becoming too burdensome for journal editors and authors.Click & Close Ads Click & Close Ads

Powered by Blogger.

- Copyright © Bela Sei Bela Video || Best Record - My Bloger - Powered by Blogger - Designed by Click Website -