P. Allison Minugh, Ph.D. and Renee Saris-Baglama, Ph.D.
Data handling typically gets short shrift in scholarly publications. While researchers painstakingly report study methods and measurement tools, data handling is largely invisible. Some argue this practice needs to change to ensure research credibility (e.g., Simmons, Nelson, & Simonsohn, 2011), but publishers do not necessarily want to give up page space for these details (Leahey, 2008).
Yet, published studies are difficult–if not impossible–to replicate, retractions are on the rise, and there is some resistance to data sharing.
The inability to replicate (http://www.omsj.org/corruption/scientists-elusive-goal-reproducing-study-results) is often attributed to a lack of detailed information about study procedures or how data were handled based on what information is allowable or reported within the journal space. Retractions, on the other hand, are becoming more common (Fang, Steen, & Casadevall, 2012). They can be the result of a simple data handling mistake like forgetting to apply a sample weight or reporting a finding based on a value that has been miscoded, or can be attributed to fraud and falsified data (http://retractionwatch.wordpress.com/).
Despite these problems, Leahey (2008) found that funding agencies, institutional review boards, and publishers reported that data editing oversight fell outside of “the stages and domains of current gatekeeping activity.” All three gatekeepers agreed it is incumbent on the researcher to ensure data are handled properly and ethically. However, some journals are taking proactive steps to improve study transparency and reproducibility (http://www.nature.com/ni/journal/v14/n5/full/ni.2603.html).
It appears there are issues that should be addressed but how? What do you think?
Contact us at email@example.com to learn how our technical experts can help you manage your data with integrity.
Fang, F. C., Steen, R. G., & Casadevall, A. (2012). Misconduct accounts for the majority of retracted scientific publications. Proceedings of the National Academy of Sciences of the United States of America, 109(42), 17028-17033.
Leahey, E. (2008). Overseeing research practice: The case of data editing. Science, Technology, & Human Values, 33(5): 605-630.
Simmons, J. P., Nelson, L. D. & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11): 1359-1366.