Are There Questionable Research Practices in Management Research?

erasure-1123441-mThe publishing industry can be competitive. But how far will a potential researcher go to achieve success? Farther than you would think, according to a forthcoming Journal of Management article entitled “The Chrysalis Effect: How Ugly Initial Results Metamorphosize into Beautiful Articles.”

The abstract:

The issue of a published literature not representative of the population of research is most often discussed in terms of entire studies being suppressed. However, alternative sources of publication bias are questionable research practices (QRPs) that entail post hoc alterations of hypotheses to support data or post hoc alterations of data to support hypotheses. Using general strain theory as an explanatory framework, we outline the means, motives, and opportunities for researchers to better their chances of publication independent of rigor and relevance. We then assess the frequency of QRPs in management research by tracking differences between dissertations and their resultingjom cover journal publications. Our primary finding is that from dissertation to journal article, the ratio of supported to unsupported hypotheses more than doubled (0.82 to 1.00 versus 1.94 to 1.00). The rise in predictive accuracy resulted from the dropping of statistically nonsignificant hypotheses, the addition of statistically significant hypotheses, the reversing of predicted direction of hypotheses, and alterations to data. We conclude with recommendations to help mitigate the problem of an unrepresentative literature that we label the “Chrysalis Effect.”

Leading author Ernest Hugh O’Boyle, Jr. and Journal of Management Associate Editor Fred Oswald discuss the article’s findings and the effects these findings have on management literature in a new podcast. You can listen to the interview for free by clicking here.

Read “The Chrysalis Effect: How Ugly Initial Results Metamorphosize into Beautiful Articles” for free from Journal of Management by clicking here. Want to know about all the latest from Journal of Management? Click here to sign up for e-alerts!

Has Organizational Stigma Research Become Strictly PG?

dark-secrets-1077691-mWhen out to dinner with one’s mother-in-law, it’s common knowledge that there are simply topics that should not be breached. But has this fear of impropriety seeped into academic research on Organizational stigma as well? Bryant Ashley Hudson and Gerardo A. Okhuysen discuss this idea and it’s possible consequences in their paper, “Taboo Topics: Structural Barriers to the Study of Organizational Stigma” from the Journal of Management Inquiry.

In this article we make two arguments. The first is that there is greatJMI_72ppiRGB_powerpoint scholarly value in examining topics that management researchers may find distasteful or undesirable; topics that involve organizational stigma. Organizational stigma involves the discrediting of organizational participants, organizational activities, and organizations themselves (Sutton &Callahan, 1987). And the study of organizational stigma often involves the examination of distasteful—and occasionally objectionable, despicable, and disgusting—activities, work, and organizations. We argue that in spite of its potentially repellent nature, organizational stigma is worth discussing as it exposes areas of social life that remain otherwise hidden. However, the nature of stigmatized topics also makes them taboo, and our experience as researchers suggests that our field erects structural barriers that discourage their examination. Our second argument, then, is that these taboos and structural barriers that inhibit the study of these topics are detrimental to knowledge creation and accumulation and deserve to be breached.

Exploring Mixed Methods in Organizational Sciences

work-4-680529-m Organizational Research Methods invites papers for a Feature Topic on Mixed Methods in the Organizational Sciences. Guest editors for this feature topic are Jose F. Molina-Azorin, Donald Bergh, Kevin Corley, and David Ketchen.

Topics include, but are not limited to:

– Philosophy of science issues related to mixed methods research

– Explanations of how mixed methods can help to carry out context specific research.

– Evaluations of how mixed methods can enhance organizational research by carrying out multilevel studies and bridging macro and micro inquiry

ORM_72ppiRGB_150pixW– Guidance on how mixed methods can simultaneously examine outcomes and process issues.

– Analysis of the implications and opportunities of mixed methods to bridge the science practice gap, emphasizing the relevance of mixed methods studies to practice.

– Development and validation of new measures using a mixed methods approach.

– Quality issues in mixed methods in organizational sciences.

Organizational Research Methods  welcomes empirical, conceptual, methodological and literature review papers.  The deadline to submit a 5 – 7 page proposal is June 30, 2014. All those interested are encouraged to read more by clicking here.

Want to get the latest news from Organizational Research Methods? Click here to sign up for e-alerts today!

In Strategic Management Studies, ‘It’s the Measurement, Stupid!’

Editor’s note: We are pleased to welcome Dan R. Dalton and Herman Aguinis, both of Indiana University, whose article “Measurement Malaise in Strategic Management Studies: The Case of Corporate Governance Research” was published in the January 2013 issue of Organizational Research Methods.

Our article describes a surprising commonality in empirical results relying on agency theory in general and in the particular domain of  corporate governance research: Mean estimates of relationships between measures of boards of directors’ independence, CEO duality, equity holdings, and corporate financial performance are disappointingly low. Typically, measures of board independence explorm_200ain only about 1% of variance in relevant outcomes. We use meta-analytic evidence to document that this disappointing result is highly consistent across very diverse bodies of research. Our article was motivated by our desire to understand the reason(s) for such small effect sizes, because this knowledge would allow us to design and execute better studies in the future which will hopefully lead to an improved understanding (i.e., larger effect sizes) of critical outcomes such as firm performance.

Our conclusion can be summarized in a simple phrase: “It’s the measurement, stupid!” Threats to construct validity such as less ideal operationalization of constructs and confounding constructs and levels of constructs seem to be the culprits, at least in part, for why observed effect sizes are so small. How did we arrive at this conclusion? We implemented a five-step protocol through which we (1) established the base rate for the phenomenon in question, (2) evaluated the extent to which the dependent variables are germane, (3) evaluated the extent to which the independent variables are germane, (4) determined whether explanatory power is improved as a consequence of improved measurement, and (5) concluded whether previously established estimates should be revised. Due to the implementation of our proposed five-step protocol, we were able to improve variance explained in outcome variables many times over—in some cases, a tenfold increase.

So, what are the main take-aways? First, remember the motto “It’s the measurement, stupid!” In most empirical research in macro studies, small effect sizes can be expected even before data are collected due to construct validity problems. Second, implementing our proposed protocol in other research domains is likely to lead to more accurate, and larger, effect estimates of relationships among constructs.

Click here to read “Measurement Malaise in Strategic Management Studies: The Case of Corporate Governance Research” in Organizational Research Methods.


Dan R. Dalton is the founding director of the Institute for Corporate Governance, Dean Emeritus, and the Harold A. Poling Chair of Strategic Management in the Kelley School of Business, Indiana University. A Fellow of the Academy of Management, his research focuses on corporate governance and research methods and analysis. He is the recipient of the 2011 Academy of Management Research Methods Division Distinguished Career Award and a former editor-in-chief of the Journal of Management.

Herman Aguinis is the Dean’s Research Professor, a professor of organizational behavior and human resources, and the founding director of the Institute for Global Organizational Effectiveness in the Kelley School of Business, Indiana University. His research interests span several human resource management, organizational behavior, and research methods and analysis topics. He has published five books and more than 100 articles in refereed journals. He is the recipient of the 2012 Academy of Management Research Methods Division Distinguished Career Award and a former editor-in-chief of Organizational Research Methods.

Organizational Research Methods 2011 Best Paper

Management INK would like to congratulate the winners of the 2011 Best Paper Award for Organizational Research Methods.

Keith Leavitt, United States Military Academy, Terence R. Mitchell, University of Washington, and Jeff Peterson, Utah Valley University, published “Theory Pruning: Strategies to Reduce Our Dense Theoretical Landscape” in the October 2010 issue of ORM.

The abstract:

The current article presents a systematic approach to theory pruning (defined here as hypothesis specification and study design intended to bound and reduce theory). First, we argue that research that limits theory is underrepresented in the organizational sciences, erring overwhelmingly on the side of confirmatory null hypothesis testing. Second, we propose criteria for determining comparability, deciding when it is appropriate to test theories or parts of theories against one another. Third, we suggest hypotheses or questions for testing competing theories. Finally, we revisit the spirit of ‘‘strong inference.’’ We present reductionist strategies appropriate for the organizational sciences, which extend beyond traditional approaches of ‘‘critical’’ comparisons between whole theories. We conclude with a discussion of strong inference in organizational science and how theory pruning can help in that pursuit.

Bookmark and Share

Meta-analysis Reviews: Debunking Myths and Urban Legends

Herman Aguinis, Charles A. Pierce, Frank A. Bosco, Dan R. Dalton, and Catherine M. Dalton offer best-practice recommendations regarding how to conduct meta-analytic reviews. Click here to read the article in Organizational Research Methods OnlineFirst.

Bookmark and Share