ASQ Improving Evidence Presentation: Resources and Tools

[The following post is re-blogged from the ASQ Blog. Click here to view the original article.]

This is a post that contains resources and tools to improve evidence presentation, echoing one of ASQ’s most important initiatives. The ASQ blog will keep updating the post and we welcome crowdsourcing and sharing of ideas from management scholars all around the world. Get in touch with to share your thoughts!

What is ASQ Improving Evidence Presentation initiative?(#ASQEvidencePresentation)

As you have seen in the “From the Editor” that Administrative Science Quarterly (ASQ) published in June 2017, the editors of ASQ strongly encourage that authors show the data in their manuscripts, by using graphical approaches to give an indication of the most important features of the data and their theoretical explanation before estimating models. Preferably this should be done as early as the introduction in order to spur the reader’s interest and give an indication of why the paper is valuable. Such use of graphical methods is rare in organizational theory and management research more generally, so we will gradually introduce methods of graphical analysis that can be used by researchers.

Graphical methods for showing the data are integrated into Stata, the most common software used by management researchers, and the Stata commands offer a good blend of simplicity and flexibility. Nevertheless, they need some training, especially because statistical training is model-focused in many schools, and highly variable in how well graphical methods are taught. New analytical tools like R and Python are also on the rise and increasingly used to visualize data, which requires, even more, training and knowledge sharing. Here we collaborate with the ASQ blog to home a resource center where tools and techniques to improve evidence presentation are crowdsourced and curated. The resource center will be updated, and editors of ASQ will contribute examples with data and do-files to demonstrate evidence presentation. We hope that the resource center will help improve the way you present evidence in your research.

—-Henrich Greve, the Editor of Administrative Science Quarterly

ASQ paper development workshop materials

2017 ASQ paper development workshop, AOM in Atlanta

Materials on importance of evidence presentation

An Economist’s Guide of Visualising Data, written by Jonathan A. Schwabish (2014) in Journal of Economic Perspectives, 28(1): 209–234

Editor’s examples

why indie books sell well, an evidence presentation example based on Greve & Song (2017), “Amazon Warrior: How a Platform Can Restructure Industry Power and Ecology.” Data and do-file are available.

Tools using Stata

  1. introduction to some important methods including scatterplots, lineplots, bar graphs, box plots, and kernel (full distribution) plots.
  2. example of more advanced programming, which is needed because stata does not (yet) have a simple way of showing a grouped bar graph with error bars, which is an important graph for taking a first look at group differences.
  3. introduction to spmap, an add-on procedure for producing mapped data displays. Displaying statistics on a map can be very helpful for any kind of research involving spatial relations, before the add-on such mapping required changing to different software and exporting data, which is both time consuming and a potential source of errors.
  4. introduction to the coefplot function, a graphical display of coefficient magnitudes. This is a very informative way of giving a comparative view of a full regression model, or parts of it, in a compact graph. This routine has a very flexible and intriguing set of plots as displayed on this link.

Wide Research, Narrow Effects: Why Interdisciplinary Research – and Innovation – is Hard

[This blog post was originally featured on Organizational Musings, written by Administrative Science Quarterly‘s Editor, Henrich R. Greve. Click here to view the original post.]

Interdisciplinary research is seen as very valuable for society and economy. Some of that could be hype, but there are some good examples of what it can do. You have probably noticed that oil is no longer 100 dollar per barrel, and the US is no longer a big importer. This is a result of fracking, a result of interdisciplinary research. And if you don’t like fracking, a good alternative is photovoltaic energy, which comes from the sun, and from interdisciplinary research.

So some interdisciplinary research has been good for society. Is it also good for the scientists who are supposed to do it? The answer to this question is very interesting, and is reported in an article in Administrative Science Quarterly by Erin Leahey, Christine Beckman, and Taryn Stanko. The start is easy to explain: interdisciplinary research is less productive, but it gets more attention. The answer got more complicated, and more interesting, when they started looking at why that happened.

Current Issue CoverThe first step was to look at whether interdisciplinary research is more difficult to do, or whether it is because it is harder for it to gain acceptance and get published. The answer is clear: it is not harder to gain acceptance, but it is harder to do, especially early on. The second step was to look at why this research got more attention. Here many factors played a role, but one stood out to me: Actually what increases especially much is the variation in how much attention interdisciplinary research gets, and that helps explain the increased average. So interdisciplinary research is related to fracking in one more way – few reap the awards from it.

This paper doesn’t really result in career advice for scientists, because everyone will be interested in different kinds of research, and have different ideas on how much risk to take on. But has important insights on how innovations are made. Building on closely related ideas is much easier to do, so no wonder much of what scientists – and companies – do is incremental. And this is true even though we often tell stories of the great successes of interdisciplinary research and integrative innovations, while forgetting all those who tried and didn’t succeed. Whether that means we cross-fertilize knowledge too little, too much, or just enough is hard to tell.

You can read the article, “Prominent but Less Productive: The Impact of Interdisciplinarity on Scientists’ Research” from Administrative Science Quarterly free for the next two weeks by clicking here. Want to stay up to date on all of the latest research from Administrative Science QuarterlyClick here to sign up for e-alerts!

The Chrysalis Effect: Publication Bias in Management Research

14523043285_2235b0dbb4_zHow well do published management articles represent the broader management research? To say that questionable research practices impact only a few articles ignores the broader, systemic issue effecting management research. According to authors Ernest Hugh O’Boyle Jr., George Christopher Banks, and Erik Gonzalez-Mulé, the high pressure for academics to publish leads many to engage in questionable research, thereby leading the resulting published articles to be biased and unrepresentative. In their article, “The Chrysalis Effect: How Ugly Initial Results Metamorphosize Into Beautiful Articles,” published in Journal of Management, O’Boyle, Banks, and Gonzalez-Mulé delve into the issue of questionable research practices. The abstract for the paper:

The issue of a published literature not representative of the population of research is Current Issue Covermost often discussed in terms of entire studies being suppressed. However, alternative sources of publication bias are questionable research practices (QRPs) that entail post hoc alterations of hypotheses to support data or post hoc alterations of data to support hypotheses. Using general strain theory as an explanatory framework, we outline the means, motives, and opportunities for researchers to better their chances of publication independent of rigor and relevance. We then assess the frequency of QRPs in management research by tracking differences between dissertations and their resulting journal publications. Our primary finding is that from dissertation to journal article, the ratio of supported to unsupported hypotheses more than doubled (0.82 to 1.00 versus 1.94 to 1.00). The rise in predictive accuracy resulted from the dropping of statistically nonsignificant hypotheses, the addition of statistically significant hypotheses, the reversing of predicted direction of hypotheses, and alterations to data. We conclude with recommendations to help mitigate the problem of an unrepresentative literature that we label the “Chrysalis Effect.”

You can read “The Chrysalis Effect: How Ugly Initial Results Metamorphosize Into Beautiful Articles” from Journal of Management free for the next two weeks by clicking here.

Want to stay current on all of the latest research published by Journal of ManagementClick here to sign up for e-alerts! You can also follow the journal on Twitter–read through the latest tweets from Journal of Management by clicking here!

*Library image attributed to Apple Vershoor (CC)


Relative and Absolute Change in Discontinuous Growth Models

6431785919_07c22823c6_z[We’re pleased to welcome Paul Bliese of University of South Carolina. Paul recently published an article in Organizational Research Methods entitled “Understanding Relative and Absolute Change in Discontinuous Growth Models: Coding Alternatives and Implications for Hypothesis Testing” with co-author Jonas W.B. Lang.]

Jonas and I became interested in the topic because we kept encountering “transition events” that could lead to discontinuous change and wondered how to statistically model the events.  For instance, a combat deployment represents a potential transition event in the career of a soldier.  Likewise, unexpectedly changing a complex task on a participant in a lab represents a transition event that could be frustrating and impede performance. As a final example, letting sleep deprived participants get a full night’s sleep is a positive transition event that should improve cognitive Current Issue Coverperformance (but may not do so equally for all participants). In all these examples, some pattern of responses is interrupted by the transition event; however, where the models are really useful is in trying to understand the patterns of change after the transition event because individuals rarely react in the same way.

When Jonas and I got into writing the manuscript we were really surprised by how some minor coding changes surrounding TIME could produce parameter estimates that had quite different meanings. In fact, I realized that if I had figured out all the details that went into the submission years ago, I probably would have specified and tested hypotheses differently in my own publications where I used the approach. My hope is that other researchers will use the manuscript as a resource to study other transition events and that the examples will help provide better specificity to the types of hypotheses researchers can propose.

The abstract for the paper:

Organizational researchers routinely have access to repeated measures from numerous time periods punctuated by one or more discontinuities. Discontinuities may be planned, such as when a researcher introduces an unexpected change in the context of a skill acquisition task. Alternatively, discontinuities may be unplanned, such as when a natural disaster or economic event occurs during an ongoing data collection. In this article, we build off the basic discontinuous growth model and illustrate how alternative specifications of time-related variables allow one to examine relative versus absolute change in transition and post-transition slopes. Our examples focus on interpreting time-varying covariates in a variety of situations (multiple discontinuities, linear and quadratic models, and models where discontinuities occur at different times). We show that the ability to test relative and absolute differences provides a high degree of precision in terms of specifying and testing hypotheses.

You can read “Understanding Relative and Absolute Change in Discontinuous Growth Models: Coding Alternatives and Implications for Hypothesis Testing” from Organizational Research Methods free for the next two weeks by clicking here. Want to stay current on all the latest research from Organizational Research Methods? Click here to sign up for e-alerts!


Extreme-Team Research: An Approach to Overcoming Research Obstacles

3729394795_d267b3ef22_zResearching the performance and management of extreme teams, which work in unconventional environments on high-risk tasks, presents a number of unique challenges to researchers, including limitations on data collection and sample sizes. In a new article published in Journal of Managemententitled “An Approach for Conducting Actionable Research with Extreme Teams,” authors Suzanne T. Bell, David M. Fisher, Shanique G. Brown, and Kristin E. Mann set out to develop a research approach that addresses the unique challenges of extreme-team research and allows extreme-team research to be applied broadly across more traditional teams. The abstract for the paper:

Extreme teams complete their tasks in unconventional performance environments and have serious consequences associated with failure. Examples include disaster relief teams, special operations teams, and astronaut crews. The unconventional performance environments within which these teams operate require researchers to carefully consider the context during the research process. These environments may also create formidable challenges to the research process, including constraining data collection and sample sizes. Given the serious consequences associated with failure, however, the challenges must be navigated so that the management of extreme teams can be evidence based. We present an approach for conducting actionable Current Issue Coverresearch on extreme teams. Our approach is an extension of mixed-methods research that is particularly well suited for emphasizing context. The approach guides researchers on how to integrate the local context into the research process, which allows for actionable recommendations. At the same time, our approach applies an intentionally broad framework for organizing context, which can serve as a mechanism through which the results of research on extreme teams can be meaningfully accumulated and integrated across teams. Finally, our approach and description of steps address the unique challenges common in extreme-team research. While developed with extreme teams in mind, we view our general approach as applicable to more traditional teams when the features of the context that impinge on team functioning are not adequately represented by typical descriptions of context in the literature and the goal is actionable research for the teams in question.

You can read “An Approach for Conducting Actionable Research with Extreme Teams” from Journal of Management free for the next two weeks by clicking here. Want to keep current on all of the latest research from Journal of ManagementClick here to sign up for e-alerts!

*Shuttle launch image attributed to The U.S. Army (CC)

Moderation and Mediation in Strategic Management Research

304526237_6d1acf58bb_z[We’re pleased to welcome Herman Aguinis of George Washington University School of Business. Herman recently published an article in Organizational Research Methods entitled, “Improving Our Understanding of Moderation and Meditation in Strategic Management Research” with co-authors Jeffrey R. Edwards and Kyle J. Bradley.]

Organizational strategy and structure are important variables in understanding firm outcomes, but does the strength of those relationships depend on contingency factors such as the uncertainty of the environment or the products and services offered by the firm? Questions such as this one require that we statistically examine the possible presence of contingency or moderating effects. Is the effect of the competitive environment on firm performance transmitted by firm strategy such that the environment influences strategic choices that in turn affect performance? Questions such as this one require that we statistically examine the possible presence of an intervening or mediating effects.

Current Issue Cover

For decades, questions and hypotheses that involve moderation and mediation have been central to strategic management research. Moreover, accurate answers to these questions have important implications for practice because knowledge about moderating effects (i.e., conditions under which an effect occurs) and mediating effects (i.e., reasons why an antecedent is related to an outcome) leads to more accurate decisions and allocation of resources that will enhance firm outcomes.

Our article published in Organizational Research Methods (ORM) titled “Improving Our Understanding of Moderation and Mediation in Strategic Management Research” reviews common impediments to the accurate and valid assessment of moderating and mediating effects. We reviewed articles published in Strategic Management Journal and Organization Science over the past 10 years and discovered the unfortunate and pervasive presence of these problems, which lead researchers to misleading conclusions about moderating and mediating effects. Our review of the 205 articles that assessed moderation revealed seven key problems. Overall, published articles demonstrated an average of 2.57 of the seven problems we identified, with only one article avoiding the problems entirely. In similar fashion, our review of the 62 articles that addressed mediation revealed six key problems and, on average, the articles exhibited 3.52 of the problems each, with none of the published articles being problem-free.

We believe that our ORM article describing these problems and their solutions will be useful for strategy researchers interested in examining moderation and mediation. Implementing these solutions will help improve the appropriateness and accuracy of tests of moderation and mediation. Our recommendations can be implemented by researchers and also used as guidelines for editors and reviewers who evaluate manuscripts reporting tests of moderation and mediation.  Our article also provides references to key methodological sources on moderation and mediation that readers can pursue for further details about the issues we discuss.

We look forward to the reactions that our article will generate and sincerely hope that it will serve as a catalyst to improve the assessment of moderation and mediation, which in turn will lead to more accurate results that will benefit theory and practice.

The abstract for the article:

We clarify differences among moderation, partial mediation, and full mediation and identify methodological problems related to moderation and mediation from a review of articles in Strategic Management Journal and Organization Science published from 2005 to 2014. Regarding moderation, we discuss measurement error, range restriction, and unequal sample sizes across moderator-based subgroups; insufficient statistical power; the artificial categorization of continuous variables; assumed negative consequences of correlations between product terms and its components (i.e., multicollinearity); and interpretation of first-order effects based on models excluding product terms. Regarding mediation, we discuss problems with the causal-steps procedure, inferences about mediation based on cross-sectional designs, whether a relation between the antecedent and the outcome is necessary for testing mediation, the routine inclusion of a direct path from the antecedent to the outcome, and consequences of measurement error. We also explain how integrating moderation and mediation can lead to important and useful insights for strategic management theory and practice. Finally, we offer specific and actionable recommendations for improving the appropriateness and accuracy of tests of moderation and mediation in strategic management research. Our recommendations can also be used as a checklist for editors and reviewers who evaluate manuscripts reporting tests of moderation and mediation.

You can read “Improving Our Understanding of Moderation and Meditation in Strategic Management Research” from Organizational Research Methods free for the next two weeks by clicking here. Want to know about the latest research from Organizational Research Methods? Click here to sign up for e-alerts!

*Image attributed to Anssi Koskinen (CC)

Herman Aguinis is the Avram Tucker Distinguished Scholar and Professor of Management at the George Washington University School of Business. His research interests span several human resource management, organizational behavior, and research methods and analysis topics. He has published five books and more than 130 articles in refereed journals. He is a fellow of the Academy of Management, past editor of Organizational Research Methods, and received the Academy of Management Research Methods Division Distinguished Career Award.

Jeffrey R. Edwards is the Belk Distinguished Professor of Organizational Behavior at the Kenan-Flagler Business School, University of North Carolina. He is past editor of Organizational Behavior and Human Decision Processes, past chair of the Research Methods Division of the Academy of Management, and a fellow of the Academy of Management, the American Psychological Association, and the Society for Industrial and Organizational Psychology. His methodological research addresses difference scores, polynomial regression, moderation and mediation, structural equation modeling, construct validity, and the development and evaluation of theory.

Kyle J. Bradley is a doctoral candidate in organizational behavior and human resource management in the Kelley School of Business, Indiana University. His scholarly interests include research methods, performance management, star performers, and the work-life interface. His work has appeared in Industrial and Organizational Psychology: Perspectives on Science and Practice, Organizational Research Methods, and Organizational Dynamics and has been presented at the meetings of the Academy of Management and Society for Industrial and Organizational Psychology

Defining and Exploring Boundary Conditions

07ORM13_Covers.inddWhile it is easy to agree that boundary conditions are an important part of theory development and research, it is not as easy for researchers to agree on what boundary conditions are, and how they should be approached. With the recent Organizational Research Methods article entitled, “Boundary Conditions: What They Are, How to Explore Them, Why We Need Them, and What to Consider Them,” authors Christian Busse, Andrew Kach, and Stephan Wagner set out to not only better define boundary conditions, but also understand how exploring boundary conditions can lead to improved research and further theory development.

The abstract for the paper:

Boundary conditions (BC) have long been discussed as an important element in theory development, referring to the “who, where, when” aspects of a theory. However, it still remains somewhat vague as to what exactly BC are, how they can or even should be explored, and why their understanding matters. This research tackles these important questions by means of an in-depth theoretical-methodological analysis. The study contributes fourfold to organizational research methods: First, it develops a more accurate and explicit conceptualization of BC. Second, it widens the understanding of how BC can be explored by suggesting and juxtaposing new tools and approaches. It also illustrates BC-exploring processes, drawing on two empirical case examples. Third, it analyzes the reasons for exploring BC, concluding that BC exploration fosters theory development, strengthens research validity, and mitigates the research-practice gap. Fourth, it synthesizes the analyses into 12 tentative suggestions for how scholars should subsequently approach the issues surrounding BC. The authors hope that the study contributes to consensus shifting with respect to BC and draws more attention to BC.

You can read “Boundary Conditions: What They Are, How to Explore Them, Why We Need Them, and What to Consider Them” from Organizational Research Methods free for the next two weeks by clicking here. Want to know all about the latest research from Organizational Research Methods? Click here to sign up for e-alerts!