Wide Research, Narrow Effects: Why Interdisciplinary Research – and Innovation – is Hard

[This blog post was originally featured on Organizational Musings, written by Administrative Science Quarterly‘s Editor, Henrich R. Greve. Click here to view the original post.]

Interdisciplinary research is seen as very valuable for society and economy. Some of that could be hype, but there are some good examples of what it can do. You have probably noticed that oil is no longer 100 dollar per barrel, and the US is no longer a big importer. This is a result of fracking, a result of interdisciplinary research. And if you don’t like fracking, a good alternative is photovoltaic energy, which comes from the sun, and from interdisciplinary research.

So some interdisciplinary research has been good for society. Is it also good for the scientists who are supposed to do it? The answer to this question is very interesting, and is reported in an article in Administrative Science Quarterly by Erin Leahey, Christine Beckman, and Taryn Stanko. The start is easy to explain: interdisciplinary research is less productive, but it gets more attention. The answer got more complicated, and more interesting, when they started looking at why that happened.

Current Issue CoverThe first step was to look at whether interdisciplinary research is more difficult to do, or whether it is because it is harder for it to gain acceptance and get published. The answer is clear: it is not harder to gain acceptance, but it is harder to do, especially early on. The second step was to look at why this research got more attention. Here many factors played a role, but one stood out to me: Actually what increases especially much is the variation in how much attention interdisciplinary research gets, and that helps explain the increased average. So interdisciplinary research is related to fracking in one more way – few reap the awards from it.

This paper doesn’t really result in career advice for scientists, because everyone will be interested in different kinds of research, and have different ideas on how much risk to take on. But has important insights on how innovations are made. Building on closely related ideas is much easier to do, so no wonder much of what scientists – and companies – do is incremental. And this is true even though we often tell stories of the great successes of interdisciplinary research and integrative innovations, while forgetting all those who tried and didn’t succeed. Whether that means we cross-fertilize knowledge too little, too much, or just enough is hard to tell.

You can read the article, “Prominent but Less Productive: The Impact of Interdisciplinarity on Scientists’ Research” from Administrative Science Quarterly free for the next two weeks by clicking here. Want to stay up to date on all of the latest research from Administrative Science QuarterlyClick here to sign up for e-alerts!

The Chrysalis Effect: Publication Bias in Management Research

14523043285_2235b0dbb4_zHow well do published management articles represent the broader management research? To say that questionable research practices impact only a few articles ignores the broader, systemic issue effecting management research. According to authors Ernest Hugh O’Boyle Jr., George Christopher Banks, and Erik Gonzalez-Mulé, the high pressure for academics to publish leads many to engage in questionable research, thereby leading the resulting published articles to be biased and unrepresentative. In their article, “The Chrysalis Effect: How Ugly Initial Results Metamorphosize Into Beautiful Articles,” published in Journal of Management, O’Boyle, Banks, and Gonzalez-Mulé delve into the issue of questionable research practices. The abstract for the paper:

The issue of a published literature not representative of the population of research is Current Issue Covermost often discussed in terms of entire studies being suppressed. However, alternative sources of publication bias are questionable research practices (QRPs) that entail post hoc alterations of hypotheses to support data or post hoc alterations of data to support hypotheses. Using general strain theory as an explanatory framework, we outline the means, motives, and opportunities for researchers to better their chances of publication independent of rigor and relevance. We then assess the frequency of QRPs in management research by tracking differences between dissertations and their resulting journal publications. Our primary finding is that from dissertation to journal article, the ratio of supported to unsupported hypotheses more than doubled (0.82 to 1.00 versus 1.94 to 1.00). The rise in predictive accuracy resulted from the dropping of statistically nonsignificant hypotheses, the addition of statistically significant hypotheses, the reversing of predicted direction of hypotheses, and alterations to data. We conclude with recommendations to help mitigate the problem of an unrepresentative literature that we label the “Chrysalis Effect.”

You can read “The Chrysalis Effect: How Ugly Initial Results Metamorphosize Into Beautiful Articles” from Journal of Management free for the next two weeks by clicking here.

Want to stay current on all of the latest research published by Journal of ManagementClick here to sign up for e-alerts! You can also follow the journal on Twitter–read through the latest tweets from Journal of Management by clicking here!

*Library image attributed to Apple Vershoor (CC)

 

How are Research Methods Taught?

[This blog post was originally posted on the SAGE Connection – Insight blog. To read the original blog post and find more content from SAGE Connection – Insight, click here.]

How can librarians better support faculty who teach research methods? What materials do students look for in their libraries? Sharlene Hesse-Biber, a celebrated research methods author and faculty member at Boston College, sought out the answers to these questions by working closely with her library to teach research methods to students. In this clip, Hesse-Biber shares insight on the type of research instruction that students receive in the classroom and where library research resources and support can fit into that process. Watch as she addresses several big questions surrounding research methods instruction such as:

  • How can faculty and librarians provide students with good exemplars of great research?
  • How can faculty and librarians support students conducting their first research project?
  • What do students learn at various levels of their academic careers?

For more information on research methods check out MethodSpace, home of the research methods community.

via How are Research Methods Taught? — SAGE Connection – Insight

The Role of Collaboration in Tourism Research

5053443202_bfa18dab8b_z[We’re pleased to welcome Gang Li of Deakin University. Gang recently published an article in Journal of Hospitality & Tourism Research entitled “Temporal Analysis of Tourism Research Collaboration Network” with co-authors Wei Fan of Hong Kong Polytechnic University and Rob Law of Hong Kong Polytechnic University.]

Network analysis is an effective tool for the study of relationships among individual, including the relationships among researchers. We would like to investigate the changes of importance of individual researchers in collaboration networks of tourism research over time, which may help to obtain better understanding of collaboration to promote the progress of research.

Current Issue Cover

We proposed to evaluate the importance of researchers by considering both productivity and their contribution to the connectivity of collaboration networks.  In network analysis, centrality measures can reflect the importance of nodes in a network and degree and betweenness are two commonly used centrality measures in previous studies. This study found that betweenness centrality is better than degree centrality in terms of reflecting the changes of importance of researchers.

Information about the evolution of collaboration network and the changes of each researcher can be provided withthe method proposed in this study. With further research on topic analysis of published articles, the proposed method may help to explore trends in tourism and hospitality research. Moreover, this work provides an alternative method to utilize centrality measure in network analysis.

The abstract for the paper:

Network analysis is an effective tool for the study of collaboration relationships among researchers. Collaboration networks constructed from previous studies, and their changes over time have been studied. However, the impact of individual researchers in collaboration networks has not been investigated systematically. We introduce a new method of measuring the contribution of researchers to the connectivity of collaboration networks and evaluate the importance of researchers by considering both contribution and productivity. Betweenness centrality is found to be better than degree centrality in terms of reflecting the changes of importance of researchers. Accordingly, a method is further proposed to identify key researchers at certain periods. The performance of the identified researchers demonstrates the effectiveness of the proposed method.

You can read “Temporal Analysis of Tourism Research Collaboration Network” from Journal of Hospitality & Tourism Research free for the next two weeks by clicking here. Want to know all about the latest research from Journal of Hospitality & Tourism ResearchClick here to sign up for e-alerts!

*Image attributed to US Embassy (CC)

Top Five Articles from Organizational Research Methods

JPG READINGSummer is just around the corner, bringing with it longer days and warmer weather. To celebrate the season, we present a list of most read articles from Organizational Research Methods to add to your summer reading list.

“Seeking Qualitative Rigor in Inductive Research: Notes on the Gioia Methodology” by Dennis A. Gioia, Kevin G. Corley, and Aimee Hamilton (January 2013)

For all its richness and potential for discovery, qualitative research has been critiqued as too often lacking in scholarly rigor. The authors summarize a systematic approach to new concept development and grounded theory articulation that is designed to
bring “qualitative rigor” to the conduct and presentation of inductive research.

Current Issue Cover

“Validation of a New General Self-Efficacy Scale” by Gilad Chen, Stanley M. Gully, and Dov Eden (January 2001)

Researchers have suggested that general self-efficacy (GSE)can substantially contribute to organizational theory, research, and practice. Unfortunately, the limited construct validity work conducted on commonly used GSE measures has highlighted such potential problems as low content validity and multidimensionality. The authors developed a new GSE (NGSE) scale and compared its psychometric properties and validity to that of the Sherer et al. General Self-Efficacy Scale (SGSE). Studies in two countries found that the NGSE scale has higher construct validity than the SGSE scale. Although shorter than the SGSE scale, the NGSE scale demonstrated high reliability, predicted specific self-efficacy (SSE) for a variety of tasks in various contexts, and moderated the influence of previous performance on subsequent SSE formation. Implications, limitations, and directions for future organizational research are discussed.

“Common Beliefs and Reality About PLS: Comments on Rönkkö and Evermann (2013)” by Jörg Henseler, Theo K. Dijkstra, Marko Sarstedt, Christian M. RingleAdamantios Diamantopoulos, Detmar W. Straub, David J. Ketchen Jr.Joseph F. Hair, G. Tomas M. Hult, and Roger J. Calantone (April 2014)

This article addresses Rönkkö and Evermann’s criticisms of the partial least squares (PLS) approach to structural equation modeling. We contend that the alleged shortcomings of PLS are not due to problems with the technique, but instead to three problems with Rönkkö and Evermann’s study: (a) the adherence to the common factor model, (b) a very limited simulation designs, and (c) overstretched generalizations of their findings. Whereas Rönkkö and Evermann claim to be dispelling myths about PLS, they have in reality created new myths that we, in turn, debunk. By examining their claims, our article contributes to reestablishing a constructive discussion of the PLS method and its properties. We show that PLS does offer advantages for exploratory research and that it is a viable estimator for composite factor models. This can pose an interesting alternative if the common factor model does not hold. Therefore, we can conclude that PLS should continue to be used as an important statistical tool for management and organizational research, as well as other social science disciplines.

“Using Generalized Estimating Equations for Longitudinal Data Analysis” by Gary A. Ballinger (April 2004)

The generalized estimating equation (GEE) approach of Zeger and Liang facilitates analysis of data collected in longitudinal, nested, or repeated measures designs. GEEs use the generalized linear model to estimate more efficient and unbiased regression parameters relative to ordinary least squares regression in part because they permit specification of a working correlation matrix that accounts for the form of within-subject correlation of responses on dependent variables of many different distributions, including normal, binomial, and Poisson. The author briefly explains the theory behind GEEs and their beneficial statistical properties and limitations and compares GEEs to suboptimal approaches for analyzing longitudinal data through use of two examples. The first demonstration applies GEEs to the analysis of data from a longitudinal lab study with a counted response variable; the second demonstration applies GEEs to analysis of data with a normally distributed response variable from subjects nested within branch offices ofan organization.

“Answers to 20 Questions About Interrater Reliability and Interrater Agreement” by James M. LeBreton and Jenell L. Senter (October 2008)

The use of interrater reliability (IRR) and interrater agreement (IRA) indices has increased dramatically during the past 20 years. This popularity is, at least in part, because of the increased role of multilevel modeling techniques (e.g., hierarchical linear modeling and multilevel structural equation modeling) in organizational research. IRR and IRA indices are often used to justify aggregating lower-level data used in composition models. The purpose of the current article is to expose researchers to the various issues surrounding the use of IRR and IRA indices often used in conjunction with multilevel models. To achieve this goal, the authors adopt a question-and-answer format and provide a tutorial in the appendices illustrating how these indices may be computed using the SPSS software.

All of the above articles from Organizational Research Methods will be free to access for the next two weeks. Want to know all about the latest research from Organizational Research Methods? Click here to sign up for e-alerts!

*Reading image attributed to Herry Lawford (CC)

Discovering Surprising Connections: The Key to Finding Content?

Where do researchers go first to find new scholarly materials? Do researchers relay on recommendations from peers and faculty members to help the research process? A new two-part white paper from SAGE Publishing explains how serendipitous discovery during research can propel researchers in the right direction. With this in mind, SAGE developed a new discovery tool, SAGE Recommends, which helps users uncover relevant research materials by drawing connections between content.

Check out an infographic for the white paper below, or read the original white papers, “Expecting the Unexpected: Serendipity, Discovery, and the Scholarly Research Process” and “The Story of SAGE Recommends”.

Infographic 1

Infographic 2

Infographic 3

Infographic 4

Read the original white papers “Expecting the Unexpected: Serendipity, Discovery, and the Scholarly Research Process” and “The Story of SAGE Recommends”, or find the original press release here. Interested in reading more SAGE white papers? Find a full archive here.

Seeking Serendipitous Scholarly Discoveries: SAGE Recommends

18501292075_59e5db288d_zResearch is a fickle process–at times, carefully planned searches and methodical approaches yield a bounty of relevant information, and other times, it seems there is no information to be found. Many times, when research plateaus, the best thing to revive research is a serendipitous discovery. But how exactly can serendipity be applied to research when it is inherently coincidental? A new two-part white paper from SAGE Publishing discusses the part serendipity plays in academic research, and how to encourage more coincidental discoveries.

In the first paper, “Expecting the Unexpected: Serendipity, Discovery, and Scholarly Research Process,” written by Alan Maloney and Lettie Y. Conrad, findings from a survey of 239 students and faculty suggest that researches prefer to stumble upon interesting, relevant content rather than have materials recommended by peers or by popularity. Statistically, 78% of undergraduates and 91% of faculty are inclined to click on recommendations during their online research, particularly when the recommendations are directly relevant to their research topic.

Lemony Snicket quote

In the second paper, “The Story of SAGE Recommends,” Alan Maloney describes how the research on serendipitous academic research led to the development of SAGE Recommends, a new discovery tool launched in December 2015. SAGE Recommends is designed to explain connections between content and subtly recommend relevant research materials to users. Alan Maloney explained:

SAGE Recommends is the first output of SAGE’s efforts over the last couple of years to develop better content intelligence, and to properly map and understand the disciplines in which we publish. This paper sets out how we have used this new knowledge and area of technical competence to make scholarly and educational materials more discoverable, to encourage new directions in research, and to delight our users.

The findings of this study will be discussed in a free webinar, which will take place on Tuesday, February 16th at 11 AM EST. The discussion will be moderated by InfoDOCKET’s Gary Price. To register, click here.

To read the first paper, “Expecting the Unexpected: Serendipity, Discovery, and Scholarly Research Process,” click here. To read the second paper, “The Story of SAGE Recommends,” click here.