Marcia Simmering on the Detection of Common Method Variance

[We’re pleased to welcome Marcia Simmering of Louisiana Tech University. Dr. Simmering recently published an article in Organizational Research Methods with Christie M. Fuller, Hettie A. Richardson, Yasemin Ocal, and Guclu M. Atinc entitled “Marker Variable Choice, Reporting, and Interpretation in the Detection of Common Method Variance: A Review and Demonstration.”]

  • What inspired you to be interested in this topic?

07ORM13_Covers.inddAfter the publication of my earlier piece on common method variance (Richardson, Simmering, Sturman, 2009 in ORM), where we found that marker variables could be potentially useful in detecting method variance, I kept getting questions from other researchers about what marker variables they should use in their own studies. I didn’t always have an answer, because the appropriateness of a marker variable depends on the study variables. So, I worked with a team of co-authors from different business disciplines on the current paper to find good marker variables in a variety of studies. As we all read articles using marker variables, we found so much variation in how they were used, and we learned that many had not been chosen or implemented properly. So, my coauthors and I decided to give an overview of how these techniques have been used (and misused). We took it a step further and tried to find out what these marker variables are really measuring and whether they’re measuring something different from presumed causes of common method variance (CMV), like social desirability and affectivity.

  • Were there findings that were surprising to you?

Yes! I would say that most of what we found in both studies surprised us. In Study 1 (the review of marker variable use), I didn’t expect so many authors to choose marker variables that really couldn’t properly capture CMV. And, I was surprised at how little journal space was given to tests of CMV. In Study 2, we didn’t know what we would find about what marker variables might detect in comparison to presumed causes of CMV, but we were still surprised to find that one added measure (either marker or presumed cause) is likely not enough to reasonably detect CMV and that multiple marker and CMV-cause variables in one study give much more information.

  • How do you see this study influencing future research and/or practice?

We hope that other researchers can find this article helpful in choosing appropriate marker variables and analyzing them in a way that can reasonably detect CMV. This is easier said than done, because a good marker variable is often chosen before data collection, and perhaps this article can influence more authors to do that. But, we hope, too, that reviewers gain some knowledge about how these techniques can be used to detect CMV. And, our ultimate goal is that this work can get us a little bit closer to understanding the large, complex, and still ambiguous phenomenon of CMV in social science research.

You can read “Marker Variable Choice, Reporting, and Interpretation in the Detection of Common Method Variance: A Review and Demonstration” from Organizational Research Methods for free for the next two weeks by clicking here. Want to know about all the latest research like this from Organizational Research Methods? Click here to sign up for e-alerts!


marcia_dickersonMarcia J. Simmering is the Francis R. Mangham Endowed professor of Management and assistant dean of Undergraduate Programs in the College of Business at Louisiana Tech University. Her current research focuses on the methods topics of common method variance and control variables. Additionally, she has published research on feedback, compensation, and training.

Christie M. Fuller is Thomas O’Kelly-Mitchener associate professor of Computer Information Systems at Louisiana Tech University. Her research in deception and decision support systems has been published in Decision Support Systems, Expert Systems with Applications, IEEE Transactions on Professional Communication, along with other journals and conference proceedings.

Richardson-Hettie for profileHettie A. Richardson is an associate professor and Chair of the Department of Management, Entrepreneurship, and Leadership in the Neeley School of Business at Texas Christian University. Her methodological research interests focus on common method variance and other measurement-related issues. She also studies employee involvement, empowerment, and strategic human resource management.

Yasemin Ocal is an assistant professor of Marketing at Texas A&M University-Commerce. Her research focuses on response rate and response bias in marketing research and has appeared in journals such as Journal of Leadership and Organizational Studies and numerous international conferences, including organization of a survey response rate issues session in World Marketing Congress of the Academy of Marketing Science.

atnicGuclu M. Atinc is an assistant professor of Management at Texas A&M University-Commerce. His current research addresses board composition, top management teams and ownership structures of young entrepreneurial firms, and research methods. Dr. Atinc’s research has appeared in journals such as Organizational Research Methods, Journal of Managerial Issues, and Journal of Leadership and Organizational Studies.

Truth or Urban Legend?

Method Variance in Organizational Research: Truth or Urban Legend?”, currently appears as one of the most frequently cited articles in Organizational Research Methods, based on citations to online articles from HighWire-hosted articles. The article was published in the April 2006 issue of ORM and written by Paul E. Spector, University of South Florida. Professor Spector kindly provided some additional background information regarding the popular article.

I was asked how I came to write “Method Variance in Organizational Research: Truth or Urban Legend?” that appeared in Organizational Research Methods in 2006. After a session on the journal review process at the Southern Management Association conference in 2003, Bob Vandenberg (University of Georgia) suggested to me, Larry Williams (Purdue University) and a few others that we should organize a session on methodological urban legends for the Academy of Management (AOM) conference. Bob’s idea is that we would talk about widely accepted methodological practices that have a questionable basis. Having been interested (with my USF colleague Mike Brannick) in the misunderstanding of common method variance, it was the first topic I thought to contribute. Bob organized the session for the 2004 AOM conference, which was given to a standing room only crowd of several hundred. Given strong interest in the session we decided to publish the papers as a feature topic in an upcoming issue of Organizational Research Methods. Three of us expanded our conference papers that were published in 2006. In addition to my paper there was Chuck Lance, Marcus Butts and Lawrence Michels (University of Georgia) on cutoff criteria, and Larry James, Stan Mulaik (Georgia Institute of Technology) and Jeanne Brett (Northwestern University) on tests for mediation. All three of these papers have been widely cited.

My interest in common method variance and methodological urban legends has continued following publication of this Organizational Research Methods paper. In 2010 Mike Brannick and I were editors for an Organizational Research Methods feature topic on common method variance. One of the papers was based on a panel discussion Mike and I participated in at the Society for Industrial and Organizational Psychology (SIOP) conference in 2009. Around the same time, Bob Vandenberg, who as the Organizational Research Methods editor-in-chief at the time put out a call for another feature topic on methodological urban legends to parallel a methodological urban legends session at SIOP in 2010. Mike Brannick and I contributed a paper on the misuse of statistical control variables that will be published in Organizational Research Methods in 2011 along with some other methodological urban legend papers.

Editor’s Note: You can read that article and the others that Professor Spector mentions in the April 2011 issue of Organizational Research Methods.