Measuring Individual Behavior Workshop

Schedule and location

Tuesday, May 28 and Wednesday, May 29, 2019

University of Jyväskylä, Agora Building (Mattilanniemi 2, 40100 Jyväskylä), room Alfa.


Registration is open until May 17.


Professor Merrill Warkentin, James J. Rouse Endowed Professor of Information Systems, Mississippi State University, USA


Professor Tuure Tuunanen, University of Jyväskylä, Finland


“What we Measure: Metrics for Investigating Individual Behavior in Business Research”

(with special focus on NueroIS / Neuro-Physiological Methods)


What are we measuring for our dependent variable (and other constructs in our variance models) and what should we be measuring?

We’ll start with a look back at years of research publications in the top journals in Information Systems, focusing especially on individual-level (micro level, not group-level, organizational-level (macro), industry-level, or society-level) quantitative empirical research.  Much like the literature in Management (employees), Marketing (consumer behavior), Behavioral Economists, and others, the field has seen many cross-sectional surveys that use self-reported measures of behavioral intention (BI) as the dependent variable (DV), which is rightfully coming under increased scrutiny.

We will discuss the problems with using intention and other measures, and we’ll explore the trend toward seeking more valid measures of individual activities, such as actual behavior measures and many others.  We will identify creative and innovative approaches in the IS literature that are also found in consumer behavior, human resources, and other areas of business research.  This will include a discussion of neuro-physiological data collection methods and what they promise for improving our research metrics.

I will provide examples from my focal research area – information security and privacy behaviors by individual computer users.  I will also present several specific examples of newer, more valid metrics from my own recent publications and others.

Preparation and pre-course Assignment 

1. Read the primary readings in the reading list.  Skim the secondary readings, focusing on the Methods section.

2. Select one article in the secondary list and one additional article from your own research interest to carefully read, then assess their measurement methods.  For each article, prepare a one-page bulleted list of the strengths and weaknesses of that paper’s research method, especially the method for measuring individual behavior.  Submit these two assessments (by email to before May 21, suitable for distribution to workshop participants (with your name and the paper’s citation).

3. Select one additional article from your primary research focal area (from a top journal in that discipline – it can be your own manuscript, published or not) which uses post-positivist quantitative empirical research methods, and assess the measurement of the key constructs, especially the dependent variable (DV).  Be prepared to discuss these measurement issues (strengths and weaknesses) during the workshop.  Submit a one-page assessment (by email to before May 21, suitable for distribution to workshop participants (with your name and the paper’s citation).

4. Bring one or two of your own current (or recent) projects to discuss with others, with the goal of considering alternative measures (with a different research design) or developing a second study to complement the current study, introducing a design that offers strength(s) where your current project may have weaknesses.

 Detailed Program


Day 1 – May 28, 2019

0800-0920: Introduction to the Speaker and the Workshop

                                           Introductions of participants and research domains

0920-0930: Break

0930-1100: Overview of the Philosophy of Science and the Empirical Method

                                           Levels of Analysis, McGrath and the Research Method Circumspect

                                           Discussion of trade-offs between alternative data collection methods

                                           Key research sub-domains and the primary measurement methods

1100-1200 Assessment of Key Measurements for Individual Level Phenomena

                                           What we measure in IS (overview of papers from top 6 AIS journals)

1200-1300 Lunch

1300-1345 Discussion of measurement paradigms in Org Behavior, Consumer
                                           Behavior, Behavioral Economics, and other scientific disciplines

1345-1520: Participant topics, presentation and discussion

                                           Top published works plus personal projects

1520-1530 Break

1530-1700 Introduction to Neuro-physiological Data Collection


Day 2 – May 29, 2019

0800-0920: NeuroIS and other innovative data collection in IS research

                                           BYU lab, Temple lab. MSU lab, University of Applied Sciences Upper Austria

                                           What is each data collection good at measuring?  What can’t it measure?

                                           How can Neuro methods be incorporated into your research study?

                                           Methodological pluralism 

0920-0940 Break

0940-1100: Rigorous research design, and innovative methods for validation, etc.

                                           Common methods bias/variance (CMV), a priori methods, solutions

                                           Manipulation checks, attention checks, instructional manipulation checks

                                           Realism checks (when using scenarios)

                                           Social desirability bias and various methods to overcome
                                                 (including the “coin flip” approach)

1100-1200 The Factorial Survey Method for behavioral research, especially to measure deviant behaviors and behaviors subject to social desirability bias

                                           How to establish the independent variables in FSM research

                                           Presentation and discussion of Workshop leader’s projects

1200-1300 Lunch

1300-1430 Scrubbing the data, Marker Variables (CMV), Outlier Analysis, etc.

1430-1520: Applications in participants’ domains, discussions and presentations

1520-1530 Break

1530-1700 Final discussions, student presentations, wrap-up, Q&A


Post-course Assignment

Each student will summarize how they will apply the principles learned in this course to their own work.  Identify specifically, in the essay, how you will increase the rigor of your measurement methods and how you will seek to validate the data you have captured.

In your essay, be sure to apply the material you learned about trade-offs between various research designs (based on McGrath, etc.), and make sure you identify both the strengths and weaknesses of your research design ideas.  Include significant excerpts of your research instrument (survey) and your research protocol.

Pre-course required reading

Primary readings

McGrath, J. E. (1995). Methodology matters: Doing research in the behavioral and social sciences. In Readings in Human–Computer Interaction (pp. 152-169). Morgan Kaufmann.

MacKenzie, S. B., & Podsakoff, P. M. (2012). Common method bias in marketing: causes, mechanisms, and procedural remedies. Journal of retailing88(4), 542-555.

Oppenheimer, D. M., Meyvis, T., & Davidenko, N. (2009). Instructional manipulation checks: Detecting satisficing to increase statistical power. Journal of Experimental Social Psychology45(4), 867-872.

Taylor, B. J. (2005). Factorial surveys: Using vignettes to study professional judgement. British Journal of Social Work36(7), 1187-1207.

Riedl, R., Davis, F. D., & Hevner, A. R. (2014). Towards a NeuroIS research methodology: intensifying the discussion on methods, tools, and measurement. Journal of the Association for Information Systems15(10), I.

Dimoka, A., Davis, F. D., Gupta, A., Pavlou, P. A., Banker, R. D., Dennis, A. R., ... & Kenning, P. H. (2012). On the use of neurophysiological tools in IS research: Developing a research agenda for NeuroIS. MIS quarterly, 679-702.

Venkatesh, V., Brown, S. A., & Bala, H. (2013). Bridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systems. MIS Quarterly, 21-54.

Wall, J. D., & Warkentin, M. (2019). Perceived argument quality's effect on threat and coping appraisals in fear appeals: An experiment and exploration of realism check heuristics. Information & Management, 2019, forthcoming, published online 21 March 2019 at at ).

Siponen, M., & Vance, A. (2014). Guidelines for improving the contextual relevance of field surveys: the case of information security policy violations. European Journal of Information Systems23(3), 289-305.

Warkentin, Merrill, Detmar Straub, and Kalana Malimage. “Measuring Secure Behavior: A Research Commentary,” Proceedings of the 7th Annual Symposium on Information Assurance, Albany, New York, June 5-6, 2012, pp. 1-8, available at (full current working version of this conference paper will be made available)

Orazi, Davide Christian, Merrill Warkentin, and Allen C. Johnston.  “Guidelines for designing fear appeals and conducting IS security research,” Communications of the AIS, 2019, forthcoming.

Secondary readings

Vedadi, Ali and Merrill Warkentin. “Can secure behaviors be contagious? A two-stage investigation of the influence of herd behavior on security decisions,” Journal of the Association for Information Systems, 2019, conditionally accepted.

Ormond, Dustin, Merrill Warkentin, and Robert E. Crossler.  “Understanding Information Security Policy Compliance through an Affective Lens,” Journal of the Association for Information Systems, 2019, forthcoming.

Johnston, Allen C., Merrill Warkentin, Alan R. Dennis, and Mikko Siponen.  “Speak their language: Designing effective messages to improve employees’ information security decision making,” Decision Sciences, 2019, forthcoming (published online 23 July 2018 at

Barlow, Jordan B., Merrill Warkentin, Dustin Ormond, and Alan R. Dennis. “Don’t even think about it! The effects of anti-neutralization, informational, and normative communication on information security compliance,” Journal of the Association for Information Systems, Volume 19, Issue 8, August, 2018, pp. 689-715.

Willison, Robert, Merrill Warkentin, and Allen C. Johnston.  “Examining Employee Computer Abuse Intentions: Insights from Justice, Deterrence, and Neutralization Perspectives,” Information Systems Journal, Vol 28, No 2, March 2018, pp. 266-293.

Warkentin, Merrill, Allen C. Johnston, Jordan Shropshire, and William D. Barnett. “Continuance of Protective Security Behavior: A Longitudinal Study,” Decision Support Systems, Volume 92, December 2016, pp. 25-35.

Ormond, Dustin; Merrill Warkentin; Allen C. Johnston, and Samuel C. Thompson. “Perceived deception: Evaluating source credibility and self-efficacy,” Journal of Information Privacy & Security, Vol. 12, Issue 4, 2016, pp. 197-217.

Warkentin, Merrill, Eric A. Walden, Allen C. Johnston, and Detmar W. Straub. “Neural Correlates of Protection Motivation for Secure IT Behaviors: An fMRI Examination,” Journal of the Association for Information Systems, Vol. 17, Issue 3, March 2016, pp. 194-215.

Johnston, Allen C., Merrill Warkentin, and Mikko Siponen.  “An Enhanced Fear Appeal Framework: Leveraging Threats to the Human Asset through Sanctioning Rhetoric,” MIS Quarterly, Vol. 39, No. 1, 2015, pp. 113-134.

Ormond, Dustin and Merrill Warkentin.  “Is This a Joke? The Impact of Message Manipulations on Risk Perceptions,” Journal of Computer Information Systems, Volume 55, Number 2, Winter 2015, pp. 9-19.

Trinkle, Brad S., Robert E. Crossler, and Merrill Warkentin. “I’m game, are you? Reducing Real-World Security Threats by Managing Employee Activity in Virtual Environments,” Journal of Information Systems, Vol. 28, No. 2, 2014, pp. 307-327.

A URL with all readings will be provided to registered attendees/students.  Additional readings will also be posted.

Published Information Security Research Articles that use Scenario-Analysis

Publications using the Factorial Survey Method (FSM) with scenarios:

Barlow, J.B., M. Warkentin, D. Ormond, & A.R. Dennis (2018). Don’t even think about it! The effects of anti-neutralization, informational, and normative communication on information security compliance, Journal of the Association for Information Systems 19(8), 689-715.

Barlow, J.B., M. Warkentin, D. Ormond, & A.R. Dennis. (2013). Don't make excuses! Discouraging neutralization to reduce IT policy violation, Computers & Security 39, 145-159.

Johnston, A.C., M. Warkentin, M. McBride, & L.D. Carter. (2016). Dispositional and Situational Factors: Influences on IS Security Policy Violations. European Journal of Information Systems 25(3), 231-251.

Lee, J. Jr.; M. Warkentin, R.E. Crossler, & R.F. Otondo (2017).“Implications of monitoring mechanisms on bring your own device adoption, Journal of Computer Information Systems 57(4), 309-318.

Menard, P., M. Warkentin, & P.B. Lowry (2018). The impact of collectivism and psychological ownership on protection motivation: A cross-cultural examination, Computers & Security 75(June), 147-166

Trinkle, B.S., R.E. Crossler, & M. Warkentin. (2014). I’m game, are you? Reducing real-world security threats by managing employee activity in virtual environments, Journal of Information Systems 28(2), 307-327.

Vance, A., Lowry, P. B., & Eggett, D. (2013). Using accountability to reduce access policy violations in information systems. Journal of Management Information Systems29(4), 263-290.

Vance, A., Lowry, P., & Eggett, D. (2015). Increasing accountability through the user interface design artifacts: A new approach to addressing the problem of access-policy violations. MIS Quarterly 39(2), 345–366.

Willison, R., M. Warkentin, & A.C. Johnston (2018). Examining employee computer abuse intentions: Insights from justice, deterrence, and neutralization perspectives, Information Systems Journal 28(2), 266-293.

Publications that use scenarios, but not the FSM:

D'Arcy, J., Hovav, A., & Galletta, D. (2009). User awareness of security countermeasures and its impact on information systems misuse: A deterrence approach. Information Systems Research20(1), 79-98.

Khansa, L., Kuem, J., Siponen, M., & Kim, S. S. (2017). To cyberloaf or not to cyberloaf: The impact of the announcement of formal organizational controls. Journal of Management Information Systems34(1), 141-176.

Lowry, P. B., Moody, G. D., Galletta, D. F., & Vance, A. (2013). The drivers in the use of online whistle-blowing reporting systems. Journal of Management Information Systems30(1), 153-190.

Wall, J. D., & Warkentin, M. (2019). Perceived argument quality's effect on threat and coping appraisals in fear appeals: An experiment and exploration of realism check heuristics. Information & Management, 2019, forthcoming, published online 21 March 2019 at at ).

Credit points

Doctoral students participating in the seminar can obtain 2 credit points. This requires participating on all of the days and completing the assignments.

Registration fee

This seminar is free-of-charge for member organization's staff and their PhD students. For others the participation fee is 400 €. The participation fee includes access to the event and the event materials. Lunch and dinner are not included.