• COE Degrees & Programs

LTEC Learning Design & Technology

Appendix 2: design-based research (dbr) dissertations.

The core of the DBR dissertation is the research, development and testing of a product or process to solve an educational problem. Design-based research protocols require intensive and long-term collaboration involving researchers and practitioners. Design-based research integrates the development of solutions to practical problems in learning environments with the identification of reusable design principles. Barab and Squire (2004) defined design-based research as “a series of approaches, with the intent of producing new theories, artifacts, and practices that account for and potentially impact learning and teaching in naturalistic settings” (p. 2). They described it as a methodology that requires: addressing complex problems in real contexts in collaboration with practitioners integrating known and hypothetical design principles with technological affordances to render plausible solutions to these complex problems; and conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design principles. A research proposal for a doctoral study using a design-based approach must include a practitioner-oriented focus as well as degrees of collaboration that are not necessarily required for more traditional predictive research designs. 

A design-based research proposal consists of the following:

Phase 1: Problem Identification (Chapter 1)

For design-based research in education, the identification and exploration of a significant educational problem is a crucial first step. It is this problem that creates a purpose for the research, and it is the creation and evaluation of a potential solution to this problem that will form the focus of the entire study. Many research students begin by thinking of a solution—such as a technology-based intervention, an educational game, or a technology tool—before they consider the educational problem it could solve. Problems then arise when the solution is revealed to be a project of interest or ‘pet’ project, rather than a genuine attempt to solve an educational problem. The statement of the problem in design-based research should identify an issue or an opportunity, explore its history or background, and provide a convincing and persuasive argument that this problem is significant and worth researching. This includes articulating both the practical and scientific relevance of the study. In line with the exploratory nature of design research, driving questions should therefore be open in nature. The assumptions that direct DBR are derived from the definition of the research problem in close collaboration with practitioners.

Literature Review (Chapter 2)

The research is fine tuned through the literature review that serves to (a) help flesh out what is already known about the problem and (b) to guide the development of potential solutions. In such instances, the inquiry that forms the basis of DBR serves the researcher to help understand the underpinning processes and variables and how they impact on the learning and learning outcomes. A well-described theoretical framework provides a sound basis for the proposed solution, because theory can inform practical design guidelines. Even though they are largely based on the literature, it is unlikely that draft principles will be complete at the time the proposal is presented. At the very least, the process of deriving them should be described and examples given.

Description of proposed intervention (Chapter 3)

The proposed solution to the educational problem is developed from consideration of relevant literature, consultation and collaboration with researchers and practitioners, and as an instantiation of the principles derived from these sources.  It is important to describe in the prospectus or proposal the process of how the intervention will be conceptualized and developed.

Methodology (Chapter 4)

The methodology serves to describe the Iterative cycles of testing and refinement of solutions in practice. Both qualitative and quantitative methods may be used in DBR. A research proposal would include details of the methodology of the implementation and evaluation of the proposed solution, as it largely constitutes the data collection and analysis stages of the study. The proposal should also acknowledge the likelihood, even the desirability in some cases, of significant modifications being required in the data collection and analysis phases of the ongoing study.

Implementation of intervention (First iteration – Alpha stage – prototyping)

The iterative nature of design-based research means that a single implementation is rarely sufficient to gather enough evidence about the success of the intervention and its effect on the problem situation. A typical design-based research study would have two or more cycles, where after the first implementation and evaluation, changes are made to the learning environment to further improve its ability to address the problem. In DBR the context of the inquiry must be seen as a means to an end rather than an end in itself. The intention should be to use the setting to gain an understanding which will have meaning beyond the immediate setting.

Participants

In a research proposal, the description of participants and the method of their selection provide important information for reviewers about the potential for bias in the proposed study. Participants are usually individuals who reflect the characteristics of or are influenced by the issues being considered by the investigation. 

Data collection and analysis

The method of data collection in design-based research can involve the collection of qualitative and/or quantitative data, and it may be collected in cycles of several weeks or semesters, or even years. Types of data collected are likely to vary along with the phases. For example, data contributing to contextual understanding are more likely to be emphasized in earlier stages of the study, whereas data on prototype characteristics or user reactions are more likely to be collected later on. This section of the proposal describes (a) data sources such as varying time, location and participants; (b) data collection methods, including varying formats (interviews, observations, etc.); and (c) analysis approaches to be used with each data type. 

Implementation of intervention—second and third iterations (Beta stage testing iteration and Gamma stage final assessment) 

Although it is impossible to describe the nature of the second and subsequent iterations of the intervention, because they are so totally dependent on the findings of the first iteration, it is useful to describe the process that should be undertaken in the proposal. The cyclic nature of the data collection and analysis cannot be described in great detail in the proposal, but the process of data collection, analysis, further refinement, implementation and data collection (and so on) of the learning environment should be explained as a method in the proposal.

After the Proposal

Once the proposal is approved and IRB acquired, the student will begin collecting data for the Alpha stage which is then analyzed.  Based on those findings, the intervention is adapted and then tested again in the Beta phase. After data collection and analysis in the Beta stage, the intervention is again modified and a final test (Gamma phase) with actual users is conducted.

Findings (Chapter 5) 

In this chapter the researcher presents the findings of each iteration as well as describes the modifications made to the intervention based on the findings of each stage. The specific organization of the findings in Chapter 5 should be decided in consultation with the dissertation chair.

Conclusion (Chapter 6) 

Reflection to produce “design principles” and enhance solution implementation

Design principles

The knowledge claim of design-based research, and one that sets it apart from other research approaches, takes the form of design principles. Design principles contain substantive and procedural knowledge with comprehensive and accurate portrayal of the procedures, results and context, such that readers may determine which insights may be relevant to their own specific settings. In the traditional sense, generalization of design-based research findings is rather limited; instead, use of design principles calls for a form of analytical generalization.

Practical outputs: Designed artifact(s)

In design -based research, the product of design is viewed as a major output . Design artifacts in this field may range from software packages to professional development programs, and many more.

Societal outputs: Professional development of participants

The collaboration that is so integral to the process of defining and accomplishing a design -based research project has an additional benefit to the extent that it enhances the professional development of all involved

Theoretical/Conceptual Framework and Literature Connections

The final chapter should also address how the findings from the research study compare and contrast with expectations based on the literature and the theoretical or conceptual framework used by the researcher.

The figure below shows a typical  DBR process.

design based research dissertation

Other Content

As in the traditional and three-paper dissertation options, the dissertation will include appropriate Appendices and a full Reference list.

Design-based research requires frequent and prolonged periods of fieldwork, off-set by periods of review, reflection and re-design. These intervals should be clearly taken into account in any timeline accompanying the research proposal. A major strength of design research lies in its adaptability, the commitment to adjusting a study’s course based on findings after each iteration.

Committee Review

LTEC has adopted the following criteria taken as the basis for evaluation for a DBR dissertation beyond the metrics for a traditional dissertation:

  • Appraise the intellectual merit of the research and the product/process proposed;
  • Review the contribution to new or existing design principles 
  • Assess the quality and  appropriateness of the  practical solutions proposed for real-world educational problems 

In general, the dissertation chair and committee are responsible for determining the appropriateness and quality of the proposed DBR research.

The format of the final dissertation must meet the style guidelines established by the UHM Graduate Division for theses and dissertations. All LTEC dissertations use APA style guidelines.

Informing Science Institute

ISI Journals

  • Informing Science: The International Journal of an Emerging Transdiscipline (InformingSciJ)
  • Journal of Information Technology Education: Research (JITE:Research)
  • Journal of Information Technology Education: Innovations in Practice (JITE:IIP)
  • Journal of Information Technology Education: Discussion Cases (JITE: DC)
  • Interdisciplinary Journal of e-Skills and Lifelong Learning (IJELL)
  • Interdisciplinary Journal of Information, Knowledge, and Management (IJIKM)
  • International Journal of Doctoral Studies (IJDS)
  • Issues in Informing Science and Information Technology (IISIT)
  • Journal for the Study of Postsecondary and Tertiary Education (JSPTE)
  • Informing Faculty (IF)

Collaborative Journals

  • Muma Case Review (MCR)
  • Muma Business Review (MBR)
  • International Journal of Community Development and Management Studies (IJCDMS)
  • InSITE 2025 : Jul 20 - 28 2025, Japan
  • All Conferences »
  • Publications
  • Journals  
  • Conferences  

Design Based Research in Doctoral Studies: Adding a New Dimension to Doctoral Research

design based research dissertation

Back to Top ↑

  • Become a Reviewer
  • Privacy Policy
  • Ethics Policy
  • Legal Disclaimer

Twitter

SEARCH PUBLICATIONS

Informing Science Institute

An Introduction to Design-Based Research with an Example From Statistics Education

  • First Online: 01 January 2014

Cite this chapter

design based research dissertation

  • Arthur Bakker 6 &
  • Dolly van Eerde 6  

Part of the book series: Advances in Mathematics Education ((AME))

10k Accesses

47 Citations

3 Altmetric

This chapter arose from the need to introduce researchers, including Master and PhD students, to design-based research (DBR). In Sect. 16.1 we address key features of DBR and differences from other research approaches. We also describe the meaning of validity and reliability in DBR and discuss how they can be improved. Section 16.2 illustrates DBR with an example from statistics education.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

design based research dissertation

Less Can Be More: Encouraging Mastery of Research Design in Undergraduate Research Methods

design based research dissertation

Research design: qualitative, quantitative, and mixed methods approaches / sixth edition

design based research dissertation

Reflection on Design-Based Research: Challenges and Future Direction

Akkerman, S. F., Admiraal, W., Brekelmans, M., & Oost, H. (2008). Auditing quality of research in social sciences. Quality & Quantity, 42 , 257–274.

Article   Google Scholar  

Anderson, T., & Shattuck, J. (2012). Design-based research: A decade of progress in education research? Educational Researcher, 41 , 16–25.

Artigue, M. (1988). Ingénierie didactique [Didactical engineering]. In M. Artigue, G. Brousseau, J. Brun, Y. Chevallard, F. Conne, & G. Vergnaud (Eds.), Didactique des mathematiques [Didactics of mathematics]. Paris: Delachaux et Niestlé.

Google Scholar  

Bakkenes, I., Vermunt, J. D., & Wubbels, T. (2010). Teachers learning in the context of educational innovation: Learning activities and learning outcomes of experienced teachers. Learning and Instruction, 20 (6), 533–548.

Bakker, A. (2004a). Design research in statistics education: On symbolizing and computer tools . Utrecht: CD-Bèta Press.

Bakker, A. (2004b). Reasoning about shape as a pattern in variability. Statistics Education Research Journal, 3 (2), 64–83. Online http://www.stat.auckland.ac.nz/~iase/serj/SERJ3(2)_Bakker.pdf

Bakker, A. (2007). Diagrammatic reasoning and hypostatic abstraction in statistics education. Semiotica, 164 , 9–29.

Bakker, A., & Derry, J. (2011). Lessons from inferentialism for statistics education. Mathematical Thinking and Learning, 13 , 5–26.

Bakker, A., & Gravemeijer, K. P. E. (2004). Learning to reason about distribution. In D. Ben-Zvi & J. Garfield (Eds.), The challenge of developing statistical literacy, reasoning, and thinking (pp. 147–168). Dordrecht: Kluwer.

Chapter   Google Scholar  

Bakker, A., & Gravemeijer, K. P. E. (2006). An historical phenomenology of mean and median. Educational Studies in Mathematics, 62 (2), 149–168.

Bakker, A., & Hoffmann, M. (2005). Diagrammatic reasoning as the basis for developing concepts: A semiotic analysis of students’ learning about statistical distribution. Educational Studies in Mathematics, 60 , 333–358.

Ben-Zvi, D., Aridor, K., Makar, K., & Bakker, A. (2012). Students’ emergent articulations of uncertainty while making informal statistical inferences. ZDM The International Journal on Mathematics Education, 44 , 913–925.

Biehler, R., Ben-Zvi, D., Bakker, A., & Makar, K. (2013). Technology for enhancing statistical reasoning at the school level. In M. A. Clement, A. J. Bishop, C. Keitel, J. Kilpatrick, & A. Y. L. Leung (Eds.), Third international handbook on mathematics education (pp. 643–689). New York: Springer. doi: 10.1007/978-1-4614-4684-2_21 .

Brown, A. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2 , 141–178.

Cicchetti, D. V. (1976). Assessing inter-rater reliability for rating scales: Resolving some basic issues. British Journal of Psychiatry, 129 , 452–456.

Cobb, P., & Whitenack, J. W. (1996). A method for conducting longitudinal analyses of classroom videorecordings and transcripts. Educational Studies in Mathematics, 30 (3), 213–228.

Cobb, P., Yackel, E., & Wood, T. (1992). A constructivist alternative to the representational view of mind in mathematics education. Journal for Research in Mathematics Education, 23 , 2–33.1.

Cobb, P., Gravemeijer, K.P.E., Bowers, J., & McClain, K. (1997). Statistical Minitools . Designed for Vanderbilt University, TN, USA. Programmed and revised (2001) at the Freudenthal Institute, Utrecht University, the Netherlands.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003a). Design experiments in educational research. Educational Researcher, 32 (1), 9–13.

Cobb, P., McClain, K., & Gravemeijer, K. P. E. (2003b). Learning about statistical covariation. Cognition and Instruction, 21 , 1–78.

Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O'Shea (Eds.), New directions in educational technology (pp. 15–22). New York: Springer.

Cook, T. (2002). Randomized experiments in education: A critical examination of the reasons the educational evaluation community has offered for not doing them. Educational Evaluation and Policy Analysis, 24 (3), 175–199.

Creswell, J. W. (2005). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (2nd ed.). Upper Saddle River: Pearson Education.

Creswell, J. W. (2007). Qualitative inquiry and research design. Choosing among five traditions (2nd ed.). Thousand Oaks: Sage.

De Jong, R., & Wijers, M. (1993). Ontwikkelingsonderzoek: Theorie en praktijk [Developmental research: Theory and practice]. Utrecht: NVORWO.

Denscombe, M. (2007). The good research guide (3rd ed.). Maidenhead: Open University Press.

Dierdorp, A., Bakker, A., Eijkelhof, H. M. C., & Van Maanen, J. A. (2011). Authentic practices as contexts for learning to draw inferences beyond correlated data. Mathematical Thinking and Learning, 13 , 132–151.

diSessa, A. A., & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Educational Researcher, 32 (1), 77–103.

Drijvers, P. H. M. (2003). Learning algebra in a computer algebra environment: Design research on the understanding of the concept of parameter . Utrecht: CD-Beta Press.

Edelson, D. C. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences, 11 , 105–121.

Educational Researcher. (2003). Special issue on design-based research collective, 32 (1–2).

Educational Psychologist. (2004). Special issue design-based research methods for studying learning in context, 39 (4).

Engeström, Y. (2011). From design experiments to formative interventions. Theory and Psychology, 21 (5), 598–628.

Fosnot, C. T., & Dolk, M. (2001). Young mathematicians at work. Constructing number sense, addition, and subtraction . Portsmouth: Heinemann.

Freudenthal, H. (1978). Weeding and sowing: Preface to a science of mathematical education . Dordrecht: Reidel.

Freudenthal, H. (1988). Ontwikkelingsonderzoek [Developmental research]. In K. Gravemeijer & K. Koster (Eds.), Onderzoek, ontwikkeling en ontwikkelingsonderzoek [Research, development and developmental research] . Universiteit Utrecht, the Netherlands: OW&OC.

Freudenthal, H. (1991). Revisiting mathematics education: China lectures . Dordrecht: Kluwer.

Frick, R. W. (1998). Interpreting statistical testing: Process and propensity, not population and random sampling. Behavior Research Methods, Instruments, & Computers, 30 (3), 527–535.

Friel, S. N., Curcio, F. R., & Bright, G. W. (2001). Making sense of graphs: Critical factors influencing comprehension and instructional implications. Journal of Research in Mathematics Education., 32 (2), 124–158.

Geertz, C. (1973). Thick description: Toward an interpretive theory of culture. In C. Geertz (Ed.), The interpretation of cultures: Selected essays (pp. 3–30). New York: Basic Books.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research . Chicago: Aldine.

Goffree, F. (1979). Leren onderwijzen met Wiskobas. Onderwijsontwikkelingsonderzoek ‘Wiskunde en Didaktiek’ op de pedagogische akademie [Learning to teach Wiskobas. Educational development research]. Rijksuniversiteit Utrecht, The Netherlands.

Gravemeijer, K. P. E. (1994). Educational development and developmental research in mathematics education. Journal for Research in Mathematics Education, 25 (5), 443–471.

Gravemeijer, K. P. E., & Cobb, P. (2006). Design research from a learning design perspective. In J. Van den Akker, K. P. E. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 17–51). London: Routledge.

Gravemeijer, K. P. E., & Koster, K. (Eds.). (1988). Onderzoek, ontwikkeling en ontwikkelingsonderzoek [Research, development, and developmental research]. Utrecht: OW&OC.

Guba, E. G. (1981). Criteria for assessing trustworthiness of naturalistic inquiries. Educational Communication and Technology Journal, 29 (2), 75–91.

Hoffmann, M. H. G. (2002). Peirce’s “diagrammatic reasoning” as a solution of the learning paradox. In G. Debrock (Ed.), Process pragmatism: Essays on a quiet philosophical revolution (pp. 147–174). Amsterdam: Rodopi Press.

Hoyles, C., Noss, R., Kent, P., & Bakker, A. (2010). Improving mathematics at work: The need for techno-mathematical literacies . Abingdon: Routledge.

Journal of the Learning Sciences (2004). Special issue on design-based research, 13(1), guest-edited by S. Barab and K. Squire.

Kanselaar, G. (1993). Ontwikkelingsonderzoek bezien vanuit de rol van de advocaat van de duivel [Design research: Taking the position of the devil’s advocate]. In R. de Jong & M. Wijers (Red.) (Eds.), Ontwikkelingsonderzoek, theorie en praktijk . Utrecht: NVORWO.

Konold, C., & Higgins, T. L. (2003). Reasoning about data. In J. Kilpatrick, W. G. Martin, & D. Schifter (Eds.), A research companion to principles and standards for school mathematics (pp. 193–215). Reston: National Council of Teachers of Mathematics.

Mathematical Thinking and Learning (2004). Special issue on learning trajectories in mathematics education , guest-edited by D. H. Clements and J. Sarama, 6(2).

Lehrer, R., & Schauble, L. (2001). Accounting for contingency in design experiments. Paper presented at the annual meeting of the American Education Research Association, Seattle.

Lewin, K. (1951). Problems of research in social psychology. In D. Cartwright (Ed.), Field theory in social science; selected theoretical papers . New York: Harper & Row.

Lijnse, P. L. (1995). “Developmental Research” as a way to an empirically based “didactical structure” of science. Science Education, 29 (2), 189–199.

Lijnse, P. L., & Klaassen, K. (2004). Didactical structures as an outcome of research on teaching-learning sequences? International Journal of Science Education, 26 (5), 537–554.

Maso, I., & Smaling, A. (1998). Kwalitatief onderzoek: praktijk en theorie [Qualitative research: Practice and theory]. Amsterdam: Boom.

Maxwell, J. A. (2004). Causal explanation, qualitative research and scientific inquiry in education. Educational Researcher, 33 (2), 3–11.

McClain, K., & Cobb, P. (2001). Supporting students’ ability to reason about data. Educational Studies in Mathematics, 45 , 103–129.

McKenney, S., & Reeves, T. (2012). Conducting educational design research . London: Routledge.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: A sourcebook of new methods . Beverly Hills: Sage.

Nathan, M. J., & Kim, S. (2009). Regulation of teacher elicitations in the mathematics classroom. Cognition and Instruction, 27 (2), 91–120.

Olsen, D. R. (2004). The triumph of hope over experience in the search for “what works”: A response to Slavin. Educational Researcher, 33 (1), 24–26.

Oost, H., & Markenhof, A. (2010). Een onderzoek voorbereiden [Preparing research]. Amersfoort: Thieme Meulenhoff.

Opie, C. (2004). Doing educational research . London: Sage.

Paas, F. (2005). Design experiments: Neither a design nor an experiment. In C. P. Constantinou, D. Demetriou, A. Evagorou, M. Evagorou, A. Kofteros, M. Michael, C. Nicolaou, D. Papademetriou, & N. Papadouris (Eds.), Integrating multiple perspectives on effective learning environments. Proceedings of 11th biennial meeting of the European Association for Research on Learning and Instruction (pp. 901–902). Nicosia: University of Cyprus.

Peirce, C. S. (1976). The new elements of mathematics (C. Eisele, Ed.). The Hague: Mouton.

Peirce, C. S. (CP). Collected papers of Charles Sanders Peirce 1931–1958. In C. Hartshorne & P. Weiss (Eds.), Cambridge, MA: Harvard University Press.

Plomp, T. (2007). Educational design research: An introduction. In N. Nieveen & T. Plomp (Eds.), An introduction to educational design research (pp. 9–35). Enschede: SLO.

Plomp, T., & Nieveen, N. (Eds.). (2007). An introduction to educational design research . Enschede: SLO.

Romberg, T. A. (1973). Development research. Overview of how development-based research works in practice. Wisconsin Research and Development Center for Cognitive Learning, University of Wisconsin-Madison, Madison.

Sandoval, W. A., & Bell, P. (2004). Design-dased research methods for studying learning in context: Introduction. Educational Psychologist, 39 (4), 199–201.

Sfard, A., & Linchevski, L. (1992). The gains and the pitfalls of reification — The case of algebra. Educational Studies in Mathematics, 26 (2–3), 191–228.

Simon, M. (1995). Reconstructing mathematics pedagogy from a constructivistic perspective. Journal for Research in Mathematics Education, 26 (2), 114–145.

Slavin, R. E. (2002). Evidence-based educational policies: Transforming educational practice and research. Educational Researcher, 31 , 15–21.

Smit, J., & Van Eerde, H. A. A. (2011). A teacher’s learning process in dual design research: Learning to scaffold language in a multilingual mathematics classroom. ZDM The International Journal on Mathematics Education, 43 (6–7), 889–900.

Smit, J., van Eerde, H. A. A., & Bakker, A. (2013). A conceptualisation of whole-class scaffolding. British Educational Research Journal, 39 (5), 817–834.

Steffe, L. P., & Thompson, P. W. (2000). Teaching experiments methodology: Underlying principles and essential elements. In R. Lesh & A. E. Kelly (Eds.), Research design in mathematics and science education (pp. 267–307). Hillsdale: Erlbaum.

Strauss, A., & Corbin, J. (1998). Basics of qualitative research techniques and procedures for developing grounded theory (2nd ed.). London: Sage.

Tall, D., Thomas, M., Davis, G., Gray, E., & Simpson, A. (2000). What is the object of the encapsulation of a process? Journal of Mathematical Behavior, 18 , 223–241.

Treffers, A. (1987). Three dimensions. A model of goal and theory description in mathematics instruction. The Wiskobas project . Dordrecht: Kluwer.

Van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker, R. M. Branch, K. Gustafson, N. Nieveen, & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Boston: Kluwer.

Van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (Eds.). (2006). Educational design research . London: Routledge.

Van den Heuvel-Panhuizen, M. (1996). Assessment and realistic mathematics education . Utrecht: CD-Bèta Press.

Van Nes, F., & Doorman, L. M. (2010). The interaction between multimedia data analysis and theory development in design research. Mathematics Education Research Journal, 22 (1), 6–30.

Wittmann, E. C. (1992). Didaktik der Mathematik als Ingenieurwissenschaft. [Didactics of mathematics as an engineering science.]. Zentralblatt für Didaktik der Mathematik, 3 , 119–121.

Yin, R. K. (2009). Case study research: Design and methods . Thousand Oaks: Sage.

Download references

Acknowledgments

The research was funded by the Netherlands Organization for Scientific Research under grant number 575-36-003B. The writing of this chapter was made possible with a grant from the Educational and Learning Sciences Utrecht awarded to Arthur Bakker. Section 2.6 is based on Bakker (2004b). We thank our Master students in our Research Methodology courses for their feedback on earlier versions of this manuscript. Angelika Bikner-Ahsbahs’s and reviewers’ careful reading has also helped us tremendously. We also acknowledge PhD students Adri Dierdorp, Al Jupri, and Victor Antwi, and our colleague Frans van Galen for their helpful comments, and Nathalie Kuijpers and Norma Presmeg for correcting this manuscript.

Author information

Authors and affiliations.

Freudenthal Institute for Science and Mathematics Education, Utrecht University, Princetonplein 5, 3584 CC, Utrecht, The Netherlands

Arthur Bakker & Dolly van Eerde

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Arthur Bakker .

Editor information

Editors and affiliations.

Faculty 3 of Mathematics and Computer Science, University of Bremen, Bremen, Germany

Angelika Bikner-Ahsbahs

Christine Knipping

Mathematics Department, Illinois State University, Normal, Illinois, USA

Norma Presmeg

Appendix: Structure of a DBR Project with Illustrations

In line with Oost and Markenhof ( 2010 ), we formulate the following general criteria for any research project:

The research should be anchored in the literature.

The research aim should be relevant , both in theoretical and practical terms.

The formulation of aim and questions should be precise , i.e. using concepts and definitions in the correct way.

The method used should be functional in answering the research question(s).

The overall structure of the research project should be consistent , i.e. title, aim, theory, question, method and results should form a coherent chain of reasoning.

In this appendix we present a structure of general points of attention during DBR and specifications for our statistics education example, including references to relevant sections in the chapter. In this structure these criteria are bolded. This structure could function as the blueprint of a book or article on a DBR project.

 

General points

Examples

Introduction:

1. Choose a topic

1. Statistics education at the middle school level

2. Identify common problems

2. Statistics as a set of unrelated concepts and techniques

3. Identify knowledge gap and relevance

3. How middle school students can be supported to develop a concept of distribution and related statistical concepts

4. Choose mathematical learning goals

4. Understanding of distribution (2.1)

Literature review forms the basis for formulating the research aim (the research has to be and )

Research aim:

It has to be clear whether an aim is descriptive, explanatory, evaluative, advisory etc. (1.2.2)

Contribute to an empirically and theoretically grounded instruction theory for statistics education at the middle school level (advisory aim) (2.1)

Research aim has to be narrowed down to a research question and possibly subquestions with the help of different theories

Literature review (theoretical background):

 Orienting frameworks

 Semiotics (2.3)

 Frameworks for action

 Theories on learning with computer tools

 Domain-specific learning theories (1.2.8)

 Realistic Mathematics Education (2.4)

With the help of theoretical constructs the research question(s) can be formulated

(the formulation has to be )

Research question:

Zoom in what knowledge is required to achieve the research aim

How can students with little statistical background develop a notion of distribution?

It should be underpinned why this research question requires DBR (the method should be )

Research approach:

The lack of the type of learning aimed for is a common reason to carry out DBR: It has to be enacted so it can be studied

Dutch statistics education was atomistic: Textbooks addressed mean, median, mode, and different graphical representations one by one. Software was hardly used. Hence the type of learning aimed for had to be enacted.

Using a research method involves several research instruments and techniques

Research instruments and techniques

Research instrument that connects different theories and concrete experiences in the form of testable hypotheses.

Series of hypothetical learning trajectories (HLTs)

1. Identify students’ prior knowledge

1. Prior interviews and pretest

2. Professional development of teacher

2. Preparatory meetings with teacher

3. Interview schemes and planning

3. Mini-interviews, observation scheme

4. Intermediate feedback and reflection with teacher

4. Debrief sessions with teacher

5. Determine learning yield (1.4.2)

5. Posttest

Design

Design guidelines

Guided reinvention; Historical and didactical phenomenology (2.4)

Data analysis

Hypotheses have to be tested by comparison of hypothetical and observed learning. Additional analyses may be necessary (1.4.3)

Comparison of hypothetical and observed learning

Constant comparative method of generating conjectures and testing them on the remaining data sources (2.6)

Results

Insights into patterns in learning and means of supporting such learning

Series of HLTs as progressive diagrammatic reasoning about growing samples (2.6)

Discussion

Theoretical and practical yield

 Concrete example of an historical and didactical phenomenology in statistics education

 Application of semiotics in an educational domain

 Insights into computer use in the mathematics classroom

 Series of learning activities

 Improved computer tools

The aim, theory, question, method and results should be aligned (the research has to be )

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer Science+Business Media Dordrecht

About this chapter

Bakker, A., van Eerde, D. (2015). An Introduction to Design-Based Research with an Example From Statistics Education. In: Bikner-Ahsbahs, A., Knipping, C., Presmeg, N. (eds) Approaches to Qualitative Research in Mathematics Education. Advances in Mathematics Education. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-9181-6_16

Download citation

DOI : https://doi.org/10.1007/978-94-017-9181-6_16

Published : 05 September 2014

Publisher Name : Springer, Dordrecht

Print ISBN : 978-94-017-9180-9

Online ISBN : 978-94-017-9181-6

eBook Packages : Humanities, Social Sciences and Law Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Corpus ID: 1389686

Design-based research and doctoral students: Guidelines for preparing a dissertation proposal

  • J. Herrington , S. McKenney , +1 author R. Oliver
  • Published 25 June 2007

236 Citations

Design based research in doctoral studies: adding a new dimension to doctoral research.

  • Highly Influenced
  • 14 Excerpts

Using a design based research approach to explore the ways that primary school teachers conceptualise authentic learning: A work in progress

Adapting design-based research as a research methodology in educational settings, reflecting on challenges of conducting design-based research in large university courses, educational design research, designing mind map for a good research proposal for phd student in the uk : guidelines and reviewing, adopting design-based research to conduct a doctoral study as a micro-cycle of design – a practice illustration, is design-based research desirable and feasible methodology for educational media and technology , 21st century teacher skills: design principles for student engagement and success, design-based research (dbr) in educational enquiry and technological studies: a version for phd students targeting the integration of new technologies and literacies into educational contexts., 26 references, design-based research: an emerging paradigm for educational inquiry, the role of design in research: the integrative learning design framework, educational research: competencies for analysis and application, design research: a socially responsible approach to instructional technology research in higher education, design experiments in educational research, the effects of a web-based learning environment on student motivation in a high school earth science course, a development research agenda for online collaborative learning, principles and methods of development research.

  • Highly Influential

Design-Based Research: Putting a Stake in the Ground

Computer-based support for science education materials developers in africa : exploring potentials, related papers.

Showing 1 through 3 of 0 Related Papers

AD Center Site Banner

  • Section 2: Home
  • Developing the Quantitative Research Design
  • Qualitative Descriptive Design
  • Design and Development Research (DDR) For Instructional Design
  • Qualitative Narrative Inquiry Research
  • Action Research Resource
  • Case Study Design in an Applied Doctorate

Qualitative Research Designs

Case study design, using case study design in the applied doctoral experience (ade), applicability of case study design to applied problem of practice, case study design references.

  • SAGE Research Methods
  • Research Examples (SAGE) This link opens in a new window
  • Dataset Examples (SAGE) This link opens in a new window
  • IRB Resource Center This link opens in a new window

The field of qualitative research there are a number of research designs (also referred to as “traditions” or “genres”), including case study, phenomenology, narrative inquiry, action research, ethnography, grounded theory, as well as a number of critical genres including Feminist theory, indigenous research, critical race theory and cultural studies. The choice of research design is directly tied to and must be aligned with your research problem and purpose. As Bloomberg & Volpe (2019) explain:

Choice of research design is directly tied to research problem and purpose. As the researcher, you actively create the link among problem, purpose, and design through a process of reflecting on problem and purpose, focusing on researchable questions, and considering how to best address these questions. Thinking along these lines affords a research study methodological congruence (p. 38).

Case study is an in-depth exploration from multiple perspectives of a bounded social phenomenon, be this a social system such as a program, event, institution, organization, or community (Stake, 1995, 2005; Yin, 2018). Case study is employed across disciplines, including education, health care, social work, sociology, and organizational studies. The purpose is to generate understanding and deep insights to inform professional practice, policy development, and community or social action (Bloomberg 2018).

Yin (2018) and Stake (1995, 2005), two of the key proponents of case study methodology, use different terms to describe case studies. Yin categorizes case studies as exploratory or descriptive . The former is used to explore those situations in which the intervention being evaluated has no clear single set of outcomes. The latter is used to describe an intervention or phenomenon and the real-life context in which it occurred. Stake identifies case studies as intrinsic or instrumental , and he proposes that a primary distinction in designing case studies is between single and multiple (or collective) case study designs. A single case study may be an instrumental case study (research focuses on an issue or concern in one bounded case) or an intrinsic case study (the focus is on the case itself because the case presents a unique situation). A longitudinal case study design is chosen when the researcher seeks to examine the same single case at two or more different points in time or to capture trends over time. A multiple case study design is used when a researcher seeks to determine the prevalence or frequency of a particular phenomenon. This approach is useful when cases are used for purposes of a cross-case analysis in order to compare, contrast, and synthesize perspectives regarding the same issue. The focus is on the analysis of diverse cases to determine how these confirm the findings within or between cases, or call the findings into question.

Case study affords significant interaction with research participants, providing an in-depth picture of the phenomenon (Bloomberg & Volpe, 2019). Research is extensive, drawing on multiple methods of data collection, and involves multiple data sources. Triangulation is critical in attempting to obtain an in-depth understanding of the phenomenon under study and adds rigor, breadth, and depth to the study and provides corroborative evidence of the data obtained. Analysis of data can be holistic or embedded—that is, dealing with the whole or parts of the case (Yin, 2018). With multiple cases the typical analytic strategy is to provide detailed description of themes within each case (within-case analysis), followed by thematic analysis across cases (cross-case analysis), providing insights regarding how individual cases are comparable along important dimensions. Research culminates in the production of a detailed description of a setting and its participants, accompanied by an analysis of the data for themes or patterns (Stake, 1995, 2005; Yin, 2018). In addition to thick, rich description, the researcher’s interpretations, conclusions, and recommendations contribute to the reader’s overall understanding of the case study.

Analysis of findings should show that the researcher has attended to all the data, should address the most significant aspects of the case, and should demonstrate familiarity with the prevailing thinking and discourse about the topic. The goal of case study design (as with all qualitative designs) is not generalizability but rather transferability —that is, how (if at all) and in what ways understanding and knowledge can be applied in similar contexts and settings. The qualitative researcher attempts to address the issue of transferability by way of thick, rich description that will provide the basis for a case or cases to have relevance and potential application across a broader context.

Qualitative research methods ask the questions of "what" and "how" a phenomenon is understood in a real-life context (Bloomberg & Volpe, 2019). In the education field, qualitative research methods uncover educational experiences and practices because qualitative research allows the researcher to reveal new knowledge and understanding. Moreover, qualitative descriptive case studies describe, analyze and interpret events that explain the reasoning behind specific phenomena (Bloomberg, 2018). As such, case study design can be the foundation for a rigorous study within the Applied Doctoral Experience (ADE).

Case study design is an appropriate research design to consider when conceptualizing and conducting a dissertation research study that is based on an applied problem of practice with inherent real-life educational implications. Case study researchers study current, real-life cases that are in progress so that they can gather accurate information that is current. This fits well with the ADE program, as students are typically exploring a problem of practice. Because of the flexibility of the methods used, a descriptive design provides the researcher with the opportunity to choose data collection methods that are best suited to a practice-based research purpose, and can include individual interviews, focus groups, observation, surveys, and critical incident questionnaires. Methods are triangulated to contribute to the study’s trustworthiness. In selecting the set of data collection methods, it is important that the researcher carefully consider the alignment between research questions and the type of data that is needed to address these. Each data source is one piece of the “puzzle,” that contributes to the researcher’s holistic understanding of a phenomenon. The various strands of data are woven together holistically to promote a deeper understanding of the case and its application to an educationally-based problem of practice.

Research studies within the Applied Doctoral Experience (ADE) will be practical in nature and focus on problems and issues that inform educational practice.  Many of the types of studies that fall within the ADE framework are exploratory, and align with case study design. Case study design fits very well with applied problems related to educational practice, as the following set of examples illustrate:

Elementary Bilingual Education Teachers’ Self-Efficacy in Teaching English Language Learners: A Qualitative Case Study

The problem to be addressed in the proposed study is that some elementary bilingual education teachers’ beliefs about their lack of preparedness to teach the English language may negatively impact the language proficiency skills of Hispanic ELLs (Ernst-Slavit & Wenger, 2016; Fuchs et al., 2018; Hoque, 2016). The purpose of the proposed qualitative descriptive case study was to explore the perspectives and experiences of elementary bilingual education teachers regarding their perceived lack of preparedness to teach the English language and how this may impact the language proficiency of Hispanic ELLs.

Exploring Minority Teachers Experiences Pertaining to their Value in Education: A Single Case Study of Teachers in New York City

The problem is that minority K-12 teachers are underrepresented in the United States, with research indicating that school leaders and teachers in schools that are populated mainly by black students, staffed mostly by white teachers who may be unprepared to deal with biases and stereotypes that are ingrained in schools (Egalite, Kisida, & Winters, 2015; Milligan & Howley, 2015). The purpose of this qualitative exploratory single case study was to develop a clearer understanding of minority teachers’ experiences concerning the under-representation of minority K-12 teachers in urban school districts in the United States since there are so few of them.

Exploring the Impact of an Urban Teacher Residency Program on Teachers’ Cultural Intelligence: A Qualitative Case Study

The problem to be addressed by this case study is that teacher candidates often report being unprepared and ill-equipped to effectively educate culturally diverse students (Skepple, 2015; Beutel, 2018). The purpose of this study was to explore and gain an in-depth understanding of the perceived impact of an urban teacher residency program in urban Iowa on teachers’ cultural competence using the cultural intelligence (CQ) framework (Earley & Ang, 2003).

Qualitative Case Study that Explores Self-Efficacy and Mentorship on Women in Academic Administrative Leadership Roles

The problem was that female school-level administrators might be less likely to experience mentorship, thereby potentially decreasing their self-efficacy (Bing & Smith, 2019; Brown, 2020; Grant, 2021). The purpose of this case study was to determine to what extent female school-level administrators in the United States who had a mentor have a sense of self-efficacy and to examine the relationship between mentorship and self-efficacy.

Suburban Teacher and Administrator Perceptions of Culturally Responsive Teaching to Promote Connectedness in Students of Color: A Qualitative Case Study

The problem to be addressed in this study is the racial discrimination experienced by students of color in suburban schools and the resulting negative school experience (Jara & Bloomsbury, 2020; Jones, 2019; Kohli et al., 2017; Wandix-White, 2020). The purpose of this case study is to explore how culturally responsive practices can counteract systemic racism and discrimination in suburban schools thereby meeting the needs of students of color by creating positive learning experiences. 

As you can see, all of these studies were well suited to qualitative case study design. In each of these studies, the applied research problem and research purpose were clearly grounded in educational practice as well as directly aligned with qualitative case study methodology. In the Applied Doctoral Experience (ADE), you will be focused on addressing or resolving an educationally relevant research problem of practice. As such, your case study, with clear boundaries, will be one that centers on a real-life authentic problem in your field of practice that you believe is in need of resolution or improvement, and that the outcome thereof will be educationally valuable.

Bloomberg, L. D. (2018). Case study method. In B. B. Frey (Ed.), The SAGE Encyclopedia of educational research, measurement, and evaluation (pp. 237–239). SAGE. https://go.openathens.net/redirector/nu.edu?url=https%3A%2F%2Fmethods.sagepub.com%2FReference%2Fthe-sage-encyclopedia-of-educational-research-measurement-and-evaluation%2Fi4294.xml

Bloomberg, L. D. & Volpe, M. (2019). Completing your qualitative dissertation: A road map from beginning to end . (4th Ed.). SAGE.

Stake, R. E. (1995). The art of case study research. SAGE.

Stake, R. E. (2005). Qualitative case studies. In N. K. Denzin and Y. S. Lincoln (Eds.), The SAGE handbook of qualitative research (3rd ed., pp. 443–466). SAGE.

Yin, R. (2018). Case study research and applications: Designs and methods. SAGE.

  • << Previous: Action Research Resource
  • Next: SAGE Research Methods >>
  • Last Updated: Jul 28, 2023 8:05 AM
  • URL: https://resources.nu.edu/c.php?g=1013605

National University

© Copyright 2024 National University. All Rights Reserved.

Privacy Policy | Consumer Information

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Design-based research and doctoral students: Guidelines for preparing a dissertation proposal

Profile image of Ron Oliver

2007, Proceedings of EdMedia

At first glance, design-based research may appear to be such a long-term and intensive approach to educational inquiry that doctoral students, most of whom expect to complete their Ph. D. degree in 4-5 years, should not attempt to adopt this approach for their doctoral dissertations. In this paper, we argue that design-based research is feasible for doctoral students, and that candidates should be encouraged to engage in it. More specifically, we describe the components of a dissertation proposal or prospectus that utilizes design- ...

Related Papers

International Journal of Doctoral Studies

Seyum Getenet

Aim/Purpose: We show a new dimension to the process of using design-based research approach in doctoral dissertations. Background: Design-based research is a long-term and concentrated approach to educational inquiry. It is often a recommendation that doctoral students should not attempt to adopt this approach for their doctoral dissertations. In this paper, we document two doctoral dissertations that used a design-based research approach in two different contexts. Methodology : The study draws on a qualitative analysis of the methodological approaches of two doctoral dissertations through the lenses of Herrington, McKenney, Reeves and Oliver principles of design-based research approach. Contribution: The findings of this study add a new dimension to using design-based research approach in doctoral dissertations in shorter-term and less intensive contexts. Findings: The results of this study indicate that design-based research is not only an effective methodological approach in doct...

design based research dissertation

Educational Researcher

Christopher Hoadley

Jennifer K . Shah

The science-to-service problem continues to taunt the field of education (Fixsen, Blasé, Naoom, & Wallace, 2009). As an academic discipline, the field requires knowledge generation that adds to or deepens theoretical understandings. As a profession, knowledge generation that solves local problems and supports continuous improvement is necessary. Using design-based research (DBR) provides a means of serving theoretical and practical needs in education, addressing the complexity of education by informing immediate practice while simultaneously contributing to theoretical understandings in the field of education. Using Stokes' (1977) model of scientific research and knowledge generation, we situate DBR within Pasteur's quadrant, describe how to increase its use, and recommend a new means for dissemination. The debate over the theory-to-practice divide continues in many fields, including education. Coburn and Stein (2010) suggested that the path from theoretical knowledge to classroom application is neither linear nor direct. Fixsen et al. (2009) stated that research results are not utilized enough to impact communities. In essence, the current process of scholars disseminating information and practitioners applying information results in a perpetual science-to-service problem. Given the complexity that lies at the heart of teaching and learning (Cochran-Smith, 2003), it is no longer reasonable to rely on the passive dissemination of knowledge. Rather, researchers and practitioners must collaborate to create and disseminate knowledge that will address the issues facing education today. This will require not only generating knowledge to expand understandings of theoretical foundations that comprise the science of the field, but also generating knowledge from practice that elucidates the act of applying the science of education. The purpose of the this article is to discuss the role that design-based research (DBR) can play in addressing the complexity of education, by informing immediate practice while simultaneously continuing to develop theoretical understandings in the field of education. In this article the authors: 1) describe Stokes' (1977) quadrant model of scientific research and elaborate on DBR's placement in Pasteur's quadrant based on its dual purpose of theoretical knowledge generation and practical knowledge generation; 2) describe the foundational elements of DBR; 3) discuss ideas of how to increase DBR use in education research; and 4) suggest an approach to disseminating DBR research that more accurately represents the methodological practice of this type of research. Overcoming the Theory to Practice Divide: Dwelling in Pasteur's Quadrant The process of knowledge dissemination suggests that knowledge generated to broaden the theoretical understandings in a discipline will directly lead to the application of that knowledge

Educational Psychologist

Philip Bell

The Design Journal

Violeta Clemente , Katja Tschimmel

Within the fruitful discussion about what design research should mean and achieve and the implication for doctoral education, this paper aims to explore the topic regarding the boundaries between project design research and academic design research. There is also a strong movement within the academic milieu in the realm of design, namely within international conferences and research meetings, to discuss methodologies and processes as a paramount contribution to defining scientific research in design. PhD design research in Portuguese universities started slowly in the late 1990s, but is increasingly establishing itself as a worthy degree. This text focuses on an original study depicting the state of the art of the methodological approaches applied in doctoral design research in Portugal. It proposes a Design Research Classification Model and a Design Research Canvas that can be applied to other systematic reviews of design research as a means of synthesising the past to outline the future. It is also a major objective of this work to contribute to a clarification of a methodological framework, which relates practice-based research to academic research.

Fatina Saikaly

Different philosophical assumptions, aims, structures, contents and processes underlie the Ph.D. programmes in design offered in different socio-cultural contexts. In order to get a deep understanding of the actual state of doctoral research in design, a comparative study of ten selected programmes was developed. The study covered universities from the North of America, Europe, Asia and Australia. Each case study was divided in three parts. A first part was related to the study of the doctoral programme. The second part was the study of selected Ph.D. theses. And finally an interview took place with one of the programme advisors. In this paper I will present the results of the comparative study. The points to be discussed are: (i) The different philosophical assumptions that guide doctoral programmes in design; How these different philosophical assumptions led to different learning experiences and contributed to the development of different skills and competencies; (ii) The different positions and roles of the design project in different research settings. I will conclude the paper by introducing two points that deserve further exploration. The first one is related to the practice-centred Ph.D. programmes in design. The other one concerns the reasons why the role of the design project in doctoral research is evolving and gaining more status.

Diana Joseph

Educational …

Michael J Feuer

The authors argue that design studies, like all scientific work, must comport with guiding scientific principles and provide adequate war-rants for their knowledge claims. The issue is whether their knowl-edge claims can be warranted. By their very nature, design studies are ...

WJET Journal

Maribel González Allende

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

Comparison. Conference for Artistic and Architectural Research. Book of Proceedings. Edited by Fabrizia Berlingieri, Francesca Zanotto with the contribution of Beatrice Balducci, Jacopo Leveratto, Claudia Mainardi, Enrico Miglietta, Chiara Pradel, Pier Paolo Tamburelli. LetteraVentidue, Siracusa

Chiara Pradel

STRATEGIES OF DESIGN-DRIVEN RESEARCH

Alessandro Rocca

Journal of the Association for Information Systems

Mark Toleman

Journal of the Learning Sciences

kurt squire

Wouter Eggink

World Journal on Educational Technology

Tina Štemberger

The Journal of Public Space

Mick Abbott

Ken Friedman

Peter Vistisen

Online Learning

Darragh McNally

Design Research Society International Conference, K. …

Wolfgang Jonas

Educational Design Research (EDeR)

Louis Major

Mithra Zahedi , Danny Godin

Medical Education

Susan McKenney

William Gaver

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

University of Twente Research Information Logo

Design-based research and doctoral students: Guidelines for preparing a dissertation proposal

  • Faculty of Behavioural, Management and Social Sciences
  • Educational Science

Research output : Chapter in Book/Report/Conference proceeding › Conference contribution › Academic › peer-review

Original languageEnglish
Title of host publicationProceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2007
EditorsC. Montgomerie, J. Seale
Pages4089-4097
Number of pages10
Publication statusPublished - 25 Jun 2007
Event - Vancouver, Canada
Duration: 25 Jun 200727 Jun 2007
ConferenceWorld Conference on Educational Multimedia, Hypermedia & Telecommunications 2007
Abbreviated titleED-MEDIA 2007
Country/TerritoryCanada
CityVancouver
Period25/06/0727/06/07
  • METIS-243753

Access to Document

  • Design-based research and doctoral students Final published version, 505 KB
  • http://ro.ecu.edu.au/ecuworks/1612/

Fingerprint

  • Theses Social Sciences 100%
  • Design Social Sciences 100%
  • Research Social Sciences 100%
  • Students Social Sciences 100%
  • Doctoral Student Computer Science 100%
  • Practice Guideline Nursing and Health Professions 100%
  • Student Nursing and Health Professions 100%
  • Doctoral students Arts and Humanities 100%

T1 - Design-based research and doctoral students: Guidelines for preparing a dissertation proposal

AU - Herrington, Jan

AU - McKenney, Susan

AU - Reeves, Thomas C.

AU - Oliver, Ron

PY - 2007/6/25

Y1 - 2007/6/25

N2 - At first glance, design-based research may appear to be such a long-term and intensive approach to educational inquiry that doctoral students, most of whom expect to complete their Ph.D. degree in 4-5 years, should not attempt to adopt this approach for their doctoral dissertations. In this paper, we argue that design-based research is feasible for doctoral students, and that candidates should be encouraged to engage in it. More specifically, we describe the components of a dissertation proposal or prospectus that utilizes design-based research methods in the context of educational technology research.

AB - At first glance, design-based research may appear to be such a long-term and intensive approach to educational inquiry that doctoral students, most of whom expect to complete their Ph.D. degree in 4-5 years, should not attempt to adopt this approach for their doctoral dissertations. In this paper, we argue that design-based research is feasible for doctoral students, and that candidates should be encouraged to engage in it. More specifically, we describe the components of a dissertation proposal or prospectus that utilizes design-based research methods in the context of educational technology research.

KW - IR-93893

KW - METIS-243753

M3 - Conference contribution

BT - Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2007

A2 - Montgomerie, C.

A2 - Seale, J.

T2 - World Conference on Educational Multimedia, Hypermedia &amp; Telecommunications 2007

Y2 - 25 June 2007 through 27 June 2007

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative approach Quantitative approach

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

Type of design Purpose and characteristics
Experimental
Quasi-experimental
Correlational
Descriptive

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Type of design Purpose and characteristics
Grounded theory
Phenomenology

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling Non-probability sampling

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Questionnaires Interviews

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Quantitative observation

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

Field Examples of data collection methods
Media & communication Collecting a sample of texts (e.g., speeches, articles, or social media posts) for data on cultural norms and narratives
Psychology Using technologies like neuroimaging, eye-tracking, or computer-based tasks to collect data on things like attention, emotional response, or reaction time
Education Using tests or assignments to collect data on knowledge and skills
Physical sciences Using scientific instruments to collect data on things like weight, blood pressure, or chemical composition

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

Reliability Validity

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

Approach Characteristics
Thematic analysis
Discourse analysis

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 26 August 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

Plan to Attend Cell Bio 2024

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Design-Based Research: A Methodology to Extend and Enrich Biology Education Research

  • Emily E. Scott
  • Mary Pat Wenderoth
  • Jennifer H. Doherty

*Address correspondence to: Emily E. Scott ( E-mail Address: [email protected] ).

Department of Biology, University of Washington, Seattle, WA 98195

Search for more papers by this author

Recent calls in biology education research (BER) have recommended that researchers leverage learning theories and methodologies from other disciplines to investigate the mechanisms by which students to develop sophisticated ideas. We suggest design-based research from the learning sciences is a compelling methodology for achieving this aim. Design-based research investigates the “learning ecologies” that move student thinking toward mastery. These “learning ecologies” are grounded in theories of learning, produce measurable changes in student learning, generate design principles that guide the development of instructional tools, and are enacted using extended, iterative teaching experiments. In this essay, we introduce readers to the key elements of design-based research, using our own research into student learning in undergraduate physiology as an example of design-based research in BER. Then, we discuss how design-based research can extend work already done in BER and foster interdisciplinary collaborations among cognitive and learning scientists, biology education researchers, and instructors. We also explore some of the challenges associated with this methodological approach.

INTRODUCTION

There have been recent calls for biology education researchers to look toward other fields of educational inquiry for theories and methodologies to advance, and expand, our understanding of what helps students learn to think like biologists ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Lo et al. , 2019 ). These calls include the recommendations that biology education researchers ground their work in learning theories from the cognitive and learning sciences ( Coley and Tanner, 2012 ) and begin investigating the underlying mechanisms by which students to develop sophisticated biology ideas ( Dolan, 2015 ; Lo et al. , 2019 ). Design-based research from the learning sciences is one methodology that seeks to do both by using theories of learning to investigate how “learning ecologies”—that is, complex systems of interactions among instructors, students, and environmental components—support the process of student learning ( Brown, 1992 ; Cobb et al. , 2003 ; Collins et al. , 2004 ; Peffer and Renken, 2016 ).

The purpose of this essay is twofold. First, we want to introduce readers to the key elements of design-based research, using our research into student learning in undergraduate physiology as an example of design-based research in biology education research (BER). Second, we will discuss how design-based research can extend work already done in BER and explore some of the challenges of its implementation. For a more in-depth review of design-based research, we direct readers to the following references: Brown (1992) , Barab and Squire (2004) , and Collins et al. (2004) , as well as commentaries by Anderson and Shattuck (2012) and McKenney and Reeves (2013) .

WHAT IS DESIGN-BASED RESEARCH?

Design-based research is a methodological approach that aligns with research methods from the fields of engineering or applied physics, where products are designed for specific purposes ( Brown, 1992 ; Joseph, 2004 ; Middleton et al. , 2008 ; Kelly, 2014 ). Consequently, investigators using design-based research approach educational inquiry much as an engineer develops a new product: First, the researchers identify a problem that needs to be addressed (e.g., a particular learning challenge that students face). Next, they design a potential “solution” to the problem in the form of instructional tools (e.g., reasoning strategies, worksheets; e.g., Reiser et al. , 2001 ) that theory and previous research suggest will address the problem. Then, the researchers test the instructional tools in a real-world setting (i.e., the classroom) to see if the tools positively impact student learning. As testing proceeds, researchers evaluate the instructional tools with emerging evidence of their effectiveness (or lack thereof) and progressively revise the tools— in real time —as necessary ( Collins et al. , 2004 ). Finally, the researchers reflect on the outcomes of the experiment, identifying the features of the instructional tools that were successful at addressing the initial learning problem, revising those aspects that were not helpful to learning, and determining how the research informed the theory underlying the experiment. This leads to another research cycle of designing, testing, evaluating, and reflecting to refine the instructional tools in support of student learning. We have characterized this iterative process in Figure 1 after Sandoval (2014) . Though we have portrayed four discrete phases to design-based research, there is often overlap of the phases as the research progresses (e.g., testing and evaluating can occur simultaneously).

FIGURE 1. The four phases of design-based research experienced in an iterative cycle (A). We also highlight the main features of each phase of our design-based research project investigating students’ use of flux in physiology (B).

Design-based research has no specific requirements for the form that instructional tools must take or the manner in which the tools are evaluated ( Bell, 2004 ; Anderson and Shattuck, 2012 ). Instead, design-based research has what Sandoval (2014) calls “epistemic commitments” 1 that inform the major goals of a design-based research project as well as how it is implemented. These epistemic commitments are: 1) Design based research should be grounded in theories of learning (e.g., constructivism, knowledge-in-pieces, conceptual change) that both inform the design of the instructional tools and are improved upon by the research ( Cobb et al. , 2003 ; Barab and Squire, 2004 ). This makes design-based research more than a method for testing whether or not an instructional tool works; it also investigates why the design worked and how it can be generalized to other learning environments ( Cobb et al. , 2003 ). 2) Design-based research should aim to produce measurable changes in student learning in classrooms around a particular learning problem ( Anderson and Shattuck, 2012 ; McKenney and Reeves, 2013 ). This requirement ensures that theoretical research into student learning is directly applicable, and impactful, to students and instructors in classroom settings ( Hoadley, 2004 ). 3) Design-based research should generate design principles that guide the development and implementation of future instructional tools ( Edelson, 2002 ). This commitment makes the research findings broadly applicable for use in a variety of classroom environments. 4) Design-based research should be enacted using extended, iterative teaching experiments in classrooms. By observing student learning over an extended period of time (e.g., throughout an entire term or across terms), researchers are more likely to observe the full effects of how the instructional tools impact student learning compared with short-term experiments ( Brown, 1992 ; Barab and Squire, 2004 ; Sandoval and Bell, 2004 ).

HOW IS DESIGN-BASED RESEARCH DIFFERENT FROM AN EXPERIMENTAL APPROACH?

Many BER studies employ experimental approaches that align with traditional scientific methods of experimentation, such as using treatment versus control groups, randomly assigning treatments to different groups, replicating interventions across multiple spatial or temporal periods, and using statistical methods to guide the kinds of inferences that arise from an experiment. While design-based research can similarly employ these strategies for educational inquiry, there are also some notable differences in its approach to experimentation ( Collins et al. , 2004 ; Hoadley, 2004 ). In this section, we contrast the differences between design-based research and what we call “experimental approaches,” although both paradigms represent a form of experimentation.

The first difference between an experimental approach and design-based research regards the role participants play in the experiment. In an experimental approach, the researcher is responsible for making all the decisions about how the experiment will be implemented and analyzed, while the instructor facilitates the experimental treatments. In design-based research, both researchers and instructors are engaged in all stages of the research from conception to reflection ( Collins et al. , 2004 ). In BER, a third condition frequently arises wherein the researcher is also the instructor. In this case, if the research questions being investigated produce generalizable results that have the potential to impact teaching broadly, then this is consistent with a design-based research approach ( Cobb et al. , 2003 ). However, when the research questions are self-reflective about how a researcher/instructor can improve his or her own classroom practices, this aligns more closely with “action research,” which is another methodology used in education research (see Stringer, 2013 ).

A second difference between experimental research and design-based research is the form that hypotheses take and the manner in which they are investigated ( Collins et al. , 2004 ; Sandoval, 2014 ). In experimental approaches, researchers develop a hypothesis about how a specific instructional intervention will impact student learning. The intervention is then tested in the classroom(s) while controlling for other variables that are not part of the study in order to isolate the effects of the intervention. Sometimes, researchers designate a “control” situation that serves as a comparison group that does not experience the intervention. For example, Jackson et al. (2018) were interested in comparing peer- and self-grading of weekly practice exams to if they were equally effective forms of deliberate practice for students in a large-enrollment class. To test this, the authors (including authors of this essay J.H.D., M.P.W.) designed an experiment in which lab sections of students in a large lecture course were randomly assigned to either a peer-grading or self-grading treatment so they could isolate the effects of each intervention. In design-based research, a hypothesis is conceptualized as the “design solution” rather than a specific intervention; that is, design-based researchers hypothesize that the designed instructional tools, when implemented in the classroom, will create a learning ecology that improves student learning around the identified learning problem ( Edelson, 2002 ; Bell, 2004 ). For example, Zagallo et al. (2016) developed a laboratory curriculum (i.e., the hypothesized “design solution”) for molecular and cellular biology majors to address the learning problem that students often struggle to connect scientific models and empirical data. This curriculum entailed: focusing instruction around a set of target biological models; developing small-group activities in which students interacted with the models by analyzing data from scientific papers; using formative assessment tools for student feedback; and providing students with a set of learning objectives they could use as study tools. They tested their curriculum in a novel, large-enrollment course of upper-division students over several years, making iterative changes to the curriculum as the study progressed.

By framing the research approach as an iterative endeavor of progressive refinement rather than a test of a particular intervention when all other variables are controlled, design-based researchers recognize that: 1) classrooms, and classroom experiences, are unique at any given time, making it difficult to truly “control” the environment in which an intervention occurs or establish a “control group” that differs only in the features of an intervention; and 2) many aspects of a classroom experience may influence the effectiveness of an intervention, often in unanticipated ways, which should be included in the research team’s analysis of an intervention’s success. Consequently, the research team is less concerned with controlling the research conditions—as in an experimental approach—and instead focuses on characterizing the learning environment ( Barab and Squire, 2004 ). This involves collecting data from multiple sources as the research progresses, including how the instructional tools were implemented, aspects of the implementation process that failed to go as planned, and how the instructional tools or implementation process was modified. These characterizations can provide important insights into what specific features of the instructional tools, or the learning environment, were most impactful to learning ( DBR Collective, 2003 ).

A third difference between experimental approaches and design-based research is when the instructional interventions can be modified. In experimental research, the intervention is fixed throughout the experimental period, with any revisions occurring only after the experiment has concluded. This is critical for ensuring that the results of the study provide evidence of the efficacy of a specific intervention. By contrast, design-based research takes a more flexible approach that allows instructional tools to be modified in situ as they are being implemented ( Hoadley, 2004 ; Barab, 2014 ). This flexibility allows the research team to modify instructional tools or strategies that prove inadequate for collecting the evidence necessary to evaluate the underlying theory and ensures a tight connection between interventions and a specific learning problem ( Collins et al. , 2004 ; Hoadley, 2004 ).

Finally, and importantly, experimental approaches and design-based research differ in the kinds of conclusions they draw from their data. Experimental research can “identify that something meaningful happened; but [it is] not able to articulate what about the intervention caused that story to unfold” ( Barab, 2014 , p. 162). In other words, experimental methods are robust for identifying where differences in learning occur, such as between groups of students experiencing peer- or self-grading of practice exams ( Jackson et al. , 2018 ) or receiving different curricula (e.g., Chi et al. , 2012 ). However, these methods are not able to characterize the underlying learning process or mechanism involved in the different learning outcomes. By contrast, design-based research has the potential to uncover mechanisms of learning, because it investigates how the nature of student thinking changes as students experience instructional interventions ( Shavelson et al. , 2003 ; Barab, 2014 ). According to Sandoval (2014) , “Design research, as a means of uncovering causal processes, is oriented not to finding effects but to finding functions , to understanding how desired (and undesired) effects arise through interactions in a designed environment” (p. 30). In Zagallo et al. (2016) , the authors found that their curriculum supported students’ data-interpretation skills, because it stimulated students’ spontaneous use of argumentation during which group members coconstructed evidence-based claims from the data provided. Students also worked collaboratively to decode figures and identify data patterns. These strategies were identified from the researchers’ qualitative data analysis of in-class recordings of small-group discussions, which allowed them to observe what students were doing to support their learning. Because design-based research is focused on characterizing how learning occurs in classrooms, it can begin to answer the kinds of mechanistic questions others have identified as central to advancing BER ( National Research Council [NRC], 2012 ; Dolan, 2015 ; Lo et al. , 2019 ).

DESIGN-BASED RESEARCH IN ACTION: AN EXAMPLE FROM UNDERGRADUATE PHYSIOLOGY

To illustrate how design-based research could be employed in BER, we draw on our own research that investigates how students learn physiology. We will characterize one iteration of our design-based research cycle ( Figure 1 ), emphasizing how our project uses Sandoval’s four epistemic commitments (i.e., theory driven, practically applied, generating design principles, implemented in an iterative manner) to guide our implementation.

Identifying the Learning Problem

Understanding physiological phenomena is challenging for students, given the wide variety of contexts (e.g., cardiovascular, neuromuscular, respiratory; animal vs. plant) and scales involved (e.g., using molecular-level interactions to explain organism functioning; Wang, 2004 ; Michael, 2007 ; Badenhorst et al. , 2016 ). To address these learning challenges, Modell (2000) identified seven “general models” that undergird most physiology phenomena (i.e., control systems, conservation of mass, mass and heat flow, elastic properties of tissues, transport across membranes, cell-to-cell communication, molecular interactions). Instructors can use these models as a “conceptual framework” to help students build intellectual coherence across phenomena and develop a deeper understanding of physiology ( Modell, 2000 ; Michael et al. , 2009 ). This approach aligns with theoretical work in the learning sciences that indicates that providing students with conceptual frameworks improves their ability to integrate and retrieve knowledge ( National Academies of Sciences, Engineering, and Medicine, 2018 ).

Before the start of our design-based project, we had been using Modell’s (2000) general models to guide our instruction. In this essay, we will focus on how we used the general models of mass and heat flow and transport across membranes in our instruction. These two models together describe how materials flow down gradients (e.g., pressure gradients, electrochemical gradients) against sources of resistance (e.g., tube diameter, channel frequency). We call this flux reasoning. We emphasized the fundamental nature and broad utility of flux reasoning in lecture and lab and frequently highlighted when it could be applied to explain a phenomenon. We also developed a conceptual scaffold (the Flux Reasoning Tool) that students could use to reason about physiological processes involving flux.

Although these instructional approaches had improved students’ understanding of flux phenomena, we found that students often demonstrated little commitment to using flux broadly across physiological contexts. Instead, they considered flux to be just another fact to memorize and applied it to narrow circumstances (e.g., they would use flux to reason about ions flowing across membranes—the context where flux was first introduced—but not the bulk flow of blood in a vessel). Students also struggled to integrate the various components of flux (e.g., balancing chemical and electrical gradients, accounting for variable resistance). We saw these issues reflected in students’ lower than hoped for exam scores on the cumulative final of the course. From these experiences, and from conversations with other physiology instructors, we identified a learning problem to address through design-based research: How do students learn to use flux reasoning to explain material flows in multiple physiology contexts?

The process of identifying a learning problem usually emerges from a researcher’s own experiences (in or outside a classroom) or from previous research that has been described in the literature ( Cobb et al. , 2003 ). To remain true to Sandoval’s first epistemic commitment, a learning problem must advance a theory of learning ( Edelson, 2002 ; McKenney and Reeves, 2013 ). In our work, we investigated how conceptual frameworks based on fundamental scientific concepts (i.e., Modell’s general models) could help students reason productively about physiology phenomena (National Academies of Sciences, Engineering, and Medicine, 2018; Modell, 2000 ). Our specific theoretical question was: Can we characterize how students’ conceptual frameworks around flux change as they work toward robust ideas? Sandoval’s second epistemic commitment stated that a learning problem must aim to improve student learning outcomes. The practical significance of our learning problem was: Does using the concept of flux as a foundational idea for instructional tools increase students’ learning of physiological phenomena?

We investigated our learning problem in an introductory biology course at a large R1 institution. The introductory course is the third in a biology sequence that focuses on plant and animal physiology. The course typically serves between 250 and 600 students in their sophomore or junior years each term. Classes have the following average demographics: 68% male, 21% from lower-income situations, 12% from an underrepresented minority, and 26% first-generation college students.

Design-Based Research Cycle 1, Phase 1: Designing Instructional Tools

The first phase of design-based research involves developing instructional tools that address both the theoretical and practical concerns of the learning problem ( Edelson, 2002 ; Wang and Hannafin, 2005 ). These instructional tools can take many forms, such as specific instructional strategies, classroom worksheets and practices, or technological software, as long as they embody the underlying learning theory being investigated. They must also produce classroom experiences or materials that can be evaluated to determine whether learning outcomes were met ( Sandoval, 2014 ). Indeed, this alignment between theory, the nature of the instructional tools, and the ways students are assessed is central to ensuring rigorous design-based research ( Hoadley, 2004 ; Sandoval, 2014 ). Taken together, the instructional tools instantiate a hypothesized learning environment that will advance both the theoretical and practical questions driving the research ( Barab, 2014 ).

In our work, the theoretical claim that instruction based on fundamental scientific concepts would support students’ flux reasoning was embodied in our instructional approach by being the central focus of all instructional materials, which included: a revised version of the Flux Reasoning Tool ( Figure 2 ); case study–based units in lecture that explicitly emphasized flux phenomena in real-world contexts ( Windschitl et al. , 2012 ; Scott et al. , 2018 ; Figure 3 ); classroom activities in which students practiced using flux to address physiological scenarios; links to online videos describing key flux-related concepts; constructed-response assessment items that cued students to use flux reasoning in their thinking; and pretest/posttest formative assessment questions that tracked student learning ( Figure 4 ).

FIGURE 2. The Flux Reasoning Tool given to students at the beginning of the quarter.

FIGURE 3. An example flux case study that is presented to students at the beginning of the neurophysiology unit. Throughout the unit, students learn how ion flows into and out of cells, as mediated by chemical and electrical gradients and various ion/molecular channels, sends signals throughout the body. They use this information to better understand why Jaime experiences persistent neuropathy. Images from: uz.wikipedia.org/wiki/Fayl:Blausen_0822_SpinalCord.png and commons.wikimedia.org/wiki/File:Figure_38_01_07.jpg.

FIGURE 4. An example flux assessment question about ion flows given in a pre-unit/post-unit formative assessment in the neurophysiology unit.

Phase 2: Testing the Instructional Tools

In the second phase of design-based research, the instructional tools are tested by implementing them in classrooms. During this phase, the instructional tools are placed “in harm’s way … in order to expose the details of the process to scrutiny” ( Cobb et al. , 2003 , p. 10). In this way, researchers and instructors test how the tools perform in real-world settings, which may differ considerably from the design team’s initial expectations ( Hoadley, 2004 ). During this phase, if necessary, the design team may make adjustments to the tools as they are being used to account for these unanticipated conditions ( Collins et al. , 2004 ).

We implemented the instructional tools during the Autumn and Spring quarters of the 2016–2017 academic year. Students were taught to use the Flux Reasoning Tool at the beginning of the term in the context of the first case study unit focused on neurophysiology. Each physiology unit throughout the term was associated with a new concept-based case study (usually about flux) that framed the context of the teaching. Embedded within the daily lectures were classroom activities in which students could practice using flux. Students were also assigned readings from the textbook and videos related to flux to watch during each unit. Throughout the term, students took five exams that each contained some flux questions as well as some pre- and post-unit formative assessment questions. During Winter quarter, we conducted clinical interviews with students who would take our course in the Spring term (i.e., “pre” data) as well as students who had just completed our course in Autumn (i.e., “post” data).

Phase 3: Evaluating the Instructional Tools

The third phase of a design-based research cycle involves evaluating the effectiveness of instructional tools using evidence of student learning ( Barab and Squire, 2004 ; Anderson and Shattuck, 2012 ). This can be done using products produced by students (e.g., homework, lab reports), attitudinal gains measured with surveys, participation rates in activities, interview testimonials, classroom discourse practices, and formative assessment or exam data (e.g., Reiser et al. , 2001 ; Cobb et al. , 2003 ; Barab and Squire, 2004 ; Mohan et al. , 2009 ). Regardless of the source, evidence must be in a form that supports a systematic analysis that could be scrutinized by other researchers ( Cobb et al. , 2003 ; Barab, 2014 ). Also, because design-based research often involves multiple data streams, researchers may need to use both quantitative and qualitative analytical methods to produce a rich picture of how the instructional tools affected student learning ( Collins et al. , 2004 ; Anderson and Shattuck, 2012 ).

In our work, we used the quality of students’ written responses on exams and formative assessment questions to determine whether students improved their understanding of physiological phenomena involving flux. For each assessment question, we analyzed a subset of student’s pretest answers to identify overarching patterns in students’ reasoning about flux, characterized these overarching patterns, then ordinated the patterns into different levels of sophistication. These became our scoring rubrics, which identified five different levels of student reasoning about flux. We used the rubrics to code the remainder of students’ responses, with a code designating the level of student reasoning associated with a particular reasoning pattern. We used this ordinal rubric format because it would later inform our theoretical understanding of how students build flux conceptual frameworks (see phase 4). This also allowed us to both characterize the ideas students held about flux phenomena and identify the frequency distribution of those ideas in a class.

By analyzing changes in the frequency distributions of students’ ideas across the rubric levels at different time points in the term (e.g., pre-unit vs. post-unit), we could track both the number of students who gained more sophisticated ideas about flux as the term progressed and the quality of those ideas. If the frequency of students reasoning at higher levels increased from pre-unit to post-unit assessments, we could conclude that our instructional tools as a whole were supporting students’ development of sophisticated flux ideas. For example, on one neuromuscular ion flux assessment question in the Spring of 2017, we found that relatively more students were reasoning at the highest levels of our rubric (i.e., levels 4 and 5) on the post-unit test compared with the pre-unit test. This meant that more students were beginning to integrate sophisticated ideas about flux (i.e., they were balancing concentration and electrical gradients) in their reasoning about ion movement.

To help validate this finding, we drew on three additional data streams: 1) from in-class group recordings of students working with flux items, we noted that students increasingly incorporated ideas about gradients and resistance when constructing their explanations as the term progressed; 2) from plant assessment items in the latter part of the term, we began to see students using flux ideas unprompted; and 3) from interviews, we observed that students who had already taken the course used flux ideas in their reasoning.

Through these analyses, we also noticed an interesting pattern in the pre-unit test data for Spring 2017 when compared with the frequency distribution of students’ responses with a previous term (Autumn 2016). In Spring 2017, 42% of students reasoned at level 4 or 5 on the pre-unit test, indicating these students already had sophisticated ideas about ion flux before they took the pre-unit assessment. This was surprising, considering only 2% of students reasoned at these levels for this item on the Autumn 2016 pre-unit test.

Phase 4: Reflecting on the Instructional Tools and Their Implementation

The final phase of a design-based research cycle involves a retrospective analysis that addresses the epistemic commitments of this methodology: How was the theory underpinning the research advanced by the research endeavor (theoretical outcome)? Did the instructional tools support student learning about the learning problem (practical outcome)? What were the critical features of the design solution that supported student learning (design principles)? ( Cobb et al. , 2003 ; Barab and Squire, 2004 ).

Theoretical Outcome (Epistemic Commitment 1).

Reflecting on how a design-based research experiment advances theory is critical to our understanding of how students learn in educational settings ( Barab and Squire, 2004 ; Mohan et al. , 2009 ). In our work, we aimed to characterize how students’ conceptual frameworks around flux change as they work toward robust ideas. To do this, we drew on learning progression research as our theoretical framing ( NRC, 2007 ; Corcoran et al. , 2009 ; Duschl et al. , 2011 ; Scott et al. , 2019 ). Learning progression frameworks describe empirically derived patterns in student thinking that are ordered into levels representing cognitive shifts in the ways students conceive a topic as they work toward mastery ( Gunckel et al. , 2012 ). We used our ion flux scoring rubrics to create a preliminary five-level learning progression framework ( Table 1 ). The framework describes how students’ ideas about flux often start with teleological-driven accounts at the lowest level (i.e., level 1), shift to focusing on driving forces (e.g., concentration gradients, electrical gradients) in the middle levels, and arrive at complex ideas that integrate multiple interacting forces at the higher levels. We further validated these reasoning patterns with our student interviews. However, our flux conceptual framework was largely based on student responses to our ion flux assessment items. Therefore, to further validate our learning progression framework, we needed a greater diversity of flux assessment items that investigated student thinking more broadly (i.e., about bulk flow, water movement) across physiological systems.

The preliminary flux learning progression framework characterizing the patterns of reasoning students may exhibit as they work toward mastery of flux reasoning. The student exemplars are from the ion flux formative assessment question presented in . The “/” divides a student’s answers to the first and second parts of the question. Level 5 represents the most sophisticated ideas about flux phenomena.

LevelLevel descriptionsStudent exemplars
5Principle-based reasoning with full consideration of interacting componentsChange the membrane potential to −100mV/The in the cell will put for the positively charged potassium than the .
4Emergent principle-based reasoning using individual componentsDecrease the more positive/the concentration gradient and electrical gradient control the motion of charged particles.
3Students use fragments of the principle to reasonChange concentration of outside K/If the , more K will rush into the cell.
2Students provide storytelling explanations that are nonmechanisticClose voltage-gated potassium channels/When the are closed then we will move back toward meaning that K+ ions will move into the cell causing the mV to go from −90 mV (K+ electrical potential) to −70 mV (RMP).
1Students provide nonmechanistic (e.g., teleological) explanationsTransport proteins/ to cross membrane because it wouldn’t do it readily since it’s charged.

Practical Outcome (Epistemic Commitment 2).

In design-based research, learning theories must “do real work” by improving student learning in real-world settings ( DBR Collective, 2003 ). Therefore, design-based researchers must reflect on whether or not the data they collected show evidence that the instructional tools improved student learning ( Cobb et al. , 2003 ; Sharma and McShane, 2008 ). We determined whether our flux-based instructional approach aided student learning by analyzing the kinds of answers students provided to our assessment questions. Specifically, we considered students who reasoned at level 4 or above as demonstrating productive flux reasoning. Because almost half of students were reasoning at level 4 or 5 on the post-unit assessment after experiencing the instructional tools in the neurophysiology unit (in Spring 2017), we concluded that our tools supported student learning in physiology. Additionally, we noticed that students used language in their explanations that directly tied to the Flux Reasoning Tool ( Figure 2 ), which instructed them to use arrows to indicate the magnitude and direction of gradient-driving forces. For example, in a posttest response to our ion flux item ( Figure 4 ), one student wrote:

Ion movement is a function of concentration and electrical gradients . Which arrow is stronger determines the movement of K+. We can make the electrical arrow bigger and pointing in by making the membrane potential more negative than Ek [i.e., potassium’s equilibrium potential]. We can make the concentration arrow bigger and pointing in by making a very strong concentration gradient pointing in.

Given that almost half of students reasoned at level 4 or above, and that students used language from the Flux Reasoning Tool, we concluded that using fundamental concepts was a productive instructional approach for improving student learning in physiology and that our instructional tools aided student learning. However, some students in the 2016–2017 academic year continued to apply flux ideas more narrowly than intended (i.e., for ion and simple diffusion cases, but not water flux or bulk flow). This suggested that students had developed nascent flux conceptual frameworks after experiencing the instructional tools but could use more support to realize the broad applicability of this principle. Also, although our cross-sectional interview approach demonstrated how students’ ideas, overall, could change after experiencing the instructional tools, it did not provide information about how a student developed flux reasoning.

Reflecting on practical outcomes also means interpreting any learning gains in the context of the learning ecology. This reflection allowed us to identify whether there were particular aspects of the instructional tools that were better at supporting learning than others ( DBR Collective, 2003 ). Indeed, this was critical for our understanding why 42% of students scored at level 3 and above on the pre-unit ion assessment in the Spring of 2017, while only 2% of students scored level 3 and above in Autumn of 2016. When we reviewed notes of the Spring 2017 implementation scheme, we saw that the pretest was due at the end of the first day of class after students had been exposed to ion flux ideas in class and in a reading/video assignment about ion flow, which may be one reason for the students’ high performance on the pretest. Consequently, we could not tell whether students’ initial high performance was due to their learning from the activities in the first day of class or for other reasons we did not measure. It also indicated we needed to close pretests before the first day of class for a more accurate measure of students’ incoming ideas and the effectiveness of the instructional tools employed at the beginning of the unit.

Design Principles (Epistemic Commitment 3).

Although design-based research is enacted in local contexts (i.e., a particular classroom), its purpose is to inform learning ecologies that have broad applications to improve learning and teaching ( Edelson, 2002 ; Cobb et al. , 2003 ). Therefore, design-based research should produce design principles that describe characteristics of learning environments that researchers and instructors can use to develop instructional tools specific to their local contexts (e.g., Edelson, 2002 ; Subramaniam et al. , 2015 ). Consequently, the design principles must balance specificity with adaptability so they can be used broadly to inform instruction ( Collins et al. , 2004 ; Barab, 2014 ).

From our first cycle of design-based research, we developed the following design principles: 1) Key scientific concepts should provide an overarching framework for course organization. This way, the individual components that make up a course, like instructional units, activities, practice problems, and assessments, all reinforce the centrality of the key concept. 2) Instructional tools should explicitly articulate the principle of interest, with specific guidance on how that principle is applied in context. This stresses the applied nature of the principle and that it is more than a fact to be memorized. 3) Instructional tools need to show specific instances of how the principle is applied in multiple contexts to combat students’ narrow application of the principle to a limited number of contexts.

Design-Based Research Cycle 2, Phase 1: Redesign and Refine the Experiment

The last “epistemic commitment” Sandoval (2014) articulated was that design-based research be an iterative process with an eye toward continually refining the instructional tools, based on evidence of student learning, to produce more robust learning environments. By viewing educational inquiry as formative research, design-based researchers recognize the difficulty in accounting for all variables that could impact student learning, or the implementation of the instructional tools, a priori ( Collins et al. , 2004 ). Robust instructional designs are the products of trial and error, which are strengthened by a systematic analysis of how they perform in real-world settings.

To continue to advance our work investigating student thinking using the principle of flux, we began a second cycle of design-based research that continued to address the learning problem of helping students reason with fundamental scientific concepts. In this cycle, we largely focused on broadening the number of physiological systems that had accompanying formative assessment questions (i.e., beyond ion flux), collecting student reasoning from a more diverse population of students (e.g., upper division, allied heath, community college), and refining and validating the flux learning progression with both written and interview data in a student through time. We developed a suite of constructed-response flux assessment questions that spanned neuromuscular, cardiovascular, respiratory, renal, and plant physiological contexts and asked students about several kinds of flux: ion movement, diffusion, water movement, and bulk flow (29 total questions; available at beyondmultiplechoice.org). This would provide us with rich qualitative data that we could use to refine the learning progression. We decided to administer written assessments and conduct interviews in a pretest/posttest manner at the beginning and end of each unit both as a way to increase our data about student reasoning and to provide students with additional practice using flux reasoning across contexts.

From this second round of designing instructional tools (i.e., broader range of assessment items), testing them in the classroom (i.e., administering the assessment items to diverse student populations), evaluating the tools (i.e., developing learning progression–aligned rubrics across phenomena from student data, tracking changes in the frequency distribution of students across levels through time), and reflecting on the tools’ success, we would develop a more thorough and robust characterization of how students use flux across systems that could better inform our creation of new instructional tools to support student learning.

HOW CAN DESIGN-BASED RESEARCH EXTEND AND ENRICH BER?

While design-based research has primarily been used in educational inquiry at the K–12 level (see Reiser et al. , 2001 ; Mohan et al. , 2009 ; Jin and Anderson, 2012 ), other science disciplines at undergraduate institutions have begun to employ this methodology to create robust instructional approaches (e.g., Szteinberg et al. , 2014 in chemistry; Hake, 2007 , and Sharma and McShane, 2008 , in physics; Kelly, 2014 , in engineering). Our own work, as well as that by Zagallo et al. (2016) , provides two examples of how design-based research could be implemented in BER. Below, we articulate some of the ways incorporating design-based research into BER could extend and enrich this field of educational inquiry.

Design-Based Research Connects Theory with Practice

One critique of BER is that it does not draw heavily enough on learning theories from other disciplines like cognitive psychology or the learning sciences to inform its research ( Coley and Tanner, 2012 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Davidesco and Milne, 2019 ). For example, there has been considerable work in BER developing concept inventories as formative assessment tools that identify concepts students often struggle to learn (e.g., Marbach-Ad et al. , 2009 ; McFarland et al. , 2017 ; Summers et al. , 2018 ). However, much of this work is detached from a theoretical understanding of why students hold misconceptions in the first place, what the nature of their thinking is, and the learning mechanisms that would move students to a more productive understanding of domain ideas ( Alonzo, 2011 ). Using design-based research to understand the basis of students’ misconceptions would ground these practical learning problems in a theoretical understanding of the nature of student thinking (e.g., see Coley and Tanner, 2012 , 2015 ; Gouvea and Simon, 2018 ) and the kinds of instructional tools that would best support the learning process.

Design-Based Research Fosters Collaborations across Disciplines

Recently, there have been multiple calls across science, technology, engineering, and mathematics education fields to increase collaborations between BER and other disciplines so as to increase the robustness of science education research at the collegiate level ( Coley and Tanner, 2012 ; NRC, 2012 ; Talanquer, 2014 ; Dolan, 2015 ; Peffer and Renken, 2016 ; Mestre et al. , 2018 ; Davidesco and Milne, 2019 ). Engaging in design-based research provides both a mechanism and a motivation for fostering interdisciplinary collaborations, as it requires the design team to have theoretical knowledge of how students learn, domain knowledge of practical learning problems, and instructional knowledge for how to implement instructional tools in the classroom ( Edelson, 2002 ; Hoadley, 2004 ; Wang and Hannafin, 2005 ; Anderson and Shattuck, 2012 ). For example, in our current work, our research team consists of two discipline-based education learning scientists from an R1 institution, two physiology education researchers/instructors (one from an R1 institution the other from a community college), several physiology disciplinary experts/instructors, and a K–12 science education expert.

Design-based research collaborations have several distinct benefits for BER: first, learning or cognitive scientists could provide theoretical and methodological expertise that may be unfamiliar to biology education researchers with traditional science backgrounds ( Lo et al. , 2019 ). This would both improve the rigor of the research project and provide biology education researchers with the opportunity to explore ideas and methods from other disciplines. Second, collaborations between researchers and instructors could help increase the implementation of evidence-based teaching practices by instructors/faculty who are not education researchers and would benefit from support while shifting their instructional approaches ( Eddy et al. , 2015 ). This may be especially true for community college and primarily undergraduate institution faculty who often do not have access to the same kinds of resources that researchers and instructors at research-intensive institutions do ( Schinske et al. , 2017 ). Third, making instructors an integral part of a design-based research project ensures they are well versed in the theory and learning objectives underlying the instructional tools they are implementing in the classroom. This can improve the fidelity of implementation of the instructional tools, because the instructors understand the tools’ theoretical and practical purposes, which has been cited as one reason there have been mixed results on the impact of active learning across biology classes ( Andrews et al. , 2011 ; Borrego et al. , 2013 ; Lee et al. , 2018 ; Offerdahl et al. , 2018 ). It also gives instructors agency to make informed adjustments to the instructional tools during implementation that improve their practical applications while remaining true to the goals of the research ( Hoadley, 2004 ).

Design-Based Research Invites Using Mixed Methods to Analyze Data

The diverse nature of the data that are often collected in design-based research can require both qualitative and quantitative methodologies to produce a rich picture of how the instructional tools and their implementation influenced student learning ( Anderson and Shattuck, 2012 ). Using mixed methods may be less familiar to biology education researchers who were primarily trained in quantitative methods as biologists ( Lo et al. , 2019 ). However, according to Warfa (2016 , p. 2), “Integration of research findings from quantitative and qualitative inquiries in the same study or across studies maximizes the affordances of each approach and can provide better understanding of biology teaching and learning than either approach alone.” Although the number of BER studies using mixed methods has increased over the past decade ( Lo et al. , 2019 ), engaging in design-based research could further this trend through its collaborative nature of bringing social scientists together with biology education researchers to share research methodologies from different fields. By leveraging qualitative and quantitative methods, design-based researchers unpack “mechanism and process” by characterizing the nature of student thinking rather than “simply reporting that differences did or did not occur” ( Barab, 2014 , p. 158), which is important for continuing to advance our understanding of student learning in BER ( Dolan, 2015 ; Lo et al. , 2019 ).

CHALLENGES TO IMPLEMENTING DESIGN-BASED RESEARCH IN BER

As with any methodological approach, there can be challenges to implementing design-based research. Here, we highlight three that may be relevant to BER.

Collaborations Can Be Difficult to Maintain

While collaborations between researchers and instructors offer many affordances (as discussed earlier), the reality of connecting researchers across departments and institutions can be challenging. For example, Peffer and Renken (2016) noted that different traditions of scholarship can present barriers to collaboration where there is not mutual respect for the methods and ideas that are part and parcel to each discipline. Additionally, Schinske et al. (2017) identified several constraints that community college faculty face for engaging in BER, such as limited time or support (e.g., infrastructural, administrative, and peer support), which could also impact their ability to form the kinds of collaborations inherent in design-based research. Moreover, the iterative nature of design-based research requires these collaborations to persist for an extended period of time. Attending to these challenges is an important part of forming the design team and identifying the different roles researchers and instructors will play in the research.

Design-Based Research Experiments Are Resource Intensive

The focus of design-based research on studying learning ecologies to uncover mechanisms of learning requires that researchers collect multiple data streams through time, which often necessitates significant temporal and financial resources ( Collins et al., 2004 ; O’Donnell, 2004 ). Consequently, researchers must weigh both practical as well as methodological considerations when formulating their experimental design. For example, investigating learning mechanisms requires that researchers collect data at a frequency that will capture changes in student thinking ( Siegler, 2006 ). However, researchers may be constrained in the number of data-collection events they can anticipate depending on: the instructor’s ability to facilitate in-class collection events or solicit student participation in extracurricular activities (e.g., interviews); the cost of technological devices to record student conversations; the time and logistical considerations needed to schedule and conduct student interviews; the financial resources available to compensate student participants; the financial and temporal costs associated with analyzing large amounts of data.

Identifying learning mechanisms also requires in-depth analyses of qualitative data as students experience various instructional tools (e.g., microgenetic methods; Flynn et al. , 2006 ; Siegler, 2006 ). The high intensity of these in-depth analyses often limits the number of students who can be evaluated in this way, which must be balanced with the kinds of generalizations researchers wish to make about the effectiveness of the instructional tools ( O’Donnell, 2004 ). Because of the large variety of data streams that could be collected in a design-based research experiment—and the resources required to collect and analyze them—it is critical that the research team identify a priori how specific data streams, and the methods of their analysis, will provide the evidence necessary to address the theoretical and practical objectives of the research (see the following section on experimental rigor; Sandoval, 2014 ). These are critical management decisions because of the need for a transparent, systematic analysis of the data that others can scrutinize to evaluate the validity of the claims being made ( Cobb et al. , 2003 ).

Concerns with Experimental Rigor

The nature of design-based research, with its use of narrative to characterize versus control experimental environments, has drawn concerns about the rigor of this methodological approach. Some have challenged its ability to produce evidence-based warrants to support its claims of learning that can be replicated and critiqued by others ( Shavelson et al. , 2003 ; Hoadley, 2004 ). This is a valid concern that design-based researchers, and indeed all education researchers, must address to ensure their research meets established standards for education research ( NRC, 2002 ).

One way design-based researchers address this concern is by “specifying theoretically salient features of a learning environment design and mapping out how they are predicted to work together to produce desired outcomes” ( Sandoval, 2014 , p. 19). Through this process, researchers explicitly show before they begin the work how their theory of learning is embodied in the instructional tools to be tested, the specific data the tools will produce for analysis, and what outcomes will be taken as evidence for success. Moreover, by allowing instructional tools to be modified during the testing phase as needed, design-based researchers acknowledge that it is impossible to anticipate all aspects of the classroom environment that might impact the implementation of instructional tools, “as dozens (if not millions) of factors interact to produce the measureable outcomes related to learning” ( Hoadley, 2004 , p. 204; DBR Collective, 2003 ). Consequently, modifying instructional tools midstream to account for these unanticipated factors can ensure they retain their methodological alignment with the underlying theory and predicted learning outcomes so that inferences drawn from the design experiment accurately reflect what was being tested ( Edelson, 2002 ; Hoadley, 2004 ). Indeed, Barab (2014) states, “the messiness of real-world practice must be recognized, understood, and integrated as part of the theoretical claims if the claims are to have real-world explanatory value” (p. 153).

CONCLUSIONS

providing a methodology that integrates theories of learning with practical experiences in classrooms,

using a range of analytical approaches that allow for researchers to uncover the underlying mechanisms of student thinking and learning,

fostering interdisciplinary collaborations among researchers and instructors, and

characterizing learning ecologies that account for the complexity involved in student learning

By employing this methodology from the learning sciences, biology education researchers can enrich our current understanding of what is required to help biology students achieve their personal and professional aims during their college experience. It can also stimulate new ideas for biology education that can be discussed and debated in our research community as we continue to explore and refine how best to serve the students who pass through our classroom doors.

1 “Epistemic commitment” is defined as engaging in certain practices that generate knowledge in an agreed-upon way.

ACKNOWLEDGMENTS

We thank the UW Biology Education Research Group’s (BERG) feedback on drafts of this essay as well as Dr. L. Jescovich for last-minute analyses. This work was supported by a National Science Foundation award (NSF DUE 1661263/1660643). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF. All procedures were conducted in accordance with approval from the Institutional Review Board at the University of Washington (52146) and the New England Independent Review Board (120160152).

  • Alonzo, A. C. ( 2011 ). Learning progressions that support formative assessment practices . Measurement , 9 (2/3), 124–129. Google Scholar
  • Anderson, T., & Shattuck, J. ( 2012 ). Design-based research: A decade of progress in education research? Educational Researcher , 41 (1), 16–25. Google Scholar
  • Andrews, T. M., Leonard, M. J., Colgrove, C. A., & Kalinowski, S. T. ( 2011 ). Active learning not associated with student learning in a random sample of college biology courses . CBE—Life Sciences Education , 10 (4), 394–405. Link ,  Google Scholar
  • Badenhorst, E., Hartman, N., & Mamede, S. ( 2016 ). How biomedical misconceptions may arise and affect medical students’ learning: A review of theoretical perspectives and empirical evidence . Health Professions Education , 2 (1), 10–17. Google Scholar
  • Barab, S. ( 2014 ). Design-based research: A methodological toolkit for engineering change . In The Cambridge handbook of the learning sciences (2nd ed., pp. 151–170). Cambridge University Press. https://doi.org/10.1017/CBO9781139519526.011 Google Scholar
  • Barab, S., & Squire, K. ( 2004 ). Design-based research: Putting a stake in the ground . Journal of the Learning Sciences , 13 (1), 1–14. Google Scholar
  • Bell, P. ( 2004 ). On the theoretical breadth of design-based research in education . Educational Psychologist , 39 (4), 243–253. Google Scholar
  • Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. ( 2013 ). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses . Journal of Engineering Education , 102 (3), 394–425. Google Scholar
  • Brown, A. L. ( 1992 ). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings . Journal of the Learning Sciences , 2 (2), 141–178. Google Scholar
  • Chi, M. T. H., Roscoe, R. D., Slotta, J. D., Roy, M., & Chase, C. C. ( 2012 ). Misconceived causal explanations for emergent processes . Cognitive Science , 36 (1), 1–61. Medline ,  Google Scholar
  • Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. ( 2003 ). Design experiments in educational research . Educational Researcher , 32 (1), 9–13. Google Scholar
  • Coley, J. D., & Tanner, K. D. ( 2012 ). Common origins of diverse misconceptions: Cognitive principles and the development of biology thinking . CBE—Life Sciences Education , 11 (3), 209–215. Link ,  Google Scholar
  • Coley, J. D., & Tanner, K. ( 2015 ). Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors . CBE—Life Sciences Education , 14 (1). https://doi.org/10.1187/cbe.14-06-0094 Medline ,  Google Scholar
  • Collins, A., Joseph, D., & Bielaczyc, K. ( 2004 ). Design research: Theoretical and methodological issues . Journal of the Learning Sciences , 13 (1), 15–42. Google Scholar
  • Corcoran, T., Mosher, F. A., & Rogat, A. D. ( 2009 ). Learning progressions in science: An evidence-based approach to reform (CPRE Research Report No. RR-63) . Philadelphia, PA: Consortium for Policy Research in Education. Google Scholar
  • Davidesco, I., & Milne, C. ( 2019 ). Implementing cognitive science and discipline-based education research in the undergraduate science classroom . CBE—Life Sciences Education , 18 (3), es4. Link ,  Google Scholar
  • Design-Based Research Collective . ( 2003 ). Design-based research: An emerging paradigm for educational inquiry . Educational Researcher , 32 (1), 5–8. Google Scholar
  • Dolan, E. L. ( 2015 ). Biology education research 2.0 . CBE—Life Sciences Education , 14 (4), ed1. Link ,  Google Scholar
  • Duschl, R., Maeng, S., & Sezen, A. ( 2011 ). Learning progressions and teaching sequences: A review and analysis . Studies in Science Education , 47 (2), 123–182. Google Scholar
  • Eddy, S. L., Converse, M., & Wenderoth, M. P. ( 2015 ). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes . CBE—Life Sciences Education , 14 (2), ar23. Link ,  Google Scholar
  • Edelson, D. C. ( 2002 ). Design research: What we learn when we engage in design . Journal of the Learning Sciences , 11 (1), 105–121. Google Scholar
  • Flynn, E., Pine, K., & Lewis, C. ( 2006 ). The microgenetic method—Time for change? The Psychologist , 19 (3), 152–155. Google Scholar
  • Gouvea, J. S., & Simon, M. R. ( 2018 ). Challenging cognitive construals: A dynamic alternative to stable misconceptions . CBE—Life Sciences Education , 17 (2), ar34. Link ,  Google Scholar
  • Gunckel, K. L., Mohan, L., Covitt, B. A., & Anderson, C. W. ( 2012 ). Addressing challenges in developing learning progressions for environmental science literacy . In Alonzo, A. C.Gotwals, A. W. (Eds.), Learning progressions in science: Current challenges and future directions (pp. 39–75). Rotterdam: SensePublishers. https://doi.org/10.1007/978-94-6091-824-7_4 Google Scholar
  • Hake, R. R. ( 2007 ). Design-based research in physics education research: A review . In Kelly, A. E.Lesh, R. A.Baek, J. Y. (Eds.), Handbook of design research methods in mathematics, science, and technology education (p. 24). New York: Routledge. Google Scholar
  • Hoadley, C. M. ( 2004 ). Methodological alignment in design-based research . Educational Psychologist , 39 (4), 203–212. Google Scholar
  • Jackson, M., Tran, A., Wenderoth, M. P., & Doherty, J. H. ( 2018 ). Peer vs. self-grading of practice exams: Which is better? CBE—Life Sciences Education , 17 (3), es44. https://doi.org/10.1187/cbe.18-04-0052 Link ,  Google Scholar
  • Jin, H., & Anderson, C. W. ( 2012 ). A learning progression for energy in socio-ecological systems . Journal of Research in Science Teaching , 49 (9), 1149–1180. Google Scholar
  • Joseph, D. ( 2004 ). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context . Educational Psychologist , 39 (4), 235–242. Google Scholar
  • Kelly, A. E. ( 2014 ). Design-based research in engineering education . In Cambridge handbook of engineering education research (pp. 497–518). New York, NY: Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.032 Google Scholar
  • Lee, C. J., Toven-Lindsey, B., Shapiro, C., Soh, M., Mazrouee, S., Levis-Fitzgerald, M., & Sanders, E. R. ( 2018 ). Error-discovery learning boosts student engagement and performance, while reducing student attrition in a bioinformatics course . CBE—Life Sciences Education , 17 (3), ar40. https://doi.org/10.1187/cbe.17-04-0061 Link ,  Google Scholar
  • Lo, S. M., Gardner, G. E., Reid, J., Napoleon-Fanis, V., Carroll, P., Smith, E., & Sato, B. K. ( 2019 ). Prevailing questions and methodologies in biology education research: A longitudinal analysis of research in CBE — life sciences education and at the society for the advancement of biology education research . CBE—Life Sciences Education , 18 (1), ar9. Link ,  Google Scholar
  • Marbach-Ad, G., Briken, V., El-Sayed, N. M., Frauwirth, K., Fredericksen, B., Hutcheson, S., … & Smith, A. C. ( 2009 ). Assessing student understanding of host pathogen interactions using a concept inventory . Journal of Microbiology & Biology Education , 10 (1), 43–50. Medline ,  Google Scholar
  • McFarland, J. L., Price, R. M., Wenderoth, M. P., Martinková, P., Cliff, W., Michael, J. , … & Wright, A. ( 2017 ). Development and validation of the homeostasis concept inventory . CBE—Life Sciences Education , 16 (2), ar35. Link ,  Google Scholar
  • McKenney, S., & Reeves, T. C. ( 2013 ). Systematic review of design-based research progress: Is a little knowledge a dangerous thing? Educational Researcher , 42 (2), 97–100. Google Scholar
  • Mestre, J. P., Cheville, A., & Herman, G. L. ( 2018 ). Promoting DBER-cognitive psychology collaborations in STEM education . Journal of Engineering Education , 107 (1), 5–10. Google Scholar
  • Michael, J. A. ( 2007 ). What makes physiology hard for students to learn? Results of a faculty survey . AJP: Advances in Physiology Education , 31 (1), 34–40. Medline ,  Google Scholar
  • Michael, J. A., Modell, H., McFarland, J., & Cliff, W. ( 2009 ). The “core principles” of physiology: What should students understand? Advances in Physiology Education , 33 (1), 10–16. Medline ,  Google Scholar
  • Middleton, J., Gorard, S., Taylor, C., & Bannan-Ritland, B. ( 2008 ). The “compleat” design experiment: From soup to nuts . In Kelly, A. E.Lesh, R. A.Baek, J. Y. (Eds.), Handbook of design research methods in education: Innovations in science, technology, engineering, and mathematics learning and teaching (pp. 21–46). New York, NY: Routledge. Google Scholar
  • Modell, H. I. ( 2000 ). How to help students understand physiology? Emphasize general models . Advances in Physiology Education , 23 (1), S101–S107. Medline ,  Google Scholar
  • Mohan, L., Chen, J., & Anderson, C. W. ( 2009 ). Developing a multi-year learning progression for carbon cycling in socio-ecological systems . Journal of Research in Science Teaching , 46 (6), 675–698. Google Scholar
  • National Academies of Sciences, Engineering, and Medicine . ( 2018 ). How People Learn II: Learners, Contexts, and Cultures . Washington, DC: National Academies Press. Retrieved June 24, 2019, from https://doi.org/10.17226/24783 Google Scholar
  • National Research Council (NRC) . ( 2002 ). Scientific research in education . Washington, DC: National Academies Press. Retrieved January 31, 2019, from https://doi.org/10.17226/10236 Google Scholar
  • NRC . ( 2007 ). Taking science to school: Learning and teaching science in grades K–8 . Washington, DC: National Academies Press. Retrieved March 22, 2019, from www.nap.edu/catalog/11625/taking-science-to-school-learning-and-teaching-science-in-grades . https://doi.org/10.17226/11625 Google Scholar
  • NRC . ( 2012 ). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering . Washington, DC: National Academies Press. Retrieved from www.nap.edu/catalog/13362/discipline-based-education-research-understanding-and-improving-learning-in-undergraduate . https://doi.org/10.17226/13362 Google Scholar
  • NRC . ( 2018 ). How people learn II: Learners, contexts, and cultures . Washington, DC: National Academies Press. Retrieved from www.nap.edu/read/24783/chapter/7 . https://doi.org/10.17226/24783 Google Scholar
  • O’Donnell, A. M. ( 2004 ). A commentary on design research . Educational Psychologist , 39 (4), 255–260. Google Scholar
  • Offerdahl, E. G., McConnell, M., & Boyer, J. ( 2018 ). Can I have your recipe? Using a fidelity of implementation (FOI) framework to identify the key ingredients of formative assessment for learning . CBE—Life Sciences Education , 17 (4), es16. Link ,  Google Scholar
  • Peffer, M., & Renken, M. ( 2016 ). Practical strategies for collaboration across discipline-based education research and the learning sciences . CBE—Life Sciences Education , 15 (4), es11. Link ,  Google Scholar
  • Reiser, B. J., Smith, B. K., Tabak, I., Steinmuller, F., Sandoval, W. A., & Leone, A. J. ( 2001 ). BGuILE: Strategic and conceptual scaffolds for scientific inquiry in biology classrooms . In Carver, S. M.Klahr, D. (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–305). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Google Scholar
  • Sandoval, W. ( 2014 ). Conjecture mapping: An approach to systematic educational design research . Journal of the Learning Sciences , 23 (1), 18–36. Google Scholar
  • Sandoval, W. A., & Bell, P. ( 2004 ). Design-based research methods for studying learning in context: Introduction . Educational Psychologist , 39 (4), 199–201. Google Scholar
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S. , … & Corwin, L. A. ( 2017 ). Broadening participation in biology education research: Engaging community college students and faculty . CBE—Life Sciences Education , 16 (2), mr1. Link ,  Google Scholar
  • Scott, E., Anderson, C. W., Mashood, K. K., Matz, R. L., Underwood, S. M., & Sawtelle, V. ( 2018 ). Developing an analytical framework to characterize student reasoning about complex processes . CBE—Life Sciences Education , 17 (3), ar49. https://doi.org/10.1187/cbe.17-10-0225 Link ,  Google Scholar
  • Scott, E., Wenderoth, M. P., & Doherty, J. H. ( 2019 ). Learning progressions: An empirically grounded, learner-centered framework to guide biology instruction . CBE—Life Sciences Education , 18 (4), es5. https://doi.org/10.1187/cbe.19-03-0059 Link ,  Google Scholar
  • Sharma, M. D., & McShane, K. ( 2008 ). A methodological framework for understanding and describing discipline-based scholarship of teaching in higher education through design-based research . Higher Education Research & Development , 27 (3), 257–270. Google Scholar
  • Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. ( 2003 ). On the science of education design studies . Educational Researcher , 32 (1), 25–28. Google Scholar
  • Siegler, R. S. ( 2006 ). Microgenetic analyses of learning . In Damon, W.Lerner, R. M. (Eds.), Handbook of child psychology (pp. 464–510). Hoboken, NJ: John Wiley & Sons, Inc. https://doi.org/10.1002/9780470147658.chpsy0211 Google Scholar
  • Stringer, E. T. ( 2013 ). Action research . Thousand Oaks, CA: Sage Publications, Inc. Google Scholar
  • Subramaniam, M., Jean, B. S., Taylor, N. G., Kodama, C., Follman, R., & Casciotti, D. ( 2015 ). Bit by bit: Using design-based research to improve the health literacy of adolescents . JMIR Research Protocols , 4 (2), e62. Medline ,  Google Scholar
  • Summers, M. M., Couch, B. A., Knight, J. K., Brownell, S. E., Crowe, A. J., Semsar, K. , … & Batzli, J. ( 2018 ). EcoEvo-MAPS: An ecology and evolution assessment for introductory through advanced undergraduates . CBE—Life Sciences Education , 17 (2), ar18. Link ,  Google Scholar
  • Szteinberg, G., Balicki, S., Banks, G., Clinchot, M., Cullipher, S., Huie, R. , … & Sevian, H. ( 2014 ). Collaborative professional development in chemistry education research: Bridging the gap between research and practice . Journal of Chemical Education , 91 (9), 1401–1408. Google Scholar
  • Talanquer, V. ( 2014 ). DBER and STEM education reform: Are we up to the challenge? Journal of Research in Science Teaching , 51 (6), 809–819. Google Scholar
  • Wang, F., & Hannafin, M. J. ( 2005 ). Design-based research and technology-enhanced learning environments . Educational Technology Research and Development , 53 (4), 5–23. Google Scholar
  • Wang, J.-R. ( 2004 ). Development and validation of a Two-tier instrument to examine understanding of internal transport in plants and the human circulatory system . International Journal of Science and Mathematics Education , 2 (2), 131–157. Google Scholar
  • Warfa, A.-R. M. ( 2016 ). Mixed-methods design in biology education research: Approach and uses . CBE—Life Sciences Education , 15 (4), rm5. Link ,  Google Scholar
  • Windschitl, M., Thompson, J., Braaten, M., & Stroupe, D. ( 2012 ). Proposing a core set of instructional practices and tools for teachers of science . Science Education , 96 (5), 878–903. Google Scholar
  • Zagallo, P., Meddleton, S., & Bolger, M. S. ( 2016 ). Teaching real data interpretation with models (TRIM): Analysis of student dialogue in a large-enrollment cell and developmental biology course . CBE—Life Sciences Education , 15 (2), ar17. Link ,  Google Scholar
  • Codéveloppement d’un programme d’autogestion de la douleur chronique en ligne: un projet de recherche basé sur la conception et axé sur l’engagement des patients 12 March 2024 | Canadian Journal of Pain, Vol. 8, No. 1
  • Developing and Validating the Preschool Nutrition Education Practices Survey 1 Aug 2024 | Journal of Nutrition Education and Behavior, Vol. 56, No. 8
  • Immediate Versus Delayed Low-Stakes Questioning: Encouraging the Testing Effect Through Embedded Video Questions to Support Students’ Knowledge Outcomes, Self-Regulation, and Critical Thinking 30 July 2024 | Technology, Knowledge and Learning, Vol. 87
  • Innovative strategies to strengthen teaching-researching skills in chemistry and biology education: a systematic literature review 10 July 2024 | Frontiers in Education, Vol. 9
  • Behavioral assessment of soft skill development in a highly structured pre-health biology course for undergraduates 11 June 2024 | Journal of Microbiology & Biology Education, Vol. 32
  • Enhancing undergraduates’ engagement in a learning community by including their voices in the technological and instructional design 1 Jun 2024 | Computers & Education, Vol. 214
  • Practice-Based Teacher Education Benefits Graduate Trainees and Their Students Through Inclusive and Active Teaching Methods 16 October 2023 | Journal for STEM Education Research, Vol. 7, No. 1
  • Designing and Evaluating Generative AI-Based Voice-Interaction Agents for Improving L2 Learners’ Oral Communication Competence 2 July 2024
  • A Research-Led Contribution of Engineering Education for a Sustainable Future 1 June 2024
  • Development of Inclusive Tools Through Digital Fabrication for Chemistry Learning in Students with and Without Visual Impairment 5 August 2024
  • Leveraging learning experience design: digital media approaches to influence motivational traits that support student learning behaviors in undergraduate online courses 11 October 2022 | Journal of Computing in Higher Education, Vol. 35, No. 3
  • Investigating an Assessment Design that Prevents Students from Using ChatGPT as the Sole Basis to Pass Assessment at the Tertiary Level 30 November 2023 | E-Journal of Humanities, Arts and Social Sciences
  • Spatial Variations in Aquatic Insect Community Structure in the Winam Gulf of Lake Victoria, Kenya 8 Sep 2023 | International Journal of Ecology, Vol. 2023
  • The Perceived Effectiveness of Various Forms of Feedback on the Acquisition of Technical Skills by Advanced Learners in Simulation-Based Health Professions Education 28 Aug 2023 | Cureus, Vol. 44
  • Occupational therapists' acceptance of 3D printing 22 August 2023 | South African Journal of Occupational Therapy, Vol. 53, No. 2
  • An app by students for students – the DPaCK-model for a digital collaborative teamwork project to identify butterflies 4 August 2023 | Frontiers in Education, Vol. 8
  • Applying DBR to design protocols for synchronous online Chinese learning: An activity theoretic perspective 1 Aug 2023 | System, Vol. 116
  • Defining the Nature of Augmented Feedback for Learning Intraosseous Access Skills in Simulation-Based Health Professions Education 14 Jul 2023 | Cureus, Vol. 86
  • Practice-based 21st-century teacher education: Design principles for adaptive expertise 1 Jul 2023 | Teaching and Teacher Education, Vol. 128
  • Undergraduate students’ neurophysiological reasoning: what we learn from the attractive distractors students select 1 Jun 2023 | Advances in Physiology Education, Vol. 47, No. 2
  • Oaks to arteries: the Physiology Core Concept of flow down gradients supports transfer of student reasoning 1 Jun 2023 | Advances in Physiology Education, Vol. 47, No. 2
  • Audrey Chen ,
  • Kimberley A. Phillips ,
  • Jennifer E. Schaefer , and
  • Patrick M. Sonner
  • Kyle Frantz,, Monitoring Editor
  • Optimizing the Learner’s Role in Feedback: Development of a Feedback-Preparedness Online Application for Medical Students in the Clinical Setting 8 May 2023 | Cureus, Vol. 42
  • History, Status, and Development of AI-Based Learning Science 8 April 2023 | SN Computer Science, Vol. 4, No. 3
  • An Analytical Dashboard of Collaborative Activities for the Knowledge Building 4 March 2023 | Technology, Knowledge and Learning, Vol. 29
  • The Application of a Design-Based Research Framework for Simulation-Based Education 22 Nov 2022 | Cureus, Vol. 22
  • Erin Stanfield ,
  • Corin D. Slown ,
  • Quentin Sedlacek , and
  • Suzanne E. Worcester
  • James Hewlett, Monitoring Editor
  • 2022 | , Vol. 511
  • The effect of the e-mentoring-based education program on professional development of preschool teachers 3 July 2021 | Education and Information Technologies, Vol. 27, No. 1
  • 2022 | Education Sciences, Vol. 12, No. 8
  • Training Digital Competences of Educators in Continuing Education: A Three-Level Approach 27 October 2022
  • Design-based research as a framework for developing and deploying augmented reality applications and scenarios for intercultural exchange 13 December 2021
  • Repetition Is Important to Students and Their Understanding during Laboratory Courses That Include Research 10 Sep 2021 | Journal of Microbiology & Biology Education, Vol. 22, No. 2
  • Another look at the core concepts of physiology: revisions and resources 1 Dec 2020 | Advances in Physiology Education, Vol. 44, No. 4

design based research dissertation

Submitted: 18 November 2019 Revised: 3 March 2020 Accepted: 25 March 2020

© 2020 E. E. Scott et al. CBE—Life Sciences Education © 2020 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is a Research Design | Types, Guide & Examples

What Is a Research Design | Types, Guide & Examples

Published on June 7, 2021 by Shona McCombes . Revised on November 20, 2023 by Pritha Bhandari.

A research design is a strategy for answering your   research question  using empirical data. Creating a research design means making decisions about:

  • Your overall research objectives and approach
  • Whether you’ll rely on primary research or secondary research
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research objectives and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, other interesting articles, frequently asked questions about research design.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities—start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative approach Quantitative approach
and describe frequencies, averages, and correlations about relationships between variables

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed-methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism. Run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types.

  • Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships
  • Descriptive and correlational designs allow you to measure variables and describe relationships between them.
Type of design Purpose and characteristics
Experimental relationships effect on a
Quasi-experimental )
Correlational
Descriptive

With descriptive and correlational designs, you can get a clear picture of characteristics, trends and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analyzing the data.

Type of design Purpose and characteristics
Grounded theory
Phenomenology

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study—plants, animals, organizations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

  • Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalize your results to the population as a whole.

Probability sampling Non-probability sampling

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study , your aim is to deeply understand a specific context, not to generalize to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question .

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviors, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews .

Questionnaires Interviews
)

Observation methods

Observational studies allow you to collect data unobtrusively, observing characteristics, behaviors or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Quantitative observation

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

Field Examples of data collection methods
Media & communication Collecting a sample of texts (e.g., speeches, articles, or social media posts) for data on cultural norms and narratives
Psychology Using technologies like neuroimaging, eye-tracking, or computer-based tasks to collect data on things like attention, emotional response, or reaction time
Education Using tests or assignments to collect data on knowledge and skills
Physical sciences Using scientific instruments to collect data on things like weight, blood pressure, or chemical composition

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what kinds of data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected—for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are high in reliability and validity.

Operationalization

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalization means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in—for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced, while validity means that you’re actually measuring the concept you’re interested in.

Reliability Validity
) )

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method , you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample—by mail, online, by phone, or in person?

If you’re using a probability sampling method , it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method , how will you avoid research bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organizing and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymize and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well-organized will save time when it comes to analyzing it. It can also help other researchers validate and add to your findings (high replicability ).

On its own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyze the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarize your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarize your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

Approach Characteristics
Thematic analysis
Discourse analysis

There are many other ways of analyzing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

If you want to know more about the research process , methodology , research bias , or statistics , make sure to check out some of our other articles with explanations and examples.

  • Simple random sampling
  • Stratified sampling
  • Cluster sampling
  • Likert scales
  • Reproducibility

 Statistics

  • Null hypothesis
  • Statistical power
  • Probability distribution
  • Effect size
  • Poisson distribution

Research bias

  • Optimism bias
  • Cognitive bias
  • Implicit bias
  • Hawthorne effect
  • Anchoring bias
  • Explicit bias

A research design is a strategy for answering your   research question . It defines your overall approach and determines how you will collect and analyze data.

A well-planned research design helps ensure that your methods match your research aims, that you collect high-quality data, and that you use the right kind of analysis to answer your questions, utilizing credible sources . This allows you to draw valid , trustworthy conclusions.

Quantitative research designs can be divided into two main categories:

  • Correlational and descriptive designs are used to investigate characteristics, averages, trends, and associations between variables.
  • Experimental and quasi-experimental designs are used to test causal relationships .

Qualitative research designs tend to be more flexible. Common types of qualitative design include case study , ethnography , and grounded theory designs.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

A sample is a subset of individuals from a larger population . Sampling means selecting the group that you will actually collect data from in your research. For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

In statistics, sampling allows you to test a hypothesis about the characteristics of a population.

Operationalization means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioral avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalize the variables that you want to measure.

A research project is an academic, scientific, or professional undertaking to answer a research question . Research projects can take many forms, such as qualitative or quantitative , descriptive , longitudinal , experimental , or correlational . What kind of research approach you choose will depend on your topic.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, November 20). What Is a Research Design | Types, Guide & Examples. Scribbr. Retrieved August 28, 2024, from https://www.scribbr.com/methodology/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, guide to experimental design | overview, steps, & examples, how to write a research proposal | examples & templates, ethical considerations in research | types & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

design based research dissertation

Dissertation Structure & Layout 101: How to structure your dissertation, thesis or research project.

By: Derek Jansen (MBA) Reviewed By: David Phair (PhD) | July 2019

So, you’ve got a decent understanding of what a dissertation is , you’ve chosen your topic and hopefully you’ve received approval for your research proposal . Awesome! Now its time to start the actual dissertation or thesis writing journey.

To craft a high-quality document, the very first thing you need to understand is dissertation structure . In this post, we’ll walk you through the generic dissertation structure and layout, step by step. We’ll start with the big picture, and then zoom into each chapter to briefly discuss the core contents. If you’re just starting out on your research journey, you should start with this post, which covers the big-picture process of how to write a dissertation or thesis .

Dissertation structure and layout - the basics

*The Caveat *

In this post, we’ll be discussing a traditional dissertation/thesis structure and layout, which is generally used for social science research across universities, whether in the US, UK, Europe or Australia. However, some universities may have small variations on this structure (extra chapters, merged chapters, slightly different ordering, etc).

So, always check with your university if they have a prescribed structure or layout that they expect you to work with. If not, it’s safe to assume the structure we’ll discuss here is suitable. And even if they do have a prescribed structure, you’ll still get value from this post as we’ll explain the core contents of each section.  

Overview: S tructuring a dissertation or thesis

  • Acknowledgements page
  • Abstract (or executive summary)
  • Table of contents , list of figures and tables
  • Chapter 1: Introduction
  • Chapter 2: Literature review
  • Chapter 3: Methodology
  • Chapter 4: Results
  • Chapter 5: Discussion
  • Chapter 6: Conclusion
  • Reference list

As I mentioned, some universities will have slight variations on this structure. For example, they want an additional “personal reflection chapter”, or they might prefer the results and discussion chapter to be merged into one. Regardless, the overarching flow will always be the same, as this flow reflects the research process , which we discussed here – i.e.:

  • The introduction chapter presents the core research question and aims .
  • The literature review chapter assesses what the current research says about this question.
  • The methodology, results and discussion chapters go about undertaking new research about this question.
  • The conclusion chapter (attempts to) answer the core research question .

In other words, the dissertation structure and layout reflect the research process of asking a well-defined question(s), investigating, and then answering the question – see below.

A dissertation's structure reflect the research process

To restate that – the structure and layout of a dissertation reflect the flow of the overall research process . This is essential to understand, as each chapter will make a lot more sense if you “get” this concept. If you’re not familiar with the research process, read this post before going further.

Right. Now that we’ve covered the big picture, let’s dive a little deeper into the details of each section and chapter. Oh and by the way, you can also grab our free dissertation/thesis template here to help speed things up.

The title page of your dissertation is the very first impression the marker will get of your work, so it pays to invest some time thinking about your title. But what makes for a good title? A strong title needs to be 3 things:

  • Succinct (not overly lengthy or verbose)
  • Specific (not vague or ambiguous)
  • Representative of the research you’re undertaking (clearly linked to your research questions)

Typically, a good title includes mention of the following:

  • The broader area of the research (i.e. the overarching topic)
  • The specific focus of your research (i.e. your specific context)
  • Indication of research design (e.g. quantitative , qualitative , or  mixed methods ).

For example:

A quantitative investigation [research design] into the antecedents of organisational trust [broader area] in the UK retail forex trading market [specific context/area of focus].

Again, some universities may have specific requirements regarding the format and structure of the title, so it’s worth double-checking expectations with your institution (if there’s no mention in the brief or study material).

Dissertations stacked up

Acknowledgements

This page provides you with an opportunity to say thank you to those who helped you along your research journey. Generally, it’s optional (and won’t count towards your marks), but it is academic best practice to include this.

So, who do you say thanks to? Well, there’s no prescribed requirements, but it’s common to mention the following people:

  • Your dissertation supervisor or committee.
  • Any professors, lecturers or academics that helped you understand the topic or methodologies.
  • Any tutors, mentors or advisors.
  • Your family and friends, especially spouse (for adult learners studying part-time).

There’s no need for lengthy rambling. Just state who you’re thankful to and for what (e.g. thank you to my supervisor, John Doe, for his endless patience and attentiveness) – be sincere. In terms of length, you should keep this to a page or less.

Abstract or executive summary

The dissertation abstract (or executive summary for some degrees) serves to provide the first-time reader (and marker or moderator) with a big-picture view of your research project. It should give them an understanding of the key insights and findings from the research, without them needing to read the rest of the report – in other words, it should be able to stand alone .

For it to stand alone, your abstract should cover the following key points (at a minimum):

  • Your research questions and aims – what key question(s) did your research aim to answer?
  • Your methodology – how did you go about investigating the topic and finding answers to your research question(s)?
  • Your findings – following your own research, what did do you discover?
  • Your conclusions – based on your findings, what conclusions did you draw? What answers did you find to your research question(s)?

So, in much the same way the dissertation structure mimics the research process, your abstract or executive summary should reflect the research process, from the initial stage of asking the original question to the final stage of answering that question.

In practical terms, it’s a good idea to write this section up last , once all your core chapters are complete. Otherwise, you’ll end up writing and rewriting this section multiple times (just wasting time). For a step by step guide on how to write a strong executive summary, check out this post .

Need a helping hand?

design based research dissertation

Table of contents

This section is straightforward. You’ll typically present your table of contents (TOC) first, followed by the two lists – figures and tables. I recommend that you use Microsoft Word’s automatic table of contents generator to generate your TOC. If you’re not familiar with this functionality, the video below explains it simply:

If you find that your table of contents is overly lengthy, consider removing one level of depth. Oftentimes, this can be done without detracting from the usefulness of the TOC.

Right, now that the “admin” sections are out of the way, its time to move on to your core chapters. These chapters are the heart of your dissertation and are where you’ll earn the marks. The first chapter is the introduction chapter – as you would expect, this is the time to introduce your research…

It’s important to understand that even though you’ve provided an overview of your research in your abstract, your introduction needs to be written as if the reader has not read that (remember, the abstract is essentially a standalone document). So, your introduction chapter needs to start from the very beginning, and should address the following questions:

  • What will you be investigating (in plain-language, big picture-level)?
  • Why is that worth investigating? How is it important to academia or business? How is it sufficiently original?
  • What are your research aims and research question(s)? Note that the research questions can sometimes be presented at the end of the literature review (next chapter).
  • What is the scope of your study? In other words, what will and won’t you cover ?
  • How will you approach your research? In other words, what methodology will you adopt?
  • How will you structure your dissertation? What are the core chapters and what will you do in each of them?

These are just the bare basic requirements for your intro chapter. Some universities will want additional bells and whistles in the intro chapter, so be sure to carefully read your brief or consult your research supervisor.

If done right, your introduction chapter will set a clear direction for the rest of your dissertation. Specifically, it will make it clear to the reader (and marker) exactly what you’ll be investigating, why that’s important, and how you’ll be going about the investigation. Conversely, if your introduction chapter leaves a first-time reader wondering what exactly you’ll be researching, you’ve still got some work to do.

Now that you’ve set a clear direction with your introduction chapter, the next step is the literature review . In this section, you will analyse the existing research (typically academic journal articles and high-quality industry publications), with a view to understanding the following questions:

  • What does the literature currently say about the topic you’re investigating?
  • Is the literature lacking or well established? Is it divided or in disagreement?
  • How does your research fit into the bigger picture?
  • How does your research contribute something original?
  • How does the methodology of previous studies help you develop your own?

Depending on the nature of your study, you may also present a conceptual framework towards the end of your literature review, which you will then test in your actual research.

Again, some universities will want you to focus on some of these areas more than others, some will have additional or fewer requirements, and so on. Therefore, as always, its important to review your brief and/or discuss with your supervisor, so that you know exactly what’s expected of your literature review chapter.

Dissertation writing

Now that you’ve investigated the current state of knowledge in your literature review chapter and are familiar with the existing key theories, models and frameworks, its time to design your own research. Enter the methodology chapter – the most “science-ey” of the chapters…

In this chapter, you need to address two critical questions:

  • Exactly HOW will you carry out your research (i.e. what is your intended research design)?
  • Exactly WHY have you chosen to do things this way (i.e. how do you justify your design)?

Remember, the dissertation part of your degree is first and foremost about developing and demonstrating research skills . Therefore, the markers want to see that you know which methods to use, can clearly articulate why you’ve chosen then, and know how to deploy them effectively.

Importantly, this chapter requires detail – don’t hold back on the specifics. State exactly what you’ll be doing, with who, when, for how long, etc. Moreover, for every design choice you make, make sure you justify it.

In practice, you will likely end up coming back to this chapter once you’ve undertaken all your data collection and analysis, and revise it based on changes you made during the analysis phase. This is perfectly fine. Its natural for you to add an additional analysis technique, scrap an old one, etc based on where your data lead you. Of course, I’m talking about small changes here – not a fundamental switch from qualitative to quantitative, which will likely send your supervisor in a spin!

You’ve now collected your data and undertaken your analysis, whether qualitative, quantitative or mixed methods. In this chapter, you’ll present the raw results of your analysis . For example, in the case of a quant study, you’ll present the demographic data, descriptive statistics, inferential statistics , etc.

Typically, Chapter 4 is simply a presentation and description of the data, not a discussion of the meaning of the data. In other words, it’s descriptive, rather than analytical – the meaning is discussed in Chapter 5. However, some universities will want you to combine chapters 4 and 5, so that you both present and interpret the meaning of the data at the same time. Check with your institution what their preference is.

Now that you’ve presented the data analysis results, its time to interpret and analyse them. In other words, its time to discuss what they mean, especially in relation to your research question(s).

What you discuss here will depend largely on your chosen methodology. For example, if you’ve gone the quantitative route, you might discuss the relationships between variables . If you’ve gone the qualitative route, you might discuss key themes and the meanings thereof. It all depends on what your research design choices were.

Most importantly, you need to discuss your results in relation to your research questions and aims, as well as the existing literature. What do the results tell you about your research questions? Are they aligned with the existing research or at odds? If so, why might this be? Dig deep into your findings and explain what the findings suggest, in plain English.

The final chapter – you’ve made it! Now that you’ve discussed your interpretation of the results, its time to bring it back to the beginning with the conclusion chapter . In other words, its time to (attempt to) answer your original research question s (from way back in chapter 1). Clearly state what your conclusions are in terms of your research questions. This might feel a bit repetitive, as you would have touched on this in the previous chapter, but its important to bring the discussion full circle and explicitly state your answer(s) to the research question(s).

Dissertation and thesis prep

Next, you’ll typically discuss the implications of your findings . In other words, you’ve answered your research questions – but what does this mean for the real world (or even for academia)? What should now be done differently, given the new insight you’ve generated?

Lastly, you should discuss the limitations of your research, as well as what this means for future research in the area. No study is perfect, especially not a Masters-level. Discuss the shortcomings of your research. Perhaps your methodology was limited, perhaps your sample size was small or not representative, etc, etc. Don’t be afraid to critique your work – the markers want to see that you can identify the limitations of your work. This is a strength, not a weakness. Be brutal!

This marks the end of your core chapters – woohoo! From here on out, it’s pretty smooth sailing.

The reference list is straightforward. It should contain a list of all resources cited in your dissertation, in the required format, e.g. APA , Harvard, etc.

It’s essential that you use reference management software for your dissertation. Do NOT try handle your referencing manually – its far too error prone. On a reference list of multiple pages, you’re going to make mistake. To this end, I suggest considering either Mendeley or Zotero. Both are free and provide a very straightforward interface to ensure that your referencing is 100% on point. I’ve included a simple how-to video for the Mendeley software (my personal favourite) below:

Some universities may ask you to include a bibliography, as opposed to a reference list. These two things are not the same . A bibliography is similar to a reference list, except that it also includes resources which informed your thinking but were not directly cited in your dissertation. So, double-check your brief and make sure you use the right one.

The very last piece of the puzzle is the appendix or set of appendices. This is where you’ll include any supporting data and evidence. Importantly, supporting is the keyword here.

Your appendices should provide additional “nice to know”, depth-adding information, which is not critical to the core analysis. Appendices should not be used as a way to cut down word count (see this post which covers how to reduce word count ). In other words, don’t place content that is critical to the core analysis here, just to save word count. You will not earn marks on any content in the appendices, so don’t try to play the system!

Time to recap…

And there you have it – the traditional dissertation structure and layout, from A-Z. To recap, the core structure for a dissertation or thesis is (typically) as follows:

  • Acknowledgments page

Most importantly, the core chapters should reflect the research process (asking, investigating and answering your research question). Moreover, the research question(s) should form the golden thread throughout your dissertation structure. Everything should revolve around the research questions, and as you’ve seen, they should form both the start point (i.e. introduction chapter) and the endpoint (i.e. conclusion chapter).

I hope this post has provided you with clarity about the traditional dissertation/thesis structure and layout. If you have any questions or comments, please leave a comment below, or feel free to get in touch with us. Also, be sure to check out the rest of the  Grad Coach Blog .

design based research dissertation

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

36 Comments

ARUN kumar SHARMA

many thanks i found it very useful

Derek Jansen

Glad to hear that, Arun. Good luck writing your dissertation.

Sue

Such clear practical logical advice. I very much needed to read this to keep me focused in stead of fretting.. Perfect now ready to start my research!

hayder

what about scientific fields like computer or engineering thesis what is the difference in the structure? thank you very much

Tim

Thanks so much this helped me a lot!

Ade Adeniyi

Very helpful and accessible. What I like most is how practical the advice is along with helpful tools/ links.

Thanks Ade!

Aswathi

Thank you so much sir.. It was really helpful..

You’re welcome!

Jp Raimundo

Hi! How many words maximum should contain the abstract?

Karmelia Renatee

Thank you so much 😊 Find this at the right moment

You’re most welcome. Good luck with your dissertation.

moha

best ever benefit i got on right time thank you

Krishnan iyer

Many times Clarity and vision of destination of dissertation is what makes the difference between good ,average and great researchers the same way a great automobile driver is fast with clarity of address and Clear weather conditions .

I guess Great researcher = great ideas + knowledge + great and fast data collection and modeling + great writing + high clarity on all these

You have given immense clarity from start to end.

Alwyn Malan

Morning. Where will I write the definitions of what I’m referring to in my report?

Rose

Thank you so much Derek, I was almost lost! Thanks a tonnnn! Have a great day!

yemi Amos

Thanks ! so concise and valuable

Kgomotso Siwelane

This was very helpful. Clear and concise. I know exactly what to do now.

dauda sesay

Thank you for allowing me to go through briefly. I hope to find time to continue.

Patrick Mwathi

Really useful to me. Thanks a thousand times

Adao Bundi

Very interesting! It will definitely set me and many more for success. highly recommended.

SAIKUMAR NALUMASU

Thank you soo much sir, for the opportunity to express my skills

mwepu Ilunga

Usefull, thanks a lot. Really clear

Rami

Very nice and easy to understand. Thank you .

Chrisogonas Odhiambo

That was incredibly useful. Thanks Grad Coach Crew!

Luke

My stress level just dropped at least 15 points after watching this. Just starting my thesis for my grad program and I feel a lot more capable now! Thanks for such a clear and helpful video, Emma and the GradCoach team!

Judy

Do we need to mention the number of words the dissertation contains in the main document?

It depends on your university’s requirements, so it would be best to check with them 🙂

Christine

Such a helpful post to help me get started with structuring my masters dissertation, thank you!

Simon Le

Great video; I appreciate that helpful information

Brhane Kidane

It is so necessary or avital course

johnson

This blog is very informative for my research. Thank you

avc

Doctoral students are required to fill out the National Research Council’s Survey of Earned Doctorates

Emmanuel Manjolo

wow this is an amazing gain in my life

Paul I Thoronka

This is so good

Tesfay haftu

How can i arrange my specific objectives in my dissertation?

Trackbacks/Pingbacks

  • What Is A Literature Review (In A Dissertation Or Thesis) - Grad Coach - […] is to write the actual literature review chapter (this is usually the second chapter in a typical dissertation or…

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

IMAGES

  1. Dissertation research design methodology. Articles to help you with

    design based research dissertation

  2. how to write a research design example

    design based research dissertation

  3. Samples Of Research Design In Thesis

    design based research dissertation

  4. Models of Dissertation Research in Design

    design based research dissertation

  5. 3. The research design within this dissertation, following the design

    design based research dissertation

  6. Design-Based Research: A Methodology to Extend and Enrich Biology

    design based research dissertation

COMMENTS

  1. PDF Design-based research and doctoral students: Guidelines for preparing a

    paper, we argue that design-based research is feasible for doctoral students, and that candidates should be encouraged to engage in it. More specifically, we describe the components of a dissertation proposal or prospectus that utilizes design-based research methods in the context of educational technology research.

  2. PDF Research by Design: Design Based Research and the Higher Degree

    education (Lesh, 2003). In this respect, design-based research is a methodological approach that supports an investigation of a learning design. An overview of design-based research is provided in this paper and design-based research is discussed in a HDR context. Several theses that have used a design-based research approach are

  3. Appendix 2: Design-Based Research (DBR) Dissertations

    The core of the DBR dissertation is the research, development and testing of a product or process to solve an educational problem. Design-based research protocols require intensive and long-term collaboration involving researchers and practitioners. Design-based research integrates the development of solutions to practical problems in learning environments with the identification of reusable ...

  4. Full article: Design-based research: What it is and why it matters to

    Design-based research methods are a thirty-year old tradition from the learning sciences that have been taken up in many domains as a way to study designed interventions that challenge the traditional relationship between research and design, as is the case with online learning. Key to the contribution and coherence of this method are different ...

  5. PDF Design Based Research in Doctoral Studies: Adding a New Dimension to

    In this paper, we docu-ment two doctoral dissertations that used a design-based research approach in two different contexts. Methodology The study draws on a qualitative analysis of the methodological approaches of two doctoral dissertations through the lenses of Herrington, McKenney, Reeves and Oliver principles of design-based research approach.

  6. Design-Based Research: A Decade of Progress in Education Research

    Design-based research (DBR) evolved near the beginning of the 21st century and was heralded as a practical research methodology that could effectively bridge the chasm between research and practice in formal education.

  7. IJDS

    In this paper, we document two doctoral dissertations that used a design-based research approach in two different contexts. Methodology : The study draws on a qualitative analysis of the methodological approaches of two doctoral dissertations through the lenses of Herrington, McKenney, Reeves and Oliver principles of design-based research approach.

  8. An Introduction to Design-Based Research with an Example From

    Educational design-based research (DBR) can be characterized as research in which the design of educational materials (e.g., computer tools, learning activities, or a professional development program) is a crucial part of the research. That is, the design of learning environments is interwoven with the testing or developing of theory.

  9. [PDF] Design-based research and doctoral students: Guidelines for

    At first glance, design-based research may appear to be such a long-term and intensive approach to educational inquiry that doctoral students, most of whom expect to complete their Ph.D. degree in 4-5 years, should not attempt to adopt this approach for their doctoral dissertations. In this paper, we argue that design-based research is feasible for doctoral students, and that candidates should ...

  10. PDF Using Design-Based Research in Higher Education Innovation

    The average grade was 1.8 for students in a control section, and 2.1 for students in a RealizeIt section. An OLS model estimated that the effect of being in a RealizeIt section was an average increase of .24 grade points for students' final grades, holding all else equal (p=.00).

  11. Design-based research

    Design-based research (DBR) is a type of research methodology used by researchers in the learning sciences, which is a sub-field of education. The basic process of DBR involves developing solutions (called "interventions") to problems. ... Mor, Y. (2010) 'A Design Approach to Research in Technology Enhanced Mathematics Education', PhD thesis ...

  12. LibGuides: Section 2: Case Study Design in an Applied Doctorate

    Case study design is an appropriate research design to consider when conceptualizing and conducting a dissertation research study that is based on an applied problem of practice with inherent real-life educational implications. Case study researchers study current, real-life cases that are in progress so that they can gather accurate ...

  13. (PDF) Design-based research and doctoral students: Guidelines for

    At first glance, design-based research may appear to be such a long-term and intensive approach to educational inquiry that doctoral students, ... In this paper, we document two doctoral dissertations that used a design-based research approach in two different contexts. Methodology : The study draws on a qualitative analysis of the ...

  14. Design-based research and doctoral students: Guidelines for preparing a

    In this paper, we argue that design-based research is feasible for doctoral students, and that candidates should be encouraged to engage in it. More specifically, we describe the components of a dissertation proposal or prospectus that utilizes design-based research methods in the context of educational technology research. KW - IR-93893

  15. Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: ... allowing you to adjust your approach based on what you find throughout the research process. Example: ... so she's an expert at writing a great thesis. She has also worked as an editor and ...

  16. PDF A Design-based Research Approach to The Implementation and Examination

    Design-Based Research (DBR) emerged from these efforts as a new paradigm for educational inquiry. As an overarching framework of inquiry, DBR integrates three major factors: a) the design of innovative learning environments; b) classroom practice; and c) development of local theories of learning (e.g. Design-Based Research Collective, 2003).

  17. Design-Based Research: A Methodology to Extend and Enrich Biology

    Recent calls in biology education research (BER) have recommended that researchers leverage learning theories and methodologies from other disciplines to investigate the mechanisms by which students to develop sophisticated ideas. We suggest design-based research from the learning sciences is a compelling methodology for achieving this aim. Design-based research investigates the "learning ...

  18. PDF Design-Based Research (DBR) in educational enquiry and

    This thesis was motivated by the assumption that these literacies need to be ... To accomplish this, a design-based research (DBR) methodology drawing on a pragmatic epistemology was developed and employed as the main research paradigm informing this design study. Thus, the research design involved a flexible three-stage research framework

  19. (PDF) Design-Based Research

    Number of articles using or discussing design-based research. Data from the 47 articles selected (5 articles each year from 2003 to 2011 and 2 articles from 2002) were entered into a spreadsheet ...

  20. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  21. PDF Reflection: Research by design: Design-based research and the higher

    The article "Research by design: Design-based research and the higher degree research student" (Kennedy-Clark, 2013) appeared in the Journal of Learning Design Volume 6, Issue 2 in 2013. ... part to the length of time taken to complete a Masters or Doctoral dissertation and also that the methodology is often overshadowed by the outcomes of ...

  22. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  23. Dissertation Structure & Layout 101 (+ Examples)

    Abstract or executive summary. The dissertation abstract (or executive summary for some degrees) serves to provide the first-time reader (and marker or moderator) with a big-picture view of your research project. It should give them an understanding of the key insights and findings from the research, without them needing to read the rest of the report - in other words, it should be able to ...