August 17 – 26, 2023

Daily Schedule

9:00 am PT: Workshops Begin
10:30 am PT: Break
12:00 pm PT: Lunch Break
2:30 pm PT: Break
4:00 pm PT: Workshops Conclude

All workshops will be conducted online. Please note that all times listed are in the Pacific time zone. Each workshop lasts one full day, from 9:00 am to 4:00 pm.

2023 Workshops

Thursday, August 17

Friday, August 18

Monday, August 21

Tuesday, August 22

Wednesday, August 23

Thursday, August 24

Friday, August 25

Saturday, August 26

Workshop Descriptions

Thursday, August 17

Foundations & Frontiers of Evaluation & Applied Research Methods
Stewart I. Donaldson
and Christina A. Christie

headshot of Stewart Donaldson headshot of Christina Christie

This workshop will provide participants with an overview of the core concepts and contemporary issues in evaluation and applied research methods. Key topics will include the various uses, purposes, and benefits of conducting evaluations and applied research, basics of validity and design sensitivity, evaluation theory, theories of change, and the strengths and weaknesses of a variety of common and new applied research methods. In addition, participants will be introduced to the latest version of the Evaluation Theory Tree, and a range of popular evaluation approaches including the transdisciplinary approach, theory-driven evaluation science, experimental and quasi-experimental evaluations, culturally responsive and equity-focused evaluations, appreciative and strengths-focused evaluations, empowerment evaluation, inclusive evaluation, utilization-focused evaluation, and developmental evaluation. This workshop is intended to provide participants with a solid introduction, overview, or refresher on the latest developments in evaluation and applied research practice, and to prepare participants for intermediate and advanced level workshops in the series.

Recommended readings:

Donaldson, S. I. (2022). Introduction to theory-driven program evaluation: Culturally responsive and strengths-focused applications. New York, NY: Routledge.

Alkin, M. C., & Christie, C. A. (Eds.) (2023). Evaluation roots: Theory influencing practice. New York, NY: Guilford.

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu or tina.christie@ucla.edu

Expanding Pathways to Leadership
Michelle Bligh

Michelle Bligh

There is no question that leadership profoundly affects our lives through our roles as researchers and evaluators. Organizational and programmatic successes and failures are often attributed to leadership. However, leadership is more than just a collection of tools and tips, or even skills and competencies; the essence of leadership is grounded in values, philosophies, and beliefs. In addition, pathways to leadership are complicated by the various challenges and opportunities rooted in gender, race, ethnicity, age, class, citizenship, ability, and experience.

Through the metaphor of the labyrinth, we will explore the following questions: What is effective leadership, and how can we encourage more researchers and evaluators to identify as leaders and proactive followers? How can we develop more inclusive leadership programs that allow diverse leaders to rise to the new challenges and demands of a global world? We will examine what successful 21st-century leaderships looks like, drawing on theories of philosophy and ethics, charismatic and transformational leadership, and followership. Using research, cases, and exercises, we will examine constructs critical to practicing leadership, including empowerment, authenticity, accountability, courage, influence, and humility.

Questions regarding this workshop may be addressed to michelle.bligh@cgu.edu.

Friday, August 18

Qualitative Methods
Kendall Cotton Bronk

headshot of Kendall Bronk
This workshop is designed to introduce you to qualitative research methods. The session will focus on how qualitative research can be effectively utilized in applied research and evaluation contexts. We’ll talk about how to devise qualitative research questions, how to select purposive samples, and what kinds of data to collect for qualitative investigations. We’ll also discuss a few approaches to analyzing qualitative data, we’ll explore strategies for enhancing the validity of qualitative studies, and we’ll discuss the types of claims qualitative researchers can make based on their methods. Finally, we’ll dedicate time in the afternoon to addressing specific issues class participants are having with qualitative work they’re currently doing or plan to do.

Questions regarding this workshop may be addressed to kendall.bronk@cgu.edu.

Note: Please be advised that this workshop will NOT be recorded.

Longitudinal Methods: Building and Maintaining Evaluation Participant Commitment Across Time
Anna Woodcock

Many psychological and behavioral processes unfold over time, necessitating longitudinal research designs. Spanning a week, month, year, or decades, longitudinal research poses a host of methodological challenges, foremost of which is participant attrition. This workshop introduces the Tailored Panel Management (TPM) approach to explore how psychological research informs recruitment and retention strategies in longitudinal studies. Using examples and case studies from more than a decade of research, we will focus on practices regarding compensation, communication, consistency, and credibility that promote sustained commitment to longitudinal research participation. This workshop is intended to provide an in-depth understanding of the TPM approach that participants can apply to future longitudinal research projects.

Recommended reading:

Estrada, M., Woodcock, A., & Schultz, P. W. (2014). Tailored panel management: A theory-based approach to building and maintaining participant commitment to a longitudinal study. Evaluation Review, 38, 3-28. doi: 10.1177/0193841×14524956

Questions regarding this workshop may be addressed to anna.woodcock@cgu.edu.

Monday, August 21

Designing & Evaluating Positive Psychology Interventions (PPIs) & Programs
Stewart I. Donaldson

headshot of Stewart Donaldson

Positive psychology research and practice has blossomed over the past two decades. This workshop is designed to provide participants with an in-depth understanding of how to practically design evidence-based positive psychology interventions and programs. We will begin by reviewing the accumulated scientific evidence on the efficacy and effectiveness of positive psychology interventions and programs provided by the Centers for Disease Control and from more than two decades of the scientific study of positive psychology theories and interventions. This research will be placed in the context of a relatively new framework, PERMA+4, which summarizes the building blocks of well-being and positive functioning typically targeted by interventions and programs. These evidence-based building blocks include physical health, positive mindset, creating positive environments, economic security, positive emotions, engagement, relationships, meaning and purpose, and achievement. You will learn how to design interventions and programs relevant for addressing contemporary topics and societal issues through mini-lectures, small group discussions, and by designing an innovative positive psychology intervention or program based on your personal and/or career interests.

Recommended reading:

Donaldson, S. I., Gaffaney, J., & Cabrera, V. (2023). The science and
practice of positive psychology: From a bold vision to PERMA+4. Invited for H. S. Friedman & C. Markey, (Eds.), The Encyclopedia of Mental Health (3rd Edition). Cambridge, Massachusetts: Academic Press.

Cabrera, V. & Donaldson, S. I. (2023). PERMA to PERMA+4 building blocks of well-being: A systematic review of the empirical literature. The Journal of Positive Psychology.

Donaldson, S. I., Caberra, V., & Gaffaney, J. (2021). Following the science to generate well-being: Using the highest quality experimental evidence to design interventions. Frontiers in Psychology. DOI: 10.3389/fpsyg.2021.739352. (Special Issue on Positive Psychological Interventions Beyond Weird Contexts: How, When, and Why They Work)

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu

Quasi-Experimental Methods
William D. Crano

William Crano

Quasi-experimental (QE) methods often are an afterthought for many trained largely or exclusively on experimental methods. This can prove a limiting factor in their research, as many important issues do not transition well into the experimental space. QE methods open the field to a host of issues that simply cannot be stuffed into experimental designs. One supposed weakness of QE research is its defining feature, which is that such designs never involve random assignment, and so cannot support causal inferences. The first part of the statement is true, the second may not be true. In this workshop, we will discuss research designs using non-random participant samples that produce results as instructive causally as any true experiment could – and that could not have been accomplished in experimental settings. In other instances, the QE design will not produce bullet-proof evidence of causation but will show how the results provide strong insights into probable causes that encourage the more precise (if less generalizable) lens of the experiment.

In our exploration of QE designs and research, we will discuss basics of all good science, which involve the quality (reliability & validity) of measures, a comparison of “true“ vs. quasi-experimental designs, important threats to internal validity in QE designs (& how to minimize them), and the regression artifact, a special problem in all non-experimental research. We will then move to what I call “primitive” QE designs, case control (or ex post facto) designs, slightly less primitive designs, matching and propensity analysis, interrupted time series analyses, and regression/discontinuity analysis. Throughout, we will consider interesting and difficult problems to help expand your views regarding the utility of QE methods in your own research lives.

Questions regarding this workshop may be addressed to william.crano@cgu.edu.

Tuesday, August 22

Survey Research Methods
Jason T. Siegel

Jason Siegel

Dr. Jason Siegel has given his survey writing workshop to large organizations such as NBC TV/Universal and the American Evaluation Association, as well as smaller ones such as Gutcheckit.com and the Energy Center of Wisconsin. He has served as a survey consultant for organizations such as the Bill and Melinda Gates Foundation and the U.S. Department of Labor. His workshop is now going virtual.

Developing surveys is like taking wedding pictures. Everyone thinks it is easy, but it takes an experienced professional to do it right. Creating an effective survey requires a complete understanding of the impact that item wording, question order, and survey design can have on a research effort. Only through adequate training can a good survey be discriminated from the bad.

Dr. Siegel’s one-day workshop will cover all basics of writing and ordering survey items. This workshop will teach you how to create surveys in a way that minimizes error and maximizes the validity of your data. Additionally, the workshop will discuss why it is important to increase respondents’ motivation to provide accurate answers and how easy it is to accidentally de-motive your respondent. Once the workshop is over, you will never look at a survey the same way again.

Questions regarding this workshop may be addressed to jason.siegel@cgu.edu.

Introduction to Data Analysis for Applied Researchers and Evaluators
Wesley Schultz

Wesley Schultz

In this course, we will introduce and review basic data analysis tools and concepts commonly used in applied research and evaluation. The focus will be on fundamental concepts that are needed to guide decisions for appropriate data analyses, interpretations, and presentations. The goal of the course is to help participants avoid errors and improve skills as data analysts, communicators of statistical findings, and consumers of data analyses.

Topics include data screening and cleaning, selecting appropriate methods for analysis, detecting statistical pitfalls and dealing with them, avoiding silly statistical mistakes, interpreting statistical output, and presenting findings to lay and professional audiences. Examples will include applications of basic distributions and statistical tests (e.g., z, t, chi-square, correlation, regression), and emerging topics like AI and big data analysis.

Questions regarding this workshop may be addressed to wesley.schultz@cgu.edu.

Wednesday, August 23

Culturally Responsive, Equity-Focused Evaluation and Applied Research 
Wanda Casillas

Culturally Responsive Evaluation (CRE) is an approach to evaluation that centers culture and context in all steps of an evaluation process. Centering culture in evaluation and applied research has the potential to not only increase the utility and meaningfulness of evaluation findings for stakeholders and communities served, but also to increase the accuracy and validity of those findings.

However, critics of CRE raise concerns regarding how to conduct culturally responsive, equitable evaluation and applied research projects. Whereas few argue that culture is an important factor to attend to in our work, many find the lack of prescription regarding how to carry out equitable work in research daunting. In this course, we will focus on first positioning the argument for how responsive, equitable evaluation and applied research methods are rigorous and valid. Then we will focus on learning methodological tactics that can be added to a researcher’s toolkit. The class will follow a lecture format with intermittent group work and discussion focused on vignettes of culturally-situated applied research projects.

Questions regarding this workshop may be addressed to wanda.casillas@cgu.edu

Bibliometric Methods
Becky Reichard

headshot of Becky Reichard

One rite of passage as an emerging scholar, including Ph.D. students, is to write a review paper on a given topic or field. This task is daunting, and for a good reason. With over 90+ million entries in the web of science, living in the information age is a gift. Still, the sheer amount of information we must sift through can be incredibly overwhelming. During this process, you may find yourself wondering questions such as: what are the most impactful papers in this field? Which authors are folks listening to the most? Since I’m relatively new to this field, how can I ultimately be sure that I have not missed any essential work? Despite facing such questions, emerging scholars must still find a way to orient themselves and quickly become knowledgeable in our chosen field of study. Thankfully, due to advances in technology developed to help us handle big data, we now have a new, easy-to-use, and robust analytical tool to help us with this task. That’s where this workshop comes in. Bibliometric methods provide us with an efficient way to cut through the clutter and objectively identify the key papers and authors in a particular field. In essence, bibliometric methods provide a clear, systematic structure for reviewing published literature on any topic.

Not only can this help you in quickly getting up to speed on your research topic, but it can also result in a publishable review paper! Using simple, free software and the Web of Science, bibliometric methods allow us to (1) examine citation indices to identify the essential documents in a particular field and (2) generate a visual network map of how individual publications cluster and relate to one another. The purpose of this workshop is to provide you with an introduction to the various bibliometric methods (e.g., histography, co-citation analysis, and bibliometric coupling). We will review the logic and objectives of bibliometric methods, demonstrate how to use the free software, and help you understand how to interpret the network maps. With the tool of bibliometric methods at your disposal, you will be well-positioned to understand the ongoing narrative within a field of study and craft an innovative, data-driven review paper.

This workshop is for anyone interested in upgrading their ability to understand a particular field of research. Additionally, the workshop is highly recommended for PhD students, who will be aided in learning a valuable tool in assisting their review paper portfolio requirement.

Questions regarding this workshop may be addressed to becky.reichard@cgu.edu

Thursday, August 24

Approaches and Methods for Promoting DEI in the Workplace
Jennifer Feitosa

Every year the Society of Industrial-Organizational Psychology surveys its members to highlight the top ten workplace trends. Not surprisingly, diversity, equity, and inclusion (DEI) has consistently been considered one of the top #3 workplace trends. With that in mind, we will be addressing this exciting and evolving topic using a critical perspective. We will work together to leverage our resources -diverse identities and perspectives- to appreciate the complexity of DEI issues and how they apply to evaluation, research, and practice.

This workshop will start by providing an overview of conceptualizations and operationalizations of DEI in organizations. In addition, we will draw from multiple organizational and DEI frameworks to understand the theory, concepts, and methods of DEI measurement. In the second part of the workshop, we will dive into the practice and evaluation of DEI with applied scenarios and activities. Lastly, participants will discuss the limitations and challenges of measuring DEI along with opportunities to integrate DEI in our evaluation practices moving forward.

Questions regarding this workshop may be addressed to jfeitosa@cmc.edu

Note: Please be advised that this workshop will NOT be recorded.

Tech Tools and Evaluation:  From Artificial Intelligence to Graphic Design and Avatar-based Meetings to Online Surveys, Spreadsheets, and Dashboards
David Fetterman

David Fetterman

Evaluation requires powerful tools to address complex, social, political, and economic issues in the world. Technology has always had a significant role in the pursuit of systematic inquiry and assessment in the field. This course focuses on a few of the most powerful and promising tech tools in the evaluation toolkit, ranging from online surveys, spreadsheets, and dashboards to artificial intelligence and avatar-based meeting rooms. Real-world case examples will be used to demonstrate the utility of these tech tools in practice, as applied in the United States and internationally. The course will be interactive and require hands-on activities, including applying many of these new tools throughout the day, presenting applications, and providing and receiving critiques. The course is designed to enhance your technological sophistication as applied to evaluation.

Questions regarding this workshop may be addressed to david.fetterman@cgu.edu

Friday, August 25

Introduction to Using ChatGPT for Education, Research, & Evaluation
Robert Klitgaard

Robert-Klitgaard

AI tools are transforming research, writing, and teaching. You’ll have a chance to dive in, with your interests at the forefront. Readings in advance will help you practice your prompts. Then, working together, we’ll see how AI tools can be your research assistants, creative colleagues, data helpers, and writing coaches. (Not to mention your illustrators.) For example, you’ll see how AI can help you summarize technical articles. Present alternative perspectives and hypotheses. Design a syllabus and outline a class. Create a Harvard-Business-School-style teaching case. Edit your writing, even emulate different styles of writing you can choose. Design a PowerPoint presentation. Not least, throughout we’ll explore the limitations and misuses of AI tools.

Questions regarding this workshop may be addressed to robert.klitgaard@cgu.edu

Social Network Analysis
Wallace Chipidza

Wallace-Chipidza

This workshop explores social network analysis – a suite of methods for describing, visualizing, and modeling networks of relationships. The workshop has both theoretical and hands-on components. We first present an overview of key concepts in network science and their potential applications. We then explore network data collection, preparation, and visualization. We proceed to investigate networks with respect to i) centrality i.e., asking who occupies positions of prominence within a given network and ii) community detection, i.e., identifying densely connected regions of the network. The workshop emphasizes applications of SNA to social science and evaluation.

Questions regarding this workshop may be addressed to wallace.chipidza@cgu.edu

Saturday, August 26

Measuring Flourishing: Using the Latest Measurement Tools to Assess Well-Being
Saida Heshmati

Saida-Heshmati

This immersive workshop offers participants a unique opportunity to delve into cutting-edge methods and measurement tools designed to assess and understand the concepts of flourishing and well-being. We will explore the most recent advancements in the field, gaining practical techniques and methodologies for accurately capturing and evaluating well-being. Through an interactive session and hands-on exercises, participants will acquire valuable insights into the latest assessment tools and learn how to effectively apply them in research, interventions, and real-life contexts. This workshop promises to expand participants’ knowledge and skillset in well-being assessment.

Questions regarding this workshop may be addressed to saida.heshmati@cgu.edu

An Introduction to Structural Equation Modeling
Bin Xie

Bin Xie photo

Structural equation modeling (SEM) is a general framework for empirically testing research hypotheses on relationships among observed (or measured) variables and unobserved (or latent) variables. It comprises a broad array of statistical methods and techniques frequently used in public health and behavioral science research. This workshop will provide an introduction to SEM on concepts, principles, applications, and interpretations. Examples of applications, which are demonstrated with the Lavaan R package from R, include path analysis, confirmatory factor analysis, and latent variable modeling.

Questions regarding this workshop may be addressed to bin.xie@cgu.edu

Contact Us

Claremont Evaluation Center

Claremont Graduate University
175 E. 12th Street
Claremont, CA 91711
matthew.higgins@cgu.edu