August 17 – 27, 2021

Daily Schedule

9:00 am: Workshops Begin
10:30 am – 10:45 am: Break
12:00 pm – 1:00 pm: Lunch Break
2:30 pm – 2:45 pm: Break
4:00 pm: Workshops Conclude

All workshops will be conducted online. Please note that all times listed are in the Pacific time zone. Each workshop lasts one full day, from 9:00 am to 4:00 pm. To register in advance for online workshop attendance, visit the registration page.

2021 Workshops

Tuesday, August 17

Wednesday, August 18

Thursday, August 19

Friday, August 20

Monday, August 23

Tuesday, August 24

Wednesday, August 25

Thursday, August 26

Friday, August 27

Workshop Descriptions

Tuesday, August 17

Foundations of Evaluation & Applied Research Methods
Stewart I. Donaldson
and Christina A. Christie

headshot of Stewart Donaldson headshot of Christina Christie

This workshop will provide participants with an overview of the core concepts in evaluation and applied research methods. Key topics will include the various uses, purposes, and benefits of conducting evaluations and applied research, basics of validity and design sensitivity, evaluation theory, theories of change, and the strengths and weaknesses of a variety of common applied research methods. In addition, participants will be introduced to a range of popular evaluation approaches including the transdisciplinary approach, theory-driven evaluation science, realist evaluation, experimental and quasi-experimental evaluations, culturally responsive evaluation, appreciative and strengths-focused evaluation, empowerment evaluation, inclusive evaluation, utilization-focused evaluation, and developmental evaluation. This workshop is intended to provide participants with a solid introduction, overview, or refresher on the latest developments in evaluation and applied research, and to prepare participants for intermediate and advanced level workshops in the series.

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu or tina.christie@ucla.edu 

Tech Tools in a (post) COVID Context: Adapting Our Teaching, Research, and Evaluation to a Digital World
David M. Fetterman

headshot of David Fetterman

We have entered a new age in which teaching and learning, research and evaluation are conducted almost exclusively online.  New tools are needed to work remotely.  They include:  Zoom and Skype (a basic digital staple), Google Sheets and Docs (indispensable for online digital collaboration), Google Forms (online survey software), Word Clouds (qualitative data analysis software), Blogs (posting about our projects and programs, instead of ourselves), social media (LinkedIn, Facebook, and Twitter), Canva (transformative publishing software), and VirBella (avatar-based meeting platform).  In addition, we must embrace a new way of using these tools to build trusting relationships, maintain people’s interest and attention, and be effective.  It will be an interactive, hands-on workshop in which participants will be introduced to tech tools, asked to find them on the net, use them, and return for a critique.  This brief workshop highlights the importance of using these new digital tools and how they can be used to minimize Zoom fatigue and exponentially enhance the quality and effectiveness of our work.

Questions regarding this workshop may be addressed to fettermanassociates@gmail.com.

Wednesday, August 18

Qualitative Methods
Kendall Cotton Bronk

headshot of Kendall Bronk
This workshop is designed to introduce you to qualitative research methods. The session will focus on how qualitative research can be effectively utilized in applied research and evaluation contexts. We’ll talk about how to devise qualitative research questions, how to select purposive samples, and what kinds of data to collect for qualitative investigations. We’ll also discuss a few approaches to analyzing qualitative findings, we’ll explore strategies for enhancing the validity of qualitative studies, and we’ll discuss the types of claims qualitative researchers can make based on their methods. Finally, we’ll dedicate time in the afternoon to addressing specific issues class participants are having with qualitative work they’re currently doing or plan to do.

Questions regarding this workshop may be addressed to kendall.bronk@cgu.edu.

Longitudinal Research Methods: Building and Maintaining Participant Commitment 
Anna Woodcock

Many psychological and behavioral processes unfold over time, necessitating longitudinal research designs. Spanning a week, month, year, or decades, longitudinal research poses a host of methodological challenges, foremost of which is participant attrition. This workshop introduces the Tailored Panel Management (TPM) approach to explore how psychological research informs recruitment and retention strategies in longitudinal studies. Using examples and case studies from more than a decade of research, we will focus on practices regarding compensation, communication, consistency, and credibility that promote sustained commitment to longitudinal research participation. This workshop is intended to provide an in-depth understanding of the TPM approach that participants can apply to future longitudinal research projects.

Recommended reading:

Estrada, M., Woodcock, A., & Schultz, P. W. (2014). Tailored panel management: A theory-based approach to building and maintaining participant commitment to a longitudinal study. Evaluation Review, 38, 3-28. doi: 10.1177/0193841×14524956

Questions regarding this workshop may be addressed to anna.woodcock@cgu.edu.

Thursday, August 19

Applications of Correlation and Multiple Regression: Mediation, Moderation, and More
Dale E. Berger

Dale Berger

Multiple regression is a powerful and flexible tool that has wide applications in evaluation and applied research. Regression analyses are used to describe relationships, test theories, make predictions with data from experimental or observational studies, and model complex relationships. In this workshop we’ll explore preparing data for analysis, selecting models that are appropriate to your data and research questions, running analyses including mediation and moderation, interpreting results, and presenting findings to a nontechnical audience. The presenter will demonstrate applications from start to finish with SPSS and Excel. In recognition of the fact that it is difficult to remember everything in a presentation, participants will be given detailed handouts with explanations and examples that can be used later to guide similar applications.

Questions regarding this workshop may be addressed to dale.berger@cgu.edu.

Social Network Analysis 
Saida Heshmati & Wallace Chipidza

Saida-HeshmatiWallace-Chipidza

This workshop explores social network analysis – a suite of methods for describing, visualizing, and modeling networks of relationships. The workshop has both theoretical and hands-on components. We first present an overview of key concepts in network science and their potential applications. We then explore network data collection, preparation, and visualization. We proceed to investigate networks with respect to i) centrality i.e., asking who occupies positions of prominence within a given network and ii) community detection, i.e., identifying densely connected regions of the network. The workshop emphasizes applications of SNA to social science and evaluation.

Questions regarding this workshop may be addressed to saida.heshmati@cgu.edu or wallace.chipidza@cgu.edu

Friday, August 20

Fair leadership: The what, why, and how of overcoming perceptions of injustice
Stephen Gilliland

headshot of Stephen Gilliland

With more divisions in our society than ever before, justice has moved to the center of attention for many organizations and leaders. Why is this the case and why is fairness so important to most people? We will examine how perceptions of injustice and fairness are formed and some of the challenges in managing these perceptions. Fairness is a multidimensional construct and you will see how by managing dimensions you can overcome sources of perceived injustice.

Questions regarding this workshop may be addressed to Stephen.Gilliland@cgu.edu.

Survey Research Methods
Jason T. Siegel

Jason Siegel

Dr. Jason Siegel has given his survey writing workshop to large organizations such as NBC TV/Universal and the American Evaluation Association, as well as smaller ones such as Gutcheckit.com and the Energy Center of Wisconsin. He has served as a survey consultant for organizations such as the Bill and Melinda Gates Foundation and the U.S. Department of Labor. His workshop is now going virtual.

Developing surveys is like taking wedding pictures. Everyone thinks it is easy, but it takes an experienced professional to do it right. Creating an effective survey requires a complete understanding of the impact that item wording, question order, and survey design can have on a research effort. Only through adequate training can a good survey be discriminated from the bad.

Dr. Siegel’s one-day workshop will cover all basics of writing and ordering survey items. This workshop will teach you how to create surveys in a way that minimizes error and maximizes the validity of your data. Additionally, the workshop will discuss why it is important to increase respondents’ motivation to provide accurate answers and how easy it is to accidentally de-motive your respondent. Once the workshop is over, you will never look at a survey the same way again.

Questions regarding this workshop may be addressed to jason.siegel@cgu.edu.

Monday, August 23

Introduction to Theory-Driven Program Evaluation: Culturally Responsive and Strengths-focused Applications 
Stewart I. Donaldson

headshot of Stewart Donaldson

This online workshop is designed to provide participants with an in-depth understanding of how to design, monitor, and evaluate interventions and programs. The process of engaging diverse stakeholders to design programs, developing logic models and theories of change, and designing tailored, ethically defensible, and culturally responsive program designs and evaluations will be covered.  Through mini-lectures, small group discussions, exercises, and culturally responsive and strengths-focused cases, you will have the experience of designing an action model, change model, and proposing a developmental, formative, and summative evaluation plan for an intervention or program of your choice.

Recommended reading:

  • Donaldson, S. I. (2021). Introduction to Theory-Driven Program Evaluation: Culturally Responsive and Strengths-Focused Applications.  New York, NY: Routledge.

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu

Tuesday, August 24

Culturally Responsive Evaluation
Wanda Casillas 

Culturally Responsive Evaluation is often described as a way of thinking, a stance taken, or an emerging approach to evaluation that centers culture and context in all steps of an evaluation process. Regardless, it provides a timely and necessary foundation for critically informing best practices in evaluation design, implementation, and utilization. Taking the broad approach that “culture” describes a shared set of values, principles, practices, behaviors, and tools among groups of individuals, we’ll determine the ways in which stakeholders and evaluators, programs, and evaluations are culturally- and contextually-situated and explore what this assertion means for evaluation practice.  We’ll confront complex topics such as the nature of intersecting cultural spheres on evaluation components and the capacity and role of evaluations to address these. In this course, we’ll engage with a core set of CRE principles that can be married with existing toolkits, such as a systems evaluation protocol, in order to create robust, thoughtful, and valid evaluation designs to optimize effectiveness for evaluands and utilization of evaluation findings. We’ll simulate a step-wise culturally responsive evaluation design using a case study and other interactive exercises to inform personal and professional practices and support group learning.

Questions regarding this workshop may be addressed to wandadcasillas@gmail.com 

Bibliometric Methods
Becky Reichard

headshot of Becky Reichard

One rite of passage as an emerging scholar, including Ph.D. students, is to write a review paper on a given topic or field. This task is daunting, and for a good reason. With over 90+ million entries in the web of science, living in the information age is a gift. Still, the sheer amount of information we must sift through can be incredibly overwhelming. During this process, you may find yourself wondering questions such as: what are the most impactful papers in this field? Which authors are folks listening to the most? Since I’m relatively new to this field, how can I ultimately be sure that I have not missed any essential work? Despite facing such questions, emerging scholars must still find a way to orient themselves and quickly become knowledgeable in our chosen field of study. Thankfully, due to advances in technology developed to help us handle big data, we now have a new, easy-to-use, and robust analytical tool to help us with this task. That’s where this workshop comes in. Bibliometric methods provide us with an efficient way to cut through the clutter and objectively identify the key papers and authors in a particular field. In essence, bibliometric methods provide a clear, systematic structure for reviewing published literature on any topic.

Not only can this help you in quickly getting up to speed on your research topic, but it can also result in a publishable review paper! Using simple, free software and the Web of Science, bibliometric methods allow us to (1) examine citation indices to identify the essential documents in a particular field and (2) generate a visual network map of how individual publications cluster and relate to one another. The purpose of this workshop is to provide you with an introduction to the various bibliometric methods (e.g., histography, co-citation analysis, and bibliometric coupling). We will review the logic and objectives of bibliometric methods, demonstrate how to use the free software, and help you understand how to interpret the network maps. With the tool of bibliometric methods at your disposal, you will be well-positioned to understand the ongoing narrative within a field of study and craft an innovative, data-driven review paper.

This workshop is for anyone interested in upgrading their ability to understand a particular field of research. Additionally, the workshop is highly recommended for PhD students, who will be aided in learning a valuable tool in assisting their review paper portfolio requirement.

Questions regarding this workshop may be addressed to becky.reichard@cgu.edu

Wednesday, August 25

Positive Organizational Psychology Interventions: Design and Evaluation 
Stewart Donaldson

headshot of Stewart Donaldson

Since its formal introduction at the American Psychological Association Convention in 1998, the science and evidence-based practice of positive psychology has blossomed, giving birth to a vibrant community of scholars and practitioners interested in understanding and improving well-being and positive functioning.

This online workshop is designed to help participants think critically about theories, research, and applications of positive psychology to career development and the workplace.  Through mini-lectures, small group discussions, exercises, and cases, you will learn about the foundations of positive psychology; foundations of positive organizational psychology; building blocks of well-being and positive functioning; positive psychological capital (PsyCap), worker, team, and organizational strengths; job and career crafting; work engagement and flow; high quality work connections and relationships; positivity resonance; positive leadership; executive and leadership coaching; positive organizational development and appreciative inquiry; and diversity, equity, and inclusion (DEI) topics in positive organizational psychology.  Special emphasis will be placed on how to design and evaluate positive organizational psychology interventions in diverse and dynamic organizational contexts.

Recommended reading:

  • Donaldson, S. I. & Chen, C. (2021). Positive organizational psychology interventions: Design & Evaluation.  Hoboken, NJ: Wiley-Blackwell.

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu

Thursday, August 26

Approaches and Methods for Promoting DEI in the Workplace
Gloria González-Morales and Jennifer Feitosa

Every year the Society of Industrial-Organizational Psychology surveys its members to highlight the top ten workplace trends. Not surprisingly, diversity, equity, and inclusion (DEI) has consistently been considered one of the top #3 workplace trends. With that in mind, we will be addressing this exciting and evolving topic using a critical perspective. We will work together to leverage our resources -diverse identities and perspectives- to appreciate the complexity of DEI issues and how they apply to evaluation, research, and practice.

This workshop will start by providing an overview of conceptualizations and operationalizations of DEI in organizations. In addition, we will draw from multiple organizational and DEI frameworks to understand the theory, concepts, and methods of DEI measurement. In the second part of the workshop, we will dive into the practice and evaluation of DEI with applied scenarios and activities. Lastly, participants will discuss the limitations and challenges of measuring DEI along with opportunities to integrate DEI in our evaluation practices moving forward.

Questions regarding this workshop may be addressed to gloria.gonzalez@cgu.edu or jfeitosa@cmc.edu

Friday, August 27

Quasi-Experimental Methods
William D. Crano

headshot of William Crano

Quasi-experimental (QE) methods often are an afterthought for many trained more or less exclusively on experimental methods. This can prove a limiting factor in their research, as many important issues do not transition well into the experimental space.  QE methods open the field to a host of issues that simply cannot be stuffed into experimental designs. One supposed weakens of QE research is its defining feature, which is that such designs never involve random assignment, and so cannot support causal inferences. This absolute may (or may not) be true. In this workshop, we will discuss QE research designs using non-random participant samples that produce results as instructive causally as any pure experiment could – and that could not have been accomplished in experimental settings. In other instances, the QE design will not produce bullet-proof evidence of causation but will show how the results provide strong insights into probable causes that encourage the more precise (if less generalizable) lens of the experiment.

In our exploration of QE designs and research, we will discuss basics of all good science, which involve the quality (reliability & validity) of measures, a comparison of “true-“vs. quasi-experimental designs, important threats to internals validity in QE designs (& how to minimize them), and the regression artifact, a special problem in QE research. We will then move to what I call “primitive” QE designs, case control (or ex post facto) designs, slightly less primitive designs, matching and propensity analysis, interrupted time series analyses, and regression/discontinuity analysis. Throughout, we will consider interesting and difficult problems to help expand your views regarding the utility of QE methods in your own research lives.


Consulting
Cindi Gilliland

This workshop focuses on learning theories behind and processes of management consulting for organizations. Students will gain understanding of major types of consulting, including process vs. expert and internal vs external in multisector environments, and will learn best practices in using evidence-based, client-centered, and positively-based tools and techniques to identify organizational development opportunities with high buy-in and positive impact. We will discuss and practice project scoping, contracting, assessing needs, developing, delivering, and evaluating successful process consulting projects, with focus given to developing the ability to empathize, diagnose, and manage positive, inclusive, and successful team and client relationships. Skills gained will include multilevel needs assessment, primary and secondary data analysis, persuasive oral and written communication, self- and team-management, project management, design and systems thinking, data visualization, and best practices in change management.

Questions regarding this workshop may be addressed to Cindi.Gilliland@cgu.edu.

Contact Us

Claremont Evaluation Center

Claremont Graduate University
175 E. 12th Street
Claremont, CA 91711
jessica.renger@cgu.edu