August 14 – 22, 2020

Daily Schedule

9:00 am: Workshops Begin
10:30 am – 10:45 am: Break
12:00 pm – 1:00 pm: Lunch Break
2:30 pm – 2:45 pm: Break
4:00 pm: Workshops Conclude

All workshops will be conducted online. Please note that all times listed are in the Pacific time zone. Each workshop lasts one full day, from 9:00 am to 4:00 pm. To register in advance for online workshop attendance, visit the registration page.

2020 Workshops

Friday, August 14

Saturday, August 15

Sunday, August 16

Monday, August 17

Tuesday, August 18

Wednesday, August 19

Thursday, August 20

Friday, August 21

Saturday, August 22

Workshop Descriptions

Friday, August 14

 

Foundations of Evaluation & Applied Research Methods
Stewart I. Donaldson
and Christina A. Christie

headshot of Stewart Donaldson headshot of Christina Christie

This workshop will provide participants with an overview of the core concepts in evaluation and applied research methods. Key topics will include the various uses, purposes, and benefits of conducting evaluations and applied research, basics of validity and design sensitivity, evaluation theory, theories of change, strengths and weaknesses of a variety of common applied research methods, and the basics of program, policy, and personnel evaluation. In addition, participants will be introduced to a range of popular evaluation approaches including the transdisciplinary approach, program theory-driven evaluation science, experimental and quasi-experimental evaluations, empowerment evaluation, inclusive evaluation, utilization-focused evaluation, developmental evaluation, and realist evaluation. This workshop is intended to provide participants with a solid introduction, overview, or refresher on the latest developments in evaluation and applied research, and to prepare participants for intermediate and advanced level workshops in the series.

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu or tina.christie@ucla.edu 

Empowerment Evaluation
David M. Fetterman

headshot of David Fetterman
Empowerment evaluation is a stakeholder involvement approach to evaluation. It is aimed at learning and improvement. It helps people learn how to help themselves and become more self-determined, by learning how to monitor and evaluate their own programs and initiatives. Key concepts include: a critical friend (evaluator helping to guide their evaluation), cycles of reflection and action, and a community of learners. Principles guiding empowerment evaluation range from improvement to capacity building and accountability. The basic steps of empowerment evaluation include: 1) establishing a mission or unifying purpose; 2) taking stock – a baseline self-assessment designed to help measure growth and improvement; and 3) planning for the future – establishing goals and strategies to achieve objectives (based on their self-assessment), as well as credible evidence to monitor change. An evaluation dashboard is used to compare actual performance with quarterly milestones and annual goals. The role of the evaluator is that of a coach or facilitator in an empowerment evaluation, since the group is in charge of the evaluation itself. The workshop will highlight how empowerment evaluation produces measurable outcomes with case examples ranging from high tech companies such as Google and Hewlett-Packard to work in rural Arkansas and squatter settlements in South Africa. The workshop will introduce participants to the theory, concepts, principles, and steps of empowerment evaluation as well as the technological tools to facilitate the approach.

Questions regarding this workshop may be addressed to fettermanassociates@gmail.com.

Saturday, August 15

 

Qualitative Methods
Kendall Cotton Bronk

headshot of Kendall Bronk
This workshop is designed to introduce you to qualitative research methods. The session will focus on how qualitative research can be effectively utilized in applied research and evaluation contexts. We’ll talk about how to devise qualitative research questions, how to select purposive samples, and what kinds of data to collect for qualitative investigations. We’ll also discuss a few approaches to analyzing qualitative findings, we’ll explore strategies for enhancing the validity of qualitative studies, and we’ll discuss the types of claims qualitative researchers can make based on their methods. Finally, we’ll dedicate time in the afternoon to addressing specific issues class participants are having with qualitative work they’re currently doing or plan to do.

Questions regarding this workshop may be addressed to kendall.bronk@cgu.edu.

 

Bibliometric Methods
Becky Reichard

headshot of Becky Reichard

One rite of passage as an emerging scholar, including Ph.D. students, is to write a review paper on a given topic or field. This task is daunting, and for a good reason. With over 90+ million entries in the web of science, living in the information age is a gift. Still, the sheer amount of information we must sift through can be incredibly overwhelming. During this process, you may find yourself wondering questions such as: what are the most impactful papers in this field? Which authors are folks listening to the most? Since I’m relatively new to this field, how can I ultimately be sure that I have not missed any essential work? Despite facing such questions, emerging scholars must still find a way to orient themselves and quickly become knowledgeable in our chosen field of study. Thankfully, due to advances in technology developed to help us handle big data, we now have a new, easy-to-use, and robust analytical tool to help us with this task. That’s where this workshop comes in. Bibliometric methods provide us with an efficient way to cut through the clutter and objectively identify the key papers and authors in a particular field. In essence, bibliometric methods provide a clear, systematic structure for reviewing published literature on any topic.

Not only can this help you in quickly getting up to speed on your research topic, but it can also result in a publishable review paper! Using simple, free software and the Web of Science, bibliometric methods allow us to (1) examine citation indices to identify the essential documents in a particular field and (2) generate a visual network map of how individual publications cluster and relate to one another. The purpose of this workshop is to provide you with an introduction to the various bibliometric methods (e.g., histography, co-citation analysis, and bibliometric coupling). We will review the logic and objectives of bibliometric methods, demonstrate how to use the free software, and help you understand how to interpret the network maps. With the tool of bibliometric methods at your disposal, you will be well-positioned to understand the ongoing narrative within a field of study and craft an innovative, data-driven review paper.

This workshop is for anyone interested in upgrading their ability to understand a particular field of research. Additionally, the workshop is highly recommended for PhD students, who will be aided in learning a valuable tool in assisting their review paper portfolio requirement.

Questions regarding this workshop may be addressed to becky.reichard@cgu.edu.

 

Sunday, August 16

Applications of Correlation and Multiple Regression: Mediation, Moderation, and More
Dale E. Berger

Dale Berger

Multiple regression is a powerful and flexible tool that has wide applications in evaluation and applied research. Regression analyses are used to describe relationships, test theories, make predictions with data from experimental or observational studies, and model complex relationships. In this workshop we’ll explore preparing data for analysis, selecting models that are appropriate to your data and research questions, running analyses including mediation and moderation, interpreting results, and presenting findings to a nontechnical audience. The presenter will demonstrate applications from start to finish with SPSS and Excel. In recognition of the fact that it is difficult to remember everything in a presentation, participants will be given detailed handouts with explanations and examples that can be used later to guide similar applications.

Questions regarding this workshop may be addressed to dale.berger@cgu.edu.

 

Systems Thinking, Complexity and Evaluation

Michael Quinn Patton

headshot of Michael Quinn Patton

Historically, evaluation has focused on the effectiveness of projects and programs. Increasingly, however, evaluators are being called on to evaluate systems change. This requires systems thinking, understanding complexity concepts, and the implications of both for evaluation. This course will cover the four major evaluation systems thinking principles and six major complexity concepts as they apply to evaluation. Criteria for evaluating systems transformations will be presented, explained, and applied. The course will include special attention to the relevance and implications of systems thinking and complexity understandings in times of uncertainty and turbulence, as in the global pandemic and accelerating climate emergency.

Questions regarding this workshop may be addressed to mqpatton@prodigy.net 

Monday, August 17


 

Survey Research Methods
Jason T. Siegel

headshot of Jason Siegel

The focus of this hands-on workshop is to instruct attendees how to create reliable and valid surveys to be used in applied research. A bad survey is very easy to create. Creating an effective survey requires a complete understanding of the impact that item wording, question ordering, and survey design can have on a research effort. Only through adequate training can a good survey be discriminated from the bad. The daylong workshop will focus specifically on these three aspects of survey creation. The day will begin with a discussion of Dillman’s (2007) principles of question writing. After a brief lecture, attendees will then be asked to use their newly gained knowledge to critique the item writing of selected national surveys. Next, attendees will work in groups to create survey items of their own. Using Sudman, Bradburn, and Schwatrz’s (1996) cognitive approach, attendees will then be informed of the various ways question order can bias results. As practice, attendees will work in groups to critique the item ordering from selected national surveys. Next, attendees will propose an ordering scheme for the questions created during the previous exercise. Lastly, using several sources, the keys to optimal survey design will be provided. As practice, the design of national surveys will be critiqued. Attendees will then work with the survey items created, and properly ordered, in class and propose a survey design.

Questions regarding this workshop may be addressed to jason.siegel@cgu.edu.

 

Principles Focused Developmental Evaluation
Michael Quinn Patton

headshot of Michael Quinn Patton

Developmental evaluation (DE) guides innovative initiatives in complex dynamic environments. Principles-focused evaluation (P-FE) is one special application of DE focused on evaluating adherence to effectiveness principles for achieving results and guiding adaptive action. The essentials of Principles-Focused Developmental Evaluation will be presented, examined, and applied. Participants will learn to use the GUIDE framework, an acronym specifying the criteria for high-quality principles: (G) guidance for action, (U) utility, (I) inspiration, (D) developmental adaptation, and (E) evaluable. The course will include special attention to the relevance and implications of principles-focused developmental evaluation in times of uncertainty and turbulence, as in the global pandemic and accelerating climate emergency. The course will cover the niche and nature of developmental evaluation (DE) and principles-focused evaluation (P-FE), and how they interconnect; five (5) purposes and applications of developmental evaluation at local and international levels; the particular challenges, strengths, and weaknesses of principles-focused developmental evaluation; and he essential principles and practices for designing and conducting principles-focused developmental evaluations.

Questions regarding this workshop may be addressed to mqpatton@prodigy.net 

Tuesday, August 18


 

Designing & Evaluating Health & Well-Being Interventions in a Global Pandemic
Stewart I. Donaldson

headshot of Stewart Donaldson

This online workshop is designed to provide participants with an in-depth understanding of how to practically design, monitor, and evaluate health and well-being interventions and programs in times of physical and social distancing.  The process of engaging diverse stakeholders to develop logic models and theories of change, and designing comprehensive, tailored, ethically defensible, and culturally responsive theory-driven evaluations will be followed to enhance health, well-being, and optimal human functioning.  Through mini-lectures, small group discussions, exercises, and cases, you will have the experience of designing an action model, change model, and proposing a developmental, formative, and summative evaluation plan for an intervention or program of your choice.
Recommended readings:

  • Donaldson, S. I. (2007). Program theory-driven evaluation science: Strategies and applications.  New York, NY: Psychology Press.
  • Donaldson, S. I., Christie, C. A., & Mark, M. (2015).  Credible and actionable evidence: The foundation of rigorous and influential evaluations.  Newbury Park, CA: Sage.
  • Donaldson, S. I., Csikszentmihalyi, M., & Nakamura, J. (2020). Positive psychological science: Improving everyday life, well-being, work, education, and societies across the globe.  New York, NY: Routledge.

Questions regarding this workshop may be addressed to stewart.donaldson@cgu.edu 

 

Quasi-Experimental Design
William D. Crano

headshot of William Crano

Conducting, interpreting, and evaluating research are important aspects of the evaluator’s job description. To that end, many good educational programs provide opportunities for training and experience in conducting and evaluating true experiments (or randomized controlled trials—RCTs—as they sometimes are called). In applied contexts, the opportunity to conduct RCTs often is limited, despite the strong demands on the researcher/evaluator to render “causal” explanations of results, as they lead to more precise understanding and control of outcomes. In these restricted contexts, which are considerably more common than those supporting RCTs, quasi-experimental designs may prove useful. Though they usually do not support causal explanations (with some noteworthy exceptions), they sometimes provide evidence that helps reduce the range of plausible alternative explanations of results, and thus, can prove to be of real value. This workshop is designed to impart an understanding of quasi-experimental designs. After some introductory foundational discussion focused on “true” experiments, we will consider quasi-experimental designs that may be useful across a range of settings that do not readily admit to experimentation. These designs will include time series and interrupted time series methods, nonrandomized designs with and without control groups, case control (or ex post facto) designs, regression-discontinuity analysis, and other esoterica. Participants are encouraged to bring to the workshop design issues they are facing in real world contexts.

Questions regarding this workshop may be addressed to william.crano@cgu.edu.

 

Wednesday, August 19

 

Ethics
Stephen Gilliland

headshot of Stephen Gilliland

When working with clients on consulting and applied research projects many ethical challenges can arise. In this workshop, we will use case examples to examine ethical challenges and dilemmas, as well as the blind spots we can develop as applied researchers and evaluation experts. We will explore climates and conditions that tend to exacerbate ethical situations and will develop strategies for successfully addressing and/or avoiding ethical challenges. Stephen has taught ethics to MA, MBA, PhD, and executive audiences and will draw on ethical challenges he has experienced in his 30 years of consulting.

Questions regarding this workshop may be addressed to Stephen.Gilliland@cgu.edu.

Culturally Responsive Evaluation
Wanda Casillas 

Culturally Responsive Evaluation is often described as a way of thinking, a stance taken, or an emerging approach to evaluation that centers culture and context in all steps of an evaluation process. Regardless, it provides a timely and necessary foundation for critically informing best practices in evaluation design, implementation, and utilization. Taking the broad approach that “culture” describes a shared set of values, principles, practices, behaviors, and tools among groups of individuals, we’ll determine the ways in which stakeholders and evaluators, programs, and evaluations are culturally- and contextually-situated and explore what this assertion means for evaluation practice.  We’ll confront complex topics such as the nature of intersecting cultural spheres on evaluation components and the capacity and role of evaluations to address these. In this course, we’ll engage with a core set of CRE principles that can be married with existing toolkits, such as a systems evaluation protocol, in order to create robust, thoughtful, and valid evaluation designs to optimize effectiveness for evaluands and utilization of evaluation findings. We’ll simulate a step-wise culturally responsive evaluation design using a case study and other interactive exercises to inform personal and professional practices and support group learning.

Questions regarding this workshop may be addressed to wandadcasillas@gmail.com 

Thursday, August 20

 

Consulting
Cindi Gilliland

headshot of Cindi Gilliland

This workshop focuses on learning theories behind and processes of management consulting for organizations. Students will gain understanding of major types of consulting, including process vs. expert and internal vs external in multisector environments, and will learn best practices in using evidence-based, client-centered, and positively-based tools and techniques to identify organizational development opportunities with high buy-in and positive impact. We will discuss and practice project scoping, contracting, assessing needs, developing, delivering, and evaluating successful process consulting projects, with focus given to developing the ability to empathize, diagnose, and manage positive, inclusive, and successful team and client relationships. Skills gained will include multilevel needs assessment, primary and secondary data analysis, persuasive oral and written communication, self- and team-management, project management, design and systems thinking, data visualization, and best practices in change management.

Questions regarding this workshop may be addressed to Cindi.Gilliland@cgu.edu.

 

Diary Methods for Organizational Intervention Research and Practice
Gloria González-Morales and Deirdre O’Shea

Intervention research and practice in organizations require specific methodologies that not only allow evaluation of the effectiveness of the programs, but also to understand the mechanisms and processes through which the intervention works and transfers to everyday operations. Diary methods can be defined as methodological designs that involve data collection on a regular basis (e.g., daily, weekly, monthly). Cross-sectional or traditional longitudinal studies can only capture a static picture or snapshot of the psychological and organizational variables being measured at specific points in time. Diary methods help us to take ‘motion pictures’ of the intervention implementation, mechanisms and outcomes.

The main learning objectives of this workshop are to:

  1. Understand the need and usefulness of diary methods for intervention research and practice. We will discuss how mediating mechanisms and processes can be assessed, and how other practical issues such as dosage, participant adherence or participant experience and practice can be explored with diary designs.
  2. Reflect on how to design an intervention diary design based on clear research/practice questions.
    We will highlight the importance of establishing clear research/practice questions to guide the design in terms of what, how and when to measure. We will consider advantages, challenges and pitfalls related to diary intervention designs. We will discuss the use of control, placebo, random assignments and Randomized Controlled Trials. Statistical, analytical and technological tools needed for this type of methods will be described.
  3. Apply diary methods to participants’ own research/practice projects.
    In groups, participants will discuss and work on their own designs for their research and/or practice projects.

Questions regarding this workshop may be addressed to Gloria.Gonzalez@cgu.edu  or deirdre.oshea@ul.ie

Friday, August 21

Applications of Positive Psychological Science in the New ERA of Physical and Social Distancing 
Stewart Donaldson and Saeideh (Saida) Heshmati

headshot of Stewart Donaldsonheadshot of Saeideh Heshmati

Since its formal introduction at the American Psychological Association Convention in 1998, the positive psychology movement has blossomed, giving birth to a vibrant community of scholars and practitioners interested in understanding and improving various aspects of individual, social, organizational, community, and societal well-being.

In this full-day online workshop, you will be provided with an overview of successful applications of positive psychological science, with a specific focus on evidence-based interventions most likely to be effective during the global pandemic. Through mini-lectures, small group discussions, exercises, and cases, you will learn about the latest research findings, scientific methods, and research-driven applications of positive psychological science, and their potential for addressing inequities caused by the Covid-19 Crisis. This emerging body of scientific knowledge can help you learn how to enhance your own well-being, the well-being of your loved ones, and how to improve the lives of underserved and disadvantaged populations often experiencing a range of social injustices that prevent them from flourishing.
Recommended reading:

  • Donaldson, S. I., Csikszentmihalyi, M., & Nakamura, J. (2020). Positive psychological science: Improving everyday life, well-being, work, education, and societies across the globe.  New York, NY: Routledge.

Questions regarding this workshop may be addressed to Stewart.Donaldson@cgu.edu or Saida.Heshmati@cgu.edu 

Expanding Pathways to Leadership
Michelle Bligh

There is no question that leadership profoundly affects our lives through our roles as researchers and evaluators. Organizational and programmatic successes and failures are often attributed to leadership. However, leadership is more than just a collection of tools and tips, or even skills and competencies; the essence of leadership is grounded in values, philosophies, and beliefs. In addition, pathways to leadership are complicated by the various challenges and opportunities rooted in gender, race, ethnicity, age, class, citizenship, ability, and experience.

Through the metaphor of the labyrinth, we will explore the following questions: What is effective leadership, and how can we encourage more researchers and evaluators to identify as leaders and proactive followers? How can we develop more inclusive leadership programs that allow diverse leaders to rise to the new challenges and demands of a global world? We will examine what successful 21st-century leaderships looks like, drawing on theories of philosophy and ethics, charismatic and transformational leadership, and followership. Using research, cases, and exercises, we will examine constructs critical to practicing leadership, including empowerment, authenticity, accountability, courage, influence, and humility.

Questions regarding this workshop may be addressed to Michelle.Bligh@cgu.edu.

Saturday, August 22

Improving Performance Monitoring for Social Betterment
Leslie Fierro

Frequently in the field of evaluation individuals refer to “M&E”—the short-form for “monitoring and evaluation.” The field of evaluation in academic and professional development contexts often focuses on the “E” part of this equation. However, in addition to performing evaluation, many programs that aim to improve wellbeing and strive for social betterment are required to report program performance metrics to funders and/or see value and utility in gathering data for their own learning purposes that can be obtained quickly, analyzed easily, and summarized at a glance. This is the realm of performance monitoring (aka performance measurement). In this workshop, we will briefly cover the history of performance monitoring (compared to evaluation) and discuss some of the debates (and empirical literature) regarding the strengths and limitations of this technique. The bulk of the workshop will focus on considerations and techniques for designing good performance measures and increasing the reliability and validity of data collection and reporting strategies. Finally, we will compare and contrast performance monitoring with evaluation and consider where important synergies exist between these two performance improvement approaches.

Questions regarding this workshop may be addressed to leslie.fierro@cgu.edu.

 

Data Visualization
Tarek Azzam

headshot of Tarek Azzam

The careful planning of visual tools will be the focus of this workshop. Part of our responsibility as professionals is to turn information into knowledge. Data complexity can often obscure main findings, or hinder a true understanding of program impact. So how do we make information more accessible to different audiences? Often this is done by visually displaying data and information, but this approach, if not done carefully, can also lead to confusion. We will explore the underlying principles behind effective information displays. These are principles that can be applied in almost any area of evaluation, and draw on the work of Edward Tufte, and Stephen Few to illustrate the breadth and depth of their applications. In addition to providing tips to improve most data displays, we will examine the core factors that make them effective. We will discuss the use of the common graphical tools, and delve deeper into other graphical displays that allow the user to visually interact with the data. Topics covered include: interactive visual displays, GIS, qualitative and quantitative data display, and crowdsourcing visualizations. This workshop is designed as an introduction to the topics, and no prior training is required.

Questions regarding this workshop may be addressed to tarek.azzam@cgu.edu

Contact Us

Claremont Evaluation Center

Claremont Graduate University
175 E. 12th Street
Claremont, CA 91711
jessica.renger@cgu.edu