Logo

The Data Daily

Health Street Smarts Fairs Created by University Students to Promote Critical Thinking on Campus, Part 2 | Skeptical Inquirer

Health Street Smarts Fairs Created by University Students to Promote Critical Thinking on Campus, Part 2 | Skeptical Inquirer

In Part 1, I discussed: (1) my concerns about health fairs; (2) my vision for the Health Street Smarts Fair (HSSF) assignment; (3) some of my favorite undergraduate-student-curated HSSF exhibits; and (4) prospects for skeptic-activists to adapt the HSSF model to create street smarts fairs to promote skeptical inquiry in community or school settings. Self-proclaimed skeptic junkie Susan Gerbic, the Wikipediatrician, notified me that she has encouraged local skeptics groups to set up street smarts exhibits like the ones described in Part 1, and that one group has decided to plan an exhibit at a fair.

This follow-up column addresses issues that may help educators and skeptic-activists interested in adapting the HSSF model. It offers: (1) the detailed instructions I gave students to guide them in completing the assignment; (2) the grading rubric I developed for the assignment; (3) my comments on the instructions and the rubric; (4) strengths and weaknesses of the assignment in eliciting meaningful academic achievement; and (5) ethical considerations in running exhibits designed to expose visitors’ vulnerability to deception my assessment of the strengths.

Here is my most recent version of the instructions for the HSSF assignment:

Each student will be assigned to a small group of students. Each small group will plan an exhibit for an on-campus health fair to be conducted on a date to be announced. The health fair will probably take place outside the University Student Union. Tables and chairs (perhaps under a tent setup) will be provided. Additional equipment may (or may not) be provided as needed, upon request. Each group will provide supplies needed for their exhibit, but no student should spend or be encouraged to spend more than $5.00 for the design of an exhibit.

Each exhibit should be designed to inspire critical thinking about a popular type of product, service, or practice claimed to influence health. Critical thinking has been defined numerous ways including:

Each exhibit should be engaging and have interactive features. Interactive exhibits may include prizes for visitors who participate and successfully complete tasks set up for them. Exhibits should not involve simply distributing products or information. Each exhibit must have, at minimum: (1) prior approval by the instructor based on: (a) a stated learner-centered learning objective composed by the group that involves some aspect of critical thinking; (b) a sound rationale including any reference citations needed to establish the learning objective as worthwhile for exhibit visitors to achieve; and (c) a sound plan to run the exhibit to achieve the learning objective and to make sure that visitors are properly debriefed after participating, especially if exhibits involve any kind of deception; (2) a plan for evaluating the exhibit in terms of process and outcome; (3) a follow-up discussion by the group in the designated discussion forum of Canvas [the online course platform] about the process evaluation and outcome evaluation findings.

Learning objectives must call for higher-order cognitive learning as indicated in Bloom’s taxonomy of educational objectives. See the revised taxonomy with links to resources. See a nice list of action verbs to choose from when writing learning objectives. Good objectives will include action verbs listed at the level of application, analysis, synthesis, or evaluation in the taxonomy.

Process evaluation should focus on the way the exhibit was planned and implemented—in other words, quality control. The key to process evaluation is posing appropriate process evaluation questions and finding ways to answer them. Outcome evaluation should focus on how well the learning objective was achieved. Evaluation is the most challenging and neglected component of educational initiatives.

All communications for the project must take place during class meeting times or in online discussion areas for each assigned group. Students should never be pressured by group members to meet at a particular time outside of class.

I provided this description of how students would be graded for the HSSF assignment in the course syllabus:

There will be both a grade for each individual group member’s efforts (up to 4 points) and a grade for each group (up to 24 points). Grades for contributions to the assignment by each individual student will be determined in part by what your instructor observes in class and in online discussions. Your instructor will also consider information received in response to a request in Canvas for information from each student about the contributions each of their group members made to completing the assignment. Grades will be based on a scale from 0 to 4 for each of 7 components (6 group grade components and 1 component for each individual student’s efforts.

The grading scale for each component uses a target analogy. A grade of 4 points is considered bullseye performance, 3 points means close to the center of the target, 2 means on the target, 1 means near the target, and 0 means way off target.

Here is a description of each possible point award for each component:

Component 7: Your Individual Contribution to the HSSF Assignment

Comments on the Instructions and Rubric

I emphasized that exhibits should be designed to promote critical thinking and not just share information. It would not have been appropriate to limit my undergraduate public health students to creating exhibits only on paranormal or pseudoscience topics. Most of my students have been much more interested in other health topics. Thus, I was careful to present definitions of critical thinking that are applicable to careful examination of a wide range of health issues but also consistent with the description of skepticism provided at SkepticalInquirer.org.

Last year in Skeptical Inquirer, Guy P. Harrison emphasized the importance of teaching critical thinking every day in schools. He provided a concise description of critical thinking that would have fit well in the instructions for the HSSF assignment: “Critical thinking is the means of figuring out if something makes sense and is likely to be true or not.”

I didn’t present my students with any definitions of critical thinking that fail to suggest the importance of distinguishing fact from fiction. An example of such a definition is: “The act or habit of carefully exploring the thinking process to clarify our understanding and make more intelligent decisions.” I think that’s a good definition of metacognition, but not of critical thinking. I found it in the glossary of a critical thinking textbook I was required to use years ago teaching one of many sections of a required Introduction to Higher Education course for freshman students. Like many other books I’ve collected on critical thinking, that textbook makes no mention of mention of skepticism, pseudoscience, or Occam’s razor.

I think many critical thinking promoters at universities would see these components of the assignment as requiring critical thinking: formulating an appropriate learning objective for an exhibit, providing a sound rationale for the objective, designing the exhibit, planning how to curate it, planning how to evaluate the exhibit, and drawing conclusions from analyzing the collected exhibit evaluation data.

Most groups had difficulty formulating an appropriate learning objective and providing a rationale for. I spent many hours helping groups with these tasks. I typically needed to give groups considerable feedback on their drafted rationales and revisions. Without a sound objective and rationale, groups would not have been able to proceed to design their exhibits. Eventually, I approved the learning objective and rationale of all groups given the assignment. Eventually, I graded most groups 4 out 4 points for both the learning objective and rationale components. I entered those grades before exhibit day.

I also graded most groups 4 out 4 points for running their exhibits. Throughout the process of designing exhibits, I advised students not to worry about how good their exhibit would be. I emphasized that I was more interested in students learning how to improve their creations from their exhibit curation experience. I wanted exhibit curation to be fun so I assured students I would not penalize their groups for any exhibit mishaps.

I think that approach worked out well. Most students took pride in designing their exhibits and tried hard to welcome and enlighten visitors. Just about all exhibits were attractive even though they were designed on small budgets.

I advised my students: “If you build it, they will come.” It turned out that I was right.

I explained to students that running an exhibit well requires teamwork. I guided each group to plan different roles for each group member. I suggested various roles such as main exhibit host, process evaluator, outcome evaluator, debriefer, and visitor recruiter (to encourage people walking nearby to participate). Some complex exhibits, such as the water tasting exhibit I described in Part 1, attracted many visitors and required as many as four group members focused just on operating the exhibit smoothly. In most groups, students worked together harmoniously. For a few groups, I had to put on my mediator hat.

The process (implementation) evaluation component of the grade is where most groups lost the most points. Many groups neglected to plan on taking careful observations to see what went well and what didn’t. Groups also often lost points on the outcome evaluation component. I think a common reason for point losses related to evaluation was that some groups struggled so much with formulating their learning objective and rationale that they wound up needing to rush to get their exhibits ready and were left with inadequate time to develop evaluation plans.

Groups with weak process or outcome evaluation tended to lose points for the follow-up discussion component of the assignment. You can’t analyze a data set that hasn’t been collected and then have a substantive discussion about it.

I think it’s important to have an individual contribution component to grades on group projects. Unproductive group members unfairly benefit from the group grades earned by the most productive group members. This situation is an example of what is known as the free rider problem.

I made the individual contribution component of the HSSF assignment worth one-sixth as much of the combined group grade components. That may seem too small to compensate for unearned group points earned by free riders, but I think it worked well. Most students in the five sections over the years that got the HSSF assignment earned 3 or 4 points for their individual contributions to the assignment. Very few students earned 2 points or less; those few students tended to also underperform on other assignments or exams.

Strengths and Weaknesses of the Assignment

Because I didn’t conduct a systematic evaluation of the instructional value of the HSSF assignment, my comments below about the strengths and weaknesses should be viewed as subjective impressions. My generalizations should be taken with grains of salt.

I think the main strengths of the assignment were: (1) students frequently expressed a sense of accomplishment in the follow-up discussions for assignments; (2) the exhibits were generally well-received by visitors; (3) my students were engaged in learning about critical thinking, instructional planning, and instructional evaluation; and (4) my students liked that they were doing something real.

When I presented and explained the assignment to my classes, most students were unenthusiastic about it. However, when it was over, they often said it turned out to greatly exceed their initial expectations.

For outcome evaluation, representatives of each group briefly interviewed visitors to their exhibit. In general, the interviews revealed that the exhibits got visitors thinking, made a positive impression, and were instructive. To some extent, it’s possible that visitors were telling exhibit representatives what they want to hear. The problem of demand characteristics of interviews can’t be ignored.

Many of my students had previous instruction in program planning and evaluation. However, students often had difficulty applying what they previously learned to exhibit planning and evaluation. But after several weeks of work on the assignment, most students demonstrated significant learning about planning and implementing a small scale health education initiative in a real-world setting.

Students enjoyed setting up displays for their exhibits. They tended to be creative even though they were not graded on their creativity. I avoided grading on creativity, because I thought doing so would have backfired and hindered creativity.

I think the main weaknesses of the assignment were: (1) the assignment required many hours of class time; (2) the challenges of completing each component of the assignment often frustrated students; (3) guiding groups to work together effectively is sometimes a challenge; and (4) the significant logistical challenges for the instructor to facilitate the assignment.

It takes time to schedule a health fair, arrange for table setups, arrange for the fair to be promoted on campus, arrange for gifts for exhibitors to offer to visitors, store exhibit posterboards designed by students for them to conveniently pick up right before scheduled exhibit time, etc. I don’t think of the assignment as an efficient way to promote critical thinking, but I think it’s a meaningful way to do so.

It’s important for exhibitors to be conscientious in reducing the risk of harm while promoting the benefits of critical thinking.I don’t think any of our exhibits posed a significant risk of harm to visitors, but I don’t think the risk is zero, especially for exhibits involving deception. I believe that deceptions in exhibits can be justified when they are geared to helping visitors to become more wary of and less vulnerable to deception. Good debriefings are essential.

Most visitors to my students’ exhibits involving deception followed by debriefing, such as the palm reading exhibit and the “power band” exhibit (both described in Part 1), seemed to be amused when the deception was revealed to them. Nevertheless, when deceptions and debriefings take place in public, I think there can be a small risk that visitors will find their exhibit encounter humiliating. Some visitors might be rattled by exhibits that challenge their strongly held beliefs. Thus, it’s important for exhibitors to be prepared to be warm, empathetic, and supportive to all visitors, especially as they leave the exhibit.

I believe most of my students learned about the importance of critical thinking through the HSSF assignment. However, the value of the HSSF assignment in promoting critical thinking of the student exhibitors should not be accepted based on my perceptions. I don’t have data to show you to back up my belief. A systematic evaluation of the HSSF assignment is needed.

Evaluation studies using a pretest/posttest design with a comparison group would be appropriate. An evaluator not involved in delivering coursework would be ideal. Relevant data could be collected from students enrolled in at least one section of a course that includes the HSSF assignment and then compared to data collected from students enrolled in at least one section of the same course that does not include the HSSF assignment. I’m not sure what would be appropriate critical thinking outcome measures for such a study.

Institutional review board approval would be needed before such a study could be conducted. Student participation in such a study would need to be voluntary; participants would be free to withdraw from the study at any time.

A comparison group design might not be feasible in many settings. Nevertheless, it would still be of interest if measurable improvements in critical thinking could be demonstrated from before the assignment is given to afterward. A less rigorous study can still provide useful insights. As epidemiologist Michael Gregg said: “We are always dealing with dirty data. The trick is to do it with a clean mind.”

Some instructors may wish to adapt the assignment for their students based on descriptions I’ve provided. I would not be surprised if they could improve the assignment.

I hope the descriptions I’ve provided will also be of use to skeptic-activists interested in developing exhibits for school or community settings. While they won’t need to give themselves grades, I encourage them to conduct process and outcome evaluations that would guide them on how to improve upon initial exhibit designs.

Since March 2020, I’ve shifted my teaching setting from physical classrooms on campus to 100 percent online teaching from home. While it’s no longer feasible for me to give students the HSSF assignment, I came up with a series of critical thinking assignments for my online Consumer Health course that I’ve found to be easier to facilitate than the HSSF assignment. Perhaps I’ll discuss them in a future column.

William M. London is a professor of public health at Cal State LA and the editor of the free weekly email newsletter Consumer Health Digest published by the Center for Inquiry’s Quackwatch.

Images Powered by Shutterstock