Data Activities for Student Feedback

  • Established Disposition Surveys

    Description

    • Beliefs about and dispositions towards disciplines
    • Understandings of the nature of what it means to learn/know content in those disciplines
    • Student self-efficacy in a given discipline or in general

    Questions Answered

    • How students feel/what they believe about the discipline.
      • People are either born with the ability to write novels or not.
      • Remembering dates and names is one of the most important parts of studying history.
    • How students see themselves in relation to the discipline (ability to do the work, motivation to learn, curiosity about a topic, etc.).
      • I can figure out most math problems.
      • Psychology will help me understand others.

    Why Important

    • Students with positive dispositions toward a subject and/or high sense of ability to do well in a subject will persist at higher rates, which can lead to improved performance.

    Evidence/Records You Can Collect

    • Scores on surveys.
    • Scores disaggregated by student characteristics such as gender, major, race, etc.
    • Aggregate scores on related questions such as “beliefs about own ability” or “beliefs about how people view me”, etc.

    Data Sharing

    • Synopsis of students’ survey results with implications for student learning.
    • Take the survey along with your students and reflect on your results (e.g., describe how your beliefs might affect your instructional approach). This would satisfy SELF data source as well as STUDENT.
    • Summary and analysis*

    Resources

    Examples

  • Faculty Made Questionnaires

    Faculty Made Questionnaires

    Description

    • Allow instructors to collect idiosyncratic data about a class
    • Is targeted and can be as short as a few questions
    • Can be given at any point in the semester
    • Can be used multiple times in a semester to monitor changes

    Questions Answered

    • How engaged/interested are my students during class, such as:.
      • Comfort level asking questions in class
      • Interest in/curiosity about a topic
      • Sense of belonging
    • Questions about students' lives and study habits:
      • Reason for pursuing degree/taking course
      • Number of hours working per week
      • Time set aside for coursework per week

    Why Important

    • Students who feel a sense of identity and/or belonging in a class are more likely to engage with the faculty/other students and are less likely to drop out. Knowing students’ expectations of time commitment and/or what learning looks like in a class can help faculty address those expectations.

    Evidence/Records You Can Collect

    • Average scores for individual questions.
    • Average aggregate scores for group(s) of related questions.
    • Collections of free-responses answers organized or coded using themes.

    Data Sharing

    • Outline results that you will use to guide changes in your instruction.
    • Interpretation of your ratings, reflecting on both your strengths, as well as items that you feel you can improve.
    • Graphical representation of increases in and/or types of positive responses.
    • Summary and analysis*

    Resources

    Examples

    • An instructor teaching Introductory Biology is curious why students are not participating in class discussions or asking questions.
      • To probe this issue, she develops a short, 2-item survey on Qualtrics
      • The survey consists of one closed-ended item that lists various reasons why she thinks students might not be participating and another item that lets students enter their reason(s) in their own words
      • Based on responses, she adjusts instructional design.
      • Several weeks later, faculty gives the survey again.
  • Student Focus Groups

    Description

    • Convene a small group of students (ideally 5-8)
    • Ask questions specific to instructional design and/or assignment efficacy
    • Ask open questions about class experience

    Questions Answered

    • How do students feel about the progress they have made in the course?
    • What do students like best about the way the course is designed?
    • How would students change the course design and/or assignments to improve their work?
    • Feedback on aspects of the course that are not included in SPOTs.

    Why Important

    • Focus groups that include questions that are directly relevant to a course can yield course and discipline-specific insights that go beyond the scope of standardized evaluation questionnaires (Fife, 2007).

    Evidence/Records You Can Collect

    • Notes/quotes from the focus group session(s) with names and identifying features deleted.
    • Analysis and/or report from the person guiding the focus groups.
    • Video of the focus group (if permission has been granted by all of the participants).
    • Transcript analysis for themes.

    Data Sharing

    • Excerpts from the group conversation and the facilitator’s notes/report.
    • Written synopsis of themes and/or areas for improvement along with plans to adjust instructional design.
    • Summary and analysis*

    Resources

    Examples

    • An example of the type of data you might yield from a focus group from the University of Illinois at Urbana-Champaign
  • Mid-Semester Feedback (MSF)

    Description

    • Can be administered by CAT or a peer
    • Questions generally focus on what is working and what is not.
    • Can include question about content such as “how would you rate your understanding of….”
    • Feedback from students is anonymous but guided to get useful information.
    • Person doing the MSF can synthesize the results for the faculty (This would then serve as a PEER data source as well.)

    Questions Answered

    • How do students perceive various aspects of the course and instruction, such as:
      • Comfort with the pace or workload
      • Utility of course resources (e.g., syllabi, Canvas module)
      • Their progress in mastering course content
      • Clarity when introducing new concepts
      • Relating new material to existing knowledge and real-world issues

    Why Important

    • Instructors can use the mid-semester feedback to make meaningful adjustments to course content and/or instructional design that semester. This can also increase students sense of belonging to a learning community.

    Evidence/Records You Can Collect

    • Written synopsis of themes and/or areas for change.
    • Brief descriptions of student conceptions of key curricular ideas.
    • Synopsis of student perceptions of is working to help them learn and what is not working in aiding learning.

    Data Sharing

    • Outline findings of student difficulties and explain what changes were/will be made and why they will facilitate understanding.
    • Discuss student suggestions for improving the class and address whether and how they will be implemented.
    • Discuss rejecting suggestions with an outline for improving student understanding of rationale for instructional practices or design.
    • Summary and analysis*

    Resources

    Examples

  • Qualitative SPOTs Questions

    Description

    • Student comments
    • Let students know what useful feedback looks like (get better and more responses)
    • Can be used to identify effective practices in the class.
    • Note: Sharing qualitative SPOTs responses with a peer may help faculty look at the overall picture – rather than focusing on the more extreme comments. (This would then serve as a PEER data source as well.)

    Questions Answered

    • How do students feel about the class in their own words?
    • Are there common elements of the class / across classes that show up in student comments?

    Why Important

    • Insight can be gained on whether students are able to follow the flow of the class, find quizzes/drafts helpful for larger assignments, or whether students can keep up with class notes.

    Evidence/Records You Can Collect

    • Themes that surface through coding of student responses. (Although a research tool, coding qualitative work can be done in a less formal way to provide faculty information).
    • Excerpts of comments along the same theme (even across sections/courses).
    • Counts of responses that fall into categories that you find valuable. For example, bin comments on homework based on “too much”, “just right”, “too hard”, “essential to passing the class”, etc.

    Data Sharing

    • Share excerpts or themes along with plans to expand on aspects of your teaching that are effective, and ideas to explore areas that can be improved.
    • Explore connections to student responses to the Likert-scale questions.
    • Summary and analysis*

    Resources

    Examples

  • Quantitative SPOTs Questions

    Description

    • Increase response rates by telling students about changes you have made based on student input.
    • Take the SPOTs yourself (prior to seeing your students’ responses). Reconcile your responses with those of your students. (This would then serve as a SELF data source as well.)

    Questions Answered

    • Did changes to the instructional design of the class work?
    • How does the new curriculum (course materials, textbook, etc.) affect student perceptions of the class?
    • Are there discrepancies between students’ perceptions of the class and my own?

    Why Important

    • Knowing how students perceive their class experience can guide faculty to strengths they want to emphasize.

    Evidence/Records You Can Collect

    • SPOTs results by individual question or groups of connected questions.
    • Changes in response rates.
    • Changes in responses to a particular question, or averages for a category of questions.

    Data Sharing

    • Annotated SPOTs results with thoughts about the information it provides. Instead of defending against, try to explain why.
    • Share some of the responses (or excerpts) on the item(s) that you focused on as well as changes that you may have or plan to undertake.
    • Summary and analysis*

    Resources

    Examples

  • Self-Regulated Learning (SRL) Scales

    Description

    • SRL is an individual’s influence, orientation, and control over his/her own learning behaviors.
    • It has been correlated with academic success.
    • Helping students refine their SRL skills can lead to considerable achievement gains (Hudesman et al., 2013).

    Questions Answered

    • Are students confident in the knowledge that they gained in this class?
    • Do my students know when and/or how to acquire knowledge that they need to be successful in this class?

    Why Important

    • Identifying discrepancies between students’ perceptions of their learning and their actual learning, as measured by exams, papers, etc., can help instructors incorporate learning skills or clarify expectations.

    Evidence/Records You Can Collect

    • Results of the self-regulation survey/scale.
    • Patterns in the results or especially surprising outcomes.

    Data Sharing

    • Synthesis of the results.
    • Salient changes from one administration to the next.
    • Changes you plan to make in light of the findings to enhance students’ self-regulation in future terms.
    • Summary and analysis*

    Resources

    Examples

  • End-of-semester or Capstone Assignments

    Description

    • Culminating assessments such as comprehensive exams, final papers, research projects, and performances
    • In order to make decisions based on culminating assessments, faculty can confirm the efficacy of the assignment design by reconciling results of the assessment with student focus groups and/ or peer feedback. (This would then serve as a PEER or another STUDENT data source as well.)

    Questions Answered

    • Have my students met the learning outcomes that I have determined are most important for this course?
    • Do the instructional design and/or course materials do what I want them to?
    • How has an instructional / curricular change impacted student success in the course?

    Why Important

    • Students’ performance on assessments can be used as an evidence-based guide for modification to instructional / materials design, clarification of expectations, or even scope and sequencing of course content for subsequent courses.

    Evidence/Records You Can Collect

    • Distribution of scores on a specific question or part of a rubric along with notes on student work.
    • Class average by semester on question(s) measuring a course learning goal.
    • Annotated samples of exemplary student work.
    • Rubrics used to assess student work.

    Data Sharing

    • Summary of student attainment/progress toward the identified key learning goal(s).
    • Graphical representation of student scores for a given skill/topic over several assessments.
    • Excerpts from student work and associated rubric.
    • Summary and analysis*

    Resources

    Examples

    • Final Exam
    • Capstone Paper
    • End-of-semester performance
    • Project
    • Presentation
  • Pre- and Post-Test Assessments

    Description

    • At start and end of semester for major course learning outcomes.
    • Before and after a module to check student progress and/or identify content that needs more work.
    • Ask pre- and post- questions about how students think about gaining knowledge in the discipline. E.g. “what does it mean to think like a _______”
    • Use concept inventories to measure whether students have an operational understanding of foundational concepts. These are standardized so you can compare student groups. STEM disciplines are further along with widely used concept inventories.

    Questions Answered

    • What are my students' learning gains throughout the semester?
    • Have my students become better learners in the discipline?
    • How do my students’ learning gains compare to other student populations?

    Why Important

    • Students' performance on pre-tests can be used to gauge students' incoming level of knowledge or competency to plan for instruction. Students’ learning gains can also be used to gauge how instruction impacts students with different starting points.

    Evidence/Records You Can Collect

    • Distribution and averages of class pre- and post-test scores.
    • Gains on pre- post assessments.
    • Isolate scores on questions intended to measure important learning goals or concepts.

    Data Sharing

    • Comparison of gains by semester after adjustments to course design and/or instructional methods.
    • Present results at workshop or conference.
    • Summarize effectiveness of course design and instruction using student learning gains.
    • Summary and analysis*

    Resources

    Examples

  • Classroom Assessment Techniques (CATs)

    Description

    • Brief, non-graded or graded, in-class/online activities give faculty and students real time feedback on the teaching-learning process.
    • Anonymous CATs put focus on learning content.

    Questions Answered

    • Are my students understanding the material being presented during class?
    • Is this new instructional practice or activity effective?
    • Did my students understand a particularly nuanced or commonly misunderstood concept?

    Why Important

    • Formative assessments like CATs are more like a check-in to help faculty and students find out if/how much progress is being made toward learning goals.

    Evidence/Records You Can Collect

    • Screenshots of results from iClicker questions.
    • Notes on what the CAT results tell you about your students’ learning, and what changes you decided to make, if any.

    Data Sharing

    • Synthesis of the results and your response.
    • Summary and analysis*

    Resources

    Examples