Category: Workshop

  • WSST Workshop resource: Thematic Analysis

    Instructions

    1. Open your preferred AI tool
    2. Copy the prompt and attached provided answer key and student responses file
    3. Run the prompt, and read the output

    Try it!

    1. Use the same prompt, but use your assessment’s answer key and student response file
    2. Try: adjust the prompt and re-run it, does the insights get better?
    3. Try: ask follow-up questions, did you get more relevant insights?

    Resources

    Inputs for generating personalized feedback on student writing.

    Resource drive link

    Student Instruction: 
    Q1. How do non-cancer cells become cancer cells? Your explanation
    
    Answer key:
    Look-for:
    Cells go through a cell cycle.
    A lot happens during the cell cycle: cells grow, chromosomes double, and the cell divides during mitosis.
    When a non-cancer cell divides into non-cancer cells, the new cells are identical: their chromosomes are the same.
    When a non-cancer cell divides into a cancer cell, something is different about the chromosomes.
    The cell cycle has checkpoints.
    When a cell skips a checkpoint, it becomes cancerous.
    Checkpoints work when there is a functioning/working p53.
    Cells skip checkpoints when there is a non-functioning/non-working p53.
    
    Rubric:
    4 - Extending
    Constructs an explanation about cancer cells based on empirical evidence and makes specific connections to multiple disciplinary ideas. Cites specific evidence from multiple class activities.
    
    3 - Proficient
    Constructs an explanation about cancer cells based on empirical evidence and makes specific connections to multiple disciplinary ideas. Cites specific evidence from class. (Don’t use words like more or a lot.)
    
    2 - Approaching
    Constructs an explanation about cancer cells based on empirical evidence and begins to make a connection to multiple disciplinary ideas. References evidence from class (without citing specific data)
    
    1 - Beginning
    Constructs an explanation (including interactions) to describe cancer cells with minimal reference to empirical evidence.
    
    
    Example:
    Cancer cells form when p53 is not working in the cell cycle. Normally, cells grow, make new chromosomes, and divide, and p53 checks to make sure it has everything it needs before it divides. If it is not working correctly, the cells that get made do not have everything they need but get made anyway. They are different because they have different chromosomes.
    

    Workshop Takeaways

    Generated themes may differ with each run even when using the same data


    Run the prompts a few times and pick the themes that commonly come up. Check against your instincts on student work!

    What are the common themes that I can cross-validate quickly to inform my instructions?

    Different models (e.g. Gemini vs ChatGPT) give different output formats


    This might even happen with the same model! Providing specific output can help improve the consistency of the insights.

    What are the output format that I found most effective in informing my instructions?

    Insights can be presented without showing student work


    Prompting the AI to show student work exemplars, and reviewing student work can ground your perspective and help you identify which insights are more useful.

    How do I stay grounded in my students’ work while using AI to speed up the analysis work?

    Their instructional coach Rachel shares what changed when teachers could finally see past the writing to what students actually understood.

    Meet the Presidential Award-winning science teacher who built Eddo and why 20 years in the classroom led him here.

  • WSST Workshop resource: Personalized Feedback

    Instructions

    1. Open your preferred AI tool
    2. Copy the prompt and the provided answer key, and copy the student response of your choice
    3. Run the prompt, and read the output

    Try it!

    1. Use the same prompt, but use your assessment’s answer key and your student’s response
    2. Try: adjust the prompt and re-run it, what did you see?

    Resources

    Inputs for generating personalized feedback on student writing.

    Resource drive link

    Student Instruction: 
    
    Question 1a. Do all solid objects deform during a collision? Claim
    
    Question 1b: Evidence
    
    Question 1c: Reasoning
    
    Answer key:
    
    Question 1a: Claim
    
    +The claim directly answers the scientific question.
    
    +The claim is clear and specific, reflecting student knowledge of key concepts.
    
    Question 1b: Evidence
    
    +Provides relevant evidence.
    
    +Evidence clearly cited with enough information to show student knowledge of key concepts.
    
    Examples of evidence possible:
    
    1 - A moving car collided with a stationary car, causing visible damage.
    
    2 - A moving golf club hit a golf ball, and the ball squished on impact.
    
    3  - A moving baseball hit a stationary bat; the ball squished and the bat bent slightly.
    
    4  - A mirror shifted the laser reflection when force was applied or a ball hit it.
    
     5 - A cement beam bent when pressure was applied by a plunger or machine.
    
    (Question 1a and Question 1b total: 8 points)
    
    8 points: All three are included and thorough.
    
    7 points: 2 of the 3 are included OR information is solid but missing key concepts or evidence.
    
    6 points: 2 of the 3 are included AND information is solid but missing key concepts or evidence.
    
    5 points: Claim, evidence and/or concepts are incomplete, inaccurate, or demonstrate major misconceptions.
    
    4 points: Claim, evidence, and concepts have been attempted but there is not enough information to make an accurate assessment of student knowledge.
    
    0 points: Nothing in this section has been completed.
    
    Question 1c: Reasoning
    +Includes logic statements that link the claim, evidence and science concepts (for example, using words like “because”, “therefore”, etc).
    +Uses correct science concepts (laws, theories, mechanisms) to justify the relationship.
    +Clearly explains the cause-and-effect link between claim and evidence.
    
    (Question 1c total: 8 points)
    
    0 points: Reasoning has not been attempted.
    
    8 points: Includes logic statements that link the claim, evidence and science concepts (including words such as "because...", "therefore...") that clearly demonstrates logical reasoning.
    
    7 points: Includes a logic statement that links the claim, evidence and concepts, and is beginning to demonstrate logical reasoning.
    
    6 points: Attempts to include a logic statement that links the evidence to the claim but does not adequately link the evidence to the claim.
    
    5 points: Restates evidence or claim and does not include a logic statement that links the evidence to the claim.
    
    4 points: Reasoning has been attempted but there is not enough information to make an accurate assessment of student knowledge.

    Question 1a. Do all solid objects deform during a collision? Claim:
    All solid objects deform during collision

    Question 1b. Evidence:
    Baseball & bat, golf club & ball, moving cars

    Question 1c. Reasoning:
    When they hit a baseball with the bat, the baseball indented a little, and the bat started to vibrate. When they hit a golf ball with a club, all the balls got dented, and when the 2 cars hit each other, they both smashed into each other and dented.

    Workshop Takeaways

    AI can generate a lot of feedback but requires curating for effective student learning


    AI does not know the students as well as the teachers do! Choosing and editing the feedback makes it more relevant to the students.

    What are some ways to curate the feedback so it’s meaningful to the students?

    Feedback with guiding questions, quoted student work, and adjusted reading levels are motivating


    Students who received the personalized feedback found them useful, especially knowing what to keep and what to fix.

    How would you incorporate AI in your workflow to generate personalized feedback?

    AI, with more structured prompts, can help with a lot of tasks


    Our hands-on shows AI evaluating text, handwriting, and drawings. Some teachers tested with additional task context and saw a more personalized feedback.

    How would you structure your prompt so the AI output is useful to you and your students?

    Their instructional coach Rachel shares what changed when teachers could finally see past the writing to what students actually understood.

    Meet the Presidential Award-winning science teacher who built Eddo and why 20 years in the classroom led him here.

  • Kick-Starting Formative Feedback with AI: CERs and Formative Assessment for OpenSciEd

    Last week, we hosted a Coaching ScienceLast week, we hosted a small Coaching Science Lab with educators, instructional coaches, and district leaders to explore a simple, open-ended question:

    What becomes possible when we use everyday AI tools to look more closely at student thinking—without changing who’s in charge of instruction?


    Experiences from the classroom

    Val shared how she used student drawings and explanations from the OpenSciEd Grade 8 Sound Waves unit to better understand student thinking, generate feedback more efficiently, and decide what to teach next.

    Val’s classroom workflow:

    • student work as the starting point
    • generated feedback in the form of strengths and questions to push learner forward
    • AI support enabled her to provide feedback that directly referenced their work, and to do it a lot faster than would have been possible otherwise

    Students loved specific, concrete feedback.


    Using everyday AI tools on purpose

    For the hands-on, we practiced student feedback using off-the-shelf AI tools like ChatGPT, Claude, and Gemini to show how you can get started without any special tools.

    Participants could see exactly:

    • how to prompt the AI with student instructions, reviewer instructions, and student work
    • what the AI produced
    • how to chat with the AI to go deeper

    This demystified the AI and showed how it can be used for really helpful analysis on real student work.

    The real leverage wasn’t the tool.
    It was the thinking around:

    • what counts as evidence in student work
    • what feedback actually helps students improve
    • what patterns matter for coaching and planning

    AI simply helped us move through that thinking faster.


    Teacher judgment stays at the center

    A strong theme throughout the session was role clarity.

    Participants resonated with an approach where:

    • teachers review and revise all feedback
    • tone and instructional intent remain human
    • AI suggestions are visible, editable, and contextual
    • patterns support coaching conversations rather than replace them

    The framing that stuck was simple:

    Teachers hope to engage students in deeper, more feedback.
    AI helps teachers do that work more often, with less friction.


    An invitation, not a conclusion

    Participants in the Coaching Science Lab session showed creativity and curiosity, encouraging us to create more opportunities for shared practice around real student work.

    We’re excited to continue learning and doing together.


    Want to explore this yourself?

    If you’re curious to try this approach in your own context, we’ve shared two lightweight entry points:

    This is a great a place to start exploring.


    We’re continuing to host small Coaching Science Labs as spaces to test ideas, learn from real classrooms, and figure out what responsible, useful AI support can look like in practice.

    If you’re interested in joining a future session—or just trying this on your own—we’d love to learn alongside you. Browse or subscribe to our calendar to join us for upcoming events.

  • CER Feedback AI Prompt Example

    Copy this prompt into an AI tool of your choice (chatgpt.com, claude.ai, gemini.google.com)

    ## Reviewer Instructions
    [Role and responsibilities]
    You are a teacher evaluating a CER (Claim, Evidence, Reasoning) assessment. Read the student response below and provide feedback. Identify strengths in the writing and make suggestions for improvement. Make specific connections between student writing and the task criteria. 
    
    [Context]
    This assessment is made up of 3 parts in response to a scientific question. The 3 parts are: Claim, Evidence, Reasoning. The claim has a correct answer and typically reflects the scientific concept. The evidence are data to support the claim.  
    Reasoning reflects the student's ability to make logical and scientific connections between the evidence and the claim. 
    
    [Instructions]
    1. Evaluate reasoning responses based on the look-fors below:
    Includes logic statements that link the claim, evidence and science concepts (for example, using words like “because”, “therefore”, etc).
    Uses correct science concepts (laws, theories, mechanisms) to justify the relationship.
    Clearly explains the cause-and-effect link between claim and evidence.
    
    2. When writing feedback, write directly to the student, citing specific but accurate strengths from their writing and pointing out ways to better organize and connect their writing. (i.e. "You provided strong evidence for... by stating that...") Use only verbatim examples from the student's writing. DO NOT MAKE ANYTHING UP. 
    
    3. Avoid providing strengths if the student response does not reflect student knowledge. 
    
    4. The student is in "learning mode" so use suggestions to elicit their understanding or probe deeper using guiding questions, but don't solve the explanation for them (i.e. Socratic style). Use clear and concise language and an encouraging, supportive tone. 
    
    5. Write it in the language and vocabulary that an 8th grader can understand. Avoid using scientific jargon. 
    
    6. Select up to 3 strengths and 3 suggestions that are the most, providing more feedback around reasoning. 
    
    The questions are: 
    Question 1a. All solid objects do bend or change shape when pushed in a collision. 
    Question 1b. Evidence
    Question 1c. Reasoning
    
    The student’s response are:
    Question 1a. All solid objects do bend or change shape when pushed in a collision. Question 1b. Moving car caused damage to the stationary car The golf club’s force caused the ball to squish When the bat and ball collided they both squished and bounced When force was applied the the mirror the laser reflected of moved on impact When the plunger was pushing down on the cement beam it was bending a lot Question 1c. In all of these scenarios, solid materials bent or changed shape from the impact of the collision. In the slow motion video of the baseball bat and ball you could see how the ball squished against the bat before bouncing back up, and the bat wiggled downward. Again in the cement beam video, the plunger’s force was making the beam bend. The golf ball scenario is like the cement video because the object applying force is causing the other object to change its shape. Even though the club was colliding with the golf ball at different speeds, the ball changed shape the same way. When the moving car and the stationary car collided, they almost instantly caused damage to each other. In the mirror and laser video, the guy applied force to the mirror where the laser was pointed and the reflection of the laser, which was pointed at paper, moved. That means that the mirror bent when the force was applied. In all of these videos, there were solid materials that had force applied to them, and they all bent or changed shape during the collision.  
     
    
    The prompt is separated into the 3 parts, context about the task, student instructions, and student response. 
    Try switching out the student instructions and student response for different assessments!
    
    1. Context about the task
    [Role and responsibilities]
    You are a teacher evaluating a CER (Claim, Evidence, Reasoning) assessment. Read the student response below and provide feedback. Identify strengths in the writing and make suggestions for improvement. Make specific connections between student writing and the task criteria. 
    
    [Context]
    This assessment is made up of 3 parts in response to a scientific question. The 3 parts are: Claim, Evidence, Reasoning. The claim has a correct answer and typically reflects the scientific concept. The evidence are data to support the claim.  
    Reasoning reflects the student's ability to make logical and scientific connections between the evidence and the claim. 
    
    [Instructions]
    1. Evaluate reasoning responses based on the look-fors below:
    Includes logic statements that link the claim, evidence and science concepts (for example, using words like “because”, “therefore”, etc).
    Uses correct science concepts (laws, theories, mechanisms) to justify the relationship.
    Clearly explains the cause-and-effect link between claim and evidence.
    
    2. When writing feedback, write directly to the student, citing specific but accurate strengths from their writing and pointing out ways to better organize and connect their writing. (i.e. "You provided strong evidence for... by stating that...") Use only verbatim examples from the student's writing. DO NOT MAKE ANYTHING UP. 
    
    3. Avoid providing strengths if the student response does not reflect student knowledge. 
    
    4. The student is in "learning mode" so use suggestions to elicit their understanding or probe deeper using guiding questions, but don't solve the explanation for them (i.e. Socratic style). Use clear and concise language and an encouraging, supportive tone. 
    
    5. Write it in the language and vocabulary that an 8th grader can understand. Avoid using scientific jargon. 
    
    6. Select up to 3 strengths and 3 suggestions that are the most, providing more feedback around reasoning. 
    
    
    ## Student Instructions
    Question 1a. All solid objects do bend or change shape when pushed in a collision. 
    Question 1b. Evidence
    Question 1c. Reasoning
    
    ## Student Response
    Question 1a. All solid objects do bend or change shape when pushed in a collision.
    
    Question 1b.
    Moving car caused damage to the stationary car
    The golf club’s force caused the ball to squish
    When the bat and ball collided they both squished and bounced
    When force was applied the the mirror the laser reflected of moved on impact
    When the plunger was pushing down on the cement beam it was bending a lot
    
    Question 1c.
    In all of these scenarios, solid materials bent or changed shape from the impact of the collision. In the slow motion video of the baseball bat and ball you could see how the ball squished against the bat before bouncing back up, and the bat wiggled downward. Again in the cement beam video, the plunger’s force was making the beam bend. The golf ball scenario is like the cement video because the object applying force is causing the other object to change its shape. Even though the club was colliding with the golf ball at different speeds, the ball changed shape the same way. When the moving car and the stationary car collided, they almost instantly caused damage to each other. In the mirror and laser video, the guy applied force to the mirror where the laser was pointed and the reflection of the laser, which was pointed at paper, moved. That means that the mirror bent when the force was applied. In all of these videos, there were solid materials that had force applied to them, and they all bent or changed shape during the collision.
    
    
    
    
  • Workshop: Kick-Starting Formative Feedback with AI: CERs and Formative Assessment for OpenSciEd

    RSVP link: https://luma.com/mtzs5lo4

    Date: Dec 3 at 12:30 – 1:30pm CT.

    Duration: 60-minute session

    Speaker profile: Valerie Pumala — Cameron Middle School (Cameron, WI); National Board Certified Teacher; 8th Grade Science Teacher; OpenSciEd Certified Facilitator

    Description:
    Can we give student scientists deeper, more actionable feedback without slowing everything down? Middle school science teacher, Valerie Pumala, will show how she uses AI to generate strengths plus guiding questions grounded in student writing—fueling quick revision cycles and improving personalization for students.

    * See before/after examples from a real assessment.

    * Practice a simple formative routine: Use three specific strengths and one catalytic question to move student thinking, and try it on your own (or sample) student work.

    * Take-home: a ready-to-use 3S+Q prompt set and a short coaching checklist.