Rubric Best Practices, Examples, and Templates
A rubric is a scoring tool that identifies the different criteria relevant to an assignment, assessment, or learning outcome and states the possible levels of achievement in a specific, clear, and objective way. Use rubrics to assess project-based student work including essays, group projects, creative endeavors, and oral presentations.
Rubrics can help instructors communicate expectations to students and assess student work fairly, consistently and efficiently. Rubrics can provide students with informative feedback on their strengths and weaknesses so that they can reflect on their performance and work on areas that need improvement.
How to Get Started
Best practices, moodle how-to guides.
- Workshop Recording (Fall 2022)
- Workshop Registration
Step 1: Analyze the assignment
The first step in the rubric creation process is to analyze the assignment or assessment for which you are creating a rubric. To do this, consider the following questions:
- What is the purpose of the assignment and your feedback? What do you want students to demonstrate through the completion of this assignment (i.e. what are the learning objectives measured by it)? Is it a summative assessment, or will students use the feedback to create an improved product?
- Does the assignment break down into different or smaller tasks? Are these tasks equally important as the main assignment?
- What would an “excellent” assignment look like? An “acceptable” assignment? One that still needs major work?
- How detailed do you want the feedback you give students to be? Do you want/need to give them a grade?
Step 2: Decide what kind of rubric you will use
Types of rubrics: holistic, analytic/descriptive, single-point
Holistic Rubric. A holistic rubric includes all the criteria (such as clarity, organization, mechanics, etc.) to be considered together and included in a single evaluation. With a holistic rubric, the rater or grader assigns a single score based on an overall judgment of the student’s work, using descriptions of each performance level to assign the score.
Advantages of holistic rubrics:
- Can p lace an emphasis on what learners can demonstrate rather than what they cannot
- Save grader time by minimizing the number of evaluations to be made for each student
- Can be used consistently across raters, provided they have all been trained
Disadvantages of holistic rubrics:
- Provide less specific feedback than analytic/descriptive rubrics
- Can be difficult to choose a score when a student’s work is at varying levels across the criteria
- Any weighting of c riteria cannot be indicated in the rubric
Analytic/Descriptive Rubric . An analytic or descriptive rubric often takes the form of a table with the criteria listed in the left column and with levels of performance listed across the top row. Each cell contains a description of what the specified criterion looks like at a given level of performance. Each of the criteria is scored individually.
Advantages of analytic rubrics:
- Provide detailed feedback on areas of strength or weakness
- Each criterion can be weighted to reflect its relative importance
Disadvantages of analytic rubrics:
- More time-consuming to create and use than a holistic rubric
- May not be used consistently across raters unless the cells are well defined
- May result in giving less personalized feedback
Single-Point Rubric . A single-point rubric is breaks down the components of an assignment into different criteria, but instead of describing different levels of performance, only the “proficient” level is described. Feedback space is provided for instructors to give individualized comments to help students improve and/or show where they excelled beyond the proficiency descriptors.
Advantages of single-point rubrics:
- Easier to create than an analytic/descriptive rubric
- Perhaps more likely that students will read the descriptors
- Areas of concern and excellence are open-ended
- May removes a focus on the grade/points
- May increase student creativity in project-based assignments
Disadvantage of analytic rubrics: Requires more work for instructors writing feedback
Step 3 (Optional): Look for templates and examples.
You might Google, “Rubric for persuasive essay at the college level” and see if there are any publicly available examples to start from. Ask your colleagues if they have used a rubric for a similar assignment. Some examples are also available at the end of this article. These rubrics can be a great starting point for you, but consider steps 3, 4, and 5 below to ensure that the rubric matches your assignment description, learning objectives and expectations.
Step 4: Define the assignment criteria
Make a list of the knowledge and skills are you measuring with the assignment/assessment Refer to your stated learning objectives, the assignment instructions, past examples of student work, etc. for help.
Helpful strategies for defining grading criteria:
- Collaborate with co-instructors, teaching assistants, and other colleagues
- Brainstorm and discuss with students
- Can they be observed and measured?
- Are they important and essential?
- Are they distinct from other criteria?
- Are they phrased in precise, unambiguous language?
- Revise the criteria as needed
- Consider whether some are more important than others, and how you will weight them.
Step 5: Design the rating scale
Most ratings scales include between 3 and 5 levels. Consider the following questions when designing your rating scale:
- Given what students are able to demonstrate in this assignment/assessment, what are the possible levels of achievement?
- How many levels would you like to include (more levels means more detailed descriptions)
- Will you use numbers and/or descriptive labels for each level of performance? (for example 5, 4, 3, 2, 1 and/or Exceeds expectations, Accomplished, Proficient, Developing, Beginning, etc.)
- Don’t use too many columns, and recognize that some criteria can have more columns that others . The rubric needs to be comprehensible and organized. Pick the right amount of columns so that the criteria flow logically and naturally across levels.
Step 6: Write descriptions for each level of the rating scale
Artificial Intelligence tools like Chat GPT have proven to be useful tools for creating a rubric. You will want to engineer your prompt that you provide the AI assistant to ensure you get what you want. For example, you might provide the assignment description, the criteria you feel are important, and the number of levels of performance you want in your prompt. Use the results as a starting point, and adjust the descriptions as needed.
Building a rubric from scratch
For a single-point rubric , describe what would be considered “proficient,” i.e. B-level work, and provide that description. You might also include suggestions for students outside of the actual rubric about how they might surpass proficient-level work.
For analytic and holistic rubrics , c reate statements of expected performance at each level of the rubric.
- Consider what descriptor is appropriate for each criteria, e.g., presence vs absence, complete vs incomplete, many vs none, major vs minor, consistent vs inconsistent, always vs never. If you have an indicator described in one level, it will need to be described in each level.
- You might start with the top/exemplary level. What does it look like when a student has achieved excellence for each/every criterion? Then, look at the “bottom” level. What does it look like when a student has not achieved the learning goals in any way? Then, complete the in-between levels.
- For an analytic rubric , do this for each particular criterion of the rubric so that every cell in the table is filled. These descriptions help students understand your expectations and their performance in regard to those expectations.
Well-written descriptions:
- Describe observable and measurable behavior
- Use parallel language across the scale
- Indicate the degree to which the standards are met
Step 7: Create your rubric
Create your rubric in a table or spreadsheet in Word, Google Docs, Sheets, etc., and then transfer it by typing it into Moodle. You can also use online tools to create the rubric, but you will still have to type the criteria, indicators, levels, etc., into Moodle. Rubric creators: Rubistar , iRubric
Step 8: Pilot-test your rubric
Prior to implementing your rubric on a live course, obtain feedback from:
- Teacher assistants
Try out your new rubric on a sample of student work. After you pilot-test your rubric, analyze the results to consider its effectiveness and revise accordingly.
- Limit the rubric to a single page for reading and grading ease
- Use parallel language . Use similar language and syntax/wording from column to column. Make sure that the rubric can be easily read from left to right or vice versa.
- Use student-friendly language . Make sure the language is learning-level appropriate. If you use academic language or concepts, you will need to teach those concepts.
- Share and discuss the rubric with your students . Students should understand that the rubric is there to help them learn, reflect, and self-assess. If students use a rubric, they will understand the expectations and their relevance to learning.
- Consider scalability and reusability of rubrics. Create rubric templates that you can alter as needed for multiple assignments.
- Maximize the descriptiveness of your language. Avoid words like “good” and “excellent.” For example, instead of saying, “uses excellent sources,” you might describe what makes a resource excellent so that students will know. You might also consider reducing the reliance on quantity, such as a number of allowable misspelled words. Focus instead, for example, on how distracting any spelling errors are.
Example of an analytic rubric for a final paper
Example of a holistic rubric for a final paper, single-point rubric, more examples:.
- Single Point Rubric Template ( variation )
- Analytic Rubric Template make a copy to edit
- A Rubric for Rubrics
- Bank of Online Discussion Rubrics in different formats
- Mathematical Presentations Descriptive Rubric
- Math Proof Assessment Rubric
- Kansas State Sample Rubrics
- Design Single Point Rubric
Technology Tools: Rubrics in Moodle
- Moodle Docs: Rubrics
- Moodle Docs: Grading Guide (use for single-point rubrics)
Tools with rubrics (other than Moodle)
- Google Assignments
- Turnitin Assignments: Rubric or Grading Form
Other resources
- DePaul University (n.d.). Rubrics .
- Gonzalez, J. (2014). Know your terms: Holistic, Analytic, and Single-Point Rubrics . Cult of Pedagogy.
- Goodrich, H. (1996). Understanding rubrics . Teaching for Authentic Student Performance, 54 (4), 14-17. Retrieved from
- Miller, A. (2012). Tame the beast: tips for designing and using rubrics.
- Ragupathi, K., Lee, A. (2020). Beyond Fairness and Consistency in Grading: The Role of Rubrics in Higher Education. In: Sanger, C., Gleason, N. (eds) Diversity and Inclusion in Global Higher Education. Palgrave Macmillan, Singapore.
Rubrics for Oral Presentations
Introduction.
Many instructors require students to give oral presentations, which they evaluate and count in students’ grades. It is important that instructors clarify their goals for these presentations as well as the student learning objectives to which they are related. Embedding the assignment in course goals and learning objectives allows instructors to be clear with students about their expectations and to develop a rubric for evaluating the presentations.
A rubric is a scoring guide that articulates and assesses specific components and expectations for an assignment. Rubrics identify the various criteria relevant to an assignment and then explicitly state the possible levels of achievement along a continuum, so that an effective rubric accurately reflects the expectations of an assignment. Using a rubric to evaluate student performance has advantages for both instructors and students. Creating Rubrics
Rubrics can be either analytic or holistic. An analytic rubric comprises a set of specific criteria, with each one evaluated separately and receiving a separate score. The template resembles a grid with the criteria listed in the left column and levels of performance listed across the top row, using numbers and/or descriptors. The cells within the center of the rubric contain descriptions of what expected performance looks like for each level of performance.
A holistic rubric consists of a set of descriptors that generate a single, global score for the entire work. The single score is based on raters’ overall perception of the quality of the performance. Often, sentence- or paragraph-length descriptions of different levels of competencies are provided.
When applied to an oral presentation, rubrics should reflect the elements of the presentation that will be evaluated as well as their relative importance. Thus, the instructor must decide whether to include dimensions relevant to both form and content and, if so, which one. Additionally, the instructor must decide how to weight each of the dimensions – are they all equally important, or are some more important than others? Additionally, if the presentation represents a group project, the instructor must decide how to balance grading individual and group contributions. Evaluating Group Projects
Creating Rubrics
The steps for creating an analytic rubric include the following:
1. Clarify the purpose of the assignment. What learning objectives are associated with the assignment?
2. Look for existing rubrics that can be adopted or adapted for the specific assignment
3. Define the criteria to be evaluated
4. Choose the rating scale to measure levels of performance
5. Write descriptions for each criterion for each performance level of the rating scale
6. Test and revise the rubric
Examples of criteria that have been included in rubrics for evaluation oral presentations include:
- Knowledge of content
- Organization of content
- Presentation of ideas
- Research/sources
- Visual aids/handouts
- Language clarity
- Grammatical correctness
- Time management
- Volume of speech
- Rate/pacing of Speech
- Mannerisms/gestures
- Eye contact/audience engagement
Examples of scales/ratings that have been used to rate student performance include:
- Strong, Satisfactory, Weak
- Beginning, Intermediate, High
- Exemplary, Competent, Developing
- Excellent, Competent, Needs Work
- Exceeds Standard, Meets Standard, Approaching Standard, Below Standard
- Exemplary, Proficient, Developing, Novice
- Excellent, Good, Marginal, Unacceptable
- Advanced, Intermediate High, Intermediate, Developing
- Exceptional, Above Average, Sufficient, Minimal, Poor
- Master, Distinguished, Proficient, Intermediate, Novice
- Excellent, Good, Satisfactory, Poor, Unacceptable
- Always, Often, Sometimes, Rarely, Never
- Exemplary, Accomplished, Acceptable, Minimally Acceptable, Emerging, Unacceptable
Grading and Performance Rubrics Carnegie Mellon University Eberly Center for Teaching Excellence & Educational Innovation
Creating and Using Rubrics Carnegie Mellon University Eberly Center for Teaching Excellence & Educational Innovation
Using Rubrics Cornell University Center for Teaching Innovation
Rubrics DePaul University Teaching Commons
Building a Rubric University of Texas/Austin Faculty Innovation Center
Building a Rubric Columbia University Center for Teaching and Learning
Rubric Development University of West Florida Center for University Teaching, Learning, and Assessment
Creating and Using Rubrics Yale University Poorvu Center for Teaching and Learning
Designing Grading Rubrics Brown University Sheridan Center for Teaching and Learning
Examples of Oral Presentation Rubrics
Oral Presentation Rubric Pomona College Teaching and Learning Center
Oral Presentation Evaluation Rubric University of Michigan
Oral Presentation Rubric Roanoke College
Oral Presentation: Scoring Guide Fresno State University Office of Institutional Effectiveness
Presentation Skills Rubric State University of New York/New Paltz School of Business
Oral Presentation Rubric Oregon State University Center for Teaching and Learning
Oral Presentation Rubric Purdue University College of Science
Group Class Presentation Sample Rubric Pepperdine University Graziadio Business School
Center for Excellence in Teaching
Home > Resources > Group presentation rubric
Group presentation rubric
This is a grading rubric an instructor uses to assess students’ work on this type of assignment. It is a sample rubric that needs to be edited to reflect the specifics of a particular assignment. Students can self-assess using the rubric as a checklist before submitting their assignment.
Download this file
Download this file [63.74 KB]
Back to Resources Page
Oral Presentation Rubric
Select the box which most describes student performance. Alternatively you can "split the indicators" by using the boxes before each indicator to evaluate each item individually.
Mailing Address
Pomona College 333 N. College Way Claremont , CA 91711
Get in touch
Give back to pomona.
Part of The Claremont Colleges
Home › Stories › In Case You Missed It: Crafting Rubrics with Canvas & ChatGPT to Streamline Grading
In Case You Missed It: Crafting Rubrics with Canvas & ChatGPT to Streamline Grading
On March 13, Alex Ambrose, program director of assessment and analytics at Notre Dame Learning’s Kaneb Center for Teaching Excellence, and Kevin Abbott, an academic tech specialist on the Office of Information Technology’s Teaching & Learning Technologies team, led a virtual workshop titled “Crafting Rubrics with Canvas & ChatGPT to Streamline Grading.”
Participants in the workshop engaged in a comprehensive exploration of rubric creation and technology integration focused on the following topics:
1. Understanding Rubric Types
Attendees explored two main types of rubrics, analytic and holistic, gaining insights into their application and benefits. The session highlighted how choosing the right type can enhance assessment clarity and provide detailed feedback, tailoring the approach to specific assignment needs.
2. Drafting Rubrics with ChatGPT Assistance
Participants were guided through drafting their own rubrics, with a special emphasis on leveraging ChatGPT. They focused on identifying criteria, setting performance levels, and articulating descriptors for mastery. This hands-on activity allowed educators to create tailored rubrics for their courses, harnessing ChatGPT’s capabilities for nuanced and effective rubric development.
3. Canvas Integration Techniques
The workshop also covered integrating these rubrics into the Canvas Learning Management System. Educators learned how to use Canvas Rubrics and Speedgrader tools to improve grading efficiency and provide meaningful feedback, streamlining the assessment process.
The session provided a platform for educators to share ideas and engage in discussions about the transformative impact of rubrics in education. If you missed this opportunity, keep an eye out for future workshops and resources to elevate your teaching and assessment methods.
- Presentation Google Slide Deck (NetID login required)
- Presentation Recording (NetID login required, 1 hr., 3 min.)
- See the Launch Post on the “Teaching in the Age of AI” series for links to register for future events.
- Check out our resource on Teaching in the Age of AI .
Send a Message
- Resources / Teaching Tips / Technology
Unlocking Academic Excellence: Using Generative AI to Create Custom Rubrics
by Bethany Harris · May 1, 2024
Rubrics are more than an evaluation tool; they help set student expectations, increase grading consistency, and promote student independence (Andrade & Du, 2005; Chen et al., 2013; Christie et al., 2015; Timmerman et al., 2011; Johsson, 2014; Panadero & Romero, 2014; Menendez-Varela & Gregori-Giralt, 2016). Well-designed rubrics allow instructors to provide targeted and more objective feedback while also minimizing grading time (Cambell, 2006; Powell, 2001; Reitmeier et al., 2004). While the benefits of rubrics are clear, their creation can often be time-consuming at the front end of assignments. The solution? Use generative AI to create custom rubrics for your courses.
A well-designed rubric outlines clear performance expectations and provides students with targeted feedback. It comprises three key elements: evaluation criteria, a scoring scale, and descriptions of quality for each criterion. It is the third element that makes rubric design so challenging. Criteria identify which features of the task will be assessed and the scoring scale rates performance quality; but it is the descriptors that help students accurately assess their own performance and strategize to improve accordingly.
As an instructor, you can streamline your rubric creation process by combining this information with generative AI such as Microsoft Copilot or ChatGPT. To start, we must design an AI prompt outlining our needs. This prompt should include the assignment or task; the course objectives; the scoring scale; the desired criteria, and instructions for descriptors. Consider the example below, a problem designed to assess students’ understanding of Newton’s Laws of Motion:
The values of masses m1 and m2 are 2kg and 3kg, respectively, in the system shown in the attached image. The friction coefficient between the inclined plane and mass m1 is 0.5. If the system is released, find the values of acceleration and tension in the string. (sin37 = 0.6, cos37 = 0.8, g = 10m/s2)
Prompt Engineering:
To create an effective prompt, we first need to tell the AI platform what we want it to do. In this case, we want it to design a rubric. We can say:
Create a well-crafted and clear rubric for students in the form of a table using student-friendly language.
Next, we need to include the assignment description by simply copying and pasting the instructions. For tasks that include an image, like our physics example above, have the image available as a separate file to upload into the generative AI platform. If the generative AI platform cannot read or interpret pictures or images, then write a detailed description of the image. At the time this article was published, Copilot was able to interpret images while the free version of ChatGPT (3.5) was not. We can say:
The rubric is for the following student task description: The values of masses m1 and m2 are 2kg and 3kg, respectively, in the system shown in the attached image. The friction coefficient between the inclined plane and mass m1 is 0.5. If the system is released, find the values of acceleration and tension in the string. (sin37 = 0.6, cos37 = 0.8, g = 10m/s2)
The language and terminology used in rubrics should align with course objectives, which means we should also include the course learning objectives in our prompt. For our physics example, we can say:
The rubric should be aligned with the following course learning objectives: Upon successfully completing this course, you will have come to understand the basic principles governing the motion of objects, learned to think more critically/scientifically, and developed the skills needed to attack difficult problems. These are all skills that will serve you strongly in your future courses and careers, even if you never again consider a block sliding down an incline.
Next, we need to tell the AI platform what type of rubric we would like to create. This includes the three essential parts of a rubric. For our physics example, we can say:
The rubric should contain three parts: Scoring and Scale, Criteria, and Descriptors. Use the following scoring scale for the rubric: Exemplary (4 points) Proficient (3 points) Basic (2 points) Beginning (1 point) Include the following criteria for each element of the scoring scale I just mentioned above: Axes Drawing free-body diagrams Representation of forces Type and direction of motion Solutions for equations Units
Next, we need to provide a clear description of the type of descriptors we need for each criteria. This is often the most difficult and time-intensive part rubric creation, but AI can quickly do this task in student-friendly language. Continuing with our example, we can say:
For each of the criteria and each scoring scale, generate a descriptor that focuses on describing the quality of the work rather than simply the quantity. Emphasize what constitutes exemplary, proficient, basic, and beginning performance in terms of meeting the objectives of the task, rather than just the quantity of work produced. For example, descriptors should highlight the depth of understanding, clarity of communication, accuracy of information, relevance to the topic, adherence to conventions, and effectiveness of practical implications, among other qualitative aspects.
Finally, we need to tell the AI platform what rubric form we would like. The most common form is a table. We can say:
Generate the rubric in the form of a table. The first row heading for the table should include the scoring scale and points. The first column on the left of the table should display the criteria. The descriptors for each component and score should be listed under the correct scoring scale and points column and criteria row. Make the descriptors in the table as specific to the objectives as possible.
When we put all of this together into one prompt, we generated the following rubric.
From here, you can adjust the rubric as needed yourself or adjust your prompt. Instead of spending your time creating a rubric for each assignment, you can use this formula to have AI do the work for you.
This blog post is adapted from CTL faculty Amanda Nolen’s “AI-Powered Rubrics” talk at the 2024 Georgia Tech Symposium for Lifetime Learning. View her presentation slides, examples, and prompt scripts that can be adapted for your own assignments/courses. To learn more about rubrics and assessment criteria, visit CTL’s online resource on the topic.
References:
Andrade, H., & Du, Y. (2005). Knowing what counts and thinking about quality: students report on how they use rubrics. Practical Assessment, Research and Evaluation , 10 (4), 57-59.
Chen, H. J., She, J. L., Chou, C. C., Tsai, Y. M., & Chiu, M. H. (2013). Development and application of a scoring rubric for evaluating students’ experimental skills in organic chemistry: An instructional guide for teaching assistants. Journal of chemical education , 90 (10), 1296-1302.
Christie, M., Grainger, P. R., Dahlgren, R., Call, K., Heck, D., & Simon, S. E. (2015). Improving the quality of assessment grading tools in master of education courses: A comparative case study in the scholarship of teaching and learning. Journal of the Scholarship of Teaching and Learning , 15 (5), 22-35.
Howell, R. J. (2014). Grading rubrics: Hoopla or help?. Innovations in education and teaching international , 51 (4), 400-410.
Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment & Evaluation in Higher Education , 39 (7), 840-852.
Menéndez-Varela, J. L., & Gregori-Giralt, E. (2018). Rubrics for developing students’ professional judgement: A study of sustainable assessment in arts education. Studies in Educational Evaluation , 58 , 70-79.
Panadero, E., & Romero, M. (2014). To rubric or not to rubric? The effects of self-assessment on self-regulation, performance and self-efficacy. Assessment in Education: Principles, Policy & Practice , 21 (2), 133-148.
Powell, T. A. (2002). Improving assessment and evaluation methods in film and television production courses . Capella University.
Reitmeier, C. A., Svendsen, L. K., & Vrchota, D. A. (2004). Improving oral communication skills of students in food science courses. Journal of Food Science Education , 3 (2), 15-20.
Timmerman, B. E. C., Strickland, D. C., Johnson, R. L., & Payne, J. R. (2011). Development of a ‘universal’ rubric for assessing undergraduates’ scientific reasoning skills using scientific writing. Assessment & Evaluation in Higher Education , 36 (5), 509-547.
You may also like...
Quick Teaching Tips: Using Group Reading Quizzes to Build Classroom Community
June 9, 2022
Redesigning an Open Textbook by Leveraging Media, Pedagogy, and Student Collaboration
December 8, 2020
Keep Teaching: Working Remotely with International Students Part I
March 31, 2020
- Next story
- Previous story Transformative Teaching and Learning Announces 13 Grant Recipients and Launches New Resource
- Campus Voices (48)
- Commentary (19)
- Events (31)
- Fellows Programs (6)
- Future Faculty (19)
- Monthly Blog Theme (3)
- Podcast (4)
- Quick Teaching Tips (9)
- Remote Teaching Videos (7)
- Resources (27)
- Student Voices (10)
- Sustainability (2)
- Teaching Tips (77)
- Technology (36)
- Transformative Teaching and Learning (2)
- Uncategorized (5)
- About University Overview Catholic, Marianist Education Points of Pride Mission and Identity History Partnerships Location Faculty and Staff Directory Social Media Directory We Soar
- Academics Academics Overview Program Listing Academic Calendar College of Arts and Sciences School of Business Administration School of Education and Health Sciences School of Engineering School of Law Professional and Continuing Education Intensive English Program University Libraries
- Admission Admission Overview Undergraduate Transfer UD Sinclair Academy International Graduate Law Professional and Continuing Education Campus Visit
- Financial Aid Affordability Overview Undergraduate Transfer International Graduate Law Consumer Information
- Diversity Diversity Overview Office of Diversity and Inclusion Equity Compliance Office
- Research Research Overview Momentum: Our Research UD Research Institute Office for Research Technology Transfer
- Life at Dayton Campus Overview Arts and Culture Campus Recreation City of Dayton Clubs and Organizations Housing and Dining Student Resources and Services
- Athletics Athletics Overview Dayton Flyers
- We Soar We Soar Overview Priorities Goals Impact Stories Volunteer Make a Gift
- Schedule a Visit
- Request Info
Explore More
- Academic Calendar
- Event Calendar
- Dayton Engineer
- Blogs at UD
University of Dayton Students Light up Fourth Graders’ Interest in Engineering
By Sarina Tacović
University of Dayton engineering students worked on a project for Stingley Elementary School to reduce cafeteria noise, offering hands-on learning and sparking excitement about college and engineering among fourth graders.
Senior UD students Adam Dulay, electrical and computer engineering; Alison Hardie, electrical engineering; and William Maurer, mechanical engineering; created a battery-operated traffic light signal to place on cafeteria tables. The lights change from green to red depending on the detected volume, which allows the elementary students to regulate noise levels.
“This project was a fun way to give back and inspire the next generation,” Dulay said. “All of us were these kids a few years ago, and I’m sure I didn’t help with the noise problem in our cafeteria growing up. I had no idea what I wanted to do at that age, but if I was introduced to engineering then, maybe I would have been doing it a lot earlier.”
The UD students met with the 77 fourth graders every few months to provide updates on the project status. They also taught the Stingley students about the concepts used in the project, including circuits and lasers.
The fourth graders’ curiosity was palpable when they asked questions:
- “Is this the same laser that would cut diamonds?”
- “Will it alert the teachers if it stays too loud for too long?”
- “Can the microphone pick up just the length of the table, or will it pick up other tables, too?”
Stingley fourth-grade teacher Craig Chabut selected a smaller group of his students, because of their keen interest in the project, for additional learning opportunities with the UD students. At one of those meetings, the fourth-graders built a circuit.
“As a teacher, I do my best to bring the real world into my classroom. This is not only a peek at the real world through the lens of an engineer, but also ties elementary and college together for our students,” Chabut said. “This experience has been fantastic for all of us, including teachers, and I have been blown away by the efforts of the UD students. From their willingness to work with our changes and opinions throughout, to the small group work they did with students, to their presentations, it has exceeded any expectations we have had.”
During one of their update presentations, the UD students explained circuits.
The fourth graders formed a circuit, too, watching the baton light up when the circuit closed.
Stingley fourth graders had hands-on opportunities to learn about circuits.
- Engineering
- Experiential Learning
- Campus and Community
Advocating for Hispanics in STEM
Reed Magazine
- Awards & Achievements
- Books, Film, Music
- Editor's Picks
- Reed History
- The Reed Thesis
- Letters from the Editor
Reedies Bring Home Physics Presentation Awards
Students presented their research at the american physical society meeting last month in minneapolis..
Three students, Valerie Wu ’24 , Natalie Rogers ’24 , and Rubayat Jalal ’26 , attended the American Physical Society Meeting last month in Minneapolis, bringing home two undergraduate presentation awards. The five-day long conference provided a space for over 10,000 physicists, from undergrads to PhDs, to come together and discuss their research.
Fourth-year physics major, Valerie Wu, presented her summer research on non-invertible symmetries—operations that are useful in understanding fundamental forces in physics, like electromagnetics and gravity. This was her third year attending the conference and first time presenting, though the step up wasn’t too intimidating. “It felt less like a lecture and more like a conversation,” said Valerie. Reed Professor Noah Charles assisted in her research and heavily encouraged her to present her work; she also collaborated with Professor Ben Heidenreich at University of Massachusetts, Amherst.
The meeting was a cross-disciplinary symposium that allowed for the junction of many different areas of physics. While Valerie’s work was more mathematical and theory-based, Rubayat Jalal, a second-year physics major, presented experimental research on her ongoing laboratory work. Rubayat has been working in Reed Professor Jennifer Heath’s lab for over a year, studying the electrical properties of an organic polymer—a synthetic substance made of particles—that was prepared by collaborators at the University of Portland. Their goal is to use the material to convert heat energy into electrical energy.
The conference was a great opportunity to network and create connections with professionals in the field, and this was not lost on the students. Heath expressed how impressive they were, commenting that “they were polished and professional, and did a wonderful job representing Reed.” This is an example of Reed’s commitment to fostering undergraduate collaboration, endeavors, and accomplishments.
- UB Directory
- Office of the Provost >
- Academic & Administrative Units >
- AI at UB Forums offer latest in AI in teaching and research on April 30 & May 14
May 14th AI at UB Forum offers generative AI recommendations for UB
Published May 1, 2024
The Task Force on Generative AI in Teaching and Learning at UB will provide recommendations for the use of generative AI at UB on May 14th in Davis 101. Following the report, an expert panel will offer a discussion on AI for the good for society.
Date and location
- Tuesday, May 14 2:45 p.m. – 4:30 p.m. at Davis 101
This forum is a follow-up to the April 30th AI at UB Forum , which featured presentations by the task force committees and AI seed grant winners.
UB has been a leader in AI research since the 1990s and now has over 200 faculty working on AI projects. Here’s what you can expect at the May 14th AI at UB Forum.
Tuesday, May 14 2:45 p.m. – 4:30 p.m. at 101 Davis Hall
- 2:45 p.m. – 3:45 p.m. – Members of the Task Force on Generative AI in Teaching and Learning at UB will present a readout of major AI recommendations for the university.
- 3:45 p.m. – 4:30 p.m. - A panel discussion with university experts on AI for the good of society.
Will the presentation be available online?
If you can’t make it to the event in person, the May 14th AI at UB Forum will be livestreamed and recorded on Panopto for online viewing.
The forum held on Tuesday, April 30th is also available to watch online .
Missed April 30th's forum?
Featured centers and institutes.
PhD Excellence Initiative
A campus-wide, student-centric effort to ensure that UB’s PhD programs remain among the strongest in the world.
Recent University News
- 5/2/24 UB provides updated information regarding protest on Wednesday evening
- 5/2/24 SEAS student takes second place at UB’s Henry A. Panasci Jr. Technology Entrepreneurship Competition
- 5/2/24 Four from UB receive NSF Graduate Research Fellowships
- 5/2/24 UB geology student selected for Department of Energy program
- 5/1/24 Statement on arrests by UB Police following protest on North Campus Wednesday
Celebration of Learning 2024
Schedule overview.
The annual Celebration of Learning showcases work by students, faculty and staff at Augustana.
This year's celebration on May 8, 2024, includes poster presentations, interactive sessions, panel presentations and special events.
Here is a schedule of posters and oral presentations by area of study or program.
Poster presentations, session I
9-10 a.m. Gerber Center, Gävle rooms Poster session I descriptions
- Asian Studies
- Computer Science
- Engineering
- German Studies
- Public Health
- Swenson Center
Poster presentations, session II
1-2:15 p.m. Gerber Center, Gävle rooms Poster session II descriptions
- Biochemistry
- Environmental Studies
- Environmental Studies, Geology
- Environmental Studies, UMC
- Neuroscience
Oral presentations, session I
10-11:15 a.m. Session I descriptions
- Hanson 102: Special event, "A People of Vision"
- Olin Auditorium: Featured presentation, "Tales from the HEAR-T lab"
- Old Main 117: Communication Studies; Women, Gender, and Sexuality Studies
- Hanson Annex 127: Physics, Mathematics
- Old Main 28: Texas Medical Center
- Wallenberg Hall, Denkmann and Thomas Tredway Library: Art History, Graphic Design, Studio Art (Senior Art Show)
- Old Main 132: Scandinavian Studies, Kinesiology, Philosophy
- Larson Hall, Bergendoff: Music
Oral presentations, session II
11:30 a.m.-12:45 p.m. Session II descriptions
- Wallenberg Hall, Denkmann: Special event, "Lifeboat Challenge"
- Olin Auditorium: Featured presentation, "Empowering Communities: Students unite with Rock Island to develop a plan to replace lead drinking water pipes equitably"
- Old Main 117: Honors Capstone (Public Health, Kinesiology, English/Creative Writing/Psychology, Philosophy)
- Hanson Annex 127: Political Science
- Hanson 102: French
- Old Main 132: Sociology and Anthropology
- You Belong Here lounge, Gerber Center: First Year Honors Program
Oral presentations, session III
1-2:15 p.m. Session III descriptions
- Hanson Annex 127: Art History, History
- Old Main, third floor: Communication Studies
Oral presentations, session IV
2:30-3:45 p.m. Session IV descriptions
- Wallenberg Hall, Denkmann: Featured presentation, "Kitchen Metamorphosis"
- Old Main 117: Holden Village J-Term program
- Hanson 102: Native American Studies and Community Outreach
- Old Main 28: Japan J-Term program
- Old Main 132: Women, Gender, and Sexuality Studies
- Black Box, Brunner Theatre Center: Creative Writing
Special events
- 9-11 a.m., Carlsson Evald Hall: "Neurdfest," presented by Genevieve Berryman, Katey Clark, Breonna Culver, Bitanya Darge, Katie Frese, Emily Kastanes, Anna Killilea, Megan Markiewicz, Monica Perez, Zion Thomas, Sierra White, Dr. Shara Stough, Dr. Ian Harrington and Dr. Rupa Gordon
- 10-11:15 a.m., Hanson 102: "A People of Vision," presented by Dr. Sandra Boham, President, Salish Kootenai College, Pablo, Mont.
- 11 a.m.-3 p.m., Viking Plaza (rain location: Brew by the Slough): "Keeping Cool with Campus Kitchen"
- 11:30 a.m.-12:45 p.m., Wallenberg Hall, Denkmann: "Lifeboat Challenge," presented by ALIVE, Laurel Williams and Michelle Crouch
- 2-4:30 p.m., Old Main first floor, rotunda area: "Reproductive Resources and Rights in the QC"
Osher Lifelong Learning Institute at Ringling College announces summer semester programs
Registration is open for the 2024 summer semester at the Osher Lifelong Learning Institute at Ringling College that offers noncredit educational opportunities for adults to pursue new interests, expand intellectual horizons, and enrich their lives.
The semester, which runs May 13-July 2, features more than 40 single-session courses, tours, presentations, movies, and hands-on programs covering a variety of topics, including arts and entertainment, history, music appreciation, health, literature, philosophy, religion, and science.
Semester highlights include Opera Houses: Musical Landscapes of Power; What Wall Street Doesn’t Want You to Know; Why Are You Left-Handed or Right-Handed?; Creating the Florida Landscape You Love; The Early Days of Comic Books (1939-1948); and a tour of St. Petersburg’s Imagine Museum and Duncan McClellan Gallery.
Classes are offered at the Sarasota Art Museum of Ringling College of Art and Design, 1001 S. Tamiami Trail, Sarasota.
Highlights of the summer semester include:
Marietta Museum of Art & Whimsy: During this guided tour with founder Marietta Lee, hear firsthand why she created a museum dedicated to the creative human spirit that raises the importance of whimsical art. This unique museum showcases a collection of light-hearted paintings, sculptures, stained glass, and artworks in a range of mediums.
Highwaymen at City Hall: Collector Roger Lightle leads a guided tour of the Florida Highwaymen exhibition at Sarasota City Hall. The Florida Highwaymen emerged in the 1950s in the agricultural communities of Fort Pierce and Gifford. They were prolific painters who sold their artwork from the trunks of their cars during the post-World War II boom because they were unable to exhibit through traditional means due to racial barriers.
Opera Houses: Musical Landscapes of Power: Explore the designs of opera houses throughout the world. Milan’s Teatro alla Scala is perhaps the most famous opera house in the world, the Sydney Opera is one of the most distinctive and unique buildings considered a masterpiece of 20th-century architecture, and the Metropolitan is the largest house with a 38,000-seat capacity. The evolution of architectural styles from Beaux Arts to more recent innovative styles will be presented including locations in China, Dubai, and Azerbaijan.
What’s it Worth? Bring one item that you have been curious about and Andrew Ford, a seasoned antique and fine art acquisitions expert, will assess its worth. With decades of knowledge and experience, Ford can accurately evaluate the value of your treasure, whether it’s a sterling silver necklace from your great aunt, or a glass sculpture.
What Wall Street Doesn’t Want You to Know: Find out why “Black Swans” and “Gray Rhinos” happen, how retail investors are the last to know, and how Wall Street preys on the two primary investing emotions: fear and greed. The Day Hagan team will provide practical tips and best practices for selecting investments, constructing portfolios, and managing risk.
Why Are You Left-Handed or Right-Handed? Most people (85% to 90%) are right-handed and have been since the Stone Age. This session will focus on the genetic, neurological, and cultural factors that influence our preference for using one hand over the other, and what can be inferred. Also, an examination of when handedness emerges in children and age-related changes in hand dominance.
To register or for more information, visit OLLIatRinglingCollege.org or call 941-309-5111.
Submitted by Kelly Fores
IMAGES
VIDEO
COMMENTS
Oral Presentation Grading Rubric Name: _____ Overall Score: /40 Nonverbal Skills 4 - Exceptional 3 - Admirable 2 - Acceptable 1 - Poor Eye Contact Holds attention of entire audience with the use of direct eye contact, seldom looking at notes or slides. Consistent use of direct eye
How to use this rubric: • Self-assessment: Record yourself presenting your talk using your computer's pre-downloaded recording software or by using the coach in Microsoft PowerPoint. Then review your recording, fill in the rubric, and use it to self-assess your work. • Feedback from Colleagues: Present your talk to a mentor or peer(s).
Oral Presentation Rubric College of Science Purdue University Criteria1 ... A. Content Topic lacks relevance or focus; presentation contains multiple fact errors Topic would benefit from more focus; presentation contains some fact errors or omissions Topic is adequately focused and relevant; major facts are accurate and generally complete
Example 1: Oral Exam This rubric describes a set of components and standards for assessing performance on an oral exam in an upper-division history course, CMU. Example 2: Oral Communication. Example 3: Group Presentations This rubric describes a set of components and standards for assessing group presentations in a history course, CMU.
Oral Presentation: Scoring Guide. 4 points - Clear organization, reinforced by media. Stays focused throughout. 3 points - Mostly organized, but loses focus once or twice. 2 points - Somewhat organized, but loses focus 3 or more times. 1 point - No clear organization to the presentation. 3 points - Incorporates several course concepts ...
Oral Presentation Evaluation Rubric, Formal Setting . PRESENTER: Non-verbal skills (Poise) 5 4 3 2 1 Comfort Relaxed, easy presentation with minimal hesitation Generally comfortable appearance, occasional ... University of Michigan Created Date: 8/17/2013 11:22:23 AM ...
Oral presentations are expected to provide an appropriate level of analysis, discussion and evaluation as required by the assignment. Oral presentations are expected to be well-organized in overall structure, beginning with a clear statement of the problem and ending with a clear conclusion. The presentation is well-structured; its organization ...
Oral Presentation Rubric Criteria Unsuccessful Somewhat Successful Mostly Successful Successful Claim Claim is clearly and There is no claim, or claim is so confusingly worded that audience cannot discern it. Claim is present/implied but too late or in a confusing manner, and/or there are significant mismatches between claim and argument/evidence.
Example 1: Discussion Class This rubric assesses the quality of student contributions to class discussions. This is appropriate for an undergraduate-level course (Carnegie Mellon). Example 2: Advanced Seminar This rubric is designed for assessing discussion performance in an advanced undergraduate or graduate seminar.
Problematic Content, structure, and language of presentation geared to intended audience Presentation is missing some content required by audience; some language used inappropriately (e.g., unfamiliar jargon, too much jargon) Presentation is missing a substantial portion of content required by audience; uses some inappropriate or ineffective ...
Rubric Best Practices, Examples, and Templates. A rubric is a scoring tool that identifies the different criteria relevant to an assignment, assessment, or learning outcome and states the possible levels of achievement in a specific, clear, and objective way. Use rubrics to assess project-based student work including essays, group projects ...
Examples of criteria that have been included in rubrics for evaluation oral presentations include: Knowledge of content. Organization of content. Presentation of ideas. Research/sources. Visual aids/handouts. Language clarity. Grammatical correctness. Time management.
confident speech. No distracting habits. A "knock-your-socks-off" kind of presentation style. 2 Presenter talks too fast or too slow, talks to the screen, or does not talk loud enough. Says "uhm" or "like" or "Ya know". A bit chatty for professional piece, presenter tends to
Criteria. Components3-Sophisticated2-Competent1-Not yet CompetentOrganization. Presentation is clear, logical, and organized. Listener can follow line of reasoning. Presentation is generally clear and well organized. A few minor points may be confusing. Organization is haphazard; listener can follow presentation only with effort.
Group presentation rubric. This is a grading rubric an instructor uses to assess students' work on this type of assignment. It is a sample rubric that needs to be edited to reflect the specifics of a particular assignment. Students can self-assess using the rubric as a checklist before submitting their assignment.
Title: Scoring Rubric for Oral Presentations: Example #1 Author: Testing and Evaluation Services Created Date: 8/10/2017 9:45:03 AM
The presentation style is interactive and encouraging of discussion throughout. The presenter engages with questions from the audience. Presentation style (10%) 0-4% 6% 8% 10% Effective use of verbal and non-verbal communication (e.g., voice, volume, inflection, eye contact).
Microsoft Word - Oral presentation grading rubric.docx Created Date: 1/4/2014 1:29:09 AM ...
Score. Language Use and Delivery The student communicates ideas effectively. Effectively uses eye contact. Speaks clearly, effectively and confidently using suitable volume and pace. Fully engages the audience. Dresses appropriately, Selects rich and varied words for context and uses correct grammar. Maintains eye contact.
Oral Presentation Rubric 4—Excellent 3—Good 2—Fair 1—Needs Improvement Delivery • Holds attention of entire audience with the use of direct eye contact, seldom looking at notes • Speaks with fluctuation in volume and inflection to maintain audience interest and emphasize key points • Consistent use of direct eye contact with ...
Scoring Rubric for Group Presentations Competence Weighting /100 Criteria Comments A A- B+ B and below Introduction 10 Clearly defined background and relevance of policy issue. States objective precisely Defined background and general relevance of policy issue. Stated objectives General description of ...
Assessment Rubric for Presentations. The presentation has a concise and clearly stated focus that is relevant to the audience. The presentation is well-structured with a clear storyline. Ideas are arranged logically; they strongly support the presentation focus. Sections are well- connected with smooth transition.
The session provided a platform for educators to share ideas and engage in discussions about the transformative impact of rubrics in education. If you missed this opportunity, keep an eye out for future workshops and resources to elevate your teaching and assessment methods. Resources: Presentation Google Slide Deck (NetID login required)
View her presentation slides, examples, and prompt scripts that can be adapted for your own assignments/courses. To learn more about rubrics and assessment criteria, visit CTL's online resource on the topic. References: Andrade, H., & Du, Y. (2005). Knowing what counts and thinking about quality: students report on how they use rubrics.
Grading Rubric. Criteria Levels of Achievement. Sophisticated. (3 points each)Competent. (2 points each)Not Yet Competent. (1 point each) Research. Quality (e.g. use of varied sources, evaluated and validated sources, accurate information) Information is accurate; resources are legitimate; resources are varied when appropriate Information is ...
University of Dayton engineering students worked on a project for Stingley Elementary School to reduce cafeteria noise, offering hands-on learning and sparking excitement about college and engineering among fourth graders. ... to their presentations, it has exceeded any expectations we have had." ...
Three students, Valerie Wu '24, Natalie Rogers '24, and Rubayat Jalal '26, attended the American Physical Society Meeting last month in Minneapolis, bringing home two undergraduate presentation awards.The five-day long conference provided a space for over 10,000 physicists, from undergrads to PhDs, to come together and discuss their research.
3:45 p.m. - 4:30 p.m. - Five winners from the VPAA Small Grants Program for using AI in teaching will provide five minute presentations in their areas of research. These grants were awarded to 10 UB faculty members who each received $5,000 to develop and discover new ways to incorporate AI in the classroom.
This year's celebration on May 8, 2024, includes poster presentations, interactive sessions, panel presentations and special events. Here is a schedule of posters and oral presentations by area of study or program. Poster presentations, session I. 9-10 a.m. Gerber Center, Gävle rooms Poster session I descriptions. Asian Studies; Chinese ...
The semester, which begins May 13, features more than 40 courses, tours, presentations, movies, and hands-on programs covering a variety of topics. ... Learning Institute at Ringling College that ...