It would be helpful to have an auto-scoring function for rubrics.

Currently, we have faculty score imported data fields such as GPA or Level of Need when conducting reviews of applicants. Doing so means students with higher GPAs or higher Levels of Need receive higher review scores and are prioritized for awarding. With large applicant pools, having reviewers manually score quantitative items can become tedious and time-consuming. It would be great if there were an automated feature that would automatically assign scores for certain items based on parameters set by the administrator (e.g., GPA of 4.0 = 10 points, 3.9 = 9 points, 3.8 = 8 points, etc.). This would save time for our reviewers and still allow us to assign different weights to unique elements of a rubric. Ultimately, we want administrators to be able to rely on the reviewer score without having to look at or filter against other information such as GPA.

  • Joel Spiess
  • Jan 23 2023
  • Reviewed: Voting Open
Client Name "shard name" uwm
User Reviewer
Functional Unit Reviews
Employee Name Joel Spiess
  • Attach files
  • Oliver Mamangun commented
    December 12, 2023 01:51

    Seems to be tied to this enhancement request: https://bbam.ideas.aha.io/ideas/SMM-I-1000


    Can the votes be merged? How can we promote enhancements and what is the criteria for enhancements to be considered?

  • Lindley Jones commented
    March 28, 2023 16:50

    We have had committees request this very thing. There's no need for them to score when we have the information on hand. An update like this would be very helpful.