Hackathon Judging Criteria
Create fair, transparent judging that everyone trusts. The right criteria turn subjective opinions into objective evaluation that drives better projects.
Why Judging Criteria Matters
Nothing kills hackathon energy faster than teams feeling like judging was unfair or arbitrary. When participants don't understand how they'll be evaluated, they optimize for the wrong things - or worse, they don't try as hard because "it's all subjective anyway."
Without Clear Criteria
- ✗Teams focus on flashy demos over substance
- ✗Judges struggle to compare different types of projects
- ✗Results feel arbitrary and demotivating
- ✗Winners can't explain why they won
With Clear Criteria
- Teams build with clear goals in mind
- Judges evaluate consistently across projects
- Results feel fair and well-reasoned
- Everyone learns what makes a strong project
The Recommended Framework
Based on what works across hundreds of hackathons, this 4-category framework strikes the right balance between comprehensive evaluation and simplicity. It works for corporate, university, community hackathons, and everything in between.
Innovation & Creativity
How novel and creative is the solution?
What to Look For:
- • Unique approach to the problem
- • Creative use of technology or resources
- • Fresh perspective on existing challenges
- • Originality of the core idea
Technical Implementation
How well is the solution built?
What to Look For:
- • Code quality and architecture
- • Technical difficulty and complexity
- • Completeness of the implementation
- • Demonstration of technical skill
Business Value & Impact
How valuable is this solution?
What to Look For:
- • Clear problem being solved
- • Potential real-world impact
- • Market viability or internal value
- • Scalability of the solution
Presentation & Communication
How well is the idea communicated?
What to Look For:
- • Clarity of the pitch and demo
- • Quality of supporting materials
- • Ability to answer questions
- • Overall professionalism
Adjust the Weights
Voting & Scoring Methods
The voting method you choose affects judging speed, fairness, and how differentiated your results are. Here are the most effective approaches.
1-10 Scale Scoring
Recommended for most hackathons
Judges score each project 1-10 across your criteria categories. Good balance of speed and differentiation. Industry standard for most hackathons.
Pro: Fast to score, enough granularity, familiar to judges. Con: Judges may cluster around 7-8 for most projects.
Top 3 Voting
Simple & democratic
Each judge (or participant) picks their top 3 favorite projects. Most votes wins. Simple, fast, and eliminates scoring complexity.
Pro: Fast, no scoring required, very clear. Con: Less nuanced, may miss solid middle projects.
Ranked Choice Voting
More nuanced rankings
Judges rank their top 5-7 projects in order of preference. More detailed than Top 3, less complex than scoring every category.
Pro: Forces differentiation, captures relative preferences. Con: More complex to calculate than Top 3.
Category Winners
Celebrate different strengths
Award winners in multiple categories instead of (or in addition to) overall winners. Celebrates diverse excellence and allows more teams to win.
Pro: Multiple winners, celebrates different types of excellence. Con: Requires more careful judging in each area.
Thumbs Up/Down + Feedback
Quality threshold voting
Judges give binary vote: Would we ship this? Yes or No. Requires written feedback explaining the decision. Focuses on quality bar rather than ranking.
Pro: Emphasizes quality over ranking, valuable feedback. Con: May create many ties at the top.
Pairwise Comparison
Head-to-head matchups
Judges compare projects two at a time: "Which is better, A or B?" Repeat for multiple pairs. The algorithm determines overall rankings from head-to-head results.
Pro: Easier decisions (A vs B), statistically robust, reduces bias. Con: Requires many comparisons for accuracy.
Publishing Your Judging Criteria
When and how you publish your criteria is just as important as what the criteria are. Here's the right approach.
Publish Before Registration Opens
Teams should know how they'll be judged before they sign up. Include criteria in your announcement email and registration page.
Make It Easy to Find
Put criteria on your hackathon homepage, in the welcome email, and reference it during kickoff. Don't hide it in a PDF that no one will read.
Explain the "Why" Behind Each Category
Don't just list categories. Explain why Innovation matters (drives breakthrough thinking) and why Presentation matters (great ideas need great communication).
Never Change Criteria Mid-Event
Changing judging criteria after teams start building breaks trust completely. If you must adjust, make it an additive bonus category, not a change to core criteria.
Example Judging Rubric
Here's a complete example rubric using the 1-10 scale. Copy and adapt this for your hackathon.
Project Name: _______________
Judge Name: _______________ | Round: ___
Innovation & Creativity (30%)
Score: ___ / 10How novel and creative is the solution?
Technical Implementation (25%)
Score: ___ / 10How well is the solution built?
Business Value & Impact (25%)
Score: ___ / 10How valuable is this solution?
Presentation & Communication (20%)
Score: ___ / 10How well is the idea communicated?
Overall Comments & Feedback:
Share Example Scores with Judges