1

WEB • 2025

Common Assessments

Scope

Ideation

0 to 1 design

Visual design

Role

Research
Prototyping
Product Design

Timeline

2 months

Platform

Web

Overview

In the U.S., teachers and schools evaluate students regularly — often on a weekly basis. These assessments can range from measuring mastery of a single concept to gauging understanding across an entire subject. Among this spectrum, common assessments play a particularly critical role.

Common assessments not only influence day-to-day teaching practices but also provide a clear picture of student learning outcomes. For school and district administrators — the key decision makers — these assessments are essential for comparing performance across classrooms and schools, and for ensuring that learning goals are being met consistently.

Provide school and district leaders with clear, actionable insights into student performance, ensuring they could measure progress and drive instructional improvements.

Enable admins to track learning outcomes

Build a solution that not only addressed educator needs but also positioned Wayground to expand into the formative assessment market at the district level, unlocking new revenue opportunities.

Create a new sales motion for Wayground

2

Getting the context

We began by speaking directly with teachers and school leaders to understand their challenges with existing assessment tools and reporting. These conversations gave us clarity on the pain points that mattered most in day-to-day use.

In parallel, we conducted competitive research to identify the core features that any credible solution would need, while also spotting opportunities to differentiate through usability and simplicity.

To accelerate the process, we leveraged AI tools for rapid prototyping and testing. This allowed us to validate ideas quickly with early users, refine the experience based on their feedback, and secure buy-in from stakeholders before moving into development.

What was not working?

Fragmented Tool Ecosystem

Teachers and admins juggle multiple tools for different assessment needs, which creates confusion, extra work, and inconsistent experiences across schools.

Heavy Administrative Burden

Administrators spend hours organising and coordinating assessments from scheduling to distributing compliant materials — leaving them with less time to focus on supporting teachers and students.

Limited Data Insights

Educators receive raw scores but not actionable insights, making it difficult to understand learning gaps or decide how to adjust instruction effectively.

No Secure Testing Mode

Without a reliable secure mode, teachers worry about test integrity and fairness, which undermines confidence in the results and increases oversight efforts.

Technical and Infrastructure Gaps

Poor internet and limited device availability mean students can’t always participate in assessments smoothly, creating unequal access and added classroom stress.

Journeys to enable

We had some core journeys defined that we wanted to design going ahead with the first version before schools began in August. This would allow teachers and leaders to play their role completely while being able to evaluate and test our offerings too.

How are we solving this?

We approached the problem by addressing some fundamental needs that would deliver great user experience.

Clear charts, filters, and navigation provide teachers and admins with data and AI powered insights.

Easy-to-read reports

Access millions of ready-made activities, saving time and enabling them to evaluate and teach effectively.

Leveraging our ecosystem

Secure mode keeps students focused during tests while also preventing cheating during the assessment.

Secure test experience

Easy way for teachers to create, assign, and distribute assessments using Wayground's tools.

Test creation & delivery

Experimenting with AI

We explored AI tools across different stages of the project — from conducting market research to uncover pain points, to refining PRDs, mapping user journeys, and even gathering early design feedback.

The biggest breakthrough came with AI-powered prototyping. These tools allowed us to quickly bring ideas to life, test them with nearly a dozen teachers, and continuously refine based on real feedback. This drastically accelerated our iteration speed, helping us validate concepts faster and with more confidence.

The AI workflow stack

Prototype on Lovable

Things we wanted to validate

Report Scanning

Explored how teachers and administrators scan reports to identify priority data points. This helped validate which elements should be surfaced first in the IA to support quick comprehension and action.

Decision-Making Behaviours

Studied how district leaders use reports to detect anomalies, confirm assumptions about performance, and identify areas of concern. This informed how the dashboard could support both high-level scanning and deeper investigation.

Real-World Actions

Investigated how admins translate report findings into real-world interventions, such as collaborating with school leaders or engaging specific teachers. This validated the need for flexible views, deeper drill-down capabilities and exporting reports.

Visualisation

Tested different chart types and visualisation methods to understand which formats enable faster pattern recognition and comprehension across diverse data sets. Insights guided us toward designing more better context-aware visualisations.

Emerging Use Cases

Collected niche scenarios from users to uncover opportunities for scalable features that could address broader needs in future iterations of the product.

Reports

We wanted the reports to not only convey insights clearly, but also empower school and district leaders to draw the right inferences, and make decisions that drive better learning outcomes.

Traditional assessment platforms often presented reports in ways that were dull, overwhelming, and difficult to interpret. Many required the support of a data analyst, which made them inaccessible to most educators and administrators.

We set out to design reports that were simple yet powerful:

  • At a glance, leaders can understand how schools and districts are performing overall.

  • With a click, they can deep-dive into specific data points and investigate anomalies at a granular level.

To push this further, we’re also exploring AI-driven insights — helping leaders surface key trends and inferences instantly, without needing to sift through multiple charts or dashboards. The goal is to make data not just available, but truly usable in driving better learning outcomes.

What do admins care about?

These assessment are window into how the school and districts are performing

Targeted Intervention

Admins need to spot where students are struggling. Reports should highlight gaps at the class or district level so support can be directed quickly.

Student Performance

They want a clear view of how students are progressing over time—what’s working, what isn’t, and whether standards are being met.

Equity in Education

Admins also look for fairness. Reports should show how different student demographics are performing so no one is left behind.

Report layout

We had to pick a layout that was flexible to support deep-dive into data
but not overwhelm anyone reading the report

The types of data displayed

Average Accuracy

Helps to understand aggregate performance of students and the district

65%

Accuracy Range

Helps to understand the distribution of students in various bands

Heat Map

Allows a leader to deep dive into granular performance across standards and questions

AI Insights

Helps to surface deeper insights that are not immediately inferred from the graphs

Screens slideshow

Assessment Hosting

Redesigning the hosting flow to support a secure, classroom-first assessment experience.

Our existing game-based experiences were built for engagement and fun, but they weren’t designed for high-stakes assessments. Loopholes in the flow could be exploited, compromising test integrity and security.

To address this, we reworked the hosting experience from the ground up:

  • Refined the class selection flow so teachers can easily assign assessments to the right groups.

  • Improved student access controls, ensuring only the intended participants can join.

  • Enhanced live monitoring tools so teachers can track progress and intervene when needed.

These updates minimized opportunities for misuse while meeting teachers’ requirements for stricter proctoring and a seamless in-class experience.

Screens slideshow

Secure Assessment Mode

Designing a secure, student-friendly testing environment to prevent the integrity of the common assessments.

Our research revealed that a secure testing mode was a critical requirement for schools and districts, as it helps maintain the integrity and fairness of assessments. To enable this, we partnered with LockDown Browser and integrated it into Wayground.

The challenge was to make the experience simple enough for students — especially younger ones — to launch and use without confusion. We designed a streamlined entry flow that minimizes setup friction while ensuring assessments run in a secure environment.

On the operational side, our customer success teams work closely with school IT departments to manage the setup and ensure smooth rollouts, balancing both product experience and real-world implementation needs.

3

Final Words

For this first version, we were able to address the core challenges around assessments and deliver a solution that teachers and administrators could trust. Along the way, we learned the importance of balancing usability with rigor to make tools simple enough for everyday classroom use, while meeting the high standards expected by districts.

As we move into the next phase, our focus will be on deepening the ecosystem and expanding the impact of our reporting tools.

Holistic Information Architecture

To fully integrate assessments into Wayground, we plan to design a robust IA that connects seamlessly with other features. This will help create a more cohesive, end-to-end ecosystem for educators.

Analytics for Multi-Test Assessments

We also aim to expand reporting to handle multi-test analysis, allowing leaders to compare results across assessments and identify long-term trends with greater accuracy.

$350K

sales pipeline secured due to the pilot and product demos

500+

assessment hosted by schools in our pilot program

©2025 Himanshu Sharma

Designed in Figma.

Developed in Framer.

Fuelled by coffee. 2025.