How can Alberta Education shift from a paper based assessment to a more informative digitally based assessment?
Client: Alberta Education, Government of Alberta
My role: User Experience Designer
Duration: April 2014 – January 2016
The Student Learning Assessment is a new digitally-based provincial assessment tool that provides a beginning of the year, formative assessment to help teachers identify the strengths and areas of growth of each individual student.
This is a pilot project starting with grade 3 students (SLA3), aimed at replacing the year end paper-based assessments called the Provincial Achievement Tests. After the grade 3 pilot, this assessment model is scheduled to be rolled out to grade 6 and 9 students.
Team / Role
After a proof of concept had been approved I was brought onto the project as a User Experience Designer. My role was to take the current proof of concept and develop it into a student and teacher focused application.
The main team on this project comprised of 5 developers, 1 QA tester, a project manager, and a product owner. We worked in an agile development methodology with weekly sprints requiring me to deliver new design concepts each week based on the current stories.
The core design process revolved around weekly developer sprints. Weekly design meetings were held to demo new design features and to discuss upcoming priorities. Based on these priorities I was responsible for delivering design concepts through an Axure prototype.
In addition to the Axure prototype, I developed a style guide built in HTML and CSS for the developers to reference in conjunction with the prototype.
In addition to the weekly sprints, I conducted contextual interviews and usability tests with teachers and internal stakeholders. Our team also developed a set of interaction guidelines for grade 3 students (aged 7-8) based on usability testing in the field.
Getting access to users was the biggest constraint on this project. Because this was a pilot project with the government, there was extra concern for public opinion and perception. This limited our ability to go out into the field to test concepts with teachers and student.
Since the assessment was only run once a year, any features that weren’t able to make it into the application would have to wait till the next year. During the administration, we were unable to actually observe the assessment in real time due to political constraints. This was a huge barrier to creating a truly teacher/student focused application and made it difficult to define user needs.
Two major gaps in this project was a lack of analytical metrics for use data and the inability to meet with end users as often as we’d like to. Without direct testing with users it was difficult to feel confident in some of the design decisions to move the project forward.
If I could approach this project again I would put more focus on designing an application through a systems perspective rather than looking at individual features. I would also work on creating better design documentation with greater details for the developers. Documenting the decision making process would also have been beneficial. Sometimes the reasoning and rational behind why something was done a certain way would get lost.