Harbor: Teaching the Way Students Learn
My Role: UX Lead (in-house) | Length: 3 Years
Project Status: Launched
Type: Learning Management System (B2B SaaS)
My Superpower: UX Writing
It’s 2024 and AI is huge. Why? We all want software that thinks the way we do.
Shouldn’t it be the same way for teachers? It’s 2024 and supporting educators is our best hope for a bright future.
As part of Big Ideas Learning, I led a small team of 3 designers over the course of 3 years in the design of one of the first learning platforms driven by real-time student progress data and fully customizable curriculums.
Problem space
How might we help teachers teach the way students learn and do so on a per-student basis?
Our Solution
We gained an intimate view of the current museum experience and learned that the experience fell short by providing minimal information and entertainment and lackluster tours. Research told us that the solution was “magic window” mobile AR. We named it Tellyscope.
Design Goals
To make every museum a multi-sensory experience.
To educate through context.
To foster curiosity and wonder.
To individualize the museum experience.
We made:
The Tellyscope brand. Television (entertainment) + Telescope (exploration) = Tellyscope
User-tailored fact callouts
Animal and environmental sound triggers
Synchronized audio and text guides
Environment animations
“Skeleton to skin” rendering lifelike animals.
Colorblind-friendly, iOS HIG visual design
UI and Features
Research
We started by putting users at the core of our design.
We learned their demographics and how each experienced museums, including their best and worst experiences.
Then we learned from their experience.
We interviewed 5 museum goers ranging from tourist to enthusiast. They told us what mattered to them during their visit and what makes a great museum.
Their insights became our design
They told us that they valued an individual, immersive experience that connected them to the pieces and engaged them intellectually. However, natural history museums were stale, boring, or for children.
Tour experiences were a key theme and audio and human guides were frequently clumsy, a hassle, and provided incomplete information.
Design
1: Low Detail
We brainstormed how we’d address each feature.
As an Agile UX team, we collaborated and improved on each other’s ideas so that we committed ourselves to the best solution.
We picked the best idea for each feature, revised it, and made it part of our MVP.
Narration Text Design
Initial Off-Screen Callout Design
Initial UI Design
Revised UI Design
2: Medium Detail
We tested our prototype with users.
Users showed us that:
They wouldn’t want to hold their phone continuously, making audio narration a key feature.
The prominent size and color of the “AR Active” indicator made it look like a button.
Numbered fact callouts suggested that
users should tap them in order.
Mid-Detail Usability Results
3: High Detail MVP
High Detail Usability Results
We made changes and tested our final MVP with more users.
What changed:
FAQ copy that reflected how users verbalized tech issues.
Intuitive zoom added.
Removing numbers from fact callouts.
Adjustable knowledge level settings for children, adults, and enthusiasts.
Takeaways
Text labels are more accessible than icons. Users would rather be “insulted” than frustrated.
Natural mappings work. Of the two ways to adjust knowledge depth, all users intuitively swiped the dial.
What one user loves, another finds distracting. Keep it simple.
UX means authentic inclusion. Authentic inclusion means going beyond convention and never assuming. Syncing audio narration and text for accessibility was enjoyed by all users.