Short Stories: Research and Design Methods at edX
A collection of quantitative and qualitative research I supported or ran while working at edX. Various design methods I employed to generate artifacts like sketches, wireframes, hi-fi visual designs, interactions, workflows, and clickable prototypes
Note: Each team at edX works differently, so process stories are individualized to time constraints, available resources, and specific team goals.
Company: edX
Dates: October 2017 – September 2019
How do learners on a degree pathway approach a course differently than those who are learning for immediate application of skills?
During a company hackathon, my team and I had some down time from standard projects and company initiatives. We used this time to do a basic review of the learning experience. We were just about to release the first full Master's degree on the platform, so this usability review focused around the experience of a credit-earning student starting a new course.

Participant Selection
We needed candidates who were:
- unbiased towards older features we intended to explore
- currently (or very recently) in a degree program
Our new interns were solid easy-to-access candidates. Their recent start at the company meant they had little bias towards the older features we intended to explore. The interns were in undergrad or graduate programs the previous semester, so we were able to see how current students study online. We also gained valuable internal exposure by showing these new employees to UX work, which they excitedly shared with their respective teams and discussed over lunch tables.
Testing Goals
- How do credit-earning learners experience a course differently than those learning ad-hoc for immediate application at work?
- Do they have enough information and support to confidently start a high-stakes high-cost course?
We gathered and coded observations using a rainbow spreadsheet for collaborative analysis. Small insights from this no-cost research provided a couple quick wins and cleared several new paths for further exploration.
Working with new employees
- My UX colleagues and I had a chance to get to know new edX employees and evangelize the work UX does.
- We left time to discuss practical ways in which our teams could support each other.
How can we update our brand to embody trustworthiness & credibility, while still focusing on content?
As part of the Course Discovery project, we were able to develop an updated visual style for the site. Reviewing visual design for general impressions and brand definition.
​​​​​​​Competitive Analysis
In order to develop a much needed updated look and feel for our product, my UI colleague and I did some research on trends and styles in the education space, particularly ed-tech companies.
​​​​​​​Style Tiles
Using this information, we each developed several concepts for an updated look and feel. We reviewed the various concepts with the team and compared against brand descriptors agreed upon earlier in the month: Trustworthiness, credibility, and approachability.

Desirability Tests
Using these style tiles, we built out the same page to run through a desirability test based on Microsoft's Desirability Toolkit. Participants viewed one rebranded page and described what they just viewed from a list of descriptors.
Using these tests, we were able to tone down things that were less useful for the brand image and amplify pieces that were helpful.
I highlighted quick facts about the site and our users—participants found these reassuring and credible. We simplified general information about the subject—people read it less frequently because it was lengthier and more technical, creating a less approachable vibe.
How can we reduce the time and resources spent processing small, simple enterprise sales?
The good news—edX's developing enterprise offering was growing and gaining a foothold with both large and small organizations.
The bad news—the sales and fulfillment process was tedious, manual, and required hours of time from the Enterprise Customer Success and Sales teams.
Contextual Inquiry to inform a Journey Map
I sat down with one of our Customer Success representatives to observe how the sales and fulfillment process worked.
I roughly applied object-oriented design principles to develop a concise journey map for the lengthy process.
Parties: Customer Success team, the client, and the Enterprise Sales team. (Supporting roles from Legal and Engineering.)
Documents: Instead of focusing on the current state, we singled out the bare minimum client information necessary to make a sale and important status alerts to move the order forward.
Opportunity Analysis
In a collaborative session including sales, customer success, engineering, and product, we reviewed the journey map for:
Opportunities to encourage upgrades to increase sale size;
Manual tasks that could be automated to decrease resources; and
Repetitive steps that could be consolidated to decrease resources, decrease support tickets, and decrease turnaround time for fulfillment.
Design Recommendation
I used the results from this collaborative session to developed a low-fi design that would collect all order information in one step to reduce emails. I then worked with product and engineering to outline additional opportunities for automation as well as a very high level workflow for an ideal fulfillment process.

Are learners likely to pursue a program if informed about it within a course they are already taking?
Part I: Audience Validation
Our data analytics team member already had a backlog of analysis from previous tests, and she had to split her time between our working group and projects for other teams.
How We Tested
I designed a banner ad which highlighted other courses in the course series. Users could click a button to "save for later." Functionally, it did nothing besides click tracking.
What we learned: we discovered a significant portion of learners were interested in additional courses in the larger program.
Part II: A/B test
Quant analysis: Using data we discovered (the portion learners who expressed interest from the previous test) and existing product data (conversion rate for users who enroll in a program and purchase that program), we estimated X learners would enroll in the program, and Y learners would convert to paying customers.
How We Tested
Version A – a more developed banner ad that directed learners to more information about the full program and the option to enroll in all
Version B – no banner
Full program enrollments increased, which increased overall course enrollments several-fold (a metric closely followed by executives and board members). Conversion for these new enrollments loosely matched conversion rates for other users who enrolled in a full program.
Technically, the design was successful, as it increased revenue. The team decided not to pursue this concept further because it did not meet our revenue requirements. Concepts required higher revenue increases to justify shipping additional components in the experience.
A note about
With unlimited access to, our small team was able to do research with less time spent recruiting, signing consent forms, and all the tasks involved with in-person research. While remote unmoderated testing is not a 1:1 replacement, it does provide opportunities to run a pilot, quickly improve a prompt, add an additional screening qualifications, run simultaneous tests, or employ the RITE method (rapid iterative testing and evaluation) to quickly develop a design concept.
(This is not a paid promotion, I just really love the efficiency and approachability of research and validation with UserTesting.)
For information about my work experience and skills: