Background and Goals
Our goal was to investigate and evaluate the Dropbox webapp to identify user experience issues and provide recommendations for improvement. Our objectives were as follows:
​
-
Test conceptual/mental models of Dropbox cloud storage and sharing
-
Identify pain points and usability issues
-
Evaluate efficiency, learnability, and intuitiveness
-
Compare the performance of experienced and novice users
-
Provide recommendations for future improvements
Process
A multi-step process was used to form an understanding of Dropbox users and their pain points to inform our insights and suggestions for improvement.

An expert evaluation was conducted to understand users' tasks and goals and gain initial insights into problem areas of Dropbox. Two test scripts were designed that combined functionality important to novice and experienced users. These scripts combine a variety of tasks, knowledge probes, and interview questions to assess their experience with Dropbox.
​
User testing was conducted via Zoom with five participants (three novice and two expert users) following three sections of the scripts: an introduction and pre-task interview, five interaction heavy tasks highlighting various Dropbox features and 'free play' with the product, and a post-task interview.
​
Afterward, an analysis of study results for both novice and experienced users guided our insights and recommendations for improvement.
Procedure
During each session, a moderator led participants through the test scripts. Two data loggers noted down the observations of the responses and behaviors of the participants. While a confederate dynamically responded to tasks in order to simulate a collaborative environment.
​
Measures of this study included:
-
Performance measures: success rate, selection accuracy, number of errors, and number of mouse clicks/taps
-
Behavioral measures: facial expressions and body language, seeking help, and verbal responses and comments while performing tasks
-
Subjective measures: open-ended question on self-performance, intuitiveness probes, satisfaction scale, and ease of use scale
Insights and Recommendations
Based on the findings and analysis of our usability tests, major areas of concern were identified. These issues were categorized under usability principles of discoverability, conceptual model, negative affordances, value proposition, information hierarchy, feedback, and complexity. Recommendations for improvement were provided alongside each discussion of issues.







My Learnings
What I learned from this project:
-
The importance of triangulation by "filling in the circle" and creating a holistic understanding of user needs. This included diversifying tasks types and using performance, behavioral, and subjective measures to collect both qualitative and quantitative data in order to provide actionable insights rather than a data report.
-
How to create strong study scripts for each users type using task, goals, and knowledge and perception probes to gain a comprehensive understanding of their experiences and pain points
-
Pilot studies are vital to ensuring the session run smoothly and as expected