We conducted comprehensive user testing before revising our design, and designed different tasks for three different target stakeholders.
Firstly, we developed quantitative tests, mainly using system usability scale related surveys before and after each task to understand user feedback.
Considering that we had more donors than vendors and requesters in the previous stage, we conducted qualitative user testings, mainly for vendors and requesters (with assistance), to gain insight into which components we could improve in our prototype to enhance their experience.
Quantitative Usability Tests
I have conducted 7 out of 27 usability tests to gather quantitative findings for our usability test results. The majority of the users were able to complete each task within 10 minutes. However, some users who required assistance faced difficulties in completing these tasks.
During the design iteration stage, we generated a data analysis that has statistical significance to further analyze our results. Our goal is to improve navigation and simplify interactions on the main page. The report will also include a more detailed statistical analysis.
Qualitative Usability Tests
To provide stakeholders, particularly requesters in need of assistance and campaign vendors/managers, with more information about our prototype, I have conducted 3 out of 7 qualitative usability tests. These tests primarily consist of in-person and remote interviews with think-aloud sessions.
After all the user testing sessions are done, we have summarized our findings and evaluated our current design using Nielsen Norman Group's 10 Usability Heuristics.
The first set of tasks primarily involves creating an account and setting up the necessary information for account verification. Participants provided feedback on adding functionalities, improving nonintuitive interaction methods, and selecting different stakeholders.
The second set of tasks primarily focused on creating new requests, publishing new campaigns, and managing campaigns. Requesters and Vendors provided valuable feedback on their use of similar platforms and suggested ways to make different parts more intuitive.
We summarized our user testing feedback in business matrices and UX matrices.
In the business matrix, the click rates for relevant sample campaigns saw a significant increase. The newly designed interface has greatly improved the efficiency of different stakeholders in finding their target pages compared to the previous platforms.
In the UX matrix, we have received mostly positive feedback, along with some areas for improvement. Overall, users have spent less time on each task, and their satisfaction rate has significantly increased. There is still room for improvement on the campaign page, as well as the possibility to create a "glanced" version versus a more detailed version depending on the users' interest in specific donation projects.
↑ 72% Satisfaction
when the donation campaign was completed. Additionally, 78% of the participants reported increased confidence in understanding and managing the entire donation process.
↓ 65% time
Searching for target charity campaigns from the main page. Additionally, over 70% of the participants reported spending less time to complete a donation compared to previously complicated donation processes.
“You all have done an exceptional job, team! 😊 This service is absolutely amazing, and I'm genuinely thrilled to dive into the next steps.” - Shiva Thirumazhusai, CTO of TISTA
“Projects like this show how technology can positively impact society by addressing challenges and improving well-being. It demonstrates the potential of technology in shaping a better future.” - Sharlane Cleare / Course Coordinator, Cornell University