swiping bias
UCRE is a class offered to students in the Master’s of Human-Computer Interaction program at Carnegie Mellon University. The class is focused on teaching students
GOAL: Provide a recommendation for incentivizing users to audit algorithmic bias in their feeds.
AUGUST 2021 - DECEMBER 2021
My Role: user research, storytelling, affinity mapping, synthesis, ideation, product design
Tools: Miro, Figma, GSuite
Team: 4 Researchers — Sreya Cherukuri, Mandy Lin, Krystal Zeng, Cameron Wu
Deliverable: Final Pitch & Recommendation
executive summary
Have you ever seen content on your feed you didn’t like but just scrolled past it because you didn’t know what to do? With technological innovation and AI development accelerating speedily everyday, our team’s goal is to make users are active members in creating more equitable and fair technologies.
SOLUTION: Our team believes with a change in information architecture and visibility, in an app such as Tik Tok, we can directly incentivize users to become drivers of identifying algorithmic bias in applications we use everyday, making the digital world a safer space for everyone.
problem
What are ways in which we can incentivize everyday users to uncover harmful algorithmic biases in AI systems and assist AI/ML teams in addressing these issues?
research methods
Our team conducted various different generative research methods to extract insights from different interviewees.
Each of our methods uncovered new assumptions and new perspectives on the proposed problem at hand. We then redefined out problem space and picked a scope for our research project.
reframing our research question
how might we improve the current Tik Tok reporting system to creative more incentive for the user to report?
insights
Through our various methods of research we uncovered the following insights when evaluating and validating our ideas. Based on previous research, our research focused on making a distinction between the provided “Not Interested” and “Interested feature.”
01 Users prefer less friction in their interactions when browsing Tik Tok, which matches with their entertaining purposes.
02 Users generally go through multiple steps or touch points before finding the reporting option, which takes too much effort and further affects their incentive to report.
03 Users rarely feel that the action of “Reporting” is the necessary action for the situation at hand because “report” indicates very severe situations.
04 Users lack awareness of using built-in features for eliminating bias.
05 When recognizing the existence of build-in features, users desire the ability to better customize the auditing process by specifying reasons.
refining solutions
Through our storyboarding and speed dating process, we focused on three reoccuring insights that our users identified were most important to them when evaluating potential solutions.
solution
As researchers, we proposed a reimagination of the TikTok reporting interface, one that takes into account aspects of our research into what elements users felt would be valuable to have. To begin, the “Report” and “Not Interested” features are to be made accessible from the surface-level TikTok interface. The distinction is extremely important to the user.
For “Not Interested”, users reacted positively to the proposal of a “swipe left, swipe right” approach to indicate whether or not they were interested in a given post. Upon swiping either way, users are prompted for more information as to why they found the TikTok to be appropriate or inappropriate to their interests, respectively.
As for the “Reporting” function, it would be added to the list of functions available on the main TikTok screen alongside the “Share” function. It was previously listed under the options presented under “Share”; in making it immediately visible and accessible and eliminating the intermediate steps to the process, we hope to see an increase in the usage of “Report.”