Thumbnail image

Berkeley IT Department

As UC Berkeley launched a campus-wide AI initiative, one key challenge emerged: how to ensure the tools would be trusted and adopted by staff. I led a foundational research effort to uncover their needs, pain points, and motivations. These insights informed a strategic report and actionable design principles to guide the long-term, human-centered integration of AI across campus operations.

Role

UX & Research Intern

Timeline

Jun 2025 - Aug 2025

Team

1 UX & Research Intern

1 Project Manager

1 User Experience Designer

My Role

As lead UX researcher, I oversaw the full research lifecycle, from discovery to concept. I conducted 21+ interviews and an admin-wide survey (800+ responses), led thematic synthesis through affinity mapping, and translated insights into personas, strategy, and mid-fidelity wireframes.

Impact

The final user research report was presented to the CTO, helping shape the AI adoption roadmap and secure stakeholder buy-in.

The Challenge

A Tool Too Complex for Its Users

To kick off this effort, Berkeley piloted River, a custom AI assistant designed to streamline administrative tasks. Modeled after similar initiatives at peer institutions, River aimed to boost staff efficiency and reduce manual work.

However, less than 10% of the intended users actually adopted the tool. This revealed a deeper problem: efficiency alone isn’t enough. For AI to succeed in this context, it must be trusted, understood, and seamlessly fit into existing workflows.

Thumbnail image

The Challenge

A Tool Too Complex for Its Users

To kick off this effort, Berkeley piloted River, a custom AI assistant designed to streamline administrative tasks. Modeled after similar initiatives at peer institutions, River aimed to boost staff efficiency and reduce manual work.

However, less than 10% of the intended users actually adopted the tool. This revealed a deeper problem: efficiency alone isn’t enough. For AI to succeed in this context, it must be trusted, understood, and seamlessly fit into existing workflows.

Thumbnail image

The Challenge

A Tool Too Complex for Its Users

To kick off this effort, Berkeley piloted River, a custom AI assistant designed to streamline administrative tasks. Modeled after similar initiatives at peer institutions, River aimed to boost staff efficiency and reduce manual work.

However, less than 10% of the intended users actually adopted the tool. This revealed a deeper problem: efficiency alone isn’t enough. For AI to succeed in this context, it must be trusted, understood, and seamlessly fit into existing workflows.

Thumbnail image

Why User Research Was Essential

Why Now: Aligning Stakeholder Needs Through Research

Thumbnail image

User Research

The Berkeley AI Hub (Sponsor)

As the primary sponsor funding the project, the AI Hub's main concern was return on investment (ROI). They needed to justify the budget for River AI and required quantitative outcomes to prove its value and necessity for the university.

The AI/Automation Team (Builders)

The AI/automation team was focused on implementing agentic workflows, AI-driven tools meant to automate repetitive tasks and enhance productivity. A key goal behind this initiative was to see whether staff could realistically use these workflows to streamline their existing processes.

The Berkeley IT Department (User Advocates)

Berkeley IT champions a human-centered AI approach. Their priority was to ensure that River AI was not just technically proficient, but also usable, trustworthy, and valuable for the staff. They wanted to highlight that user research for AI integration from the start was essential.

Affinity Mapping

Uncovering Core Pain Points and Opportunities

Market Research

We also noted gaps in campus IT support and tool effectiveness. After 21+ interviews and 800+ survey responses, I had hundreds of data points. To synthesize them, I used affinity mapping, a key step that turned scattered quotes into three core, evidence-based pain points that shaped the project.

Thematic Insight #1

A Need for Better Guidance on Effective Prompting

While staff were eager to use the tool, many struggled to prompt it effectively. Questions like “What phrasing works best?”, “What are the tool’s limitations?”, and “How are others using it successfully?” were common. This wasn’t a fringe issue, 68% of survey respondents said they needed clearer guidance and examples to improve their results.

Thematic Insight #2

A Need for Intuitive Workflow Automation

Another key insight I gathered from user interviews was that staff wanted to offload repetitive tasks to AI but faced barriers due to the tool’s complex, technical automation features. Designed for technical users, the interface prevented 52% of staff—who wanted automation—from effectively using it.

Thematic Insight #3

Addressing Data and Privacy Barriers in AI Adoption

Users were uncertain about how their data was handled due to unclear communication around privacy and security. This lack of transparency led staff to avoid using River AI for sensitive student or administrative tasks. Addressing this requires visible safeguards and clearer, more transparent communication to rebuild trust.

User Personas

Beyond the Data: The 3 Personas That Define AI Use

Our research revealed a clear insight: 'staff' isn't a single user type. Attitudes toward AI varied widely, from skepticism and fear to curiosity and excitement. To design for this diverse audience, we translated our interview and survey data into three personas grounded in real findings

Competitive Analysis

Benchmarking & Best Practices

Market Research

To ensure our recommendations were grounded in best practices and informed by the broader landscape, we conducted secondary research into AI adoption in peer institutions, identifying successful strategies and common pitfalls.

Concept Testing

Exploring Future Designs

Usability testing was a key step for each major component of the redesign. For the sake of brevity in this case study, the following serves as a representative example of the concept testing process.

Conclusion

Next Steps

User Research

Validate with Usability Testing

Next, we’ll test high-fidelity prototypes with participants representing our three personas. We'll assess whether the designs truly address pain points around trust, clarity, and ease of use.

Phase the Implementation Roadmap

Post-validation, we’ll partner with PMs and devs to build a roadmap, starting with an MVP focused on trust. First-phase priorities include transparent data handling and privacy tools.

Expand Research & Track Impact

Explore how teams like Finance or Student Affairs can apply AI in their workflows and re-run the campus survey 6–12 months post-launch to measure shifts in sentiment, trust, and adoption.

Love, Priya ©2025

Love, Priya ©2025

Love, Priya ©2025