
Berkeley IT Department
The Registrar’s Office often experiences a high volume of repetitive, routine inquiries, accounting for ~60% of administrative support capacity. This volume reduced responsiveness to time-sensitive cases, resulting in response times of up to two weeks. To address this, I collaborated on the design of an AI search experience that allows students to find answers to common questions directly within the student dashboard, reducing the burden of routine support requests on staff, allowing them to focus on more complex or urgent cases.
Role
Product Design Intern
Timeline
Jun 2025 - Aug 2025
Team
2 Product Design Interns
1 Project Manager
2 Engineers
My Role
Researched and designed a generative AI search feature in the student dashboard to deflect routine inquiries and reduce administrative workload.
Impact
During usability testing with seven students, the AI search feature reduced average time to find answers by 40%, indicating a possibility for significant reduction in routine support requests.
An Overview
A Case of Administrative Burnout
UC Berkeley’s student dashboard contains critical academic and administrative information, but unclear navigation and a weak information hierarchy make it difficult to discover. As a result, students frequently contact the Registrar by email or phone for routine questions, creating a support bottleneck that delays responses to time-sensitive issues such as financial aid holds and enrollment problems.
Information in the student dashboard is buried in large, hard-to-navigate lists across 5 pages, often undiscoverable
User Research and Synthesis
To better understand how students discover and access critical information, we conducted interviews with 14 UC Berkeley students, examining how navigation and information hierarchy impact their ability to find answers independently to routine academic and administrative questions
Students Are Unaware of Existing Information in the Student Dashboard
Students are often unaware that the student dashboard contains answers to common questions, policies, deadlines. As a result, they contact the Registrar for support even when the information already exists within the dashboard.
Mismatch Between Student Language and Dashboard Terminology
Students search using everyday language, while the dashboard uses official campus terminology. The mismatch causes them to struggle with search and they default to contact the Registrar for help.
High Volumes of Routine Inquiries Delay Time-Sensitive Support
Registrar staff reported that approximately 60% of incoming requests concern routine questions such as deadlines or links. This volume reduces visibility and response speed for more time-sensitive cases.
Meet Eva
Eva represents a common Berkeley student archetype: busy, capable, and unfamiliar with institutional jargon. Through her needs, we observed that students and the university speak two different languages: students ask questions in plain English (“I’m sick, what do I do?”), while the university organizes information by official codes (“Medical Withdrawal Policy”), making the AI’s most important role not link retrieval but real-time translation of student intent into the correct campus answers.
Competitive Analysis
I audited 'answer engines' like Perplexity and enterprise search tools like Glean and identified that trust breaks down when AI operates as a black box, highlighting the need for verifiable, citation-backed answers that clearly show students like Eva where information comes from.
Key Opportunity Area: Transform CalCentral into a trusted AI discovery experience that delivers direct answers while using fuzzy matching and related-topic suggestions to reduce reliance on registrar support.
Ideation
Guided by our user research and competitive insights, we explored ways to translate student questions into verified answers, prioritizing clarity, trust, and ambiguity over opaque, black-box AI behavior.
Iterative Prototyping
We iterated on search and results states to refine breadcrumb metadata and in-context linking, ensuring students could clearly trace answers back to their source and navigate seamlessly within the dashboard.
Keyword-Based Search
“Popular Right Now” smartly surfaces timely student needs

“Search for Keywords…” signals a basic database search, not an AI Search

"Popular Right Now" poses an institutional risk if trending queries surface inappropriate or misleading content
Guided Discovery
The “What can we help you find?” section makes the interface feel more assistive
The lightbulb icons signal smart, suggested queries, helping users understand how to engage with the AI.

The blue styling of the smart suggestions implies external linking, which can mislead users about the interaction
Natural Language Search
The placeholder “Ask a question…” encourages users to move beyond keyword-only search
Concrete example queries (e.g., “How do I add a delegate?”) teach users the system’s capabilities without requiring extensive exploration on their own
Search Dead-End
The message is direct about system failure, avoiding misleading or incorrect AI-suggested results

The lack of suggestions, however, creates a dead-end experience, forcing students to guess the correct keyword

Poor visual hierarchy makes the error message easy to miss
Suggestion-Based Recovery
“Did you mean” suggestions prevent dead ends by giving users an immediate path forward
Clear iconography and spacing signal a state change

Requiring an extra click to view likely-intended results adds unnecessary friction
Smart Correction
Automatically showing results for the corrected term while clearly labeling the change
“Related Searches” and “Recent Searches” add helpful academic context
Clear source labels help students quickly understand what information they’re seeing
Filter-First Layout
Follows a traditional departmental structure that feels familiar to the old CalCentral

The vertical density is overwhelming, requiring the student to scroll through a wall of text to find an answer
Organization by Categories
Clearly links search terms to results, giving students a predictable navigation path
Breadcrumbs show source metadata, helping students understand where information is permanently located.

Uniform blue links make functional tools like calculators look identical to static pages, reducing scannability
Contextual Task Surface
Surfaces actions like "Apply for an Emergency Loan" allowing students to easily find answers
A two-column grid maximizes screen space without feeling cluttered
The combination of location-based icons (e.g., the Home icon for "My Dashboard") and highlighting allows for instant verification of what the result is and where it is located
Final Prototypes
Direct Intent Retrieval
This prototype surfaces tasks such as emergency loan applications when a student’s query matches institutional data, providing direct access to relevant actions or information to streamline task completion.
Intelligent Search Correction
In this scenario, the design aligns everyday student questions with official university policies. The system prevents dead-ends by automatically finding the relevant result, ensuring the student never feels lost due to a naming mismatch.
Reflections
Working Around Technical Constraints & Limitations
Throughout the project, I worked closely with 2 engineers to examine the dashboard’s data architecture and integration limits, identifying where institutional technology restricts real-time querying, data verification, and UI flexibility, which informed design decisions around feature scope and possible user interactions.
Trust is Important in AI for Institutional Contexts
I learned a lot through user research about how student strongly prefer transparency and clear citations; without them, even a technically accurate system can feel like a “black box,” reducing adoption and confidence. Working within Berkeley’s systems taught me how to design AI that is both functional and trustworthy, balancing technical limits with the need for transparent, verifiable answers that avoid the “black box” problem.












