Scholarly, an AI Assistant for Instructors
Led the end-to-end design of a research prototype, conducting user research on AI's impact on teaching
OVERVIEW
Scholarly is an AI chatbot designed to support instructors, especially those new to AI, in their pedagogy. I conducted user research with instructors and pedagogy experts and led a research study where participants used our tool. By comparing two versions of our tool blindly, participant insights directly informed future design changes and important findings regarding how instructors interact with AI.
ROLE
UI/UX Designer
TIMELINE
Mar 2025 - Aug 2025
TEAM
3 Researchers 3 Developers 1 Pedagogy Expert
TOOLS/SKILLS
User Research, Participatory Design Sessions, User Interviews, UI/UX Design, Figma
A Quick Glance at the Results
Final Impact
PROBLEM STATEMENT
How might we design an AI tool that not only supports instructors' needs but also helps us understand how instructors with varying levels of AI attitudes and teaching experience adopt AI tools?
Instructors lack on-demand, 24/7 support for lesson planning, student feedback, and classroom challenges. Many educators have low AI literacy, which makes it harder to adopt, trust, and meaningfully integrate AI into their pedagogy. Differing levels of teaching experience shape how instructors expect to interact with AI, yet most tools don’t account for these differences.
Goal
Lead UI/UX design that resonates with diverse instructor needs
User Research on Two User Profiles
The team identified two key user profiles: AI Novices, open but inexperienced, and AI Conservatives, skeptical and hesitant. To inform Scholarly's design, the team conducted 10 user interviews, where participants walked through storyboards and gave feedback. Users valued transparency through reference links and stressed the need for privacy controls, including data deletion. They also wanted the tool to support brainstorming with contextual knowledge while staying alert to misinformation. These insights directly shaped how we refined the tool.
Integrating Competitive Analysis into Wireframes
I analyzed competitor layouts and identified reusable components like the sidebar and chat box to create a familiar, low-friction experience. Initial wireframes for the chat and dashboard emphasized simplicity for novice users. Through multiple iterations, we refined navigation, resource surfacing, and overall user support.
Initial Design System & Final Visuals
I developed a modular design system to support future scalability, including accessible type styles, components, and iconography. This ensured consistency across screens, streamlined collaboration with developers, and created a foundation for future feature expansion.
With the design system in place, I moved on to creating the sign-up flow, which put the system into practice. This not only helped me visualize how components worked together but also validated consistency, accessibility, and scalability in a real user flow.
Onboarding
I explored multiple onboarding flows under key constraints. The onboarding data needed to provide context for tailored chatbot suggestions, but parsing free-form responses risked errors from typos, grammar, or insufficient input. Early iterations asked instructors to type about their disciplines in a form field, but we shifted to three essential inputs (years of experience, classes taught, and AI attitude), which aligned with our pedagogy and AI stakeholders. The “classes taught” field remains flexible as short-answer text, letting users provide as many entries as needed to guide the chatbot effectively.
Chat Interface
To support users with low AI literacy, I added coach marks that reduced friction and aligned with stakeholder vision. A plus icon in the chat box lets instructors tag courses, helping them organize conversations. This feature also aligned with the dev team by ensuring data stayed correctly linked to each class.
Dashboard
One key feature of the tool is referencing pedagogical resources within chats. I first explored a modal design, but it covered the chat and disrupted flow. I shifted to a dynamic side window, allowing users to view resources and chat history simultaneously. This window also supports generated materials like lesson plans, reducing friction by keeping everything in one place.
I designed the dashboard to surface relevant resources from chat history and provide cards for managing class and demographic data. To address privacy concerns, users can remove anything the AI learns from conversations. The dashboard also links to the respective institution's Learning Center, extending support beyond the tool with in-person resources.
User Testing
As part of the team's research study, I led 8 usability tests with instructors with varying levels of teaching experience, AI attitude, and demographics. The study consisted of comparing two versions of our tool blindly. The Socratic version has shorter, more frequent conversation turns, often offering questions meant to encourage user reflection. The Narrative version has significantly longer responses and makes use of examples and stories.
User Insights
Overall, users were satisfied with the tool and its user experience. Many liked the familiar design, validating my competitive analysis. Others liked the ability to tag courses and save resources. Through these interviews, the team found that instructors with more years of experience preferred the Narrative version, as they were looking for more concrete examples they could implement in class. Instructors with less years of experience preferred the Socratic version, as it provided more direction.
TAKEAWAYS
Results
Final Impact
REFLECTION
Final Results
< PREVIOUS PROJECT
NEXT PROJECT >