Logo

Introduction

At the beginning of 2024, I led a research initiative that shaped our team’s planning and priorities for the year. The goal was to deeply understand user needs, align our work with team objectives, and identify the most valuable opportunities for improvement.

It’s important to highlight that this was a collaborative effort, everything we accomplished was as a team. While I didn’t do this alone, as the lead I was actively involved in decision-making and defining the research methodologies. Together, we gathered insights from usability tests, 25 user interviews, surveys, and product analytics, and centralized all findings in one accessible location.

We also facilitated several workshops and created detailed user personas based on real data. These personas helped us uncover key pain points, user needs, and behavioral patterns, grounding our strategy in evidence rather than assumptions.

As a result, we were able to:
  • Prioritize tasks using impact vs. effort assessments
  • Focus on initiatives that would deliver the most value to users and the business
  • Identify 10 high-priority features and complete or actively progress 8 of them by the end of the year

User Value Scale of features


Uncovering a Real Problem: Support

During internal workshops with DDS team members and stakeholders, support emerged as a most voted topic and primary pain point for the team. It was also present in the User Value Scale of features Research I mentioned previously.

Key issues included:
  • Lack of code examples
  • Unclear support user journeys
  • Repeated questions in Teams
  • High volume of direct messages to support
  • No dedicated tool to manage support efficiently

These insights were consistent with earlier research, where users expressed frustration with long response times, difficulty finding answers, and confusion around where to ask questions. Much of the information already existed—often buried in Teams threads—but wasn't easy to find or access.


Support Workshop with the team


Revisiting Past Research

To deepen our understanding, we reviewed:
  • User Research Desk findings: showed users struggled with understanding the support process, ticket classification, and status visibility.
  • Feedback system analysis (2024): categorized requests from Teams, the forum, individual chats, and feedback form's tickets. Most requests were component-related and could have been resolved by using existing answers.
  • User interviews: revealed confusion due to lack of documentation and unclear entry points for support. Support was scattered across five main channels.
  • Competitive analysis: suggested that instead of building a static FAQ, we should improve site content and navigation to make answers easier to find.

Mapping the Support Process

To visualize and understand the internal experience, we facilitated a Mapping Support Process Workshop with team members directly involved in DDS support. The goal was to understand how the support process currently works, visually map it in Miro, and gather feedback on what was working, what wasn’t, and what could be improved.

Findings:
  • Strengths:
    Microsoft Teams for empathy and collaboration
  • Pain Points:
    unclear user requests, repeated questions, outdated documentation, lack of tracking.
  • Requirements:
    better data collection, user education, clearer documentation.
  • Ideas:
    AI suggestions, improved FAQs, forums, scripts, feedback form redesign.

These sessions validated the need for a more scalable, structured support solution.


User Value Scale of features

Final Support Process

Ideation and Concept Development

Next, we entered the ideation phase, co-creating possible solutions with support team members. We used the Crazy 8s method to quickly sketch ideas. After analyzing and prioritizing them, three main directions emerged:

Key improvements included:
  1. Improve website documentation
  2. Build a better FAQ repository
  3. Launch an AI-powered support chat (within Dell's privacy constraints)

Given limited resources, we adopted an MVP mindset. The AI assistant became our focus, as it could address repeated questions from Teams and improve user self-service.


Ideation Workshop

Building the AI Support Chat MVP

We collaborated closely with developers and Dell’s AI team, who were also building a chat assistant using the Dell Design System. To ensure consistency, we aligned our design patterns with theirs, strengthening future collaboration opportunities.

Due to privacy rules, the chat could only be accessed by internal users. Rather than investing in a complex login system, we opted for a full-page experience. Two MVP versions (v1 and v2) were created to show progress, test ideas, and demonstrate value early.

Curating the Knowledge Base

While developers configured the AI, we gathered content by analyzing months of Teams conversations. We identified frequently asked questions and, using the dev-provided format, created structured, clear answers for the assistant.

This made the AI immediately useful, answering real, repeated questions users often searched for. It also helped reduce the burden on our support team.


Usability Testing and Iteration

Once the MVP was ready, we tested it with 5 designers and 5 developers using tailored personas. The feedback was clear: the tool was promising, but it needed more content.

In response, we:
  • Enhanced the database with existing website documentation
  • Refined AI answers for clarity and usefulness

The AI chat is now capable of handling straightforward support questions—especially those commonly repeated in Teams.


Usability tests

Looking Ahead

Our long-term goal is to:
  • Improve and centralize documentation
  • Launch a more robust FAQ section
  • Transition users from Teams to the AI chat and feedback form for support

This will streamline the experience for users and reduce manual effort for the support team.


Fast Fix: Microsoft Teams Copilot

We also implemented Microsoft Teams Copilot to reduce back-and-forth. Support team members often had to ask for basic context (e.g., role, product, team), so we used Copilot to extract this from user messages automatically. It saved time, improved efficiency, and made early case handling smoother.


Conclusion

This project highlights how design, research, and collaboration can drive meaningful change, even with limited resources. By focusing on real user needs, involving the right stakeholders, and delivering iteratively, we introduced a scalable support solution aligned with Dell’s broader design ecosystem.

Our AI support assistant is only the beginning. With continuous iteration, stronger documentation, and smarter tools, we're laying the groundwork for a more efficient, user-friendly support experience across the Dell Design System.