DEV Community

Cover image for Challenge: 3 Making UX Work Understandable to Engineers
Pavanipriya Sajja
Pavanipriya Sajja

Posted on

Challenge: 3 Making UX Work Understandable to Engineers

I want to explain this challenge through a real experience I had while working on improving the developer experience for KServe.

Because this is where I truly understood:

👉 UX work doesn’t fail because of bad research.
👉 It fails when developers don’t understand it fast enough to care.

Image explaining of the UX doesn't fail but the wrong presentation


The Situation:

We set out to improve usability for developers using KServe, specifically focusing on how they deploy ML models, manage inference services, and work within Kubernetes-based workflows. As a UX team, we took a structured approach: we conducted a usability study, designed task-based scenarios, and collected feedback from real engineers to anchor our decisions in actual developer needs. We even received approval from maintainers, and everything seemed solid in theory. But when it came time to present this work to developers… we struggled.


What Actually Happened:

In one of our community interactions, we presented a set of usability tasks, our research approach, and what we wanted developers to do. From a UX perspective, everything felt clear and well-structured but from a developer’s perspective, it wasn’t landing the same way. The reaction was subtle but telling: people didn’t ask many questions, feedback was minimal, and engagement was low. At first, it was easy to assume, “maybe people are just busy.” But looking deeper, the real issue became clear: we hadn’t explained the work in a way that aligned with how developers think.


Where We Went Wrong:

Image is representing three what went wrong

1. We presented too much information at once

Including multiple tasks, long end-to-end flows, and heavy contextual details. From our side, it felt like organized research with clear structure, but from the developer’s perspective, it came across very differently. It felt like: “This will take time. I’ll come back later.” And in real developer workflows, “later” often turns into never, which meant the key message and intent were easily lost in the volume.

2. We Used UX Framing Instead of Developer Framing:

We used UX framing instead of developer framing when explaining the work. We described it in terms of “usability study,” “user tasks,” and “feedback collection.” From our perspective, this language was standard and clear but developers interpreted it very differently. They were instead thinking: “What exactly do you want me to test?” “How long will this take?” and “What part of KServe is actually broken?” The gap wasn’t in the research itself, but in the fact that we didn’t clearly answer the practical, implementation focused questions developers needed upfront.

3. We Didn’t Anchor to Real Workflows:

We were talking about tasks in general, without clearly anchoring them in real developer actions. Instead of connecting them directly to things like deploying a model, defining Kubernetes manifests, or troubleshooting inference latency and failures, the examples stayed too abstract. As a result, the link to their day to day workflow was weak and not immediately obvious, making it harder for developers to see how the work applied to what they actually do.


The Turning Point:

Image explains what is turning point is

After seeing low engagement, we changed our approach completely. Instead of trying to “present UX properly,” we asked:

👉 “How can we make this feel like a small, real developer task?”


What We Changed (And Why It Worked):

1. One Task at a Time:

We shifted to a one-task-at-a-time approach instead of sharing everything at once. Rather than overwhelming engineers with a full set of activities, we started sending one small, focused task per engineer. For example: try a specific setup step, tell us where you get stuck, and share what feels confusing during the process.

This approach worked because it reduced time commitment, made the request feel actionable, and fit naturally into their existing workflow. Instead of feeling like a large study or formal exercise, the task now felt simple and approachable like: “I can quickly try this.”

2. Short, Direct Communication (Often via DM)

We shifted to short, direct communication (often via DM) instead of relying only on meetings. Rather than scheduling long discussions, we reached out to engineers individually, sent simple, step-by-step instructions, and asked focused, specific questions. This approach changed the response rate significantly because it better matched how developers prefer to work. Developers tend to favor asynchronous communication, clear and explicit asks, and minimal overhead, so reducing friction made it much easier for them to respond and engage.

Image described about 5 steps

3. We Stopped Explaining “UX” — and Started Showing Friction

We stopped explaining “UX” in abstract terms and started showing real friction instead. Instead of saying, “We are conducting a usability study,” we shifted to something more direct and anchored in actual behavior: “We noticed developers get stuck during this step , can you try it and confirm?”

That small shift made a big difference because it made the problem real, made it relevant to their experience, and made it easier to respond without extra context or effort.

4. Strong Documentation Built Trust:

Once we started getting feedback, the next challenge emerged: “How did you reach these conclusions?” To address this, we improved how we documented our findings, focusing on clearly capturing what we observed, the patterns across users, the common failure points, and how the insights were derived. We made the reasoning explicit by showing a clear flow: Raw feedback → Patterns → Insight → Recommendation. This step became especially important because developers don’t just accept results at face value—they want to trace the logic and understand how each conclusion was formed. Present the documentation along with the presentation or part of the results that explains how did you reach these conclusions.

5. Repeating in Community Meetings:

Another important learning came from repeating in community meetings. In environments like KServe, not everyone attends every meeting and the context resets frequently, which means people often miss earlier explanations. To address this, we re-presented the work multiple times, explained it again across different sessions, and shared updates incrementally over time.

At first, this approach felt repetitive, but it turned out to be essential. In reality, it significantly improved awareness, increased participation, and built stronger trust within the community.


A Simple Question That Changed My Entire Feedback Strategy:

One moment really changed how I think about presenting UX work. I was in a developer team meeting where I presented a persona, and at the end, I simply asked: “Can I get your feedback?” The response I received was unexpected: “What feedback do you want from us?”
That question caught me off guard. From my perspective, I had explained the work, walked through the persona, and assumed it was obvious what kind of input I needed. But from their perspective, it wasn’t clear at all they needed more specific context and direction on what to focus on.

I was presenting in the same way I typically do in UX-focused meetings, assuming the format would translate. But developers don’t think in the same way, and they don’t automatically understand what a persona is, why it matters, or what kind of feedback is expected from them.
So when I asked for “feedback,” it was too vague and open-ended. And that’s exactly why the response came back as: “What exactly do you want from us?”

After that experience, I changed how I structured my presentation approach. Instead of jumping straight into the persona, I began explaining it step by step, starting with the fundamentals. I now start with the basics (very important)—clearly outlining what a persona is, why we are creating it, and how it will be used in the context of the work. Then present the process of persona creation form the research data and finally explain the actual persona that you created form the raw data.

For example, I created a persona of a hybrid engineer who deploys clusters across multi-cloud, on-prem, and edge environments.

When presenting this persona in a meeting, you can ask for feedback like this:

“If you are an engineer deploying clusters across cloud, on-prem, and edge environments, does this persona reflect your problems, goals, workflows, and tools?”

You can also ask:

“Does this persona represent your day-to-day tasks as a hybrid engineer, or is something missing?”

“Do these goals and pain points resonate with your experience?”

“Is there anything that doesn’t match your workflow or feels completely different?”

Based on their feedback, you can refine the persona by adding missing tasks, adjusting goals, or removing elements that don’t align with real-world workflows.

What Changed After This: Once I started asking specific, targeted questions, things began to shift. Developers responded more frequently, feedback became more detailed, and conversations turned more meaningful and insightful.

Instead of silence, I started hearing responses like: “This part is correct, but we also struggle with…”, “We don’t use this tool—we use something else,” and “This step is missing in real workflows.”

Key insight: The problem wasn’t that developers didn’t want to give feedback—it was that I didn’t clearly ask for the right kind of feedback.

When you explain the context, show your process, and ask specific, targeted questions, developers know exactly how to respond. And that’s when UX feedback actually starts working.


What Changed After This:

Once we adapted to developer behavior, the quality of interaction changed noticeably. Feedback became more specific, engineers engaged more actively, and conversations became deeper and more meaningful. Instead of silence or minimal responses, we started hearing things like: “Yeah, this step is confusing because…”, “I had to check another doc for this”, and “This part could be simplified.”

That’s when it became clear: 👉 the UX work itself didn’t change — the way we communicated it did.


The Real Insight:

Real insight illustration

This experience completely changed how I think about presenting UX in DevEx. It’s not about having perfect slides, using heavy UX terminology, or relying on structured frameworks. Instead, it’s about something much more practical and aligned with the audience.

👉 It’s about making your work feel like part of a developer’s workflow

Conclusion:

If developers don’t understand your UX work, it’s not because they don’t care—it’s usually because it doesn’t fit their mental model, doesn’t align with their workflow, or feels too heavy and abstract to act on.

But when you break things into small tasks, show real friction points, speak in terms of their actual work, and provide traceable logic behind decisions, something shifts.

UX stops being seen as “design work”… and starts being understood as engineering value.

Top comments (0)