I've got to disagree with one point here. Moreso than talk to your users, watch your users. We began doing UX testing sessions about a year ago, and the insight we've gained has been invaluable. That being said, what the users say while they're doing a UX session rarely correlates with what they do.
Watching users can be better than talking.
Is it practical? How do you do that and when?
Think about the most cases we need this at and for different stages of a project.
Nonetheless, the point was to involve users in the process to help you out make better decisions, whatever medium you choose to collect that info.
Doing both is probably the best. You need to talk to user to know what user wants and what problem you’re solving. But to know how users use their computers and how do they manage problems you software is supposed to solve now, it is better to watch them. If you ask me how do I interact with my computer every day, I am not sure if I could describe good enough for it to be usable.
I remember good example of this. One of our opeartors was reporting problems with one of our kiosk apps. A lot of app crashes and corrupted local SQLite database files. It was obvious “something” is damaging files in fs. We dismissed usual suspects (virus, bad hdd...), but it didn’t help. We even rewrote how we handle corrupted db files - didn’t work. In the end tech guys went there, observed how local operator operates the machine. It wasn’t surprise when they reported operator was shutting down machine by unplugging power cable from the wall. Operators never mentioned that, because they took it for granted, and figured out it’s trivial and not important.
Disagree was a bit too strong of wording; rather, if you can watch your users, you should absolutely take the chance to because it's even better than talking to them.
We build enterprise software, so it's a easier for us to go about this.
When we first started doing UX sessions, we created a few scripts that go through basic functionality, but soon found those to be too far-reaching and time-consuming.
Now, whenever we are creating a new feature or testing a new layout, we invite several customers — particularly any who requested said feature or complained about the layout — to test drive it on an internal server, sometimes with a copy of their database if they have specific use-cases, during a screensharing session. Generally, we'll pick 3-5 customers, and do a 1 hour session each. Our development team as well as the sales team watch the sessions take notes, while our project manager guides the client through a script. We take all the feedback, triage it, implement any changes, and then invite them back to test drive again before rolling it out.
We're still learning ourselves, but found the advice in this blog to be really beneficial to the way we conduct our sessions.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.