You can only automate front-end testing so much. This looks like it would save a lot of time, but there's still some level of "does it look right". You still need human judgment in the loop.
Mmhm, you do, you still need human eyes to check if buttons are in the right place - especially in Safari... We try to automate not all, but as much as we can, so things are a little saner. If a machine can do regressions tests massively in parallel, that means we can clear our pre-flight checklist in minutes instead of hours.
You can only automate front-end testing so much. This looks like it would save a lot of time, but there's still some level of "does it look right". You still need human judgment in the loop.
DOM snapshot testing can answer the "does it look same" question efficiently, so that you know where to look to see if it now looks better or worse.
Mmhm, you do, you still need human eyes to check if buttons are in the right place - especially in Safari... We try to automate not all, but as much as we can, so things are a little saner. If a machine can do regressions tests massively in parallel, that means we can clear our pre-flight checklist in minutes instead of hours.
Makes total sense, yeah. I suspect there are some things that human eye does that can be automated too, at least partially.
E.g. "is this thing I just clicked on actually visible" seems like a thing one could check automatically.