DEV Community

Cover image for Beyond Googling the Error Message
Ingo Steinke
Ingo Steinke

Posted on • Updated on

Beyond Googling the Error Message

A common meme of the mock O'RLY book covers :

"Googling the Error Message"

Screenshot: Google Image results for Googling the Error Message

Note on "googling": I am well aware that there are many Google alternatives, and I chose Ecosia as my default search engine for multiple reasons, including privacy and ecology. But I found that Google often provides better matches for tricky programming issues.

Google: better, but not good enough

Stylelint hangs, stylelint gets stuck and I seem to be the only one affected, apart from the one single issue when stylelint --fix got stuck with React.js inline styles, none of which matches my own specific setup. I started to wonder if I had lost my talent to paste the right thing into Google's search box, or maybe people don't report errors anymore?

How can there be less than 3 pages of search results for any query in 2023?

Image search results totally unrelated to my query

All I want is something that works. To be honest, I'd prefer an elegant, robust, and maintainable solution, so perfection might be one of my problems.

This post tries to sum up various takeaways from days, or possibly years struggling with error messages instead of proceeding with my work in a more productive and satisfactory manner.

Note that my takeaways are not guaranteed work for you as well!

Question, rephrase, isolate, and experiment!

Don't get stuck trying to make one specific solution strategy work! If you don't find any helpful results on Google or Ecosia, try rephrasing your questions, question your assumptions, and try to isolate relevant aspects in simplified scenarios like in a CodePen.

Narrow down the problem precisely

When you're calling an ambulance, you must answer some very precise questions: who you are, what happened, at which place exactly? So they don't send a fire engine to entrance A when you need an ambulance at entrance B.

Following this principle, we need to narrow down our problem to specific circumstances that we can mention precisely in a bug issue or search query and avoid broad or ambiguous search terms. Otherwise our results will always look like this:

Screenshot: irrelevant search results for broad term query

My search query for 'custom post type CPT media library empty "repair"' yielded few relevant results and soon switched to seemingly random stuff like a Nature Journal's post about non-viral precision T cell receptor replacement.

Learn to express your problem in different words

Try variations and don't insist on unnecessary constraints!

Just because the problem only occurs on my specific Ubuntu version but not my coworker's MacBook does not imply that it matters.

Maybe it's just a Safari vs. Chrome thing, but again, this will only become clear once we actually narrow down the problem and ask ourselves, again and again:

  • under which circumstances can I reproduce the problem?
  • under which circumstances I can't?

In the Ubuntu vs. Mac example, comparing Safari and Chrome on the same MacBook might have eliminated a lot of irrelevant assumptions quickly.

Anticipate further inquiries

Like calling an ambulance, we can prepare ourselves by anticipating what we'll probably be asked. If I state my OS and browser version in a bug ticket (as often suggested by template fields for filing a new issue), I might already think what I would ask a customer when I read an error report like that. One of my questions would be: "Does it only occur in this specific browser? Have you tried what happens in Firefox?"

A small success story about a small CSS misconception

In one of my recent posts, I presented a small problem where I managed to save myself after having got stuck trying to apply max-width after transforming and scaling a pseudo element. It turned out that using a proper DOM child instead eliminated my problem.

Lost? Try searching for a pattern!

This isn't the first time that I fail to find solutions on Google. Bing is no better, by the way. And this keeps happing since long before assistant systems like chatGPT became the new go-to resoure. And don't ask me if I asked on StackOverflow: how could I come up with a minimal reproducible example of something failing on my current machine, often even without any error message.

Reproducible examples vs. hidden assumptions

But trying to reproduce the error in another setup can make us aware of our environment and possible hidden assumptions. Are we sure we built, committed, and deployed? Are we even looking at the correct server? Sometimes I hit reload several times before finding out that I must have followed a link to the production server without realizing that I'm no longer testing my development environment. This should have been obvious from the URL, but I must have stopped paying attention to that important detail.

Avoid debugging irrelevant warnings!

Ever so often, the actual error is somewhere else. It might be a missing semicolon or any kind of typo or mismatching variable or file name. But while I keep getting spammed with irrelevant warnings and information (lines are too long, some attribute is not allowed in some HTML tag, "i++" should never again be used in JavaScript etc.) the tool miss out on the crucial part.

Sometimes it is obvious that an error message does not point to the actual root cause, for example when it states there is a missing closing bracket at the end of a file: Uncaught SyntaxError: Unexpected end of input (at scripts.js:12345:1).

Don't rely on linting and code inspection!

False positive warnings might distract our workflow, but false negatives (undetected mistakes) are more dangerous. Have a look at this example:

Screenshot: false positive as described above

Both stylelint and PhpStorm's built-in code inspection complain about the wrong (right) definition.

Wrong, defunct, but formally correct:

font-family: "var(--wp--preset--font-family--source-sans)";
Enter fullscreen mode Exit fullscreen mode

The false positive example below is actually correct, provided that we actually define that property somewhere.

font-family: var(--wp--preset--font-family--playfair-display);
Enter fullscreen mode Exit fullscreen mode

How could static code analysis know that --wp--preset--font-family--playfair-display will be defined in CSS at run-time when it's only defined in a JSON dataset like below? (That's a WordPress theme.json by the way.)

"typography": {
    "fontFamilies": [
        {
            "fontFace": [
                {
                    "fontFamily": "Playfair Display",
Enter fullscreen mode Exit fullscreen mode

The same configuration file quotes custom CSS properties as string values:

"h1": {
    "typography": {
        "fontFamily": "var(--wp--preset--font-family--source-sans)"
Enter fullscreen mode Exit fullscreen mode

So I must have copy-pasted this value to my custom CSS file.
Too bad that it's syntactically correct to write:

font-family: "var(--wp--preset--font-family--source-sans)";
Enter fullscreen mode Exit fullscreen mode

Even worse, there seems to be no stylelint rule to warn about that yet, at least not in the default recommended configuration.

Machine learning, AI, and Algorithms

Machine learning and so-called artificial intelligence haven't helped me much so far. I have tried to use chatGPT in different situations, including the WordPress issue below, where both chatGPT and the classic Google search engine became helpful only after I already knew the solution, thus knowing how to ask the right question. AI can reproduce some typical coding challenge answers and commonplace boilerplate code the kind of which can be found everywhere else, including StackOverflow, MDN, W3Schools, and uncountable pages copying the same content desparate to earn some money with page ads.

OpenAI: helpful, irrelevant or dangerously confabulating?

When I asked chatGPT about various tricky WordPress problems, it came up with reasonable statements that were both true and helpful in general, but either not related to the actual problem in question, or a list of possible reasons all of which I had already been able to verify.

Other people have found chat bots quite helpful, so try and decide for yourself, but don't take anything for granted. Artificial "intelligence" is not intelligent, it's just a very elaborate guess, trained on popular posts and solutions, interpolating these sources often causing made up, probable but incorrect artificial hallucination (confabulation), thus giving wrong advice!

Find out what's missing!

What about a git blame to inspect the latest changes after the last known working state? But what if do not even know when it worked as expected? And what if the error is caused by a deleted line (maybe accidentally), as deleted lines are not shown by git blame and reconstructing those can be harder than expected, especially when there were merge commits involved.

We can also have a look at what's not present, but should be, like a colored syntax highlighting or an implicit parameter annotation etc. like this conspicuously inconspicuous gray loation property that should have been a location:

Image description

Another variation that's even worse, when the unintended spelling or syntax is formally correct, like

  • if (a = 1) is not a comparison, but an assignment in most languages;
  • myFunction is a reference, but myFunction() executes immediately in JavaScript;
  • a misplaced brace / bracket or a missing semicolon can change the control flow;
  • auto-closing behavior of HTML parsing: <p class="outer"><p ="inner"> is equivalent to <p class="outer"></p><p class="inner"> because paragraph elements must not be nested. Quoting MDN on tag omission:

The start tag is required. The end tag may be omitted if the <p> element is immediately followed by an <address>, <article>, <aside>, <blockquote>, <div>, <dl>, <fieldset>, <footer>, <form>, h1, h2, h3, h4, h5, h6, <header>, <hr>, <menu>, <nav>, <ol>, <pre>, <section>, <table>, <ul> or another <p> element, or if there is no more content in the parent element and the parent element is not an <a> element.

Source: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/p

A lot of those kind of errors cause warnings in a good linter configuration, but some mistakes can't be detected by algorithms as they don't follow a typical anti-pattern.

Solution Strategies

I usually try to run all available checks and tests to rule out any problems even if they seem to be unrelated. I try to vary configurations. I try to find out more details by trying if there is a --verbose option or a logfile.

I google the error message if there is any. I try different variations of my queries, I read hopeful sources which usually helps me narrow the possible root causes. I try to describe the problem in a more detailed way, like I would have to when asking a coworker, open a GitHub issue or ask a question on StackOverflow. I do one of those things and get no helfpul answer (coworker), no answer at all (GitHub) or my question gets downvoted and deleted (StackOverflow).

Beyond googling ("missing ... must include"?!)

It is hard to google for code anyway, but sometimes it feels as if we are trying to query some secret that must not be told, so the search engines refuse to process our query and insist on ignoring certain parts of it, or stop caring about ordering results by relevance and put the one missing most of my query on the number one top position.

Image description

Trying long-tail variations, after ignoring the warning that "there are not many great results" for our queries ...

Image description

Image description

... we will finally hit the wall and meet the secret animated cartoon character making me crazy with its passive-aggressive "this is fine attitude". Well, at least I do. Thanks to Bing, Bard, and chatGPT, many people just copy and paste the answer to their magic prompt and voilà they've done 2 weeks of work in 2 minutes. At least that what some developers claim to achieve on social media. I already mentioned this "idle fisher" character in my rant post "I enjoy life-long learning, but..." about all the things that I could do without.

I have mentioned AI before. Sometimes developers, can profit a lot by getting extensive code snippets either by the popular chatGPT or by a virtual coding assistant like GitHub copilot or tabnine, although it can waste a lot of time and concentration when the recommendations aren't helpful at all.

So let's ask a chatbot instead of querying a search engine. Maybe that's the first issue preventing me to do so, as I usually don't type or speak natural language questions but rather type or paste something quite technical into a prompt. Instead of an obscure technological error detail, I can try to form a natural language question like "how to write php code that finds the static front page in the current language in a wordpress theme localized by polylang?"

Let's try this same question in Google, Bing, and chatGPT...

Image description

... and they all give me helpful results ... now.

Why? Because I managed to ask the right question in the right way, which wasn't that hard in hindsight, after I had already solved the problem.

Learn to ask the right questions the right way!

Can we learn to ask better questions? Well, StackOverflow has become infamous for its quest for good questions. It already helped me a lot to try and write a StackOverflow question and anticipate the further inquiries and reasons for downvotes without ever finishing and publishing my draft.

That's much like the stuffed toy teddy bear junior developers had to talk to and explain their issues to the inanimate figure before proceeding to bother an actual human developer.

While it may still be hard to word the correct question, lest find an appropriate answer, we might cut short and eliminate some false assumptions and do some basic checks that we might have missed when focusing too much on the details of what we thought to be the problem.

Beyond rants and downvotes

As you might know already, I sometimes use blogging to convert negative energy caused by frustrating search for elegant best practices that have never existed into something constructive and provide a solution to be found when using the previously unsuccessful search query.

Image description

Unlike StackOverflow, where I could, at least in theory, ask and answer my own question, on DEV I am allowed to be verbose and admit my negative emotions starting with a naive question, a frustrated rant, to conclude with a pragmatic solution that doesn't have to stand up to pseudo-scientific criticism.

"This question is closed. It is not currently accepting answers." might fit the logic of a strictly moderated Wiki website, but it still feels like the exact opposite of usability and inclusive UX writing to me.

Let's find a pragmatic solution then!

Pragmatic solutions

As you can see on GitHub, StackOverflow, and various forums like on WordPress.org, not all of my questions are unanswered or deleted though. If there is an answer, I try to verify it as soon as possible, to follow up with further details or by saying thank you or upvoting the answer.

Sometimes I write a blog post about my experience, to help me review what happened and share it with others who might have the same problem. Often, one of those "others" will be myself, some time later, when the same problem comes up again after I forgot about the solution. So there will be at least one helfpul search result on Google next time.

My 3 stages of bugfixing

Picture me: on the phone with my customer, frowning in a screen sharing zoom meeting, and giving a thumbs up when I finally found the solution!

Photo collage: Ingo Steinke, creative web developer, on the phone, frowning, and giving thumbs up

So what's the Pattern?

Another strategy is questioning my assumptions and my favorite solution. If things don't work, get too complicated, or nobody else seems to do it like this, I might be wrong or at least there might be a better (i.e. more easy, more supported, less error-prone) approach.

I might not be aware of doing something in an unusual way, but, back to square one, I could check my configurations and my recent commit history. Maybe there is something suspicious that I introduced, or maybe there has been an update to a tool or a framework that has introduced an incompatibility.

When I fail to come up with a solution, I try to do something else: take a break or switch tasks. Often there is more than one (sub-)task to work on.

Conclusion

When coding and debugging, always keep an open mind for a different view, question your strategies, and verify your assumptions.

Google screenshot: Looks like you have reached the end.

Top comments (3)

Collapse
 
ingosteinke profile image
Ingo Steinke • Edited

Adding another undetected antipattern that made me waste debugging time:
domElement.classList.remove('.hidden');

No matter if there is or isn't any "hidden" class name in our project, it should be obvious that class names can't or shouldn't start with a leading dot. We add a dot in CSS or in query selectors used in JavaScript, but it's not part of the class name.

domElement = wrapperElement.querySelector('.content .hidden'); // correct
if (domElement) {
  domElement.classList.remove('.hidden'); // always wrong!
  domElement.classList.remove('hidden'); // at least formally correct
Enter fullscreen mode Exit fullscreen mode

Stylelint does not check JS files, while ESLint seems to ignore CSS syntax inside JS commands (maybe another reason against CSS-in-JS). If someone is bored or looking for an open source side project, why not add an eslint plugin to check against obvious CSS and query selector antipatterns?

Collapse
 
randellbrianknight profile image
Randell Brian Knight

Thanks for the in-depth article about searching for error fixes. I discovered the hard way that I needed to be careful how I word my query and sometimes needed to try multiple times with different words. Even then, it can be frustrating. I liked your advice on using codepen. That is a great way to demonstrate the actions for others.

Collapse
 
pluglet profile image
Pluglet

Many thanks for the excellent exposition on coding 'pain reduction' strategies :).