This post was originally published in my blog smartpuffin.com.
I sometimes have this feeling: The Hunch™. It's when I sense there is something wrong with the project, with the requirement, or with the feature, even when everyone else is sure all is well.
When I sense this, I feel like a dog who picked up the scent. I follow the scent by asking questions, until I get my answer.
Here's how it happens.
Dataset Extension
Our Dev team asked the Data team to collect some new data, in order to extend our already existing dataset. We had 1000 items, and we wanted 5000 more. We told them how we want it to be prioritised, we agred on the timeline, and we left reassured.
The first deliverable was ready. They sent is a huge shared spreadsheet with 1 thousand rows. We were happy, since it meant we would soon have twice more data than before!..
I wrote a script to upload all this new data to our database. I found some small problems in the data, such as columns mixed up, or string values in numerical columns, and the Data team fixed them quickly. Nothing serious.
All was ready. I was (literally!) holding my finger above the Enter key, pressing which would upload the data to the live servers.
But I hesitated. I had The Hunch™. And so I didn't press enter.
I asked the Data team the very last (or so I thought) question:
"Do I understand it right: all this is the very new data? When we upload it, we'll have 2 thousand items, is that correct?"
No, they told me. They had decided to execute a clean-up project along with collecting new data for us. 600 rows in this spreadsheet are to be updated. Only 400 are to be created.
It took me a minute to gather my thoughts. Not a single time was the clean-up mentioned to us before. How happy I was that I asked!.. Had I uploaded the data, we would have gotten 600 duplicated items.
How, asked I, do I understand from the spreadsheet which items are new and which are old? Surely you have an ID column for updates?
No, they told me. There was no such column. There is no other way to understand that. They simply didn't realise we will have to know which exact items have to be updated - until I asked.
Long story short, after much discussion, they added the ID column for us, marked the updated items, and we uploaded all the data successfully.
But had I not have The Hunch™, had I not asked the question, we would have uploaded the bad data, and we would have learnt about it much, much later.
Square Miles Conversion
I saw a feature on our website, where some geo-math was involved. I immediately had The Hunch™: there must be a bug.
It turned out to be correct, there was a funny bug which was live on our website for 10 years, without anyone noticing.
New API idea
Our team was rebuilding a product. The original product was built some 15 years ago, and evolved chaotically.
My colleagues were in love with a particular idea. The new API was supposed to distinguish between 3 use cases that were the same in the old product.
I was very new to the team, and I wasn't completely sold on the idea. I felt like I didn't understand it enough. I thought it might be too complex for the end users. I estimated that a lot of work was needed to support the split, including categorizing the existing data and supporting backwards compatibility. In short, my first days in the new team were full of The Hunch™.
But I decided that my colleagues, knowing the domain area better than me, have thought about all this. I assumed they estimated the amount of work and agreed with it - before I joined them.
They assured me that yes, some work is needed to categorise the existing data into these 3 use cases, but it is rather straightforward. A person from my team will do that, and will ask 2 more data specialists for help. The results will be ready very soon. And meanwhile, why don't we go ahead with the implementation?
I voiced my concerns, but started to implement the new API.
A week after, we met to see how the data split was going.
All 3 data specialists who were categorizing the data disagreed on every single data entry.
I asked them some questions, and it became clear that they understood the use cases differently. Moreover, all my own team members - including myself - understood it differently as well.
Setting what's happening, my team members agreed to postpone the feature until there is more clarity on the use cases.
Had I not seen that the use cases and not thought through enough, we would have offered a half-baked product to our users, and we wouldn't be able to explain the reason to split the use cases to them.
Collaboration between two teams
I supervised implementing a cross-team task.
The problem was that many places used different APIs to fetch the same data. The APIs worked slightly differently and returned different results. The users saw different values and complained.
The task was to clean it all up. The purpose of the task was to:
- Use one single API in all places, because:
- We wanted to unify display.
Developer1 implemented new API calls in three places. For some reason, in two of them they overrided results and displayed something different based on some condition.
So we still had the same inconsistency as before. This solution sort of complies with purpose 1), but completely defeats 2).
Developer2 replaced the API call in one place that I pointed out as an example. Turned out, results were overrided further in the code, based on yet another condition.
Moreover, in the area of responsibility of Developer2 there was one more place with the old API. They didn't notice it and didn't make any changes to that place.
Again, this solution half-complies with 1), but is out of line with 2).
I found out about all this only when I replaced the API call in yet another place. While pushing my code, I felt The Hunch™. Because the task was so complex and distributed between several people, I decided to test all changes at once.
What I saw was that all of them were different from each other.
After all these efforts, customers were still confused by different data on different pages. The developers' time was spent twice. And again, it would have been noticed much later, had I not decided to double check.
When I failed
Long time ago, we were implementing a new GIS-system. Our old system was storing coordinates in the UTM-projection, which meant you can work with only a relatively small piece of map at a time.
In the new system, we wanted to have the map of the whole world.
We made a decision quickly: let's store coordinates in latitude and longitude, and always work with them like that. When we want to display them on the screen, we will project them as needed.
Our next decision was to split the map to tiles to display them on the screen. And here we also made the decision to use latitude+longitude. It seemed only logical.
I spent almost a month implementing the tiles on latitude+longitude coordinates. Doing math on the sphere is hard, and I made a lot of mistakes. It was taking forever. I just didn't seem to get it right.
Seeing how I struggle, a more senior developer proposed to scratch all that and make a decision to have tiles in flat projected coordinates, using the projection our customers were using the most.
I agreed reluctantly, since I still thought having "real" spherical coordinates was "right". I deleted the complex code and wrote the new one in just 3 days. It's been working perfectly ever since.
I didn't have The Hunch™ back then - and, as you can see, the team lost a month because I didn't think very well about the consequences of the decision - how complex the math would be to implement.
How to do it yourself
I see many people not thinking through all aspects of their tasks and decisions, from technical to product ones, to business ones, to data collection ones.
A task, even small and simple one, often involves much more than it seems. For developers, I wrote an article helping them to think about corner cases in advance, when coding. The similar advice is applicable to all other sorts of tasks.
Try to think about the problem from all angles. What other people it involves? Does it involve changing existing stuff, be it code or processes or data? Does it align smoothly with everything else? Is there a logical flaw?
It is like testing. You list positive cases, negative cases, and corner cases; you also test other classes your change could touch; you ask your users for help; etc.
It will almost definitely be cheaper and faster to spend some time investigating first, than to rush into implementation and fail later.
Knowing what is actually involved will help you with prioritization. If the task is large, you might decide to do a smaller one instead.
Sometimes, after looking at all aspects, you might make a decision to do the task with the flaw. You might decide to not care about a certain problem or make a trade-off. Still, you will make an informed decision, and not simply go with the flow.
Top comments (7)
Heh... Reading these scenarios, all I can think is, "this is the time for the Stupid Question™!"
In truth, the Stupid Question generally isn't a stupid question. It's the, "something smells funny, I'm going to keep asking questions — frequently the same question restated multiple ways — until either the smell goes away or others start to notice the smell, too." Which is to say, keep seeking answers until the questions been clearly and consistently satisfied or the barrage of questions reveal that, wait, maybe this plan wasn't so well thought through.
Restatement and summarization is a powerful tool in ensuring any action-plan or project is actually well thought out and contingencies accounted for.
Thanks Elena! I like your "ask questions" approach, keep trusting your intuition!
Trying to do so is already a good start! It is like writing unit-tests for a logical problem. You list positive cases, negative cases, you also test other classes this change could touch, etc.
With time, one becomes better in that, and starts to "anticipate" problems even before thinking consciously.
I keep asking what is important and why is it important. It's very high level but it's fairly objective and client driven at times (the why we're doing x project) and it keeps me out of the weeds on details sometimes.
It's hard to find the balance and I think it'a a big challenge for what we do.
Intuition is questioned. When it is I back up with data when I can. Sometimes it's really strong and I don't know why. I feel it guides me to look in places others haven't looked.
I've been trying to find ways to show it to others and so far some collection tools for theming and analysis has been a thing.
I feel the curiosity and discovery is strong which fuels intuition... or is it the other way around? Maybe they influence each other.
I do data collection or "feeling it out" early when folks want to jump to the tech... so I feel you when you talk about learning first. People want action now or yesterday.
How do you make that work? I think you may have thoughts on this that are maybe different than mine.
How do we show our intuition in a bigger plan? How do we also use it well?
Thanks for your reply, Kat. It seems like you have The Hunch too. That's nice! :)
I'm not sure I have it figured out 100% :). I still struggle sometimes with how to make other people believe me, or how to teach them thinking before doing. By sharing this post, I wanted to help people look at the task more intensely, and to show what happens when they don't.
I don't think saying "I sense..." or "my intuition tells me..." helps. This sounds like magic, and can be easily rejected with "well, my intuition tells me something different". I think what helps, is explaining that this is based on your (years of) experience, and in similar situations you usually see this or that set of problems. I.e.: "when making a large code change, people usually forget about backwards compatibility, so we should think about it", or something like that.
Further, just like you said, I think that supporting your claims with data helps. It is hard to argue with data.
I also find that it helps to methodically lay it out to people - basically, in a list with bullet points. I.e.: "we still have to figure out: a), b), c); and in c) I usually meet a lot of problems which may delay us a lot; so why don't we think about c) in advance".
Do you have other tips or ideas? I'd love to hear them!
P.S. What do you call "the hunch"? :))
Developer Investigator, that's what most experienced devs are.
It seems you had the trait much earlier, nice article Elena!