I'm bookmarking this to read later (so many good comments!) but I'll chime in with a QA perspective:
If you're working at a place with a formal QA step, test your implementation, not your requirements
I've noticed in my devs' specs they'll have tests for things like "this has been called once", "all parts of this if can be hit", yada yada yada, and then there will be things like "it returns all this info", "the info is sorted properly", "the info is in the right format", etc.
Then if you look at my tests, they're "it returns all this info", "the info is sorted properly", "the info is in the right format", etc... things a user would see and that are in the story's acceptance criteria for the feature. Where I am, QA automation (end-to-end with a hint of integration testing) is a part of the formal definition of done, so a feature isn't considered done until both of us have written the same thing just at two different levels.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.