If you're doing Test Driven Development I'd expect to also see tests for short and long distances. That would demonstrate that your first test (the short one) was for a distance you measured with a ruler, then the second was added when you ran your code for two points a long distance apart, got the wrong result, and switched to an algorithm that knows we're not living on a plane.

I'm not sure I understand the argument about the plane. The distance calculated as if the coordinates are on a plane will never match the distance on the globe.

However, I do agree there should be tests for short and long distances! I mention them in the "Corner cases" section:

How about max distance on the planet?
How about a really small distance?
If both points are absolutely the same, are we getting 0 m distance?

The deviation from a plane is about 20cm per km, or one part in 5000. So provided that you're measuring between two points less than 5m apart (a few millionths of a degree apart) the distances do match, to within 1mm.

Say, we have two points on the equator.
Point 1: latitude 0, longitude 0.
Point 2: latitude 0, longitude 1.

Distance between them, measured as if they were flat coordinates: 1. (Units: degrees.)

Distance between them, measured as if they're on the planet Earth: 111.19 km.

111 (km) != 1 (degree).

Moreover, measuring distance in degrees doesn't make sense because one degree longitude is not equal to one degree latitude; and one degree longitude means different number of kilometres depending on latitude. The closer to the pole you go, the less kilometres "fit" in 1 degree longitude. (Note to self: I need to write an article about that and add pictures. It would be easier to explain with pictures.)

I suspect that you might be talking about specific projections, such as UTM, where for certain subset of coordinates distances between them, measured using the Pythagorean theorem, will roughly match the real on-the-planet-surface distance.

But if you're using such projection, your points' coordinates are also specified in that projection, and not in latitudes and longitudes.

To compare the distances, you'd have to first convert your point latitude and longitude to the coordinate system coordinates and then perform the calculation according to the Pythagorean formula. That's possible, but it complicates tests too much.

## re: Writing Good Unit Tests: A Step By Step Tutorial VIEW POST

FULL DISCUSSIONIf you're doing Test Driven Development I'd expect to also see tests for short and long distances. That would demonstrate that your first test (the short one) was for a distance you measured with a ruler, then the second was added when you ran your code for two points a long distance apart, got the wrong result, and switched to an algorithm that knows we're not living on a plane.

I'm not sure I understand the argument about the plane. The distance calculated as if the coordinates are on a plane will never match the distance on the globe.

However, I do agree there should be tests for short and long distances! I mention them in the "Corner cases" section:

The deviation from a plane is about 20cm per km, or one part in 5000. So provided that you're measuring between two points less than 5m apart (a few millionths of a degree apart) the distances do match, to within 1mm.

Incidentally, people may find this link interesting: ianvisits.co.uk/blog/2018/03/06/ho...

No, this is not the case. Here's a simple proof.

Say, we have two points on the equator.

Point 1: latitude 0, longitude 0.

Point 2: latitude 0, longitude 1.

Distance between them, measured as if they were flat coordinates: 1. (Units: degrees.)

Distance between them, measured as if they're on the planet Earth: 111.19 km.

111 (km) != 1 (degree).

Moreover, measuring distance in degrees doesn't make sense because one degree longitude is not equal to one degree latitude; and one degree longitude means different number of kilometres depending on latitude. The closer to the pole you go, the less kilometres "fit" in 1 degree longitude. (Note to self: I need to write an article about that and add pictures. It would be easier to explain with pictures.)

I suspect that you might be talking about specific projections, such as UTM, where for certain subset of coordinates distances between them, measured using the Pythagorean theorem, will roughly match the real on-the-planet-surface distance.

But if you're using such projection, your points' coordinates are also specified in that projection, and not in latitudes and longitudes.

To compare the distances, you'd have to first convert your point latitude and longitude to the coordinate system coordinates and then perform the calculation according to the Pythagorean formula. That's possible, but it complicates tests too much.