DEV Community

Discussion on: Type System FAQ

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

Really good insight! This will definitely help a lot of people understand a lot of the debates regarding typing systems.

From a computer point of view, there are no types, there are only bit strings...Types exist for people.

My only pedantic addition is that types aren't merely a high-level abstraction. Of course, once again this one of those "it's complicated" issues you and I tend to get into interesting back-and-forths about, so I'll to try to distill this as far as possible:

Getting this out of the way...objects, classes, and enumerations (etc.), along with their consequential "types," are only abstractions. They really don't "exist" at machine level, only in the higher level language.

But so-called "atomic types" are usually a slightly different scenario. This typically include integers, booleans, and floating-point numbers (either single or double precision). There's a mix of abstraction and underlying logic with these. These atomic types are uniquely delineated, structured, and addressed in memory.

Which types actually constitute an atomic type, and which are just glorified versions of other types. For example, a C char is actually a form of int. A boolean may also be an integer...or it may not. It really depends on the implementation of the language. Once again, this is where it gets "wibbly-wobbly-timey-wimey."

(P.S. Strings aren't ever atomic; they're arrays.)

You are still absolutely correct that all data is just bitstrings. It's possible to interpret the same 32-bit chunk of memory as a single-precision floating point number, or as a 32-bit integer, or as a Unicode character. The trouble is, the data will be effectively "garbage" for one of those interpretations.

For that reason, at the assembly level, type still exists in one sense — it defines how that bitstring is interpreted. We lack most (or all) of the syntactic sugar that makes delineating type so convenient, but the core behavior is still needed.

Type has to be a concept at the machine code level precisely because type doesn't exist as far as memory itself is concerned. There's no label sitting on the bitstring declaring "I'm an integer!" or "I'm a floating-point number!" There's not even typically a sentinel (marker) to declare where one "variable" stops, and another starts.

Knowing how much binary data to read, and how to interpret it, is the responsibility of the code. In fact, a massive percentage of bugs originate from this one responsibility getting botched. The (machine) code's knowledge of how much data to read, and how that data is to be interpreted, is the logic that constitutes "type". It's not pretty or simple this close to the silicon — imagine trying to store all your data in the bit-array data structure your language provides — so higher-level languages abstract that logic out into various typing systems.

All that to say, types (conceptually) don't exist from a memory perspective, but they must exist from a code perspective. Type systems, on the other hand, are abstractions that exist for people.

Collapse
 
stereobooster profile image
stereobooster

The trouble is, the data will be effectively "garbage" for one of those interpretations.

But who will understand that this is a garbage? Does machine care? Do transistors or NAND gates care? This is the human who gives interpretation to the things.

It's possible to interpret the same 32-bit chunk of memory as a single-precision floating point number, or as a 32-bit integer, or as a Unicode character.

Who decides which interpretation to give it? Programmer who wrote the program, in which they put instruction: "go to memory X, take N digits and treat them as number..."

This is tricky philosophical question, if you like this kind of things I recommend to read Real Patterns essay.

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

Does machine care? Do transistors or NAND gates care?

Literally true of anything, though. The computer doesn't care about any of this, because it's a non-being. It's just as "happy" being unpowered altogether. Literally anything a computer does only ultimately matters to the human who gives interpretations to the things.

And, like I said, as far as memory is concerned, type isn't a thing.

It is a tricky philosophical question, I agree. Again, my point is, the concept of "type" does exist at machine code level, just in a painfully non-abstracted fashion.

Although, I think we actually agree on that.

Thread Thread
 
stereobooster profile image
stereobooster

My opinion: it exists in the same way as glider exists in the game of life. The structure itself exists on the board, but the meaning exists in the head of observer. From board (game of life) pov: yet another configuration. From human pov: oh look it has interesting properties.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

Yeah, I think that's fair. I just don't want anyone to get the idea that "type" (conceptually) isn't an important concept in machine code, which would be fundamentally misleading.

BTW, although I have limited experience writing assembly, I checked all of this with one of my co-workers, who programmed in binary and machine code (with punchcards) as his full time job for many years.

Thread Thread
 
stereobooster profile image
stereobooster • Edited

Fair point. I'm not sure shall we call it Types or Encoding 🤔.

Non scientific chart:

Thread Thread
 
codemouse92 profile image
Jason C. McDonald

Heh, excellent chart. "Machine types" is a bit closer to what I'm talking about, I think, although encoding certainly enters into it some.