But the thing is, we don't describe computer languages with terms like "verb" or "object" (at least not in the grammatical sense of the word), and most of the terms we use to describe them are borrowed from mathematics (function, argument, etc.)
Of course that the grammatical is limited by practicals term. The languages include just what is needed at the syntactic and grammatical level in order to express the subset of semantics that is needed in the problem domain for which the language is designed.
I do not know in details all the computer languages in the list but are you sure there is no programming paradigm with such grammatical stuffs?
In practical terms, what we do is that we read the code and understand the meaning. That illuminated act is some kind of a matching at the semantic level.
If you read for example:
order.add(item)
you have subject (order), predicate (add) and object (item).
... artificial languages are influenced by natural languages but I have yet to find an influence of linguistic research in the area of artificial language "design", which seems to me the question asked by the OP.
They gave you a generic way to design any language by mean of its grammar and generate automatically compilers and parser. Isn't that one influence ?
Also you may find you will find a lot more in the level of Semantics. Remember that linguistics also covers Semantics.
The past decade there was a lot of work in the field of Semantic Computing, check that work also.
...for humans we need to learn what conveys meaning, which is limitless and includes only a small part of what is grammatically correct.
Then you'll like the book about knowledge representation. When you see which semantics are behind every word you can better select which is the proper word to avoid ambiguity.
It's pronounced Diane. I do data architecture, operations, and backend development. In my spare time I maintain Massive.js, a data mapper for Node.js and PostgreSQL.
Mathematics and computer science having little to do with linguistics seems to be more often the position of people with an interest in the former two than a generally shared perspective. It's true that programming languages are formally mathematical, but they're still languages with grammars and parsing and even room for implication and ambiguity and authorial expression now and then; and math or no, learners often find framing, for example, object orientation in terms of nouns and verbs more intuitive than staring at proofs.
Manufactured languages, software architectures, and specialized interfaces are much more limited in scope than natural languages, texts, and grammars. But there are still rules they conform to and tendencies they exhibit, so it's rather a waste of time to ignore the vocabulary and theoretical toolkit already developed for us by linguists and semioticians. And of course, treating natural languages themselves in mathematical terms is a topic of no small interest on either side of what dividing line exists.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Of course that the grammatical is limited by practicals term. The languages include just what is needed at the syntactic and grammatical level in order to express the subset of semantics that is needed in the problem domain for which the language is designed.
I do not know in details all the computer languages in the list but are you sure there is no programming paradigm with such grammatical stuffs?
In practical terms, what we do is that we read the code and understand the meaning. That illuminated act is some kind of a matching at the semantic level.
If you read for example:
you have subject (order), predicate (add) and object (item).
They gave you a generic way to design any language by mean of its grammar and generate automatically compilers and parser. Isn't that one influence ?
Also you may find you will find a lot more in the level of Semantics. Remember that linguistics also covers Semantics.
The past decade there was a lot of work in the field of Semantic Computing, check that work also.
Then you'll like the book about knowledge representation. When you see which semantics are behind every word you can better select which is the proper word to avoid ambiguity.
You might also find a lot of useful stuffs in General Semantics and Institute of General Semantics
Mathematics and computer science having little to do with linguistics seems to be more often the position of people with an interest in the former two than a generally shared perspective. It's true that programming languages are formally mathematical, but they're still languages with grammars and parsing and even room for implication and ambiguity and authorial expression now and then; and math or no, learners often find framing, for example, object orientation in terms of nouns and verbs more intuitive than staring at proofs.
Manufactured languages, software architectures, and specialized interfaces are much more limited in scope than natural languages, texts, and grammars. But there are still rules they conform to and tendencies they exhibit, so it's rather a waste of time to ignore the vocabulary and theoretical toolkit already developed for us by linguists and semioticians. And of course, treating natural languages themselves in mathematical terms is a topic of no small interest on either side of what dividing line exists.