In short, for Productivity.
If you want to know further , you have to go through all of it.
Python is not the one and only solution for Artificial...
For further actions, you may consider blocking this person and/or reporting abuse
While I essentially agree with your position, the main point is somewhat of a false dichotomy.
Under C++ or C, you have the machine languages that they compile down to as well, that argument would be the same as why isn't everyone still programming in Assembly?
The answer is as you almost clearly stated, ease of access. Assembly is not at all easy to write or think about in high cognitive ways easily.
Thus came languages to make that easier.
Then came more, and more and more. Each abstracting more of the banalities of the previous ladder step away.
Pythons main success is three fold;
This allows for quick to market, quick to prototype, instant exploration of idea to code to output. It is extremely powerful to think, write a few lines of something, and begin to see what that looks like.
It has very deep wells of power. While many in the traditional language world may have, or still, considered python "just a scripting" language it is far from the truth. It is an extremely deep and powerfully full bodied language. You can go from simple scripts and running a simple data set in a few minutes to enterprise level software applications. It spans technologies, use cases and industries with ease and has extreme reach into numerous technological arenas.
It is tightly coupled, though this is true of CPython, which is the normal implementation people are aware only one of many implementations, with C and C++. This allows interop with tons of existing, and new libraries to glue, or, place performance code into lower level code where needed allowing python to straddle both fast to market and performance operation.
This is extremely difficult for almost any other language in the way that python has done it. The success and eponymous nature of it for data sciences is only because it is an every tool that is easy to start, and powerful enough to carry you through enterprise level work.
Great analysis. I think Python has also helped Machine Learning to rise in popularity these last years. Like, it's first too academic, then it jumps to low level software, and then it opens up to the public when it lands to such glue languages. Nice post!
Personally i'm not a huge fan of Python. Rather Java, C++, C# feels more native to me. But I have no way but admit that Python helped growing AI , ML , DS things more than anything else.
Indentation. I hate the "Indentation defines scope" paradigm of Python! I'd rather have the IDE beautify/indent the code rather than enforce it by having it imply scoping.
Agree 😂😂
Curly braces are better than shitty indentations.
As just a brainstorm, I wonder about a translation layer plugin for editors (and error reporting going back) that would indent guided by curly brackets.
I would say python is enough for exploring ideas and staging. Most of the time will do just good in production. But many teams will end up porting python data science code to C/C++ or even write from scratch when python hits it is performance limits.
Productivity is not an issue for expert C/C++ developers. The challenge is to find those experts C/C++ developers.
I don't argue with your point that Python has less pitfalls and is more productive for higher level programming, and that C++ is more performant, but it is erroneous to imply there is any "garbage" memory to be collected in C++; there is no Garbage Collector. This provides the precise control over memory usage (and performance) that make C++ the systems language of choice, despite it's complexity. I am interested in Rust though. Rust provides compiled performance with automatic but synchronous memory deallocation for more deterministic performance characteristics and memory profiles when compared to asynchronous GC approaches like those applied in Java, .Net, and GoLang.
« synchronous memory deallocation for more deterministic performance characteristics and memory profiles when compared to asynchronous GC approaches like those applied in Java, .Net, and GoLang. »
It’s not that obvious.
GC can be more performant because it’s not up to the dev to decide when the desallocation should happen.
It’s the role of the GC to decide if the system don’t have better tasks to do and deallocate memory at the right time by doing it for all unreferenced data in one go.
Of course in theory you can be as efficient about as a program but in the context of others program run at the time as yours, it’s less than obvious. You’ll most likely slow the machine by running memory desallocation while there is largely enough memory left etc...
Im not criticising C++. Its one of the best programming languages ever made. But Im not talking about system programming or OS, Kernel stuff. System programming is something that requires more efficiency rather than productivity. AI, ML, DS things are completely different. If you deal AI, ML things with a VM language like Python or JS , thats fine. But when it comes to system programming , VM languages are horrible choice. Despite of all complexities , drawbacks C/C++ are by far the best choice for system programming.
Just mentioning that there are GC libraries available for C/C++. You are not forced to use them.
Clarifying. Where is Julia in this landscape?
There are several AI projects that are C/C++ based and don't necessarily require Python to use them: lmemsm.dreamwidth.org/16168.html
Let me know your opinion on this topic.