"Scripting language" has always struck me as a weasel word. It lacks a formal definition, but is applied liberally to any language which a developer deems "not a real programming language" (whilst turning up their nose in bourgeois disgust.)
If they mean "interpreted language," that's something specific, but it would also mean dropping Java.
And...legacy? Since when is Python "legacy"? It's one of the most popular application development languages today, fer cryin' out loud. (On that note, Ruby and Perl aren't even remotely "legacy" either.)
I have a theory what's going on, though...I think Apple is just trying to "encourage" (force) developers to use their in-house languages, and build more Apple-only software. It's the whole walled-garden game again.
But then, I started boycotting Apple products four years ago, when they started explicitly blocking Linux from being able to communicate with iDevices. (And yes, tech support confirmed this.) Dirty pool is just their game.
Apple lovers: come to Linux! There's freedom on this side of the garden hedge. ;)
Agreed. It really irks me when people use the term "scripting language". Even if the term is valid in the context, they're actually using it to describe a language as inferior because it's not compiled. Many of the "engineers" at my work still don't consider me their equal because I program in JavaScript and not C#, which contains the almighty compiler and static type system 😂
Have had many hats on in my life: Developer, Team Lead, Scrum Master, Architect and Product Owner. Now back to developer \o/ Interested in product discovery, quality assurance and language design.
Rest assured that at least I and many others that have >15 years of experience in the industry don't value people's skills based on the language they use.
Nothing wrong with the term scripting language. It's a language which is compiled on the fly when executed. Scripting language have a purpose. Not just he domain specific languages like R, but also the more script generic languages like Perl and Python.
This is just more of Apple tightening their grip and narrowing their eco system. Apple lost their ways years ago.
I never said a scripting language is an interpreted language. The latter has a definition. The former actually doesn't.
Also, Java is indeed interpreted. It is compiled to bytecode (not machine language), which is executed by the JVM. This is why the end-user has to have the Java runtime installed on their machine.
It is technically possible (albeit uncommon) to compile Java down to machine code, but arguably, the same is true of Python.
Yeah, I as about to say the same thing. Also In theory it’s possible to compile any Interpreted language to Machine code. the different between an ”interpreted” language and a compiled language is that an interpreted language is compiled to Machine code just before execution, example being JavaScript Just in time compiler. Were as a compiled language is compiled to Machine code ahead of time.
If I wanted to I could make a A head of time compiler for JavaScript/ node.js.
Java is not interpreted. The fact that it is compiled at all is the end of that story... But like, good talk? The jvm virtualizes a cpu. Sooooooo basically what you're saying is that if you run a c program in a virtual machine. C becomes interpreted.
Edit sorry a c Linux program on like, Windows
Edit 2 (sorry I'm drunk) in the event you're unaware. There exist[ed] processors that natively ran Java. So, what I'm talking about with c, isn't really unreasonable.
Edit 3 for funsies is assembly now interpreted in the event your writing it on a modern computer with the 8086 emulator?
By you logic Typescript must be a compiled language due to the fact you have to ‘compile’ it to JS, despite JS being an interpreted language itself. However we still consider TS to be an interpreted language.
For a language or application to be considered ‘fully’ compiled we say it must be fully compiled and shipped As Machine readable code or Machine code. This it not True for Java, which uses a intermediate language know as byte code. The Java runtime environment takes that byte code and produces Machine instructions at runtime. For the most part Java can not run on ‘Bare metal‘ systems such as embedded systems without a Java runtime environment.
Java is not a compiled language as further computation has to occur at runtime for the program’s instructions to be acted upon. Yes C can be an interpreted language with a C interpreter.
for the most part is irrelevant.
Java optimized processors exist. The jvm virtualizes said processors. That is the code that is being compiled down to.
Which is why I brought up 8086, according to your logic. Every language is interpreted.
Just a reminder that java and JVM were both made be years before Jazelle or anything like it came about.
This is why most people take Java to be interpreted, and it was designed to be interpreted.
Doesn't matter. Like even a little. There exists a hardware spec.
Anyway, here bud, user perspective.
Interpreted: code executes from human readable project.
Compiled: user compiles project, then runs output file.
From a user perspective it's that extra step that matters.
FYI "most people" is quite a stretch. I'll grant you that it's a debated topic. But the actual issue of this topic and why we're discussing it. Is that "interpreted" is actually not well defined, as people seem to think.
Some people consider vm languages like Java and C# interpreted because they typically run in vms, but there's really no reason they can't run on baremetal; we in fact proved Java can.
For me, this makes the most clear definition of the two, the user perspective I posited before. Because I can create an environment that all languages are interpreted based on the vm/emulator situation. You could certainly compile anything down if you try hard enough, as you stated before.
So from a user perspective. How do your execute your code. Does your human readable code get compiled down to lower level code, and then you get to run it? It's compiled.
Do you send your human readable code directly to an interpreter? Interpreted.
I find your tone off-putting. There's no reason to be abrasive.
To the point people are making, they are correct. The fact an application is distributed in a way that's human readable is a minor distinction. That means more to the humans than the computer. They are all loading an application which does not contain native machine code and jitting it by the time it is executing. Node and Python are no more interpreted than Java is.
The major distinction is whether a language runtime needs to be preinstalled or distributed with an application. In that regard, c/c++/golang are in one category, and python/node/java/c# are in another
Well again. There exists a hardware spec for Java, you can run it natively with the correct hardware. So that goes back to my point of emulators. Just because you are virtualizing that hardware in most cases, does not mean you don't need to compile it down for that hardware.
In that regard c and Java are effectively the same. You just happen to have the hardware to run what gcc will shit out.
I believe you misinterpreted my human readable distinction. I was trying to explain what you pass through for the actual execution.
With traditionally accepted interpreted languages, you don't preform any extra steps to execute your code. Just throw it through the interpreter.
With non interpreted, you have to compile it down before the code will execute. That's how Java works. I also accept that some people claim Java to be interpreted, but in that regard you have a very gray area on what "interpreted" actually means.
Java, as a language, is not interpreted. Byte code is not the written language.
Typescript is mixed because Javascript is valid and not compiled in Typescript. It is more analogous to the C preprocessor, which is referred to as a macro language.
Scripting languages are not well defined, I utilize D as my scripting language, but it is fully compiled to machine code. Then you through in JIT and things get more confusing.
To better understand, it is best to look at the term for the time it was emerging. You had C and Bash, Lisp and Fortran. Languages like perl and php follow closer to the style for bash, these languages start execution at the file entry and don't define a special entry (main).
As for inferiority of scripting over a real language, we need to look at the level of understanding necessary to use the language.
Bash required writing your shell commands to a file then calling bash on it. Similarly languages like visual basic would add container iteration. C required learning pointers and memory layout. While scripts could easily build the description of a task, but would be limited in performance. Today machines are resource abundant and optimization techniques are identified.
C++ was long considered a compiled language, but it wasn't until Walter that the first compiler to build machine code instead of C existed.
On the other hand, the drivers of Linux are open source, and the quality of the trackpad is mostly determined by the software. So you're free to make it as good as you wish.
Hardwire-wise, the trackpad on puri.sm/products/librem-15/#specs seems to be to be as good
My razer's pad feels good to me (note I use a 2014 MacBookPro 60% of the time, 2018 MacBook Air 35% of the time, and the Razer 5%). But the trackpad software (or windows in general?) has issues. I feel like sliding my finger across the pad to move the mouse I end up selecting things I don't want to. I suppose that could be bad hardware but I think it's more likely bad software
If you read carefully, "legacy" refers to the software using those languages, not the languages itself.
Unfortunately the lack of lots of professional softwares it's a massive drawback for Linux, and no, the many alternatives are simply not a viable solution for the majority of professionals out there, having tried it myself many times.
What I was referring to is that a considerable amount of non-legacy software still being updated, distributed, and used relies on these languages, but their statement is rather flippantly dismissive of all that.
Unfortunately the lack of lots of professional softwares it's a massive drawback for Linux, and no, the many alternatives are simply not a viable solution for the majority of professionals out there, having tried it myself many times.
Hm. I've known quite a lot of professionals, and many are very happy with Linux software. I'd go as far as to say "a majority" I know do find that Linux is a viable solution. The past five years especially has seen major expansions of the software available on this platform, either directly or via compatibility layers.
I graduated in 1990 in Electrical Engineering and since then I have been in university, doing research in the field of DSP. To me programming is more a tool than a job.
About the definition of "scripting language" (that I do not consider a pejorative term). Missing a formal and precise definition, I personally would call a language "scripting" if
it is interpreted, so you do not have an explicit compilation phase (maybe an internal one)
it is not "native interactive," that is its main way of using does not involve an interactive shell.
With this definition Ruby and Python are scripting language, while matlab/octave is not (they are used via a shell), neither Ada nor C (they are compiled). Note that Ruby has its "shell" irb that allows you an interaction with the language, but irb is used (at least, I use it...) just for "fast and dirty" tests, while the main way of use it is to write programs with it.
The definition above seems at odd with "shell scripting" since a shell is interactive by its own nature. Indeed, I personally do not consider bash (or csh, zsh, ...) a language, but an interpreter. The main way of interacting is via command line, but you can put the commands in a file too.
As said above, "scripting" is not (in my feeling) a pejorative term. Every language is a tool, and different problems require different tools. I personally use Ruby for simple programs (I noticed that maintaining gets difficult when the program is too complex), especially if they involve parsing text files (very easy with Ruby). For something more complex or long-lived I go with Ada that IMHO is more suited for writing complex software.
Well, see, you're still only describing interpreted languages, really, with some additional arbitrary criteria. You're welcome to categorize how you like, but it still lacks a real definition, and the meaning varies from one person to the next. Therefore, in terms of policy setting, the term is altogether useless.
Also, I might be missing something you're saying, but Python is very much used via a shell, although you can also execute Python programs directly.
For something more complex or long-lived I go with Ada that IMHO is more suited for writing complex software.
There is a lot of large, complex software for which Python is well-suited. I, like thousands of developers, have created full-blown applications in the language, that are just as maintainable as, say, something written in C++.
Naturally, though, you can (and should) use what you work best in. Just remember that your experience is that: your experience. Plenty of developers are just as successful at making large, complex (and maintainable) software in so-called "scripting languages."
Every language is a tool, and different problems require different tools.
I agree with this much, at least.
I just see Apple's dismissing an entire set of languages, which have been used to deploy production software that people use today, as crummy garden-walling.
Dude lol. They're not calling python/ruby/whatever legacy. It's the versions of those languages they ship with that are legacy, and quite frankly shouldn't be used anyway. Sure they could update them, but obviously they're lazy, and anyways we have very easy means to get most recent versions of these now days
A lot of software exists that relies on these languages that is not legacy, but their phrasing dismisses that reality altogether. That's what I'm referring to.
Sure they could update them, but obviously they're lazy...
Which is part of my point. Apple is breaking the entire ecosystem instead of updating like literally every other UNIX-like system.
...and anyways we have very easy means to get most recent versions of these now days.
Define "easy". In my experience, it has never been obvious how to install anything of the sort on Linux. I've always had to use Homebrew to install Python, and OOPS, wasn't that written in Python? Hmm.
If someone isn't technically inclined, and many users aren't, would they really want to take the time to follow some convoluted process to install something they don't understand? It isn't even (IME) as simple to install as on Windows, where you only need to run a three-click installer.
The non technically inclined will certainly not be effected by this. You'd have to be technically inclined to have a need for python to begin with
Ya as a developer it's going to make my life slightly harder. There will be a bootstrapping problem. Apple is abandoning developers, I'm not arguing against that.
I was just being purposely pedantic on your statement about them calling the languages themselves legacy
You'd have to be technically inclined to have a need for python to begin with
Entirely untrue. Python is used to create quite a lot of end-user software, which now either must be packaged specifically for Apple, or will require Python to be manually installed on their machine. So, this will absolutely affect non-technically inclined people.
This is where you and I will not come to an agreement. I'm a firm believer that we should already be bundling python with our apps. No worries about compatibility, and easier for the end user. (IMO)
If all the ui people got together and stopped the "fightin" and built one decent window manager with a decent ui with similar ux to macos... A huge number of people would switch.... Linux desktop needs to be easier. Example.. Compare time machine to what? Rsync? Or what? Exacly... Or what.
Apple has a lot of nice shit.. And you don't have to search high and dry to get something done. I like my nice desktop..
The reason we have multiple window managers and multiple UIs is because everyone has different desires and preferences. Experimentation and free play of ideas is how we improve things. Projects tend to actually share ideas, but we don't force everyone to do things ONE WAY.
Linux is the LEGO-bricks of the computer world. You can build what you want, or select any of a dozen pre-built sets. If you don't like one, there are options to tweak it or change it.
Regarding backups, personally I've had good experiences with Duplicity, but there are several options depending on what you need. Also, Linux Mint has a system snapshots system that is on par with anything Windows or Apple has ever offered.
But, if you don't want to like Linux, you don't have to like Linux. Stick with Apple if you like it. Just understand that your problem is rooted in your perspective, not the realities of the OS. Your view isn't shared by thousands of happy Linux users. You're entitled to your opinion, just don't conflate it with objective fact.
It was handquotes fightin. Without the g. Meaning not really fighting but tension exists. And sure free reign and not forcing people to do thing a certain way fine. I agree with that. But Linux will never have a general consumer desktop that is popular because of this. I didn't say throw out the system. I said the system needs to adapt to general consumer needs of they want to blow out Apple. I think you missed the point my dude.
For the record I've been using Linux since Slackware 3. Not once did I ever say that I don't like Linux and you need to stick with Apple.
Linux will never have a general consumer desktop that is popular because of this.
My computer repair clients, all of them non-technical, would disagree.
I suspect the only reason it doesn't have more market share is the fact that you can't walk into most stores and buy a computer with Linux already on it. The majority of general consumer users aren't going to install any operating system themselves. They'll only use the default.
OS popularity is a function of OEM installs, not OS merit.
I think you missed the point my dude.
I just hear a lot of empty griping about "Linux is broken because it isn't always default-identical to this thing over here that I already like." That's what your post sounded like.
Meaning not really fighting but tension exists.
Once again, not really. We live and let live. I hate KDE myself as a user, but I appreciate the fact it meets the needs of people who like it! That sort of attitude is common. We all have our preferences, which is why the existence of multiple DEs and WMs is so awesome. I seldom encounter anyone in the Linux world who feels their DE is unilaterally superior to all others.
I used to have a MacBook that I loved (unibody was my last experience) and I was happy with it. I got some Bootcamp up and running and some VMs to learn Linux. After that I went to Lenovo, but I got a Yoga and didn't think about how it would be unfriendly with a Linux install compared to other ideapads and thinkpads. VMs still work though. If I had gone for the yoga thinkpad I would be running Linux natively for sure.
"Scripting language" has always struck me as a weasel word. It lacks a formal definition, but is applied liberally to any language which a developer deems "not a real programming language" (whilst turning up their nose in bourgeois disgust.)
If they mean "interpreted language," that's something specific, but it would also mean dropping Java.
And...legacy? Since when is Python "legacy"? It's one of the most popular application development languages today, fer cryin' out loud. (On that note, Ruby and Perl aren't even remotely "legacy" either.)
I have a theory what's going on, though...I think Apple is just trying to "encourage" (force) developers to use their in-house languages, and build more Apple-only software. It's the whole walled-garden game again.
But then, I started boycotting Apple products four years ago, when they started explicitly blocking Linux from being able to communicate with iDevices. (And yes, tech support confirmed this.) Dirty pool is just their game.
Apple lovers: come to Linux! There's freedom on this side of the garden hedge. ;)
I agree with this sentiment. I really need to get on the Linux train.
Both Ubuntu and MATE can be customized to look like macOS, just FYI. ;-)
elementaryOS is worth a try too!
Is elementaryOS run smooth on 4Gb RAM? last i tried was version 0.2.2 so many glitch, how much RAM do you had?
Yes, elementaryOS (0.4 Loki) ran smooth on my 4GB machine.
Agreed. It really irks me when people use the term "scripting language". Even if the term is valid in the context, they're actually using it to describe a language as inferior because it's not compiled. Many of the "engineers" at my work still don't consider me their equal because I program in JavaScript and not C#, which contains the almighty compiler and static type system 😂
Wow, that's really sad to hear.
Rest assured that at least I and many others that have >15 years of experience in the industry don't value people's skills based on the language they use.
Nothing wrong with the term scripting language. It's a language which is compiled on the fly when executed. Scripting language have a purpose. Not just he domain specific languages like R, but also the more script generic languages like Perl and Python.
This is just more of Apple tightening their grip and narrowing their eco system. Apple lost their ways years ago.
Scripting language does mean interpreted. Also Java is not an interpreted language. You compile it.
I never said a scripting language is an interpreted language. The latter has a definition. The former actually doesn't.
Also, Java is indeed interpreted. It is compiled to bytecode (not machine language), which is executed by the JVM. This is why the end-user has to have the Java runtime installed on their machine.
It is technically possible (albeit uncommon) to compile Java down to machine code, but arguably, the same is true of Python.
Yeah, I as about to say the same thing. Also In theory it’s possible to compile any Interpreted language to Machine code. the different between an ”interpreted” language and a compiled language is that an interpreted language is compiled to Machine code just before execution, example being JavaScript Just in time compiler. Were as a compiled language is compiled to Machine code ahead of time.
If I wanted to I could make a A head of time compiler for JavaScript/ node.js.
I said it was. I was answering your question.
Java is not interpreted. The fact that it is compiled at all is the end of that story... But like, good talk? The jvm virtualizes a cpu. Sooooooo basically what you're saying is that if you run a c program in a virtual machine. C becomes interpreted.
Edit sorry a c Linux program on like, Windows
Edit 2 (sorry I'm drunk) in the event you're unaware. There exist[ed] processors that natively ran Java. So, what I'm talking about with c, isn't really unreasonable.
Edit 3 for funsies is assembly now interpreted in the event your writing it on a modern computer with the 8086 emulator?
By you logic Typescript must be a compiled language due to the fact you have to ‘compile’ it to JS, despite JS being an interpreted language itself. However we still consider TS to be an interpreted language.
For a language or application to be considered ‘fully’ compiled we say it must be fully compiled and shipped As Machine readable code or Machine code. This it not True for Java, which uses a intermediate language know as byte code. The Java runtime environment takes that byte code and produces Machine instructions at runtime. For the most part Java can not run on ‘Bare metal‘ systems such as embedded systems without a Java runtime environment.
Java is not a compiled language as further computation has to occur at runtime for the program’s instructions to be acted upon. Yes C can be an interpreted language with a C interpreter.
There exists hardware that can run it natively. Try again.
for the most part is irrelevant.
Java optimized processors exist. The jvm virtualizes said processors. That is the code that is being compiled down to.
Which is why I brought up 8086, according to your logic. Every language is interpreted.
Yes, in theory every Language can be interpreted.
Just a reminder that java and JVM were both made be years before Jazelle or anything like it came about.
This is why most people take Java to be interpreted, and it was designed to be interpreted.
Doesn't matter. Like even a little. There exists a hardware spec.
Anyway, here bud, user perspective.
Interpreted: code executes from human readable project.
Compiled: user compiles project, then runs output file.
From a user perspective it's that extra step that matters.
FYI "most people" is quite a stretch. I'll grant you that it's a debated topic. But the actual issue of this topic and why we're discussing it. Is that "interpreted" is actually not well defined, as people seem to think.
Some people consider vm languages like Java and C# interpreted because they typically run in vms, but there's really no reason they can't run on baremetal; we in fact proved Java can.
For me, this makes the most clear definition of the two, the user perspective I posited before. Because I can create an environment that all languages are interpreted based on the vm/emulator situation. You could certainly compile anything down if you try hard enough, as you stated before.
So from a user perspective. How do your execute your code. Does your human readable code get compiled down to lower level code, and then you get to run it? It's compiled.
Do you send your human readable code directly to an interpreter? Interpreted.
Basically, 2 step vs 1 step process.
I find your tone off-putting. There's no reason to be abrasive.
To the point people are making, they are correct. The fact an application is distributed in a way that's human readable is a minor distinction. That means more to the humans than the computer. They are all loading an application which does not contain native machine code and jitting it by the time it is executing. Node and Python are no more interpreted than Java is.
The major distinction is whether a language runtime needs to be preinstalled or distributed with an application. In that regard, c/c++/golang are in one category, and python/node/java/c# are in another
Well again. There exists a hardware spec for Java, you can run it natively with the correct hardware. So that goes back to my point of emulators. Just because you are virtualizing that hardware in most cases, does not mean you don't need to compile it down for that hardware.
In that regard c and Java are effectively the same. You just happen to have the hardware to run what gcc will shit out.
I believe you misinterpreted my human readable distinction. I was trying to explain what you pass through for the actual execution.
With traditionally accepted interpreted languages, you don't preform any extra steps to execute your code. Just throw it through the interpreter.
With non interpreted, you have to compile it down before the code will execute. That's how Java works. I also accept that some people claim Java to be interpreted, but in that regard you have a very gray area on what "interpreted" actually means.
Java, as a language, is not interpreted. Byte code is not the written language.
Typescript is mixed because Javascript is valid and not compiled in Typescript. It is more analogous to the C preprocessor, which is referred to as a macro language.
Scripting languages are not well defined, I utilize D as my scripting language, but it is fully compiled to machine code. Then you through in JIT and things get more confusing.
To better understand, it is best to look at the term for the time it was emerging. You had C and Bash, Lisp and Fortran. Languages like perl and php follow closer to the style for bash, these languages start execution at the file entry and don't define a special entry (main).
As for inferiority of scripting over a real language, we need to look at the level of understanding necessary to use the language.
Bash required writing your shell commands to a file then calling bash on it. Similarly languages like visual basic would add container iteration. C required learning pointers and memory layout. While scripts could easily build the description of a task, but would be limited in performance. Today machines are resource abundant and optimization techniques are identified.
C++ was long considered a compiled language, but it wasn't until Walter that the first compiler to build machine code instead of C existed.
The day another vendor provides a trackpad as good as the one Apple MacBooks provide is the day I switch to Linux.
The preinstalled libraries and binaries are a tad annoying and are usually superseded by whatever Homebrew installs anyway.
On the other hand, the drivers of Linux are open source, and the quality of the trackpad is mostly determined by the software. So you're free to make it as good as you wish.
Hardwire-wise, the trackpad on puri.sm/products/librem-15/#specs seems to be to be as good
My razer's pad feels good to me (note I use a 2014 MacBookPro 60% of the time, 2018 MacBook Air 35% of the time, and the Razer 5%). But the trackpad software (or windows in general?) has issues. I feel like sliding my finger across the pad to move the mouse I end up selecting things I don't want to. I suppose that could be bad hardware but I think it's more likely bad software
Glad I made the jump after no nivida drivers for latest os
If you read carefully, "legacy" refers to the software using those languages, not the languages itself.
Unfortunately the lack of lots of professional softwares it's a massive drawback for Linux, and no, the many alternatives are simply not a viable solution for the majority of professionals out there, having tried it myself many times.
What I was referring to is that a considerable amount of non-legacy software still being updated, distributed, and used relies on these languages, but their statement is rather flippantly dismissive of all that.
Hm. I've known quite a lot of professionals, and many are very happy with Linux software. I'd go as far as to say "a majority" I know do find that Linux is a viable solution. The past five years especially has seen major expansions of the software available on this platform, either directly or via compatibility layers.
But as for you, you know your own requirements.
Add me as another former Mac developer who has made the switch to Linux.
About the definition of "scripting language" (that I do not consider a pejorative term). Missing a formal and precise definition, I personally would call a language "scripting" if
With this definition Ruby and Python are scripting language, while matlab/octave is not (they are used via a shell), neither Ada nor C (they are compiled). Note that Ruby has its "shell" irb that allows you an interaction with the language, but irb is used (at least, I use it...) just for "fast and dirty" tests, while the main way of use it is to write programs with it.
The definition above seems at odd with "shell scripting" since a shell is interactive by its own nature. Indeed, I personally do not consider bash (or csh, zsh, ...) a language, but an interpreter. The main way of interacting is via command line, but you can put the commands in a file too.
As said above, "scripting" is not (in my feeling) a pejorative term. Every language is a tool, and different problems require different tools. I personally use Ruby for simple programs (I noticed that maintaining gets difficult when the program is too complex), especially if they involve parsing text files (very easy with Ruby). For something more complex or long-lived I go with Ada that IMHO is more suited for writing complex software.
Well, see, you're still only describing interpreted languages, really, with some additional arbitrary criteria. You're welcome to categorize how you like, but it still lacks a real definition, and the meaning varies from one person to the next. Therefore, in terms of policy setting, the term is altogether useless.
Also, I might be missing something you're saying, but Python is very much used via a shell, although you can also execute Python programs directly.
There is a lot of large, complex software for which Python is well-suited. I, like thousands of developers, have created full-blown applications in the language, that are just as maintainable as, say, something written in C++.
Naturally, though, you can (and should) use what you work best in. Just remember that your experience is that: your experience. Plenty of developers are just as successful at making large, complex (and maintainable) software in so-called "scripting languages."
I agree with this much, at least.
I just see Apple's dismissing an entire set of languages, which have been used to deploy production software that people use today, as crummy garden-walling.
Python 2 (which comes preinstalled on OSX) is indeed a legacy version, EOL is set to next Jan.
But they didn't say "Python 2". They said "Python," which includes the non-legacy Python 3.
The smarter move on Apple's part would have been to switch the default to
python3
.Dude lol. They're not calling python/ruby/whatever legacy. It's the versions of those languages they ship with that are legacy, and quite frankly shouldn't be used anyway. Sure they could update them, but obviously they're lazy, and anyways we have very easy means to get most recent versions of these now days
A lot of software exists that relies on these languages that is not legacy, but their phrasing dismisses that reality altogether. That's what I'm referring to.
Which is part of my point. Apple is breaking the entire ecosystem instead of updating like literally every other UNIX-like system.
Define "easy". In my experience, it has never been obvious how to install anything of the sort on Linux. I've always had to use Homebrew to install Python, and OOPS, wasn't that written in Python? Hmm.
If someone isn't technically inclined, and many users aren't, would they really want to take the time to follow some convoluted process to install something they don't understand? It isn't even (IME) as simple to install as on Windows, where you only need to run a three-click installer.
The non technically inclined will certainly not be effected by this. You'd have to be technically inclined to have a need for python to begin with
Ya as a developer it's going to make my life slightly harder. There will be a bootstrapping problem. Apple is abandoning developers, I'm not arguing against that.
I was just being purposely pedantic on your statement about them calling the languages themselves legacy
Entirely untrue. Python is used to create quite a lot of end-user software, which now either must be packaged specifically for Apple, or will require Python to be manually installed on their machine. So, this will absolutely affect non-technically inclined people.
This is where you and I will not come to an agreement. I'm a firm believer that we should already be bundling python with our apps. No worries about compatibility, and easier for the end user. (IMO)
Big examples of this are eve online, and sublime text
If all the ui people got together and stopped the "fightin" and built one decent window manager with a decent ui with similar ux to macos... A huge number of people would switch.... Linux desktop needs to be easier. Example.. Compare time machine to what? Rsync? Or what? Exacly... Or what.
Apple has a lot of nice shit.. And you don't have to search high and dry to get something done. I like my nice desktop..
UI people are fighting? First I've heard of it.
The reason we have multiple window managers and multiple UIs is because everyone has different desires and preferences. Experimentation and free play of ideas is how we improve things. Projects tend to actually share ideas, but we don't force everyone to do things ONE WAY.
Linux is the LEGO-bricks of the computer world. You can build what you want, or select any of a dozen pre-built sets. If you don't like one, there are options to tweak it or change it.
Regarding backups, personally I've had good experiences with Duplicity, but there are several options depending on what you need. Also, Linux Mint has a system snapshots system that is on par with anything Windows or Apple has ever offered.
But, if you don't want to like Linux, you don't have to like Linux. Stick with Apple if you like it. Just understand that your problem is rooted in your perspective, not the realities of the OS. Your view isn't shared by thousands of happy Linux users. You're entitled to your opinion, just don't conflate it with objective fact.
It was handquotes fightin. Without the g. Meaning not really fighting but tension exists. And sure free reign and not forcing people to do thing a certain way fine. I agree with that. But Linux will never have a general consumer desktop that is popular because of this. I didn't say throw out the system. I said the system needs to adapt to general consumer needs of they want to blow out Apple. I think you missed the point my dude.
For the record I've been using Linux since Slackware 3. Not once did I ever say that I don't like Linux and you need to stick with Apple.
My computer repair clients, all of them non-technical, would disagree.
I suspect the only reason it doesn't have more market share is the fact that you can't walk into most stores and buy a computer with Linux already on it. The majority of general consumer users aren't going to install any operating system themselves. They'll only use the default.
OS popularity is a function of OEM installs, not OS merit.
I just hear a lot of empty griping about "Linux is broken because it isn't always default-identical to this thing over here that I already like." That's what your post sounded like.
Once again, not really. We live and let live. I hate KDE myself as a user, but I appreciate the fact it meets the needs of people who like it! That sort of attitude is common. We all have our preferences, which is why the existence of multiple DEs and WMs is so awesome. I seldom encounter anyone in the Linux world who feels their DE is unilaterally superior to all others.
ChromeOS. If there's one Linux desktop with any popularity at all with the general consumer, it's ChromeOS.
I used to have a MacBook that I loved (unibody was my last experience) and I was happy with it. I got some Bootcamp up and running and some VMs to learn Linux. After that I went to Lenovo, but I got a Yoga and didn't think about how it would be unfriendly with a Linux install compared to other ideapads and thinkpads. VMs still work though. If I had gone for the yoga thinkpad I would be running Linux natively for sure.
Would never pay 2x for 1x worth of hardware. Terrible elitist strategy, I always held a grudge against them. Never had anything Apple.
The rationale I heard was that no one uses the built-in versions of these runtimes anyway, opting instead to download their own version.