Tim BL always said that he regretted making URLs complicated (scheme, domains, and path) and not fully hierarchical. That if he were to get a do-over, instead of https://dev.to/ben/post we would have https/to/dev/ben/post. Just thought I'd throw that in as an interesting fact :)
Browser = Window to Internet
Aol is not Internet
Something.anything else would just collapse (ppl, products, links everywhere) so .com was it.
The numbers: 3/4 letter speakable .com is v rare. Companies/brands/ppl/things not.
Type in traffic (autocompletion for example is usually .com).
Age/authenticity/credibility/income generating appreciating assets.
Even in 2000, ppl were saying "it's so hard to find a good domain".
It's better now.
I have .coms & people 'respect/trust' it more (I don't really care, but it's prized). Also, native per country (.fr, ..de etc).
google.co.ck nsfw (that tag is not really gonna work in this instance, but hey, it's a legit domain).
Internet today? Big set of gates with a {hired} famous person smiling & asking for your credit card + two forms of identity, a signing of a TOS before you can enter & get your @internet handle. 9-6 operating hours. WWW would be a 'room' for select credentialed professionals. Read only.
Grew up in Russia, lived in the States, moved to Germany, sometimes live in Spain. I program since I was 13. I used to program games, maps and now I reverse engineer password managers and other stuff
Location
Berlin and MΓ‘laga
Education
MS in CS from State Polytechnic University of St. Petersburg
It would not be open. DRM everywhere, including the pages and the code, not just media. Maybe it'd be faster and less burdened with backwards compatibility, but with even more security problems. Would be run by corporations any maybe governments. Kind of a dark picture I have in mind, based on the current trends.
shudders
But yeah, totally, it'd be anything but open. I recently read this article that mentions that the internet is becoming a Dark Forest: onezero.medium.com/the-dark-forest...
I find it amazing to be able to be part of this, but sad nonetheless.
Grew up in Russia, lived in the States, moved to Germany, sometimes live in Spain. I program since I was 13. I used to program games, maps and now I reverse engineer password managers and other stuff
Location
Berlin and MΓ‘laga
Education
MS in CS from State Polytechnic University of St. Petersburg
I didn't know about this theory. But I know what he's talking about. I also prefer to hang out in private groups and channels. It feels safe, cozy, friendly and you can be open, bold, honest, goofy, can cross some lines you'd never cross in public. The "internet" in this case is just a protocol or transport, not the World Wide Web as it was meant to be.
The funny thing is that something similar was actually proposed before the internet. A sort of micropayment-driven digital distributed super-library. Might not have turned out too bad.
I think you're spot on though. IT/software is too popular right now, making it a target for greed rather than just personal interest.
Gotta say - almost every one of these responses is looking at the web from a purely front end perspective.
The web as a platform - REST - is a vital part of the internet. If you're proposing to change the web to fit only with your vision of the front end, recognise that you'll be losing a lot of its power and expressiveness for the backend, which was a part of what made it so successful in the first place.
Dear JavaScript developer: the web isn't just about you π₯π₯π₯
I think it would be built from the ground up to be an application platform. We're getting to that place with WebGL, PWAs, and Web Assembly, but if we were to start over those features would be built first. JavaScript would get rid of "the bad parts" and take a role as a true scripting language instead of an application language.
I think it would be built from the ground up to be an application platform.
Oh dear no I hope not. One of the joys of the web is its transparency and discoverability. HTML is easy to write and read - and inspect on a website you're browsing. HTTP 1.1 is as well - you can easily learn how the web works by reading network requests.
The web as application platform, delivered as opaque blobs of WASM, is no longer the open web. If it had been written like that it would have failed.
In fact, it did fail. It was called Java applets and Flash.
Have you looked at any JavaScript application bundle? Everyone minifys, uglifys, and bundles for performance, and the result is unreadable. Without the source maps and dependency list, an inspection of modern JavaScript tells you nothing. Client heavy applications are some of the most popular web sites today, and some of the most used developer applications are Electron apps. You can argue that this is "Not The Right Way To Make Software" (tm), but the fact is the web as application platform has been very successful. So I stand by my original post. Knowing what the web becomes, I think it would be prudent to design it for efficiency as an application platform first, and a document sharing tool second.
Have you looked at any JavaScript application bundle? Everyone minifys, uglifys, and bundles for performance, and the result is unreadable. Without the source maps and dependency list, an inspection of modern JavaScript tells you nothing.
I feel you're making my argument for me here.
Client heavy applications are some of the most popular web sites today, and some of the most used developer applications are Electron apps.
How on earth is that an argument for the web to be a JavaScript application platform? An Electron app doesn't even need to be connected to the Internet, it doesn't need to be on the web at all. You could send me a copy on a ~floppy disk~ USB drive. If anything it's reinforced the idea that JavaScript 'web' apps have very little to do with the web at all.
You can argue that this is "Not The Right Way To Make Software" (tm)
I can and I do. But I haven't trademarked it. Yet.
but the fact is the web as application platform has been very successful.
The web has been very successful. The web as application platform has been very successful. JavaScript applications running in the browser, delivered over the web, with the web browser being used as a host-slash-interpreter-slash-GUI-library is... well, I'll just say it's ridiculous if you're just trying to make a recipe website, but pretty cool if you're building Google Docs.
Most people are not building Google Docs.
Knowing what the web becomes, I think it would be prudent to design it for efficiency as an application platform first, and a document sharing tool second.
The web is not just websites. It's a series of protocols, standards and conventions that have been used as an application platform for decades now. Not applications in the sense that you're using it - JavaScript in the browser - but distributed systems communicating over HTTP using principles such as REST. The reason the web has been successful is that it covers this use case as well as 'document sharing' (you make it sound like Dropbox).
As I've said in another comment above: the web is not just about the front end.
I agree, I think WASM would be the "base language of the web", and JS would just happen to be one of the compatible languages, but on an equal playing field with Rust, Go, Kotlin, and all the other languages that compile to WASM. It would be more like desktop programming where all languages are equally valid, rather than having the one language of JS dominate the entire landscape.
Basically no HTML. XML-like structures are just too verbose. No more legacy event handling and no more distinction between attributes and properties.
JavaScript
No more global objects for APIs. No more document, or fetch, or Date, or Promise. We'd have to explicitly import from modules, like we'll have to do to use the brand new KV Storage.
CSS
A solid way to encapsulate styles and expose part of components to theming (see Shadow DOM, and ::part and ::theme). No more legacy WTFs (like using vertical-align for two completely different things).
I'm a software developer who writes about Laravel, JavaScript, Rails, Linux, Docker, WordPress and the tech industry. Follow me on Twitter @tylerlwsmith
It'd probably be worse, since I assume it'd be some mega-corporate committee who sets about with the task of recreating it.
Look at HTML, CSS, and JS. All of these have had numerous opportunities to add clearer constructs and make things easier. Yet at every iteration they also add a bunch of nonsense and fail to address critical errors. This effect would be magnified if the same people were to recreate the entire web.
I actually don't see the problem with just deprecating old standards. Our world is changing so fast that any active service/content provider needs to make changes frequently anyway. If we wish to access old content, then we can use special legacy browsers. It's not like a Blu-Ray player can read VHS tapes, so why should a browser be able to browse content from 20 years ago.
Evolutionary design is by far the best approach, but we do need to drop compatibility with old crap. It'll also help a lot for learning, by thinning out the garbage.
But no, redesigning would be a failure. I'm pretty much against up-front designs, since they tend to fail. Everything must be evolutionary to work.
I think you mean iterative design. AFAIK 'evolutionary' would involve having a population of solutions, that are selected based on fitness, mutated and combinated to create a new generation of solutions. Rinse & repeat. You might argue that is how human populations/companies work but it seems a bit of a stretch.
I know this is related to the network side not the WWW components but it is everything in the end, Make all IP addresses IPv6 so IPv4 had never existed (I'd Tweak how IPv6 is implemented but that is besides the point).
I think it would have encryption, identity management and payments built in from the ground up. Looking at what is emerging with Web 3.0 in the blockchain field (which I predict will be the web in 5-10 years time), these are the kinds of things people are creating. Maybe the WWW is being recreated today, though I doubt it will emerge without legacy dependencies...
No multiple Javascript engine, only different approach to the user experience around using a web browser. A single entity helped by the GAFAM to improve the engine, no more compatibility issues, workforce not splitted, features focused on security and developer experience.
If I were to make it from the ground up, there would be no such thing as a browser. The whole OS is the browser. You can "add it to your home screen" or search it in the "appstore" kinda like a PWA. But one thing that will stay are different search engines to index the data and display relievet info similar to the Android slices API. Since there is no browser there will be no single language that the "sites" will have to be made in. So you can use what ever the hell you want, and all those languages and runtime will be updated at the OS level and will have a "manifest.json" to check for compatibility.
It's hard to imagine. So many of the conventions we use now are because of the early implementations. Maybe if we were more advanced the possibility of more centralization could have happened. Luckily it all fell apart and we have a more centralized web.
we would start building dependencies for the next generations.
The only reached this point because we had 20+ years to fool around with it. It will continue to grow, change, be better (or worse). There is nothing we can do about it
If www was rebuilt from scratch today, it wouldn't be very different to what we're moving toward since 10 years: a more and more centralized network owned by big compagnies controlling everything.
I hope by default it would be more secure and everything would just be HTTPS be default.
But really, take what everyone wants it to be and then add 10 years, and we are asking this question again. There will always be supporting legacy dependencies. It's just the nature of our industry and how it evolves. I know of technologies today were new around 10 years ago that are now supporting legacy dependencies.
One thing that sticks out for me is that the web was very much originally designed as way to present static content. Css was built around the idea that you had some content and you could provide various styles to alter the presentation of that content. While this works quite well for documents, I don't think it works as well for applications. This has been addressed to some extent with more modern layout mechanisms like flex and grid. Still, it seems kind of clunky and confusing. I have a feeling we might have a more elegant layout system if it didn't have to be integrated reasonably well with existing css.
Another thing that comes to mind is security. Security for web applications, with cors and the constant threat of lurking xss and csrf vulnerabilities, is just kind of a nightmare. I would really like for there to be a brand-new, really straightforward standard that made it easy to build security into web applications. Security is inherently a difficult area, but I can't help thinking that there's got to be a better way than what we currently have.
Howβs it going, I'm a Adam, a Full-Stack Engineer, actively searching for work. I'm all about JavaScript. And Frontend but don't let that fool you - I've also got some serious Backend skills.
Location
City of Bath, UK π¬π§
Education
10 plus years* active enterprise development experience and a Fine art degree π¨
I canβt see how it would work. Everyone would charge for everything that goes on their part of the network. HIPPA, GDPR and related law would probably scare away investors
Tim BL always said that he regretted making URLs complicated (scheme, domains, and path) and not fully hierarchical. That if he were to get a do-over, instead of
https://dev.to/ben/post
we would havehttps/to/dev/ben/post
. Just thought I'd throw that in as an interesting fact :)The cultural nature of domains as we know them today makes this such an odd thing to think about. "
.com
" is such a thing.IT was so hard to get..
Browser = Window to Internet
Aol is not Internet
Something.anything else would just collapse (ppl, products, links everywhere) so .com was it.
The numbers: 3/4 letter speakable .com is v rare. Companies/brands/ppl/things not.
Type in traffic (autocompletion for example is usually .com).
Age/authenticity/credibility/income generating appreciating assets.
Even in 2000, ppl were saying "it's so hard to find a good domain".
It's better now.
I have .coms & people 'respect/trust' it more (I don't really care, but it's prized). Also, native per country (.fr, ..de etc).
google.co.ck nsfw (that tag is not really gonna work in this instance, but hey, it's a legit domain).
Internet today? Big set of gates with a {hired} famous person smiling & asking for your credit card + two forms of identity, a signing of a TOS before you can enter & get your @internet handle. 9-6 operating hours. WWW would be a 'room' for select credentialed professionals. Read only.
I don't like
https/to/dev/ben/post
because it removes the at-a-glance organization of why URLs are structured that way.However, that said, I would reverse domain names and prefer
https:/to.dev/ben/post
The parts of a URI are immediately identifiable from just looking at them.
That is interesting. I prefer the way we have it, mostly.
It would not be open. DRM everywhere, including the pages and the code, not just media. Maybe it'd be faster and less burdened with backwards compatibility, but with even more security problems. Would be run by corporations any maybe governments. Kind of a dark picture I have in mind, based on the current trends.
shudders
But yeah, totally, it'd be anything but open. I recently read this article that mentions that the internet is becoming a Dark Forest: onezero.medium.com/the-dark-forest...
I find it amazing to be able to be part of this, but sad nonetheless.
I didn't know about this theory. But I know what he's talking about. I also prefer to hang out in private groups and channels. It feels safe, cozy, friendly and you can be open, bold, honest, goofy, can cross some lines you'd never cross in public. The "internet" in this case is just a protocol or transport, not the World Wide Web as it was meant to be.
The funny thing is that something similar was actually proposed before the internet. A sort of micropayment-driven digital distributed super-library. Might not have turned out too bad.
I think you're spot on though. IT/software is too popular right now, making it a target for greed rather than just personal interest.
Gotta say - almost every one of these responses is looking at the web from a purely front end perspective.
The web as a platform - REST - is a vital part of the internet. If you're proposing to change the web to fit only with your vision of the front end, recognise that you'll be losing a lot of its power and expressiveness for the backend, which was a part of what made it so successful in the first place.
Dear JavaScript developer: the web isn't just about you π₯π₯π₯
I think it would be built from the ground up to be an application platform. We're getting to that place with WebGL, PWAs, and Web Assembly, but if we were to start over those features would be built first. JavaScript would get rid of "the bad parts" and take a role as a true scripting language instead of an application language.
Oh dear no I hope not. One of the joys of the web is its transparency and discoverability. HTML is easy to write and read - and inspect on a website you're browsing. HTTP 1.1 is as well - you can easily learn how the web works by reading network requests.
The web as application platform, delivered as opaque blobs of WASM, is no longer the open web. If it had been written like that it would have failed.
In fact, it did fail. It was called Java applets and Flash.
Have you looked at any JavaScript application bundle? Everyone minifys, uglifys, and bundles for performance, and the result is unreadable. Without the source maps and dependency list, an inspection of modern JavaScript tells you nothing. Client heavy applications are some of the most popular web sites today, and some of the most used developer applications are Electron apps. You can argue that this is "Not The Right Way To Make Software" (tm), but the fact is the web as application platform has been very successful. So I stand by my original post. Knowing what the web becomes, I think it would be prudent to design it for efficiency as an application platform first, and a document sharing tool second.
I feel you're making my argument for me here.
How on earth is that an argument for the web to be a JavaScript application platform? An Electron app doesn't even need to be connected to the Internet, it doesn't need to be on the web at all. You could send me a copy on a ~floppy disk~ USB drive. If anything it's reinforced the idea that JavaScript 'web' apps have very little to do with the web at all.
I can and I do. But I haven't trademarked it. Yet.
The web has been very successful. The web as application platform has been very successful. JavaScript applications running in the browser, delivered over the web, with the web browser being used as a host-slash-interpreter-slash-GUI-library is... well, I'll just say it's ridiculous if you're just trying to make a recipe website, but pretty cool if you're building Google Docs.
Most people are not building Google Docs.
The web is not just websites. It's a series of protocols, standards and conventions that have been used as an application platform for decades now. Not applications in the sense that you're using it - JavaScript in the browser - but distributed systems communicating over HTTP using principles such as REST. The reason the web has been successful is that it covers this use case as well as 'document sharing' (you make it sound like Dropbox).
As I've said in another comment above: the web is not just about the front end.
I agree, I think WASM would be the "base language of the web", and JS would just happen to be one of the compatible languages, but on an equal playing field with Rust, Go, Kotlin, and all the other languages that compile to WASM. It would be more like desktop programming where all languages are equally valid, rather than having the one language of JS dominate the entire landscape.
HTML
Basically no HTML. XML-like structures are just too verbose. No more legacy event handling and no more distinction between attributes and properties.
JavaScript
No more global objects for APIs. No more
document
, orfetch
, orDate
, orPromise
. We'd have to explicitly import from modules, like we'll have to do to use the brand new KV Storage.CSS
A solid way to encapsulate styles and expose part of components to theming (see Shadow DOM, and
::part
and::theme
). No more legacy WTFs (like usingvertical-align
for two completely different things).<input>
would be split up into<textbox>
,<checkbox>
, etc. just as<textarea>
and<button>
became individual elements.Additionally, the custom elements API might be a lot more flushed out so that the native elements could be defined as 'Custom Elements'
Amen.
It'd probably be worse, since I assume it'd be some mega-corporate committee who sets about with the task of recreating it.
Look at HTML, CSS, and JS. All of these have had numerous opportunities to add clearer constructs and make things easier. Yet at every iteration they also add a bunch of nonsense and fail to address critical errors. This effect would be magnified if the same people were to recreate the entire web.
I actually don't see the problem with just deprecating old standards. Our world is changing so fast that any active service/content provider needs to make changes frequently anyway. If we wish to access old content, then we can use special legacy browsers. It's not like a Blu-Ray player can read VHS tapes, so why should a browser be able to browse content from 20 years ago.
Evolutionary design is by far the best approach, but we do need to drop compatibility with old crap. It'll also help a lot for learning, by thinning out the garbage.
But no, redesigning would be a failure. I'm pretty much against up-front designs, since they tend to fail. Everything must be evolutionary to work.
I think you mean iterative design. AFAIK 'evolutionary' would involve having a population of solutions, that are selected based on fitness, mutated and combinated to create a new generation of solutions. Rinse & repeat. You might argue that is how human populations/companies work but it seems a bit of a stretch.
It would have all the new tech from today but then that would be "legacy" in a decade
I know this is related to the network side not the WWW components but it is everything in the end, Make all IP addresses
IPv6
soIPv4
had never existed (I'd Tweak how IPv6 is implemented but that is besides the point).Markdown-based
:-)I think it would have encryption, identity management and payments built in from the ground up. Looking at what is emerging with Web 3.0 in the blockchain field (which I predict will be the web in 5-10 years time), these are the kinds of things people are creating. Maybe the WWW is being recreated today, though I doubt it will emerge without legacy dependencies...
No multiple Javascript engine, only different approach to the user experience around using a web browser. A single entity helped by the GAFAM to improve the engine, no more compatibility issues, workforce not splitted, features focused on security and developer experience.
Ah man... If only...
If I were to make it from the ground up, there would be no such thing as a browser. The whole OS is the browser. You can "add it to your home screen" or search it in the "appstore" kinda like a PWA. But one thing that will stay are different search engines to index the data and display relievet info similar to the Android slices API. Since there is no browser there will be no single language that the "sites" will have to be made in. So you can use what ever the hell you want, and all those languages and runtime will be updated at the OS level and will have a "manifest.json" to check for compatibility.
It's hard to imagine. So many of the conventions we use now are because of the early implementations. Maybe if we were more advanced the possibility of more centralization could have happened. Luckily it all fell apart and we have a more centralized web.
nothing would be in the global scope in Javascript. All of the 'std lib' web APIs would use
import
We need to clarify 'dependencies' in this context. We talking technical, commercial?
Are we assuming the American Military still inventing the Internet, scoping this question specifically to the web (HTTP invented by Tim BL)?
we would start building dependencies for the next generations.
The only reached this point because we had 20+ years to fool around with it. It will continue to grow, change, be better (or worse). There is nothing we can do about it
If www was rebuilt from scratch today, it wouldn't be very different to what we're moving toward since 10 years: a more and more centralized network owned by big compagnies controlling everything.
I hope by default it would be more secure and everything would just be HTTPS be default.
But really, take what everyone wants it to be and then add 10 years, and we are asking this question again. There will always be supporting legacy dependencies. It's just the nature of our industry and how it evolves. I know of technologies today were new around 10 years ago that are now supporting legacy dependencies.
One thing that sticks out for me is that the web was very much originally designed as way to present static content. Css was built around the idea that you had some content and you could provide various styles to alter the presentation of that content. While this works quite well for documents, I don't think it works as well for applications. This has been addressed to some extent with more modern layout mechanisms like flex and grid. Still, it seems kind of clunky and confusing. I have a feeling we might have a more elegant layout system if it didn't have to be integrated reasonably well with existing css.
Another thing that comes to mind is security. Security for web applications, with cors and the constant threat of lurking xss and csrf vulnerabilities, is just kind of a nightmare. I would really like for there to be a brand-new, really straightforward standard that made it easy to build security into web applications. Security is inherently a difficult area, but I can't help thinking that there's got to be a better way than what we currently have.
No inline JS/CSS.
And likely no "View Source" option.
No "View Source" would be very sad.
JavaScript would be much better language and CSS might be Turing complete.
A federated Google Wave would be my wish.
We should draw inspiration from Google Wave for future DEV directions
IPFS ipfs.io/
The two most important pieces that I can think of are
1)ES6 everywhere and
2)dependable stacks with scalable applications
Serverless?
Two-way hyperlinks. π
I canβt see how it would work. Everyone would charge for everything that goes on their part of the network. HIPPA, GDPR and related law would probably scare away investors
Client side Ruby instead of JavaScript?
By the time the web was built, it would have fallen behind already.