For further actions, you may consider blocking this person and/or reporting abuse
Read next
Advanced MySQL Techniques for Developers: Boost Performance, Scalability, and Flexibility
Abhay Singh Kathayat -
Ask Anything in VS Code now for Development!
Nik L. -
Understanding the Barrel Pattern in JavaScript/TypeScript
Aditya Rawas -
Important Topics for Frontend Developers to Master in 2025
Mohamed Ibrahim -
Top comments (45)
Tim BL always said that he regretted making URLs complicated (scheme, domains, and path) and not fully hierarchical. That if he were to get a do-over, instead of
https://dev.to/ben/post
we would havehttps/to/dev/ben/post
. Just thought I'd throw that in as an interesting fact :)The cultural nature of domains as we know them today makes this such an odd thing to think about. "
.com
" is such a thing.IT was so hard to get..
Browser = Window to Internet
Aol is not Internet
Something.anything else would just collapse (ppl, products, links everywhere) so .com was it.
The numbers: 3/4 letter speakable .com is v rare. Companies/brands/ppl/things not.
Type in traffic (autocompletion for example is usually .com).
Age/authenticity/credibility/income generating appreciating assets.
Even in 2000, ppl were saying "it's so hard to find a good domain".
It's better now.
I have .coms & people 'respect/trust' it more (I don't really care, but it's prized). Also, native per country (.fr, ..de etc).
google.co.ck nsfw (that tag is not really gonna work in this instance, but hey, it's a legit domain).
Internet today? Big set of gates with a {hired} famous person smiling & asking for your credit card + two forms of identity, a signing of a TOS before you can enter & get your @internet handle. 9-6 operating hours. WWW would be a 'room' for select credentialed professionals. Read only.
I don't like
https/to/dev/ben/post
because it removes the at-a-glance organization of why URLs are structured that way.However, that said, I would reverse domain names and prefer
https:/to.dev/ben/post
The parts of a URI are immediately identifiable from just looking at them.
That is interesting. I prefer the way we have it, mostly.
It would not be open. DRM everywhere, including the pages and the code, not just media. Maybe it'd be faster and less burdened with backwards compatibility, but with even more security problems. Would be run by corporations any maybe governments. Kind of a dark picture I have in mind, based on the current trends.
shudders
But yeah, totally, it'd be anything but open. I recently read this article that mentions that the internet is becoming a Dark Forest: onezero.medium.com/the-dark-forest...
I find it amazing to be able to be part of this, but sad nonetheless.
I didn't know about this theory. But I know what he's talking about. I also prefer to hang out in private groups and channels. It feels safe, cozy, friendly and you can be open, bold, honest, goofy, can cross some lines you'd never cross in public. The "internet" in this case is just a protocol or transport, not the World Wide Web as it was meant to be.
The funny thing is that something similar was actually proposed before the internet. A sort of micropayment-driven digital distributed super-library. Might not have turned out too bad.
I think you're spot on though. IT/software is too popular right now, making it a target for greed rather than just personal interest.
Gotta say - almost every one of these responses is looking at the web from a purely front end perspective.
The web as a platform - REST - is a vital part of the internet. If you're proposing to change the web to fit only with your vision of the front end, recognise that you'll be losing a lot of its power and expressiveness for the backend, which was a part of what made it so successful in the first place.
Dear JavaScript developer: the web isn't just about you ๐ฅ๐ฅ๐ฅ
I think it would be built from the ground up to be an application platform. We're getting to that place with WebGL, PWAs, and Web Assembly, but if we were to start over those features would be built first. JavaScript would get rid of "the bad parts" and take a role as a true scripting language instead of an application language.
Oh dear no I hope not. One of the joys of the web is its transparency and discoverability. HTML is easy to write and read - and inspect on a website you're browsing. HTTP 1.1 is as well - you can easily learn how the web works by reading network requests.
The web as application platform, delivered as opaque blobs of WASM, is no longer the open web. If it had been written like that it would have failed.
In fact, it did fail. It was called Java applets and Flash.
Have you looked at any JavaScript application bundle? Everyone minifys, uglifys, and bundles for performance, and the result is unreadable. Without the source maps and dependency list, an inspection of modern JavaScript tells you nothing. Client heavy applications are some of the most popular web sites today, and some of the most used developer applications are Electron apps. You can argue that this is "Not The Right Way To Make Software" (tm), but the fact is the web as application platform has been very successful. So I stand by my original post. Knowing what the web becomes, I think it would be prudent to design it for efficiency as an application platform first, and a document sharing tool second.
I feel you're making my argument for me here.
How on earth is that an argument for the web to be a JavaScript application platform? An Electron app doesn't even need to be connected to the Internet, it doesn't need to be on the web at all. You could send me a copy on a ~floppy disk~ USB drive. If anything it's reinforced the idea that JavaScript 'web' apps have very little to do with the web at all.
I can and I do. But I haven't trademarked it. Yet.
The web has been very successful. The web as application platform has been very successful. JavaScript applications running in the browser, delivered over the web, with the web browser being used as a host-slash-interpreter-slash-GUI-library is... well, I'll just say it's ridiculous if you're just trying to make a recipe website, but pretty cool if you're building Google Docs.
Most people are not building Google Docs.
The web is not just websites. It's a series of protocols, standards and conventions that have been used as an application platform for decades now. Not applications in the sense that you're using it - JavaScript in the browser - but distributed systems communicating over HTTP using principles such as REST. The reason the web has been successful is that it covers this use case as well as 'document sharing' (you make it sound like Dropbox).
As I've said in another comment above: the web is not just about the front end.
I agree, I think WASM would be the "base language of the web", and JS would just happen to be one of the compatible languages, but on an equal playing field with Rust, Go, Kotlin, and all the other languages that compile to WASM. It would be more like desktop programming where all languages are equally valid, rather than having the one language of JS dominate the entire landscape.
HTML
Basically no HTML. XML-like structures are just too verbose. No more legacy event handling and no more distinction between attributes and properties.
JavaScript
No more global objects for APIs. No more
document
, orfetch
, orDate
, orPromise
. We'd have to explicitly import from modules, like we'll have to do to use the brand new KV Storage.CSS
A solid way to encapsulate styles and expose part of components to theming (see Shadow DOM, and
::part
and::theme
). No more legacy WTFs (like usingvertical-align
for two completely different things).<input>
would be split up into<textbox>
,<checkbox>
, etc. just as<textarea>
and<button>
became individual elements.Additionally, the custom elements API might be a lot more flushed out so that the native elements could be defined as 'Custom Elements'
Amen.
It'd probably be worse, since I assume it'd be some mega-corporate committee who sets about with the task of recreating it.
Look at HTML, CSS, and JS. All of these have had numerous opportunities to add clearer constructs and make things easier. Yet at every iteration they also add a bunch of nonsense and fail to address critical errors. This effect would be magnified if the same people were to recreate the entire web.
I actually don't see the problem with just deprecating old standards. Our world is changing so fast that any active service/content provider needs to make changes frequently anyway. If we wish to access old content, then we can use special legacy browsers. It's not like a Blu-Ray player can read VHS tapes, so why should a browser be able to browse content from 20 years ago.
Evolutionary design is by far the best approach, but we do need to drop compatibility with old crap. It'll also help a lot for learning, by thinning out the garbage.
But no, redesigning would be a failure. I'm pretty much against up-front designs, since they tend to fail. Everything must be evolutionary to work.
I think you mean iterative design. AFAIK 'evolutionary' would involve having a population of solutions, that are selected based on fitness, mutated and combinated to create a new generation of solutions. Rinse & repeat. You might argue that is how human populations/companies work but it seems a bit of a stretch.
It would have all the new tech from today but then that would be "legacy" in a decade
I know this is related to the network side not the WWW components but it is everything in the end, Make all IP addresses
IPv6
soIPv4
had never existed (I'd Tweak how IPv6 is implemented but that is besides the point).Markdown-based
:-)