DEV Community

loading...

The Million.js Manifesto

Aiden Bai
student @ camas high school tinkerin' away on the web
・3 min read

Past

Back in the days when JavaScript was first manifested, direct DOM manipulation was used to introduce interactivity for static web sites. Eventually, the JavaScript scene moved on to new technologies such as MooTools and jQuery, which helped simplify certain actions into simple, reusable functions. However, there was still one pressing issue.

Even though web development became easier, scaling large web applications that required an extensive amount of interactivity was difficult, as operations were generally imperative and difficult to organize. This led the rise to several frameworks and libraries, notably React. React aimed to allow for declarative development and componentization to compose user interfaces scalably. To achieve this, the team behind React used a Virtual DOM architecture, where computation and logic was deferred to an algorithm to determine changes. This means that the developer just has to figure out how the markup will look.

React helped to pioneer extensive usage of JavaScript to render web applications, leading to the rise of the Virtual DOM architecture in other libraries and usage of new technologies like JSX.

Present

Nowadays, the prevalence of web applications is non-negligible. JavaScript and React are everywhere, and the Virtual DOM architecture is used in many of the popular libraries today (React.js, Vue.js, Preact.js, Elm, etc). "Pure JavaScript rendering" has subsided as development of preprocessors such as Webpack have taken hold as a standard, leading the way to preprocessors, allowing for cross-browser compatibility (e.g. Babel), prerendering, and JAMstack technologies.

JavaScript frameworks and libraries have entire ecosystems around them, with tools like Next.js around React supercharging how web development is done. Library developers are slowly realizing the importance of performance sacrificed in the previous generation of developers for the purpose of developer experience. This is seen with trends towards static analysis with Svelte, which currently leverages the prevalent usage of bundlers in most web applications.

Performance is slowly becoming a priority, but technologies that depend on the Virtual DOM like React are fundamentally outdated, even with incremental improvements with lazy loading and ISR in Next.js.

Future

It is easy to to recite, reinvent, and monkey-patch what exists. With React and the Virtual DOM being such a backbone technology in the JavaScript library space, it is paramount that bleeding-edge libraries like Next.js leverage their position in the ecosystem to pioneer compiler-based optimizations. While the most straightforward solution is to ditch everything and use the browser just as compile target with a series of imperative operations, the Virtual DOM arguably brings many advantages. It helps us write declarative interfaces without templating, meaning conditions are constrained to the limit of JavaScript, as well as many other benefits.

The future of the Virtual DOM is not destruction of the Virtual DOM, rather it is to constrain usage of the Virtual DOM where it is powerful while leveraging static analysis and compilation (This is done through Million.js).

→ Interested in how Million is different?

I recently had the opportunity to chat with @rauchg about Million.js as well as some of my thoughts on the current state of Virtual DOM. He explained to me that optimization of the Virtual DOM was only a facet of the next phase of web frameworks. This includes how we can improve development mode experience, tree-shaking and bundle size, etc.
aggressive tree shaking - reduce bundle size. I highly recommend you to check out his Twitter and blog, as he has so much valuable insight into the industry and how impact can and needs to be made.

Discussion (20)

Collapse
miketalbot profile image
Mike Talbot • Edited

I'm interested in understanding where you see the VDOM as a distinct entity existing in the world where we can compile a program that understands implicitly which attributes to change on which DOM elements.

My naive thoughts here were that a VDOM allowed you to "do what you like 'coz its complex" and then work out what the implications of that are for the much slower real DOM, hence only updating the elements and attributes that must be changed. My understanding of compiler technology was that it would generate code that performs this minimal update by statically analysing what can happen in a code path and only including code for that. So, from this rather uneducated standpoint it would appear that a properly functioning compiler would mean that a VDOM became unnecessary while retaining the declarative patterns we now often use. I can see that static analysis may not always be possible I guess...

Collapse
tenbits profile image
Alex Kit

This is actually what we are doing at maskjs. Here the flow is: Template ➝ AST ➝ DOM. When the builder creates DOM nodes, it notices all observable expressions and the render context, and when later the new value arrives it changes only one part of the changed DOM.

h1 {
    '~[observe user.name]'
}
Enter fullscreen mode Exit fullscreen mode

And when the name is changed, e.g. user.name = "Foo", only textContent of the Element h1 is changed. Such approach is indeed much faster. But, if you have a huge template, a UserPage for example, and want to apply a new user model to show, which leads to dozens of such in-DOM changes, it could hurt performance. Also for VDOM. For this scenario the container DOM Element should be detached from the Live DOM, modified, and then reattached. But in most cases it can be even faster just to drop old Container Element, and to re-render the Template.

Collapse
aidenybai profile image
Aiden Bai Author

Really cool MVC project, seems undervalued and needs more attention haha. I think you have a misunderstanding of how vdom works - vdom would diff changes then apply pinpoint changes to the DOM. Whether it's faster to diff or drop and rerender the template I'm not sure about since all the jsperf links on your project are broken

Collapse
aidenybai profile image
Aiden Bai Author

This is a great point. For libraries like Svelte that use their own syntax + compiler, compiling down to near imperative operations just makes sense. The issue with that is that you have to build patterns around that sort of idea - like building a system like React with static analysis would be a pain in the butt.

In terms of whether the far future will compile down to imperative operations - probably. It might also be coupled with scheduling, WASM, workers, lazyloading, etc to maximize performance. But whether that is a realistic possiblity now or in the near future with libraries that depend on Virtual DOM-y patterns is difficult to say in my opinion.

Virtual DOM in terms of architecture is more of an "ideology" in my opinion. Ideologies vary in effectiveness, and it's important to bring pragmaticism to make it a "practical ideology."

Collapse
miketalbot profile image
Mike Talbot

Thanks for the clarification, I agree with your sentiment. A lot of the recent work in React has been around the scheduling of work to optimise when something should be processed and make it's updates to the VDOM (React fibre etc). This I guess is the new "reactivity" of React. Is this beyond the remit of million?

In React's case, the props of a component changing initiate a reactive update scheduled at some point in the future, until you get down to the DOM element itself.

I know a bit about Fibre and this kind of thing as it inspired my js-coroutines library, but I'm not sure if this concept is outside the remit you are looking to support and innovate in?

Thread Thread
aidenybai profile image
Aiden Bai Author

I've done a bit of scheduling implementation on Million here, but I haven't implemented any task prioritization techniques yet, and I do plan to. I'm still trying to figure out how the compiler can play a role in this.

I just checked out your article on DEV about js-coroutines, and it;'s really cool! I'm unsure whether million needs a hard dependency on js-coroutines, just because they aren't quite the same but are in the same ballpark. I'd be happy to try to adopt some elements / if you can take a look to improve million with your expertise.

Thread Thread
miketalbot profile image
Mike Talbot

Yeah you don't need js-coroutines itself, just some of the techniques. I'd be interested in thinking through it with you and seeing if I can help. js-coroutines is a load of helper functions for common operations wrapped around a very simple core.

export async function run(
    coroutine,
    loopWhileMsRemains = 1,
    timeout = 16 * 10
) {
    const options = {timeout}
    let terminated = false
    let resolver = null
    const result = new Promise(function (resolve, reject) {
        resolver = resolve
        const iterator = coroutine()
        window.requestIdleCallback(run)

        function run(api) {
            if (terminated) {
                iterator.return()
                return
            }
            const minTime = Math.max(0.5, loopWhileMsRemains)
            try {
                do {
                    const {value, done} = iterator.next()
                    if (done) {
                        resolve(value)
                        return
                    }
                    if (value === true) {
                        break
                    }
                } while (api.timeRemaining() > minTime)
            } catch (e) {
                reject(e)
                return
            }

            window.requestIdleCallback(run, options)
        }
    })
    result.terminate = function (result) {
        terminated = true
        if (resolver) {
            resolver.resolve(result)
        }
    }
    return result
}
Enter fullscreen mode Exit fullscreen mode

The idea of deciding on the importance of a particular update and adding it to some queues that then either get immediate updates (user input) or more casual updates that could be run in priority queues on idle seems like a plan.

Collapse
miketalbot profile image
Mike Talbot • Edited

BTW I did read your other article, but I didn't get a concrete feeling for the API constraints you refer to. Do we not have a constrained API in the DOM?

Collapse
aidenybai profile image
Aiden Bai Author

Thanks for reading my other articles! Constraints isn't probably the best word for the job, but I feel like it's the only one that fits right now. When you develop for the purpose of compiling down to imperative operations, your API pattern will be toward that. It's difficult to see a well optimized React API pattern compiling down to imperative operations.

Thread Thread
miketalbot profile image
Mike Talbot

Yes I see your point.

Collapse
samuelfaure profile image
Samuel FAURE

Jesus you post about your million.js library every other day on dev.to. At which point does this stop being informative and just becomes plain commercial spam?

That must be the 8th post I see you write about your library on a short time. Please stop.

Collapse
aidenybai profile image
Aiden Bai Author • Edited

Is it wrong to post about your own project, as long as it provides value? My situation isn't unique - many library developers write series of articles about their own work to share it with others.

If you don't think an article provides value to you, unfollow the author and ignore their posts. That's how you get an author to stop posting about a topic.

Collapse
samuelfaure profile image
Samuel FAURE

You articles repeat the same points over and over again and all converge towards the fact that your library is amazing and great and isn't the virtual dom the best thing ever. Geez we get it already.

I wish more people would write good articles to help others out of altruism, instead of rehashing the same post twelve times for free advertisement and selfish interests.

This is exactly the problem with Dev.to, a pure tragedy of the commons. Everyone here wants to promote their own work while providing minimum value. It's a sad state of affairs.

All the people using this website as a free billboard are greatly diminishing its quality, which in the long run will only destroy it completely.

But at least you got some stars on your github project.

Thread Thread
aidenybai profile image
Aiden Bai Author

I think you are generalizing my articles into a straw man and immediately pointing it as overused or oversaid. Personally, as a high school student, I'm really passionate about the virtual DOM space, and dev.to is my way of not only pressure testing my own opinions, but also coupled with my library. Others like me don't have easy access into this space, and this is one of the ways to break into it, learn, and try to contribute and provide value to others.

I believe I read your article on bullshit articles and lack of quality (e.g. 10 THINGS YOU DIDN'T KNOW ABOUT this), and I do agree to an extent. But do I care? No, I move on and look for higher quality articles, and this goes for the same for any other forum. Additionally, these articles genuinely provide value to some beginners, and ignoring that is just having an elitist attitude.

Personally, I write articles not out of altruism, but out of mutalistic gain. The reader obtains some value, and I obtain some value. I applaud your efforts for advocating for better quality, but at the end of the day, quality is subjective based on the end reader, and trashing on students trying to provide value in this space is cringe at best.

Thread Thread
aidenybai profile image
Aiden Bai Author • Edited

Anyway, this post was made in bad faith, and I ask you to cease discussion here. Feel free to discuss with people who are interested in your opinion outside of this article.

Collapse
lexlohr profile image
Alex Lohr

The primary reason for abstractions to DOM methods like jQuery and MooTools was the inconsistencies of browser behavior especially during the browser wars. IE, Netscape, Mozilla, Opera, Safari, all of them had their quirks and issues. Later then target of these toolkits shifted to a better developer experience.

Then Google released Chrome with V8, delivering a hitherto unknown performance to the web, which led to a paradigm shift, to web apps. Not web sites simulating actual apps, but application logic in the front end. With this shift, a lot of patterns previously reserved for the back end were moved to the front, bringing the first attempts to have MVC patterns in the browser.

These patterns were refined and adapted until we reach the current situation where concepts like virtual DOM, data binding, state management, components, JSX and so on can be used freely, either as part of a framework or mix and match of specialized tools.

Million.js is such a specialized tool for a virtual DOM. There are others, like inferno and frameworks that forgo the virtual DOM approach completely.

Collapse
aidenybai profile image
Aiden Bai Author

100% agree with first couple points. I believe you have a fundemental misunderstanding of Inferno, they're currently one of the fastest frameworks with vdom libraries - they dont forgo the Virtual DOM at all 😂.

You probably mean libraries like Solid or Svelte

Collapse
lexlohr profile image
Alex Lohr

No, you misunderstood me: other vDOM libraries like inferno and those that forgo the vDOM like Solid or Svelte. Though I concede it was easy to overlook this detail.

Thread Thread
aidenybai profile image
Aiden Bai Author

I see now, thanks for the clarification

Collapse
willdoescode profile image
Will

as someone who has no connections to aiden or million this project is the best project to ever exist