DEV Community

JSDevJournal
JSDevJournal

Posted on • Originally published at jsdevjournal.com

19 Common JavaScript and Node.js Mistakes That Can Slow Down Your Code

Learn about 19 common JavaScript and Node.js mistakes that can slow down performance, like excessive DOM access, poorly scoped events, blocking the main thread, and more.

Speed. Performance. Responsiveness. These ideals reign supreme in the world of web development expecially JavaScript and Nodejs. A slow or janky website is the hallmark of an amateur, while a slick, optimized experience delights users and sets professionals apart.

But creating truly high-performance web apps is littered with pitfalls. Mistakes abound that can drag down the pace of your JavaScript without you even realizing it. Tiny oversights that bloat your code and surreptitiously sap speed away bit by bit.

You vigilantly minify your code and leverage caching…yet your site still feels oddly sluggish at times. The UI stutters during scroll or when buttons are clicked. Pages take eons to load.

What’s going on?

Turns out, there are many common ways we inadvertently slow down our JavaScript. Anti-patterns that, over time, can hobble site performance.

The good news? These mistakes can be avoided.

Today we’re spotlighting the top 19 performance pitfalls that can secretly slow down JavaScript and Node.js apps. We’ll explore what causes them through illustrative examples, and actionable solutions to optimize your code.

Identifying and eliminating these hazards is key to crafting buttery smooth web experiences that delight users. So let’s dive in!

1. Improper Variable Declarations and Scope

When first learning JavaScript, it’s tempting to declare all variables globally. However, this causes problems down the road. Let’s look at an example:

    // globals.js
    var color = 'blue';
    function printColor() {
      console.log(color); 
    }
    printColor(); // Prints 'blue'
Enter fullscreen mode Exit fullscreen mode

This works fine, but imagine if we loaded another script:

    // script2.js
    var color = 'red';
    printColor(); // Prints 'red'!
Enter fullscreen mode Exit fullscreen mode

Because color is global, script2.js overwritten it! To fix this, declare variables locally inside functions whenever possible:

    function printColor() {
      var color = 'blue'; // local variable

      console.log(color);
    }
    printColor(); // Prints 'blue'
Enter fullscreen mode Exit fullscreen mode

Now changes in other scripts won’t affect printColor.

Declaring variables in global scope when unnecessary is an anti-pattern. Try to limit globals to configuration constants. For other variables, declare them locally in the smallest possible scope.

2. Inefficient DOM Manipulation

When updating DOM elements, batch changes instead of manipulating one node at a time. Consider this example:


    const ul = document.getElementById('list');
    for (let i = 0; i < 10; i++) {
      const li = document.createElement('li');
      li.textContent = i;

      ul.appendChild(li);
    }
Enter fullscreen mode Exit fullscreen mode

This appends list items one-by-one. Better to build a string first then set .innerHTML:

    const ul = document.getElementById('list');
    let html = '';
    for (let i = 0; i < 10; i++) {
      html += `<li>${i}</li>`; 
    }
    ul.innerHTML = html;
Enter fullscreen mode Exit fullscreen mode

Building a string minimizes reflows. We update the DOM once instead of 10 times.

For multiple updates, build up changes then apply at end. Or better yet, use DocumentFragment to batch appends.

3. Excessive DOM Manipulation

Frequent DOM updates can crush performance. Consider a chat app that inserts messages into the page.

Bad:

    // New message received
    const msg = `<div>${messageText}</div>`;
    chatLog.insertAdjacentHTML('beforeend', msg);
Enter fullscreen mode Exit fullscreen mode

This naively inserts on each message. Better to throttle updates:

Good:

    let chatLogHTML = '';
    const throttleTime = 100; // ms
    // New message received  
    chatLogHTML += `<div>${messageText}</div>`;
    // Throttle DOM updates
    setTimeout(() => {
      chatLog.innerHTML = chatLogHTML;
      chatLogHTML = ''; 
    }, throttleTime);
Enter fullscreen mode Exit fullscreen mode

Now we update at most every 100ms, keeping DOM operations low.

For highly dynamic UIs, consider virtual DOM libraries like React. These minimize DOM manipulation using a virtual representation.

4. Lack of Event Delegation

Attaching event listeners to many elements creates needless overhead. Consider a table with delete buttons on each row:

Bad:

    const rows = document.querySelectorAll('table tr');
    rows.forEach(row => {
      const deleteBtn = row.querySelector('.delete');  
      deleteBtn.addEventListener('click', handleDelete);
    });
Enter fullscreen mode Exit fullscreen mode

This adds a listener to each delete button. Better to use event delegation:

Good:

    const table = document.querySelector('table');
    table.addEventListener('click', e => {
      if (e.target.classList.contains('delete')) {
        handleDelete(e);
      }
    });
Enter fullscreen mode Exit fullscreen mode

Now there’s a single listener on the <table>. Less memory overhead.

Event delegation utilizes event bubbling. One listener can handle events from multiple descendants. Use delegation whenever applicable.

5. Inefficient String Concatenation

When concatenating strings in a loop, performance suffers. Consider this code:

    let html = '';
    for (let i = 0; i < 10; i++) {
      html += '<div>' + i + '</div>';
    }
Enter fullscreen mode Exit fullscreen mode

Creating new strings requires memory allocation. Better to use an array:

    const parts = [];
    for (let i = 0; i < 10; i++) {
      parts.push('<div>', i, '</div>');
    }
    const html = parts.join('');
Enter fullscreen mode Exit fullscreen mode

Building an array minimizes intermediate strings. .join() concatenates once at the end.

For multiple string additions, use array joining. Also consider template literals for embedded values.

6. Unoptimized Loops

Loops often cause performance problems in JavaScript. A common mistake is repeatedly accessing array length:

Bad:

    const items = [/*...*/];
    for (let i = 0; i < items.length; i++) {
      // ...
    }
Enter fullscreen mode Exit fullscreen mode

Redundantly checking .length inhibits optimizations. Better:

Good:

    const items = [/*...*/];  
    const len = items.length;
    for (let i = 0; i < len; i++) {
      // ...
    }
Enter fullscreen mode Exit fullscreen mode

Caching length improves speed. Other optimizations include hoisting invariants out of loops, simplifying termination conditions, and avoiding expensive operations inside iterations.

7. Unnecessary Synchronous Operations

JavaScript’s async capabilities are a key advantage. But beware of blocking I/O! For example:

Bad:

    const data = fs.readFileSync('file.json'); // blocks!
Enter fullscreen mode Exit fullscreen mode

This stalls execution while reading from disk. Instead use callbacks or promises:

Good:

    fs.readFile('file.json', (err, data) => {
      // ...
    });
Enter fullscreen mode Exit fullscreen mode

Now the event loop continues while the file is read. For complex flows, async/await simplifies asynchronous logic. Avoid synchronous operations to prevent blocking.

8. Blocking the Event Loop

JavaScript uses a single-threaded event loop. Blocking it stalls execution. Some common blockers:

  • Heavy computational tasks

  • Synchronous I/O

  • Unoptimized algorithms

For example:

    function countPrimes(max) {
      // Unoptimized loop
      for (let i = 0; i <= max; i++) {
        // ...check if prime...
      }
    }
    countPrimes(1000000); // Long running!
Enter fullscreen mode Exit fullscreen mode

This executes synchronously, blocking other events. To avoid:

  • Defer unnecessary work

  • Batch data processing

  • Use Worker threads

  • Look for optimization opportunities

Keep the event loop running smoothly. Profile periodically to catch blocking code.

9. Inefficient Error Handling

It’s vital to handle errors properly in JavaScript. But beware performance pitfalls!

Bad:

    try {
      // ...
    } catch (err) {
      console.error(err); // just logging
    }
Enter fullscreen mode Exit fullscreen mode

This captures errors, but takes no corrective action. Unhandled errors often lead to memory leaks or data corruption.

Better:

    try {
      // ...
    } catch (err) {
      console.error(err);

      // Emit error event 
      emitError(err); 

      // Nullify variables
      obj = null;

      // Inform user
      showErrorNotice();
    }
Enter fullscreen mode Exit fullscreen mode

Logging isn’t enough! Clean up artifacts, notify users, and consider recovery options. Use tools like Sentry to monitor errors in production. Handle all errors explicitly.

11. Memory Leaks

Memory leaks happen when memory is allocated but never released. Over time, leaks accumulate and degrade performance.

Common sources in JavaScript include:

  • Uncleaned up event listeners

  • Outdated references to deleted DOM nodes

  • Cached data that’s no longer needed

  • Accumulating state in closures

For example:

    function processData() {
      const data = [];
    // Use closure to accumulate data
      return function() {
        data.push(getData()); 
      }
    }
    const processor = processData();
    // Long running...keeps holding reference to growing data array!
Enter fullscreen mode Exit fullscreen mode

The array keeps getting larger but is never cleared. To fix:

  • Use weak references

  • Clean up event listeners

  • Delete no-longer-needed references

  • Limit closured state size

Monitor memory usage and watch for growing trends. Proactively eliminate leaks before they pile up.

11. Overuse of Dependencies

While npm offers endless options, resist the urge to over-import! Each dependency increases bundle size and attack surface.

Bad:

    import _ from 'lodash';
    import moment from 'moment'; 
    import validator from 'validator';
    // etc...
Enter fullscreen mode Exit fullscreen mode

Importing entire libraries for minor utilities. Better to cherries pick helpers as needed:

Good:

    import cloneDeep from 'lodash/cloneDeep';
    import { format } from 'date-fns';
    import { isEmail } from 'validator';
Enter fullscreen mode Exit fullscreen mode

Only import what you need. Review dependencies regularly to prune unused ones. Keep bundles lean and minimize dependencies.

12. Inadequate Caching

Caching allows skipping expensive computations by reusing prior results. But it’s often overlooked.

Bad:

    function generateReport() {
      // Perform expensive processing
      // to generate report data... 
    }

    generateReport(); // Computes
    generateReport(); // Computes again!
Enter fullscreen mode Exit fullscreen mode

Since inputs haven’t changed, the report could be cached:

Good:

    let cachedReport;

    function generateReport() {
      if (cachedReport) {
        return cachedReport;
      }
      cachedReport = // expensive processing...
      return cachedReport; 
    }
Enter fullscreen mode Exit fullscreen mode

Now repeated calls are fast. Other caching strategies:

  • Memory caches like Redis

  • HTTP caching headers

  • LocalStorage for client caching

  • CDNs for asset caching

Identify cache opportunities — they often provide big speedups!

13. Unoptimized Database Queries

When interfacing with databases, inefficient queries can bog down performance. Some issues to avoid:

Bad:

    // No indexing
    db.find({name: 'John', age: 35}); 

    // Unecessary fields
    db.find({first: 'John', last:'Doe', email:'john@doe.com'}, {first: 1, last: 1});
    // Too many separate queries
    for (let id of ids) {
      const user = db.find({id});
    }
Enter fullscreen mode Exit fullscreen mode

This fails to utilize indexes, retrieves unused fields, and executes excessive queries.

Good:

    // Use index on 'name' 
    db.find({name: 'John'}).hint({name: 1});

    // Only get 'email' field
    db.find({first: 'John'}, {email: 1}); 
    // Get users in one query
    const users = db.find({
      id: {$in: ids} 
    });
Enter fullscreen mode Exit fullscreen mode

Analyze explain plans. Create indexes strategically. Avoid multiple piecemeal queries. Optimize datastore interactions.

14. Improper Error Handling in Promises

Promises simplify asynchronous code. But unhandled rejections are silent failures!

Bad:

    function getUser() {
      return fetch('/user')
        .then(r => r.json()); 
    }

    getUser();
Enter fullscreen mode Exit fullscreen mode

If fetch rejects, exception goes unnoticed. Instead:

Good:

    function getUser() {
      return fetch('/user')
        .then(r => r.json())
        .catch(err => console.error(err));
    }

    getUser();
Enter fullscreen mode Exit fullscreen mode

Chaining .catch() handles errors properly. Other tips:

  • Avoid promise nesting hell

  • Handle rejections at top level

  • Configure unhandled rejection tracking

Don’t ignore promise errors!

15. Synchronous Network Operations

Network requests should be asynchronous. But sometimes sync variants get used:

Bad:

    const data = http.getSync('http://example.com/data'); // blocks!
Enter fullscreen mode Exit fullscreen mode

This stalls the event loop during the request. Instead use callbacks:

Good:

    http.get('http://example.com/data', res => {
      // ...
    });
Enter fullscreen mode Exit fullscreen mode

Or promises:

    fetch('http://example.com/data')
      .then(res => res.json())
      .then(data => {
        // ...
      });
Enter fullscreen mode Exit fullscreen mode

Async network requests allow other processing while waiting for responses. Avoid synchronous network calls.

16. Inefficient File I/O Operations

Reading/writing files synchronously blocks. For example:

Bad:

    const contents = fs.readFileSync('file.txt'); // blocks!
Enter fullscreen mode Exit fullscreen mode

This stalls execution during disk I/O. Instead:

Good:

    fs.readFile('file.txt', (err, contents) => {
      // ...
    });
    // or promises
    fs.promises.readFile('file.txt')
       .then(contents => {
         // ...  
       });
Enter fullscreen mode Exit fullscreen mode

This allows the event loop to continue during the file read.

For multiple files, use streams:

    function processFiles(files) {
      for (let file of files) {
        fs.createReadStream(file)
          .pipe(/*...*/);
      }
    }
Enter fullscreen mode Exit fullscreen mode

Avoid synchronous file operations. Use callbacks, promises, and streams.

17. Ignoring Performance Profiling and Optimization

It’s easy to overlook performance until there are obvious issues. But optimization should be ongoing! Measure first with profiling tools:

  • Browser devtools timeline

  • Node.js profiler

  • third-party profilers

This reveals optimization opportunities even if performance seems fine:

    // profile.js
    function processOrders(orders) {
      orders.forEach(o => {
        // ...
      });
    }
    processOrders(allOrders);
Enter fullscreen mode Exit fullscreen mode

The profiler shows processOrders taking 200ms. We investigate and find:

  • Unoptimized loop

  • Expensive inner operation

  • Unnecessary work

We optimize iteratively. The final version takes 5ms!

Profiling guides optimization. Establish performance budgets and fail if exceeded. Measure often, optimize judiciously.

18. Not Utilizing Caching Mechanisms

Caching improves speed by avoiding duplicate work. But it’s often forgotten.

Bad:

    // Compute expensive report
    function generateReport() {
      // ...heavy processing...
    }
    generateReport(); // Computes
    generateReport(); // Computes again!
Enter fullscreen mode Exit fullscreen mode

The same inputs always produce the same output. We should cache:

Good:

    // Cache report contents
    const cache = {};

    function generateReport() {
      if (cache.report) {
        return cache.report;
      }
      const report = // ...compute...
      cache.report = report;
      return report;
    }
Enter fullscreen mode Exit fullscreen mode

Now repeated calls are fast. Other caching strategies:

  • Memory caches like Redis

  • HTTP caching headers

  • LocalStorage for client caching

  • CDNs for asset caching

Identify cache opportunities — they often give big wins!

19. Unnecessary Code Duplication

Duplicated code harms maintainability and optimizability. Consider:

    function userStats(user) {
      const name = user.name;
      const email = user.email;

      // ...logic...
    }
    function orderStats(order) {
      const name = order.customerName;
      const email = order.customerEmail;
      // ...logic... 
    }
Enter fullscreen mode Exit fullscreen mode

The extraction is duplicated. We refactor:

    function getCustomerInfo(data) {
      return {
        name: data.name, 
        email: data.email
      };
    }
    function userStats(user) {
      const { name, email } = getCustomerInfo(user);

      // ...logic...
    }
    function orderStats(order) {
      const { name, email } = getCustomerInfo(order);
      // ...logic...
    }
Enter fullscreen mode Exit fullscreen mode

Now it’s just defined once. Other fixes:

  • Extract utility functions

  • Build helper classes

  • Leverage modules for reusability

Deduplicate whenever you can. It improves code health and optimizability.

Conclusion

Optimizing JavaScript application performance is an iterative process. By learning efficient practices and being diligent about profiling, dramatic speed improvements can be achieved.

Key areas to focus on include minimizing DOM changes, leveraging asynchronous techniques, eliminating blocking operations, reducing dependencies, utilizing caching, and removing unneeded duplication.

With attention and experience, you can discover bottlenecks and hone in on optimizations for your specific workload. The result is faster, leaner, and more responsive web apps your users will love.

So be relentless about optimizations — employ these tips and see your JavaScript fly!

If you like my article, please follow me on JSDevJournal

Top comments (0)