I was scrolling through a developer subreddit the other day and stumbled upon a goldmine: a thread about the most underrated JavaScript features. It wasn't about the flashy new frameworks, but the quiet, built-in tools that do the heavy lifting without any fanfare.
These are the features that make you look at your old code (or that giant utils.js file) and think, "I could have just used that?"
So I've compiled the seven best ones. These aren't just clever tricks; they're battle-tested solutions that can genuinely make your code cleaner, faster, and easier to read. But before we dive in, a quick word on setup. To use some of these modern goodies, especially in a Node.js environment, you need to be on a recent version. Juggling different Node versions for different projects can be a real pain. If you've ever found yourself fighting with nvm or brew, you know what I mean. A tool like ServBay can be a lifesaver here, letting you install and switch between Node.js versions with a single click. But more on that later.
Let's get to the code.
Set: The Ultimate Tool for Unique Items and Fast Lookups
When you hear "remove duplicates from an array," your brain might jump to filter + indexOf. It works, but it's the scenic route—slow and inefficient, with a time complexity of O(n²).
Set is the express train. It’s a data structure that, by its very nature, only holds unique values. And checking if an item exists is lightning-fast (O(1)).
Example: De-duping a list of user emails
// A list of emails from a messy database import
const emails = ['dev@example.com', 'ceo@example.com', 'dev@example.com', 'support@example.com'];
// One line to get the unique emails
const uniqueEmails = [...new Set(emails)];
console.log(uniqueEmails);
// -> ['dev@example.com', 'ceo@example.com', 'support@example.com']
Object.entries() & Object.fromEntries(): Your New Best Friends for Object Manipulation
Remember the old for...in loop? You had to check hasOwnProperty every time to avoid iterating over prototype properties. It felt like walking on eggshells.
This pair of methods makes working with objects as easy as working with arrays. Object.entries() turns an object into [[key, value], [key, value]], and Object.fromEntries() does the exact reverse. It's perfect for filtering or mapping objects.
Example: Filtering out null or undefined values from an API response
// A product object from an API, with some missing data
const product = {
id: 123,
name: 'Super Cool Gadget',
price: 99.99,
description: 'The best gadget ever.',
inStock: null, // We want to remove this
rating: undefined // And this
};
// 1. Convert to an array, 2. Filter, 3. Convert back to an object
const cleanProduct = Object.fromEntries(
Object.entries(product).filter(([key, value]) => value != null)
);
console.log(cleanProduct);
// -> { id: 123, name: 'Super Cool Gadget', price: 99.99, description: 'The best gadget ever.' }
?? and ??=: The Smarter Default Operators
For years, we used the || (OR) operator for default values. The problem is that || treats any "falsy" value (0, '', false) as something to be replaced. This can lead to bugs.
The nullish coalescing operator (??) is much more precise. It only triggers for null or undefined.
Example: Handling user input where 0 is a valid value
const userInputQuantity = 0;
// The old, buggy way: 0 is falsy, so it defaults to 1
const quantity_buggy = userInputQuantity || 1;
// The new, correct way: 0 is not nullish, so it's kept
const quantity_correct = userInputQuantity ?? 1;
console.log(`Buggy: ${quantity_buggy}`); // -> Buggy: 1
console.log(`Correct: ${quantity_correct}`); // -> Correct: 0
The logical nullish assignment operator (??=) is even slicker. It only assigns a value if the variable is currently null or undefined.
Example: Setting default configuration options
// User-provided config might be missing some values
const userConfig = {
theme: 'dark'
};
// Apply defaults without overwriting existing settings
userConfig.theme ??= 'light'; // Does nothing, 'dark' is already set
userConfig.timeout ??= 5000; // timeout is nullish, so it gets set to 5000
console.log(userConfig);
// -> { theme: 'dark', timeout: 5000 }
Intersection Observer: Performant Scrolling Without the Jank
The old way to implement lazy loading or infinite scroll was to listen to the scroll event and call getBoundingClientRect() in a loop. This is a performance nightmare because it forces the browser to constantly recalculate layouts, leading to a janky, stuttering experience.
Intersection Observer is the modern, non-blocking way to do it. You tell it to watch an element, and it asynchronously notifies you when that element enters or leaves the viewport.
Example: Simple image lazy-loading
<!-- The real image URL is in data-src -->
<img class="lazy-image" src="placeholder-spinner.gif" data-src="real-image.jpg">
const lazyImages = document.querySelectorAll('.lazy-image');
const observer = new IntersectionObserver((entries, observer) => {
entries.forEach(entry => {
// If the image is in the viewport
if (entry.isIntersecting) {
const img = entry.target;
img.src = img.dataset.src; // Swap placeholder with real image
img.classList.remove('lazy-image');
observer.unobserve(img); // Stop watching this image
}
});
});
lazyImages.forEach(img => observer.observe(img));
Promise.allSettled(): When You Need Every Promise to Finish, Win or Lose
Promise.all is great, but it's an all-or-nothing deal. If one promise in the batch rejects, the whole thing fails, and you lose the results from the successful ones.
Promise.allSettled() is more resilient. It waits for all promises to complete, regardless of whether they succeed or fail. It then gives you a neat report of each outcome.
Example: Fetching data from multiple APIs where some might fail
const apiEndpoints = [
fetch('https://api.github.com/users/github'), // This will work
fetch('https://api.example.com/non-existent'), // This will fail
fetch('https://api.github.com/users/vercel') // This will work
];
Promise.allSettled(apiEndpoints)
.then(results => {
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log(`Endpoint ${index} succeeded with value:`, result.value.status);
} else {
console.error(`Endpoint ${index} failed with reason:`, result.reason.message);
}
});
});
// -> Endpoint 0 succeeded with value: 200
// -> Endpoint 1 failed with reason: Failed to fetch
// -> Endpoint 2 succeeded with value: 200
URL and URLSearchParams: Stop Wrestling with URL Strings
Parsing query parameters from a URL with regex is a rite of passage for many developers, but it's a painful one. The built-in URL and URLSearchParams objects make this trivial.
Example: Reading and manipulating URL query parameters
const urlString = 'https://mysite.com/products?category=electronics&page=2';
const url = new URL(urlString);
// Read params
const category = url.searchParams.get('category'); // "electronics"
console.log(`Category is: ${category}`);
// Add a new param
url.searchParams.append('sort', 'price_desc');
// Modify an existing param
url.searchParams.set('page', '3');
// The new URL string
console.log(url.href);
// -> https://mysite.com/products?category=electronics&page=3&sort=price_desc
Top-Level await: Say Goodbye to async IIFEs
In the past, if you wanted to use await at the top level of an ES module (e.g., to fetch configuration before the rest of the module runs), you had to wrap it in an async Immediately Invoked Function Expression (async () => { ... })();. It was clumsy.
Top-level await lets you use await directly in your modules. Any other module that imports it will simply wait for the promise to resolve before executing.
Example: Asynchronously loading a configuration file
// in config.js
const response = await fetch('/api/app-settings');
export const AppConfig = await response.json();
// in main.js
import { AppConfig } from './config.js';
// This code only runs after the config has been fetched and parsed.
console.log(`API URL is: ${AppConfig.apiUrl}`);
A Quick Detour: Managing Node.js Versions for Modern Features
This last one, top-level await, is a perfect example of a feature that depends on your environment. It's supported in modern browsers and in Node.js v14.8+ .
This is where developers often hit a wall. You're excited to use a modern feature, but your client's project is stuck on Node.js 12. Your new side project needs Node.js 20. Soon, you're in version management hell, wrestling with nvm, n, or your system's package manager. It's a distraction from what you actually want to do: write code.
This is precisely the kind of headache a tool like ServBay is designed to eliminate. It's a local development environment that treats Node.js (along with PHP, MariaDB, etc.) as a first-class citizen.
- One-Click Installation: You can install multiple versions of Node.js (e.g., 14, 16, 18, 20) with a simple click. No more cryptic command-line errors.
- Effortless Switching: Need to switch a project from Node 16 to Node 20? It's literally a dropdown menu.
- Co-existence: Different projects can run on different Node.js versions simultaneously without conflict.
It lets you focus on using these awesome JS features without wasting an hour debugging your local setup.
Wrapping Up
JavaScript has come a long way. Many of the problems we once relied on heavy libraries like Lodash or Moment.js to solve can now be handled by the language itself.
As older browsers fade into irrelevance, these native APIs are becoming the standard way of doing things. So next time you reach for an npm install, take a second to ask: "Can JavaScript do this for me already?" The answer might surprise you.


Top comments (0)