Modern JavaScript Features You Should Use
HOW TO GUIDES Dec. 20, 2025, 11:30 a.m.

Modern JavaScript Features You Should Use

JavaScript has come a long way since the days of var and function declarations. Modern syntax not only makes code shorter, but also safer and easier to reason about. In this article we’ll walk through the most useful features that have become standard in ES6 and beyond, and see how they can clean up real‑world codebases. Grab a coffee, open your editor, and let’s modernize your JavaScript together.

Variable Declarations: let, const, and Block Scope

One of the first upgrades you’ll notice is the replacement of var with let and const. let provides block‑level scoping, preventing accidental leaks into the outer scope. const goes a step further by guaranteeing that the binding itself never changes, which is perfect for constants and imported modules.

Consider a simple loop that collects even numbers. Using var can lead to subtle bugs when you later reference the loop variable inside a callback.

const numbers = [1, 2, 3, 4, 5, 6];
const evens = [];

for (let i = 0; i < numbers.length; i++) {
    if (numbers[i] % 2 === 0) {
        evens.push(numbers[i]);
    }
}
console.log(evens); // [2, 4, 6]

Notice how i is declared with let. If we had used var, any asynchronous operation inside the loop would have captured the final value of i, leading to unexpected results.

Pro tip: Treat const as the default. Switch to let only when you know the variable will be reassigned.

Arrow Functions and Implicit Returns

Arrow functions give you a concise syntax for writing function expressions, and they lexically bind this. This eliminates the classic “var that = this” workaround. Arrow functions also support implicit returns when the body is a single expression, which can dramatically shrink utility code.

Let’s refactor a typical array transformation using an arrow function.

const users = [
    { name: 'Alice', age: 28 },
    { name: 'Bob', age: 34 },
    { name: 'Carol', age: 22 }
];

// Traditional function
const namesTraditional = users.map(function(user) {
    return user.name;
});

// Arrow function with implicit return
const namesArrow = users.map(user => user.name);

console.log(namesArrow); // ['Alice', 'Bob', 'Carol']

The arrow version removes the boilerplate function keyword and the explicit return, making the intent crystal clear.

Preserving this in Callbacks

When you need to access the surrounding object inside a callback, arrow functions shine. Imagine a simple timer class that logs elapsed seconds.

class Timer {
    constructor() {
        this.seconds = 0;
    }

    start() {
        setInterval(() => {
            this.seconds++;
            console.log(`Elapsed: ${this.seconds}s`);
        }, 1000);
    }
}

const t = new Timer();
t.start(); // Logs every second without losing the context of `this`

Because the arrow function inherits this from the enclosing Timer instance, the counter updates correctly.

Template Literals and Tagged Templates

Template literals, delimited by backticks (`), let you embed expressions directly into strings. They also support multi‑line strings without the need for concatenation or escape characters.

const user = { name: 'Dana', role: 'admin' };
const greeting = `
    Hello, ${user.name}!
    You have ${user.role === 'admin' ? 'full' : 'limited'} access.
`;
console.log(greeting.trim());

Beyond interpolation, tagged templates let you preprocess template strings. A common use case is sanitizing HTML to prevent XSS attacks.

function safeHTML(strings, ...values) {
    const escaped = values.map(v => String(v)
        .replace(/&/g, '&')
        .replace(//g, '>')
    );
    return strings.reduce((result, str, i) => result + str + (escaped[i] || ''), '');
}

const userInput = '';
const output = safeHTML`User comment: ${userInput}`;
console.log(output); // User comment: <script>alert("hacked")</script>
Pro tip: Use tagged templates for any situation where you need to transform or validate interpolated data before rendering.

Destructuring, Spread, and Rest

Destructuring lets you pull values out of arrays or objects in a single, readable line. Combined with the spread (...) and rest operators, you can clone, merge, and split data structures without mutating the original.

Here’s a practical example: extracting configuration options while providing defaults.

function initChart({ type = 'line', width = 800, height = 600, ...options }) {
    console.log(`Creating a ${type} chart (${width}x${height})`);
    console.log('Additional options:', options);
}

const userConfig = {
    type: 'bar',
    height: 400,
    color: 'steelblue',
    legend: true
};

initChart(userConfig);
// Output:
// Creating a bar chart (800x400)
// Additional options: { color: 'steelblue', legend: true }

The ...options rest pattern captures any extra properties, allowing the function to stay flexible as new options are added.

Cloning and Merging Objects

Instead of using Object.assign, the spread operator offers a more declarative syntax for shallow copies and merges.

const defaults = { timeout: 3000, cache: true };
const overrides = { cache: false, retries: 2 };

const settings = { ...defaults, ...overrides };
console.log(settings); // { timeout: 3000, cache: false, retries: 2 }

Because later spreads overwrite earlier ones, you can layer configuration layers from most generic to most specific.

Default Parameters and Rest Parameters

Function signatures can now include default values, reducing the need for manual if (arg === undefined) checks. Rest parameters (...args) collect an indefinite number of arguments into an array, making variadic functions straightforward.

function sum(...numbers) {
    return numbers.reduce((total, n) => total + n, 0);
}

console.log(sum(1, 2, 3, 4)); // 10

Combine defaults with rest to create flexible APIs, such as a logger that prefixes messages with a configurable tag.

function logger(tag = 'APP', ...messages) {
    console.log(`[${tag}]`, ...messages);
}

logger('SERVER', 'Started on port', 8080);
// Output: [SERVER] Started on port 8080
logger('DEBUG', 'User data:', { id: 42 });
// Output: [DEBUG] User data: { id: 42 }

Promises, Async/Await, and Error Handling

Promises turned asynchronous code into a chainable, composable pattern. With async/await, that pattern becomes synchronous‑looking, drastically improving readability.

Let’s rewrite a typical fetch workflow using async/await.

async function fetchUser(id) {
    try {
        const response = await fetch(`https://api.example.com/users/${id}`);
        if (!response.ok) throw new Error('Network response was not ok');
        const data = await response.json();
        return data;
    } catch (err) {
        console.error('Failed to fetch user:', err);
        return null;
    }
}

// Usage
(async () => {
    const user = await fetchUser(7);
    if (user) console.log('User name:', user.name);
})();

The try/catch block captures both network errors and JSON parsing failures, keeping error handling in one place.

Pro tip: Always check response.ok before calling .json(). A non‑2xx status still resolves the promise, so you must handle it manually.

Parallel Async Operations

When multiple independent async calls are needed, Promise.all runs them concurrently, saving time.

async function loadDashboard() {
    const [stats, recent, alerts] = await Promise.all([
        fetch('/api/stats').then(r => r.json()),
        fetch('/api/recent').then(r => r.json()),
        fetch('/api/alerts').then(r => r.json())
    ]);

    console.log('Stats:', stats);
    console.log('Recent activity:', recent);
    console.log('Alerts:', alerts);
}

Because the three fetches are independent, they execute in parallel, reducing total latency.

Modules: import / export

ES modules let you split code into reusable, self‑contained files. The export keyword marks public members, while import pulls them into other files. This eliminates the old require/CommonJS pattern in browsers that now support native modules.

// utils.js
export function capitalize(str) {
    return str.charAt(0).toUpperCase() + str.slice(1);
}
export const PI = Math.PI;

// app.js
import { capitalize, PI } from './utils.js';

console.log(capitalize('module')); // Module
console.log('π ≈', PI.toFixed(2));

Modules are statically analyzable, enabling tree‑shaking tools like Webpack or Rollup to drop unused exports automatically.

Dynamic Imports for Code Splitting

When you only need a module under certain conditions, import() returns a promise that resolves to the module. This is perfect for lazy‑loading heavy libraries.

document.getElementById('chartBtn').addEventListener('click', async () => {
    const { renderChart } = await import('./chart.js');
    renderChart();
});

The chart library is fetched only after the user clicks the button, keeping the initial bundle lightweight.

Optional Chaining and Nullish Coalescing

Deep property access used to require repetitive checks. Optional chaining (?.) short‑circuits when a reference is null or undefined. Nullish coalescing (??) provides a default only when the left‑hand side is truly nullish, unlike || which treats 0 and '' as falsy.

const response = {
    user: {
        profile: {
            avatar: null
        }
    }
};

const avatarUrl = response.user?.profile?.avatar ?? 'default.png';
console.log(avatarUrl); // 'default.png'

This pattern is especially handy when handling API responses where some fields may be missing.

Pro tip: Combine optional chaining with nullish coalescing to provide safe fallbacks without overriding legitimate falsy values like 0 or false.

Classes, Private Fields, and Static Methods

ES6 introduced class syntax, which is syntactic sugar over prototype inheritance but reads like traditional OOP languages. Private fields (#field) enforce encapsulation, while static methods belong to the class itself rather than instances.

class BankAccount {
    #balance = 0; // private field

    constructor(owner) {
        this.owner = owner;
    }

    deposit(amount) {
        if (amount > 0) this.#balance += amount;
        return this;
    }

    withdraw(amount) {
        if (amount <= this.#balance) this.#balance -= amount;
        return this;
    }

    getBalance() {
        return this.#balance;
    }

    static transfer(from, to, amount) {
        from.withdraw(amount);
        to.deposit(amount);
    }
}

const alice = new BankAccount('Alice');
const bob = new BankAccount('Bob');

BankAccount.transfer(alice, bob, 50);
console.log(alice.getBalance()); // 0 (assuming no prior deposit)
console.log(bob.getBalance());   // 50

Attempting to read #balance from outside the class throws a syntax error, guaranteeing that internal state cannot be tampered with accidentally.

Mixins and Composition

Instead of deep inheritance hierarchies, modern JavaScript favors composition. Mixins can be applied to classes to share reusable behavior.

const TimestampMixin = Base => class extends Base {
    constructor(...args) {
        super(...args);
        this.createdAt = new Date();
    }

    age() {
        return Date.now() - this.createdAt.getTime();
    }
};

class Message {}
class TimedMessage extends TimestampMixin(Message) {}

const msg = new TimedMessage();
setTimeout(() => console.log('Age ms:', msg.age()), 1500);

The mixin adds timestamp functionality without polluting the original Message class, keeping concerns separated.

Iterators, Generators, and Async Iteration

Iterators define a protocol for sequential access, while generators simplify iterator creation with the function* syntax and yield keyword. Async generators extend this concept to asynchronous streams, perfect for processing large data sets or websockets.

// Synchronous generator
function* range(start, end) {
    for (let i = start; i < end; i++) {
        yield i;
    }
}

for (const n of range(1, 5)) {
    console.log(n); // 1 2 3 4
}

// Async generator reading a paginated API
async function* fetchPages(url) {
    let next = url;
    while (next) {
        const res = await fetch(next);
        const { data, nextPage } = await res.json();
        yield* data; // emit each item
        next = nextPage;
    }
}

// Consuming the async generator
(async () => {
    for await (const item of fetchPages('/api/items?page=1')) {
        console.log('Item:', item);
    }
})();

Generators turn complex loops into declarative pipelines, and async iteration lets you handle streams with a simple for await…of construct.

Real‑World Refactor: From Callback Hell to Clean Async/Await

Suppose you have a legacy function that loads user data, then their posts, then comments, each nested inside callbacks. The code quickly becomes unreadable.

// Callback hell (old style)
function loadUserData(userId, cb) {
    getUser(userId, function(user) {
        getPosts(user.id, function(posts) {
            getComments(posts, function(comments) {
                cb({ user, posts, comments });
            });
        });
    });
}

Refactor with promises and async/await to flatten the flow and centralize error handling.

// Modern async/await version
async function loadUserData(userId) {
    try {
        const user = await getUser(userId);
        const posts = await getPosts(user.id);
        const comments = await getComments(posts);
        return {
        
Share this article