Chunking Arrays: Handy Utilities Every JavaScript Dev Should Know

Posted by

From data pagination to batching API calls, splitting arrays into chunks is one of those timeless problems every developer solves again and again. Here’s how to do it cleanly.

From data pagination to batching API calls, splitting arrays into chunks is one of those timeless problems every developer solves again and again. Here’s how to do it cleanly.

Introduction

You’ve got a huge array, and you don’t want to process it all at once. Maybe you’re:

  • Paginating results for a UI.
  • Sending bulk records to an API in batches.
  • Rendering grid layouts.
  • Running jobs where you can’t overload memory or network.

This is where chunking comes in — splitting arrays into smaller, evenly sized pieces.

It sounds simple, but there are lots of ways to get it wrong: off-by-one errors, forgetting the leftover items, or writing clunky loops that don’t scale.

Let’s go beyond quick hacks and build handy utilities you’ll reuse in every project.


1. The Classic Loop Approach

function chunkArray(arr, size) {
const chunks = [];
for (let i = 0; i < arr.length; i += size) {
chunks.push(arr.slice(i, i + size));
}
return chunks;
}

console.log(chunkArray([1,2,3,4,5,6,7], 3));
// [[1,2,3], [4,5,6], [7]]

Pros: Fast, clear, handles leftovers.
⚠️ Cons: Slightly verbose if used inline.


2. Using reduce

function chunkWithReduce(arr, size) {
return arr.reduce((acc, _, i) => {
if (i % size === 0) acc.push(arr.slice(i, i + size));
return acc;
}, []);
}

console.log(chunkWithReduce([1,2,3,4,5,6,7], 3));
// [[1,2,3], [4,5,6], [7]]

Pros: Functional style, one-liner in pipelines.
 ⚠️ Cons: Less obvious to beginners than loops.


3. Leveraging Generators

Generators let you chunk lazily (no full array in memory).

function* chunkGenerator(arr, size) {
for (let i = 0; i < arr.length; i += size) {
yield arr.slice(i, i + size);
}
}

for (const chunk of chunkGenerator([1,2,3,4,5,6,7], 3)) {
console.log(chunk);
}
// [1,2,3]
// [4,5,6]
// [7]

👉 Why use it? Perfect for streams, large data sets, or when you don’t need all chunks at once.


4. Real-World Use Cases

A. Paginating Data for UI

const data = Array.from({ length: 20 }, (_, i) => i + 1);
const pages = chunkArray(data, 5);

console.log(pages[0]); // [1,2,3,4,5]
// Show page 1, then page 2...

B. Batch API Requests

async function batchFetch(ids) {
const batches = chunkArray(ids, 50);
const results = [];
for (const batch of batches) {
const res = await fetch("/api/data", {
method: "POST",
body: JSON.stringify(batch)
}).then(r => r.json());
results.push(...res);
}
return results;
}

👉 Prevents hitting API request limits or payload size errors.

C. Grid Layouts in React

function Grid({ items }) {
const rows = chunkArray(items, 3);
return (
<div className="grid">
{rows.map((row, i) => (
<div className="row" key={i}>
{row.map(item => <div className="cell" key={item}>{item}</div>)}
</div>
))}
</div>
);
}

👉 Handy for responsive layouts or when rendering items in equal rows.


5. Lodash & Alternatives

Lodash has a built-in _.chunk:

import _ from "lodash";

_.chunk([1,2,3,4,5], 2);
// [[1,2], [3,4], [5]]

👉 If you already use Lodash, use it — well-tested and readable.
 👉 If not, avoid pulling in the whole library just for chunking.


6. Edge Cases & Gotchas

  • Size ≤ 0: Should throw or return [].
  • Empty array: Always return [].
  • Array smaller than chunk size: Return [arr].
  • Uneven division: Last chunk may be shorter — and that’s expected.
chunkArray([1,2,3], 5); // [[1,2,3]]
chunkArray([], 3); // []
chunkArray([1,2,3], 0); // ❌ Danger: infinite loop

7. TypeScript-Friendly Version

function chunkArray<T>(arr: T[], size: number): T[][] {
if (size <= 0) throw new Error("Size must be > 0");
const chunks: T[][] = [];
for (let i = 0; i < arr.length; i += size) {
chunks.push(arr.slice(i, i + size));
}
return chunks;
}

👉 Strongly typed, reusable in any project.


8. Performance Considerations

  • Loop vs. Reduce: Performance is nearly identical for small/medium arrays.
  • Huge arrays (100k+): Prefer the classic for loop for speed and memory clarity.
  • Lazy consumption: Use generators or async generators if processing streams or very large lists.

Example async generator for server-side pagination:

async function* asyncChunk(fetchPage, size) {
let page = 0;
while (true) {
const items = await fetchPage(page++, size);
if (!items.length) break;
yield items;
}
}

9. Handy Utilities (Copy-Paste)

// Core reusable
export function chunkArray(arr, size) {
if (size <= 0) throw new Error("Size must be > 0");
const chunks = [];
for (let i = 0; i < arr.length; i += size) {
chunks.push(arr.slice(i, i + size));
}
return chunks;
}

// Reduce-based
export const chunkWithReduce = (arr, size) =>
arr.reduce((acc, _, i) => {
if (i % size === 0) acc.push(arr.slice(i, i + size));
return acc;
}, []);
// Generator
export function* chunkGenerator(arr, size) {
for (let i = 0; i < arr.length; i += size) {
yield arr.slice(i, i + size);
}
}

Conclusion

Chunking arrays is one of those small but foundational skills in JavaScript. Whether you’re paginating, batching API requests, or building layouts, knowing the right utility saves time and prevents off-by-one bugs.

  • For everyday use: Classic loop or reduce-based chunker.
  • For streams or huge data: Generators/async generators.
  • For quick one-liners: Lodash’s _.chunk.

👉 Pro Tip: Wrap your favorite version into a tiny utility file (or even publish as a gist/NPM package). It’ll pay for itself the next time you’re staring at a 10,000-item array.


Call to Action

What’s your go-to way of chunking arrays?
Drop your favorite snippet (or edge-case fix) in the comments, and share this post with a teammate who keeps reinventing the chunker every sprint.

Leave a Reply

Your email address will not be published. Required fields are marked *