Two common patterns in JavaScript to deduplicate arrays — and when to use each.

Introduction
Every developer eventually faces this:
const arr = [1, 2, 2, 3, 4, 4, 5];
👉 How do you get:
[1, 2, 3, 4, 5]
There are dozens of tricks floating around — Set, filter, reduce, Lodash’s uniq. But the two most common in vanilla JavaScript are:
- Using
Set - Using
Array.prototype.filter
They both work, but they aren’t interchangeable. This guide will break down both approaches with clear code, pros/cons, performance notes, and real-world cases.
1) Using Set (The Modern Default)
Set is a built-in collection type introduced in ES6. It stores only unique values.
Example:
const arr = [1, 2, 2, 3, 4, 4, 5];
const unique = [...new Set(arr)];
- Convert array →
Set(removes duplicates). - Spread back into a new array.
👉 This is usually the fastest and simplest way.
2) Using filter (The Classic Way)
Before ES6, devs used filter with indexOf or findIndex.
Example:
const arr = [1, 2, 2, 3, 4, 4, 5];
const unique = arr.filter((item, index) => arr.indexOf(item) === index);
console.log(unique); // [1, 2, 3, 4, 5]
How it works:
- For each element, check if its first index matches the current index.
- If yes → it’s the first occurrence → keep it.
- If no → it’s a duplicate → skip it.
3) Comparing Sets vs. Filter
Syntax Simplicity
- Set: One-liner, very clean.
- Filter: Slightly longer and less readable.
Performance
For large arrays:
- Set: Generally faster — O(n) with hashing.
- Filter: Slower —
indexOfis O(n), so worst-case is O(n²).
👉 With 100k+ items, Set will usually win big.
Data Types
- Set preserves uniqueness by strict equality (
===). - Filter can be customized:
// Deduplicate case-insensitively
const arr = ["a", "A", "b", "B"];
const unique = arr.filter(
(item, index, self) => self.findIndex(x => x.toLowerCase() === item.toLowerCase()) === index
);
console.log(unique); // ["a", "b"]
👉 Use filter if you need custom rules for equality.
Order Preservation
Both approaches preserve the first occurrence order.
const arr = [3, 2, 2, 1];
console.log([...new Set(arr)]);
// [3, 2, 1]
console.log(arr.filter((x, i) => arr.indexOf(x) === i));
// [3, 2, 1]
Handling Objects
const arr = [{ id: 1 }, { id: 1 }, { id: 2 }];
console.log([...new Set(arr)]);
// [{ id: 1 }, { id: 1 }, { id: 2 }] ❌ (different references)
👉 Set sees objects as unique unless they are the same reference.
For objects, you often need filter with custom comparison:
const uniqueById = arr.filter(
(obj, index, self) => index === self.findIndex(o => o.id === obj.id)
);
console.log(uniqueById); // [{ id: 1 }, { id: 2 }]
4) Real-World Use Cases
a) Deduplicating IDs
const ids = [1, 2, 3, 2, 4, 1];
const uniqueIds = [...new Set(ids)];
b) Deduplicating Search Results
const results = [
{ id: 1, name: "Alice" },
{ id: 1, name: "Alice" },
{ id: 2, name: "Bob" }
];
const uniqueResults = results.filter(
(item, idx, arr) => idx === arr.findIndex(r => r.id === item.id)
);
c) Deduplicating Case-Insensitive User Input
const tags = ["JS", "js", "JavaScript", "JS"];
const uniqueTags = tags.filter(
(t, i, arr) => arr.findIndex(x => x.toLowerCase() === t.toLowerCase()) === i
);
5) Quick Reference Table

Conclusion
- Use
Setwhen: - You’re deduplicating primitive values.
- You want the fastest, cleanest solution.
- You’re working with large arrays.
- Use
filterwhen: - You need custom equality rules (case-insensitive, compare object keys).
- You’re deduplicating arrays of objects.
- Readability of explicit logic matters.
Pro tip: In modern codebases, start with Set. Reach for filter only when you need extra control.
Call to Action
Which one do you reach for first — Set or filter?
💬 Share your favorite deduplication trick in the comments.
🔖 Bookmark this as your cheat sheet for array uniqueness.
👩💻 Share with a teammate who still writes for loops to remove duplicates.


Leave a Reply