JavaScript Performance: Reduce function
This short article is the result of asking my juniors and regulars teammates about using the reduce function better. I want to describe to you the problem that we want to solve, and methods on how to measure the performance of the solution.
Problem:
Some reduce loops are not effective. The scripts that need to parse a lot of data can make a freezes. How to perform better?
Problem’s background
As a programmers(it doesn’t matter if you work on frontend or backend) we need to remember about memory usage. Each time you put something to variable we create a memory address — to object that already exists(reference) or to new allocated object. So, if we put to memory new objects time, by time, we load them, and if memory is overloaded, then the processor can run slower.
Suggested solution is to remer writing each function how do we allocate the memory.
Solutions: Save reference of accumulator value
Typical construction of reduce function in JS:
arr.reduce(callback(accumulator, currentValue[, index[, array]])[, initialValue])
So let’s make a two variants:
1. that looks pretty but has bad performance
arr.reduce((acc, curr) => ({ ...acc, [curr.id]: curr }), {});
2. that works fast but looks “non fancy”
arr.reduce((acc, curr) => {
acc[curr.id] = curr;
return acc;
}, {});
On the first look it’s the same, but the first looks better for me. But let’s check how this functions perform. I have already done the repo with performance tests. To run test suits use Benchmark.js that allow me to make simple tests and faker to mock a data.
Experiment:
const Benchmark = require('benchmark');
const faker = require('faker');const ROUNDS = 100;console.log(`Generating ${ROUNDS} sets of data...`);const fixture = new Array(ROUNDS).fill(null).map(() => faker.helpers.userCard());const suite = new Benchmark.Suite;console.log(`Measurement...`)suite
.add('Reduce - left references', () => {
fixture.reduce((acc, curr) => {
const { city } = curr.address;
return {
...acc,
[city]: (acc[city] || 0) + 1
}
}, {});
})
.add('Reduce - save reference', () => {
fixture.reduce((acc, curr) => {
const { city } = curr.address;
acc[city] = (acc[city] || 0) + 1;
return acc;
}, {});
})
// add listeners
.on('cycle', function(event) {
console.log(String(event.target));
})
.on('complete', function() {
console.log(`Fastest is "${this.filter('fastest').map('name')}"`);
})
.run();
Results:
Generating 100 sets of data...Measurement...Reduce - left references x 4,357 ops/sec ±0.95% (91 runs sampled)
Reduce - save reference x 132,828 ops/sec ±1.10% (94 runs sampled)Fastest is "Reduce - save reference"
As you can see, If we save the reference, the processor was able to do more operations per second — it’s OVER 30 TIMES FASTER! Want to try for your own? Clone my repo and run npm run test:reduce