In JavaScript there are three different types of functions, regular function
s, arrow functions using the =>
syntax, and generator functions using the function*
form.
Generator functions allow you to create iterable objects that can be consumed one item at a time using for..of
loops or Array.from
, and others. The main characteristic of them is that execution pauses when the yield
operator is found, at which point the state of the function is preserved for the next execution. Knowing this, we can abuse it a little and use them as a coroutine. Generators and coroutines are different and you can read about it here.
You use generator functions like this
function* fibonnaciGenerator() {
let i = 0;
let j = 0;
while (true) {
const next = i + j || 1;
yield next;
i = j;
j = next;
}
}
const fibgen = fibonnaciGenerator();
console.log(fibGenerator.next().value);
The first thing you find weird about this is that the body of the function is composed of an infinite loop, normally this would be a problem, but it is no so for generator functions.
The generator function, does not execute right away, and only happens until you call the .next()
function. This will return an object having two fields: value
and done
.
Calling .next()
will execute the body of the function until it finds the keyword yield <value>
, it will pause and then return the object mentioned earlier with the passed value.
Since the function stops executing on a yield
instruction, the done
field is marked as false
. done
is set to true when the function reaches a return
or to the end of the functino body.
With this knowledge we can deduct two things:
- The infinite loop will not execute forever because there is a
yield
instruction in it. - You still need to be careful of not passing the iterator function to a
for..of
loop or aArray.from
, because in that case the program will hang.
You can use generator functions to make many things easier, like an RNG or merging two sorted arrays.
RNG Generator function
function* rng() {
let x = 123456789;
let y = 362436069;
let z = 521288629;
let w = 88675123;
let t = 0;
let max = 4294967295;
while (true) {
t = (x ^ (x << 11)) >>> 0;
x = y;
y = z;
z = w;
w = (w ^ (w >>> 19) ^ (t ^ (t >>> 8))) >>> 0;
yield w / max;
}
}
Merging two sorted arrays
function* merge(x, y) {
const [a, ...aa] = x || [];
const [b, ...bb] = y || [];
if (a <= b || (x.length && !y.length)) {
yield a;
yield* merge(aa, y);
} else if (b < a || y.length) {
yield b;
yield* merge(x, bb);
}
}
console.log(Array.from(merge(arr1, arr2)));
Previously I mentioned abusing the generator functions, so here comes the interesting part...
Not blocking the browser
Let's jump right in, using generator functions and promises, we can write a function to execute long running tasks in a cooperative way. In this case, I am writing a function to map the elements in an array asynchronously, freeing the main executin thread every N number of items.
Behold the code!
function mapAsync(array, fn, workSize = 10) {
return new Promise((resolve, reject) => {
const resultArray = [];
let currentSize = 0;
async function* genFn() {
try {
for (const item of array) {
resultArray.push(await fn(item));
if (++currentSize >= workSize) {
// This condition will ensure that the loop
// will execute until the current number of
// elements is equal to workSize.
// When that happens, it will yield and
// reset the currentSize variable.
yield;
currentSize = 0;
}
}
resolve(resultArray);
} catch (error) {
reject(error);
}
}
const gen = genFn();
const id = setInterval(() => {
// 1. Schedule executions using the event loop
// allowing other events to be processed.
// This interval serves two purposes:
// 2. Execute the generator function until it
// is done.
if (gen.next().done) {
clearInterval(id);
}
}, 0);
});
}
The main points to look in the function are already comented, but overall, it is using the yield function to pause the execution of the for..of
loop, and use setInterval
to resume it, repeat until the whole array has been processed.
If your concern is performance, then keep a big workSize
value, but be aware that doing so will impact the overall application responsiveness.
Here is an example on how to use it
(async () => {
let counter = 0;
const hbId = setInterval(
() => console.log('heartbeat', ++counter), 0);
const squares = await mapAsync(new Array(100),
(_) => {
let x = 100000000;
let y = 0;
while (x--) {
y += x;
}
return y;
}, 1);
console.log({ squares });
clearInterval(hbId);
})();
This is just doing busy work to push as much work as possible to the main thread, but using the mapAsync
function other parts of the webpage will not be blocked from execution. You can play around with the workSize
and see how it affects the running time.
One final remark
Everything explained here is not something new, but a pretty seldom used feature in JavaScript that works in browsers and node.js.