Fix Jenkins kubernetis parallelization issues in jest react test using fetchMock and react testing library

Alexander Luria
3 min readApr 20, 2021
Testing can be tricky if you combine multiple disciplines together

TLDR: When working in multi worker insane of test runner. Chain events!

mockApiFetch
.mockResolvedValueOnce({ ok: true, body: jobPendingBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobSuccessBody });

PROBLEM: I started to develop in new devops environment. This one is using Jenkins Helm kubernetes and gitswarm to spin up test execution instances on each code delivery.

I developed a simple component using React and typescript (yes I know… roll your eyes and click your tongs). All in al a descent peace of code that is monitoring RestAPI task progress by scheduling a poll interval and invoking browser “fetch” to call the server.
All the test were passing at my local environment, but when delivered to the DevOps pipeline, failed consistently on not fulfilled fetch calls.

It was driving me crazy, I tried all kinds of trick but no avail. I even tried to debug the test delivering small changes to pipeline but nothing seemed to work.

SOLUTION: I found that I can actually mock whole functional component using `jest.mock(‘yourlibrary’)` and actually importing the library but using it for mock calls.

The basic code was about to be

import { ApiFetch } from ‘data-access-api’;
jest.mock(‘data-access-api’);

beforeEach(() => {
jest.useFakeTimers();
jest.clearAllMocks();
});

afterAll(() => {
jest.clearAllTimers();
});

test(‘Task fetches, calls onUpdate with each fetch result, and ends on successful job’, async () => {
const mockApiFetch = ApiFetch as jest.Mock;
const apiFetchSpy = jest.spyOn(mockApiFetch, ‘mockResolvedValueOnce’);

mockApiFetch .mockResolvedValueOnce({ ok: true, body: jobPendingBody })
mockApiFetch .mockResolvedValueOnce({ ok: true, body: jobRunningBody })
mockApiFetch .mockResolvedValueOnce({ ok: true, body: jobRunningBody })
mockApiFetch .mockResolvedValueOnce({ ok: true, body: jobRunningBody })
mockApiFetch .mockResolvedValueOnce({ ok: true, body: jobRunningBody })
mockApiFetch .mockResolvedValueOnce({ ok: true, body: jobSuccessBody });

const onComplete = jest.fn();
const onUpdate = jest.fn();
const onFailure = jest.fn();
TaskPoller({
taskCreated: taskCreatedMock,
onComplete,
onFailure,
onUpdate,
});

await waitFor(() => {
expect(onComplete).toHaveBeenCalledTimes(1);
});

expect(onComplete).toHaveBeenLastCalledWith(jobSuccessBody);
expect(onUpdate).toHaveBeenCalledTimes(5);
expect(onUpdate).toHaveBeenLastCalledWith(jobRunningBody);
expect(onFailure).not.toHaveBeenCalled();
expect(apiFetchSpy).toHaveBeenCalledTimes(6);

jest.advanceTimersByTime(DEFAULT_POLL_RATE_MS);
expect(apiFetchSpy).toHaveBeenCalledTimes(6);
});

The thing is though it improved the test passage in the DevOps pipeline it still had tests failing even thought “it worked on my computer”.

So the next thing came painfully after several days of trying every trick in the book. Lets chain the execution of the mocks that way we can make sure they happen and fire and mock fetch in order. Since the pipeline runs on multiple workers the test were colliding mocks, sometimes overcounting mock calls and some time over counting. It was a clear synchronization issue. So chaining the calls solved the issue terminally.

mockApiFetch
.mockResolvedValueOnce({ ok: true, body: jobPendingBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobRunningBody })
.mockResolvedValueOnce({ ok: true, body: jobSuccessBody });

Happy testing dear friends, hope I save several years of you life trying to solve this issue!

--

--

Alexander Luria

Veteran Front end developer. Obsessing over ROI development. Addicted to online FPS.