HTML to Image in Node.js — Without Puppeteer

Puppeteer is the obvious choice when you first need to convert HTML to an image in Node.js. It is well-documented, flexible, and produces accurate output. Then you try to deploy it. The binary is 150–300 MB depending on the platform. It fails silently on Vercel. It crashes on underpowered Lambda functions. Cold starts add seconds to the first request. And every few months a new version of Chrome breaks something in your launch flags.

There is a simpler path: offload the render to an API and use Node's built-in fetch to get the image back. No binary dependencies, no process management, no platform restrictions.

Why Puppeteer causes deployment pain

Puppeteer bundles a full Chromium binary. On Linux (the platform most cloud functions use) that binary is around 300 MB after extraction. This creates three compounding problems:

puppeteer-core with a system-installed Chrome sidesteps the binary size issue, but you still need to manage the Chrome installation in your deployment environment — which is impractical for most managed hosting platforms.

The alternative: a render API

An HTML-to-image API runs Chromium in a managed pool on infrastructure built for it. Your Node.js code makes an HTTP request and receives binary image data. From your application's perspective, it is no different from calling any other API.

FactorPuppeteerRender API
Deployment size+150–300 MB0 MB added
Works on Vercel / LambdaRarelyAlways
Cold start1–3 sNone
System library depsManyNone
MaintenanceBreaks on Chrome updatesAPI versioned
Full HTML/CSS supportYesYes

Migrating existing Puppeteer code

If you already have Puppeteer code, the migration is mechanical. Here is a typical screenshot function:

js — before (Puppeteer)
import puppeteer from 'puppeteer';

export async function htmlToImage(html: string, width = 1200, height = 630) {
  const browser = await puppeteer.launch({ args: ['--no-sandbox'] });
  const page    = await browser.newPage();

  await page.setViewport({ width, height });
  await page.setContent(html, { waitUntil: 'networkidle0' });

  const screenshot = await page.screenshot({ type: 'png' });
  await browser.close();

  return screenshot;
}

And the same function using the RenderPix API:

js — after (fetch)
export async function htmlToImage(html: string, width = 1200, height = 630) {
  const res = await fetch('https://renderpix.dev/v1/render', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-Api-Key': process.env.RENDERPIX_API_KEY!,
    },
    body: JSON.stringify({ html, width, height, format: 'png' }),
  });

  if (!res.ok) {
    throw new Error(`Render failed: ${res.status} ${await res.text()}`);
  }

  return Buffer.from(await res.arrayBuffer());
}

The function signature is identical. The return type is the same — a Buffer of PNG bytes. Any calling code that writes the buffer to disk, streams it to S3, or sends it in an HTTP response continues to work without changes.

The RENDERPIX_API_KEY environment variable is the only new dependency. Add it to your .env.local locally and to your hosting platform's environment variable settings for production.

Writing to disk, S3, or HTTP response

The buffer returned by htmlToImage works anywhere a Buffer does. Common destinations:

js — usage examples
import { writeFile } from 'node:fs/promises';
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';

const buf = await htmlToImage(myHtml, 1200, 630);

// Write to disk
await writeFile('output.png', buf);

// Upload to S3
await s3.send(new PutObjectCommand({
  Bucket: 'my-bucket',
  Key: `images/${id}.png`,
  Body: buf,
  ContentType: 'image/png',
}));

// Stream in an Express response
res.set('Content-Type', 'image/png');
res.send(buf);

// Next.js Route Handler
return new Response(buf, {
  headers: { 'Content-Type': 'image/png' },
});

Using TypeScript

If you are on TypeScript, a thin typed wrapper makes the interface explicit and easy to test:

ts — render.ts
interface RenderOptions {
  html: string;
  width?: number;
  height?: number;
  format?: 'png' | 'jpeg' | 'webp';
  quality?: number;
  deviceScaleFactor?: number;
}

export async function render(opts: RenderOptions): Promise<Buffer> {
  const { html, width = 1200, height = 630, format = 'png', ...rest } = opts;

  const res = await fetch('https://renderpix.dev/v1/render', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-Api-Key': process.env.RENDERPIX_API_KEY!,
    },
    body: JSON.stringify({ html, width, height, format, ...rest }),
  });

  if (!res.ok) {
    throw new Error(`RenderPix error ${res.status}: ${await res.text()}`);
  }

  return Buffer.from(await res.arrayBuffer());
}

Import render wherever you previously imported Puppeteer. Remove puppeteer from package.json. Run npm install and watch your node_modules shrink dramatically.

Also see: screenshot API for developers.

Drop Puppeteer. Keep the output quality.

Sign up for a free account and replace your first Puppeteer function in under 10 minutes.

← All articles API Reference →