Back to template gallery

Start with TypeScript on Bun

Scrape single page with provided URL with Axios and extract data from page's HTML with Cheerio.

Language

typescript

Tools

bun

cheerio

axios

Use cases

Starter

Web scraping

src/main.ts

1// Axios - Promise based HTTP client for the browser and node.js (Read more at https://axios-http.com/docs/intro).
2import axios from 'axios';
3// Cheerio - The fast, flexible & elegant library for parsing and manipulating HTML and XML (Read more at https://cheerio.js.org/).
4import * as cheerio from 'cheerio';
5// Apify SDK - toolkit for building Apify Actors (Read more at https://docs.apify.com/sdk/js/).
6import { Actor } from 'apify';
7
8// this is ESM project, and as such, it requires you to specify extensions in your relative imports
9// read more about this here: https://nodejs.org/docs/latest-v18.x/api/esm.html#mandatory-file-extensions
10// note that we need to use `.js` even when inside TS files
11// import { router } from './routes.js';
12
13// The init() call configures the Actor for its environment. It's recommended to start every Actor with an init().
14await Actor.init();
15
16interface Input {
17    url: string;
18}
19// Structure of input is defined in input_schema.json
20const input = await Actor.getInput<Input>();
21if (!input) throw new Error("Input is missing!");
22const { url } = input;
23
24// Fetch the HTML content of the page.
25const response = await axios.get(url);
26
27// Parse the downloaded HTML with Cheerio to enable data extraction.
28const $ = cheerio.load(response.data);
29
30// Extract all headings from the page (tag name and text).
31const headings: { level: string, text: string }[] = [];
32$("h1, h2, h3, h4, h5, h6").each((_i, element) => {
33    const headingObject = {
34        level: $(element).prop("tagName")!.toLowerCase(),
35        text: $(element).text(),
36    };
37    console.log("Extracted heading", headingObject);
38    headings.push(headingObject);
39});
40
41// Save headings to Dataset - a table-like storage.
42await Actor.pushData(headings);
43
44// Gracefully exit the Actor process. It's recommended to quit all Actors with an exit().
45await Actor.exit();

Scrape single-page in Bun template

A template for scraping data from a single web page in TypeScript (Node.js). The URL of the web page is passed in via input, which is defined by the input schema. The template uses the Axios client to get the HTML of the page and the Cheerio library to parse the data from it. The data are then stored in a dataset where you can easily access them.

The scraped data in this template are page headings but you can easily edit the code to scrape whatever you want from the page.

Included features

  • Apify SDK - a toolkit for building Actors
  • Input schema - define and easily validate a schema for your Actor's input
  • Dataset - store structured data where each object stored has the same attributes
  • Axios client - promise-based HTTP Client for Node.js and the browser
  • Cheerio - library for parsing and manipulating HTML and XML

How it works

  1. Actor.getInput() gets the input where the page URL is defined

  2. axios.get(url) fetches the page

  3. cheerio.load(response.data) loads the page data and enables parsing the headings

  4. This parses the headings from the page and here you can edit the code to parse whatever you need from the page

    $("h1, h2, h3, h4, h5, h6").each((_i, element) => {...});
  5. Actor.pushData(headings) stores the headings in the dataset

Resources

Already have a solution in mind?

Sign up for a free Apify account and deploy your code to the platform in just a few minutes! If you want a head start without coding it yourself, browse our Store of existing solutions.