Catch Surf® USA Scraper avatar
Catch Surf® USA Scraper

Pricing

Pay per usage

Go to Apify Store
Catch Surf® USA Scraper

Catch Surf® USA Scraper

Scrape Catch Surf® USA and extract data on swimming from catchsurf.com. Our Catch Surf® USA API lets you crawl product information and pricing. The saved data can be downloaded as HTML, JSON, CSV, Excel, and XML.

Pricing

Pay per usage

Rating

0.0

(0)

Developer

Mark Carter

Mark Carter

Maintained by Community

Actor stats

1

Bookmarked

2

Total users

1

Monthly active users

3 years ago

Last modified

Share

main.js

1import Apify from 'apify';
2
3Apify.main(async () => {
4 const input = await Apify.getInput();
5
6 await Apify.metamorph('pocesar/shopify-scraper', {
7 ...input,
8 startUrls: [{
9 url: 'http://www.catchsurf.com',
10 }],
11 });
12});

package.json

{
"name": "catch-surfr-usa-scraper",
"version": "0.0.1",
"type": "module",
"dependencies": {
"apify": "^2.3.2"
},
"scripts": {
"start": "node main.js"
}
}

Dockerfile

# First, specify the base Docker image. You can read more about
# the available images at https://sdk.apify.com/docs/guides/docker-images
# You can also use any other image from Docker Hub.
FROM apify/actor-node:16
# Second, copy just package.json and package-lock.json since those are the only
# files that affect "npm install" in the next step, to speed up the build.
COPY package*.json ./
RUN npm --quiet set progress=false \
&& npm install --only=prod --no-optional \
&& echo "Installed NPM packages:" \
&& (npm list --only=prod --no-optional --all || true) \
&& echo "Node.js version:" \
&& node --version \
&& echo "NPM version:" \
&& npm --version
COPY . ./
ENV APIFY_DISABLE_OUTDATED_WARNING 1
ENV npm_config_loglevel=silent