Browser as a Service (BaaS)
Pricing
from $25.00 / 1,000 screenshot or page elements
Browser as a Service (BaaS)
Declarative browser automation. Send JSON actions — navigate, click, type, scrape, screenshot — get clean structured results. No code required. Powered by Playwright.
Pricing
from $25.00 / 1,000 screenshot or page elements
Rating
0.0
(0)
Developer

Aaron
Actor stats
1
Bookmarked
2
Total users
1
Monthly active users
2 days ago
Last modified
Categories
Share
🌐 Browser as a Service (BaaS)
Automate any website with simple JSON. No code required.
Tell it what page to visit and what to do — click buttons, fill forms, grab text, take screenshots. It runs a real browser in the cloud and gives you back clean data.
⚡ How it works (3 steps)
1️⃣ Get your Apify API token
Go to apify.com → create a free account → Settings > Integrations → copy your API token.
2️⃣ Tell it what to do
Send a JSON message with two things:
url— the page you want to visitactions— a list of steps to perform (in order)
3️⃣ Get your data
Results come back as clean JSON — text, data, screenshot links, whatever you asked for. Each scraped item shows up as a row in the output table.
🤖 Use with AI — Just Copy & Paste
Don't know how to code? No problem. Just paste one of these prompts into your favorite AI and tell it what you want. It does the rest.
💬 Prompt for Claude / Claude Code
Copy this whole block, paste it into Claude, and replace the two things in [brackets]:
I want to use the Apify actor "zenacquire/browser-as-a-service" to automate a browser task.My Apify API token: [PASTE YOUR TOKEN HERE]How the actor works:- API: POST https://api.apify.com/v2/acts/zenacquire~browser-as-a-service/runs?token=YOUR_TOKEN- Send JSON with "url" (the page) and "actions" (ordered list of browser steps)- Actions available:→ navigate — open a page→ click — click a button or link (needs "selector")→ type — fill in a text field (needs "selector" and "value")→ scrape — grab text from the page (needs "selector" and "name")→ screenshot — take a picture of the page ("fullPage": true for whole page)→ wait — pause until something loads (needs "selector" or "ms")→ select — pick from a dropdown (needs "selector" and "value")→ hover — mouse over an element (needs "selector")→ scroll — scroll the page (needs "y" for pixels or "selector" to scroll to)→ evaluate — run custom JavaScript on the page (needs "expression")Each scraped item comes back as { url, action, value } in the dataset.Screenshots are saved in the key-value store.Get results: GET https://api.apify.com/v2/actor-runs/{RUN_ID}/dataset/items?token=YOUR_TOKENGet screenshots: GET https://api.apify.com/v2/actor-runs/{RUN_ID}/key-value-store/records/{NAME}?token=YOUR_TOKENHere's what I want to do: [DESCRIBE YOUR TASK IN PLAIN ENGLISH]
Example things you can say:
- "Scrape all product names and prices from this Amazon search page: https://amazon.com/s?k=wireless+earbuds"
- "Go to Zillow, search for homes in Austin TX, and grab the first page of listings"
- "Take a screenshot of my website https://mysite.com"
- "Fill out the contact form on https://example.com/contact with my info"
💬 Prompt for ChatGPT
Same idea — copy, paste, fill in the blanks:
Help me use the Apify actor "zenacquire/browser-as-a-service" to automate a browser task.My Apify token: [PASTE YOUR TOKEN HERE]API: POST https://api.apify.com/v2/acts/zenacquire~browser-as-a-service/runs?token=YOUR_TOKENContent-Type: application/jsonBody: { "url": "...", "actions": [...] }Actions:→ navigate (open page)→ click (selector)→ type (selector + value)→ scrape (selector + name) — extracts text→ screenshot (name, fullPage)→ wait (selector or ms)→ select (selector + value)→ hover (selector)→ scroll (selector or y pixels)→ evaluate (expression — runs JavaScript)Results: GET https://api.apify.com/v2/actor-runs/{RUN_ID}/dataset/items?token=YOUR_TOKENEach result = { url, action, value }What I need: [DESCRIBE YOUR TASK IN PLAIN ENGLISH]
📋 Real-World Examples
🛒 Grab an Amazon product title and price
{"url": "https://www.amazon.com/dp/B0DXXXXXXXXX","actions": [{ "type": "navigate" },{ "type": "scrape", "selector": "#productTitle", "name": "title" },{ "type": "scrape", "selector": ".a-price .a-offscreen", "name": "price" },{ "type": "screenshot", "name": "product-page", "fullPage": true }]}
Output:
| URL | Action | Value |
|---|---|---|
| amazon.com/dp/... | title | Sony WH-1000XM5 Wireless Headphones |
| amazon.com/dp/... | price | $278.00 |
📰 Scrape Hacker News headlines
{"url": "https://news.ycombinator.com","actions": [{ "type": "navigate" },{ "type": "scrape", "selector": ".titleline > a", "name": "headlines" }]}
🔍 Search Google and grab results
{"url": "https://www.google.com","actions": [{ "type": "navigate" },{ "type": "type", "selector": "textarea[name=q]", "value": "best headphones 2026" },{ "type": "click", "selector": "input[name=btnK]" },{ "type": "wait", "selector": "#search" },{ "type": "scrape", "selector": "h3", "name": "results" },{ "type": "screenshot", "name": "search-results", "fullPage": true }]}
📸 Screenshot any website
{"url": "https://example.com","actions": [{ "type": "navigate" },{ "type": "screenshot", "name": "full-page", "fullPage": true }]}
🧪 Quick Test (cURL)
Copy-paste into your terminal to test it right now:
curl -X POST "https://api.apify.com/v2/acts/zenacquire~browser-as-a-service/runs?token=YOUR_APIFY_TOKEN" \-H "Content-Type: application/json" \-d '{"url": "https://news.ycombinator.com","actions": [{ "type": "navigate" },{ "type": "scrape", "selector": ".titleline > a", "name": "headlines" },{ "type": "screenshot", "name": "hn", "fullPage": true }]}'
🐍 Python
import requests, timeTOKEN = "YOUR_APIFY_TOKEN"BASE = "https://api.apify.com/v2"# Start the runrun = requests.post(f"{BASE}/acts/zenacquire~browser-as-a-service/runs?token={TOKEN}",json={"url": "https://example.com","actions": [{"type": "navigate"},{"type": "scrape", "selector": "h1", "name": "heading"},{"type": "screenshot", "name": "page", "fullPage": True}]}).json()run_id = run["data"]["id"]print(f"🚀 Run started: {run_id}")# Wait for it to finishwhile True:status = requests.get(f"{BASE}/actor-runs/{run_id}?token={TOKEN}").json()state = status["data"]["status"]if state in ("SUCCEEDED", "FAILED", "ABORTED", "TIMED-OUT"):breaktime.sleep(2)# Get resultsitems = requests.get(f"{BASE}/actor-runs/{run_id}/dataset/items?token={TOKEN}").json()for item in items:print(f" {item['action']}: {item['value']}")
📦 JavaScript / Node.js
const TOKEN = "YOUR_APIFY_TOKEN";const BASE = "https://api.apify.com/v2";const run = await fetch(`${BASE}/acts/zenacquire~browser-as-a-service/runs?token=${TOKEN}`,{method: "POST",headers: { "Content-Type": "application/json" },body: JSON.stringify({url: "https://example.com",actions: [{ type: "navigate" },{ type: "scrape", selector: "h1", name: "heading" },{ type: "screenshot", name: "page", fullPage: true },],}),}).then((r) => r.json());const runId = run.data.id;console.log(`🚀 Run started: ${runId}`);// Wait for it to finishlet state;do {await new Promise((r) => setTimeout(r, 2000));const status = await fetch(`${BASE}/actor-runs/${runId}?token=${TOKEN}`).then((r) => r.json());state = status.data.status;} while (!["SUCCEEDED", "FAILED", "ABORTED", "TIMED-OUT"].includes(state));// Get resultsconst items = await fetch(`${BASE}/actor-runs/${runId}/dataset/items?token=${TOKEN}`).then((r) => r.json());items.forEach((item) => console.log(` ${item.action}: ${item.value}`));
🎯 All Actions
| What you want to do | Action | What to include |
|---|---|---|
| 🌐 Open a page | navigate | url (optional — defaults to input url) |
| 👆 Click something | click | selector |
| ⌨️ Type into a field | type | selector + value |
| 📋 Grab text from page | scrape | selector + name |
| 📸 Take a screenshot | screenshot | name + fullPage (true/false) |
| ⏳ Wait for something | wait | selector or ms (milliseconds) |
| 📑 Pick from dropdown | select | selector + value |
| 🖱️ Hover over element | hover | selector |
| 📜 Scroll the page | scroll | selector or y (pixels) |
| 💻 Run JavaScript | evaluate | expression + name |
💡 What's a selector? It's how you point to something on a page. In Chrome: right-click any element → Inspect → right-click the highlighted code → Copy → Copy selector. Paste that into the
selectorfield.
⚙️ Options
| Setting | Default | What it does |
|---|---|---|
browserType | "chromium" | Browser engine: chromium, firefox, or webkit |
timeoutSecs | 30 | Max seconds to wait per action |
viewport | {"width": 1280, "height": 720} | Browser window size |
proxyConfiguration | none | Use Apify proxy to avoid blocks |
💰 Pricing
| Event | Cost |
|---|---|
| Per scraped result | $7.00 / 1,000 |
| Per screenshot | $25.00 / 1,000 |
| Actor start | $0.00005 |
| Platform usage | Variable (cheaper on higher Apify plans) |