Uneed Scraper
Pricing
$10.00 / 1,000 results
Pricing
$10.00 / 1,000 results
Rating
0.0
(0)
Developer
Maxime
Actor stats
0
Bookmarked
8
Total users
2
Monthly active users
10 hours ago
Last modified
Categories
Share
Get the latest Uneed ladder into your pipeline
ποΈ Uneed Scraper reads the live daily ladder, keeps paid spots out by default, captures maker social links, and can pull website emails in the same run.

Fastest way to try it
Start with maxNbItemsToScrape: 10, keep Include promoted listings? off, let the actor fetch the latest available ladder, and inspect the first dataset rows in the Output tab.
Why people use it
- ποΈ Manual Uneed collection means checking the daily ladder, separating promoted tools, opening each tool page, and copying maker links by hand.
- π« Keep paid entries out by default, and include them only when you explicitly want them.
- π¬ Pull maker social links and optional website emails into the same record.
- π Keep a direct source link for every tool you export.
- ποΈ Turn the latest ladder into a repeatable daily feed for research, GTM monitoring, and enrichment workflows.
How to use in 3 simple steps
- Open the Input tab and use
Product limitto leave the full ladder on or start with a small cap such as10. - Leave
Include promoted listings?off if you want organic-only output, and switch it on only when you intentionally want paid tools included. - Keep
Scrape emails?on when you want website contact paths too, then use the Output tab or API to send the dataset into your own workflow.
Inputs, defaults, and behavior
- ποΈ The actor reads the live Uneed daily ladder API, not a brittle homepage snapshot.
- ποΈ If the current UTC day is empty or temporarily unavailable, it automatically tries the previous UTC day so you still get the latest published ladder.
- π’ Leave
maxNbItemsToScrapeempty or set it to0to process the full ladder. - π£
Include promoted listings?defaults to off. When organic-only output is requested, the run fails if promotion signals cannot be verified. - π¬
shouldScrapeEmailsdefaults totrue. - π€ The actor opens each tool page to extract maker contact links and can crawl the tool website for emails in the same run.
- π Each record includes a stable
listingUrlin the formhttps://www.uneed.best/tool/{slug}. - β οΈ If one tool or its website crawl fails, the actor logs that failure and continues with the rest of the ladder.
- π The actor supports Apify proxy configuration, including direct mode when you explicitly pass
null. - π
descriptionis alwaysnullfor Uneed because this source does not expose a separate long description field in the emitted contract.
Input example
{"maxNbItemsToScrape": 10,"shouldIncludePromotedListings": false,"shouldScrapeEmails": true}
Input screenshot

What data can Uneed Scraper extract?
See the full Output tab for the complete schema.
π Each row includes the tool website URL, the Uneed listing URL, and the contact or email paths you need for follow-up.
| Field | What you get | Why it matters |
|---|---|---|
url | The tool's own website URL | Feed it into enrichment, scoring, or dedupe jobs |
listingUrl | The Uneed tool page URL | Keep a direct source link for review |
tagline | The live ladder description string | Understand the tool without reopening the page |
contacts | Maker names plus external social links | Research founders or authors from the same row |
emails | Website emails when enrichment is on | Get contact paths in the same pass |
isPromotedListing | Whether Uneed marked the tool as promoted | Keep paid tools separate from organic ones |
Output example
[{"url": "https://acme.dev/","listingUrl": "https://www.uneed.best/tool/acme-dev","name": "Acme Dev","tagline": "API observability for small teams","description": null,"isPromotedListing": false,"contacts": [{"name": "Jane Doe","links": ["https://x.com/janedoe","https://www.linkedin.com/in/jane-doe"]}],"emails": ["hello@acme.dev"]}]
Output screenshot

How much does Uneed scraping cost?
π’ This actor uses price-per-result billing, so the main cost driver is how many Uneed rows it emits. The easiest way to bound a first run is to start with maxNbItemsToScrape: 10, which is about $0.10 at the current repo-configured rate. At that same repo-configured rate, 100 tool rows is about $1.00. The live Pricing tab is the source of truth for the exact current rate.
| Billed item | When it triggers | Repo-configured price |
|---|---|---|
| Tool result | When one Uneed tool row is pushed to the dataset | $0.01 |
Why run Uneed Scraper on Apify?
- Run it from the Console or call it from the Apify API when Uneed is one source inside a larger pipeline.
- Schedule a daily job so the ladder lands in your dataset without manual checking.
- Keep run logs and datasets together when you need to inspect a failed tool or compare days.
- Use Apify proxy settings when you want managed networking instead of wiring that layer yourself.
FAQ
Does it scrape today's ladder or the latest available ladder?
It tries the current UTC day first. If that ladder is empty or temporarily unavailable, it falls back to the previous UTC day.
Can I include promoted listings?
Yes. Include promoted listings? defaults to off, and the run will fail rather than quietly mixing sponsored tools into an organic-only output when promotion signals cannot be verified.
Why is description always null?
The emitted Uneed contract uses the ladder description as tagline. There is no separate long-description field in this actor's output.
What happens if one tool fails during processing?
That tool is logged as a failure, skipped, and the run continues with the remaining ladder items.
What should I try if the ladder looks incomplete or the run fails?
First remember that the actor tries the current UTC day and can fall back to the previous UTC day when needed. Then verify maxNbItemsToScrape and whether Include promoted listings? is on, re-run with a smaller limit like 10, and check the run log before opening an issue.
Where do I report a missing field, broken tool page, or Uneed source change?
Open the Issues page with the tool URL or ladder date you tested, the input you used, and the output you expected. I use that queue for fixes and feature requests.
Explore the rest of the collection
- Product Hunt Scraper - daily Product Hunt leaderboard scraping with cache and live-crawl options, maker links, and optional email enrichment
- TinySeed Scraper - TinySeed portfolio scraping with company descriptions and optional website emails
- Tiny Startups Scraper - Tiny Startups homepage scraping with promoted-card filtering and email enrichment
- Website Emails Scraper - shallow-crawl any list of URLs and emit one row per unique email found
Missing a feature or data?
Create a ticket and I'll add it within 24h.