Gofundme Campaign Scraper
Pricing
Pay per usage
Gofundme Campaign Scraper
Pricing
Pay per usage
Rating
0.0
(0)
Developer

Donny Nguyen
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
3 days ago
Last modified
Categories
Share
Scrape GoFundMe campaign pages to extract fundraising data, donor counts, campaign stories, organizer information, and recent donations. Monitor crowdfunding trends, track campaign performance, or build datasets of fundraising campaigns at scale.
What does GoFundMe Campaign Scraper do?
This actor takes a list of GoFundMe campaign URLs and extracts detailed information from each campaign page. It uses a headless browser with stealth capabilities to reliably access campaign data, including amounts raised, funding goals, donor counts, the full campaign story, and the most recent donations.
Use this scraper to:
- Track fundraising progress across multiple campaigns over time
- Analyze crowdfunding trends by extracting campaign data at scale
- Monitor competitor campaigns for benchmarking and research
- Build datasets of GoFundMe campaigns for academic or market research
- Aggregate donation data to understand giving patterns and donor behavior
Input
| Field | Type | Description | Default |
|---|---|---|---|
urls | Array of strings | Required. List of GoFundMe campaign URLs to scrape. Each URL should point to an individual campaign page (e.g., https://www.gofundme.com/f/campaign-name). | [] |
maxResults | Integer | Maximum number of campaign pages to scrape in a single run. Controls cost and execution time. | 100 |
useResidentialProxy | Boolean | Enable residential proxies for improved success rates against anti-bot protection. Recommended for production use. | true |
Example Input
{"urls": ["https://www.gofundme.com/f/example-campaign-1","https://www.gofundme.com/f/example-campaign-2"],"maxResults": 50,"useResidentialProxy": true}
Output
Each scraped campaign produces a JSON object with the following fields:
| Field | Type | Description |
|---|---|---|
url | String | The GoFundMe campaign URL that was scraped |
title | String | Campaign title or headline |
organizer | String | Name of the campaign organizer |
raisedAmount | Number | Total amount raised so far in USD |
fundingGoal | Number | Campaign funding goal in USD |
donorCount | Integer | Total number of donors or supporters |
category | String | Campaign category (e.g., Medical, Emergency, Education) |
story | String | Campaign story text (truncated to 3,000 characters) |
createdDate | String | Date the campaign was created |
recentDonationsCount | Integer | Number of recent donations extracted (up to 10) |
recentDonations | Array | List of recent donations, each with name, amount, and time |
error | String | Error code if scraping failed (e.g., 403, 429, REQUEST_FAILED) |
errorMessage | String | Human-readable error description |
scrapedAt | String | ISO 8601 timestamp of when the data was scraped |
Example Output
{"url": "https://www.gofundme.com/f/example-campaign","title": "Help John with Medical Expenses","organizer": "Jane Smith","raisedAmount": 15420,"fundingGoal": 25000,"donorCount": 312,"category": "Medical","story": "John was recently diagnosed with...","createdDate": "2025-11-15","recentDonationsCount": 10,"recentDonations": [{"name": "Sarah K.","amount": "$50","time": "2 hours ago"},{"name": "Anonymous","amount": "$100","time": "5 hours ago"}],"scrapedAt": "2026-02-11T12:00:00.000Z"}
Proxy Configuration
This scraper supports both residential and datacenter proxies:
- Residential proxies (default): Higher success rates against GoFundMe's anti-bot systems. Recommended for production scraping.
- Datacenter proxies: Lower cost but may experience higher block rates. Suitable for small-scale testing.
Limitations
- Requires valid GoFundMe campaign URLs as input (does not discover or search for campaigns)
- Campaign story text is truncated to 3,000 characters to manage dataset size
- Recent donations are limited to the 10 most recent visible on the page
- Some campaigns may have restricted access or may have been removed
- Rate limiting may occur with large numbers of requests in a short period
Tips for Best Results
- Use residential proxies for the most reliable scraping results
- Start with a small batch of URLs to verify output before scaling up
- Monitor error counts in the status messages to detect blocking patterns
- Space out large runs to avoid triggering rate limits
- Check the
errorfield in results to identify campaigns that need re-scraping