Cheapest Reddit Scraper 1$/1000 results avatar
Cheapest Reddit Scraper 1$/1000 results
Under maintenance

Pricing

$1.00 / 1,000 results

Go to Apify Store
Cheapest Reddit Scraper 1$/1000 results

Cheapest Reddit Scraper 1$/1000 results

Under maintenance

Scrape Reddit data. Extract posts, comments, user profiles, and subreddit content in structured JSON format. Perfect for social media analysis, research, and data collection.

Pricing

$1.00 / 1,000 results

Rating

5.0

(2)

Developer

Camp8 fr0

Camp8 fr0

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

3 days ago

Last modified

Share

Reddit Scraper - Extract Posts, Comments & User Data from Reddit

Powerful Apify actor for scraping Reddit data. Extract posts, comments, user profiles, and subreddit content in structured JSON format. Perfect for social media analysis, research, and data collection.

Features

  • ✅ Scrape posts from any subreddit
  • ✅ Scrape posts from any user profile
  • ✅ Scrape comments from any user
  • ✅ Scrape comments from any subreddit
  • ✅ Configurable limits and options
  • ✅ Clean structured JSON output
  • ✅ No authentication required

What Can You Scrape?

1. Subreddit Posts

Fetches posts from a specific subreddit.

Parameters:

  • subreddit (required) - Subreddit name (e.g., "python")
  • limit - Number of posts to fetch (default: 25)
  • includeFacets - Include facets in results (default: true)

2. User Posts

Fetches posts from a specific user's profile.

Parameters:

  • user (required) - Reddit username (e.g., "slaeryx")
  • limit - Number of posts to fetch (default: 25)
  • showMore - Show more details (default: true)

3. User Comments

Fetches comments from a specific user.

Parameters:

  • user (required) - Reddit username
  • limit - Number of comments to fetch (default: 25)
  • showMore - Show more details (default: true)

4. Subreddit Comments

Fetches comments from a specific subreddit.

Parameters:

  • subreddit (required) - Subreddit name
  • limit - Number of comments to fetch (default: 25)
  • showMore - Show more details (default: true)
  • includeMedia - Include media in results (default: true)

Input Configuration

{
"scrapeType": "subreddit_posts",
"subreddit": "python",
"limit": 50,
"includeFacets": true,
"includeMedia": true,
"showMore": true
}

Required Fields

  • scrapeType - Choose from: subreddit_posts, user_posts, user_comments, subreddit_comments

Conditional Required Fields

  • subreddit - Required for subreddit_posts and subreddit_comments
  • user - Required for user_posts and user_comments

Output

The actor returns clean, structured data directly from Reddit. Each item contains the full Reddit data object with all available fields.

Example Output (Subreddit Post)

{
"title": "What's everyone working on this week?",
"author": "AutoModerator",
"subreddit": "Python",
"score": 42,
"num_comments": 15,
"url": "https://www.reddit.com/r/Python/comments/...",
"selftext": "Share your projects...",
"created_utc": 1768089632,
"id": "1q9jxqs",
"permalink": "/r/Python/comments/1q9jxqs/...",
"is_self": true
}

Example Output (User Comment)

{
"body": "This is a great project!",
"author": "slaeryx",
"subreddit": "Python",
"score": 15,
"created_utc": 1768100056,
"link_title": "Show off your project",
"parent_id": "t3_1q9niqo",
"id": "nywkrgs",
"permalink": "/r/Python/comments/..."
}

Usage Examples

Example 1: Scrape Python Subreddit Posts

{
"scrapeType": "subreddit_posts",
"subreddit": "python",
"limit": 25
}

Example 2: Scrape User's Posts

{
"scrapeType": "user_posts",
"user": "slaeryx",
"limit": 50
}

Example 3: Scrape User's Comments

{
"scrapeType": "user_comments",
"user": "slaeryx",
"limit": 25
}

Example 4: Scrape Subreddit Comments

{
"scrapeType": "subreddit_comments",
"subreddit": "python",
"limit": 50
}

Support

For issues or questions:

  • Review the actor logs for detailed error messages
  • Check your input configuration
  • Ensure subreddit/user names are correct

Use Cases

  • Social media monitoring and analysis
  • Market research and sentiment analysis
  • Content aggregation
  • Research and data collection
  • Community insights
  • Trend analysis