Reddit Posts & Comments Scraper
Pricing
Pay per usage
Reddit Posts & Comments Scraper
Scrape Reddit posts, comments, and subreddit data. Extract titles, scores, authors, comment threads, and media. Works with any subreddit or search query.
Pricing
Pay per usage
Rating
0.0
(0)
Developer
oscar lira
Actor stats
0
Bookmarked
6
Total users
4
Monthly active users
3 hours ago
Last modified
Categories
Share
Reddit Scraper
Scrape posts and comments from any subreddit or Reddit search. Extracts 29 fields per post and 10 fields per comment, including nested replies, image galleries, flairs, and subreddit stats. No API keys or login needed.
What data does it extract?
Post fields (29)
| Field | Description |
|---|---|
id | Reddit post ID (e.g. t3_1s6e3dp) |
subreddit | Subreddit name (e.g. technology) |
title | Post title |
author | Reddit username |
score | Net upvotes |
upvoteRatio | Upvote ratio (e.g. 0.95 = 95%) |
numComments | Comment count |
url | Reddit permalink |
selftext | Post body for text posts (up to 5,000 chars) |
thumbnail | Thumbnail preview URL |
imageUrls | All image URLs from galleries and image posts |
media | Video URL + duration, or image URL |
created | Post creation time (ISO 8601) |
edited | Last edit timestamp, or false |
isVideo | Video post flag |
isSelf | Text post (true) vs link post (false) |
isGallery | Multi-image gallery post |
domain | Source domain (e.g. youtube.com, self.technology) |
linkUrl | External URL for link posts |
flair | Post flair (e.g. Discussion, News, Privacy) |
awards | Total Reddit awards |
isNSFW | NSFW flag |
isSpoiler | Spoiler flag |
isPinned | Pinned/stickied by mods |
numCrossposts | Times crossposted to other subreddits |
subredditSubscribers | Subreddit subscriber count |
postType | Classification: text, link, video, image, gallery |
scrapedAt | Scraping timestamp (ISO 8601) |
comments | Array of comments (when enabled) |
Comment fields (10)
| Field | Description |
|---|---|
id | Comment ID |
author | Commenter username |
body | Comment text (up to 2,000 chars) |
score | Net upvotes |
created | Timestamp (ISO 8601) |
depth | Nesting level (0 = top-level, 1 = reply, 2 = reply-to-reply) |
isSubmitter | Whether the commenter is the post author |
parentId | Parent comment or post ID |
controversiality | Whether the comment is controversial (0 or 1) |
replies | Number of direct replies |
Use cases
- Market research — track discussions about your product or competitors
- Sentiment analysis — collect posts + comments and feed them into NLP pipelines
- Lead generation — find people asking for solutions your product solves
- Content discovery — find trending topics across communities
- AI training data — gather real conversations for LLM fine-tuning or RAG
How to use
Scrape hot posts from multiple subreddits
{"subreddits": ["technology", "programming", "webdev"],"maxPosts": 50,"sort": "hot"}
maxPosts applies per subreddit — this returns up to 150 posts total.
Search across all of Reddit
{"searchQuery": "best CRM for small business","maxPosts": 100,"sort": "top","timeFilter": "month"}
Get posts with nested comments
{"subreddits": ["AskReddit"],"maxPosts": 25,"sort": "top","timeFilter": "week","includeComments": true,"maxCommentsPerPost": 20}
Input parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
subreddits | array | [] | Subreddit names without r/ prefix |
searchQuery | string | — | Search Reddit for this term |
maxPosts | integer | 50 | Max posts per subreddit (1-500) |
sort | string | hot | Sort: hot, new, top, rising |
timeFilter | string | day | Time range for top: hour, day, week, month, year, all |
includeComments | boolean | false | Fetch comments for each post |
maxCommentsPerPost | integer | 10 | Comments per post (1-100), includes nested replies up to 3 levels |
Output example
{"id": "t3_1s6gkmj","subreddit": "pics","title": "About 100,000 attended the No Kings protest in St. Paul, Minnesota","author": "katotooo","score": 24915,"upvoteRatio": 0.97,"numComments": 235,"url": "https://www.reddit.com/r/pics/comments/1s6gkmj/about_100000_attended_the_no_kings_protest/","selftext": null,"thumbnail": "https://preview.redd.it/oy94fh2hnvrg1.jpg?width=140&height=93","imageUrls": ["https://preview.redd.it/oy94fh2hnvrg1.jpg?width=3024&format=pjpg","https://preview.redd.it/9pmbe5aqnvrg1.jpg?width=4032&format=pjpg"],"media": null,"created": "2026-03-29T00:21:20.000Z","edited": false,"isVideo": false,"isSelf": false,"isGallery": true,"domain": "old.reddit.com","linkUrl": "https://www.reddit.com/gallery/1s6gkmj","flair": "Politics","awards": 0,"isNSFW": false,"isSpoiler": false,"isPinned": false,"numCrossposts": 2,"subredditSubscribers": 33336092,"postType": "gallery","scrapedAt": "2026-03-29T08:05:31.904Z","comments": [{"id": "od1rtqc","author": "YJSubs","body": "When I see Bernie, I thought, did you just identify him as bald eagle?","score": 1,"created": "2026-03-29T00:21:22.000Z","depth": 0,"isSubmitter": false,"parentId": "t3_1s6gkmj","controversiality": 0,"replies": 1},{"id": "od1s2fp","author": "rclonecopymove","body": "Same, he's bald but not very aquiline.","score": 1,"created": "2026-03-29T00:22:45.000Z","depth": 1,"isSubmitter": false,"parentId": "t1_od1rtqc","controversiality": 0,"replies": 0}]}
Performance & cost
- ~$0.016 per 1,000 posts (without comments)
- ~$0.05 per 1,000 posts with 10 comments each
- 3 subreddits × 50 posts in ~40 seconds
- Session persistence reuses working proxies across subreddits
- 12 automatic retries with proxy rotation on Reddit blocks
FAQ
Does this need a Reddit API key? No. It uses Reddit's public JSON endpoints with a stealth browser to avoid blocks.
Why do some requests get 403 errors? Reddit blocks datacenter IPs. The scraper rotates proxy IPs and retries automatically. A few 403s in the log is normal — it finds a working IP within 3-12 attempts.
What does maxPosts mean? It's per subreddit, not global. 3 subreddits × 50 maxPosts = up to 150 posts.
How deep do comments go?
Up to 3 levels: top-level comments (depth 0), replies (depth 1), and replies-to-replies (depth 2). Each comment has a parentId so you can rebuild the thread.