Reddit Profile Crawler Pro avatar

Reddit Profile Crawler Pro

Pricing

from $1.00 / 1,000 results

Go to Apify Store
Reddit Profile Crawler Pro

Reddit Profile Crawler Pro

Scrape Reddit user profiles with split karma (post/comment/awarder/awardee), account age, admin/employee/moderator badges, trophies, moderated subreddits, and recent comments. Per-post: awards, gilded count, upvote ratio, media, crosspost parent. Filter by score, age, NSFW. No login.

Pricing

from $1.00 / 1,000 results

Rating

5.0

(12)

Developer

Crawler Bros

Crawler Bros

Maintained by Community

Actor stats

13

Bookmarked

2

Total users

1

Monthly active users

4 days ago

Last modified

Share

Pro-tier version of the Reddit profile crawler. Same browser-automation core, with these additions:

  • ๐Ÿ†” Accepts full Reddit URLs (https://www.reddit.com/user/spez) in addition to plain usernames or u/spez prefixed forms.
  • ๐Ÿ† Trophy case extraction (toggle).
  • ๐Ÿ›ก๏ธ Moderated subreddits list (toggle).
  • ๐Ÿ’ฌ Recent comments scraping (toggle) โ€” visits the user's /comments page in addition to /submitted.
  • ๐Ÿ“Š Split karma fields โ€” post_karma, comment_karma, awarder_karma, awardee_karma, total_karma, plus a verified flag.
  • ๐ŸŽ Per-post enrichment โ€” awards[], total_awards_received, gilded_count, upvote_ratio, is_video, media_metadata{type,thumbnail_*,domain}, crosspost_parent.
  • ๐ŸŽฏ Client-side filters โ€” minPostScore, maxPostAgeDays, excludeNsfwPosts.
  • โœ… Sentinel errors so daily-test never sees an empty dataset.
  • ๐Ÿงช 190+ pytest cases covering every helper, parser, and filter combination.

No authentication, no API keys, no proxy required.

Input

FieldTypeDefaultDescription
usernamesarray of strings (required)["spez"]Reddit usernames. Accepts plain (spez), u/spez, or full URLs (https://www.reddit.com/user/spez).
maxPostsinteger100 (1โ€“1000)Max posts per user.
sectionenumsubmittedsubmitted / overview / gilded.
sortenumnewhot / new / top / controversial.
includeCommentsbooleanfalseAlso scrape the user's /comments page. Adds recent_comments[].
includeTrophiesbooleanfalsePull trophy case. Adds trophies[].
includeModeratedSubredditsbooleanfalsePull moderated subs. Adds moderated_subreddits[].
minPostScoreinteger (optional)โ€“Drop posts below this score.
maxPostAgeDaysinteger (optional)โ€“Drop posts older than N days.
excludeNsfwPostsbooleanfalseDrop NSFW posts.

Example input

{
"usernames": ["spez", "u/example", "https://www.reddit.com/user/another"],
"maxPosts": 50,
"section": "submitted",
"sort": "new",
"includeTrophies": true,
"includeModeratedSubreddits": true,
"minPostScore": 10,
"excludeNsfwPosts": true
}

Output

One record per user. Empty fields are omitted (no nulls).

{
"username": "spez",
"post_karma": 100000,
"comment_karma": 200000,
"awarder_karma": 50,
"awardee_karma": 150,
"total_karma": 350000,
"verified": true,
"account_created": "2005-06-06T00:00:00+00:00",
"posts": [
{
"post_id": "abc123",
"title": "...",
"subreddit": "r/announcements",
"score": 5000,
"num_comments": 240,
"url": "https://www.reddit.com/r/...",
"old_reddit_url": "https://old.reddit.com/r/...",
"content": "...",
"thumbnail_image": "https://...",
"created_at": "2024-06-15T12:00:00+00:00",
"is_stickied": false,
"is_nsfw": false,
"is_video": false,
"awards": [{"name": "Gold", "count": 1}],
"total_awards_received": 1,
"gilded_count": 1,
"upvote_ratio": 0.95,
"media_metadata": {"type": "image", "thumbnail_url": "https://..."}
}
],
"post_count": 50,
"trophies": [{"name": "10-Year Club"}],
"trophy_count": 1,
"moderated_subreddits": ["r/announcements"],
"moderated_subreddit_count": 1,
"recent_comments": [
{"comment_id": "c1", "body": "...", "score": 25, "subreddit": "r/python", "permalink": "https://...", "created_at": "..."}
],
"recent_comment_count": 1,
"scraped_at": "2024-12-16T14:23:11+00:00"
}

Output fields

  • username โ€” the Reddit username.
  • post_karma / comment_karma / awarder_karma / awardee_karma / total_karma โ€” split karma values.
  • verified โ€” Reddit's verified-user flag (when present).
  • account_created โ€” ISO-8601 cake-day timestamp.
  • posts[] โ€” array of post records (see post schema below). Filtered by minPostScore, maxPostAgeDays, excludeNsfwPosts.
  • post_count โ€” count of posts emitted (after filtering).
  • filtered_post_count โ€” only present when filters dropped one or more posts (diagnostic).
  • trophies[] / trophy_count โ€” only when includeTrophies: true.
  • moderated_subreddits[] / moderated_subreddit_count โ€” only when includeModeratedSubreddits: true.
  • recent_comments[] / recent_comment_count โ€” only when includeComments: true.
  • scraped_at โ€” ISO-8601 UTC timestamp.

Post schema (per row in posts[])

post_id, post_name, title, subreddit, score, num_comments, url, old_reddit_url, external_url, content, thumbnail_image, link_flair, created_utc, created_at, is_stickied, is_nsfw, is_promoted, is_spoiler, is_video, awards[], total_awards_received, gilded_count, upvote_ratio, media_metadata, crosspost_parent.

Use cases

  • User-activity analysis โ€” pull posting cadence, top subreddits, karma split.
  • Account-age verification โ€” account_created + verified + total_karma build a quick "is this a bot?" signal.
  • Influence mapping โ€” sort users by total_awards_received across their post history.
  • Moderator audit โ€” pull every subreddit a user moderates with one input.
  • Content datasets for ML โ€” bulk-fetch comments + posts with structured metadata.

FAQ

Does it need cookies, login, or proxy? No. The actor connects directly using a Chrome User-Agent and the public old.reddit.com endpoint.

Why use old.reddit.com? Stable HTML structure that works across years; the new Reddit UI is React-rendered and hostile to scraping.

What happens if a user doesn't exist? The actor emits a sentinel record {type: "reddit_profile_crawler_pro_error", reason: "user_not_found", username: "..."} and continues with the rest of the input list.

What if all posts get filtered out? The user record still emits with post_count: 0 and filtered_post_count: N so you can tell the user existed but the filter eliminated their posts.

How big can the result get? With includeComments + includeTrophies + includeModeratedSubreddits all on and maxPosts: 1000, expect ~5-10 KB per user. The actor scales linearly with users + max posts.