# Reddit Subreddit Scraper (`logiover/reddit-subreddit-scraper`) Actor

Scrape posts from any subreddit - title, author, score, comments, flair, text and timestamps. Run it on a schedule for social listening, brand monitoring, lead generation or market research.

- **URL**: https://apify.com/logiover/reddit-subreddit-scraper.md
- **Developed by:** [Logiover](https://apify.com/logiover) (community)
- **Categories:** Social media, Marketing
- **Stats:** 11 total users, 2 monthly users, 100.0% runs succeeded, NaN bookmarks
- **User rating**: No ratings yet

## Pricing

from $3.50 / 1,000 search results

This Actor is paid per event. You are not charged for the Apify platform usage, but only a fixed price for specific events.
Since this Actor supports Apify Store discounts, the price gets lower the higher subscription plan you have.

Learn more: https://docs.apify.com/platform/actors/running/actors-in-store#pay-per-event

## What's an Apify Actor?

Actors are a software tools running on the Apify platform, for all kinds of web data extraction and automation use cases.
In Batch mode, an Actor accepts a well-defined JSON input, performs an action which can take anything from a few seconds to a few hours,
and optionally produces a well-defined JSON output, datasets with results, or files in key-value store.
In Standby mode, an Actor provides a web server which can be used as a website, API, or an MCP server.
Actors are written with capital "A".

## How to integrate an Actor?

If asked about integration, you help developers integrate Actors into their projects.
You adapt to their stack and deliver integrations that are safe, well-documented, and production-ready.
The best way to integrate Actors is as follows.

In JavaScript/TypeScript projects, use official [JavaScript/TypeScript client](https://docs.apify.com/api/client/js.md):

```bash
npm install apify-client
```

In Python projects, use official [Python client library](https://docs.apify.com/api/client/python.md):

```bash
pip install apify-client
```

In shell scripts, use [Apify CLI](https://docs.apify.com/cli/docs.md):

````bash
# MacOS / Linux
curl -fsSL https://apify.com/install-cli.sh | bash
# Windows
irm https://apify.com/install-cli.ps1 | iex
```bash

In AI frameworks, you might use the [Apify MCP server](https://docs.apify.com/platform/integrations/mcp.md).

If your project is in a different language, use the [REST API](https://docs.apify.com/api/v2.md).

For usage examples, see the [API](#api) section below.

For more details, see Apify documentation as [Markdown index](https://docs.apify.com/llms.txt) and [Markdown full-text](https://docs.apify.com/llms-full.txt).


# README

## 👽 Reddit Subreddit Scraper — Scrape Reddit Posts, Scores & Comments

Scrape **posts from any subreddit on [Reddit](https://www.reddit.com)** — title, author, score, comment count, flair and full self text — and export them to JSON, CSV or Excel. This **Reddit scraper** comes with built-in residential proxy support so it works reliably against Reddit's datacenter-IP blocks, with **no login and no Reddit API key required**.

Reddit never stops, so this Actor is built for **scheduled, recurring use** — keep an always-fresh feed from the subreddits you care about for social listening, brand monitoring and market research.

### ✨ What this Actor does / Key features

- 🌐 Scrape **any subreddit** — pass one or many subreddit names in a single run, with or without the `r/` prefix
- 🛡️ **Reliable by default** — uses Apify residential proxies to avoid Reddit's datacenter-IP blocks
- 📊 Rich data per post — post ID, subreddit, title, author, URL, permalink, self text, score, upvote ratio, comment count, flair, video flag, NSFW flag and timestamps
- 🔀 **Sort options** — `new`, `hot`, `top`, `rising`
- ⏱️ Time-window control for `top` sort — hour, day, week, month, year or all
- ♾️ Leave the per-subreddit limit empty to pull as many posts as available
- 🔁 Built for scheduling — run daily for a continuously fresh subreddit feed
- 📤 Structured output ready for JSON, CSV and Excel export

### 🔍 Input

| Field | Type | Description |
|-------|------|-------------|
| `subreddits` | array | Subreddit names to scrape (e.g. `startups`, `forhire`, `cryptocurrency`). With or without the `r/` prefix. |
| `sort` | string (select) | Post sort order: `new`, `hot`, `top`, `rising`. |
| `timeFilter` | string (select) | Time window applied to `top` sort: `hour`, `day`, `week`, `month`, `year`, `all`. |
| `maxPostsPerSub` | integer | Maximum posts to collect per subreddit. Leave empty / `0` for as many as available. |
| `proxyConfiguration` | object | Proxy settings. Reddit blocks datacenter IPs, so a residential proxy is used by default. |

### 🚀 Example input

```json
{
  "subreddits": ["startups", "Entrepreneur"],
  "sort": "new",
  "timeFilter": "day",
  "maxPostsPerSub": 200,
  "proxyConfiguration": {
    "useApifyProxy": true,
    "apifyProxyGroups": ["RESIDENTIAL"]
  }
}
````

### 📦 Output

Each post is saved as one structured record.

| Field | Description |
|-------|-------------|
| `postId` | Unique Reddit post ID |
| `subreddit` | Subreddit the post belongs to |
| `title` | Post title |
| `author` | Username of the post author |
| `url` | URL the post links to (external link or Reddit URL) |
| `permalink` | Permanent Reddit link to the post |
| `selftext` | Full body text of self/text posts |
| `score` | Post score (upvotes minus downvotes) |
| `upvoteRatio` | Ratio of upvotes to total votes |
| `numComments` | Number of comments on the post |
| `flair` | Post flair label |
| `isVideo` | Whether the post is a video (boolean) |
| `over18` | Whether the post is marked NSFW (boolean) |
| `createdAt` | When the post was created |
| `scrapedAt` | When the Actor scraped the post |

#### Sample record

```json
{
  "postId": "1abc234",
  "subreddit": "startups",
  "title": "How we got our first 100 customers",
  "author": "founder_jane",
  "url": "https://www.reddit.com/r/startups/comments/1abc234/...",
  "permalink": "/r/startups/comments/1abc234/...",
  "selftext": "We spent three months on cold outreach...",
  "score": 842,
  "upvoteRatio": 0.97,
  "numComments": 134,
  "flair": "Share Your Startup",
  "isVideo": false,
  "over18": false,
  "createdAt": "2026-05-13T18:22:00.000Z",
  "scrapedAt": "2026-05-14T08:00:00.000Z"
}
```

### 💡 Use cases

- **Brand & topic monitoring** — track mentions and discussions of your product or competitors across subreddits.
- **Market & audience research** — understand what target communities are talking about and which posts resonate.
- **Lead generation** — find people actively asking for solutions you provide (e.g. in `r/forhire` or niche subreddits).
- **Sentiment analysis & datasets** — build a structured Reddit corpus on a schedule for NLP and trend analysis.
- **Content & trend discovery** — surface the top and rising posts in any community for ideation.

### ❓ Frequently Asked Questions

**Do I need a Reddit account or API key?**
No. The Actor scrapes publicly visible subreddit posts — no login and no Reddit API key required.

**Is scraping Reddit legal?**
The Actor collects publicly available post data. You are responsible for using it in compliance with Reddit's terms of service, content policy and applicable laws.

**Why does it use a residential proxy?**
Reddit blocks datacenter IPs, so the Actor uses Apify residential proxies by default for reliable access. Leave the proxy configuration as-is unless you have a specific reason to change it.

**How many posts can I get?**
Set `maxPostsPerSub` to control volume per subreddit, or leave it empty to pull as many posts as Reddit makes available for the chosen sort.

**Can I scrape multiple subreddits at once?**
Yes. Pass several subreddit names in the `subreddits` array and the Actor scrapes them all in one run.

**What does the time filter do?**
`timeFilter` applies to the `top` sort, letting you pull the top posts from the last hour, day, week, month, year or all time.

**What output formats are supported?**
Results are stored in an Apify dataset and can be exported as JSON, CSV, Excel, XML or HTML, or pulled via the Apify API.

### ⏰ Scheduling & integration

Schedule this Actor on Apify to run hourly or daily for a continuously fresh subreddit feed. Export results to JSON, CSV or Excel, call it via the Apify API, or connect it to Slack, Google Sheets and webhooks through Apify integrations for automated social-listening alerts.

# Actor input Schema

## `subreddits` (type: `array`):

Subreddit names to scrape, e.g. 'startups', 'forhire', 'cryptocurrency'. With or without the 'r/' prefix.

## `sort` (type: `string`):

Post sort order.

## `timeFilter` (type: `string`):

Time window (applies to 'top' sort).

## `maxPostsPerSub` (type: `integer`):

Maximum posts to collect per subreddit.

## `proxyConfiguration` (type: `object`):

Reddit blocks datacenter IPs, so a residential proxy is used by default. Leave as-is unless you have a reason to change it.

## Actor input object example

```json
{
  "subreddits": [
    "startups"
  ],
  "sort": "new",
  "timeFilter": "day",
  "proxyConfiguration": {
    "useApifyProxy": true,
    "apifyProxyGroups": [
      "RESIDENTIAL"
    ]
  }
}
```

# Actor output Schema

## `subreddit` (type: `string`):

subreddit

## `title` (type: `string`):

title

## `author` (type: `string`):

author

## `score` (type: `string`):

score

## `numComments` (type: `string`):

numComments

## `flair` (type: `string`):

flair

## `permalink` (type: `string`):

permalink

## `createdAt` (type: `string`):

createdAt

# API

You can run this Actor programmatically using our API. Below are code examples in JavaScript, Python, and CLI, as well as the OpenAPI specification and MCP server setup.

## JavaScript example

```javascript
import { ApifyClient } from 'apify-client';

// Initialize the ApifyClient with your Apify API token
// Replace the '<YOUR_API_TOKEN>' with your token
const client = new ApifyClient({
    token: '<YOUR_API_TOKEN>',
});

// Prepare Actor input
const input = {
    "subreddits": [
        "startups"
    ]
};

// Run the Actor and wait for it to finish
const run = await client.actor("logiover/reddit-subreddit-scraper").call(input);

// Fetch and print Actor results from the run's dataset (if any)
console.log('Results from dataset');
console.log(`💾 Check your data here: https://console.apify.com/storage/datasets/${run.defaultDatasetId}`);
const { items } = await client.dataset(run.defaultDatasetId).listItems();
items.forEach((item) => {
    console.dir(item);
});

// 📚 Want to learn more 📖? Go to → https://docs.apify.com/api/client/js/docs

```

## Python example

```python
from apify_client import ApifyClient

# Initialize the ApifyClient with your Apify API token
# Replace '<YOUR_API_TOKEN>' with your token.
client = ApifyClient("<YOUR_API_TOKEN>")

# Prepare the Actor input
run_input = { "subreddits": ["startups"] }

# Run the Actor and wait for it to finish
run = client.actor("logiover/reddit-subreddit-scraper").call(run_input=run_input)

# Fetch and print Actor results from the run's dataset (if there are any)
print("💾 Check your data here: https://console.apify.com/storage/datasets/" + run["defaultDatasetId"])
for item in client.dataset(run["defaultDatasetId"]).iterate_items():
    print(item)

# 📚 Want to learn more 📖? Go to → https://docs.apify.com/api/client/python/docs/quick-start

```

## CLI example

```bash
echo '{
  "subreddits": [
    "startups"
  ]
}' |
apify call logiover/reddit-subreddit-scraper --silent --output-dataset

```

## MCP server setup

```json
{
    "mcpServers": {
        "apify": {
            "command": "npx",
            "args": [
                "mcp-remote",
                "https://mcp.apify.com/?tools=logiover/reddit-subreddit-scraper",
                "--header",
                "Authorization: Bearer <YOUR_API_TOKEN>"
            ]
        }
    }
}

```

## OpenAPI specification

```json
{
    "openapi": "3.0.1",
    "info": {
        "title": "Reddit Subreddit Scraper",
        "description": "Scrape posts from any subreddit - title, author, score, comments, flair, text and timestamps. Run it on a schedule for social listening, brand monitoring, lead generation or market research.",
        "version": "1.0",
        "x-build-id": "DVEYvzeTrWklCgg0B"
    },
    "servers": [
        {
            "url": "https://api.apify.com/v2"
        }
    ],
    "paths": {
        "/acts/logiover~reddit-subreddit-scraper/run-sync-get-dataset-items": {
            "post": {
                "operationId": "run-sync-get-dataset-items-logiover-reddit-subreddit-scraper",
                "x-openai-isConsequential": false,
                "summary": "Executes an Actor, waits for its completion, and returns Actor's dataset items in response.",
                "tags": [
                    "Run Actor"
                ],
                "requestBody": {
                    "required": true,
                    "content": {
                        "application/json": {
                            "schema": {
                                "$ref": "#/components/schemas/inputSchema"
                            }
                        }
                    }
                },
                "parameters": [
                    {
                        "name": "token",
                        "in": "query",
                        "required": true,
                        "schema": {
                            "type": "string"
                        },
                        "description": "Enter your Apify token here"
                    }
                ],
                "responses": {
                    "200": {
                        "description": "OK"
                    }
                }
            }
        },
        "/acts/logiover~reddit-subreddit-scraper/runs": {
            "post": {
                "operationId": "runs-sync-logiover-reddit-subreddit-scraper",
                "x-openai-isConsequential": false,
                "summary": "Executes an Actor and returns information about the initiated run in response.",
                "tags": [
                    "Run Actor"
                ],
                "requestBody": {
                    "required": true,
                    "content": {
                        "application/json": {
                            "schema": {
                                "$ref": "#/components/schemas/inputSchema"
                            }
                        }
                    }
                },
                "parameters": [
                    {
                        "name": "token",
                        "in": "query",
                        "required": true,
                        "schema": {
                            "type": "string"
                        },
                        "description": "Enter your Apify token here"
                    }
                ],
                "responses": {
                    "200": {
                        "description": "OK",
                        "content": {
                            "application/json": {
                                "schema": {
                                    "$ref": "#/components/schemas/runsResponseSchema"
                                }
                            }
                        }
                    }
                }
            }
        },
        "/acts/logiover~reddit-subreddit-scraper/run-sync": {
            "post": {
                "operationId": "run-sync-logiover-reddit-subreddit-scraper",
                "x-openai-isConsequential": false,
                "summary": "Executes an Actor, waits for completion, and returns the OUTPUT from Key-value store in response.",
                "tags": [
                    "Run Actor"
                ],
                "requestBody": {
                    "required": true,
                    "content": {
                        "application/json": {
                            "schema": {
                                "$ref": "#/components/schemas/inputSchema"
                            }
                        }
                    }
                },
                "parameters": [
                    {
                        "name": "token",
                        "in": "query",
                        "required": true,
                        "schema": {
                            "type": "string"
                        },
                        "description": "Enter your Apify token here"
                    }
                ],
                "responses": {
                    "200": {
                        "description": "OK"
                    }
                }
            }
        }
    },
    "components": {
        "schemas": {
            "inputSchema": {
                "type": "object",
                "properties": {
                    "subreddits": {
                        "title": "Subreddits",
                        "type": "array",
                        "description": "Subreddit names to scrape, e.g. 'startups', 'forhire', 'cryptocurrency'. With or without the 'r/' prefix.",
                        "items": {
                            "type": "string"
                        }
                    },
                    "sort": {
                        "title": "Sort",
                        "enum": [
                            "new",
                            "hot",
                            "top",
                            "rising"
                        ],
                        "type": "string",
                        "description": "Post sort order.",
                        "default": "new"
                    },
                    "timeFilter": {
                        "title": "Time Filter",
                        "enum": [
                            "hour",
                            "day",
                            "week",
                            "month",
                            "year",
                            "all"
                        ],
                        "type": "string",
                        "description": "Time window (applies to 'top' sort).",
                        "default": "day"
                    },
                    "maxPostsPerSub": {
                        "title": "Max Posts per Subreddit",
                        "minimum": 0,
                        "type": "integer",
                        "description": "Maximum posts to collect per subreddit."
                    },
                    "proxyConfiguration": {
                        "title": "Proxy Configuration",
                        "type": "object",
                        "description": "Reddit blocks datacenter IPs, so a residential proxy is used by default. Leave as-is unless you have a reason to change it.",
                        "default": {
                            "useApifyProxy": true,
                            "apifyProxyGroups": [
                                "RESIDENTIAL"
                            ]
                        }
                    }
                }
            },
            "runsResponseSchema": {
                "type": "object",
                "properties": {
                    "data": {
                        "type": "object",
                        "properties": {
                            "id": {
                                "type": "string"
                            },
                            "actId": {
                                "type": "string"
                            },
                            "userId": {
                                "type": "string"
                            },
                            "startedAt": {
                                "type": "string",
                                "format": "date-time",
                                "example": "2025-01-08T00:00:00.000Z"
                            },
                            "finishedAt": {
                                "type": "string",
                                "format": "date-time",
                                "example": "2025-01-08T00:00:00.000Z"
                            },
                            "status": {
                                "type": "string",
                                "example": "READY"
                            },
                            "meta": {
                                "type": "object",
                                "properties": {
                                    "origin": {
                                        "type": "string",
                                        "example": "API"
                                    },
                                    "userAgent": {
                                        "type": "string"
                                    }
                                }
                            },
                            "stats": {
                                "type": "object",
                                "properties": {
                                    "inputBodyLen": {
                                        "type": "integer",
                                        "example": 2000
                                    },
                                    "rebootCount": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "restartCount": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "resurrectCount": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "computeUnits": {
                                        "type": "integer",
                                        "example": 0
                                    }
                                }
                            },
                            "options": {
                                "type": "object",
                                "properties": {
                                    "build": {
                                        "type": "string",
                                        "example": "latest"
                                    },
                                    "timeoutSecs": {
                                        "type": "integer",
                                        "example": 300
                                    },
                                    "memoryMbytes": {
                                        "type": "integer",
                                        "example": 1024
                                    },
                                    "diskMbytes": {
                                        "type": "integer",
                                        "example": 2048
                                    }
                                }
                            },
                            "buildId": {
                                "type": "string"
                            },
                            "defaultKeyValueStoreId": {
                                "type": "string"
                            },
                            "defaultDatasetId": {
                                "type": "string"
                            },
                            "defaultRequestQueueId": {
                                "type": "string"
                            },
                            "buildNumber": {
                                "type": "string",
                                "example": "1.0.0"
                            },
                            "containerUrl": {
                                "type": "string"
                            },
                            "usage": {
                                "type": "object",
                                "properties": {
                                    "ACTOR_COMPUTE_UNITS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATASET_READS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATASET_WRITES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "KEY_VALUE_STORE_READS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "KEY_VALUE_STORE_WRITES": {
                                        "type": "integer",
                                        "example": 1
                                    },
                                    "KEY_VALUE_STORE_LISTS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "REQUEST_QUEUE_READS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "REQUEST_QUEUE_WRITES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATA_TRANSFER_INTERNAL_GBYTES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATA_TRANSFER_EXTERNAL_GBYTES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "PROXY_RESIDENTIAL_TRANSFER_GBYTES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "PROXY_SERPS": {
                                        "type": "integer",
                                        "example": 0
                                    }
                                }
                            },
                            "usageTotalUsd": {
                                "type": "number",
                                "example": 0.00005
                            },
                            "usageUsd": {
                                "type": "object",
                                "properties": {
                                    "ACTOR_COMPUTE_UNITS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATASET_READS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATASET_WRITES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "KEY_VALUE_STORE_READS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "KEY_VALUE_STORE_WRITES": {
                                        "type": "number",
                                        "example": 0.00005
                                    },
                                    "KEY_VALUE_STORE_LISTS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "REQUEST_QUEUE_READS": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "REQUEST_QUEUE_WRITES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATA_TRANSFER_INTERNAL_GBYTES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "DATA_TRANSFER_EXTERNAL_GBYTES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "PROXY_RESIDENTIAL_TRANSFER_GBYTES": {
                                        "type": "integer",
                                        "example": 0
                                    },
                                    "PROXY_SERPS": {
                                        "type": "integer",
                                        "example": 0
                                    }
                                }
                            }
                        }
                    }
                }
            }
        }
    }
}
```
