Skool Post Comments Scraper avatar
Skool Post Comments Scraper

Pricing

Pay per event

Go to Store
Skool Post Comments Scraper

Skool Post Comments Scraper

Developed by

Louis Deconinck

Maintained by Community

Pull every comment, reply, and user detail from any Skool post in minutes. Fast, reliable, and easy to use. Perfect for community managers, marketers, and researchers who want full visibility without wasting hours.

5.0 (3)

Pricing

Pay per event

3

Total users

3

Monthly users

3

Last modified

2 days ago

🚀 Instantly Scrape Skool Post Comments. Fast, Easy, Complete

Unlock every comment (and replies) from any Skool post - along with full user profiles - in seconds. Whether you're building analytics, community insights, or automations, this scraper gets you the raw data you need.

🔥 Why Use This Scraper?

  • Simple Setup: Plug in the post URL and get scraping - you're live in minutes.
  • Fast as Hell: Scrape hundreds of comments in seconds.
  • Affordable at Scale: Designed to run cheap, even on big Skool communities.
  • Full Comment Threads: Get every comment + every reply (replies are free).
  • User Data Included: Profile info like name, bio, socials, and profile picture - scraped with each comment.
  • Zero Guesswork: Automatically handles Skool’s cursor-based pagination for large comment chains.

🎯 Perfect For

  • Community Managers: Analyze conversations, find top contributors, monitor discussions.
  • Marketers: Pull insights for content ideas, lead generation, or customer research.
  • Developers: Build Skool-powered apps, automations, or dashboards.
  • Growth Hackers: Discover hidden opportunities inside Skool groups.

📦 Output Data

Each result includes comprehensive metadata. Here's what you'll get:

  • post.id: Comment ID
  • post.metadata.content: Comment content
  • post.created_at: Comment creation date
  • post.user.id: User ID
  • post.user.name: User name
  • post.user.metadata.picture_profile: User profile picture
  • post.user.first_name: User first name
  • post.user.last_name: User last name
  • post.user.metadata.bio: User bio
  • post.user.metadata.last_offline: User last offline time
  • post.user.metadata.link_facebook: User Facebook link
  • post.user.metadata.link_instagram: User Instagram link
  • post.user.metadata.link_linkedin: User LinkedIn link
  • post.user.metadata.link_twitter: User Twitter link
  • post.user.metadata.link_website: User website link
  • post.user.metadata.link_youtube: User YouTube link
  • post.user.metadata.location: User location
  • children: Replies to the comment

✅ All neatly structured in JSON — ready for analysis, export, or automation.

Full example output:

1{
2	"post": {
3		"id": "988164040ef44b9aba05e89a17a8f94f",
4		"metadata": {
5			"action": 0,
6			"attachments_edit": 1745495768274600000,
7			"content": "My thoughts: choosing between stored procedures and PySpark will depend a lot on your architectural requirement. If you use lakehouses you will be using a notebook \\(Pyspark or python based n data volume\\) to wirte your transformations, and if you want to automate sql scripts over a warehouse you will be needing stored procs.\n\nit might also depend on the skillset of the data team. So if you have your data people coming from a SQL background, you might end up using warehouse + SPs. Although we can do a lot with T-SQL/SparkSQL running in notebooks \\(i'm not sure of the limitations here \\)\n\nIn terms of cost, Pyspark should be the cheaper option \\(for large datasets\\). in terms of performance, i used to believe that notebooks are the fastest, but i just watched [this video](https://www.youtube.com/watch?v=G6t4d5FU0zI) where stored procedures were used to create fact and dimension tables and it ran about 2x faster than running the same process in a spark notebook.\n\nAs for being platform agnostic, i think your stored procedures and spark codes can be easily transferred to any of the other data platforms you mentioned.\n\nLet me know if this aligns with and answers some of your concerns.\n\nP.S. This is a topic I’ve been exploring a lot lately \\(whether it's possible to build solutions entirely on a Lakehouse or entirely on a Warehouse\\), so I’m looking forward to learning from this thread.",
8			"content_edit": 1745495768274600000,
9			"has_new_comments": 1,
10			"last_comment": 1745544507061979000,
11			"upvotes": 3,
12			"video_links_data": "[]",
13			"video_links_edit": 1745495768274600000
14		},
15		"created_at": "2025-04-24T10:46:28.273608Z",
16		"updated_at": "2025-04-25T12:06:09.183188Z",
17		"group_id": "f412933e1c184e088b4b132d5244b875",
18		"user_id": "21e14917517d4e13833c90980b1b8d57",
19		"post_type": "comment",
20		"parent_id": "c5dc8b5d4a174c4db4930670d12ba505",
21		"root_id": "c5dc8b5d4a174c4db4930670d12ba505",
22		"user": {
23			"id": "21e14917517d4e13833c90980b1b8d57",
24			"name": "mubaraq-abdulmaleek-2161",
25			"metadata": {
26				"bio": "Freelance data analyst || DP600 Certified  || Entering Fabric zen 🧘🏽‍♂️:)",
27				"last_offline": 1745608988404655400,
28				"link_facebook": "",
29				"link_instagram": "",
30				"link_linkedin": "https://www.linkedin.com/in/mubaraq-abdulmaleek/",
31				"link_twitter": "",
32				"link_website": "",
33				"link_youtube": "",
34				"location": "",
35				"myers_briggs": "",
36				"picture_bubble": "https://assets.skool.com/f/21e14917517d4e13833c90980b1b8d57/0fa64679053a4dc78c4fcf914f2c012822524019e2e5440ba6921813f027b3c0-sm.jpg",
37				"picture_profile": "https://assets.skool.com/f/21e14917517d4e13833c90980b1b8d57/0fa64679053a4dc78c4fcf914f2c012822524019e2e5440ba6921813f027b3c0-md.jpg",
38				"sp_data": "{\"pts\":512,\"lv\":5,\"pcl\":155,\"pnl\":515,\"role\":4}"
39			},
40			"created_at": "2024-01-18T12:22:19.394356Z",
41			"updated_at": "2025-04-25T19:23:08.404874Z",
42			"email": "",
43			"first_name": "Mubaraq",
44			"last_name": "Abdulmaleek"
45		}
46	},
47	"children": [
48		{
49			"post": {
50				"id": "2bc3ad36924f46b58ec4cb4eec79dde6",
51				"metadata": {
52					"action": 0,
53					"content": "[@Mubaraq Abdulmaleek](obj://user/21e14917517d4e13833c90980b1b8d57) , thanks for your inputs. I'm glad I joined this group. Yes, it does confirm some of my thoughts. It essentially falls to what is most comfortable to the data team that will be supporting the pipelines. I My next question would probably be if it is possible to have both a warehouse and a lakehouse in the same workspace? I can't seem to find any resource about this set-up or if it is even ideal. My Thought is that, in cases where we end up to be much comfortable with stored procedure then we can just create a warehouse in the same workspace."
54				},
55				"created_at": "2025-04-25T01:28:27.061979Z",
56				"updated_at": "2025-04-25T01:28:27.061979Z",
57				"group_id": "f412933e1c184e088b4b132d5244b875",
58				"user_id": "a4d0fbd407454d7d8a6e1ae82f38416a",
59				"post_type": "comment",
60				"parent_id": "988164040ef44b9aba05e89a17a8f94f",
61				"root_id": "c5dc8b5d4a174c4db4930670d12ba505",
62				"user": {
63					"id": "a4d0fbd407454d7d8a6e1ae82f38416a",
64					"name": "dexter-wagang-5380",
65					"metadata": {
66						"bio": "Data Architect",
67						"last_offline": 1745544992858476800,
68						"picture_bubble": "https://assets.skool.com/f/a4d0fbd407454d7d8a6e1ae82f38416a/d9a5859bf4c94b7ca8f41e65b2c529a9d465c16ecd904c0db72c3c0c54fc672b-sm.jpg",
69						"picture_profile": "https://assets.skool.com/f/a4d0fbd407454d7d8a6e1ae82f38416a/d9a5859bf4c94b7ca8f41e65b2c529a9d465c16ecd904c0db72c3c0c54fc672b-md.jpg",
70						"sp_data": "{\"pts\":5,\"lv\":2,\"pcl\":5,\"pnl\":20,\"role\":4}"
71					},
72					"created_at": "2025-04-21T09:07:06.714302Z",
73					"updated_at": "2025-04-26T19:28:26.327075Z",
74					"email": "",
75					"first_name": "Dexter",
76					"last_name": "Wagang"
77				}
78			}
79		}
80	]
81}

⌨️ Input Parameters

  • postUrl: Any valid Skool post URL.
  • cookies: Cookies to use for authentication to scrape private groups.

If you want to scrape comments from a post within a private group you will have to provide authentication cookies from someone who is part of that group. This is not needed for posts from public groups.

Here's how to obtain your auth cookie:

  1. Install Copy Cookies extension
  2. Go to Skool and log in
  3. Click the extension icon to copy the cookies
  4. Paste them into the “cookies” field

Input example:

1{
2    "postUrl": "https://www.skool.com/microsoft-fabric/i-took-the-dp-700-today-and-i-passed"
3}

💵 Transparent Pricing

  • Per post / actor run: $0.001 (1,000 posts for $1)
  • Per comment: $0.001 (1,000 comments for $1)
  • Per comment reply: FREE

You are only charged for base level comments on posts, not replies to comments.

Example: You have 2 posts to scrape with each 100 comments. You will pay 2 * $0.001 = $0.002 for the actor runs and 2 * 100 * $0.001 = $0.02 for the results. Total cost: $0.022 (= 2 cents for 200 comments)

Pro Tip: Use the “Maximum cost per run” setting under "Run Options" to control your scraping budget.

Consider upgrading to a paid plan if you run out of free credits: https://apify.com/pricing.

🚀 How to Start

  1. Click "Try for free"
  2. Paste your Skool post URL
  3. Run the actor and download your data

That’s it. No rate-limiting. No complex setups. No missed comments.

🔗 Easy Integrations

  • Use with Make.com for automation workflows

👋 About the Developer

Louis Deconinck

I’m Louis Deconinck, top 1% Apify developer, Oxford graduate, and builder of over 60+ public actors used by hundreds of users every month.

With 10+ years of software experience and millions of results scraped across the toughest websites, I specialize in fast, stable, and scalable data extraction tools that just work.

  • 🏆 Winner of the Apify AI Agent Hackathon
  • 💬 300+ community contributions on Apify Discord
  • 🏦 Worked with major European banks on data infrastructure

Need help or want to collaborate? Message me here.

📣 Reviews from the Community

⭐⭐⭐⭐⭐ "I want to let you know that you're an incredibly talented developer and your ability to understand/distill problems, troubleshoot, and offer solutions is unparalleled." - Kamesh D.

⭐⭐⭐⭐⭐ "You have so much talent and a very strong mental model for tackling problems. I enjoyed working with you and your scrapers got me very productive and meaningful results." - Leo X.

Click "Try for free" to start!