Instagram Post Details Scraper (PPR) avatar
Instagram Post Details Scraper (PPR)

Pricing

$2.00 / 1,000 posts

Go to Store
Instagram Post Details Scraper (PPR)

Instagram Post Details Scraper (PPR)

Developed by

Powerful Bachelor

Powerful Bachelor

Maintained by Community

📸 Extract detailed Instagram post data effortlessly! Get likes, comments, captions, hashtags & more. Perfect for 📊 analytics, 🎯 marketing research & 📈 performance tracking. Easy setup, powerful results! Supports multiple formats: JSON, CSV, Excel. Start scraping Instagram posts today! 🚀✨

5.0 (1)

Pricing

$2.00 / 1,000 posts

13

Total users

207

Monthly users

34

Runs succeeded

>99%

Issues response

1.1 days

Last modified

14 days ago

SE

I'd like to inquire about the scripper media function.

Closed

seon056 opened this issue
a month ago

The script you provided is very useful.

Thank you so much for providing me with the scrippers I really need.

However, if you throw url of the release as the parameter value of the media URL, there is no output, and it only changes to the successful state.

May I know if you can't bring it or not provide it?

Another part is that when I provided MediaURL as a parameter value, I wonder if I can request it as a separator value so that I don't get comments.

As far as I know, it's not in the function that's currently being provided, but sometimes it's too slow and out of memory because of the part where I bring the comments

Can you check this part?

Please check.

powerful_bachelor avatar

Hey, Thank you for your valuable feedback! ⭐ Can you provide the run id for the second part and explain a little bit more on that? As for the first part We will check and get back to you. Thanks

powerful_bachelor avatar

Hey,

Great news! We have now added support for Instagram Reels URLs to our scraper. Don't forget to try it out. Regarding comments, they come with the data by default and don't need extra calls, so they won't slow things down - the memory issues you're having are probably from processing too many URLs at once, so try using smaller batches of 500 or fewer. You can also start multiple runs at the same time to speed up your scraping process.

Thanks

SE

seon056

a month ago

Thank you so much for reflecting your request! Thanks to you, my concerns have been resolved! Good luck with everything you do!

powerful_bachelor avatar

Thank you so much! We're really glad to hear that it's working well for you. Wishing you continued success too!