Linkedin post scraper avatar

Linkedin post scraper

Try for free

3 days trial then $30.00/month - No credit card required now

View all Actors
Linkedin post scraper

Linkedin post scraper

curious_coder/linkedin-post-search-scraper
Try for free

3 days trial then $30.00/month - No credit card required now

Scrape linkedin posts or updates from linkedin post search results. Supports advanced linkedin search filters. Extract posts from any linkedin member

LI

"Issue with LinkedIn Scraper: Handling Cookies and Authentication in Local VSCode Environment"

Closed

likedinscraper opened this issue
a month ago

I successfully ran the LinkedIn scraper in the Apify environment, and it returned the correct JSON data. However, I am having trouble replicating the same results in my local VSCode environment.

In the Apify environment, everything worked fine, but when I try to execute the script locally, I am unsure where to place my authentication token (or which token is required). Specifically, I’m unclear on how to pass the necessary cookies and authentication data into the request when running the scraper locally.

I’ve reviewed the documentation, but I wasn’t able to figure out how to handle the cookies and token integration in the local environment. Could you please provide guidance on what token to use, where it should be placed, and how to handle the authentication for LinkedIn scraping?

here is my code:

from apify_client import ApifyClient

Replace '<YOUR_API_TOKEN>' with your Apify API token

client = ApifyClient("<YOUR_API_TOKEN>")

Prepare the input for the Actor, with your LinkedIn URLs

run_input = { "urls": [ "https://www.linkedin.com/in/ginoolivares/recent-activity/all/", # Add more URLs for posts or profiles as needed ], "minDelay": 2, "maxDelay": 5, "proxy": { "useApifyProxy": True, "apifyProxyCountry": "BR", # Or adjust the country as needed }, # Here I tried to pass the cookies exported from the extension "cookies": [ { "name": "bcookie", "value": "<your_bc... [trimmed]

LI

likedinscraper

a month ago

I have already used the "Copy Cookies" extension, and it works perfectly when I paste the exported cookies into the Apify front-end. However, when I try to run the scraper locally in VSCode, I am not sure which of the cookies should be used and how exactly they should be passed in the code.

Could someone help me with the correct structure for the code to run the scraper locally?

curious_coder avatar

You can switch to JSON tab on Apify actor input page to get correct payload for your API

Developer
Maintained by Community

Actor Metrics

  • 479 monthly users

  • 79 stars

  • 93% runs succeeded

  • 4.5 days response time

  • Created in Jun 2023

  • Modified 24 days ago