LinkedIn Jobs & Company Scraper avatar

LinkedIn Jobs & Company Scraper

Try for free

1 day trial then $19.99/month - No credit card required now

Go to Store
LinkedIn Jobs & Company Scraper

LinkedIn Jobs & Company Scraper

fetchclub/linkedin-jobs-scraper
Try for free

1 day trial then $19.99/month - No credit card required now

Actively Maintained - Cheap Rental & Run Cost - LinkedIn Jobs Scraper + Companies - to extract job listings worldwide, gathering all essential details. Export results for analysis, connect via API, & integrate with other apps. Ideal for job seekers, recruitment & agencies. Unofficial LinkedIn API.

🔥 - LinkedIn Jobs Scraper Apify Actor

Linked-In-Jobs-Scraper-on-Apify

🕑 - Access the LinkedIn Jobs Scraper on Apify

Quickly pull fresh job listings from LinkedIn and capture company profiles all in one place to speed up your job hunt or streamline recruitment efforts. Use keywords and a full range of filters to narrow your search, and extract crucial details such as job titles, descriptions, company profiles, and recruiter contact information. With this scraper, you’ll have everything you need in one tool, including company logos, profile links, and more—making it ideal for businesses and recruitment agencies. Export the data in formats ready for analysis, or connect seamlessly through API or Python for deeper integration with other tools. This scraper is perfect to find roles and the associated companies worldwide, covering all regions available on LinkedIn, and saving you countless hours of manual searching.

❓ - What is the LinkedIn Jobs Scraper Used For?

The LinkedIn Jobs Scraper simplifies downloading relevant job listings and associated company profiles from LinkedIn. Whether you're a job seeker or a recruitment agency, LinkedIn offers a wealth of opportunities across industries and regions. This scraper not only pulls job listings that match your search criteria but also gathers key details about the companies posting these jobs. With everything in one place—job descriptions, company profiles — you can save hours of manual searching and quickly access the most up-to-date roles and company details.

It’s like having your own personal LinkedIn API for both jobs and companies responsible, all in a single, cost-effective tool.


Example Job from the LinkedIn that the Apify LinkedIn Jobs Scraper Can Collect

🏢 - Get Company Profiles Alongside Job Listings

One of the standout features of this LinkedIn Jobs Scraper is the ability to scrape not just job listings but also detailed company profiles, all within the same tool. Now you can pull essential job data alongside comprehensive company information—like company name, logo, profile URL, and more. This is perfect for recruitment agencies or businesses that need a full picture of the companies behind the roles they’re sourcing. Instead of paying for separate scrapers (which can cost $30 for jobs and $20 for company data), you get everything bundled together for just $19.99. It’s a smarter, more affordable solution, providing everything you need in one place for one low monthly cost.


🔨 - How to Use the LinkedIn Jobs Scraper

Using the LinkedIn Jobs Scraper is as straightforward as using the LinkedIn website.

Head over to https://www.linkedin.com/jobs/search using an incognito window (a logged-out session). Next, enter the job title or company you're looking for, along with the geographic location, and click search. You can further refine your results using the filters available in the dropdown on the LinkedIn website, including 'Company', 'Job Type', 'Experience Level', 'Location', 'Salary', and 'Remote' status. The more filters you apply, the more specific your search will be, which generally returns fewer results.

Once you've set the search terms and filters you're looking to scrape results for, simply copy the entire URL. Head back to the LinkedIn Job Scraper on Apify, paste the LinkedIn URL into the "Search URL" field on the input tab, and you're ready to scrape the job results. Simply click "Start" (or "Save & Start").

The actor will return the results from the LinkedIn website, which you can download in JSON, CSV, XML, Excel, HTML Table, RSS, or JSONL formats.

You might find that when applying filters and search terms on a logged-out LinkedIn session, LinkedIn may prompt you to log in to continue. If this happens, simply click the back button twice, and it will take you back to the page you were on, allowing you to continue refining your selection to get the LinkedIn jobs URL you're looking for.


🔢 - How Many Results Can the LinkedIn Scraper Collect?

You can scrape up to 1,000 job listings at a time with the LinkedIn scraper, which should be sufficient for most use cases. If you need to gather more than 1,000 roles, you can break the task into multiple scrapes by adjusting your search criteria using the available filters on the LinkedIn website, and creating multiple tasks on Apify with the different URLs in the Search URL input box. Additionally, you have the option to set a limit on the number of results scraped by using the 'Max Results' feature on the input page.

For example, the scraper takes approximately 3 minutes to return 50 results from LinkedIn.


💰 - What's the Cost of Using the LinkedIn Jobs Scraper?

We offer this LinkedIn Jobs Scraper at a highly competitive monthly rental of just $19.99, making it more affordable than many comparable actors on Apify. Additionally, we've optimized the scraper to minimize the cost of each run, with an average price of just $0.02 per 50 results, which is significantly cheaper than most other LinkedIn scrapers available. This allows you to scrape more job listings for a lower cost per result, saving you time and money. Over a year of use, you could save hundreds of dollars compared to other scrapers on the Apify platform, making this a cheap LinkedIn scraper vs. Octoparse and PhantomBuster. This actor is optimised to work with both datacenter and residential proxies. You'll notice that using datacenter proxies results in significantly lower running costs compared to residential proxies. This is one of the key reasons why this actor is much more cost-effective than many others available on the Apify platform.


💎 - What Data Can the LinkedIn Jobs Scraper Extract?

The LinkedIn Jobs Scraper can extract all the key information and metadata for each job that matches your filter criteria. Below is an example of the type of data collected, along with a brief explanation of what each field represents:

Field NameField Description
titleSpecifies the job title, e.g., 'Software Engineer.'
locationShows the geographical location of the job, e.g., 'London, England, United Kingdom'
company_nameThe name of the company offering the position.
posted_datetimeThe exact date the job was posted.
posted_time_agoThe relative time since the job was posted, e.g., '2 days ago.'
applicantsShows the number of applicants for the role, if available.
base_salaryDisplays the base salary offered for the role, if provided.
additional_compensationSpecifies any additional compensation, such as bonuses or benefits, offered alongside the base salary.
seniority_levelThe seniority level required for the role, such as 'Entry Level,' 'Mid-Senior level' or 'Internship.'
employment_typeThe type of employment, such as 'Full-time', 'Contract,' etc.
job_functionSpecifies the functional area of the job, such as 'Information Technology,' 'Marketing,' or 'Engineering.'
industriesDescribes the industries relevant to the role, such as 'Staffing and Recruiting,' 'IT Services and IT Consulting,' etc.
description_textThe job description in plain text format without HTML formatting.
description_htmlThe job description in HTML format.
job_urlThe unique URL to the LinkedIn job posting.
apply_typeDescribes the type of application process, such as 'Easy Apply' or through a third-party website.
apply_urlThe direct URL to the job application page based on apply_type.
recruiter_nameThe name of the recruiter responsible for the position.
recruiter_detailThe LinkedIn headline for the recruiter responsible for the role, if available.
recruiter_imageThe profile image URL of the recruiter.
recruiter_profileA link to the LinkedIn profile of the recruiter.
company_idThe LinkedIn unique identifier for the company posting the job.
company_profileA link to the LinkedIn profile page of the company.
company_logoThe logo image URL for the company posting the role.
job_urnThe unique identifier (URN) for the job on LinkedIn.
company_websiteThe official website of the company.
company_address_typeThe type of address associated with the company, e.g., 'PostalAddress.'
company_streetThe street address of the company.
company_localityThe locality (city) of the company.
company_regionThe region (state/province) of the company.
company_postal_codeThe postal code of the company.
company_countryThe country of the company.
company_employees_on_linkedinThe number of employees listed on LinkedIn for the company.
company_followers_on_linkedinThe number of followers the company has on LinkedIn.
company_cover_imageThe URL of the company's cover image on LinkedIn.
company_sloganThe slogan or tagline of the company, if available.
company_twitter_descriptionA brief description of the company used for its LinkedIn or Twitter profile.
company_about_usA detailed "About Us" section for the company, including its mission and values.
company_industryThe industry in which the company operates, e.g., 'Software Development.'
company_sizeThe size of the company based on employee count, e.g., '11-50 employees.'
company_headquartersThe location of the company's headquarters, e.g., 'Delray Beach, Florida.'
company_organization_typeThe type of organization, e.g., 'Privately Held.'
company_foundedThe year the company was founded.
company_specialtiesThe company's areas of expertise, e.g., 'Web Development, Mobile Development, React.'

🪜 - How to Use the LinkedIn Jobs Scraper on Apify

The LinkedIn Jobs Scraper is designed for simplicity. Follow these easy steps to start scraping and download relevant job data:

  1. Create a free Apify account.

  2. Open the LinkedIn Jobs Scraper (start your 3-day free trial).

  3. Copy the URL from LinkedIn for your desired search and paste it in the Search URL input box.

  4. Click "Start" and wait for the scraping process to finish.

  5. Select your preferred format to export the LinkedIn job data.

Repeat steps 3 to 5 as many times as needed to download all the jobs you're interested in from LinkedIn.


Example of Jobs Scraped Output from the LinkedIn Jobs Scraper

⬇️ - Example Input for the LinkedIn Jobs Scraper

Below is a sample JSON input for the LinkedIn scraper. This input is configured to return all job listings for the term "Medical Doctor" in London. Notice how you only need to copy the URL for your search term on LinkedIn and paste it into the "search_url" input box. You can easily specify filters and other search criteria using LinkedIn's native interface, making it simple to setup and run your search on Apify.

1{
2  "include_company_details": true,
3  "max_results": 1000,
4  "search_url": "https://www.linkedin.com/jobs/search?keywords=Data%20Engineer&location=London%20Area%2C%20United%20Kingdom&geoId=90009496&f_TPR=&f_WT=2&f_E=4&position=1&pageNum=0",
5  "proxy_group": "DATACENTER"
6}

⬆️ - Example Output for the LinkedIn Jobs Scraper

Below are two sample records in JSON format based on the input provided above. Each field is captured directly from the LinkedIn website. You have multiple download options, including JSON, CSV, XML, Excel, RSS, and JSONL. The flat structure of the JSON format ensures easy parsing and integration into your workflows.

1[
2  {
3    "title": "Software Engineer for Training AI Data (Verilog)",
4    "location": "London, England, United Kingdom",
5    "company_name": "G2i Inc.",
6    "posted_datetime": "2024-09-19 10:09:12",
7    "posted_time_ago": "1 week ago",
8    "applicants": "Be among the first 25 applicants",
9    "base_salary": null,
10    "additional_compensation": null,
11    "seniority_level": "Mid-Senior level",
12    "employment_type": "Full-time",
13    "job_function": "Engineering and Information Technology",
14    "industries": "Software Development",
15    "description_text": "Accepted Locations\nWe accept applicants from the US, Canada, and most countries in LATAM and Europe. We are accepting candidates from some countries in Africa and Asia. For the complete list of accepted locations\n,\nclick here\n.\nThis work is 100% remote.\nLoom Video\nOur Founder/CEO, Gabe Greenberg, created a more in-depth Loom video that we highly recommend you watch! Check it out here: https://www.loom.com/share/5a8972c7fbbf46aaa3f389b2b6391c40\nOverview\nYou’ll join an expert annotation team to create training data for the world's most advanced AI models. No previous AI experience is necessary. You'll get your foot in the door with one of the most prominent players in the AI/LLM space today. We seek software engineers with 3+ years of experience to train large AI language models, helping cutting-edge generative AI models write better code. Projects typically include discrete, highly variable problems that involve engaging with these models as they learn to code. We currently have 200+ roles open!\nWhat Will I Be Doing?\nEvaluating the quality of AI-generated code, including human-readable summaries of your rationale\nSolve coding problems, writing functional and efficient code\nWriting robust test cases to confirm code works efficiently and effectively\nWe asked the technical project manager to go into even more detail and this is how he answered: It is solving coding challenges, creating instructions to help others, reviewing the code before it goes into the model, and there's a ton of variety in the projects. We have everything from \"Which piece of Verilog code is better?\" to \"Make a full mobile application using this chatbot, and improve the chatbot's responses afterward to make it faster.\"\nPay Rates\nCompensation rates vary based on location and experience.\nNote: The following rates are starting points and may be subject to change:\nTo view the complete list of locations and their values, click here.\nUS - 50/hr\nBrazil - 25.40 USD /hr\nEgypt - 14.60 USD /hr\nPhilippines - 24.20 USD /hr\nExpectations are 15+ hours per week; however, there is no upper limit. We have engineers working 20-40 hours per week and some working 40+ hours per week. You can work as much as you want to. You'll get paid weekly per hour of work done on the platform.\nContract Length\nLong term, there is no end date. They expect to have work for the next 2 years.\nYou can end the contract at any time. We hope you will commit to 12 months of work, but if you start and it's not a fit for you, we totally understand.\nFlexible Schedules\nDevelopers can set their own hours—ideal candidates will be interested in spending 40 hours a week. They will be with teams, so strong performers will adapt to the urgency of projects and stay engaged, but they will also be incredibly flexible on working hours.\nYou can take a 3-hour lunch, no problem. Instead of tracking your hours, you are paid according to time spent on the platform, calculated in the coding exercises.\nInterview Process\nApply using this Ashby form.\nIf you seem like a good fit, we'll send an async RLHF code review that will take 35 minutes and must be finished within 72 hours of us sending it.\nYou'll receive credentials to the RLHF platform. We'll then set up a group call to answer any further questions about onboarding with the company.\nYou'll perform a simulated production-level task (RLHF task) on the platform. This will be your final interview, which will ultimately determine your employment and leveling. Successful completion of this process provides you with an opportunity to work on projects as they become available.\nTech Stack Priorities\nThe current priority for this team is engineers with either a Data Science background who know Verilog well or software engineers who are well versed in Verilog.\nRequired Qualifications:\n3+ years of experience in a software engineering/software development role.\nComplete fluency in the English language.\nAbility to articulate complex scientific concepts clearly and engagingly.\nExcellent attention to detail and ability to maintain consistency in writing.\nSolid understanding of grammar, punctuation, and style guidelines.\nProficiency with Verilog\nNice To Haves:\nBachelor's or Master’s degree in Computer Science\nProficiency in working with one or more of the following: Java, JavaScript, TypeScript, C++. SQL, Swift, Ruby, Rust, Go, NET, Matlab, PHP, HTML, DART, R, Apex, and Shell, C, C#\nRecognized accomplishments or contributions to the coding community or in projects.\nProven analytical skills with an ability to approach problems creatively.\nAdept communication skills, especially when understanding and discussing project requirements.\nA commitment to continuous learning and staying updated with the latest coding advancements and best practices.\nEnthusiasm for teaching AI models and experience with technical writing!",
16    "description_html": "<div class=\"show-more-less-html__markup show-more-less-html__markup--clamp-after-5 relative overflow-hidden\">\n<strong>Accepted Locations<br/><br/></strong>We accept applicants from the US, Canada, and most countries in LATAM and Europe. We are accepting candidates from some countries in Africa and Asia. For the complete list of accepted locations<strong>, </strong><strong>click here</strong><strong>.</strong> This work is 100% remote.<br/><br/><strong>Loom Video<br/><br/></strong>Our Founder/CEO, Gabe Greenberg, created a more in-depth Loom video that we highly recommend you watch! Check it out here: https://www.loom.com/share/5a8972c7fbbf46aaa3f389b2b6391c40<br/><br/><strong>Overview<br/><br/></strong>You’ll join an expert annotation team to create training data for the world's most advanced AI models. No previous AI experience is necessary. You'll get your foot in the door with one of the most prominent players in the AI/LLM space today. We seek software engineers with 3+ years of experience to train large AI language models, helping cutting-edge generative AI models write better code. Projects typically include discrete, highly variable problems that involve engaging with these models as they learn to code. We currently have 200+ roles open!<br/><br/><strong>What Will I Be Doing?<br/><br/></strong><ul><li>Evaluating the quality of AI-generated code, including human-readable summaries of your rationale</li><li>Solve coding problems, writing functional and efficient code</li><li>Writing robust test cases to confirm code works efficiently and effectively</li><li>We asked the technical project manager to go into even more detail and this is how he answered: It is solving coding challenges, creating instructions to help others, reviewing the code before it goes into the model, and there's a ton of variety in the projects. We have everything from \"Which piece of Verilog code is better?\" to \"Make a full mobile application using this chatbot, and improve the chatbot's responses afterward to make it faster.\"<br/><br/></li></ul><strong>Pay Rates<br/><br/></strong>Compensation rates vary based on location and experience.<br/><br/>Note: The following rates are starting points and may be subject to change:<br/><br/>To view the complete list of locations and their values, click here.<br/><br/><ul><li>US - 50/hr</li><li>Brazil - 25.40 USD /hr</li><li>Egypt - 14.60 USD /hr</li><li>Philippines - 24.20 USD /hr</li><li>Expectations are 15+ hours per week; however, there is no upper limit. We have engineers working 20-40 hours per week and some working 40+ hours per week. You can work as much as you want to. You'll get paid weekly per hour of work done on the platform. <br/><br/></li></ul><strong>Contract Length<br/><br/></strong><ul><li>Long term, there is no end date. They expect to have work for the next 2 years. </li><li>You can end the contract at any time. We hope you will commit to 12 months of work, but if you start and it's not a fit for you, we totally understand. <br/><br/></li></ul><strong>Flexible Schedules<br/><br/></strong><ul><li>Developers can set their own hours—ideal candidates will be interested in spending 40 hours a week. They will be with teams, so strong performers will adapt to the urgency of projects and stay engaged, but they will also be incredibly flexible on working hours. </li><li>You can take a 3-hour lunch, no problem. Instead of tracking your hours, you are paid according to time spent on the platform, calculated in the coding exercises. <br/><br/></li></ul><strong>Interview Process<br/><br/></strong><ul><li>Apply using this Ashby form. </li><li>If you seem like a good fit, we'll send an async RLHF code review that will take 35 minutes and must be finished within 72 hours of us sending it. </li><li>You'll receive credentials to the RLHF platform. We'll then set up a group call to answer any further questions about onboarding with the company. </li><li>You'll perform a simulated production-level task (RLHF task) on the platform. This will be your final interview, which will ultimately determine your employment and leveling. Successful completion of this process provides you with an opportunity to work on projects as they become available. <br/><br/></li></ul><strong>Tech Stack Priorities<br/><br/></strong><ul><li>The current priority for this team is engineers with either a Data Science background who know Verilog well or software engineers who are well versed in Verilog. <br/><br/></li></ul><strong>Required Qualifications:<br/><br/></strong><ul><li>3+ years of experience in a software engineering/software development role. </li><li>Complete fluency in the English language. </li><li>Ability to articulate complex scientific concepts clearly and engagingly. </li><li>Excellent attention to detail and ability to maintain consistency in writing. </li><li>Solid understanding of grammar, punctuation, and style guidelines. </li><li>Proficiency with Verilog<br/><br/></li></ul><strong>Nice To Haves:<br/><br/></strong><ul><li>Bachelor's or Master’s degree in Computer Science</li><li>Proficiency in working with one or more of the following: Java, JavaScript, TypeScript, C++. SQL, Swift, Ruby, Rust, Go, NET, Matlab, PHP, HTML, DART, R, Apex, and Shell, C, C#</li><li>Recognized accomplishments or contributions to the coding community or in projects. </li><li>Proven analytical skills with an ability to approach problems creatively. </li><li>Adept communication skills, especially when understanding and discussing project requirements. </li><li>A commitment to continuous learning and staying updated with the latest coding advancements and best practices. </li><li>Enthusiasm for teaching AI models and experience with technical writing!<br/><br/></li></ul>\n</div>",
17    "job_url": "https://uk.linkedin.com/jobs/view/software-engineer-for-training-ai-data-verilog-at-g2i-inc-4029759064",
18    "apply_type": "External Apply",
19    "apply_url": "https://www.linkedin.com/jobs/view/externalApply/4029759064?url=https%3A%2F%2Fjobs%2Eashbyhq%2Ecom%2Fg2i%2F5ff24192-2ad6-49e2-b9b2-295d082af81b%2Fapplication%3Futm_source%3Dj9V6VK4o8K&urlHash=KKNS",
20    "recruiter_name": null,
21    "recruiter_detail": null,
22    "recruiter_image": null,
23    "recruiter_profile": null,
24    "company_id": "7790167",
25    "company_profile": "https://www.linkedin.com/company/g2i-inc",
26    "company_logo": "https://media.licdn.com/dms/image/v2/D560BAQE1LAyn2pCGDw/company-logo_100_100/company-logo_100_100/0/1700638743046/g2i_inc__logo?e=2147483647&v=beta&t=S0XtxUfYx197eNosunyaBJqpmWrVCjNPb_TSNtlAiFE",
27    "company_website": "https://g2i.co",
28    "company_address_type": "PostalAddress",
29    "company_street": "105 E Atlantic Ave",
30    "company_locality": "Delray Beach",
31    "company_region": "Florida",
32    "company_postal_code": "33444",
33    "company_country": "US",
34    "company_employees_on_linkedin": 113,
35    "company_followers_on_linkedin": 13456,
36    "company_cover_image": "https://media.licdn.com/dms/image/v2/D563DAQGkiDRFwJh56w/image-scale_191_1128/image-scale_191_1128/0/1700638322201/g2i_inc__cover?e=2147483647&v=beta&t=S6MYlsR-P-HjpVA22MDKpS4eXe7SXX3papXwvsyI3nQ",
37    "company_slogan": "G2i is a hiring community connecting remote developers with world-class engineering teams.",
38    "company_twitter_description": "G2i Inc. | 13,456 followers on LinkedIn. G2i is a hiring community connecting remote developers with world-class engineering teams. | G2i is a hiring community connecting remote developers with world-class engineering teams. Our unique approach combines rigorous technical assessments with a solid commitment to developer health, ensuring companies get skilled developers who are supported, valued, and ready to execute from day one.\n\nOur transparent vetting process includes in-depth, performance-ranked developer profiles, recorded technical interviews, and soft-skills assessments.",
39    "company_about_us": "G2i is a hiring community connecting remote developers with world-class engineering teams. Our unique approach combines rigorous technical assessments with a solid commitment to developer health, ensuring companies get skilled developers who are supported, valued, and ready to execute from day one.\n\nOur transparent vetting process includes in-depth, performance-ranked developer profiles, recorded technical interviews, and soft-skills assessments. Whether you're working on a short-term project or burning down a backlog, G2i connects you with a community of pre-vetted developers.\n\nPlanning to hire ten or more engineers? We create a Custom Talent Pipeline, allowing for specific customizations in sourcing, assessment criteria, technical interview questions, and integration with your existing HR systems and processes.\n\nG2i partners with clients who support the developer health mission—matching developers with environments that improve their health, support recovery from burnout, and enable professional growth through restful work. \n\nIs your team overworked or understaffed? Contact us today to learn how G2i can help you.\n\nMore information about our mission and commitment to developers and clients can be found at https://g2i.co or follow us on X @g2i_co",
40    "company_industry": "Software Development",
41    "company_size": "11-50 employees",
42    "company_headquarters": "Delray Beach, Florida",
43    "company_organization_type": "Privately Held",
44    "company_founded": 2016,
45    "company_specialties": "Web Development, Android/iOS Development, Node.js, Mobile Development, JavaScript, React, and React Native"
46  },
47  {
48    "title": "Data Protection Automation Engineer - Remote",
49    "location": "London, England, United Kingdom",
50    "company_name": "Lorien",
51    "posted_datetime": "2024-09-18 00:44:30",
52    "posted_time_ago": "1 week ago",
53    "applicants": "Be among the first 25 applicants",
54    "base_salary": null,
55    "additional_compensation": null,
56    "seniority_level": "Mid-Senior level",
57    "employment_type": "Full-time",
58    "job_function": "Engineering and Information Technology",
59    "industries": "Staffing and Recruiting",
60    "description_text": "Brand new opportunity for a Data Protection Automation Engineer\nfor our client who are a multinational information technology services and consulting company.\nInside IR35\nLocation: Remote\nIn this role, you will:\n* Design, develop, test and support automation for the data protection services within the bank.* Design, develop, test and support reporting and monitoring capabilities for the automation.* Provide engineering support on the bank's data protection products and their standard usage.* Undertake product evaluation, new features, and versions.* Produce clear and comprehensive documentation for all automation produced.* Mentor less experienced members of staff\nTo be successful in this role, you should meet the following requirements:\n* 10+ years development experience with multiple languages, including Python, Bash* 5+ years' experience in configuration management and orchestration using Redhat Ansible* 5+ years' experience in configuration management and orchestration using Puppet.* 5+ years' experience in designing, developing, and testing Rest API services.* 5+ years' experience in Devops SCM tooling, GitHub* 5+ years' experience in Devops CI/CD tooling, Jenkins, CloudBees* 5+ years' experience in Devops collaboration tools, Jira, Confluence* Strong technical background in Veritas NetBackup and related infrastructure* Strong technical background in Commvault and related infrastructure\nCarbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.",
61    "description_html": "<div class=\"show-more-less-html__markup show-more-less-html__markup--clamp-after-5 relative overflow-hidden\">\n<strong>Brand new opportunity for a Data Protection Automation Engineer <strong> for our client who are a multinational information technology services and consulting company. <br/><br/></strong></strong><strong><strong>Inside IR35<br/><br/></strong></strong><strong>Location: Remote<br/><br/></strong><strong>In this role, you will:</strong>* Design, develop, test and support automation for the data protection services within the bank.* Design, develop, test and support reporting and monitoring capabilities for the automation.* Provide engineering support on the bank's data protection products and their standard usage.* Undertake product evaluation, new features, and versions.* Produce clear and comprehensive documentation for all automation produced.* Mentor less experienced members of staff<strong>To be successful in this role, you should meet the following requirements:</strong>* 10+ years development experience with multiple languages, including Python, Bash* 5+ years' experience in configuration management and orchestration using Redhat Ansible* 5+ years' experience in configuration management and orchestration using Puppet.* 5+ years' experience in designing, developing, and testing Rest API services.* 5+ years' experience in Devops SCM tooling, GitHub* 5+ years' experience in Devops CI/CD tooling, Jenkins, CloudBees* 5+ years' experience in Devops collaboration tools, Jira, Confluence* Strong technical background in Veritas NetBackup and related infrastructure* Strong technical background in Commvault and related infrastructure<br/><br/>Carbon60, Lorien &amp; SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.\n        </div>",
62    "job_url": "https://uk.linkedin.com/jobs/view/data-protection-automation-engineer-remote-at-lorien-4026404944",
63    "apply_type": "External Apply",
64    "apply_url": "https://www.linkedin.com/jobs/view/externalApply/4026404944?url=https%3A%2F%2Florien%2Eeasyapply-ats%2Ecom%2Fuk%2Flinkedin%2F66e9c559dcf93c548bdbefe4%3Fsd%3D1&urlHash=Cqov",
65    "recruiter_name": null,
66    "recruiter_detail": null,
67    "recruiter_image": null,
68    "recruiter_profile": null,
69    "company_id": "164766",
70    "company_profile": "https://uk.linkedin.com/company/lorien",
71    "company_logo": "https://media.licdn.com/dms/image/v2/D4E0BAQHV8Pay22YgLg/company-logo_100_100/company-logo_100_100/0/1719841563182/lorien_logo?e=2147483647&v=beta&t=JymhkG02fxjeW1WorAzPJbefepAC_EkqBXud32UQtYU",
72    "company_website": "https://www.lorienglobal.com/?utm_source=LinkedIn&utm_medium=LinkedIn+Profile&utm_campaign=LinkedIn&utm_id=1",
73    "company_address_type": "PostalAddress",
74    "company_street": "18th & 19th Floors",
75    "company_locality": "100 Bishopsgate",
76    "company_region": "London",
77    "company_postal_code": "EC2N 4AG",
78    "company_country": "GB",
79    "company_employees_on_linkedin": 664,
80    "company_followers_on_linkedin": 448932,
81    "company_cover_image": "https://media.licdn.com/dms/image/v2/C4D1BAQF0zgOrzQm26w/company-background_10000/company-background_10000/0/1605187007306/lorien_cover?e=2147483647&v=beta&t=SnohQpnC11sRaeMrFJswiTlsWnhEwExEZKLbB3IstaU",
82    "company_slogan": "Transforming Careers, Businesses and Tech Strategies",
83    "company_twitter_description": "Lorien | 448,932 followers on LinkedIn. Transforming Careers, Businesses and Tech Strategies | Lorien is a technology, transformation and telecom talent solutions specialist. We combine tech expertise with the scope, depth and breadth of large-scale talent solutions. \n\nDriven by both clients and candidates, covering Europe and the US, we work with all sectors, sizes and tech needs – from start-up to established, tech companies and those who use tech to propel them further - we work with them all.",
84    "company_about_us": "Lorien is a technology, transformation and telecom talent solutions specialist. We combine tech expertise with the scope, depth and breadth of large-scale talent solutions. \n\nDriven by both clients and candidates, covering Europe and the US, we work with all sectors, sizes and tech needs – from start-up to established, tech companies and those who use tech to propel them further - we work with them all. \n\nOur unique position in the market means we have the insight to create tailored solutions – keeping our clients ahead of the curve and giving complete agility. From one-off placements to scalable enterprise solutions, executive search to next-generation tech skills, time-sensitive projects to ongoing digital journeys – we flex to fit the requirement.\n\nWe are the house of global technology recruitment. \n\nLorien is powered by Impellam, a connected group providing global workforce and specialist recruitment solutions. To learn more about Impellam Group, visit: www.impellam.com.",
85    "company_industry": "Staffing and Recruiting",
86    "company_size": "201-500 employees",
87    "company_headquarters": "100 Bishopsgate, London",
88    "company_organization_type": "Privately Held",
89    "company_founded": 1977,
90    "company_specialties": "Technology Recruitment Solutions, Recruitment Outsourcing, Managed Services, Statement of Work, RPO, Contingent Workforce Solutions, Project Recruitment, Executive Search, and MSP"
91  }
92]

💡 - Why Should I Switch to This LinkedIn Scraper?

If you're already using a LinkedIn scraper on Apify, here's why you should consider switching to this one. Our scraper is designed for maximum reliability, ensuring a high success rate with minimal discrepancies in the data collected. It's also cost-efficient to run, offering excellent value while covering all the details you need, as outlined in the example data provided. We offer responsive support, quickly addressing any issues or feature requests to help you stay productive. Plus, new features and improvements are regularly added based on user feedback, making this tool highly adaptable to your needs.


🏢 - LinkedIn Scraper for Recruitment Agencies

For recruitment agencies and staffing firms, this LinkedIn Jobs Scraper for Recruitment is an essential tool. It allows you to efficiently gather comprehensive job data, including job titles, company information, salary ranges, and recruiter contacts, making it easier to stay ahead of the competition. By automating the data collection process with this LinkedIn Jobs Scraper, you’ll save valuable time and ensure you’re always sourcing the best candidates for your clients before your competitors.

This LinkedIn Jobs Scraper is especially useful for analysing market trends over time, enabling recruitment agencies to spot in-demand skills and shifts in hiring patterns. Whether you’re tracking trends across industries or expanding your talent pool globally, the scraper helps you stay ahead of the curve. For agencies working with international clients, this tool allows you to collect job listings from around the world, making it perfect for scaling recruitment efforts. By keeping your candidate database aligned with current market demands, you can ensure you’re delivering the top talent your clients are searching for, all with the help of this LinkedIn Jobs Scraper for Recruitment.


👋 - Support is Available for the LinkedIn Jobs Scraper

Having used many LinkedIn scrapers ourselves, both on the Apify platform and elsewhere, we understand not only how essential the LinkedIn Jobs Scraper is to your workflow but also how critical it is to address any issues promptly when something doesn't look right. If you encounter any problems or have questions, you can quickly raise an issue and expect a fast response. We proactively test this actor to ensure stability, making it reliable and user-friendly. Rest assured, we value your investment and are committed to providing a service that meets your expectations.


📝 - Helpful Notes for the LinkedIn Scraper

When using this scraper, you can expect the number of results to match the LinkedIn website about 90% of the time for a given filter criteria. LinkedIn often includes duplicates or may present roles in ways that can lead to slight discrepancies in the total count. If you notice a significant difference or believe some results are missing, please contact us, and we will promptly investigate. While some factors related to how LinkedIn displays roles are beyond our control, we are committed to ensuring the highest level of accuracy.


Our web scrapers operate ethically, ensuring they do not collect private user data such as personal email addresses, gender, or an individual’s location. They only gather information that users have explicitly shared publicly. Additionally, we only allow you to use a logged-out session of LinkedIn, so you're accessing publicly available data. We believe that when used for ethical purposes, our scrapers are a safe tool. However, please note that your results may still contain personal data. Such data is protected under GDPR in the European Union and similar regulations globally. You should avoid scraping personal data unless you have a legitimate reason. If you're uncertain about the legality of your reason, we recommend consulting legal counsel. legal counsel. For more information, you can also read the blog post Apify wrote on the legality of webscraping.

Developer
Maintained by Community

Actor Metrics

  • 41 monthly users

  • 7 stars

  • >99% runs succeeded

  • 6.2 days response time

  • Created in Sep 2024

  • Modified 23 days ago

Categories