Download names from Google search with Node.js
Use Google Search Results Scraper to get names from Google search with Node.js. Want to grab names from Google search? Google Search Results Scraper makes it quick and easy. Just tell it what to download and you’ll get your Google search name available offline, for whenever you want it.
1Get an Apify account
You can’t get data from the inside of the platform if you’re not authorized in it. So to get started, create an Apify account. It only takes a minute and it's free of charge.
Sign up for free2Initialize the API using your token
After you’ve registered, it’s time to add your secret authentication token. You can find your API token on the Integrations page in Apify Console.
Get your token in Console3Define input and copy it in JSON
To get the data from Google search you first need to use Google Search Results Scraper to extract it. So let’s add a simple input and transfer it to your code. You can copy your input as a JSON from the Google Search Results Scraper’s Input tab in Console.
4Integrate Apify into your codebase
Finally, call the Google Search Results Scraper from your Node.js project. Use Apify Client or Endpoints. You’ll be able to export scraped Google search data in no time by running the sample code below ↓.
5Monitor your Google Search Results Scraper runs
Head over to our dashboard and see how Google Search Results Scraper runs are executed in real time. Here you can also download the run logs and keep an eye on the API’s performance.
Go to dashboardGet your Node.js project up and running
Add-on to step 4: start your Node.js project by executing this code snippet in your go-to environment.
1import { ApifyClient } from 'apify-client';
2
3// Initialize the ApifyClient with your Apify API token
4// Replace the '<YOUR_API_TOKEN>' with your token
5const client = new ApifyClient({
6 token: '<YOUR_API_TOKEN>',
7});
8
9// Prepare Actor input
10const input = {
11 "queries": `javascript
12 typescript
13 python`,
14 "resultsPerPage": 100,
15 "maxPagesPerQuery": 1
16};
17
18// Run the Actor and wait for it to finish
19const run = await client.actor("apify/google-search-scraper").call(input);
20
21// Fetch and print Actor results from the run's dataset (if any)
22console.log('Results from dataset');
23console.log(`💾 Check your data here: https://console.apify.com/storage/datasets/${run.defaultDatasetId}`);
24const { items } = await client.dataset(run.defaultDatasetId).listItems();
25items.forEach((item) => {
26 console.dir(item);
27});
28
29// 📚 Want to learn more 📖? Go to → https://docs.apify.com/api/client/js/docs
Enjoy $5 of free platform usage every month to explore and kickstart your projects.
Get started on Apify instantly without the hassle of entering your credit card information.
Join our Discord community to ask questions, share ideas, and connect with developers.
Google Search Results Scraper
apify/google-search-scraper
Scrape Google Search Engine Results Pages (SERPs). Select the country or language and extract organic and paid results, AI overviews, ads, queries, People Also Ask, prices, reviews, like a Google SERP API. Export scraped data, run the scraper via API, schedule runs, or integrate with other tools.
50.9k
258
Google search Parser google maps / my business
saswave/google-search-parser-google-maps-my-business
Extract company infos from google search results, right side box (website, phone number, reviews, social accounts, address..) Allows you to automate enrichment using queries with a combination of company name, website domain and street address (complete or partial). Check readme for some exemples.
363
4
Google Search Scraper
epctex/google-search-scraper
Boost SEO with our tool! Explore organic & paid results, People Also Ask, and Related Queries. Select countries, languages, and precise locations for tailored insights.
452
9
Ready to start downloading Google search names?
You just need a free Apify account
Dataset Image Downloader & Uploader
lukaskrivka/images-download-upload
Download image files from image URLs in your datasets and save them to a Zip file, Key-Value store, or directly your AWS S3 bucket.
449
15
Youtube Video and MP3 Downloader
easyapi/youtube-video-and-mp3-downloader
Easily download high-quality videos and audio from YouTube. With batch downloading capabilities and high-speed scraping, you can enhance your media collection effortlessly! 📹
80
5
API / JSON scraper
pocesar/json-downloader
Scrape any API / JSON URLs directly to the dataset, and return them in CSV, XML, HTML, or Excel formats. Transform and filter the output. Enables you to follow pagination recursively from the payload without the need to visit the HTML page.
421
5
Apify’s wide range of tools use a technique called web scraping to extract public data from websites. These scrapers access the website the same way as you would with a browser, find the image, video, or text you want, and download it for you. They’re a fast and efficient way to get data at scale.
Web scraping is a handy method for collecting information from various websites. It's like having a digital assistant that visits web pages on your behalf, pulling out the details you need such as prices, descriptions, addresses, and contact information. But it's more than just text; this tool can also download images and videos, making it a comprehensive way to gather content from the online world. It takes care of all the complex, technical parts, so you don't have to.
Web scraping is a method where you choose websites to collect specific content, including text, images, and videos. You begin by identifying the web pages that host the visual media you're interested in. Next, you use a web scraping tool tailored to locate the parts of the page containing the images or videos you want to download. Once the tool is set up and run, it navigates to the chosen web pages, identifies the images and videos, and downloads them for you. It's a streamlined way to gather pictures and videos from online sources without having to manually download each item.
Yes, web scraping is legal for gathering public information from websites. But be careful with personal or confidential data, as well as intellectual property, because laws and regulations might protect them. It's good practice to check the website's rules or terms of service to know what's allowed. If you're not sure, getting legal advice can help ensure you're using web scraping correctly and within the law.
Actors are serverless cloud programs that run on the Apify platform and do computing jobs. They’re called Actors because, like human actors, they perform actions based on a script. They can perform anything from simple actions (such as filling out a web form or sending an email) to complex operations (such as crawling an entire website or removing duplicates from a large dataset). Actor runs can be as short or as long as necessary. They could last seconds, hours, or even run infinitely.