Actor picture

Easy Google Maps Scraper


Collect data from Google Maps. Just enter what you want to scrape, e.g. restaurants, and where you want to search, e.g. New York. Free and easy web scraping of map data.

No credit card required

Author's avatarCompass
  • Modified
  • Users1,185
  • Runs4,846
Actor picture
Easy Google Maps Scraper


  "actorSpecification": 1,
  "name": "Easy Google Maps",
  "title": "Easy Google Maps",
  "description": "Gets a list of usable proxies from public sources.",
  "version": "0.0.1",
  "storages": {
    "dataset": {
      "actorSpecification": 1,
      "title": "Easy Google Maps",
      "description": "Too see all scraped properties, export the whole dataset or select All fields instead of Overview",
      "views": {
        "overview": {
          "title": "Overview",
          "description": "",
          "transformation": {
            "fields": [
          "display": {
            "component": "table",
            "columns": [
              { "label": "", "format": "image", "field": "imageUrls[0]" },
              { "label": "Title", "format": "text", "field": "title" },
              { "label": "Address", "format": "text", "field": "address" },
              { "label": "Website", "format": "link", "field": "website" },
              { "label": "Phone", "format": "text", "field": "phone" },
                "label": "Total score",
                "format": "number",
                "field": "totalScore"
              { "label": "Rank", "format": "number", "field": "rank" },
                "label": "Reviews count",
                "format": "number",
                "field": "reviewsCount"
                "label": "Ad?",
                "format": "boolean",
                "field": "isAdvertisement"
                "label": "Category name",
                "format": "text",
                "field": "categoryName"
                "label": "Search string",
                "format": "text",
                "field": "searchString"
              { "label": "Google Maps URL", "format": "link", "field": "url" }


root = true

indent_style = space
indent_size = 4
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
end_of_line = lf


    "extends": "@apify"


# This file tells Git which files shouldn't be added to source control



# First, specify the base Docker image. You can read more about
# the available images at
# You can also use any other image from Docker Hub.
FROM apify/actor-node-puppeteer-chrome:14

# Second, copy just package.json and package-lock.json since it should be
# the only file that affects "npm install" in the next step, to speed up the build
COPY package*.json ./

# Install NPM packages, skip optional and development dependencies to
# keep the image small. Avoid logging too much and print the dependency
# tree for debugging
RUN npm --quiet set progress=false \
 && npm install --only=prod --no-optional \
 && echo "Installed NPM packages:" \
 && (npm list --only=prod --no-optional --all || true) \
 && echo "Node.js version:" \
 && node --version \
 && echo "NPM version:" \
 && npm --version

# Next, copy the remaining files and directories with the source code.
# Since we do this after NPM install, quick build will be really fast
# for most source file changes.
COPY . ./

# Optionally, specify how to launch the source code of your actor.
# By default, Apify's base Docker images define the CMD instruction
# that runs the Node.js source code using the command specified
# in the "scripts.start" section of the package.json file.
# In short, the instruction looks something like this:
# CMD npm start


    "title": "Easy Google Maps Scraper",
    "description": "Just enter what you want to search for, where, and how many results you want to get. Then hit the RUN button at the bottom of the page.",
    "type": "object",
    "schemaVersion": 1,
    "properties": {
        "search": {
            "title": "What?",
            "type": "string",
            "description": "What do you want to search for?",
            "editor": "textfield",
            "prefill": "restaurant"
        "location": {
            "title": "Where?",
            "type": "string",
            "description": "Where you want to search? Be more specific if you seem to get the wrong location.",
            "editor": "textfield",
            "prefill": "New York"
        "maxCrawledPlacesPerSearch": {
            "title": "How many places?",
            "type": "integer",
            "description": "The maximum number of places you want to find. The scrape is limited by what Google Maps shows, i.e. anything between 1 and 400 places.",
            "prefill": 10,
            "minimum": 1,
            "maximum": 400
    "required": [
        "search", "location"

# Easy Google Maps Scraper
Have you ever wanted to scrape information from Google maps, but it seemed too difficult? Try our free Easy Google Maps Scraper! 

Just enter what you want to scrape, e.g. restaurants, where to search, e.g. New York, and how many you need (maybe you just want to find 10 results and not 400).

Then you hit the RUN button at the bottom of the page.

We've even set it up so that you can get data on restaurants in New York so just click that RUN button and see how easy it is!

P.S. It might take about 10 minutes, so just grab a cup of coffee while you let our robots do the work 🦾☕

## Need a more advanced Google Maps scraper?
If you need more detailed data from Google Maps, try our [Google Maps Scraper]( It's a little more complicated to set up but you can get almost any data from Google Maps.

## How can you use Google maps data?
So what can you do with the Google Maps data you collect by web scraping? Here are just some ideas:
- create a potential customer base.
- search, monitor and analyze your competitors.
- find where you can buy a specific product and choose the best option out of the pool of results.
- analyze geospatial data for scientific or engineering work.
- find opportunities for expanding your business or organization and developing a working market strategy.

For more inspiration on how to use the extracted Google Maps data, check out our [industries pages]( See how web scraping results are already being used by companies of all sizes, including [e-commerce and retail](, [real estate](, and the [travel industry](

## How much will it cost me?
Apify provides you with $5 free usage credits to use every month on the [Apify Free plan]( and you can get up to 2,000 results from this Easy Google Maps Scraper for $5. So it will be completely free for 2,000 results!

But if you need to get more data regularly you should grab an Apify subscription. We recommend our [$49/month Personal plan]( - you can get up to 20,000 Google Maps results every month with the free $49 in monthly usage credits from that plan!

## Integrations and Google Maps Scraper
Last but not least, Google Maps Scraper can be connected with almost any cloud service or web app thanks to  <a href=""  target="_blank"> integrations on the Apify platform</a>. You can integrate with Make, Zapier, Slack, Airbyte, GitHub, Google Sheets, Google Drive, <a href=""  target="_blank"> and more</a>. Or you can use <a href=""  target="_blank"> webhooks</a> to carry out an action whenever an event occurs, e.g. get a notification whenever Google Maps Scraper successfully finishes a run.

## Using Google Maps Scraper with the Apify API
The Apify API gives you programmatic access to the Apify platform. The API is organized around RESTful HTTP endpoints that enable you to manage, schedule, and run Apify actors. The API also lets you access any datasets, monitor actor performance, fetch results, create and update versions, and more.

To access the API using Node.js, use the apify-client NPM package. To access the API using Python, use the apify-client PyPI package.

Check out the <a href=""  target="_blank"> Apify API reference</a> docs for full details or click on the <a href=""  target="_blank"> API tab</a> for code examples.


    "env": { "npm_config_loglevel": "silent" }


import Apify from 'apify';

Apify.main(async () => {
    const input = await Apify.getInput();
    const searchStringsArray = [`${} in ${input.location}`];
await Apify.metamorph('drobnikj/crawler-google-places', {
        proxyConfig: { useApifyProxy: true },


    "name": "easy-google-maps",
    "version": "0.0.1",
    "description": "An actor with easy inputs for Google places",
    "type": "module",
    "dependencies": {
        "apify": "^2.0.7"
    "devDependencies": {
        "@apify/eslint-config": "^0.1.3",
        "eslint": "^7.0.0"
    "scripts": {
        "start": "node main.js",
        "lint": "./node_modules/.bin/eslint ./src --ext .js,.jsx",
        "lint:fix": "./node_modules/.bin/eslint ./src --ext .js,.jsx --fix",
        "test": "echo \"Error: oops, the actor has no tests yet, sad!\" && exit 1"
    "author": "Zuzka",
    "license": "ISC"