Restaurant Reviews Bundle avatar
Restaurant Reviews Bundle
Try for free

Pay $3.00 for 1,000 Reviews

View all Actors
Restaurant Reviews Bundle

Restaurant Reviews Bundle

tri_angle/restaurant-reviews-bundle
Try for free

Pay $3.00 for 1,000 Reviews

⟁ developed a unified output schema containing metadata and review details. Our solution uses Google search queries with restaurant names and GEO coordinates to obtain correct URLs for reviews, reducing manual effort. It gathers reviews from GMaps, Doordash, UberEats, Yelp, TripAdvisor & Facebook.

Restaurant reviews scraper

Scrapes reviews for restaurants on:

  • Google Maps
  • Tripadvisor
  • Yelp
  • Facebook

How it works, in short

First, it scrapes places from Google Maps according to some input.

Google Maps is the source of truth: each review in the output refers to a place which was found at least on Google Maps.

Then, the scraper searches for the returned places and their reviews also on the other providers.
The procedure to scrape reviews for one provider is called a pipeline, e.g., the Facebook pipeline.

Facebook

Scraping reviews on Facebook involves a peculiar approach:

  • the Facebook pages for each place found on Google Maps are searched on Google Search;
  • the resulting URLs are scraped to get some information from each page, such as the place's address;
  • the addresses from Google Maps and the Facebook pages are geocoded, searching for their coordinates on Google Maps, and compared;
  • the matching pages are eventually scraped for reviews.

Output

The information about each review in the output Dataset comes from the review itself, or from the initial Google Maps result, wether that specific information is not available from a provider.

Places are uniquely identified through the googleMapsPlaceId.

Output sample

The extracted Uber Eats reviews will be shown as a dataset which you can find in the Output tab.

You can preview all the fields in the Storage and Output tabs and choose the format in which to export the Google Scholar data you've extracted: JSON, CSV, Excel, or HTML table. Here below is a sample dataset in JSON:

1{
2	"googleMapsPlaceId": "abcd12345678",
3	"placeName": "The Place",
4	"placeAlternateNames": [
5        "The Nice Place"
6    ],
7	"placeUrl": "https://www.theproviderwebsite.com/places/abcd1234",
8	"placeAddress": "This Way, 13, That City",
9	"provider": "the-provider",
10	"reviewId": "abcd9876",
11	"reviewUrl": "https://www.theproviderwebsite.com/reviews/abcd9876",
12	"reviewTitle": "Great!",
13	"reviewText": "Unbelievable!",
14	"reviewDate": "2023-12-31T10:10:10.010Z",
15	"reviewRating": 5,
16	"authorName": "Jon Doe"
17}

Debug

Some additional information is saved in the KeyValueStore, for instance:

  • the external Actors' run IDs;
  • the places scraped from each provider;
  • the addresses' geocodes.

Caveats

  • Full dates are stored in ISO format: since some providers only specify review dates, some reviews may report a time of 00:00:00.
  • Scraping the Facebook reviews is a bit flaky. The main weak points are:
    • Searching for Facebook pages on Google Search: sometimes, the results for one query change across runs.
    • Geocoding Facebook page's addresses: they could be incorrect or poorly written, preventing the result to be found.
Developer
Maintained by Apify
Actor metrics
  • 10 monthly users
  • 1 star
  • 91.5% runs succeeded
  • Created in Apr 2024
  • Modified 10 days ago
Categories