
Zillow Detail Scraper
Pricing
$3.00 / 1,000 results

Zillow Detail Scraper
Get details of Zillow properties from URLs. This Actor can be easily integrated with other Zillow Scrapers.
4.3 (10)
Pricing
$3.00 / 1,000 results
73
Monthly users
305
Runs succeeded
99%
Response time
5.6 days
Last modified
2 months ago
For Sale by Owner Details
Open
Can you please explain in simple, non-technical terms how I can setup a saved task that uses the Zillow Search Scraper + Zillow Details Scraper to extract owner information (name and phone number) for For Sale by Owner listings? I've read through other comments and tried to make it work but it's just too confusing. Certain comments talk about unwinding objects etc but is there not a better way? I just want to be able to setup the scraper so that the CSV exports the property address, property city, property state, property zipcode, owner name, owner phone number, asking price and zestimate. Can you tell me how this can be accomplished so that once I run the task I can export these data fields as a CSV? I'm downloading data every day so I figure there must be a simpler way to get all the useful data columns without a bunch of extra steps each time I need to export data. Thanks in advance!

Hi, thanks for opening this issue!
Sounds like you've already figured out how to connect the two scrapers together? If not simply setup a maxcopell/zillow-detail-scraper
-> maxcopell/zillow-detail-scraper
Actor to Actor integration. Both are scrapers are ready to be integrated and the URLs from the first one are automatically used by the second one :)

For the export of the data, CSV is a very limiting file format with many downsides. One of those being that it's static, not dynamic, so our dynamically scraper data from Zillow is a terrible choice for it. As you've mentioned already, you will have to unwind or select the fields directly.
What exact fields are you trying to select? Maybe I can help you out with that :)
Thanks!
Kelsify
It sounds like CSV has some limitations so let’s take this in a different direction. I created a Google Sheets script that pulls the data from the last successful run of the saved task via the saved task API. As mentioned, I have a saved task that is the Zillow Search Scraper + Zillow Details Scraper. When I run the script, I thought the API would pull in the data from the Zillow Details Scraper as well since it is an integration within the saved task but it only returns the default Zillow Search Scraper details.
How do I make it so that when I use the endpoint it imports all of the details from the last run of the saved task (Zillow Search Scraper + Zillow Details Scraper)? I got the API URL by clicking on the saved task then clicking “API” in the upper right hand corner and I'm using the "Get last run dataset items" endpoint. The goal is to return ALL scraped details that includes the property owner name and phone number.

Right, that might be a bit problematic. Honestly, I'm no expert in this subject (especially the Google Sheets part) and wouldn't dare to go much in depth, but I can guide you in the general direction at least I hope :P Also, the Excel format is not much better than good old CSV, but whatever, it should get the job done. Ideally, you would use some modern and dynamic format like JSON otherwise.
The problem with calling it from the Google Sheet is that the data is only saved from the first run, the second is only invoked "separately". I would suggest using the Google Sheets Import & Export utility Actor to upload and modify your data to the sheet. You should be able to pair it up into an Actor to Actor integration chain like so: maxcopell/zillow-detail-scraper
-> maxcopell/zillow-detail-scraper
-> lukaskrivka/google-sheets
Let me know if this helps, thanks!
Pricing
Pricing model
Pay per resultThis Actor is paid per result. You are not charged for the Apify platform usage, but only a fixed price for each dataset of 1,000 items in the Actor outputs.
Price per 1,000 items
$3.00