
Rightmove Scraper
Pricing
Pay per event

Rightmove Scraper
Scrape rightmove.co.uk to crawl millions of sale/rent real estate properties from United Kingdom. Our real estate scraper also lets you monitor specific listing for new updates/listing. You can provide multiple search result listings to scrape/monitor.
3.0 (1)
Pricing
Pay per event
22
Total users
348
Monthly users
66
Runs succeeded
>99%
Issues response
2.2 hours
Last modified
3 hours ago
Inconsistent Key-Value Store Writes for Delisting Tracker
Closed
I am using the enableDelistingTracker: true feature to identify properties that have been removed from Rightmove. My process involves scraping properties and then using a separate workflow to check the lastSeen timestamp in the Key-Value store (https://api.apify.com/v2/key-value-stores/[my_username]~rightmove-properties/records/163235030)
Expected Behavior: For every property successfully scraped by the actor, a corresponding key (the property ID) with a lastSeen timestamp is created or updated in the Key-Value store.
Actual Behavior: The Key-Value store is not being populated reliably. For many actor runs, entire batches of properties from specific postcodes are scraped successfully (the data appears in the dataset and is sent to my webhook), but their keys are never written to the KV store.
This causes my downstream "delisting check" workflow to fail with a 404 "Record was not found" error when it tries to look up the lastSeen value for a property that should exist in the store. The issue seems intermittent; sometimes a re-run of the same postcode will work, but often it does not.
Steps to Reproduce:
- Enable enableDelistingTracker: true in the actor input.
- Run a scrape using a search URL like the one below.
- After the run completes, check the actor's default Key-Value store.
- Observe that many of the property IDs from the run's dataset are missing from the KV store.
Here is an example of a search URL that consistently produces this issue: https://www.rightmove.co.uk/property-for-sale/find.html?searchLocation=E18+1JW&useLocationIdentifier=true&locationIdentifier=POSTCODE%5E1474039&buy=For+sale&radius=0.0&_includeSSTC=on
Thank you for looking into this!

Hi,
First of all, so glad to know that you are using delisting tracker! I developed that feature and documented it nicely but very less of subscriber is using it and know that they can make use of it.
I have tried reproducing it but my test went fine. One of the possible gotcha I can see is that whenever you are using delisting tracker, monitoring mode must be enabled. If you fail to do so, it won't see anything in store. I just published new version which will force scraper to set store even if monitoring is disabled and delisting is enabled in latest version.
Please let me know in case of error is still reproducible.
Test steps carried out :
1 - Input with following configurations :
{"addEmptyTrackerRecord": false,"deduplicateAtTaskLevel": true,"enableDelistingTracker": true,"fullPropertyDetails": true,"includeNearestSchools": false,"includePriceHistory": true,"listUrls": [{"url": "https://www.rightmove.co.uk/property-for-sale/find.html?searchLocation=AB16&useLocationIdentifier=true&locationIdentifier=OUTCODE%5E7&radius=0.0&_includeSSTC=on","method": "GET"}],"maxProperties": 50,"monitoringMode": true,"proxy": {"useApifyProxy": false},}
2 - deduplicateAtTaskLevel
has been turned on intentionally so that you can get new clean key value store for this particular run to verify. It will create new store with name smae as task id.
3 - Dataset had 13 results and store had 13 keys with above configuration.
Feel free to let me know in case of any issue. This is public thread. If you want to share run URL to look into data, DM me : dhrumil@techvasu.com
VDAC
Hi Dhrumil, thank you for looking into this and deploying a new version. For more context, I'm using a adaptable HTTP node in n8n that for the first run of a postcode will set the full scraping to on and as well as the price history if sales, then on any runs going forward it will turn off the full scraping and turn on monitoring mode.
"{
"listUrls": [
{
"url": "{{ $json.search_url.trim() }}"
}
],
"fullScrape": true,
"monitoringMode": {{ $json.monitoringMode }},
"fullPropertyDetails" : true,
"includePriceHistory": {{ $json.includePriceHistory }},
"enableDelistingTracker" : true,
"addEmptyTrackerRecord" : false,
"deduplicateAtTaskLevel" : true,
"maxProperties" : 1000
}"
I’ll wipe my data from the actor as well as the key values and run all the scraping again to see if the same issue occurs — will keep you posted.