Propertylink Estatesgazette Scraper avatar
Propertylink Estatesgazette Scraper
Try for free

3 days trial then $30.00/month - No credit card required now

View all Actors
Propertylink Estatesgazette Scraper

Propertylink Estatesgazette Scraper

dhrumil/propertylink-estatesgazette-scraper
Try for free

3 days trial then $30.00/month - No credit card required now

Scrape propertylink.estatesgazette.com to crawl millions of sale/rent real estate properties from United Kingdom. Our real estate scraper also lets you monitor specific listing for new updates/listing. You can provide multiple search result listings to scrape/monitor.

This Propertylink Estatesgazette Properties Scraper will enable you scrape any sale/rent listing from collection from propertylink.estatesgazette.com.

You can simply take your listing url from browser and enter it into this actor. This actor will crawl through all pages of particular listing and generate dataset for you.

Listing url is something you get when you perform the search on properylink estatesgazette site. Example listing urls :

📈 Extract Propertylink market data listings on Propertylink

👀 This actor is not just scraper but also has monitoring capability. You can turn on monitoring mode and it will give you only newly added properties compared to your previous scrapes.

📩 This actor also helps yu to identify which properties are not listed anymore. Please refer to Identifying delisted properties

⬇️ Download Propertylink real estate data in Excel, CSV, JSON, and other formats

📚 How do I start scraping with this scraper?

  1. Register for your free Apify account here
  2. You don't need to provide your credit card details for free acount. Just click on "Get Started" button on above link and complete the registration only.
  3. Free account comes with reasonable credits to try out this actor. This actor also comes with free trial of 3 days without any commitment/upfront charge.
  4. Run this actor and verify the scraped data. Apify has huge integration possibilities. You can download the data or push the data into any 3rd party platform directly.
📝📝
Listing TitleAgent Company
Listing URLDeposit
Agent NameAgent Phone
Listing TypeProperty Type
LatitudeLongitude
Text DescriptionFormatted HTML Description
WebsiteImages
PriceSize
Listing DateTenure

⬇️ Input

For simple usecase, you just need to provide browser url of propertylink search result page & that's all. You can leave other fields as they are to be sensible defaults.

Input example

1{
2    "listUrls": [
3        {
4            "url": "https://propertylink.estatesgazette.com/commercial-property-for-rent/sw2"
5        }
6    ],
7    "propertyUrls": [
8        {
9            "url": "https://propertylink.estatesgazette.com/property-details/6925105-office-space-to-let-on-lyham-road-brixton-london-sw2"
10        }
11    ],
12    "fullScrape": true,
13    "monitoringMode": false,
14	"enableDelistingTracker" : false,
15	"addEmptyTrackerRecord" : false
16}

You can either provide listUrls to search properties from or provide propertyUrls directly to crawl.

Understading monitoring mode :

  • fullScrape : This option is by default turned on. When enabled it always force actor to scrape complete listing from all pagination pages regardless of monitoring is enabled or not.

  • monitoringMode : This option when turned on will only scrape newly added property listings compared to previously scraped properties by this actor. It's important to turn off fullScrape setting if you are using this mode. If you keep fullScrape on, it will re-scrape complete listing again.

  • enableDelistingTracker : This option when turned on will start tracking date against each property under Apify Key Value store. This KV store can be queried later to find out which properties are delisted.

  • addEmptyTrackerRecord : This option when turned on will add empty record having only id of property to Apify dataset. This helps you identify whether property is still listed compared to your own database in incremental mode.

⬆️ Output

The scraped data is stored in the dataset of each run. The data can be viewed or downloaded in many popular formats, such as JSON, CSV, Excel, XML, RSS, and HTML.

Output example

The result for scraping a single property like this:

1{
2  "title": "123 Victoria Street, SW1E - Fully Serviced Managed Office",
3	"id": "6927980",
4	"countryCode": "GB",
5	"agentName": "Julie Kold",
6	"agentPhone": "0808 304 8575",
7	"agentCompany": "Office Freedom Central London",
8	"agentCompanyLogo": "https://propertylink.estatesgazette.com/find-a-service/results?hq_id=6909943&show_properties=1",
9	"coordinates": {
10		"latitude": "51.4968039",
11		"longitude": "-0.1384018"
12	},
13	"images": [
14		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555880lenormous-1900x0.jpg",
15		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555881lenormous-1900x0.jpg",
16		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555882lenormous-1900x0.jpg",
17		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555883lenormous-1900x0.jpg",
18		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555884lenormous-1900x0.jpg",
19		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555885lenormous-1900x0.jpg",
20		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555886lenormous-1900x0.jpg",
21		"https://propertylinkassets.estatesgazette.com/images/20230731/1-115555887lenormous-1900x0.jpg"
22	],
23	"description": "Reference: 28501Fully inclusive serviced office. Facilities include 24 Hour AccessAccess to other nationwide centresBike StorageBreakout SpaceConciergeDDA Compliant SummaryBoasting a high-design, high-service atmosphere and a range of flexible office spaces, this business centre in Victoria is a wonderful choice for any growing business. A range of features and facilities are provided to tenants, including a professional concierge service, access to boardrooms and full time IT support.Office DescriptionAn excellent range of services and facilities make this business centre ideal for any company, with a concierge service and front desk reception team ensuring a smooth work day and great first impression, whilst offices are well equipped for business with the latest business technologies. Meeting and event space can be hired by the hour.Location DescriptionFive minutes' walk from Victoria station, this location is one of London's busiest transport hubs, making it easily accessible from all over the city and surrounding areas. Victoria Coach station even offers bus services all over the country. There are excellent local amenities, including theatres and top-end restaurants.Website for this property",
24	"descriptionHtml": "<div class=\"mb-4\">Reference: 28501<br><br>Fully inclusive serviced office. <br><br>Facilities include <br>24 Hour Access<br>Access to other nationwide centres<br>Bike Storage<br>Breakout Space<br>Concierge<br>DDA Compliant <br><br><p><strong>Summary</strong></p><br>Boasting a high-design, high-service atmosphere and a range of flexible office spaces, this business centre in Victoria is a wonderful choice for any growing business. A range of features and facilities are provided to tenants, including a professional concierge service, access to boardrooms and full time IT support.<br><br><p><strong>Office Description</strong></p><br>An excellent range of services and facilities make this business centre ideal for any company, with a concierge service and front desk reception team ensuring a smooth work day and great first impression, whilst offices are well equipped for business with the latest business technologies. Meeting and event space can be hired by the hour.<br><br><p><strong>Location Description</strong></p><br>Five minutes' walk from Victoria station, this location is one of London's busiest transport hubs, making it easily accessible from all over the city and surrounding areas. Victoria Coach station even offers bus services all over the country. There are excellent local amenities, including theatres and top-end restaurants.</div><div class=\"mb-4\"><a target=\"_blank\" class=\"js-arxa-item-lead\" data-item-impact=\"link\" data-item-id=\"6927980\" href=\"https://officefreedom.com/uk/london/victoria/office-space/victoria-street-victoria-sw1e-ref-28501\">Website for this property</a></div>",
25	"websiteOfProperty": "https://officefreedom.com/uk/london/victoria/office-space/victoria-street-victoria-sw1e-ref-28501",
26	"tenure": "To Let",
27	"url": "https://propertylink.estatesgazette.com/property-details/6927980-123-victoria-street-sw1e-fully-serviced-managed-office",
28	"rentPrice": "£750.00   pcm (from) per desk - all inclusive",
29	"sizeSqFeet": "100 - 14,000 Sq Ft",
30	"displayAddress": "Victoria Street, Victoria, SW1E 6RA",
31	"addedOn": "1st August 2023",
32	"propertyType": "Serviced Office, Offices",
33	"type": "rent"
34}

❓Limitations

Since Propertylink allows only 1000 properties per listing/search result, you might want to break down your listing urls into smaller area if it has more than 1K results. Good News is that even if multiple list urls contains overlapping results, they will get deduplicated within same run data.

🔎 Identifying delisted properties

This actor provides you monitoring mode configuration using which you can get only incremental updates about newly added properties. In case, you also want to identify which properties are delisted from platform, you can use any of the following techniques with the help of this actor.

  1. Running Always in full scraper mode : Run this actor always in full scrape mode and cross check the new incoming batch of data with your existing database. If any property that exists in yoru database but not in newly scraped data batch, that means it's not listed anymore

  2. Use Key Value Store generated by scraper : If your are monitoring very large batch of data and you don't want to scrape everything all the time, this method involves bit of technicality but achieves the goal efectively. Apify has storage feature called Key-value store. When you run this scrape, this scraper stores every single property in key value store along with timestamp in propertylink-properties store. Inside this store, key is property id itself and value is timestamp like this

    { lastSeen : '2023-11-02T05:59:25.763Z'}

    Whenever you run this scraper, it will update the timestamp against particular id if it finds property on the platform. e.g. if we have 2 proprties with id prop1 and prop2 and we scraped them both on November 1, key value storage would look like this :

    1prop1 -> { lastSeen : '2023-11-01T05:59:25.763Z'}
    2prop2 -> { lastSeen : '2023-11-01T05:59:25.763Z'}

    Now if you run this scraper again on December 1 and prop1 is not on the platform anymore but prop2 is still there, key value storage would change like this :

    1prop1 -> { lastSeen : '2023-11-01T05:59:25.763Z'}
    2prop2 -> { lastSeen : '2023-12-01T05:59:25.763Z'}

    That means if any property has lastSeen less than latest batch you loaded, that property is delisted now. You can directly iterate through whole Key value storage using Apify key value storage API to identify this. Please refer to this API documentation to do the same. Please remember store name generated by this scrape will be propertylink-properties.

    Alternatively, you can iterate through your existing database active properties and use this API to identify listing status.

    For this approach to work, it's important that you enable this feature via enableDelistingTracker (Enable Delisting tracker) input.

🙋‍♀️ For custom solutions

In case you need some custom solution, you can contact me : dhrumil@techvasu.com

Or learn more about me on github : https://github.com/dhrumil4u360

Developer
Maintained by Community
Actor metrics
  • 1 monthly users
  • 96.3% runs succeeded
  • 0.0 days response time
  • Created in Oct 2023
  • Modified 4 months ago