InvestorLift Scraper · Export Deals & Wholesaler Data avatar

InvestorLift Scraper · Export Deals & Wholesaler Data

Pricing

from $2.99 / 1,000 results

Go to Apify Store
InvestorLift Scraper · Export Deals & Wholesaler Data

InvestorLift Scraper · Export Deals & Wholesaler Data

Export properties, wholesalers, pricing, locations, and deal data from InvestorLift. Get a clean dataset ready for CRM, outreach, comps, or deal sourcing — in one run. Turn InvestorLift into a structured deal pipeline. One run, one dataset, thousands of deals.

Pricing

from $2.99 / 1,000 results

Rating

5.0

(1)

Developer

Corentin Robert

Corentin Robert

Maintained by Community

Actor stats

2

Bookmarked

14

Total users

7

Monthly active users

1.8 days

Issues response

13 days ago

Last modified

Share

InvestorLift Marketplace Scraper

Turn InvestorLift — a US wholesale / off-market deal marketplace — into structured rows you can use: CRM, cold outreach, market maps, comps, or internal deal flow. No manual copy-paste from the marketplace UI.


Why use this Actor?

You want to…This helps you…
Fill a pipelineExport hundreds or thousands of deals with location, price, and seller signals in one run.
Contact wholesalersOptional enrichment: company, contact name, seller rating & review count, description, condition — so your team can personalize outreach.
See what already sold or went pendingHistorical mode pulls sold and pending inventory for research and comp building.
Focus on fresh listingsActive mode + optional listed from / until dates to match your buy box timing.
Own a list of IDsSpecific mode: paste deal URLs or numeric IDs and enrich only those.
Survive long runsLarge historical crawls can resume after timeout or failure — already-saved deals are not duplicated.

Technical note: Scraping uses HTTP + parsing (no headless browser farm), so runs stay predictable on Apify.


Modes

ModeBest forTypical runtime
ActiveCurrent listingsMinutes
HistoricalSold + pending (bulk history)Hours — raise run timeout if needed
SpecificYour own URL/ID listDepends on list size
Active + date filter“Listed in this window” onlyMinutes

Use Listed from / until (YYYY-MM-DD) only with Active — they narrow listings by when they appeared on the marketplace.


  1. Choose Mode (usually Active to start).
  2. For full seller + description + ratings (best for outreach), pass "enrichWithDetails": true in the API input or your INPUT.json. The Console hides this field; the schema default is false, so set it explicitly when you need enrichment.
  3. Start, then download the Dataset.

Output (what lands in your dataset)

Use caseFields (high level)
Routing & geographycity, state_code, county, zip, latitude, longitude
Deal economicsprice, bedrooms, bathrooms, sq_footage, lot_size, ARV / margin fields when present
Seller & trustwholesaler_company, wholesaler_name, account_title, wholesaler_rating, wholesaler_review_count, account_id
Timing & tractiondays_on_il, page_views, published_at, expires_at
Narrativedescription, condition, property_type, img_url, property_page_url
Lineagesource: available, historical, or bulk

Troubleshooting

IssueWhat to do
Timeout on HistoricalResurrect the run or relaunch; raise Timeout in run options (e.g. 8 h).
403 / blockedSite may rate-limit or block datacenter IPs — retry later or reduce concurrency via detailConcurrency.
Thin or empty rowsFor Active, check date format YYYY-MM-DD; retry if the API returned nothing transiently.
No description / wholesaler columnsSet enrichWithDetails": true in input (see above).

Resume / checkpoint

On failure, the Actor stores state so a Resurrect or rerun with the same input skips deals already present in the dataset (and associated ID tracking). For long Historical jobs: resurrect → optionally increase timeout → continue without duplicating rows.


API input (advanced)

All fields can be set via the Apify API. Console-visible fields are listed in the Actor input form; others are optional.

ParameterNotes
scrapeModeactive_only · historical · specific
dealIdsFor specific — URLs or numeric IDs
dateRangeStart / dateRangeEndActive only — filter by listed date
enrichWithDetailstrue recommended for outreach-quality rows; false for faster, lighter runs
maxItemsOptional cap on how many deals to push (API / hidden)
excludeIdsSkip known IDs (API)
detailConcurrencyParallel detail fetches (API; default 64, max 128)

Example:

{
"scrapeMode": "historical",
"enrichWithDetails": true
}

Local development

$npm install

Create storage/key_value_stores/default/INPUT.json:

{
"scrapeMode": "active_only",
"enrichWithDetails": true
}

Then:

$apify run

Unit tests (no network): npm test · Integration (live site): npm run test:integration