LoopNet + Crexi Scraper · CRE Listings · CoStar Alternative
Pricing
from $5.00 / 1,000 results
LoopNet + Crexi Scraper · CRE Listings · CoStar Alternative
LoopNet + Crexi commercial real estate scraper. Aggregates listings, normalizes cap rates (NOI/price + asset-class median fallback), tracks days-on-market, dedups cross-platform, exposes broker contacts. The CoStar alternative for SMB CRE brokers — pay-per-result, $5/1K listings.
Pricing
from $5.00 / 1,000 results
Rating
0.0
(0)
Developer
KazKN
Actor stats
1
Bookmarked
3
Total users
1
Monthly active users
3 hours ago
Last modified
Categories
Share
LoopNet + Crexi Scraper · Commercial Real Estate Listings · CoStar Alternative
🏢 The autonomous commercial real estate API that aggregates LoopNet and Crexi listings, normalizes cap rates, and deduplicates cross-platform. $5 per 1,000 results — 1/40th of CoStar's $40k/yr median.
💰 Why CRE brokers switch to this
CoStar gates the U.S. commercial real estate listings market behind $14k–$40k/year per seat. Every existing scraper on Apify Store covers a single source (LoopNet OR Crexi). None normalize cap rates. None track days-on-market. None deduplicate cross-platform.
This actor does. And it runs fully autonomously, 24/7.
| 💵 Annual cost | 🏢 Coverage | 📊 Intelligence layer |
|---|---|---|
| This actor ~$1,500/yr (typical Pro broker use) | LoopNet + Crexi | ✅ Cap rate, DOM, dedup |
| CoStar Suite entry | $14,000–$20,000/yr | ⚠️ Web UI only, no API |
| CoStar Suite median | $40,000/yr | ⚠️ Multi-seat |
| Reonomy / Compstak | $5,000–$18,000/yr | Off-market focus |
⚡ Quick output preview
{"source": "loopnet","source_listing_id": "29769721","address": {"street": "2921 E 17th St","city": "Austin","state": "TX","zip": "78702","lat": 30.278541,"lng": -97.709889},"asset_class": "office","sub_type": "Loft/Creative Space","sqft": 7500,"asking_price_usd": 2500000,"cap_rate_listed": null,"cap_rate_normalized": 7.5,"cap_rate_estimated": true,"price_per_sqft": 333,"days_on_market": 32,"status": "active","broker": {"name": "Isaac Gutierrez","company": "ECR","phone": null},"also_listed_on": [],"photo_urls": ["https://images1.loopnet.com/i2/.../900x675/image.jpg"],"description": "Creative office building in the East Austin submarket available for sale."}
🎯 Quick start
🏙️ Search by city + state (most common)
{"city": "Austin","state": "TX","sourcesEnabled": ["loopnet", "crexi"],"assetClasses": ["office", "retail"],"priceMin": 500000,"priceMax": 5000000,"maxResults": 200}
🔄 Daily monitoring — only NEW listings
{"city": "Dallas","state": "TX","monitoringMode": true,"maxResults": 500}
Run on a daily schedule. The actor maintains an internal snapshot per query. Only listings not seen in previous runs are emitted.
📊 Track CLOSED deals
{"city": "Phoenix","state": "AZ","transactionTrackingMode": true}
Emits ONLY listings that disappeared since the last run (likely sold/leased) or whose status flipped to sold / under_contract. Each item carries transaction_tracking.close_status, close_detected_at, previous_price.
🔗 Paste specific listing URLs
{"startUrls": [{"url": "https://www.loopnet.com/Listing/12345678/example-property/"},{"url": "https://www.crexi.com/properties/87654321/example-listing"}]}
🏢 Who uses this
| Persona | Use case |
|---|---|
| 💼 CRE investment broker | Daily morning brief: new sub-$5M listings in target market |
| 🏛️ REIT acquisition team | Track multifamily + retail across 5 metros, normalize cap rates |
| 🔍 ETA / search fund | Identify owner-operated deals, dedupe LoopNet+Crexi to skip duplicate underwriting |
| 📈 CRE consulting firm | Build normalized comp datasets for client reports |
| 🏰 Family office | Filter for opportunity-zone office, weekly digest |
| 🏦 Lender / appraiser | Days-on-market by asset class for valuation models |
📊 The intelligence layer (the actual moat)
Cap-rate normalization
Listing data is inconsistent across platforms. Some declare cap rate, others NOI only, auctions and unpriced listings have neither. The actor handles all 4 cases:
| Source data | Action | Output cap_rate_estimated |
|---|---|---|
| NOI + price both declared | Recompute cap = NOI/price × 100 | false ✅ |
| Only cap_rate declared | Pass-through, derive implied NOI | false ✅ |
| Neither declared, price known | Estimate with asset-class median (US Q1 2024-25) | true ⚠️ |
| No price + no cap_rate | — | null |
Asset-class medians used:
| Class | Median | Class | Median |
|---|---|---|---|
| 🏠 Multifamily | 5.4% | 🛍️ Retail | 6.8% |
| 🏭 Industrial | 5.9% | 🏢 Office | 7.5% |
| 🏗️ Mixed-use | 6.5% | 🎯 Specialty | 7.0% |
| 🏨 Hotel | 8.5% |
🔄 Cross-platform deduplication
Property listed on both LoopNet AND Crexi appears in BOTH datasets. The actor:
- Computes a stable
dedup_key= hash(normalized_street + city + state + sqft_bucket + asset_class) - Groups listings by key
- Picks the most complete record as primary (highest non-null field count)
- Marks
also_listed_on: ["crexi"]on the LoopNet primary (or vice versa) - Outputs one deduplicated record per property
📌 No other Apify actor does this. It's the difference between raw scrape and intelligence.
💰 Pricing — fully transparent
Pay-per-event on Apify Store. New Apify users get $5 platform credit on signup (~1,000 listings test on this actor included).
| Event | Price | When |
|---|---|---|
| 🚀 Actor Start | $0.05 / run | Once per scrape job |
| 📍 Result ⭐ | $0.005 / listing | Each unique listing returned (primary cost) |
| 📋 Listing detail enrichment | $0.003 / listing | Only when includeListingDetails: true (NOI, photos, year built) |
💵 Realistic cost estimates
| Volume | Without details | With details |
|---|---|---|
| 100 listings (test) | $0.55 | $0.85 |
| 1,000 listings (city scan) | $5.05 | $8.05 |
| 10,000 listings (multi-city) | $50.05 | $80.05 |
| 25,000 listings/mo (active broker) | ~$125/mo | ~$200/mo |
📈 At 25K listings/mo the actor costs ~$1,500/yr vs CoStar's $40,000/yr median. ROI = 27×.
✅ Key features
- 🔄 Cross-platform aggregation — LoopNet + Crexi
- 📊 Cap-rate normalization with NOI fallback + asset-class medians
- 🔁 Cross-platform deduplication with stable
dedup_key - ⏱️ Days-on-market index auto-computed
- 🆕 Monitoring mode — only new listings since last run
- 🤝 Transaction tracking — only closed deals (sold / leased / under contract)
- 📦 Portfolio expansion — multi-property listings split into individual items
- 🎛️ 20+ filters — price, cap rate, sqft, asset class, sub-type, state, city
- 📁 Apify dataset views — Overview / Financial / Broker contacts pre-configured
- 🌍 Multi-locale parsing — handles English + French, ranges (1.5K-13K SF), K/M multipliers
- 🤖 Fully autonomous — runs on Apify residential proxy, no human ritual
❓ FAQ
Q: How is this different from existing LoopNet / Crexi scrapers on Apify? A: Other scrapers cover a single source and return raw data. This covers BOTH, normalizes cap rates, deduplicates cross-platform, computes days-on-market. The intelligence layer is the value.
Q: How fresh is the data? A: As fresh as LoopNet/Crexi public listings — typically updated every 24–48h on the source side. With monitoring mode, you're notified within 1 day of a new listing.
Q: Can I get NOI and cap rate that aren't publicly listed?
A: When the source platform exposes them, yes. When not (~40% of listings), the actor estimates with asset-class medians, clearly flagged with cap_rate_estimated: true. Medians refreshed quarterly.
Q: Are broker contact emails / phones included? A: Phone and email when available on source (Crexi exposes more contact info than LoopNet). Use ethically — direct outreach is governed by your local CRE broker license rules.
Q: Can I use this for off-market data? A: No, on-market only. For off-market, look at Reonomy or CompStak (different price tier).
Q: What about Ten-X auctions?
A: Out of scope — Ten-X auction listings are gated (bidder registration required). The actor flags listings transitioning to auction status via status: "under_contract" when source exposes it.
Q: Can I run this on a schedule?
A: Yes — Apify scheduler. Recommended: daily 6am ET in monitoringMode: true for new-listing alerts. Combine with Apify webhooks → Slack / Zapier / your CRM.
Q: How does dedup handle two listings with same address but different asset class?
A: Dedup key includes asset_class, so a "Mixed Use" record and a "Multifamily" record at the same address remain separate items.
Q: What about international listings? A: USA + Canada (LoopNet covers both). International CRE markets not currently supported.
📋 Output schema
Each dataset row follows this schema:
{source: "loopnet" | "crexi",source_listing_id: string,source_url: string,scraped_at: string, // ISO8601address: {raw: string,street: string | null,city: string | null,state: string | null, // 2-letter US/CA codezip: string | null,country: "US" | "CA",lat: number | null,lng: number | null},asset_class:| "office" | "retail" | "industrial" | "multifamily"| "land" | "hotel" | "mixed-use" | "specialty" | "unknown",sub_type: string | null,sqft: number | null,units: number | null,year_built: number | null,lot_size_sqft: number | null,asking_price_usd: number | null,noi_usd: number | null,cap_rate_listed: number | null,cap_rate_normalized: number | null,cap_rate_estimated: boolean,price_per_sqft: number | null,listed_at: string | null,days_on_market: number | null,status:| "active" | "under_contract" | "sold" | "leased"| "removed" | "off_market" | "unknown",broker: {name: string | null,company: string | null,phone: string | null,email: string | null,profile_url: string | null},dedup_key: string, // stable hash for dedupalso_listed_on: ("loopnet" | "crexi")[],photo_urls: string[],description: string | null,// Only when transactionTrackingMode=truetransaction_tracking?: {close_status: "sold" | "leased" | "under_contract" | "removed" | "pending",close_detected_at: string,first_seen_at: string,last_seen_at: string,previous_price: number | null}}
🛠️ Technical notes
- Stack: Apify SDK v3 + Crawlee v3.16, TypeScript ESM, Node 22
- Source extraction:
pds.loopnet.commobile API +api.crexi.comJSON - Anti-blocking: Apify residential proxies, auto-rotated
- Performance: HTTP-only for normal usage, sub-10s runs for 50 listings
- Cap-rate medians: refreshed quarterly from Marcus & Millichap + CBRE Cap Rate Survey
- Address normalization: USPS-style street parsing + sqft buckets for dedup
📞 Contact
Issues, feature requests, custom feeds → open an issue on Apify Store or contact KazKN directly.
🏢 Built to help SMB CRE brokers stop overpaying for data they couldn't differentiate themselves with anyway.