Walmart Reviews Scraper — Product Reviews to CSV/JSON in 2 min
Pricing
Pay per usage
Walmart Reviews Scraper — Product Reviews to CSV/JSON in 2 min
7 runs. Backed by 951-run Trustpilot flagship + 31-actor portfolio. Walmart product reviews → CSV/JSON. Bypasses 100-review UI cap. 17 fields: stars, text, author, date, helpful, images, seller. For BI + competitor monitoring + sentiment. spinov001@gmail.com · blog.spinov.online · t.me/scraping_ai
Pricing
Pay per usage
Rating
0.0
(0)
Developer
Alex
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
2 days ago
Last modified
Categories
Share
Walmart Reviews Scraper
Walmart product reviews → CSV / JSON / Excel in 2 minutes. Bypasses the 100-review default UI cap on walmart.com/reviews/product/<id>. 17 fields per review including image URLs and seller metadata.
For: BI dashboards, competitor monitoring, sentiment analysis, product-launch QA.
Output schema (17 fields)
| Field | Type | Notes |
|---|---|---|
productId | string | Walmart numeric product ID (extracted from URL or accepted directly) |
page | integer | Pagination page on which this review was found |
sourceUrl | string | Exact URL fetched (page + sort included) |
reviewId | string | null | Walmart's stable review identifier |
stars | integer | 1 – 5 |
title | string | null | Review headline |
body | string | null | Review body text |
author | string | null | userNickname — display name (Walmart obfuscates real names) |
date | string | null | Submission date (US-format M/D/YYYY as Walmart returns it) |
verifiedPurchase | boolean | null | Derived from userBadges; null when Walmart omits the flag |
helpfulCount | integer | null | positiveFeedback — number of "Helpful" votes |
unhelpfulCount | integer | null | negativeFeedback — number of "Not Helpful" votes |
recommended | boolean | null | Reviewer recommends product (Walmart-prompted yes/no) |
imageUrls | string[] | Photos attached to this review (empty array when none) |
fulfilledBy | string | null | E.g. "Walmart" / "Marketplace seller" — fulfilment side of purchase |
sellerName | string | null | Seller of record at time of purchase |
scrapedAt | integer | Unix epoch seconds — when this row was collected |
Inputs
productInputs(array, required) — list of full Walmart product URLs OR bare numeric product IDs. Mixed input is OK. Up to 50 products per run.maxReviewsPerProduct(integer, default 200, max 5000) — hard cap. If a product has fewer reviews than the cap, the scraper stops naturally; it does NOT pad.sortBy(enum, defaultmost-recent) —most-recent|most-helpful|rating-high|rating-low.useProxy(boolean, defaulttrue) — strongly recommended ON.proxyConfiguration(object, default RESIDENTIAL) — Apify proxy group selector.maxConcurrency(integer, default 2) — parallel products. Reviews within a single product still paginate sequentially.requestDelayMs(integer, default 1500ms) — delay between sequential page fetches within a product.
Honest limitations
These are real, deliberate trade-offs — not bugs:
- Walmart aggressively rate-limits direct datacenter IPs.
useProxy=true(RESIDENTIAL) is the default for a reason. Datacenter / no-proxy runs will see 403 / 429 within a few hundred requests. - Reviews-per-page is fixed at 10 by Walmart. A 200-review product = 20 sequential page fetches = ~30 seconds at default 1500ms delay.
- Cap behavior. If a product has 47 reviews and
maxReviewsPerProduct=200, you get 47 — no padding, no synthetic rows. verifiedPurchaseis best-effort. Walmart sometimes omits the flag on older reviews. Missing →null.- No auth, no seller dashboard reviews. Public review list only — what you'd see at
walmart.com/reviews/product/<id>without logging in. - 404 products are skipped. Deactivated, regional, or ID-typoed entries log a warning and continue with the next product.
- HTTP 403 / 429 aborts the current product (not the whole run) and continues. Retry the failed IDs in a separate run — repeated 4xx in the same session usually means proxy IP burnout.
How it works
The scraper fetches walmart.com/reviews/product/<id>?page=<n>&sort=<sort> and parses the embedded __NEXT_DATA__ JSON blob — not the rendered DOM. This is more stable across UI redesigns: as long as the Next.js page state is shipped with the HTML, the parser keeps working.
Canonical extraction path (verified 2026-04 against walmart.com): props.pageProps.initialData.data.reviews.customerReviews. If a future Walmart redesign moves the array, the parser falls back to a tree-walk that pattern-matches review records by their unique key set (reviewId + reviewText + rating) — so the scraper degrades gracefully rather than returning empty.
Cost framing (Apify pricing)
- Per product, full pagination to ~100 reviews = ~10 page fetches × ~3 KB each = ~30 KB transfer.
- Walmart pages are HTML-heavy (~150–250 KB each rendered, but the embedded JSON is what matters for parsing).
- Default RESIDENTIAL proxy traffic is the dominant cost driver — budget accordingly.
Use cases
- Competitor monitoring: track competitor product reviews daily, alert on rating drops or new negative themes.
- Product-launch QA: scrape your own product's reviews post-launch, build a sentiment dashboard.
- Market research: collect reviews across a category for thematic analysis (LDA, embeddings, GPT clustering).
- BI integrations: drop the dataset into BigQuery / Snowflake / DuckDB for ad-hoc analysis.
Related actors in this portfolio
| Tool | Adds |
|---|---|
| Trustpilot Reviews | Cross-platform review parity (Trustpilot vs Walmart) |
| Reddit Discussion | Off-platform sentiment (where buyers complain about your product) |
| Google News | News coverage signals correlated with rating spikes |
Need a custom data pipeline?
I build custom scrapers, ETL pipelines, and data-feed integrations. Pilot scope examples: a tailored parser for a specific Walmart category, a daily cron with Slack alerts, or a multi-marketplace aggregator.
📧 Email: spinov001@gmail.com 🌐 Portfolio: blog.spinov.online · apify.com/knotless_cadence 💬 Tips & tutorials: t.me/scraping_ai
Disclosure: I maintain Apify actors related to this topic; links above point to my Apify Store profile (commercial). I am not affiliated with Walmart Inc.