Amazon Scraper avatar
Amazon Scraper

Pricing

$10.00 / 1,000 results

Go to Store
Amazon Scraper

Amazon Scraper

Developed by

Junglee

Junglee

Maintained by Apify

Gets you product data from Amazon. Unofficial API. Scrapes and downloads product information without using the Amazon API, including reviews, prices, descriptions, and ASIN.

4.4 (14)

Pricing

$10.00 / 1,000 results

121

Total users

6.5K

Monthly users

527

Runs succeeded

99%

Issues response

14 hours

Last modified

2 days ago

CD

Column seller/url missed

Closed

complimentary_dunlin opened this issue
25 days ago

I need to fix the run 0VRUgJAAHa7voolvx because there isn't the column seller/url I needed and that was present in previous test runs.

ruocco-l avatar

Hello and thank you for opening this issue.

Unfortunately this is a limitation on our side about the exporting of the dataset. If go to your run and select All fields you will see the seller object and all its information are there, both in Table and JSON format (see the screenshots I attached).

The problem is not that the scraper didn't get them, but because the platform has some limit about exporting very large datasets, something that we are already working on.

This problem, however, does not concern the JSON exporting, and that is the reason why we always suggest using that, manipulating the dataset with your own tool once you retrieve it. If you want, you could do from simple to complex manipulations right here on the platform, using the Merge, Dedup & Transform Datasets Actor. Here you can specify a function in the Post dedup transform function that will manipulate the dataset for you, creating a new one. For example, using this function

async (items, { Apify }) => {
return items.map((item) => {
return {
asin: item.asin,
price: item.price,
seller: item.seller
}
});
}

will get you a new dataset ready to download with just the information about the asin, the price and the seller. You can experiment and decide which property are the most important for you.

Hope this helps. Sorry for the inconvenience. As I said our team knows about the problem but it may take a while to get it properly fixed. In the meantime I'll leave this issue open and ping you when progress will be made.

Feel free to ask more question if you want to!