Binance NFT Collection Scraper avatar

Binance NFT Collection Scraper

Deprecated
Go to Store
This Actor is deprecated

This Actor is unavailable because the developer has decided to deprecate it. Would you like to try a similar Actor instead?

See alternative Actors
Binance NFT Collection Scraper

Binance NFT Collection Scraper

gabrielm/binance-nft-collection-scraper

🌐 Unlock the world of Binance NFT marketplace with our all-in-one scraper 💹 Extract rankings, explore individual collections, and access asset metadata easily. Get key metrics in your preferred cryptocurrency and fine-tune your results with extensive filtering, sorting and exporting options!!

The Binance NFT scraper



Hello! It's Indy. I'm in charge of introducing you to this actor, but if you are here, I bet you are skilled enough in crypto assets jargon, so we can skip that part. 🤭

This actor scrapes NFT collections and assets from Binance, the largest cryptocurrency exchange in the world.

Founded in 2017, Binance has a near-perfect score (as of September, 2023) according to CoinMarketCap, regarding trading volume, liquidity, web traffic and confidence.

During the 'boom' of NFTs in 2021, Binance launched its own marketplace with several features for both creators and owners of digital assets. So here we are. 🙂 Welcome to the Binance NFT scraper!



Our goal


As with most actors, we try to provide a service that is not available in the 'target' site, or takes too long to do it in bulk by hand.

It's noticeable that the Binance API for NFTs is very limited, with just a few end-points available and lacking of several useful details in the responses.

Additionally, using the web interface to search for collections or within collections under some parameters is already painfully slow without even considering the human factor.

This scraper lets you do all that anonymously, without quota limits and the constant risk of getting banned.

We achieve that by working one layer below the usual browser automation, through their 'non-public' API, and this has pros and cons.

The pros, it's lightning fast and detailed, compared to what you see as a web user. The cons, it's undocumented, but that's the reason we decided to develop this actor. So don't worry!



Roadmap


If you're in a hurry, follow this quick guide:

✅ In 0️⃣ Basic options, decide which currency you want your amounts converted to, and select it.

✅ For scraping the rankings or trending list and useful data about each collection inside, select 2️⃣ Collection rankings as the scraping mode.

  • In 2️⃣ Collection rankings, pick the 'Ranking type' and optionally enable 'Include properties' or 'Include additional details' for extra data.
  • Depending on the 'Ranking type' you chose, customize your results (period covered or sorting) in 2️⃣.1️⃣ All collections or 2️⃣.2️⃣ Trending collections.

✅ For scraping useful data about the NFTs in a given collection, select 3️⃣ Collection assets as the scraping mode.

  • In 3️⃣ Collection assets, enter the desired 'Collection id' or just the URL of that collection, and optionally customize your results (filters, sorting, etc.) with the rest of the options in this section.

▶ Hit Start and wait a few minutes.

Select the right view for browsing the Output. For rankings or trending list, be sure to pick 1️⃣&2️⃣ Collections - Compact/Detailed. As for NFTs, pick 3️⃣ Assets - Compact/Detailed.



Basic options


There are 6 currencies that can be used in the results shown:

  • USD [default]
  • Network's coin
  • BNB (the Binance Smart Chain's coin)
  • ETH (Ethereum network)
  • MATIC (Polygon network)
  • BTC (Bitcoin network)

Since the listed collections belong to diverse networks, a currency conversion may be required. We take the conversion rates directly from Binance, but as expected in the crypto world, the amounts can vary after a few minutes.

In case you don't want any conversion applied, pick the Network's coin option. It will show each result in its associated currency.

This actor encapsulates three working modes:

  • Specific collection(s). Scrapes all the details for an user-supplied list of collections.
  • Collections rankings. Same as the previous options, but for the website's all/top and trending lists. [default]
  • Collection assets. Grabs all the assets (items, NFTs, etc.) for an user-supplied collection.


Specific collection(s)


Period chooses what kind of volume, volume rate, floor price and floor price rate values are scraped:

  • Last hour
  • Last 6 hours
  • Last 24 hours [default]
  • Last 7 days
  • Last 30 days
  • Lifetime

⚠️ Sometimes, the specified collections are not included in the selected period rankings, resulting in missing volumes and prices, and there's no known workaround for that.

Volume rate and floor price rate are seen as '1H/6H/24H/7D/30D change' in the website.

Include properties will add the available attributes (or traits) for every scraped collection.

Be wary that this option can add a lot of details to your results. Some collections don't have any properties at all, but some can have A TON of them.

Enter a list of collection ids and the actor will scrape the details of each one. [required]

Each entry can be any of the following:

  • A basic id, a series of digits such as 694369170047053825
  • A full collection URL, as copied from the website or browser, such as https://www.binance.com/en/nft/collection/degods-694369170047053825?collections=694369170047053825&isBack=1&order=amount_sort%401
  • A simplified collection URL, such as https://www.binance.com/en/nft/collection/694369170047053825

The actor will 'understand' each type, as long as they correspond to Binance collection ids or collection URLs, and will ignore the invalid ones.

The system will remove the empty spaces and won't allow duplicated ids. And the actor is still able to detect (and remove) different URLs pointing to the same collection.



Collection rankings


Maximum rows sets the length of the list of scraped collections. There is a limit of 100 for the Trending list, imposed by the website.

Include properties, exactly as with Specific collection(s), will add the traits for every scraped collection.

Include additional details, when enabled, extends the results with this information:

  1. The collection's 'contract address' in the respective blockchain.
  2. The collection's 'royalty fee' rate.
  3. The volume and volume rate for the last 24 hours.
  4. The floor price and floor price rate for the last 24 hours.
  5. The lifetime volume ('total volume' in the website).

These details require a separate request per each collection. Have that into account when scraping several rows.

All the rates are expressed in percentages.


It's worth mentioning that the options described here (Network, Maximum rows, Properties and Additional details) apply for the next two sections, 2️⃣.1️⃣ and 2️⃣.2️⃣, corresponding to each ranking type.

There are two available ranking types:

  • All collections. A customized list of 'top' collections according to user-defined rules. [default]
  • Trending collections. A list of collections that the website considers to be 'hot' right now.

The details (i.e., columns) collected by these ranking types are exactly the same. The difference lies in how the results (i.e., rows) are selected.

Network restricts the collections to the ones belonging to any of these:

  • All networks. I.e., no restriction. [default]
  • Binance Smart Chain. I.e., Binance's own network.
  • Ethereum
  • Polygon
  • Bitcoin

In September 2023, Binance decided to part ways with the Polygon blockchain as a business move we don't fully understand. Because of that, Polygon's collections are not shown in the rankings anymore. They (and their NFTs) can still be accessed directly and appear in the results, but are planned to be delisted by the end of 2023. 😞



All collections




These options are exclusive for the scraping mode 'Collection rankings' and the ranking type 'All collections'.

Sort metric is pretty much self-explanatory when you see the available options:

  • Total volume [default]
  • Total volume rate %
  • Floor price
  • Floor price rate %

Sort direction acts as a modifier for the previous option:

  • Ascending
  • Descending [default]

Both sort metric and direction may seem useless considering that once a dataset is exported, you can sort it by whatever criteria you wish.

However, notice that we are querying the Binance rankings, so the results vary a lot depending on the sort parameters.

E.g., the top ten collections by volume are not gonna be the same as the top ten by price, etc.

Period works exactly as in the 'Specific collection(s)' mode:

  • Last hour
  • Last 6 hours
  • Last 24 hours [default]
  • Last 7 days
  • Last 30 days
  • Lifetime



Period works as expected, but since the 'trending' ranking is limited by the website, it has fewer options:

  • Last hour
  • Last 6 hours
  • Last 24 hours [default]
  • Last 7 days


Collection assets


The last scraping mode is the most customizable, given that we are dealing with individual NFTs here.

Enter a collection id (or URL) and the actor will scrape all the NFTs belonging to that collection.

You can leave it in blank and that will result in grabbing the NFTs for ANY collection, meaning THOUSANDS of probable results. So we trust you know what you are doing.😉

These filters are pretty evident. Let's review only the particular details which may not be obvious at first look.

Not selecting anything in each filter group is the same as completely ignoring that filter, and that makes the available results wider.

⚠️ Don't trust the 'Assets status' filter for anything different than 'Buy now'. This Binance feature is problematic by design. Selecting individual statuses works but does not guarantee an exact filtering of the results, and there's nothing to do about it because their API behaves that way.

Rarity only applies to 'Mystery boxes' and the NFTs coming from them, 'Box-exclusive'. This is a Binance-only feature, not to be confused with 'Open rarity'.

As mentioned before, Binance does not work with the Polygon network anymore, but you can filter by that and still obtain results (which is likely to continue happening until the end of 2023).

Sort metric differs slightly from what the website provides for sorting the NFTs.

But combining it with Sort direction will give us these useful and well-known criteria:

  • Date of listing (along with 'Descending' gets the 'Recently listed' assets)
  • Date of sale end (along with 'Ascending' gets the assets 'Ending soon')
  • Date of creation (along with 'Descending' gets the 'Recently created' assets)
  • Price [default]
  • Favorites (along with 'Descending' gets the 'Most favorited' assets)
  • Score (along with 'Descending' gets the 'Most popular' assets)
  • Open rarity (along with 'Ascending' gets the rarest assets)

Additionally, you can play with the options and get other not-that-useful results. For example:

  • Date of listing (along with 'Ascending' would get the earliest listed assets)
  • Favorites (along with 'Asscending' would get the least favorited assets)
  • Open rarity (along with 'Descending' would gets the most common assets)

If a date is missing, it will be treated as the oldest date when sorting in ascending order, and this causes that the rows with missing dates are moved to the top.

The favorites 'count' can be negative, possibly due to dislikes or other reasons. Don't be surprised if you come across this.

Maximum rows sets the length of the list of scraped assets. There is no limit for this value, and collections can include any number of assets from only a few tens to tens of thousands.

We are confident that you will keep this in mind to prevent the actor from consuming unnecessary resources.

Include additional details, when enabled, extends the results with this information:

  1. The collection's 'contract address' in the respective blockchain.
  2. Details about the creator of the NFT.
  3. Details about the owner of the NFT.
  4. How much the asset was purchased for (by the owner).
  5. The properties/attributes/traits of the NFT.
  6. Bids information (lowest bid, highest bid and total bids)
  7. Offers information (lowest offer, highest offer and total offers)

The values 1-5, the bids, and the offers require a separate request for each. So be careful when scraping several assets as this can take a considerable time.



Advanced options


Most of the time, you won't need to make any changes in this section unless you wish to customize the actor's behavior for specific situations.

Maximum number of threads is the amount of simultaneous scraping processes this actor can start at a given time.

Request timeout is the amount of time given to each individual request to complete. After this interval, incomplete requests are just forgotten.

Request retries is the number of times to try again a failed request before giving up and acknowledge the actual failure.

Request cooldown is the amount of time to wait before retrying each failed request.



This actor can run without proxies, but, like any scraper, proxies can be used to simulate 'fair use' by multiple users, ensuring anonymity and reducing the likelihood of triggering certain flags.


People used to think that proxies could slow down and degrade the success rate of requests. However, any decent proxy list, including the ones Apify provides, are more than sufficient.


Binance is known for its hostility towards automation and issues IP bans, sometimes lasting for days. So you have been warned. Use proxies. 😋

Having said that, these are the available proxy options:

  • Datacenter proxies: These are the fastest and most affordable ones provided by Apify. Combine with 'Choose proxies automatically' to let the system to decide what's better for you. Additionaly, you are free to select a single country to simulate the origin of the actor requests, but no need to bother with that.

  • Residential proxies: The most reliable ones, with the best chances of getting successful requests. At expense of a higher cost, though. Country selection is also possible here.

  • Special proxies: In a nutshell, do not use. Here you find SERP proxies, which are not suitable for this actor.

  • Own proxies: If you bought proxies from some 3rd-party, this is the place to configure that. Enter one or more URLs with the format http://username:password@my-proxy.example.com.

This final group of options really requires a deeper explanation, with examples, etc., to be fully understood, and we'll do that next. But the main idea is that you don't need to worry about it. 😉


Dataset optimization

This option, when enabled, reduces the horizontal size of the exported data by unifying the columns belonging to the same top-level group. Only some columns like properties or fees may need this.

Let's see an unoptimized collection's properties export. Notice the amount of wasted space for a collection with only 2 properties, 'Relic' and 'Role':

properties/Relic/0/totalproperties/Relic/0/valueproperties/Role/0/totalproperties/Role/0/valueproperties/Role/1/totalproperties/Role/1/valueproperties/Role/2/totalproperties/Role/2/value
84KodaPendant15793Hunter12239Farmer7158Enchanter

Now look how simple it becomes when optimized.

properties
Relic: KodaPendant (84)
Role: Hunter (15793), Farmer (12239), Enchanter (7158)

For a programmatic post-processing this may not seem adequate, but from a human point of view, it's neat and concise.


Another sample. An unoptimized assset's platform fees export. One single asset/row, to keep it simple:

platformFees/0/productSourceplatformFees/0/rateplatformFees/0/vipRateplatformFees/1/productSourceplatformFees/1/rateplatformFees/1/vipRate
Binance1OpenSea2.5

See the beauty of the optimization for the previous export:

platformFees
Binance: 1%
OpenSea: 2.5%

Dataset cleanse

Enabling this gets rid of collections/assets which don't include enough information to be considered valuable.

For example, not having ('null') a Network or Currency specified, results in all the amounts (volumes, prices, etc.) being either missing as well or invalid for a given result row.

A collection with no assets (i.e., total items being zero) only wastes space in the results.

An asset with no defined marketplace and/or trade type has been probably de-listed or is somewhat inactive.

Maybe there are other conditions to discard results, but after a thorough analysis, we have concluded that the ones described here are the most prominent.

Notice that the omitted results are still shown in the actor's log.


Show warnings

Reporting an issue will be better when you attach the information shown (in the actor's log) by activating the warnings.

Apart of a bigger log, this won't hurt anybody. Most of the time, you get messages like this:

!! request failed: [Binance 000003 - Too many requests. Please try again later.] url=... (3 retries remaining) !!

Remember when we talked about hostility? Well, look it in action here. That's just a notification. Insisting with the same IP could get it blocked for who knows how long, and that's why proxies are recommended.

Other common warnings are about missing information. For example:

!! [getNFTAPICollectionProperties()] Collection properties request failed: Response data is empty - collection '...' !!

This just means that we found a collection with no properties/attributes/traits. 👨🏽‍🔬 A completely normal phenomenon.

Failures related to different issues are very rare.


Allow incompletion

This one's directly related to the previous option, and works as follows:

If a request fails and we are allowing incomplete information, the data that the request was attempting to populate remains empty without causing any interruption; otherwise, the entire process fails too.

Having the warnings enabled let you know what happened in these cases.



Output


Before explaining how the datasets will look, it's important to know two essential aspects about the data gathered from Binance.

  1. There can be empty fields, so you will often find 'null' values, meaning that the content is either not available or not applicable.

    Some of those are pretty obvious to figure out a reason why. For example, an asset with no offers (i.e., 'totalOffers' is zero), is certain to have the offer values ('lowestOffer' and 'highestOffer') set to 'null'.

    On the other hand, an asset with a 'score' (the index of popularity) set to 'null' makes little sense. Probably means that they did not bother to calculate it, but we all know that the best approach should be setting it to '0.0'.

  2. We need to accept some sort of small imprecision, both in terms of time and money, imposed by the website itself. The information is not generated in real-time, and our objective is not that either. Having that in mind:

    The volumes, prices, and other currency amounts, along with dependent values (e.g., rates), aren't perfectly up to date, at least not to the second.

    Rankings can also take some minutes to be updated. That's the reason the values in these lists sometimes differ from what you get when browsing individual collections.

    The currency conversion table is loaded at the beginning of the actor's run, and the same exchange ratios are used throughout the process. Constantly updating this table would waste processing time and bandwidth.

Moving forward, remember that there are 3 scraping modes for this actor. 'Specific collection(s)' and 'Collection rankings' refer to (guess what) collections and hence, will produce results using the same layout. Let's see a 1-row sample:

1[
2  {
3    "collectionId": "696169778363047937",
4    "collectionName": "Otherside Vessels",
5    "collectionUrl": "https://www.binance.com/en/nft/collection/696169778363047937",
6    "collectionCover": "https://public.nftstatic.com/static/nft/res/nft-extdata-loader/S3/1680744730441_4lg38ehc0z91k4qmw66teiz4xj5f9ozz.jpg",
7    "contractAddress": "0x5b1085136a811e55b2bb2ca1ea456ba82126a376",
8    "currency": "BNB",
9    "floorPrice": "1.40347061545580749653",
10    "floorPrice24h": "1.40517704766311892642",
11    "floorPriceRate": 21.69,
12    "floorPriceRate24h": 1.13,
13    "network": "ETH",
14    "properties": {
15      "Relic": [
16        {
17          "value": "KodaPendant",
18          "total": 84
19        }
20      ],
21      "Role": [
22        {
23          "value": "Hunter",
24          "total": 15793
25        },
26        {
27          "value": "Farmer",
28          "total": 12239
29        },
30        {
31          "value": "Enchanter",
32          "total": 7158
33        }
34      ]
35    },
36    "royaltyFeeRate": 0,
37    "totalItems": 43012,
38    "totalListed": 117,
39    "totalOwners": 7555,
40    "verified": true,
41    "volume": "8491.66201712170291531698",
42    "volume24h": "577.21772373900971772328",
43    "volumeLifetime": "68727.09585330865340120315",
44    "volumeRate": 37.72,
45    "volumeRate24h": -67.98
46  }
47]

Now, let me explain a little things about the previous collected information...

  • You may see that this collection runs on the Ethereum network, but the actor was instructed to convert the currency amounts (volume, volume24h, volumeLifetime, floorPrice and floorPrice24h) to the one Binance uses. So, network is 'ETH' and currency is 'BNB'.

  • Therefore, the contractAddress belongs to the Ethereum blockchain, and you can verify that here.

  • As you have probably noticed, the amounts are represented as strings, and if you wonder why, don't forget that in the crypto world, every decimal matters and precision loss can lead to huge differences in currency conversions.

  • The amounts volume and floorPrice, refer to the period selected by the user. For this example, it was 'Last 7 days' (run in September 2023 if you ask).

  • The amounts volume24h and floorPrice24h are obviously for the 'Last 24 hours'. And yes, since this same period can be selected from the list of options, doing it will result in volume and floorPrice holding the same values, respectively.

  • As pointed out previously in this document, rates (volumeRate, volumeRate24h, floorPriceRate and floorPriceRate24h) are expressed in percentages. The volumeRate and floorPriceRate values will match the 'Last 24 hours' values when this is the selected period, consistently.

  • This collection have NFTs with two possible attributes: 'Relic' and 'Role'. 'Relic' can be 'KodaPendant' only and there are 84 items with this value. 'Role' can be either 'Hunter', 'Farmer', or 'Enchanter' and there are 15793, 12239, and 7158 items with this value, respectively.


The results of the 'Collection assets' scraping mode use this other layout. Check this 1-row sample:

1[
2  {
3    "assetId": "76869756",
4    "assetName": "Doodle #4609",
5    "assetType": "NFT",
6    "assetUrl": "https://www.binance.com/en/nft/item/76869756",
7    "assetCover": "https://public.nftstatic.com/static/nft/res/nft-extdata-loader/S3/1681528324012_zfqis72luh65t8rieol3rtjxqk8kil27.jpg",
8    "assetMedia": "https://public.nftstatic.com/static/nft/res/nft-extdata-loader/S3/1681528324012_zfqis72luh65t8rieol3rtjxqk8kil27.jpg",
9    "collectionId": "533647159874920449",
10    "collectionName": "Doodles",
11    "collectionUrl": "https://www.binance.com/en/nft/collection/533647159874920449",
12    "contractAddress": "0x8a90cab2b38dba80c64b7734e58ee1db38b8992e",
13    "category": "Collectibles",
14    "creator": "Anonymous-User-cfcd2",
15    "creatorUrl": "https://www.binance.com/en/nft/profile/a928af121852b2ed44c5ae45be80c589",
16    "currency": "USDT",
17    "dateCreate": "2023-09-04T13:51:43.000Z",
18    "dateEnd": "2023-10-04T13:51:37.000Z",
19    "dateList": "2023-09-04T13:51:44.000Z",
20    "dateStart": "2023-09-04T13:51:37.000Z",
21    "highestBid": null,
22    "highestOffer": "10.446",
23    "lowestBid": null,
24    "lowestOffer": "2.222",
25    "mediaType": "IMAGE",
26    "network": "ETH",
27    "openRarity": 9861,
28    "openRarityCount": 10000,
29    "owner": "LidaLL",
30    "ownerUrl": "https://www.binance.com/en/nft/profile/2c04a955afc7d44d9dc983760a9061d2",
31    "platformFees": [
32      {
33        "productSource": "Binance",
34        "rate": 1,
35        "vipRate": null
36      }
37    ],
38    "price": "2433.144",
39    "priceBuyout": "0",
40    "priceLastSold": "1.1",
41    "productId": "197592158",
42    "productSource": "Binance",
43    "productStatus": 1,
44    "properties": [
45      {
46        "rate": 3.4779867605337813,
47        "total": 331,
48        "type": "body",
49        "value": "combo 1 puffer"
50      },
51      {
52        "rate": 7.796574550803825,
53        "total": 742,
54        "type": "background",
55        "value": "grey"
56      },
57      {
58        "rate": 7.292213932962068,
59        "total": 694,
60        "type": "face",
61        "value": "sunglasses"
62      },
63      {
64        "rate": 10.003152253861511,
65        "total": 952,
66        "type": "head",
67        "value": "med"
68      },
69      {
70        "rate": 3.1522538615109807,
71        "total": 300,
72        "type": "hair",
73        "value": "green mullet"
74      }
75    ],
76    "rarity": null,
77    "royaltyFees": [
78      {
79        "productSource": "Binance",
80        "rate": 0.5,
81        "vipRate": null
82      }
83    ],
84    "score": null,
85    "tokenId": "4609",
86    "totalBids": 0,
87    "totalFavorites": 1,
88    "totalOffers": 4,
89    "tradeType": "Buy now",
90    "verified": true
91  }
92]

Remarkable elements about these new scraped results are:

  • assetType is usually 'NFT' but it can have other values such as 'Mystery Box'.

  • assetMedia holds the URL of the raw NFT content. It uses to be the same as assetCover, when mediaType is 'IMAGE'. But if it's 'VIDEO', the URL will point to an actual MP4 file (or another supported format). These URLs are permalinks, so you can download the files anytime.

  • About the dates (dateCreate, dateEnd, dateList, and dateStart), see that they adhere to the ISO 8601 format, in UTC to eliminate ambiguity and facilitate automated post-processing.

  • Since totalBids is zero, both highestBid and lowestBid are 'null'.

  • There are 4 offers for this asset, as you can see in the value of totalOffers, but only the highestOffer and lowestOffer are scraped.

  • openRarity and openRarityCount refer to the OpenRarity standard for NFT rarity rankings, which Binance has now support for. The less the quotient of those values, the rarest the NFT is (the one in the example is pretty common: 9861/10000). But take note that it's a slowly adopting feature which not every collection includes, so this pair usually appears as 'null'.

  • The arrays platformFees and royaltyFees are the fees you have to pay for trading with the asset in the website. Thus, productSource is gonna be 'Binance', but other marketplaces such as 'OpenSea' can be included for third-party listed NFTs. Remember that rate, just like other rates will be a percentage. I've not found any vipRate which is not empty so far. It's there because the API includes it. That's all.

  • You can find the 'Buy Now' price as price, the price that the owner paid as priceLastSold, and in case there's a running auction for the asset, the 'Place a Bid' price is now in price, and priceBuyout contains the immediate win price.

  • The sample asset has values set for 5 traits: 'body' is 'combo 1 puffer', 'background' is 'grey', 'face' is 'sunglasses', 'head' is 'med', and 'hair' is 'green mullet'. The percentage of assets (in this collection) having 'combo 1 puffer' as 'body' is aproximately 3.5% (331 assets); for having 'grey' as 'background' is aproximately 7.8% (742 assets), etc.

  • rarity, as stated in the 'Collection assets' options, only applies for 'Mystery boxes' and 'Box-exclusive' NFTs. The possible values are 'Unopened', 'Normal', 'Rare', 'Super Rare', 'Super Super Rare', and of course 'null', meaning 'not applicable'.

  • score, the previously discussed 'index of popularity' (when available), is a decimal value between 0 and 1. The highest values, closer to 1, correspond to the most popular assets.

  • tokenId, along with the contractAddress, helps to locate the NFT in its original marketplace. It can be used to auto-generate external links.



Output views


There are four type of available views for browsing and export your datasets. Two for collections and two for assets.

The way Apify works, there is no chance of enabling the availability of the views according to the current scraping mode. Hence, it's possible to browse a dataset with an incompatible view by mystake. Watch it. 😋

Please observe that 1️⃣, 2️⃣, and 3️⃣ identify the scraping mode. Remember that the scraping modes 'Specific collection(s)' and 'Collection rankings' share the same results layout.

  • 1️⃣&2️⃣ Collections - Compact: this layout includes Cover, Id, Name, Network, Verified status, Currency, Volume, Volume rate, Volume 24h, Volume rate 24h, Floor price, Floor price rate, Floor price 24h, Floor price rate 24h, Total owners and Total items.

  • 1️⃣&2️⃣ Collections - Detailed: same as 'Collections - Compact', but includes the collection URL, Contract address, Volume lifetime, Royalty fee rate, Total listed items and Properties.

  • 3️⃣ Assets - Compact: this includes Cover, Id, Name, Type, Collection id, Collection name, Network, Verified status, Source marketplace, Currency, Price, Creator, Price last sold, Owner, Total bids, Total Offers, Total favorites, Score, Rarity and Open rarity

  • 3️⃣ Assets - Detailed: same as 'Assets - Compact', but includes the asset Media type, URL, collection URL, Contract address, Category, Trade type, Token id, Product id, Creator URL, Owner URL, Royalty fees, Platform fees, Highest bid, Lowest bid, Highest offer, Lowest offer, Open rarity count, Create/List/Start/End dates and Properties

  • All fields: this one includes everything. Even some fields not shown in the custom layouts, at the expense of being a little messy.





That finishes this document. We gladly hope you have found this guide very helpful, but given the amount of sections covered, maybe we went too fast on something or even forgot one thing or two.

Feel free to contact us if you have any questions or need a custom development. 🙂

Happy scraping!

Developer
Maintained by Community