Usgs Earthquake Data
Pricing
Pay per usage
Usgs Earthquake Data
Usgs Earthquake Data scrapes data and outputs structured results. Runs on Apify platform with scheduling, API access, and proxy support.
Pricing
Pay per usage
Rating
0.0
(0)
Developer

Donny Nguyen
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
6 hours ago
Last modified
Categories
Share
USGS Earthquake Data Scraper
What does this actor do?
USGS Earthquake Data Scraper is an Apify actor that get real-time earthquake data worldwide from USGS. Filter by magnitude, location, and time range. It runs on the Apify platform and delivers structured data in JSON, CSV, or Excel formats that you can easily integrate into your workflows. For each item found, the actor extracts key data fields including magnitude, place, time, depth, and more. All results are stored in an Apify dataset that you can download or connect to via the Apify API.
Why use this actor?
Manually collecting this data would be extremely time-consuming and error-prone. USGS Earthquake Data Scraper automates the entire process, saving you hours of manual work. This actor is ideal for data analysts, researchers, marketers, and developers who need reliable, structured data. You can schedule regular runs to keep your data fresh, integrate results directly into spreadsheets or databases, and scale your data collection without any coding required. The actor handles pagination, rate limiting, and data normalization automatically.
How does it work?
This actor connects to the target website to fetch structured data directly. It processes the responses, normalizes the data into a consistent format, and stores everything in an Apify dataset. The API-based approach is fast and reliable, with built-in error handling and retry logic.
Input parameters
| Parameter | Type | Description | Default |
|---|---|---|---|
| minMagnitude | string | Minimum earthquake magnitude (e.g., 2.5) | "4.0" |
| maxResults | integer | Maximum earthquakes to return | 50 |
| daysBack | integer | How many days back to search | 7 |
Output fields
Each item in the output dataset contains the following fields:
| Field | Description | Format |
|---|---|---|
| magnitude | Magnitude | text |
| place | Place | text |
| time | Time | text |
| depth | Depth | text |
| latitude | Latitude | text |
| longitude | Longitude | text |
| tsunami | Tsunami | text |
| alert | Alert | text |
| status | Status | text |
| url | Url | link |
Example output:
{"magnitude": "Sample Magnitude","place": "Sample Place","time": "Sample Time","depth": "Sample Depth","latitude": "Sample Latitude","longitude": "Sample Longitude"}
Cost and performance
This actor runs with a default memory allocation of 512 MB. As an API-based actor, it is very efficient and typically costs around $0.05-0.15 in Apify platform credits per 1,000 results. A typical run completes in under 1 minute. You can reduce costs by limiting the number of results with the maxResults parameter and by scheduling runs during off-peak hours.
Related Actors
Check out these related actors on Apify:
- Academic Citation Graph Builder
- Accessibility Auditor
- Affiliate Program Scraper
- Alibaba Supplier Scraper
- Alternativeto Scraper
Tips and best practices
- Start with a small number of results to test your configuration before scaling up.
- Use the Apify scheduling feature to automate regular data collection runs.
- Export results in the format that best fits your workflow: JSON for APIs, CSV for spreadsheets, or Excel for reports.
- Connect this actor with other actors on the Apify platform for more comprehensive data pipelines.
Related actors you might find useful: