USGS Earthquake Data Scraper avatar

USGS Earthquake Data Scraper

Pricing

from $4.50 / 1,000 results

Go to Apify Store
USGS Earthquake Data Scraper

USGS Earthquake Data Scraper

Extract earthquake data from the USGS seismic network. Filter by date range, magnitude, location coordinates, and alert level. Get magnitude, depth, coordinates, felt reports, tsunami warnings, and significance scores. 150K+ events per year. Pure API, no proxy needed.

Pricing

from $4.50 / 1,000 results

Rating

0.0

(0)

Developer

ParseForge

ParseForge

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

a day ago

Last modified

Share

ParseForge Banner

🌍 USGS Earthquake Data Scraper

πŸš€ Extract earthquake data from the USGS seismic network. Filter by date range, magnitude, location coordinates, and alert level. Get magnitude, depth, coordinates, felt reports, tsunami warnings, and significance scores for 150K+ events per year.

πŸ•’ Last updated: 2026-04-23

Seismic data powers critical decisions in construction, insurance, journalism, and scientific research. The USGS collects earthquake data from stations worldwide, but downloading and structuring it manually is time-consuming.

The USGS Earthquake Data Scraper automates this process, pulling structured earthquake records with flexible filtering by date, magnitude, geographic bounds, and alert level. Whether you are a seismologist studying patterns, a journalist reporting on natural disasters, or a developer building alert apps, this tool delivers clean data in seconds. Pure API, no proxy needed.

Target AudienceSeismologists, journalists, engineers, app developers, insurance analysts
Primary Use CasesSeismic research, disaster reporting, risk assessment, alert systems

πŸ“‹ What Does It Do

This tool collects earthquake event data from the USGS seismic network, returning structured records with comprehensive details. It delivers:

  • 🌍 Global coverage - earthquakes from every seismic station worldwide
  • πŸ“Š 30 fields per event - magnitude, depth, coordinates, felt reports, significance
  • πŸ“… Date filtering - search by date range (defaults to last 30 days)
  • πŸ” Magnitude filter - min/max magnitude (e.g., only M5.0+ events)
  • πŸ—ΊοΈ Geographic bounds - filter by latitude/longitude bounding box
  • 🚨 Alert levels - filter by green, yellow, orange, red PAGER alert status

🎬 How to Use the USGS Earthquake Scraper - Full Demo

🚧 Demo video coming soon. Follow the step-by-step instructions below to get started in under 2 minutes.


βš™οΈ Input

Configure your earthquake search with date, magnitude, and location filters.

FieldTypeDescription
Max ItemsIntegerFree users: Limited to 10 records. Paid users: up to 1,000,000
Start DateStringStart date (YYYY-MM-DD). Defaults to 30 days ago
End DateStringEnd date (YYYY-MM-DD). Defaults to today
Minimum MagnitudeNumberMin magnitude (e.g. 4.0 for significant, 6.0 for major)
Maximum MagnitudeNumberMax magnitude filter
Alert LevelSelectPAGER alert: Green, Yellow, Orange, or Red
Min/Max LatitudeNumberGeographic bounding box (south/north)
Min/Max LongitudeNumberGeographic bounding box (west/east)

Example 1 - Major earthquakes in Q1 2026:

{
"startDate": "2026-01-01",
"endDate": "2026-04-01",
"minMagnitude": 5,
"maxItems": 100
}

Example 2 - California earthquakes:

{
"minLatitude": 32.5,
"maxLatitude": 42.0,
"minLongitude": -124.4,
"maxLongitude": -114.1,
"maxItems": 500
}

⚠️ Free users are limited to 10 records per run. Sign up for a paid plan to unlock up to 1,000,000 records.


πŸ“Š Output

🧾 Output Schema

FieldTypeDescription
eventIdstringUSGS event identifier
magnitudenumberEarthquake magnitude
magnitudeTypestringMagnitude scale used
placestringLocation description
timestringEvent timestamp (ISO 8601)
longitudenumberEpicenter longitude
latitudenumberEpicenter latitude
depthnumberDepth in kilometers
tsunamiintegerTsunami warning flag (0/1)
alertstringPAGER alert level
significanceintegerSignificance score
feltintegerNumber of felt reports
urlstringUSGS event page URL
scrapedAtstringTimestamp of data collection

πŸ“¦ Sample Output

Sample 1 - Major earthquake:

{
"eventId": "us7000sbbr",
"magnitude": 6.2,
"magnitudeType": "mww",
"place": "128 km SSW of Taron, Papua New Guinea",
"time": "2026-03-15T08:23:45.123Z",
"longitude": 152.8234,
"latitude": -5.4321,
"depth": 35.2,
"tsunami": 0,
"alert": "green",
"significance": 592,
"felt": 24,
"url": "https://earthquake.usgs.gov/earthquakes/eventpage/us7000sbbr",
"scrapedAt": "2026-04-09T00:00:00.000Z"
}

Sample 2 - Moderate earthquake:

{
"eventId": "us7000sabc",
"magnitude": 4.8,
"magnitudeType": "mb",
"place": "45 km NE of Ridgecrest, California",
"time": "2026-03-10T14:15:30.000Z",
"longitude": -117.4521,
"latitude": 35.8912,
"depth": 8.5,
"tsunami": 0,
"alert": "green",
"significance": 354,
"felt": 1256,
"url": "https://earthquake.usgs.gov/earthquakes/eventpage/us7000sabc",
"scrapedAt": "2026-04-09T00:00:00.000Z"
}

Sample 3 - Deep earthquake:

{
"eventId": "us7000sxyz",
"magnitude": 5.5,
"magnitudeType": "mww",
"place": "Fiji region",
"time": "2026-02-28T03:45:12.000Z",
"longitude": 179.1234,
"latitude": -17.5678,
"depth": 520.3,
"tsunami": 0,
"alert": null,
"significance": 465,
"felt": 0,
"url": "https://earthquake.usgs.gov/earthquakes/eventpage/us7000sxyz",
"scrapedAt": "2026-04-09T00:00:00.000Z"
}

✨ Why Choose the USGS Earthquake Data Scraper?

AdvantageDetails
🌍 Global coverageEvery seismic station worldwide
πŸ“Š 30 fields per eventMagnitude, depth, coordinates, felt reports, alerts
πŸ“… Flexible date filteringAny date range from decades of history
πŸ—ΊοΈ Geographic bounding boxFocus on any region of the world
🚨 Alert level filteringTarget only significant or severe events
⚑ Pure API, no proxyFast, reliable, no additional costs
⏰ Scheduled monitoringSet up hourly alerts for seismic activity

πŸ“ˆ How Does It Compare?

FeatureOur ToolManual USGS Site
Batch collectionUp to 1M eventsPage-by-page
Date + magnitude filteringCombinedSeparate queries
Geographic bounding boxBuilt-inInteractive map only
Structured outputJSON, CSV, ExcelWeb tables
Automated schedulingHourly/dailyNot possible
Alert level filteringYesLimited

πŸš€ How to Use

  1. Sign Up - Create a free account w/ $5 credit (takes 2 minutes)
  2. Find the Tool - Search for "USGS Earthquake Data Scraper" in the Apify Store
  3. Set Input - Configure date range, magnitude filter, and geographic bounds
  4. Run It - Click "Start" and get structured earthquake data in seconds
  5. Download Data - Export results as CSV, Excel, or JSON from the Dataset tab

πŸ’Ό Business Use Cases

Seismologists and Researchers:

  • Build earthquake catalogs for research and analysis
  • Study seismic patterns by region and depth

Journalists:

  • Get structured data on recent seismic events for reporting
  • Create data visualizations of earthquake activity

Engineers and Construction:

  • Assess seismic risk for construction projects by location
  • Research historical earthquake activity for building codes

Insurance and Risk:

  • Analyze earthquake frequency and intensity by region
  • Feed seismic data into risk models


✨ Why choose this Actor

Capability
🎯Built for the job. Scoped specifically to this data source so you skip the parser engineering entirely.
πŸ”–Structured output. Clean, typed fields ready for analysis, dashboards, or downstream pipelines.
⚑Fast. Optimized request patterns return results in seconds, not minutes.
πŸ”Always fresh. Every run pulls live data, so the dataset reflects the source as of run time.
🌐No infra to manage. Apify handles proxies, retries, scaling, scheduling, and storage.
πŸ›‘οΈReliable. Battle-tested across many runs and edge cases, with graceful error handling.
🚫No code required. Configure in the UI, run from CLI, schedule via cron, or call from any language with the Apify SDK.

πŸ“Š Production-grade structured data without the engineering overhead of building and maintaining your own scraper.


πŸ“ˆ How it compares to alternatives

ApproachCostCoverageRefreshFiltersSetup
⭐ USGS Earthquake Data Scraper (this Actor)$5 free credit, then pay-per-useFull source coverageLive per runSource-native filters supported⚑ 2 min
Build your own scraperEngineering hoursFull once builtWhenever you maintain itCustom code🐒 Days to weeks
Paid managed APIs$$$ monthlyVendor-definedLiveVendor-defined⏳ Hours
Third-party data dumpsVariesSubset, often stalePeriodicNoneπŸ•’ Variable

Pick this Actor when you want broad coverage, server-side filtering, and no pipeline maintenance.


πŸš€ How to use

  1. πŸ“ Sign up. Create a free account with $5 credit (takes 2 minutes).
  2. 🌐 Open the Actor. Go to the USGS Earthquake Data Scraper page on the Apify Store.
  3. 🎯 Set input. Configure the input fields in the form (or paste a JSON), then set maxItems.
  4. πŸš€ Run it. Click Start and let the Actor collect your data.
  5. πŸ“₯ Download. Grab your results in the Dataset tab as CSV, Excel, JSON, or XML.

⏱️ Total time from signup to downloaded dataset: 3-5 minutes. No coding required.


πŸ’Ό Business use cases

πŸ“Š Data & Analytics

  • Build trend reports and dashboards from live source data
  • Feed BI tools, warehouses, and ML pipelines with structured records
  • Run periodic snapshots to track changes over time
  • Compare segments, regions, or categories with consistent fields

🏒 Operations & Strategy

  • Monitor competitor moves, pricing, and inventory shifts
  • Build internal directories and lookup tools backed by current data
  • Power workflows that depend on fresh source records
  • Cut manual data-gathering time from hours to minutes

🎯 Marketing & Growth

  • Identify market opportunities and trending topics
  • Research target audiences and customer personas at scale
  • Power lead-generation pipelines with verified records
  • Track sentiment, reviews, or social signals over time

πŸ› οΈ Engineering & Product

  • Prototype features that need real-world data without owning a crawler
  • Replace fragile in-house scrapers with a managed Actor
  • Wire datasets into your apps via the Apify API or webhooks
  • Skip the proxy, retry, and parsing maintenance entirely

🌟 Beyond business use cases

Data like this powers more than commercial workflows. The same structured records support research, education, civic projects, and personal initiatives.

πŸŽ“ Research and academia

  • Empirical datasets for papers, thesis work, and coursework
  • Longitudinal studies tracking changes across snapshots
  • Reproducible research with cited, versioned data pulls
  • Classroom exercises on data analysis and ethical scraping

🎨 Personal and creative

  • Side projects, portfolio demos, and indie app launches
  • Data visualizations, dashboards, and infographics
  • Content research for bloggers, YouTubers, and podcasters
  • Hobbyist collections and personal trackers

🀝 Non-profit and civic

  • Transparency reporting and accountability projects
  • Advocacy campaigns backed by public-interest data
  • Community-run databases for local issues
  • Investigative journalism on public records

πŸ§ͺ Experimentation

  • Prototype AI and machine-learning pipelines with real data
  • Validate product-market hypotheses before engineering spend
  • Train small domain-specific models on niche corpora
  • Test dashboard concepts with live input

πŸ€– Ask an AI assistant about this scraper

Open a ready-to-send prompt about this ParseForge actor in the AI of your choice:

❓ Frequently Asked Questions

πŸ’³ Do I need a paid Apify plan to run this actor?

No. You can start right now on the free Apify plan, which includes $5 in free monthly credit. That is enough to run this actor several times and explore the output before committing to anything. Paid plans unlock higher limits, more concurrent runs, and larger datasets. Create a free Apify account here to get started.

🚨 What happens if my run fails or returns no results?

Failed runs are not charged. If the source site changes, proxies get rate-limited, or a specific input matches nothing, re-run the actor or open our contact form and we will investigate. You can also check the run log in the Apify console to see why the run stopped.

πŸ“ How many items can I scrape per run?

Free users are limited to 10 items per run so you can preview the output and confirm the actor works for your use case. Paid users can raise maxItems up to 1,000,000 per run. Upgrade here if you need full scale.

πŸ•’ How fresh is the data?

Every run fetches live data at the moment of execution. There is no cache or delay: the records you get reflect what the source returned at that moment. Schedule the actor to maintain a rolling snapshot of the data you need.

πŸ§‘β€πŸ’» Can I call this actor from my own code?

Yes. Apify exposes every actor as a REST endpoint and ships first-class SDKs for Node.js and Python. You can start a run, read the dataset, and handle webhooks from your own app in a few lines. All you need is your Apify API token.

πŸ“€ How do I export the data?

Every Apify dataset can be downloaded in one click from the console as CSV, JSON, JSONL, Excel, HTML, XML, or RSS. You can also pull results programmatically via the Apify API or stream them into BigQuery, S3, and other destinations through built-in integrations.

πŸ“… Can I schedule the actor to run automatically?

Yes. Use the Apify scheduler to run the actor on any cadence, from hourly to monthly. Results are saved to your dataset and can be delivered to webhooks, email, Slack, cloud storage, or automation tools such as Zapier and Make.


πŸ”Œ Automating Your Earthquake Data Collection

Schedule hourly runs for real-time monitoring or daily runs for research. Use the Apify scheduler and push data to Slack alerts, Google Sheets, or your custom monitoring dashboard.

πŸ”Œ Integrate USGS Earthquake Scraper with Any App

Connect your earthquake data to thousands of apps using these integrations:

  • Make - Automate seismic monitoring workflows
  • Zapier - Get alerts on significant earthquakes
  • Slack - Post earthquake alerts to your channel
  • Google Drive - Export data to spreadsheets
  • Airbyte - Sync to your data warehouse
  • Webhooks - Trigger actions when runs complete

πŸ”Œ Integrate with any app

USGS Earthquake Data Scraper connects to any cloud service via Apify integrations:

  • Make - Automate multi-step workflows
  • Zapier - Connect with 5,000+ apps
  • Slack - Get run notifications in your channels
  • Airbyte - Pipe results into your warehouse
  • GitHub - Trigger runs from commits and releases
  • Google Drive - Export datasets straight to Sheets

You can also use webhooks to trigger downstream actions when a run finishes. Push fresh data into your product backend, or alert your team in Slack.


Looking for more data collection tools? Check out these related actors:

ActorDescriptionLink
GSA eLibrary ScraperGovernment contract dataLink
FINRA BrokerCheck ScraperBroker registration dataLink
FAA Aircraft Registry ScraperAircraft registration recordsLink
Pitchbook Funds ScraperPrivate fund profilesLink
GreatSchools ScraperSchool ratings and district dataLink

Pro Tip: πŸ’‘ Browse the full ParseForge catalog to find more data tools.


πŸ†˜ Need Help?


⚠️ Disclaimer: This Actor is an independent tool and is not affiliated with, endorsed by, or connected to the USGS or the United States Geological Survey. It accesses only publicly available data through official public APIs.