MySQL Database Extractor
Pricing
Pay per event
MySQL Database Extractor
Extract a full MySQL database into Apify storage using either a connection string or individual parameters. Supports table filtering and flexible output formats, enabling automated export of all rows and metadata for smooth integration with other Actors, pipelines, or workflows.
Pricing
Pay per event
Rating
5.0
(1)
Developer
ParseForge
Maintained by CommunityActor stats
0
Bookmarked
2
Total users
1
Monthly active users
6 days ago
Last modified
Categories
Share

πΎ MySQL Database Extractor
π Last updated: 2026-05-05
Extract entire MySQL databases into Apify storage without writing a single line of code. Connect securely with your credentials, select tables to extract, and download results as JSON, CSV, or Excel. Perfect for database backups, data migration, consolidation, and compliance audits.
The MySQL Database Extractor connects to any MySQL database and exports all table data with up to 1,000,000 rows per table, SSL encryption, table filtering, and automatic dataset creation.
β¨ What Does It Do
- π Table Names - Identify which tables are being extracted and monitor database structure
- π Row Counts - See how many records exist in each table to plan extraction scope
- π Dataset URLs - Access direct links to each table's extracted data for easy sharing
- π Column Schemas - Retrieve field names and data types to understand your database
- π Extraction Timestamp - Track when data was pulled for audit trails and compliance
- ποΈ Organized Datasets - Each table automatically saved to its own dataset for management
π§ Input
- Connection String - paste a full MySQL URL (optional, or use individual host/port/database fields)
- Host - your MySQL server hostname or IP address where the database lives
- Port - MySQL port number, typically 3306 (verify with your database administrator)
- Database - name of the specific database you want to extract from
- Username - MySQL user account with read access to the database
- Password - password for the MySQL user account
- Use SSL - enable encrypted connections (required for AWS RDS, cloud databases, and secure networks)
- Tables to Extract - specify exact table names to extract (leave blank to extract all tables)
- Tables to Exclude - list tables to skip, like logs or temporary data
- Max Rows Per Table - limit extraction to N rows per table to control data volume
Example input:
{"host": "your-database.example.com","port": 3306,"database": "myapp_db","username": "appuser","password": "securePassword123","ssl": true,"tables": ["users", "products"],"maxRowsPerTable": 10000}
π Output
Each extraction produces structured data for every table in your database. Download as JSON, CSV, Excel, or other formats.
| π Table Name | π Row Count | π Extraction Date |
|---|---|---|
| π Dataset URL | ποΈ Dataset Name | π Column Schemas |
| π Field Names | π€ Data Types | π― Column Position |
π Why Choose the MySQL Database Extractor?
| Feature | Our Actor | Similar Tools |
|---|---|---|
| Connect with standard credentials only | βοΈ | β |
| Extract all tables automatically with one click | βοΈ | Partial |
| Exclude sensitive tables from extraction | βοΈ | β |
| Set row limits per table to control volume | βοΈ | Partial |
| Each table saved to separate dataset automatically | βοΈ | β |
| Column schemas included for data mapping | βοΈ | Partial |
| Supports 1,000,000 rows per table | βοΈ | β |
| No coding required - full configuration via UI | βοΈ | β |
| Batch processing for efficient large table extraction | βοΈ | β |
| Secure SSL and credential handling | βοΈ | βοΈ |
π How to Use
No technical skills required. Follow these simple steps:
- Sign Up: Create a free account with $5 credit
- Find the Tool: Search for "MySQL Database Extractor" in the Apify Store and configure your database connection details
- Run It: Click Start and watch your data extraction progress
That's it. No coding, no setup, no complicated configuration. Now you can export your data in CSV, Excel, or JSON format.
π― Business Use Cases
- π Data Analyst - Extract customer and transaction tables weekly to build dashboards in Excel for real-time business insights
- π DevOps Engineer - Backup production databases before deployments to Apify datasets, then restore if issues occur during the release
- π’ Compliance Officer - Audit sensitive user data quarterly by extracting and archiving customer records to meet GDPR and SOX requirements
π Beyond business use cases
Data like this powers more than commercial workflows. The same structured records support research, education, civic projects, and personal initiatives.
π€ Ask an AI assistant about this scraper
Open a ready-to-send prompt about this ParseForge actor in the AI of your choice:
- π¬ ChatGPT
- π§ Claude
- π Perplexity
- π Copilot
β Frequently Asked Questions
π³ Do I need a paid Apify plan to run this actor?
No. You can start right now on the free Apify plan, which includes $5 in free monthly credit. That is enough to run this actor several times and explore the output before committing to anything. Paid plans unlock higher limits, more concurrent runs, and larger datasets. Create a free Apify account here to get started.
π¨ What happens if my run fails or returns no results?
Failed runs are not charged. If the source site changes, proxies get rate-limited, or a specific input matches nothing, re-run the actor or open our contact form and we will investigate. You can also check the run log in the Apify console to see why the run stopped.
π How many items can I scrape per run?
Free users are limited to 10 items per run so you can preview the output and confirm the actor works for your use case. Paid users can raise maxItems up to 1,000,000 per run. Upgrade here if you need full scale.
π How fresh is the data?
Every run fetches live data at the moment of execution. There is no cache or delay: the records you get reflect what the source returned at that moment. Schedule the actor to maintain a rolling snapshot of the data you need.
π§βπ» Can I call this actor from my own code?
Yes. Apify exposes every actor as a REST endpoint and ships first-class SDKs for Node.js and Python. You can start a run, read the dataset, and handle webhooks from your own app in a few lines. All you need is your Apify API token.
π€ How do I export the data?
Every Apify dataset can be downloaded in one click from the console as CSV, JSON, JSONL, Excel, HTML, XML, or RSS. You can also pull results programmatically via the Apify API or stream them into BigQuery, S3, and other destinations through built-in integrations.
π Can I schedule the actor to run automatically?
Yes. Use the Apify scheduler to run the actor on any cadence, from hourly to monthly. Results are saved to your dataset and can be delivered to webhooks, email, Slack, cloud storage, or automation tools such as Zapier and Make.
π‘ More ParseForge Actors
- Crunchbase Scraper - Extract company and investor data from Crunchbase
- Alibaba.com Rental Scraper - Collect rental listings and rates
- PropertyShark Commercial Property Transactions Scraper - Monitor commercial real estate deals
- WebMD Healthcare Provider Directory Scraper - Extract healthcare provider information
- Korea's Open Government Data Portal (data.go.kr) - Access public datasets from Korea
Browse our complete collection of data extraction tools for more.
π Ready to Start?
Create a free account with $5 credit and extract your first 100 results for free. No coding, no setup.
π Need Help?
- Check the FAQ section above for common questions
- Visit the Apify support page for documentation and tutorials
- Contact us to request a new scraper, propose a custom project, or report an issue at Tally contact form
β οΈ Disclaimer
This Actor is an independent tool provided as-is. Users are responsible for complying with applicable laws and terms of service when processing data. All trademarks mentioned are the property of their respective owners.


