MongoDB Extractor
Pricing
Pay per event
MongoDB Extractor
Extract a full MongoDB database into Apify storage using either a connection string or individual parameters. Supports collection filtering and flexible output formats, providing automated export of all documents and metadata for seamless integration with other workflows.
Pricing
Pay per event
Rating
5.0
(1)
Developer
ParseForge
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
3 days ago
Last modified
Categories
Share

ποΈ MongoDB Database Extractor
Connect to your MongoDB database and extract entire collections into Apify datasets. No coding required. Collect data from one collection or your entire database, apply filters to exclude unwanted data, and download results in CSV, Excel, or JSON format.
The MongoDB Database Extractor connects to any MongoDB instance and exports collection data into Apify storage with up to 1,000,000 documents per run - no coding or technical setup required.
β¨ What Does It Do
- π Database Connection - Connect securely using connection strings, authentication parameters, or MongoDB Atlas credentials
- π Collection Selection - Extract specific collections by name or exclude unwanted ones with simple list filtering
- π Document Limits - Set maximum documents per collection to control extraction scope and costs
- π Multi-Dataset Export - Each collection gets its own dataset for organized, downloadable results
- π Automatic Schema Detection - Displays field names and types for every collection in your overview report
- πΎ Batch Processing - Handles large collections efficiently with automatic batch streaming in 100-document chunks
π§ Input
- Connection String - Paste your MongoDB connection URL (mongodb:// or mongodb+srv://) to skip individual parameters
- Host - MongoDB server hostname or IP address where your database is running
- Port - TCP port number for MongoDB (default: 27017)
- Database - Name of the database to connect to (required)
- Username - User account for authentication (leave blank if no auth is enabled)
- Password - Password for the account
- Authentication Database - Database containing your user account (usually 'admin')
- Use SSL - Enable for secure connections to cloud databases like MongoDB Atlas
- Collections to Extract - Leave blank to extract all collections, or list specific ones to limit the scope
- Collections to Exclude - Specify collections to skip (useful for logs or temporary data)
- Max Documents Per Collection - Set a limit on the number of documents extracted from each collection (e.g., 100 for quick tests)
Example input:
{"connectionString": "mongodb+srv://user:password@cluster.mongodb.net/mydb?retryWrites=true&w=majority","database": "mydb","collections": ["users", "orders"],"maxDocumentsPerCollection": 1000}
π Output
Each extraction produces summary records with up to 7 data fields. Download as JSON, CSV, or Excel.
| ποΈ Collection Name | ποΈ Database Name | π Extraction Date |
|---|---|---|
| π Document Count | π Schema Fields | π Dataset Name |
Each collection's data is also saved to its own dataset within Apify, making it easy to access and download individual collection exports.
π Why Choose the MongoDB Database Extractor?
| Feature | Our Actor | Similar Tools |
|---|---|---|
| Connect using connection strings | βοΈ | β |
| MongoDB Atlas support (mongodb+srv://) | βοΈ | β |
| Collection-level filtering (include/exclude) | βοΈ | Partial |
| Document limit per collection | βοΈ | β |
| Automatic schema detection | βοΈ | β |
| Multi-dataset export (one per collection) | βοΈ | β |
| Batch streaming for large collections | βοΈ | β |
| SSL/secure connections | βοΈ | βοΈ |
| Free tier with 100 documents | βοΈ | β |
| Paid tier up to 1,000,000 documents | βοΈ | β |
| No authentication required option | βοΈ | β |
| Per-event charging model | βοΈ | β |
π How to Use
No technical skills required. Follow these simple steps:
- Sign Up: Create a free account with $5 credit
- Find the Tool: Search for "MongoDB Database Extractor" in the Apify Store and configure your input
- Run It: Click "Start" and watch your results appear
That's it. No coding, no setup, no complicated configuration. Now you can export your data in CSV, Excel, or JSON format.
π― Business Use Cases
- π Data Analyst - Extract customer data monthly to analyze trends and segment users for business intelligence reporting
- πΌ Database Administrator - Back up collection data to external storage for disaster recovery and audit compliance
- π Integration Developer - Export MongoDB collections on-demand to feed data pipelines and ETL workflows
π Beyond business use cases
Data like this powers more than commercial workflows. The same structured records support research, education, civic projects, and personal initiatives.
π€ Ask an AI assistant about this scraper
Open a ready-to-send prompt about this ParseForge actor in the AI of your choice:
- π¬ ChatGPT
- π§ Claude
- π Perplexity
- π Copilot
β Frequently Asked Questions
π³ Do I need a paid Apify plan to run this actor?
No. You can start right now on the free Apify plan, which includes $5 in free monthly credit. That is enough to run this actor several times and explore the output before committing to anything. Paid plans unlock higher limits, more concurrent runs, and larger datasets. Create a free Apify account here to get started.
π¨ What happens if my run fails or returns no results?
Failed runs are not charged. If the source site changes, proxies get rate-limited, or a specific input matches nothing, re-run the actor or open our contact form and we will investigate. You can also check the run log in the Apify console to see why the run stopped.
π How many items can I scrape per run?
Free users are limited to 10 items per run so you can preview the output and confirm the actor works for your use case. Paid users can raise maxItems up to 1,000,000 per run. Upgrade here if you need full scale.
π How fresh is the data?
Every run fetches live data at the moment of execution. There is no cache or delay: the records you get reflect what the source returned at that moment. Schedule the actor to maintain a rolling snapshot of the data you need.
π§βπ» Can I call this actor from my own code?
Yes. Apify exposes every actor as a REST endpoint and ships first-class SDKs for Node.js and Python. You can start a run, read the dataset, and handle webhooks from your own app in a few lines. All you need is your Apify API token.
π€ How do I export the data?
Every Apify dataset can be downloaded in one click from the console as CSV, JSON, JSONL, Excel, HTML, XML, or RSS. You can also pull results programmatically via the Apify API or stream them into BigQuery, S3, and other destinations through built-in integrations.
π Can I schedule the actor to run automatically?
Yes. Use the Apify scheduler to run the actor on any cadence, from hourly to monthly. Results are saved to your dataset and can be delivered to webhooks, email, Slack, cloud storage, or automation tools such as Zapier and Make.
π‘ More ParseForge Actors
- MySQL Database Extractor - Extract data from MySQL databases
- Mood Fabrics Product Scraper - Collect fabric inventory and product data
- Justia Case Law Scraper - Extract legal case information
- LiveAuctioneers Scraper - Collect auction data and bids
- PropertyShark Commercial Property Transactions Scraper - Extract commercial real estate data
Browse our complete collection of data extraction tools for more.
π Ready to Start?
Create a free account with $5 credit and collect your first 100 results for free. No coding, no setup.
π Need Help?
- Check the FAQ section above for common questions
- Visit the Apify support page for documentation and tutorials
- Contact us to request a new scraper, propose a custom project, or report an issue at Tally contact form
β οΈ Disclaimer
This Actor is an independent tool provided as-is. Users are responsible for complying with applicable laws and terms of service when processing data. All trademarks mentioned are the property of their respective owners.


