Public Tender Scraper Germany
Pricing
Pay per event
Public Tender Scraper Germany
Public Tender Scraper Germany is an Apify Actor designed to collect public tender information from the German government procurement portal "e-Vergabe".
Pricing
Pay per event
Rating
0.0
(0)
Developer

Stephanie Hohenberg
Actor stats
0
Bookmarked
1
Total users
0
Monthly active users
5 days ago
Last modified
Categories
Share
Public Tender Scraper
Public Tender Scraper is an Apify Actor designed to collect public tender information from multiple government procurement portals. While public tenders ensure transparency and fair competition, counteracting favoritism and corruption, they are not publicly advertised in one place. Each government provides their own platform, if any, to let organizations publish tenders, on which businesses and suppliers can apply and bid.
Supported platforms:
| Country | National procurement portal |
|---|---|
| 🇩🇪 Germany | e-Vergabe |
| 🇪🇺 tba | Link |
Public Tender Scraper can help businesses ...
- to automate steps in their procurement process
- to find suitable tenders faster and more efficiently across various platforms
- to go international and win contracts abroad
... by providing the following functionalities:
🔎 Unified tender search across multiple national platforms
🏗️ Automation-ready data for integration into procurement workflows
💬 Automatic translation of queries and results to/from user's language
💾 Flexible output formats: JSON and CSV
🧩 Simple API integration for seamless embedding in systems
How to integrate?
- Generate the API keys
- Apify API Key
- Groq API Key (optional, to enable translation feature)
- Export env variables
export APIFY_API_KEY=apify_api_...
export GROQ_API_KEY=gsk_... - Invoke the Actor via curl
curl -X POST "https://api.apify.com/v2/acts/stephaniehhnbrg~public-tender-scraper-germany/runs?token=$APIFY_API_KEY" \-d '{"keyword": "Ultraschall", "maxResults": "10", "groqApiKey":"'"$GROQ_API_KEY"'"}' \-H 'Content-Type: application/json'
- Retrieve the RUN-ID from the JSON response (data > id)
- Check the status of the run (data > status)
$curl "https://api.apify.com/v2/acts/stephaniehhnbrg~public-tender-scraper-germany/runs/<RUN-ID>?token=$APIFY_API_TOKEN"
- Retrieve the DATASET-ID from the JSON response (data > defaultDatasetId)
- Fetch the dataset items, as soon as the run holds the status succeeded.
$curl "https://api.apify.com/v2/datasets/<DATASET-ID>/items?view=overview"
Alternatively, open the Apify Console link from the status response (data > consoleUrl)
Dev Notes
Folder Structure
The project was initialized using the Apify template ts-crawlee-playwright-chrome, which provides a standard structure:
.actor/ # Actor metadata and I/O schemassrc/ # Source codestorage/├── datasets/ # Actor outputs (JSON + CSV)├── key_value_stores/ # Input variables and run statistics└── request_queues/ # Crawling data
Run Actor locally
- Install Apify - Guide
- Configure input parameters (keyword, maxResults, groqApiKey) by editing ./storage/key_value_stores/default/INPUT.json
- Run the actor locally
$apify run
- Review the output
- as JSON objects: ./storage/datasets/default
- as CSV file: ./storage/datasets/result.csv
Publish Actor
After running the following commands:
apify loginapify push
check out the Apify console and publish the actor via the UI.