RFP Response Drafter
Pricing
from $0.01 / 1,000 results
Go to Apify Store
RFP Response Drafter
Under maintenanceDraft RFP responses from RFP text + your capabilities: outline, compliance checklist, win themes, and draft sections.
RFP Response Drafter
Pricing
from $0.01 / 1,000 results
Draft RFP responses from RFP text + your capabilities: outline, compliance checklist, win themes, and draft sections.
Optional URL to pull RFP text from (HTML pages only). PDFs are not parsed in this version.
Paste the RFP text here. This is the best input for high-quality drafts.
Company name to use in the response.
Short description of what you do, who you serve, and what makes you different.
Optional examples of similar projects. Use anonymized descriptions if needed.
[]Constraints to respect (e.g., timeline, budget, technology, staffing).
[]Writing style for the draft.
How much to generate.
If draftMode includes sections, these section names are drafted. Leave empty for reasonable defaults.
[]If disabled, the Actor returns a report without drafting.
Which API format to use: 'ollama' (native /api/chat) or 'openai' (OpenAI-compatible /v1/chat/completions).
Base URL for the LLM API. For local Ollama: http://localhost:11434. For Ollama Cloud, set the Cloud base URL here or via LLM_BASE_URL env var.
API key for the LLM provider. Prefer setting LLM_API_KEY (or OLLAMA_API_KEY) as an Actor secret instead of passing it in input.
Model used for drafting. Manage this via LLM_MODEL/OLLAMA_MODEL env vars if you prefer.
Optional full URL override for OpenAI-compatible chat completions (e.g., https://.../v1/chat/completions). Takes precedence over llmBaseUrl when llmApiStyle='openai'.
Max characters of RFP text sent to the model (to control cost).
Maximum output tokens requested.
Enable if monetized with Apify pay-per-event. Disable for local development.
Event to charge per generated draft.