YAML Validator & Converter
Pricing
Pay per event
YAML Validator & Converter
Validate YAML, JSON, and TOML syntax. Convert between formats. Detect errors with exact line numbers. Bulk-process documents or URLs. Zero proxy, 95%+ margin.
Pricing
Pay per event
Rating
0.0
(0)
Developer
Stas Persiianenko
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
5 days ago
Last modified
Categories
Share
Validate YAML syntax, detect errors with line numbers, and convert between YAML, JSON, and TOML formats — in bulk, via API, or as part of any automation pipeline.
What does it do?
YAML Validator & Converter is a pure-computation Apify actor that processes YAML, JSON, and TOML documents without touching any websites or requiring proxies. You paste documents directly or provide URLs to raw config files, and the actor returns:
- ✅ Validation status — is the document syntactically valid?
- 🔢 Error line number — exactly where is the syntax error?
- 📄 Error message — human-readable description of the problem
- 🔄 Converted output — the document reformatted as YAML, JSON, or TOML
It processes documents in bulk, making it ideal for CI/CD pipelines, config file audits, and data transformation workflows.
Who is it for?
🧑💻 DevOps engineers and SREs
Validate Kubernetes manifests, Helm charts, Docker Compose files, and Ansible playbooks before deploying. Catch syntax errors early rather than discovering them at runtime.
🏗️ Platform engineers
Automate config file validation across hundreds of microservices. Use the API to gate deployments on YAML validity checks.
🔧 Backend developers
Convert configuration formats when migrating between tools — from JSON to YAML for Kubernetes, from TOML to JSON for API consumption, or from YAML to TOML for Rust/Go config files.
📊 Data engineers
Clean and normalize configuration data from multiple sources into a consistent format before loading it into databases or data lakes.
🤖 Automation builders
Use with Apify integrations (Zapier, Make, n8n) to validate config files uploaded by users or generated by other automation steps before processing them downstream.
Why use YAML Validator & Converter?
- No setup required — runs instantly, no browser, no proxy, no scraping
- Bulk processing — validate dozens of documents in a single run
- Precise error reporting — exact line numbers, not just "invalid YAML"
- Multi-format support — YAML, JSON, and TOML all in one actor
- Bidirectional conversion — any format to any other format
- URL fetching — point at raw GitHub URLs, CDN-hosted configs, or any public endpoint
- API-first — programmatic access via REST API, Node.js SDK, Python SDK
What data does it extract?
Each processed document produces one row in the dataset:
| Field | Type | Description |
|---|---|---|
index | number | 1-based position in the input list |
source | text | inline-N for pasted docs, URL for fetched docs |
inputFormat | text | Detected or specified format: yaml, json, or toml |
isValid | boolean | true if the document is syntactically valid |
errorMessage | text | Human-readable error description (null if valid) |
errorLine | number | Line number where error occurred (null if valid) |
outputFormat | text | Target format for conversion (null if validate-only) |
convertedContent | text | Converted document string (null if validate-only or invalid) |
originalContent | text | Original input text (only if includeOriginal is enabled) |
How much does it cost to validate YAML and convert formats?
This actor uses pay-per-event (PPE) pricing — you only pay for what you process, with volume discounts for higher subscription tiers:
| Event | Free | Bronze | Silver | Gold | Platinum | Diamond |
|---|---|---|---|---|---|---|
| 🚀 Actor start (per run) | $0.005 | $0.00475 | $0.00425 | $0.00375 | $0.003 | $0.0025 |
| 📄 Document processed | $0.001 | $0.0009 | $0.0008 | $0.00065 | $0.0005 | $0.0004 |
Example costs (Free tier):
- Validate 10 YAML files: $0.005 + 10 × $0.001 = $0.015
- Validate 100 config files in one run: $0.005 + 100 × $0.001 = $0.105
- Convert 50 JSON files to YAML: $0.005 + 50 × $0.001 = $0.055
Free plan: Apify's free tier includes $5/month of platform credits — enough to process ~4,900 documents per month at no cost.
This actor uses zero proxy (pure computation only), so there are no proxy bandwidth costs.
How to use YAML Validator & Converter
Step 1 — Open the actor
Go to https://apify.com/automation-lab/yaml-validator and click Try for free.
Step 2 — Choose your input method
Option A: Paste documents directly Add your YAML, JSON, or TOML documents in the Documents field. Each entry in the list is one complete document.
Option B: Provide URLs Add raw file URLs in the URLs to Fetch field. For example:
- GitHub raw:
https://raw.githubusercontent.com/owner/repo/main/config.yaml - Any public HTTP endpoint returning a config file
Step 3 — Select operation
- Validate only — checks syntax and reports errors with line numbers
- Convert — transforms documents to the target format
Step 4 — Configure format options
- Input Format:
auto(recommended) detects YAML, JSON, and TOML automatically - Output Format: choose
yaml,json, ortoml(only for convert operation)
Step 5 — Run and export
Click Start and wait a few seconds. Download results as JSON, CSV, or Excel from the Dataset tab.
Input parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
documents | string[] | — | Raw document texts to validate or convert |
urls | string[] | — | URLs to fetch and process as documents |
operation | string | validate | validate or convert |
inputFormat | string | auto | auto, yaml, json, or toml |
outputFormat | string | json | yaml, json, or toml (for convert only) |
jsonIndent | integer | 2 | Spaces for JSON indentation (0 = minified) |
stopOnError | boolean | false | Stop processing after the first invalid document |
includeOriginal | boolean | false | Include original document text in output |
Output example
For a validation run on two documents (one valid YAML, one invalid):
[{"index": 1,"source": "inline-1","inputFormat": "yaml","isValid": true,"errorMessage": null,"errorLine": null,"outputFormat": null,"convertedContent": null},{"index": 2,"source": "inline-2","inputFormat": "yaml","isValid": false,"errorMessage": "unexpected end of the stream within a flow collection (4:1)","errorLine": 4,"outputFormat": null,"convertedContent": null}]
For a conversion run (YAML to JSON):
[{"index": 1,"source": "inline-1","inputFormat": "yaml","isValid": true,"errorMessage": null,"errorLine": null,"outputFormat": "json","convertedContent": "{\n \"name\": \"John Doe\",\n \"age\": 30,\n \"roles\": [\"admin\", \"editor\"]\n}"}]
Tips for best results
💡 Use auto format detection — the actor correctly identifies YAML, JSON, and TOML from content patterns. Only specify a format if you know documents might be ambiguous.
💡 Validate before converting — if you're converting many documents, run a validate pass first. Invalid documents produce no converted output.
💡 Use stopOnError: true in CI/CD — gates your pipeline on the first error, making failures fast and obvious.
💡 TOML output requires a top-level object — arrays and scalar values cannot be serialized to TOML (TOML spec). If your document is an array, convert to JSON or YAML instead.
💡 URL fetching supports any public HTTP endpoint — GitHub raw URLs, S3 presigned URLs, and CDN-hosted files all work. The actor follows redirects and handles Content-Encoding: gzip automatically.
💡 Batch many documents in one run — the start fee is $0.005 regardless of document count. Processing 100 documents in one run is far cheaper than 100 separate runs.
Integrations
🔗 CI/CD pipeline gate (GitHub Actions)
Validate all YAML configs before deployment:
- name: Validate configsuses: apify/run-actor@v0with:actor: automation-lab/yaml-validatorinput: |{"urls": ["${{ env.CONFIG_URL }}"],"operation": "validate","stopOnError": true}
🔗 Make (Integromat) workflow
Trigger on file upload → validate YAML → branch on isValid → send Slack alert on error.
🔗 Zapier automation
Validate config text from a form submission (Typeform/Jotform) → convert to JSON → push to Airtable.
🔗 n8n workflow
Fetch config files from GitHub via HTTP Request → batch-validate with YAML Validator → filter invalid rows → create GitHub Issues for each error.
🔗 Apify integration chain
Combine with other automation-lab actors:
- JSON CSV Converter — convert JSON output to CSV for spreadsheet analysis
- Fake Test Data Generator — generate test data, validate the schema in YAML format
API usage
Node.js
import { ApifyClient } from 'apify-client';const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });const run = await client.actor('automation-lab/yaml-validator').call({documents: ['name: John\nage: 30\nroles:\n - admin','{ "name": "Jane", "age": 25 }'],operation: 'convert',outputFormat: 'json'});const { items } = await client.dataset(run.defaultDatasetId).listItems();console.log(items);
Python
from apify_client import ApifyClientclient = ApifyClient(token="YOUR_API_TOKEN")run = client.actor("automation-lab/yaml-validator").call(run_input={"documents": ["name: John\nage: 30\nroles:\n - admin",'{ "name": "Jane", "age": 25 }'],"operation": "validate"})items = client.dataset(run["defaultDatasetId"]).list_items().itemsfor item in items:print(f"{item['source']}: {'valid' if item['isValid'] else item['errorMessage']}")
cURL
curl -X POST \"https://api.apify.com/v2/acts/automation-lab~yaml-validator/runs?token=YOUR_API_TOKEN" \-H "Content-Type: application/json" \-d '{"documents": ["name: John\nage: 30"],"operation": "convert","outputFormat": "json"}'
Use with Claude (MCP)
Connect this actor as an MCP tool to validate and convert YAML/JSON/TOML directly in Claude conversations.
Claude Code / Cursor / VS Code
$claude mcp add --transport http apify "https://mcp.apify.com?tools=automation-lab/yaml-validator"
Claude Desktop
Add to your claude_desktop_config.json:
{"mcpServers": {"apify": {"type": "http","url": "https://mcp.apify.com?tools=automation-lab/yaml-validator","headers": {"Authorization": "Bearer YOUR_APIFY_TOKEN"}}}}
Example prompts:
- "Validate this Kubernetes manifest for syntax errors: [paste YAML]"
- "Convert this JSON config to YAML format: [paste JSON]"
- "Check if my docker-compose.yaml is valid and report any errors"
- "I have 5 config files at these URLs — validate them all and tell me which ones have errors"
Is it legal to use?
Yes. This actor performs local data transformation only — it does not scrape any websites, does not access any third-party services, and processes only the data you explicitly provide. There are no legal concerns around terms of service or robots.txt because no crawling takes place.
FAQ
What YAML features are supported?
The actor uses js-yaml v4, which supports the full YAML 1.1/1.2 specification including anchors (&), aliases (*), multi-document streams, block scalars, and all standard data types.
Does it support multi-document YAML (with --- separators)?
Each string in the documents list is treated as a single document. Multi-document YAML streams (multiple --- separated documents in one string) are parsed as a single YAML document — js-yaml will only parse the first document in a stream. Split your documents into separate list items for multi-document validation.
Why is my TOML conversion failing?
TOML requires the top-level value to be a non-array object (key-value map). If your YAML or JSON document has an array at the top level (e.g., [1, 2, 3]), it cannot be converted to TOML. Convert to JSON or YAML instead.
My URL fetch is failing — what could be wrong?
- Ensure the URL returns raw text (not an HTML page wrapping the file)
- For GitHub, use
raw.githubusercontent.comURLs, notgithub.combrowse URLs - Private URLs are not supported — the URL must be publicly accessible
- Check that the URL returns a 200 OK status code
Can I validate private/internal config files?
Yes — paste the document content directly in the documents field. Nothing leaves your actor run except the validation results saved to the Apify dataset.
The actor detected my file as YAML but it's TOML — what do I do?
Set inputFormat: "toml" explicitly. Auto-detection uses heuristics (TOML uses = assignments, YAML uses :) and can occasionally misclassify edge cases.
Related scrapers
- Fake Test Data Generator — generate bulk test data in JSON, CSV, or NDJSON
- JSON CSV Converter — convert between JSON and CSV formats
- cURL to Code Converter — convert cURL commands to code in multiple languages