TOML Validator & Converter avatar

TOML Validator & Converter

Pricing

Pay per event

Go to Apify Store
TOML Validator & Converter

TOML Validator & Converter

Validate TOML files and convert between TOML, JSON, and YAML. Paste documents or fetch from URLs. Bulk processing with exact error line numbers.

Pricing

Pay per event

Rating

0.0

(0)

Developer

Stas Persiianenko

Stas Persiianenko

Maintained by Community

Actor stats

0

Bookmarked

2

Total users

1

Monthly active users

6 days ago

Last modified

Categories

Share

Validate TOML syntax, detect errors with exact line numbers, and convert between TOML, JSON, and YAML formats — in bulk, via API, or as part of any automation pipeline.


What does it do?

TOML Validator & Converter is a pure-computation Apify actor that processes TOML, JSON, and YAML documents without touching any websites or requiring proxies. You paste documents directly or provide URLs to raw config files, and the actor returns:

  • Validation status — is the document syntactically valid TOML?
  • 🔢 Error line number — exactly where is the syntax error?
  • 📄 Error message — human-readable description of the problem
  • 🔄 Converted output — the document reformatted as TOML, JSON, or YAML

It processes documents in bulk (up to hundreds in one run), making it ideal for CI/CD pipelines, config file audits, and data transformation workflows.


Who is it for?

🦀 Rust developers

Validate and lint Cargo.toml files programmatically. Catch syntax errors in dependency manifests before cargo build fails. Convert Cargo configuration to JSON for automated tooling.

🏗️ DevOps and platform engineers

Validate configuration files (Prometheus, Vector, InfluxDB, Gitea, Hugo, uv, pyproject.toml) before deploying. Automate config file validation across hundreds of repositories or microservices.

🔧 Python / Node.js developers

Convert between pyproject.toml, package.json, and YAML-based configurations when migrating between ecosystems. Validate pyproject.toml files as part of publishing pipelines.

📊 Data engineers

Normalize configuration data from multiple sources (TOML, JSON, YAML) into a consistent format before loading into databases or data pipelines.

🤖 Automation builders

Use with Apify integrations (Zapier, Make, n8n) to validate config files generated by other automation steps before processing them downstream. Gate CI/CD workflows on TOML validity.


Why use TOML Validator & Converter?

  • No setup required — runs instantly, no browser, no proxy, no scraping
  • Precise error reporting — exact line numbers, not just "invalid TOML"
  • Multi-format support — TOML, JSON, and YAML all in one actor
  • Bidirectional conversion — any format to any other format (TOML→JSON, JSON→TOML, YAML→TOML, etc.)
  • Bulk processing — validate dozens of documents in a single run at low cost
  • URL fetching — point at raw GitHub URLs, CDN-hosted configs, or any public endpoint
  • API-first — programmatic access via REST API, Node.js SDK, Python SDK
  • Zero proxy costs — pure computation, no bandwidth charges

What data does it extract?

Each processed document produces one row in the dataset:

FieldTypeDescription
indexnumber1-based position in the input list
sourcetextinline-N for pasted docs, URL for fetched docs
inputFormattextDetected or specified format: toml, json, or yaml
isValidbooleantrue if the document is syntactically valid
errorMessagetextHuman-readable error description (null if valid)
errorLinenumberLine number where error occurred (null if valid)
outputFormattextTarget format for conversion (null if validate-only)
convertedContenttextConverted document string (null if validate-only or invalid)
originalContenttextOriginal input text (only if includeOriginal is enabled)

How much does it cost to validate TOML and convert formats?

This actor uses pay-per-event (PPE) pricing — you only pay for what you process, with volume discounts for higher subscription tiers:

EventFreeBronzeSilverGoldPlatinumDiamond
🚀 Actor start (per run)$0.005$0.00475$0.00425$0.00375$0.003$0.0025
📄 Document processed$0.001$0.0009$0.0008$0.00065$0.0005$0.0004

Example costs (Free tier):

  • Validate 1 Cargo.toml: $0.005 + 1 × $0.001 = $0.006
  • Validate 10 config files: $0.005 + 10 × $0.001 = $0.015
  • Convert 50 TOML files to JSON: $0.005 + 50 × $0.001 = $0.055
  • Validate 100 pyproject.toml files: $0.005 + 100 × $0.001 = $0.105

Free plan: Apify's free tier includes $5/month of platform credits — enough to process ~4,900 documents per month at no cost.

This actor uses zero proxy (pure computation only), so there are no proxy bandwidth costs.


How to use TOML Validator & Converter

Step 1 — Open the actor

Go to https://apify.com/automation-lab/toml-validator and click Try for free.

Step 2 — Choose your input method

Option A: Paste documents directly Add your TOML, JSON, or YAML documents in the Documents field. Each entry in the list is one complete document.

Option B: Provide URLs Add raw file URLs in the URLs to Fetch field. For example:

  • GitHub raw: https://raw.githubusercontent.com/owner/repo/main/Cargo.toml
  • pyproject.toml: https://raw.githubusercontent.com/owner/repo/main/pyproject.toml
  • Any public HTTP endpoint returning a config file

Step 3 — Select operation

  • Validate only — checks syntax and reports errors with line numbers
  • Convert — transforms documents to the target format (TOML, JSON, or YAML)

Step 4 — Configure format options

  • Input Format: auto (recommended) detects TOML, JSON, and YAML automatically
  • Output Format: choose toml, json, or yaml (only for convert operation)

Step 5 — Run and export

Click Start and wait a few seconds. Download results as JSON, CSV, or Excel from the Dataset tab.


Input parameters

ParameterTypeDefaultDescription
documentsstring[]Raw document texts to validate or convert
urlsstring[]URLs to fetch and process as documents
operationstringvalidatevalidate or convert
inputFormatstringautoauto, toml, json, or yaml
outputFormatstringjsontoml, json, or yaml (for convert only)
jsonIndentinteger2Spaces for JSON indentation (0 = minified)
stopOnErrorbooleanfalseStop processing after the first invalid document
includeOriginalbooleanfalseInclude original document text in output

Output example

For a validation run on two documents (one valid Cargo.toml, one invalid):

[
{
"index": 1,
"source": "inline-1",
"inputFormat": "toml",
"isValid": true,
"errorMessage": null,
"errorLine": null,
"outputFormat": null,
"convertedContent": null
},
{
"index": 2,
"source": "inline-2",
"inputFormat": "toml",
"isValid": false,
"errorMessage": "Unexpected character, expecting string, number, datetime, boolean, inline array or inline table at row 2, col 8, pos 18",
"errorLine": 2,
"outputFormat": null,
"convertedContent": null
}
]

For a conversion run (TOML to JSON):

[
{
"index": 1,
"source": "inline-1",
"inputFormat": "toml",
"isValid": true,
"errorMessage": null,
"errorLine": null,
"outputFormat": "json",
"convertedContent": "{\n \"package\": {\n \"name\": \"my-app\",\n \"version\": \"1.0.0\"\n },\n \"dependencies\": {\n \"serde\": \"1.0\"\n }\n}"
}
]

Tips for best results

💡 Use auto format detection — the actor correctly identifies TOML, JSON, and YAML from content patterns. Only specify a format if your documents are ambiguous.

💡 Validate before converting — if you're converting many documents, run a validate pass first. Invalid documents produce no converted output.

💡 Use stopOnError: true in CI/CD — gates your pipeline on the first error, making failures fast and obvious.

💡 TOML output requires a top-level object — arrays and scalar values cannot be serialized to TOML (TOML spec). If your document is an array at the top level, convert to JSON or YAML instead.

💡 URL fetching supports any public HTTP endpoint — GitHub raw URLs, S3 presigned URLs, and CDN-hosted files all work. The actor follows redirects and handles standard HTTP responses.

💡 Batch many documents in one run — the start fee is $0.005 regardless of document count. Processing 100 documents in one run is far cheaper than 100 separate runs.

💡 TOML integers use underscore separators — when converting JSON numbers like 8080 to TOML, the output may use 8_080 (valid TOML notation for readability). This is standard TOML behavior.


Integrations

🔗 CI/CD pipeline gate (GitHub Actions)

Validate all TOML configs before deployment:

- name: Validate TOML configs
run: |
curl -s -X POST \
"https://api.apify.com/v2/acts/automation-lab~toml-validator/run-sync-get-dataset-items?token=${{ secrets.APIFY_TOKEN }}" \
-H "Content-Type: application/json" \
-d '{
"urls": ["${{ env.CARGO_TOML_URL }}"],
"operation": "validate",
"stopOnError": true
}' | python3 -c "
import sys, json
items = json.load(sys.stdin)
if any(not item['isValid'] for item in items):
print('TOML validation failed!')
sys.exit(1)
print('All TOML files valid.')
"

🔗 Make (Integromat) workflow

Trigger on file upload → validate TOML → branch on isValid → send Slack alert on invalid config.

🔗 Zapier automation

Validate TOML text from a form submission → convert to JSON → push to Airtable or Google Sheets.

🔗 n8n workflow

Fetch config files from GitHub via HTTP Request → batch-validate with TOML Validator → filter invalid rows → create GitHub Issues for each error.

🔗 Apify integration chain

Combine with other automation-lab actors:


API usage

Node.js

import { ApifyClient } from 'apify-client';
const client = new ApifyClient({ token: 'YOUR_API_TOKEN' });
const run = await client.actor('automation-lab/toml-validator').call({
documents: [
'[package]\nname = "my-app"\nversion = "1.0.0"\n\n[dependencies]\nserde = "1.0"\n',
'{ "name": "Jane", "age": 25 }'
],
operation: 'convert',
outputFormat: 'toml'
});
const { items } = await client.dataset(run.defaultDatasetId).listItems();
console.log(items);

Python

from apify_client import ApifyClient
client = ApifyClient(token="YOUR_API_TOKEN")
run = client.actor("automation-lab/toml-validator").call(run_input={
"urls": [
"https://raw.githubusercontent.com/owner/repo/main/Cargo.toml",
"https://raw.githubusercontent.com/owner/repo/main/pyproject.toml"
],
"operation": "validate"
})
items = client.dataset(run["defaultDatasetId"]).list_items().items
for item in items:
status = "valid" if item["isValid"] else f"INVALID (line {item['errorLine']}): {item['errorMessage']}"
print(f"{item['source']}: {status}")

cURL

curl -X POST \
"https://api.apify.com/v2/acts/automation-lab~toml-validator/runs?token=YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"documents": ["[package]\nname = \"my-app\"\nversion = \"1.0.0\"\n"],
"operation": "convert",
"outputFormat": "json"
}'

Use with Claude (MCP)

Connect this actor as an MCP tool to validate and convert TOML/JSON/YAML directly in Claude conversations.

Claude Code / Cursor / VS Code

$claude mcp add --transport http apify "https://mcp.apify.com?tools=automation-lab/toml-validator"

Claude Desktop

Add to your claude_desktop_config.json:

{
"mcpServers": {
"apify": {
"type": "http",
"url": "https://mcp.apify.com?tools=automation-lab/toml-validator",
"headers": {
"Authorization": "Bearer YOUR_APIFY_TOKEN"
}
}
}
}

Example prompts:

  • "Validate this Cargo.toml for syntax errors: [paste TOML]"
  • "Convert this JSON config to TOML format: [paste JSON]"
  • "Check if my pyproject.toml is valid and report any errors"
  • "I have 5 TOML files at these GitHub URLs — validate them all and tell me which ones have errors"
  • "Convert this YAML config to TOML so I can use it in my Rust project"

Yes. This actor performs local data transformation only — it does not scrape any websites, does not access any third-party services, and processes only the data you explicitly provide. There are no legal concerns around terms of service or robots.txt because no crawling takes place.


FAQ

What TOML version is supported?

The actor uses @iarna/toml v2, which supports TOML v0.5 specification including inline tables, arrays of tables ([[array]]), multi-line strings, datetime values, and all standard data types.

Does it support TOML arrays of tables ([[section]])?

Yes. Arrays of tables (like multiple [[worker]] or [[dependency]] entries) are fully supported and correctly parsed into JavaScript arrays of objects.

Why does my JSON number like 8080 become 8_080 in TOML output?

TOML supports underscore separators in integer literals for readability (e.g., 8_080 or 1_000_000). This is valid TOML notation and will parse correctly in any TOML parser. If you need the number without underscores, you can post-process the output.

My TOML conversion is failing with "top-level value must be an object" — why?

TOML requires the root of the document to be a key-value map (object). If your JSON or YAML document has an array at the top level (e.g., [1, 2, 3] or - item: value), it cannot be serialized to TOML. Convert to JSON or YAML instead.

The actor detected my TOML file as YAML — what do I do?

Set inputFormat: "toml" explicitly. Auto-detection uses content heuristics (TOML uses = assignments, YAML uses : mappings) and can occasionally misclassify edge cases, especially for minimal or unusual files.

My URL fetch is failing — what could be wrong?

  • Ensure the URL returns raw text (not an HTML page wrapping the file)
  • For GitHub, use raw.githubusercontent.com URLs, not the github.com browse URLs
  • Private URLs are not supported — the URL must be publicly accessible
  • Check that the URL returns a 200 OK status code

Can I validate private config files without exposing them?

Yes — paste the document content directly in the documents field. The content is processed within your actor run and only the validation results are saved to the Apify dataset. Nothing is sent to external services.

Can I validate TOML and JSON in the same run?

Yes. Set inputFormat: "auto" (the default) and mix TOML, JSON, and YAML documents freely. The actor detects each document's format independently.