Backlink Checker
Pricing
$10.00 / 1,000 results
Backlink Checker
Backlink Checker is a focused SEO data Actor built to deliver clean, reliable backlink intelligence for any domain or URL list. It accepts a simple input structure with only two fields: domainsToCheck and Backlink Mode. For every target submitted, it returns exactly one output item.
Pricing
$10.00 / 1,000 results
Rating
0.0
(0)
Developer

Salman Bareesh
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
a day ago
Last modified
Categories
Share
Backlink Checker provides backlink reporting in a stable, one-record-per-target output format.
Documentation Revision
- Revision ID:
2026-02-25-r2 - Scope: public contract clarification, integration depth, operational checklists, and governance guidance.
- Purpose: ensure the README in Apify Console reflects the latest production-facing definition.
Product Summary
This Actor is built for teams that want a clean dataset shape for recurring backlink review.
Each requested target generates exactly one dataset item.
That single item includes:
- target-level backlink summary fields
- full backlink details stored in
backlinks_jsonas native JSON
This structure is designed for:
- reporting
- data pipelines
- spreadsheet workflows
- dashboard ingestion
- historical run comparison
Who This Is For
This Actor is intended for:
- SEO teams tracking backlink profile changes
- growth teams validating referral footprint
- analytics teams building backlink trend dashboards
- operations teams standardizing recurring backlink checks
- engineering teams integrating backlink snapshots into internal pipelines
What You Can Expect Per Run
For every run:
- each requested target is processed at most once
- each processed target produces exactly one dataset item
- full backlink detail remains attached to that target in nested JSON
- summary metrics are preserved for quick filtering and ranking
- timestamps are included for historical replay and comparison
Public Contract
This section defines the public data contract you can rely on.
Input Contract
Only two input fields are accepted:
domainsToCheck(required)mode(optional)
domainsToCheck:
- type: array of strings
- accepted values: domains and full URLs
- duplicates are ignored during processing
mode:
- type: string
- allowed values:
root_domaindomainurl
- default:
root_domain
Input examples are validated against the schema defined in .actor/input_schema.json.
Input Quality Rules
The input normalization behavior is deterministic:
- surrounding whitespace is trimmed
- repeated targets are removed
- empty lines and empty list entries are ignored
- both plain domain values and full URLs are accepted
- mode fallback applies only when mode is omitted, not when invalid
If mode is invalid, the run fails early with a validation error.
Output Contract
One dataset item is produced per requested target.
Top-level fields:
checked_target(string)mode(string)report_domain(string)total_external_backlinks(string)total_referring_domains(string)dofollow_backlinks_percent(string)referring_ips(string)backlink_count(number)backlinks_json(array of objects)timestamp(ISO-8601 string)
backlinks_json item fields:
source_url(string)source_display_url(string)target_url(string)anchor_text(string)link_type(string)score(string)
Output Guarantees
The following output guarantees apply:
- exactly one item per processed target
- stable top-level field naming
- nested backlink objects with stable shape
- explicit
backlink_countfor quick numeric filtering - parse-ready JSON array in
backlinks_json
This allows both:
- high-level table analysis with summary metrics
- deep analysis with per-backlink attributes from
backlinks_json
Input Examples
Example 1: Domain List
{"domainsToCheck": ["example.com","another-example.org"],"mode": "root_domain"}
Example 2: URL List
{"domainsToCheck": ["https://example.com/pricing","https://example.com/blog/post-1"],"mode": "url"}
Example 3: Mixed Input
{"domainsToCheck": ["example.com","https://www.example.org"]}
Output Example
{"checked_target": "example.com","mode": "root_domain","report_domain": "example.com","total_external_backlinks": "123","total_referring_domains": "45","dofollow_backlinks_percent": "67%","referring_ips": "40","backlink_count": 2,"backlinks_json": [{"source_url": "https://site-a.com/page-1","source_display_url": "site-a.com/page-1","target_url": "https://example.com/","anchor_text": "Example","link_type": "dofollow","score": "18"},{"source_url": "https://site-b.com/post-8","source_display_url": "site-b.com/post-8","target_url": "https://example.com/","anchor_text": "Example brand","link_type": "nofollow","score": "12"}],"timestamp": "2026-02-25T00:00:00+00:00"}
Output Mapping Tips
If you map this output into downstream systems:
- treat
checked_targetas the source key for the requested input - treat
report_domainas the normalized report-side domain label - use
modeto separate logic by target granularity - use
timestampfor sequencing and snapshots - use
backlink_countfor threshold alerts - expand
backlinks_jsononly for detailed backlink-level analytics
Dataset Shape Guidance
Why One Record Per Target
One record per target gives you:
- easier joins with external domain inventories
- consistent table width
- simpler incremental loads
- easier run-over-run comparisons
Recommended Primary Keys
For analytics and warehousing, a practical key can be:
checked_targetmodetimestamp
Null and Empty Expectations
backlinks_jsoncan be empty ([])backlink_countcan be0- summary fields can be empty strings when upstream report data is not present
Recommended Data Model (Warehouse)
A practical model in a warehouse environment:
- Landing table:
- Store full actor row as-is, including
backlinks_json.
- Store full actor row as-is, including
- Curated target table:
- Keep one row per target snapshot with metric fields and counts.
- Expanded backlink table:
- Flatten
backlinks_jsoninto one row per backlink object for deep querying.
- Flatten
This gives fast aggregate queries plus full drill-down.
API Consumption Notes
Dataset Items Endpoint
Use the default dataset endpoint from Actor run links.
Recommended query options:
view=overviewclean=1(if needed in your own API calls)format=json
JSON Handling
backlinks_json is already a parsed JSON array in dataset responses.
No string parsing is needed.
Suggested API Pagination Strategy
For larger historical datasets:
- request pages in deterministic order
- persist last processed offset or timestamp in your integration
- run idempotent upserts keyed by (
checked_target,mode,timestamp) - avoid destructive overwrite patterns
Local Development
Prerequisites
- Python 3.11+
- Apify CLI
- valid dependency installation from
requirements.txt - credentials file present locally
Install Dependencies
$pip install -r requirements.txt
Local Run
$apify run
Optional Run
$python -m src
Operational Readiness Checklist
Before each production run:
- verify target list freshness
- verify selected mode alignment with objective
- verify credential rotation policy compliance
- verify downstream parser version compatibility
- verify dashboard or ETL jobs expect current schema
After each production run:
- validate output item count versus requested target count
- inspect
backlink_countoutliers - monitor empty result frequency
- archive run metadata for traceability
Deployment
Push to Apify:
$apify push
Data Governance
This section outlines intended data handling behavior.
- Actor output includes reporting data related to requested targets only
- credential values are not emitted into dataset output
- output schema is designed for deterministic programmatic use
Compliance-Oriented Usage Notes
- store only data required for your business use case
- apply your organization’s retention policy to raw run outputs
- restrict run-level access based on least privilege
- review third-party policy obligations before high-volume usage
Security Notes
- keep local credential files private
- do not commit real credentials to version control
- rotate account passwords and tokens according to your internal policy
- avoid sharing run links publicly when access control is required
Secrets Management Recommendation
For secure deployment pipelines:
- inject credentials through secured environment variables
- keep secret values out of source files
- keep secret values out of CI logs
- rotate secrets on a defined schedule
- validate secret presence during pre-run checks
Integration Patterns
Spreadsheet Flow
- map one row to one checked target
- keep
backlinks_jsonas JSON-type column when possible - expand nested rows only when deeper analysis is required
Data Warehouse Flow
- load top-level fields as fact columns
- store
backlinks_jsonas nested or semi-structured column - create derived tables from nested backlink objects when needed
Internal Service Flow
- trigger Actor runs from scheduler or workflow system
- ingest dataset items through API
- apply your own downstream validation and alerting rules
Versioning Expectations
Public contract stability priorities:
- keep
domainsToCheckandmodeinput contract stable - keep one-record-per-target output behavior stable
- keep
backlinks_jsonas native JSON array data - additive fields may be introduced in future versions
Backward Compatibility Guidance
When extending this Actor:
- do not repurpose existing field names
- do not change
backlinks_jsonfrom JSON array to string - do not break one-item-per-target behavior
- add new fields as optional additions
- preserve existing view field order where possible
Quality Checklist
Before running in production:
- confirm input list formatting
- confirm mode selection
- confirm credential availability
- confirm downstream parser accepts nested JSON arrays
Troubleshooting
Input Validation Error
Common reasons:
domainsToCheckmissingdomainsToCheckempty- unsupported
modevalue
Empty Output
Common reasons:
- no valid targets in input
- upstream report currently has no backlink entries
- temporary account access issue
Unexpected Field Parsing
Check that your downstream system treats:
backlink_countas numberbacklinks_jsonas arraytimestampas datetime
Output Count Mismatch
If output items are fewer than requested targets:
- check whether input list contained duplicates
- check for early validation failures on malformed entries
- review error logs for target-level retries exhausted
Nested Data Not Displaying in Downstream Tool
If your BI tool shows [object Object] for backlinks_json:
- configure the column as JSON/array type if supported
- flatten array in ETL before visualization
- render with custom JSON formatter in dashboard layer
FAQ
Does each target produce multiple dataset rows?
No. One dataset row per target.
Is backlink detail preserved?
Yes. Full detail is stored in backlinks_json.
Do I need to parse JSON from string?
No. backlinks_json is JSON array data.
Is output deterministic in shape?
Yes. Field names are stable and schema-based.
Can I compare run history easily?
Yes. Use checked_target, mode, and timestamp.
Can I use only summary metrics and ignore nested backlinks?
Yes. You can rely solely on top-level summary fields and backlink_count.
Can I perform per-backlink analytics later?
Yes. Expand backlinks_json into a dedicated table at any time.
Minimal Contract Snapshot
Input:
{"domainsToCheck": ["example.com"],"mode": "root_domain"}
Output:
{"checked_target": "example.com","mode": "root_domain","backlink_count": 0,"backlinks_json": [],"timestamp": "2026-02-25T00:00:00+00:00"}
Final Notes
This Actor is intentionally focused on:
- simple input
- stable output shape
- one record per target
- native JSON backlink detail
- reliable integration readiness
If you consume this Actor as a long-term data source, prefer schema-aware parsers and track revision IDs in your pipeline metadata.