Content Calendar Actor
Pricing
from $0.05 / 1,000 results
Content Calendar Actor
Ready-to-publish “This Day in History” content by date, location, and topic. Extracts ranked historical events from Wikidata & Wikipedia and generates captions, hashtags, and SEO headlines for marketing, media, and education. No scraping. API-ready.
Pricing
from $0.05 / 1,000 results
Rating
0.0
(0)
Developer

Hayder Al-Khalissi
Actor stats
0
Bookmarked
2
Total users
1
Monthly active users
a day ago
Last modified
Categories
Share
Content Calendar & Events Intelligence
What does Content Calendar Actor do?
Content Calendar Actor is an Apify Actor that extracts historical events for "This day in history" from Wikidata and Wikipedia. You choose a date, optional locations (e.g. Berlin, Iraq) and topics (politics, culture, science, sports, economy), and the Actor returns ranked events plus ready-to-publish content: social captions, hashtags, and SEO headlines. Input is a simple form (date, locations, topics, language, max events); no code required. This Actor is a Content Calendar & Events Intelligence API — it does not scrape websites or collect trivia; it uses public Wikidata and Wikipedia APIs to build event-based content for marketing, media, and education.
Why use Content Calendar Actor?
- Content marketing — Fill your editorial calendar with "on this day" posts, newsletters, and social content. Get factual captions and hashtags without writing from scratch.
- Media and research — Use canonical event data (Wikidata QIDs, dates, locations) for fact-checking, timelines, and broadcast segments.
- Education — Build curricula or timelines from structured historical events with source links to Wikidata and Wikipedia.
Platform advantages: Run the Actor on a schedule (e.g. daily at 6:00 UTC), call it via the Apify API from your app, send results to a webhook or Telegram, and download datasets in JSON, CSV, or Excel from the Apify Console. You get monitoring, retries, and integrations without building infrastructure yourself.
What data can Content Calendar Actor extract?
Each run returns a dataset of historical events. Main data points per event:
| Data | Description |
|---|---|
| Event title & type | Human-readable label and event type from Wikidata |
| Date | ISO date (YYYY-MM-DD) when the event occurred |
| Location | Place name and Wikidata QID; optional coordinates |
| Place hierarchy | Normalized city, region, country for captions and hashtags |
| Wikipedia summary | Short, human-readable description (when a Wikipedia article exists) |
| Importance score | Ranking score (sitelinks, languages, anniversary milestones) |
| Social caption | Factual caption for social posts |
| Caption with hook | Alternative "This day in history" style line |
| Hashtags | Topic- and geography-based suggestions (e.g. #OnThisDay, #Berlin) |
| SEO headline | Ready-to-use headline for articles or titles |
| Source URLs | Links to the Wikidata item and Wikipedia article |
How to use Content Calendar Actor to get historical events
- Open the Actor on Apify (Store or your Console) and go to the Input tab.
- Set Date to
todayor a specific date inYYYY-MM-DDformat. - (Optional) Add Locations (e.g.
Berlin,Iraq) to filter events by geography. Leave empty for global events. - (Optional) Add Topics (e.g.
politics,culture,science) to filter by type. - Choose Language for labels and Wikipedia summaries (e.g. English, German, Arabic).
- Set Max events (1–100). Start with 10 or 20 for quick runs.
- Click Start and wait for the run to finish.
- Open the Output tab to view or download the dataset (JSON, CSV, or Markdown depending on Output format).
For a daily content calendar, use Scheduler: create a schedule (e.g. cron 0 6 * * * for 6:00 UTC), set input to date: "today", and optionally enable Send Telegram or Send webhook to receive a digest automatically.
How much does it cost to run Content Calendar Actor?
Runs are billed on Apify’s consumption model (Compute Units and platform usage). The Actor uses only public Wikidata and Wikipedia APIs — no proxies or paid data sources. Cost scales with max events and how often you run (e.g. once per day vs. many runs per day). On the free plan you can run the Actor regularly with moderate maxEvents; larger plans support more runs and higher limits. Check the Pricing section on the Actor page and the Apify pricing page for current rates.
Input
Content Calendar Actor has the following input options. Click the Input tab on the Actor page for the full schema and tooltips.
| Option | Description |
|---|---|
| Date | today or YYYY-MM-DD (e.g. 2025-02-19) |
| Locations | List of place names (city, region, country). Empty = global. |
| Topics | List of: culture, politics, science, sports, economy. Empty = all topics. |
| Language | Language for labels and Wikipedia (e.g. en, de, ar). |
| Max events | Number of events to return (1–100). |
| Output format | json, csv, or markdown. |
| Send webhook / Webhook URL | POST a summary to your URL when the run finishes. |
| Send Telegram / Telegram bot token / Chat ID | Send a short "This day in history" digest to Telegram. |
Output
You can download the dataset produced by Content Calendar Actor in JSON, CSV, or Markdown (depending on Output format). From the run’s Output tab you can also export as Excel or use the Apify API to fetch results programmatically.
Example output (one event in JSON):
{"qid": "Q12345","title": "Fall of the Berlin Wall","eventType": "event","date": "1989-11-09","location": "Berlin","locationQid": "Q64","description": "Fall of the Berlin Wall in 1989","wikipediaSummary": "The Fall of the Berlin Wall was a pivotal event...","locationNormalized": { "city": "Berlin", "region": "Berlin", "country": "Germany" },"sitelinksCount": 120,"importanceScore": 42.5,"caption": "On November 9, 1989, Fall of the Berlin Wall took place in Berlin. The Fall of the Berlin Wall was a pivotal event...","captionHook": "This day in history: Fall of the Berlin Wall — The Fall of the Berlin Wall was a pivotal event...","hashtags": ["OnThisDay", "History", "Politics", "Berlin", "Germany"],"seoHeadline": "Fall of the Berlin Wall: What Happened on November 9, 1989 in Berlin","wikidataUrl": "https://www.wikidata.org/wiki/Q12345","wikipediaUrl": "https://en.wikipedia.org/wiki/Fall_of_the_Berlin_Wall"}
Data sources and responsible use
The Actor uses only public APIs:
- Wikidata for structured event data (dates, locations, types).
- Wikipedia REST API for short summaries when an article exists.
No personal or private data is collected. All content is derived from Wikidata and Wikipedia under their respective licenses and terms of use. Use the results in line with those terms and with respect to attribution. If you reuse Wikipedia text, follow Wikipedia’s reuse guidelines.
Tips and advanced options
- Limit cost — Use a lower Max events (e.g. 10) for testing; increase for production.
- Daily digest — Use Scheduler with
date: "today"and Send Telegram or Send webhook to get a daily "This day in history" feed. - Multiple languages — Change Language and re-run to get labels and Wikipedia summaries in another language (e.g.
de,ar). - Narrow results — Combine Locations and Topics (e.g.
["Berlin"]+["politics", "culture"]) for focused content. - Few or no events with locations? — The location filter keeps only events that occurred in or within the resolved places (e.g. Berlin or a venue in Berlin). Many dates have few such events. For more results, use one location, try a larger area (e.g. country), or leave Locations empty for global events.
- “No events found” after a timeout — The Actor uses Wikidata’s public SPARQL endpoint, which can sometimes return 504 (upstream timeout). The Actor retries once automatically. If you still get no events, run again later or try without location/topic filters.
- Placeholders for sparse data — When a Wikidata item has no label or Wikipedia article, the Actor still returns the event: title may be
Event (Q123), location may beLocation (Q64)or derived from place hierarchy, and description from the first sentence of the summary or “Event on <date>”. You can use the qid and wikidataUrl to look up or improve the data on Wikidata.
Integrations
Run Content Calendar Actor from your automation platform and pipe events into newsletters, social posts, or other tools.
n8n
Use the Apify node in n8n to run the Actor and get the dataset in one step.
- Install the Apify node — In n8n: Settings → Community nodes → Install
@apify/n8n-nodes-apify, or search for Apify in the node panel (n8n Cloud). - Connect — Add an Apify node and authenticate with your Apify API token.
- Run the Actor — Choose Run an Actor and Get Dataset, select Content Calendar & Events Intelligence (or paste the Actor ID), and set input (e.g.
date: "today",locations: ["Berlin"],topics: ["culture"],maxEvents: 10). - Use the data — The node outputs dataset items. Map
title,caption,hashtags,seoHeadline,wikidataUrlinto the next steps (e.g. Google Sheets, Slack, or a social scheduler).
Details: Apify + n8n integration.
Make (Integromat)
Use the Apify app in Make to run the Actor and fetch results.
- Add Apify — In your scenario, add a module and search for Apify. Connect with OAuth or your Apify API token.
- Run the Actor — Use Run an Actor with the Content Calendar Actor ID and input (date, locations, topics, maxEvents, etc.). For short runs, the synchronous action returns when the run finishes (respect Make’s timeout). For longer runs, use Watch Actor Runs (trigger) and then Get Dataset Items.
- Get Dataset Items — Add a Get Dataset Items module and use the run’s default dataset ID from the previous step. Map the items to Google Sheets, Airtable, Slack, email, or another app.
Details: Apify + Make integration and Apify on Make.
OpenClaw
Use the Apify API (REST, CLI, or MCP) from OpenClaw or any automation that can call HTTP APIs.
- Trigger a run —
POST https://api.apify.com/v2/acts/YOUR_ACTOR_ID/runswith a JSON body for input (e.g.{"date":"today","maxEvents":20}). Use your Apify token in theAuthorizationheader. - Wait and fetch — Poll the run status, or use Send webhook in the Actor input so the run POSTs a summary to a URL when it finishes.
- Get dataset — When the run is finished,
GET https://api.apify.com/v2/datasets/DATASET_ID/itemsto retrieve the events. Use them in OpenClaw flows (e.g. email digests, calendar entries) or pass the dataset URL to another service.
For CLI-based flows: apify run content-calendar-actor --date today --maxEvents 10 then fetch the default dataset. See OpenClaw on Apify and Apify API.
References & further reading
- Apify Wikipedia API — Apify’s MediaWiki Scraper for richer Wikipedia/Fandom data (sections, categories, revision info). This Actor uses Wikipedia’s official REST API for short summaries; the Apify API is an option if you need full-page or batch extraction elsewhere.
- wikibase-cli — Command-line interface to read and edit Wikibase (including Wikidata), built on wikibase-sdk. Useful for scripting and understanding programmatic Wikidata access; this Actor uses direct
wbgetentitiesand SPARQL calls. - Wikipedia page ID → Wikidata QID — To resolve a Wikipedia page ID to a Wikidata item, a reliable approach is the two-step method: use the Wikipedia API (
action=query&prop=pageprops&pageids=…) to getwikibase_item, then query Wikidata by that QID. UsingSERVICE wikibase:mwapiin SPARQL can work but is prone to timeouts; see this Stack Overflow discussion.
FAQ and support
Can I use this as an API?
Yes. Call the Actor via the Apify API (REST or client libraries). The run’s default dataset contains the events; you can fetch it by dataset ID.
Why are some events missing a Wikipedia summary?
Summaries are only added when the event has a Wikipedia article in the chosen language. Otherwise wikipediaSummary is empty; captions still use title, date, and location.
How are events ranked?
Events are scored by sitelinks count, number of languages, and anniversary milestones (10, 25, 50, 100 years). They are sorted by this importance score and limited by Max events.
Where can I report issues or ask for help?
Use the Issues tab on the Actor page to report bugs or request features. We’re happy to hear how you use the Actor and how we can improve it.
Why is my webhook URL rejected?
For security (SSRF prevention), webhook URLs must be valid http or https and cannot target localhost or private IP ranges (e.g. 127.0.0.1, 10.x, 192.168.x). Use a public HTTPS endpoint.
Why do I get only one or zero events when I set locations?
Events are kept only if their place is one of the resolved locations or inside one of them (e.g. a building in Berlin). On many dates there are few such events. See Tips above: try fewer locations, a larger area, or no locations for global results.
Limitations
Location and topic filters rely on Wikidata search; ambiguous names may resolve to the first match. For critical use, verify important QIDs. The Actor does not use an LLM; all text comes from Wikidata/Wikipedia and deterministic templates, so output is factual and reproducible.