n8n NewsAPI node
n8n NewsAPI node extracts articles from news sites with headlines, content, and metadata. Connect it to your n8n workflows for media monitoring or content aggregation.
Trusted by industry leaders all over the world
What data you can get with n8n NewsAPI node
Extract article headlines, full text, authors, publication dates, and source information. Get metadata and images. Use for news monitoring or content curation. Export as JSON.
Output
{ "url": "https://apnews.com/article/nvidia-gtc-jensen-huang-ai-457e9260aa2a34c1bbcc07c98b7a0555", "site": "apnews.com", "tags": [ "Business" ], "image": "https://dims.apnews.com/dims4/default/1110a1f/2147483647/strip/true/crop/4659x2621+0+243/resize/1440x810!/quality/90/?url=https%3A%2F%2Fassets.apnews.com%2F1c%2Fa7%2Fbb9db252004b299235ec619feb7b%2F227eb1f572f14664b6ea05d276e07359", "label": "apnews.com.article", "query": "Nvidia", "title": "Nvidia CEO Jensen Huang unveils new Rubin AI chips at GTC 2025", "author": "SARAH PARVINI", "content": "Nvidia founder Jensen Huang kicked off the company’s artificial intelligence developer conference on Tuesday by telling a crowd of thousands that AI is going through “an inflection point.”\n\nAt GTC 2025 — dubbed the “Super Bowl of AI” — Huang focused his keynote on the company’s advancements in AI and his predictions for how the industry will move over the next few years. Demand for GPUs from the top four cloud service providers is surging, he said, adding that... (truncated)", "updated": "2025-03-18T18:35:20", "published": "2025-03-18T18:35:20", "description": "Nvidia founder Jensen Huang kicked off the company’s artificial intelligence developer conference, on Tuesday by telling a crowd of thousands that AI is going through “an inflection point.”"}How to set up n8n NewsAPI node with Apify
Configure the Actor with news site URLs or topic searches. It extracts articles with full content, bypassing paywalls where possible. Set date ranges and article limits as needed.
Sign up for Apify account01
Creating an account is quick and free. No credit card required. Your account gives you access to more than 20,000+ scrapers and APIs.
Get your Apify API token02
Go to Settings in Apify Console and navigate to the API & Integrations tab. There, create a new token and save it for later.
Test run Newsapi Node03
Open Newsapi Node in Apify Console and configure your input parameters. Click Start to run the Actor and preview the data structure you receive in your n8n workflow.
Integrate Newsapi Node via n8n04
Add the Apify node to your n8n workflow. Select Run Actor as the operation, choose your Actor, and pass your input configuration as JSON. Enable Wait for finish to retrieve results directly in subsequent nodes.
Never get blocked
Every plan (free included) comes with Apify Proxy, which is great for avoiding blocking and giving you access to geo-specific content.
Customers love us
We truly care about the satisfaction of our users and thanks to that we're one of the best-rated data extraction platforms on both G2 and Capterra.
Monitor your runs
With our latest monitoring features, you always have immediate access to valuable insights on the status of your web scraping tasks.
Add an HTTP Request node to your n8n workflow and point it to the Apify API. Use your API token for authentication and specify the NewsAPI node Actor ID you want to run. The Actor executes and returns data directly to your workflow. You can also use n8n's dedicated Apify node if available in your version.
Yes. Apify offers a free tier with prepaid platform usage. This is enough to test Actors with your n8n workflows and run small-scale extractions. No credit card required to start.
No. You can configure Apify Actors through their web interface and connect them to n8n using the HTTP Request node - no coding required. For advanced use cases, you can customize Actor inputs or use the Apify SDK with JavaScript or Python.
Building and maintaining scrapers takes significant time. Websites change their structure, add bot detection, and block requests. Apify Actors handle all of this automatically - proxy rotation, anti-bot bypassing, error handling, and data parsing. You get reliable data without the maintenance burden.