Practical Exercise: Wikipedia Research Agent in n8n

Let us create a Wikipedia research agent in n8n. In this lesson, we will build a workflow that:

  1. Takes a research topic as input
  2. Fetches relevant information from Wikipedia
  3. Uses an LLM to summarize the information
  4. Returns a concise, well-structured summary

Step-by-Step Implementation

Step 1: HTTP Request to Wikipedia API

Nodes needed:
- Webhook node (to trigger workflow)
- HTTP Request node

Configuration:
- Method: GET
- URL: https://en.wikipedia.org/api/rest_v1/page/summary/{topic}
- Replace {topic} with dynamic data from webhook

Step 2: Data Processing

Nodes needed:
- Set node (to extract relevant fields)

Data to extract:
- Extract: title, extract (summary), pageid
- Handle errors (page not found)

Step 3: AI Processing with Free LLM

Nodes needed:
- AI Agent or Chat Model node (using Google Gemini)

Prompt engineering:
"Summarize the following Wikipedia content in 3 bullet points:
Title: {title}
Content: {extract}

Focus on key facts and importance."

Step 4: Output Formatting

Nodes needed:
- Code node or HTML node

Create clean output:
- Format as markdown
- Include source link
- Add timestamps

Complete Workflow Structure

Start (Webhook/Manual Trigger)
↓
HTTP Request → Wikipedia API
↓
Error Handler (If page not found)
↓
Set Node (Extract title & content)
↓
AI Agent Node (Gemini - Free tier)
↓
Format Output
↓
Return Result (Webhook Response/Email/Slack)

If you liked the tutorial, spread the word and share the link and our website, Studyopedia, with others.


For Videos, Join Our YouTube Channel: Join Now


Read More:

AI-Powered Workflows: Configuring the AI Agent and Chat Model nodes
Studyopedia Editorial Staff
contact@studyopedia.com

We work to create programming tutorials for all.

No Comments

Post A Comment