๐ n8n Mastery Guide: From Basics to Best Practices
Table of Contents
- Understanding n8n Data Flow
- Workflow Management & Subflows
- Error Handling & Resilience
- Thinking in Workflows
- Version Control Strategies
1. Understanding n8n Data Flow
๐ Official Docs
Core Concepts
n8n processes data in โitemsโ. Each item is an object with a json
property:
1
2
3
4
5
6
7
8
9
10
11
12
13
// Single item
{
json: {
name: "John",
email: "john@example.com"
}
}
// Multiple items (array)
[
{ json: { name: "John", email: "john@example.com" } },
{ json: { name: "Jane", email: "jane@example.com" } }
]
Key Rules
- Node Input: Each node receives an array of items
- Node Output: Each node returns an array of items
- Accessing Data: Use
$json
to access current itemโs data
Practical Examples
Example 1: Simple Data Flow
1
[Manual Trigger] โ [Code Node] โ [HTTP Request]
Code Node:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
// Access previous node's data
const userName = $json.name;
const userEmail = $json.email;
// Return data for next node
return [
{
json: {
fullName: userName,
contact: userEmail,
timestamp: new Date().toISOString(),
},
},
];
Example 2: Processing Multiple Items
1
2
3
4
5
6
7
8
9
10
11
12
13
// When you have multiple items from previous node
const allItems = $input.all(); // Get all items as array
// Process each item
const processed = allItems.map((item) => ({
json: {
originalData: item.json,
processed: true,
processedAt: Date.now(),
},
}));
return processed;
Example 3: Aggregating Data
1
2
3
4
5
6
7
8
9
10
11
12
// Combine multiple items into one
const allItems = $input.all();
return [
{
json: {
items: allItems.map((item) => item.json),
count: allItems.length,
summary: "Combined " + allItems.length + " items",
},
},
];
๐ฏ Quick Reference
Expression | Purpose | Example |
---|---|---|
$json |
Current item data | $json.name |
$input.first() |
First item | $input.first().json |
$input.last() |
Last item | $input.last().json |
$input.all() |
All items | $input.all() |
$node["NodeName"] |
Specific node data | $node["HTTP Request"].json |
2. Workflow Management & Subflows
๐ Official Docs
Pattern 1: Simple Subflow Call
Main Workflow:
1
[Trigger] โ [Process Data] โ [Execute Workflow: "Email Sender"] โ [Log Result]
Subworkflow (โEmail Senderโ):
1
[When called by another workflow] โ [Format Email] โ [Send Email]
Configuration:
Execute Workflow Node:
- Source: Database
- Workflow: Select โEmail Senderโ
- Data automatically passes to subworkflow
In Subworkflow - Access Parent Data:
1
2
3
4
5
6
// Get data from parent workflow
const parentData = $input.first().json;
// Access specific fields
const articles = parentData.articles;
const metadata = parentData.metadata;
Pattern 2: Subflow with Return Value
Main Workflow:
1
2
3
4
5
6
7
8
9
10
// Before calling subworkflow
return [
{
json: {
operation: "processData",
data: myData,
config: { timeout: 5000 },
},
},
];
Execute Workflow Node โ Returns subworkflow output
After Execute Workflow:
1
2
3
4
5
// Access subworkflow result
const result = $json.result;
const status = $json.status;
console.log("Subworkflow returned:", result);
Pattern 3: Conditional Subflow Execution
1
2
[Process] โ [IF Node] โ True: [Execute Workflow: "Success Handler"]
โ False: [Execute Workflow: "Error Handler"]
IF Node Condition:
1
2
3
4
5
{
{
$json.status === "success";
}
}
Pattern 4: Multiple Subflows in Sequence
1
[Data] โ [Execute: "Validate"] โ [Execute: "Process"] โ [Execute: "Notify"]
Each subworkflow receives output from previous one.
Pattern 5: Parallel Subflow Execution
1
2
3
โ [Execute: "Process A"]
[Data] โ [Split] โ โ [Execute: "Process B"] โ [Merge] โ [Combine Results]
โ [Execute: "Process C"]
Use Split In Batches or Item Lists node.
๐ฏ Best Practices
โ DO:
- Name subworkflows clearly: โEmail - Send Digestโ, โData - Validate Inputโ
- Keep subworkflows focused (single responsibility)
- Use descriptive node names
- Add notes to complex subworkflows
โ DONโT:
- Create circular dependencies (A calls B, B calls A)
- Pass huge datasets (>1MB) between workflows
- Nest subworkflows more than 3 levels deep
3. Error Handling & Resilience
๐ Official Docs
Strategy 1: Try-Catch Pattern with IF Node
1
2
[HTTP Request] โ [IF: Check for Errors] โ Success: [Continue]
โ Error: [Handle Error] โ [Notify]
IF Node Condition:
1
2
3
4
5
{
{
$json.error === undefined && $json.statusCode === 200;
}
}
Strategy 2: Built-in Error Handling
On any node, click the gear icon โ Settings:
- Continue On Fail: โ (keeps workflow running)
- Retry On Fail: โ
- Max Tries: 3
- Wait Between Tries: 1000ms (exponential backoff)
Example: Resilient HTTP Request
1
2
3
4
Settings:
- Continue On Fail: Yes
- Retry: 3 times
- Wait: 1000ms, 2000ms, 4000ms
Strategy 3: Error Workflow (Global Handler)
Create dedicated โError Handlerโ workflow:
1
[Error Trigger] โ [Parse Error] โ [Log to Database] โ [Send Alert]
Error Trigger Node:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
// Access error details
const errorWorkflow = $json.workflow.name;
const errorNode = $json.node.name;
const errorMessage = $json.error.message;
const errorTime = $json.execution.startedAt;
return [
{
json: {
workflow: errorWorkflow,
node: errorNode,
error: errorMessage,
timestamp: errorTime,
executionId: $json.execution.id,
},
},
];
Set as Global Error Workflow:
- Go to Settings โ Workflows
- Set โError Workflowโ to your error handler
Strategy 4: Defensive Coding
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
// Always validate input
function processData(data) {
// Validation
if (!data || typeof data !== "object") {
console.log("โ Invalid data received");
return { error: "Invalid input", success: false };
}
// Null checks
const items = data.items || [];
const config = data.config || {};
// Try-catch for risky operations
try {
const result = riskyOperation(items);
return { result, success: true };
} catch (error) {
console.log("โ Error:", error.message);
return { error: error.message, success: false };
}
}
// Use it
const result = processData($json);
if (!result.success) {
// Handle error path
return [{ json: { status: "error", message: result.error } }];
}
// Continue with success path
return [{ json: { status: "success", data: result.result } }];
Strategy 5: Circuit Breaker Pattern
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
// Track failures
let failureCount = 0;
const MAX_FAILURES = 5;
// In your processing loop
try {
const result = await externalService.call();
failureCount = 0; // Reset on success
return result;
} catch (error) {
failureCount++;
if (failureCount >= MAX_FAILURES) {
console.log("๐ด Circuit breaker triggered - stopping workflow");
throw new Error("Circuit breaker open: too many failures");
}
console.log(`โ ๏ธ Failure ${failureCount}/${MAX_FAILURES}`);
// Continue or retry
}
๐ฏ Resilience Checklist
- Enable โContinue On Failโ on external API calls
- Add retry logic with exponential backoff
- Validate all input data before processing
- Use try-catch blocks around risky operations
- Set up global error workflow for monitoring
- Add timeout limits on long-running operations
- Log errors with context (node name, input data)
- Send alerts for critical failures
- Test failure scenarios regularly
4. Thinking in Workflows
๐ Official Docs
The Workflow Mindset
Think in stages, not steps:
โ Wrong Thinking: โI need to make an API call, then parse the response, thenโฆโ
โ Right Thinking: โI need to: 1) Get Data โ 2) Transform โ 3) Act โ 4) Reportโ
Pattern: Start Small, Grow Iteratively
Iteration 1: Basic Flow (Hardcoded)
1
[Manual Trigger] โ [HTTP Request: Fixed URL] โ [Show Result]
Iteration 2: Add Parameters
1
[Manual Trigger] โ [Set Variables] โ [HTTP Request: Use Variables] โ [Show Result]
Iteration 3: Add Processing
1
[Trigger] โ [Set Vars] โ [HTTP Request] โ [Filter Data] โ [Transform] โ [Show]
Iteration 4: Add Error Handling
1
2
[Trigger] โ [Set] โ [HTTP + Retry] โ [IF: Success?] โ Yes: [Process]
โ No: [Log Error]
Iteration 5: Make Reusable
1
[Trigger] โ [Validate Input] โ [Execute Workflow: "API Handler"] โ [Format Output]
Decomposition Strategy
Example: โSend Weekly Security Digestโ
๐ด Bad Approach: One massive workflow with 30 nodes
๐ข Good Approach: Decompose into logical units
1
2
3
4
5
Main Workflow: "Security Digest Orchestrator"
โโโ Subworkflow: "RSS - Fetch Articles"
โโโ Subworkflow: "AI - Generate Summary"
โโโ Subworkflow: "Email - Send Digest"
โโโ Subworkflow: "Slack - Post Notification"
Each subworkflow is:
- Testable independently
- Reusable in other workflows
- Maintainable (easier to update)
- Understandable (single purpose)
Design Patterns
Pattern 1: Pipeline (Sequential Processing)
1
[Input] โ [Validate] โ [Transform] โ [Enrich] โ [Output]
Use When: Data flows linearly through transformations
Pattern 2: Branch (Conditional Logic)
1
2
[Input] โ [Decision] โ Path A: [Process A] โ [Merge]
โ Path B: [Process B] โ [Merge]
Use When: Different handling based on conditions
Pattern 3: Map-Reduce (Batch Processing)
1
[Input] โ [Split Items] โ [Process Each] โ [Aggregate] โ [Output]
Use When: Processing lists of items
Pattern 4: Event-Driven (Trigger-Response)
1
[Webhook Trigger] โ [Parse Event] โ [Route by Type] โ [Multiple Handlers]
Use When: Reacting to external events
Pattern 5: Scheduled Task (Time-Based)
1
[Cron Trigger] โ [Fetch Data] โ [Process] โ [Report]
Use When: Regular automated tasks
Workflow Organization Framework
Use this naming convention:
1
2
3
4
5
6
7
[Category] - [Purpose] - [Version]
Examples:
- Data - Fetch RSS Articles - v2
- Email - Send Digest - v1
- Integration - Slack Notify - v3
- Utility - Validate JSON - v1
Folder Structure (in your mind/docs):
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Workflows/
โโโ Core/ (main business workflows)
โ โโโ Security Digest Orchestrator
โ โโโ Weekly Report Generator
โโโ Integrations/ (external services)
โ โโโ RSS - Fetch Articles
โ โโโ Slack - Post Message
โ โโโ Email - Send via Gmail
โโโ Utilities/ (reusable helpers)
โ โโโ JSON - Validate
โ โโโ Data - Clean HTML
โ โโโ Text - Truncate
โโโ Error Handlers/
โโโ Global Error Handler
๐ฏ Development Workflow
-
Plan (5 min):
- Write down inputs and outputs
- Identify major stages
- Sketch on paper
-
Prototype (15 min):
- Build happy path only
- Use hardcoded test data
- Get something working
-
Test (10 min):
- Run with real data
- Check outputs
- Note failures
-
Refine (20 min):
- Add error handling
- Parameterize hardcoded values
- Improve logging
-
Extract (15 min):
- Identify reusable parts
- Create subworkflows
- Clean up main flow
-
Document (5 min):
- Add sticky notes
- Name nodes clearly
- Update workflow description
5. Version Control Strategies
๐ Official Docs
Strategy 1: Manual Export (Simplest)
Workflow:
- Click โโฎโ (three dots) on workflow
- Select โDownloadโ
- Save as:
workflow-name-v1.json
Folder Structure:
1
2
3
4
5
6
7
8
9
n8n-workflows/
โโโ production/
โ โโโ security-digest-v2.json
โ โโโ email-sender-v1.json
โโโ staging/
โ โโโ security-digest-v3-beta.json
โโโ archive/
โโโ security-digest-v1.json
โโโ email-sender-v0.json
Naming Convention:
1
2
3
4
5
6
[workflow-name]-v[major].[minor].json
Examples:
- security-digest-v1.0.json
- security-digest-v1.1.json (minor update)
- security-digest-v2.0.json (major rewrite)
Strategy 2: Git-Based Version Control (Recommended)
Setup:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
# Create repository
mkdir n8n-workflows
cd n8n-workflows
git init
# Create structure
mkdir -p workflows/{production,staging,development}
mkdir -p subworkflows
mkdir -p docs
# Add .gitignore
cat > .gitignore << 'EOF'
# Sensitive data
*-credentials.json
.env
# Temporary files
*.tmp
*.bak
EOF
# Initial commit
git add .
git commit -m "Initial n8n workflows repository"
Workflow Process:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# 1. Start new feature
git checkout -b feature/add-slack-integration
# 2. Export workflow from n8n
# Go to n8n โ Download workflow as JSON
# 3. Save to git
cp ~/Downloads/slack-integration.json workflows/development/
git add workflows/development/slack-integration.json
git commit -m "Add: Slack integration workflow
- Sends digest to #security channel
- Includes article count and top stories
- Handles rate limiting"
# 4. Test in development
# 5. Promote to staging
git checkout staging
git merge feature/add-slack-integration
cp workflows/development/slack-integration.json workflows/staging/
git commit -m "Promote: Slack integration to staging"
# 6. Test in staging
# 7. Deploy to production
git checkout main
git merge staging
cp workflows/staging/slack-integration.json workflows/production/
git tag -a v1.2.0 -m "Release: Add Slack integration"
git commit -m "Release: v1.2.0 - Slack integration"
git push origin main --tags
Strategy 3: Automated Backup Script
Create backup-workflows.sh
:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
#!/bin/bash
# Configuration
N8N_URL="http://localhost:5678"
N8N_API_KEY="your-api-key"
BACKUP_DIR="./backups/$(date +%Y-%m-%d)"
# Create backup directory
mkdir -p "$BACKUP_DIR"
# Export all workflows
echo "๐ฆ Backing up n8n workflows..."
# Get list of workflows (requires n8n API)
curl -X GET "$N8N_URL/api/v1/workflows" \
-H "X-N8N-API-KEY: $N8N_API_KEY" \
| jq -r '.data[] | .id + "," + .name' \
| while IFS=, read -r id name; do
# Download workflow
echo " โโ Exporting: $name (ID: $id)"
curl -X GET "$N8N_URL/api/v1/workflows/$id" \
-H "X-N8N-API-KEY: $N8N_API_KEY" \
> "$BACKUP_DIR/${name}-${id}.json"
done
# Commit to git
cd "$BACKUP_DIR/.."
git add .
git commit -m "Auto-backup: $(date +%Y-%m-%d)"
echo "โ
Backup complete: $BACKUP_DIR"
Add to crontab:
1
2
# Daily backup at 2 AM
0 2 * * * /path/to/backup-workflows.sh
Strategy 4: Changelog and Documentation
Create CHANGELOG.md
:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
# Changelog
All notable changes to n8n workflows will be documented in this file.
## [2.0.0] - 2025-10-09
### Added
- Advanced filtering with AI-based scoring
- Support for multiple RSS categories
- Automatic retry on failed API calls
### Changed
- Improved email template with better formatting
- Reduced API costs by 40% with smarter filtering
### Fixed
- Bug where empty content caused crashes
- Category mapping not working for new feeds
### Removed
- Deprecated Fever API authentication (replaced with Basic Auth)
## [1.1.0] - 2025-10-01
### Added
- Email digest subworkflow
- Slack notification integration
### Fixed
- Timezone issues in scheduling
## [1.0.0] - 2025-09-25
### Added
- Initial release
- RSS feed fetching
- Basic email sending
Create README.md
for each workflow:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
# Security Digest Workflow
## Overview
Fetches security articles from RSS feeds, filters by relevance, and sends weekly email digest.
## Version
v2.0.0 (2025-10-09)
## Dependencies
- Subworkflow: "Email - Send Digest v1"
- Subworkflow: "RSS - Fetch Articles v2"
- External: CommaFeed RSS reader
- External: Gmail API
## Configuration
### Environment Variables
- `COMMAFEED_URL`: https://commafeed.lab.aminrj.com
- `COMMAFEED_USER`: amine
- `COMMAFEED_PASSWORD`: \*\*\*
### Schedule
- Runs: Every Monday at 9:00 AM
- Timezone: UTC
## Data Flow
1. Fetch categories and feeds from CommaFeed
2. Fetch articles (last 3 days)
3. Filter by relevance score (>3)
4. Select top 12 articles across categories
5. Generate email digest
6. Send via Gmail
## Testing
1. Pin test data from "Fetch Articles" node
2. Test "Advanced Filter" with pinned data
3. Verify output has 10-15 articles
4. Check email formatting in preview
## Rollback
If issues occur, revert to v1.1.0:
```bash
git checkout tags/v1.1.0
# Import workflow-v1.1.0.json to n8n
```
Maintenance
- Update feed URLs quarterly
- Review filter thresholds monthly
- Archive old versions after 6 months
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
### Strategy 5: Release Management
**Use semantic versioning:**
- **Major (1.0.0 โ 2.0.0)**: Breaking changes, workflow restructure
- **Minor (1.0.0 โ 1.1.0)**: New features, no breaking changes
- **Patch (1.0.0 โ 1.0.1)**: Bug fixes, minor tweaks
**Release Checklist:**
```markdown
## Pre-Release Checklist
- [ ] All nodes have clear names
- [ ] Error handling on all API calls
- [ ] Sensitive data in environment variables (not hardcoded)
- [ ] Tested with production data
- [ ] Subworkflows are stable versions
- [ ] Documentation updated
- [ ] Changelog entry added
- [ ] Tagged in git
- [ ] Backup of previous version saved
## Post-Release Checklist
- [ ] Monitor first execution
- [ ] Check logs for errors
- [ ] Verify outputs match expected
- [ ] Update team documentation
- [ ] Schedule review in 1 week
๐ฏ Complete Git Workflow Example
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# Initial setup
git init n8n-workflows
cd n8n-workflows
# Create structure
mkdir -p {workflows,subworkflows,docs,scripts}
# Track changes
git add .
git commit -m "Initial structure"
# Feature development
git checkout -b feature/improved-filtering
# ... make changes in n8n ...
# Export workflow
git add workflows/security-digest-v2.json
git commit -m "Improve: Article filtering algorithm
- Add AI-based relevance scoring
- Filter out duplicate articles
- Prioritize recent content"
# Tag release
git tag -a v2.0.0 -m "Release: Improved filtering"
git push origin v2.0.0
# Create changelog
echo "## [2.0.0] - $(date +%Y-%m-%d)" >> CHANGELOG.md
echo "### Improved" >> CHANGELOG.md
echo "- Article filtering with AI scoring" >> CHANGELOG.md
git add CHANGELOG.md
git commit -m "Update changelog for v2.0.0"
๐ Learning Path: 30-Day Plan
Week 1: Foundations
- Day 1-2: Data flow and expressions
- Day 3-4: Build 3 simple workflows
- Day 5-7: Error handling practice
Week 2: Intermediate
- Day 8-10: Create first subworkflow
- Day 11-12: Refactor existing workflow
- Day 13-14: Set up git repository
Week 3: Advanced
- Day 15-17: Build complex multi-subworkflow system
- Day 18-19: Implement comprehensive error handling
- Day 20-21: Performance optimization
Week 4: Best Practices
- Day 22-24: Documentation and versioning
- Day 25-26: Monitoring and alerting
- Day 27-28: Code review with another workflow
- Day 29-30: Build something from scratch applying all concepts
๐ Additional Resources
Official Documentation
Video Tutorials
- n8n YouTube Channel
- Search: โn8n tutorialsโ for community content
Templates
- n8n Workflow Templates
- Study popular workflows for patterns
Community
๐ Quick Start Checklist
Today, right now:
- Export your current workflow as JSON
- Create a git repository
- Commit your workflow with a meaningful message
- Add a README.md describing the workflow
- Identify one subworkflow you can extract
- Add error handling to one critical node
- Test a failure scenario
This week:
- Create your first subworkflow
- Set up automated backups
- Write documentation for main workflows
- Implement retry logic on API calls
This month:
- Refactor one complex workflow into modular parts
- Build a global error handler
- Create a workflow style guide for your team
- Implement monitoring and alerting
Youโre now equipped to build professional, maintainable n8n workflows! Start small, iterate often, and always version your work. ๐
Created: 2025-10-09 21:24 [[today-2025-10-09]]