Automatically detect Google Search Console traffic drops, analyze affected pages with AI-powered SEO insights, and receive detailed Slack reports with keyword recommendations weekly.
This agent works seamlessly with these platforms to deliver powerful automation.
Automate web browsers with AI-powered interactions. Navigate pages, fill forms,...
Monitor and optimize your website\'s search performance with Google Search Conso...
Leverage OpenAI's powerful language models to generate text, answer questions, a...
Send messages, manage channels, and automate workflows in Slack workspaces
This powerful SEO monitoring workflow automatically detects traffic drops on your website and provides AI-powered analysis to help you understand and fix the issues. Every week, it compares your Google Search Console data across two time periods, identifies pages that have lost traffic, analyzes the keywords and content for those pages, and delivers actionable insights directly to your Slack channel.
Key benefits: Save 5-10 hours per week on manual SEO monitoring, catch traffic drops before they become major problems, and receive expert-level analysis without hiring an SEO consultant. The workflow processes multiple pages automatically, examining keyword performance, content structure, and providing specific recommendations for improvement.
Perfect for: Content marketers tracking organic performance, SEO specialists managing multiple sites, digital agencies monitoring client websites, and business owners who want to stay on top of their search visibility without constant manual checking.
This workflow is ideal for:
You don't need to be an SEO expert to use this workflow—the AI analyst provides clear, actionable recommendations that anyone can understand and implement.
Why it's needed: This integration provides the core data about your website's search performance, including impressions, clicks, and keyword rankings. The workflow uses it to compare time periods and identify traffic drops.
Setup steps:
Important: Ensure your Google account has at least "Full" permission level in Google Search Console for the property you want to monitor. Owner or Full User roles work perfectly.
Why it's needed: This integration scrapes the actual content from your web pages, allowing the AI to analyze the title tags, headings, and overall content structure to provide specific recommendations.
Setup steps:
Note: The browser automation service can access publicly available pages. If you need to scrape pages behind authentication, you'll need to configure additional headers in the node settings.
Why it's needed: Powers the AI SEO analyst that reviews your page data and provides intelligent, context-aware recommendations for improving rankings and recovering lost traffic.
Setup steps:
Cost consideration: This workflow uses the gpt-4o-mini model, which is highly cost-effective. Expect approximately $0.01-0.05 per page analyzed, depending on content length.
Why it's needed: Delivers your SEO analysis reports directly to your team's Slack workspace, ensuring immediate visibility and enabling quick action on traffic drops.
Setup steps:
Pro tip: Create a dedicated channel like #seo-alerts or #traffic-monitoring to keep these reports organized and easily searchable.
Configure when the workflow runs:
weekly
09:00 to receive reports at the start of your workdayEnter your website details:
siteUrl
https://www.yourwebsite.com)https:// and use the exact URL format registered in Google Search ConsoleConfigure the comparison timeframes:
{{4158.siteUrl}}
-14 days (last two weeks)-7 days (through last week)-28 days (four weeks ago)-21 days (three weeks ago)page (to analyze individual URLs)This configuration compares last week's performance to the week before, helping you catch recent drops.
Set criteria for what constitutes a significant drop:
clicks_change < -10
impressions_current > 50 to focus on pages with meaningful trafficConfigure the loop to process each affected page:
{{4160.filtered_pages}}
10 (prevents excessive API usage)Prevents rate limiting:
1000 millisecondsThis function node extracts the current page URL:
return {
currentPage: input.item.page,
clicksChange: input.item.clicks_change,
impressionsChange: input.item.impressions_change
};
Retrieve keyword data for the affected page:
{{4158.siteUrl}}
{{4163.currentPage}}
-14 days
-7 days
query
20 (top 20 keywords)Transform keyword data into readable format:
const keywords = input.keywords.map(k =>
`${k.query} (${k.clicks} clicks, pos: ${k.position.toFixed(1)})`
).join('\n');
return { formattedKeywords: keywords };
Extract the actual page content:
{{4163.currentPage}}
body (ensures page loads)10000 millisecondsParse the HTML to extract key elements:
const titleMatch = input.html.match(/<title>(.*?)<\/title>/i);
const h2Matches = input.html.match(/<h2[^>]*>(.*?)<\/h2>/gi);
return {
title: titleMatch ? titleMatch[1] : 'No title found',
h2Tags: h2Matches ? h2Matches.map(h => h.replace(/<[^>]*>/g, '')).join(', ') : 'No H2 tags found'
};
Configure the AI analysis:
gpt-4o-mini
0.7 (balanced creativity and consistency)1000
Page: {{4163.currentPage}}
Traffic Change: {{4163.clicksChange}} clicks
Top Keywords: {{4165.formattedKeywords}}
Title: {{4167.title}}
H2 Headings: {{4167.h2Tags}}
Provide 3-5 specific recommendations to improve this page's rankings.
Structure the report for Slack:
return {
message: `🚨 *SEO Traffic Drop Alert*\n\n*Page:* ${input.currentPage}\n*Traffic Change:* ${input.clicksChange} clicks (${input.impressionsChange} impressions)\n\n*AI Analysis:*\n${input.aiRecommendations}\n\n*Top Keywords:*\n${input.formattedKeywords}`
};
Send the formatted report:
#seo-alerts (or your preferred channel){{4169.message}}
SEO Monitor (optional):chart_with_downwards_trend: (optional)After GSC: Compare Periods:
After Filter: Traffic Drops:
-5 clicks) for testingAfter Loop Execution:
After AI Analysis:
Final Slack Message:
A successful test should produce:
Cause: Either no pages lost traffic, or your filter threshold is too strict.
Solution:
clicks_change < 0 to see all declining pagesCause: The page may require authentication, have blocking measures, or load slowly.
Solution:
15000 millisecondsCause: Too many requests in a short time period.
Solution:
2000 milliseconds5 pagesCause: Channel permissions or integration issues.
Solution:
C1234567890)Cause: Google Search Console has a 3-day data delay.
Solution:
-10 days instead of -7 days
Refine Your Filters:
clicks_change < -10 AND impressions_current > 100
Enhance AI Analysis:
Expand Reporting:
Scale Your Monitoring:
Prioritize High-Impact Pages: Add a scoring system in the filter to focus on pages with the highest potential impact:
const impactScore = Math.abs(input.clicks_change) * input.impressions_current;
return impactScore > 1000;
Integrate with Your CMS: Connect the workflow to your content management system to automatically create tasks or tickets for content updates based on AI recommendations.
A/B Test Recommendations: Track which AI recommendations you implement and measure their impact in subsequent reports to continuously improve the analysis quality.
Create a Dashboard: Export the data to a visualization tool like Google Data Studio or Tableau to see traffic drop trends over time and identify patterns.
This workflow transforms SEO monitoring from a time-consuming manual task into an automated, intelligent system that helps you maintain and improve your search visibility. Start with the basic configuration, then gradually add customizations as you become comfortable with the workflow!