Global AI Network

SEO Traffic Drop Auditor AI Agent with GSC

Automatically detect Google Search Console traffic drops, analyze affected pages with AI-powered SEO insights, and receive detailed Slack reports with keyword recommendations weekly.

183+
Total Deployments
20 min
Setup Time
v1.0
Version

Need Help Getting Started? Our AI Specialists Will Set It Up For Free

1-Click Deployment 5-Min Setup Free Expert Support
Technology Partners

Required Integrations

This agent works seamlessly with these platforms to deliver powerful automation.

Browser Automation

Browser Automation

Automate web browsers with AI-powered interactions. Navigate pages, fill forms,...

Google Search Console

Google Search Console

Monitor and optimize your website\'s search performance with Google Search Conso...

OpenAI

OpenAI

Leverage OpenAI's powerful language models to generate text, answer questions, a...

Slack

Slack

Send messages, manage channels, and automate workflows in Slack workspaces

Step by Step

Setup Tutorial

mission-briefing.md

What This Agent Does

This powerful SEO monitoring workflow automatically detects traffic drops on your website and provides AI-powered analysis to help you understand and fix the issues. Every week, it compares your Google Search Console data across two time periods, identifies pages that have lost traffic, analyzes the keywords and content for those pages, and delivers actionable insights directly to your Slack channel.

Key benefits: Save 5-10 hours per week on manual SEO monitoring, catch traffic drops before they become major problems, and receive expert-level analysis without hiring an SEO consultant. The workflow processes multiple pages automatically, examining keyword performance, content structure, and providing specific recommendations for improvement.

Perfect for: Content marketers tracking organic performance, SEO specialists managing multiple sites, digital agencies monitoring client websites, and business owners who want to stay on top of their search visibility without constant manual checking.

Who Is It For

This workflow is ideal for:

  • SEO professionals who need to monitor multiple pages and want automated alerts when traffic drops occur
  • Content managers responsible for maintaining organic search performance across large websites
  • Digital marketing agencies managing SEO for multiple clients and needing scalable monitoring solutions
  • Website owners and entrepreneurs who want professional-grade SEO monitoring without the complexity
  • Marketing teams that need to quickly identify and respond to search ranking changes

You don't need to be an SEO expert to use this workflow—the AI analyst provides clear, actionable recommendations that anyone can understand and implement.

Required Integrations

Google Search Console

Why it's needed: This integration provides the core data about your website's search performance, including impressions, clicks, and keyword rankings. The workflow uses it to compare time periods and identify traffic drops.

Setup steps:

  1. Navigate to the Integrations section in TaskAGI
  2. Click Add Integration and select Google Search Console
  3. Click Connect to Google and sign in with the Google account that has access to your Search Console property
  4. Grant TaskAGI permission to read your Search Console data
  5. Select the website property you want to monitor from the dropdown list
  6. Click Save Integration

Important: Ensure your Google account has at least "Full" permission level in Google Search Console for the property you want to monitor. Owner or Full User roles work perfectly.

Browser Automation

Why it's needed: This integration scrapes the actual content from your web pages, allowing the AI to analyze the title tags, headings, and overall content structure to provide specific recommendations.

Setup steps:

  1. Go to Integrations in TaskAGI
  2. Select Browser Automation
  3. Click Enable Integration
  4. No additional credentials are required—this integration uses TaskAGI's built-in browser automation service
  5. Verify the integration shows as Active

Note: The browser automation service can access publicly available pages. If you need to scrape pages behind authentication, you'll need to configure additional headers in the node settings.

OpenAI

Why it's needed: Powers the AI SEO analyst that reviews your page data and provides intelligent, context-aware recommendations for improving rankings and recovering lost traffic.

Setup steps:

  1. Visit platform.openai.com and create an account or sign in
  2. Navigate to API Keys in your OpenAI dashboard
  3. Click Create new secret key and give it a descriptive name like "TaskAGI SEO Monitor"
  4. Copy the API key immediately (you won't be able to see it again)
  5. In TaskAGI, go to Integrations and select OpenAI
  6. Paste your API key into the API Key field
  7. Click Test Connection to verify it works
  8. Click Save Integration

Cost consideration: This workflow uses the gpt-4o-mini model, which is highly cost-effective. Expect approximately $0.01-0.05 per page analyzed, depending on content length.

Slack

Why it's needed: Delivers your SEO analysis reports directly to your team's Slack workspace, ensuring immediate visibility and enabling quick action on traffic drops.

Setup steps:

  1. In TaskAGI, navigate to Integrations and select Slack
  2. Click Connect to Slack
  3. Select the Slack workspace where you want to receive notifications
  4. Choose which channels TaskAGI can access (select at least one channel for SEO reports)
  5. Click Allow to grant permissions
  6. Back in TaskAGI, verify the integration shows as Connected
  7. Note the channel ID or name where you want reports sent

Pro tip: Create a dedicated channel like #seo-alerts or #traffic-monitoring to keep these reports organized and easily searchable.

Configuration Steps

1. Weekly Schedule Trigger

Configure when the workflow runs:

  • Set Interval to weekly
  • Choose Day of Week: Monday works well for reviewing weekend data
  • Set Time: 09:00 to receive reports at the start of your workday
  • Timezone: Select your local timezone

2. Config: Site URL

Enter your website details:

  • Variable Name: siteUrl
  • Value: Your full domain (e.g., https://www.yourwebsite.com)
  • Ensure you include https:// and use the exact URL format registered in Google Search Console

3. GSC: Compare Periods

Configure the comparison timeframes:

  • Site URL: Reference the config node: {{4158.siteUrl}}
  • Current Period Start: -14 days (last two weeks)
  • Current Period End: -7 days (through last week)
  • Previous Period Start: -28 days (four weeks ago)
  • Previous Period End: -21 days (three weeks ago)
  • Dimension: page (to analyze individual URLs)

This configuration compares last week's performance to the week before, helping you catch recent drops.

4. Filter: Traffic Drops

Set criteria for what constitutes a significant drop:

  • Filter Condition: clicks_change < -10
  • This identifies pages that lost more than 10 clicks between periods
  • Optional: Add additional filter impressions_current > 50 to focus on pages with meaningful traffic

5. Loop: Dropped Pages

Configure the loop to process each affected page:

  • Input Array: {{4160.filtered_pages}}
  • Max Iterations: 10 (prevents excessive API usage)
  • Continue on Error: Enable this to ensure one failed page doesn't stop the entire workflow

6. Wait 1 Second

Prevents rate limiting:

  • Duration: 1000 milliseconds
  • This ensures respectful API usage across all services

7. Prepare Loop Data

This function node extracts the current page URL:

return {
  currentPage: input.item.page,
  clicksChange: input.item.clicks_change,
  impressionsChange: input.item.impressions_change
};

8. GSC: Get Keywords

Retrieve keyword data for the affected page:

  • Site URL: {{4158.siteUrl}}
  • Page URL: {{4163.currentPage}}
  • Start Date: -14 days
  • End Date: -7 days
  • Dimension: query
  • Row Limit: 20 (top 20 keywords)

9. Format Keywords

Transform keyword data into readable format:

const keywords = input.keywords.map(k => 
  `${k.query} (${k.clicks} clicks, pos: ${k.position.toFixed(1)})`
).join('\n');

return { formattedKeywords: keywords };

10. Scrape Page Content

Extract the actual page content:

  • URL: {{4163.currentPage}}
  • Wait for Selector: body (ensures page loads)
  • Timeout: 10000 milliseconds

11. Extract Title & H2

Parse the HTML to extract key elements:

const titleMatch = input.html.match(/<title>(.*?)<\/title>/i);
const h2Matches = input.html.match(/<h2[^>]*>(.*?)<\/h2>/gi);

return {
  title: titleMatch ? titleMatch[1] : 'No title found',
  h2Tags: h2Matches ? h2Matches.map(h => h.replace(/<[^>]*>/g, '')).join(', ') : 'No H2 tags found'
};

12. AI SEO Analyst

Configure the AI analysis:

  • Model: gpt-4o-mini
  • Temperature: 0.7 (balanced creativity and consistency)
  • Max Tokens: 1000
  • System Prompt: "You are an expert Web Editor skilled in SEO analysis. Analyze the following page data and provide specific, actionable recommendations to recover lost traffic."
  • User Prompt:
Page: {{4163.currentPage}}
Traffic Change: {{4163.clicksChange}} clicks
Top Keywords: {{4165.formattedKeywords}}
Title: {{4167.title}}
H2 Headings: {{4167.h2Tags}}

Provide 3-5 specific recommendations to improve this page's rankings.

13. Format Slack Message

Structure the report for Slack:

return {
  message: `🚨 *SEO Traffic Drop Alert*\n\n*Page:* ${input.currentPage}\n*Traffic Change:* ${input.clicksChange} clicks (${input.impressionsChange} impressions)\n\n*AI Analysis:*\n${input.aiRecommendations}\n\n*Top Keywords:*\n${input.formattedKeywords}`
};

14. Notify Slack

Send the formatted report:

  • Channel: #seo-alerts (or your preferred channel)
  • Message: {{4169.message}}
  • Bot Name: SEO Monitor (optional)
  • Icon: :chart_with_downwards_trend: (optional)

Testing Your Agent

Initial Test Run

  1. Manual Trigger: Click the Run Test button in the workflow editor
  2. Monitor Progress: Watch each node execute in real-time
  3. Check Data Flow: Verify that data passes correctly between nodes

Verification Checklist

After GSC: Compare Periods:

  • Confirm you see page data with clicks and impressions
  • Verify the date ranges are correct
  • Check that you have at least a few pages with data

After Filter: Traffic Drops:

  • Ensure at least one page meets your filter criteria
  • If no pages appear, temporarily lower the threshold (e.g., -5 clicks) for testing

After Loop Execution:

  • Verify each page is processed individually
  • Check that keyword data is retrieved successfully
  • Confirm page content is scraped properly

After AI Analysis:

  • Review the AI recommendations for relevance and specificity
  • Ensure the analysis references the actual page data
  • Verify recommendations are actionable

Final Slack Message:

  • Check that the message appears in your designated channel
  • Verify formatting is clean and readable
  • Confirm all data points are present and accurate

Expected Results

A successful test should produce:

  • A Slack message for each page that experienced a traffic drop
  • Specific keyword data showing which terms lost rankings
  • AI-generated recommendations tailored to each page's content
  • Complete execution in 2-5 minutes (depending on number of pages)

Troubleshooting

"No pages found" in Filter Node

Cause: Either no pages lost traffic, or your filter threshold is too strict.

Solution:

  • Temporarily change filter to clicks_change < 0 to see all declining pages
  • Verify your GSC comparison periods have sufficient data
  • Check that your site URL exactly matches your Search Console property

"Failed to scrape page content"

Cause: The page may require authentication, have blocking measures, or load slowly.

Solution:

  • Verify the URL is publicly accessible in an incognito browser
  • Increase the timeout to 15000 milliseconds
  • Check if the site has bot protection (Cloudflare, etc.)
  • For authenticated pages, add custom headers in the Browser Automation node

"OpenAI API Error: Rate Limit"

Cause: Too many requests in a short time period.

Solution:

  • Increase the wait time between loop iterations to 2000 milliseconds
  • Reduce the max iterations in the loop to 5 pages
  • Check your OpenAI account usage limits and upgrade if needed

"Slack message not received"

Cause: Channel permissions or integration issues.

Solution:

  • Verify the Slack integration is still connected (check Integration status)
  • Ensure the bot has permission to post in your selected channel
  • Try using a channel ID instead of channel name (format: C1234567890)
  • Re-authorize the Slack integration if needed

"Invalid date range in GSC node"

Cause: Google Search Console has a 3-day data delay.

Solution:

  • Adjust your date ranges to end at least 3 days ago
  • Change current period end to -10 days instead of -7 days
  • Ensure start dates are always before end dates

Next Steps

After Successful Setup

  1. Monitor the first few reports to ensure the AI recommendations are relevant and actionable
  2. Adjust the filter threshold based on what constitutes a meaningful drop for your site
  3. Create a response process for your team to act on the recommendations
  4. Document improvements to track which recommendations actually recover traffic

Optimization Suggestions

Refine Your Filters:

  • Add multiple filter conditions: clicks_change < -10 AND impressions_current > 100
  • Focus on high-value pages by filtering for specific URL patterns
  • Create separate workflows for different site sections

Enhance AI Analysis:

  • Customize the AI prompt to focus on your specific industry or content type
  • Add competitor analysis by including competitor URLs in the prompt
  • Request specific recommendation formats (e.g., "Provide recommendations as a checklist")

Expand Reporting:

  • Add email notifications for critical drops using an email integration
  • Create a Google Sheet log of all traffic drops for trend analysis
  • Include screenshots of the pages in your Slack messages

Scale Your Monitoring:

  • Duplicate the workflow for multiple websites
  • Create daily and monthly versions for different monitoring frequencies
  • Add alerts for traffic increases to identify winning content patterns

Advanced Usage Tips

Prioritize High-Impact Pages: Add a scoring system in the filter to focus on pages with the highest potential impact:

const impactScore = Math.abs(input.clicks_change) * input.impressions_current;
return impactScore > 1000;

Integrate with Your CMS: Connect the workflow to your content management system to automatically create tasks or tickets for content updates based on AI recommendations.

A/B Test Recommendations: Track which AI recommendations you implement and measure their impact in subsequent reports to continuously improve the analysis quality.

Create a Dashboard: Export the data to a visualization tool like Google Data Studio or Tableau to see traffic drop trends over time and identify patterns.

This workflow transforms SEO monitoring from a time-consuming manual task into an automated, intelligent system that helps you maintain and improve your search visibility. Start with the basic configuration, then gradually add customizations as you become comfortable with the workflow!