Global AI Network
Agent Template v1.0

Reddit Post Summarizer with Supabase

Automatically curate and summarize Reddit posts daily, filter by relevance with AI, and save insights to Supabase for seamless content discovery and analysis.

42+
Deployments
15m
Setup Time
Free
Pricing

Need custom configuration?

Our solution engineers can help you adapt this agent to your specific infrastructure and requirements.

Enterprise Grade Best Practices Production Optimized

INTEGRATED_MODULES

Google Gemini
Google Gemini
Reddit
Reddit
Supabase
Supabase
Step by Step

Setup Tutorial

mission-briefing.md

What This Agent Does

This intelligent automation workflow transforms your Reddit saved posts into a curated database of valuable insights. The agent automatically runs on a daily schedule, retrieves your saved Reddit posts, intelligently filters them for relevance, summarizes the most valuable content, and stores everything in Supabase for easy reference and analysis.

Key benefits include:

  • Automated curation: Save hours manually organizing Reddit content by letting AI handle the heavy lifting
  • Smart filtering: Only relevant, high-quality posts make it to your database, eliminating noise
  • Instant summaries: Get concise, actionable summaries of lengthy discussions without reading every comment
  • Persistent storage: Build a searchable knowledge base of valuable Reddit discussions organized by topic
  • Hands-free operation: Set it and forget it—the workflow runs automatically every day

This workflow is perfect for researchers, content creators, product managers, and knowledge workers who want to systematically capture and organize valuable discussions from Reddit without manual effort.


Who Is It For

This agent is ideal for professionals and enthusiasts who:

  • Regularly save Reddit posts but struggle to organize them
  • Want to build a personal knowledge base from community discussions
  • Need to monitor specific topics or communities for insights
  • Prefer automated solutions over manual bookmarking systems
  • Use Supabase as their data storage solution
  • Want AI-powered content summarization and filtering

Required Integrations

Supabase

Why it's needed: Supabase serves as your persistent data storage layer, allowing you to save summarized posts and create a searchable database of curated Reddit content. It's a PostgreSQL-based backend that makes organizing and retrieving your data simple and scalable.

Setup steps:

  1. Create a Supabase account at supabase.com if you don't already have one
  2. Create a new project in your Supabase dashboard
  3. Create a table named reddit_posts with the following columns:
    • id (UUID, primary key, auto-generated)
    • post_id (text, unique)
    • title (text)
    • url (text)
    • summary (text)
    • relevance_score (text)
    • created_at (timestamp, auto-set to now)
  4. Obtain your credentials:
    • Navigate to Settings → API in your Supabase dashboard
    • Copy your Project URL (looks like https://xxxxx.supabase.co)
    • Copy your anon public key (starts with eyJhbGc...)
  5. Configure in TaskAGI:
    • In the workflow, locate the Supabase nodes
    • Enter your Project URL in the connection settings
    • Enter your anon public key for authentication
    • Verify the table name matches reddit_posts

Reddit

Why it's needed: Reddit is your data source. This integration retrieves your saved posts and their associated comments, providing the raw content that the workflow will analyze, filter, and summarize.

Setup steps:

  1. Create a Reddit application:
    • Go to reddit.com/prefs/apps
    • Click "Create an app" or "Create another app"
    • Fill in the app name (e.g., TaskAGI-Automation)
    • Select "script" as the application type
    • Accept the terms and click "Create app"
  2. Obtain your credentials:
    • Your Client ID appears under the app name
    • Click "show" to reveal your Client Secret
    • Note your Reddit username and password
  3. Configure in TaskAGI:
    • In the Reddit integration settings, enter:
      • Client ID: Your app's client ID
      • Client Secret: Your app's secret key
      • Username: Your Reddit account username
      • Password: Your Reddit account password
    • Test the connection to ensure authentication works

Google Gemini

Why it's needed: Google Gemini powers the AI intelligence in this workflow. It evaluates post relevance and generates concise summaries of lengthy Reddit discussions, saving you reading time while preserving key insights.

Setup steps:

  1. Access Google AI Studio:
  2. Create an API key:
    • Click "Get API Key" in the left sidebar
    • Select "Create API key in new project"
    • Copy your API key (keep this secure!)
  3. Enable the Gemini API:
    • Go to console.cloud.google.com
    • Select your project
    • Navigate to APIs & Services → Library
    • Search for "Generative Language API"
    • Click Enable
  4. Configure in TaskAGI:
    • In the Google Gemini integration settings, paste your API key
    • The workflow uses gemini-2.0-flash model (already configured)
    • Verify the model selection in both AI nodes (relevance check and summarization)

Configuration Steps

Node 1: Overview (Documentation)

This is a note node for your reference. No configuration needed—it documents the workflow's purpose.

Node 2: Daily Schedule (Trigger)

Configuration:

  • Set Interval: 1 day
  • Set Time: Choose your preferred time (e.g., 09:00 AM)
  • This ensures the workflow runs automatically every 24 hours

Why it matters: Consistent daily execution keeps your database fresh without overwhelming the Reddit API.

Node 3: Get Existing Posts from Supabase

Configuration:

  • Table: reddit_posts
  • Query: Leave empty to fetch all records
  • Purpose: Retrieves previously saved posts to avoid duplicates

Expected output: Array of existing post objects with their IDs

Node 4: Extract Existing IDs

Configuration:

  • Input: Connect to Node 3's output
  • Function: Maps the results to extract only the post_id field
  • Code example:
    return data.map(post => post.post_id);
    

Expected output: Array of post IDs like ["abc123", "def456"]

Node 5: Get Saved Posts from Reddit

Configuration:

  • Limit: Set to 25 (retrieves your 25 most recent saved posts)
  • Sort: new (newest first)
  • Purpose: Fetches your current saved posts from Reddit

Expected output: Array of post objects with titles, URLs, and metadata

Node 6: Filter Posts

Configuration:

  • Input: Connect Node 5 output and Node 4 output
  • Function: Removes posts already in your database
  • Code example:
    const existingIds = extractedIds;
    return redditPosts.filter(post => !existingIds.includes(post.id));
    

Expected output: Only new, unsaved posts

Node 7: Loop Over Posts

Configuration:

  • Array input: Connect to Node 6 (filtered posts)
  • Loop variable: currentPost
  • Purpose: Processes each new post individually

Data flow: Each iteration passes one post to the next nodes

Node 8: Check Post Relevance

Configuration:

  • Model: gemini-2.0-flash
  • Prompt:
    Does this Reddit post discuss something valuable or insightful? 
    Post title: {currentPost.title}
    Post content: {currentPost.selftext}
    
    Respond with only "YES" or "NO".
    
  • Purpose: AI determines if the post is worth summarizing

Expected output: "YES" or "NO"

Node 9: Is Post Relevant?

Configuration:

  • Condition: Check Post Relevance output === "YES"
  • True path: Continues to summarization
  • False path: Skips to next loop iteration
  • Purpose: Only processes relevant posts

Node 10: Get Post with Comments

Configuration:

  • Post ID: currentPost.id
  • Include comments: true
  • Limit comments: 10 (top comments)
  • Purpose: Retrieves full discussion context

Expected output: Post object with comment thread

Node 11: Format Content for Summary

Configuration:

  • Input: Connect Node 10 output
  • Function: Combines post title, content, and top comments
  • Code example:
    return `Title: ${post.title}\n\nContent: ${post.selftext}\n\nTop Comments: ${post.comments.map(c => c.body).join('\n')}`;
    

Expected output: Formatted text string ready for summarization

Node 12: Summarize Post

Configuration:

  • Model: gemini-2.0-flash
  • Prompt:
    Summarize the following Reddit post using its content and comments. 
    Focus on key insights and actionable takeaways. Keep it under 200 words.
    
    {formattedContent}
    
  • Purpose: Generates concise summary of the discussion

Expected output: Summary text (200 words max)

Node 13: Parse Summary

Configuration:

  • Input: Connect Node 12 output
  • Parse type: text
  • Purpose: Ensures clean text output

Node 14: Prepare Data for Supabase

Configuration:

  • Input: Connect previous nodes
  • Function: Structures data for database insertion
  • Code example:
    return {
      post_id: currentPost.id,
      title: currentPost.title,
      url: currentPost.url,
      summary: parsedSummary,
      relevance_score: "high"
    };
    

Expected output: Object matching your Supabase table schema

Node 15: Save to Supabase

Configuration:

  • Table: reddit_posts
  • Action: Insert
  • Data: Connect to Node 14 output
  • Purpose: Stores the summarized post in your database

Node 16: Rate Limit Pause

Configuration:

  • Duration: 2 seconds
  • Purpose: Prevents hitting Reddit API rate limits
  • Placement: After each post is saved

Testing Your Agent

Step 1: Initial Test Run

  1. Open your workflow in TaskAGI
  2. Click "Test" or "Run Now" button
  3. Monitor the execution in real-time

Step 2: Verify Each Stage

Check Node 3 (Supabase retrieval):

  • Confirm it returns your existing posts (or empty array if first run)
  • Look for correct table structure

Check Node 5 (Reddit retrieval):

  • Verify it fetches your saved posts
  • Confirm post objects contain title, selftext, and url

Check Node 6 (Filtering):

  • Ensure new posts are identified correctly
  • Count should be less than or equal to Node 5 count

Check Node 8 (Relevance check):

  • Verify AI returns "YES" or "NO"
  • Review a few decisions to ensure they make sense

Check Node 12 (Summarization):

  • Read generated summaries for quality and accuracy
  • Confirm summaries are under 200 words

Check Node 15 (Supabase save):

  • Log into Supabase dashboard
  • Navigate to your reddit_posts table
  • Verify new rows appear with complete data

Step 3: Success Indicators

Workflow completes without errorsNew posts appear in Supabase tableSummaries are concise and accurateNo duplicate posts are savedExecution time is under 5 minutes

Step 4: Monitor Ongoing Execution

  • Check your Supabase dashboard weekly to review saved posts
  • Adjust the relevance prompt if you're getting too many false positives/negatives
  • Monitor API usage to ensure you stay within free tier limits

Congratulations! Your Reddit curation agent is now live and building your knowledge base automatically.

Similar Solutions

Related Agents

Explore these powerful automation agents that complement your workflow.

Reddit Marketing AI Agent

Reddit Marketing AI Agent

Use this AI agent to keep track of industries, keywords, brands and competitors on Reddit and automatically engage with...

Reddit Comments Scraper AI Agent

Reddit Comments Scraper AI Agent

Extract comment data from Reddit post URLs including nested replies and discussion threads. Simple form interface lets y...

Reddit Posts by Keyword Scraper AI Agent

Reddit Posts by Keyword Scraper AI Agent

Discover Reddit posts using keyword search with advanced filtering options. Simple form interface lets you search across...