• NanoBits
  • Posts
  • Build your own Reddit content curator in 30 minutes with Make.com⚡ [Step-by-step guide]

Build your own Reddit content curator in 30 minutes with Make.com⚡ [Step-by-step guide]

Nanobits Product Spotlight

EDITOR’S NOTE

Dear Nanobiters,

Like most of you, I love Reddit! Whether researching trends, analyzing public opinions, or gathering data for content creation, it’s always the go-to platform because of its valuable insights.

But more often than not, you keep missing out on the best discussions and insights from your favorite subreddits.

You open Reddit, scroll through a few posts, and before you know it, you've spent an hour diving down random rabbit holes—yet somehow missed the most important updates from the communities you actually care about.

I've been there too many times to count!

Whether it's r/cryptocurrency's latest market analysis, r/india's electrifying discussions, or even r/dadjokes' weekly gems (don't judge!), keeping track of what truly matters across multiple subreddits can feel like trying to drink from a firehose.

Sure, you could manually check each subreddit daily, or rely on Reddit's "Top Posts" algorithm.

But who has the time? And let's be honest—Reddit's own sorting system isn't always the best at surfacing the content you personally find valuable.

That's why I built a custom Reddit scraper using Make.com and ChatGPT.

It's like having a personal Reddit curator who knows exactly what you're interested in and delivers the most relevant content straight to your inbox.

And the best part is that you don't need any coding experience to build this. In this newsletter, I'll show you how to create your own Reddit intelligence system in just a few simple steps.

Ready to save hours of Reddit scrolling time? Let's build this together.

ARE YOU NEW TO MAKE.COM?

It’s a super easy and intuitive tool to learn.

Here’s a beginner-friendly guide/primer I wrote a few weeks ago. It will help you get started with Make.com.

AI agent that summarizes top posts from your favorite subreddits

This simple automation will summarize the most popular posts from your favorite subreddits and email you the summary. No more endless scrolling on Reddit’s feed to find what matters to you.

So, let’s get started.

Click on scenarios on the left-hand side.

Click “Create a new scenario” on the top right corner of the page.

This now drops us into the scenario editor or designer, where we can build our automation.

Module 1: HTTP Module for an API request

This first module is a bit tricky, so I want you to pay close attention to this.

But before that…

What's an API and Why Do We Need It?

Think of an API as a waiter at a restaurant. You (Make.com) tell the waiter (API) what you want, and they fetch it from the kitchen (Reddit's servers). The Reddit API lets our automation gather posts, comments, and other data from any subreddit.

Without the Reddit API, our automation would be like trying to order food by walking straight into the restaurant's kitchen - messy and inefficient.

The API gives us structured data we can work with. Instead of trying to extract information from Reddit's website directly, we get clean, organized data that's perfect for our automation.

In Step 1, we'll set up an HTTP request to Reddit's API, which will fetch exactly what we need—no manual searching is required.

In the center of the screen, you'll see this large plus icon. We will start by adding an HTTP module.

After adding this module, you have to add your credentials in the first box.

Here’s how:

  1. Go to rapidapi.com and sign up. They give a few credits in their free tier to experiment.

  2. At the top, there is a search bar. In it, you can type “Reddit Scrapper,, “which will take you to this page

  3. Then click on “Go to Playground.“ You will see a page like this:

     

  4. I have highlighted the information you need handy to add your credentials.

  5. You can choose any name. Then, enter the key from RapidAPI. Then, add the API key parameter name: x-rapidapi-key.

  6. Don’t forget to enable the API by testing the endpoint on the RapidAPI page.

  7. Now, follow the steps as shown in the video below:

A few things to note:

  • The URL is on the RapidAPI page in the playground. I have underlined it in red in the image above.

  • You may choose any subreddit. I chose India because I want to know what was discussed in this subreddit last week.

You may also choose subreddits like Product Marketing or B2B SaaS Marketing if you want to catch up on the top discussions you missed or it can also help you get content ideas for your next blog/article/social media post.
  • By changing the time variable in the API module, you can view the subreddits' top posts for the week or month.

Module 2: Create a Google Document

This document will capture the top posts the API module will fetch in the first step.

A few things to note:

  • If you haven’t used Make or any Google modules before, you must sign in to your Google account to set up a connection.

  • The “now“ timestamp in the newsletter name determines when the newsletter was created.

  • Make requires you to put some content in to create a Google Doc. Since we want a blank document, I have added one hyphen so it’s not empty.

Iterator:

It is a module that splits an array into multiple bundles, with each item in the array becoming a separate bundle. This allows you to process each item in the collection individually.

In this flow, an iterator will collect several posts from the Reddit API module and process them in batch before moving on to process it on ChatGPT. 
Sometimes, the iterator’s values may not be accessible to the map function in the subsequent module. In such cases, it is advisable to manually set the value, for instance, as {{2.data.data.posts}}.

Module 3: Append a paragraph into the Google Document

The Reddit posts fetched by the first module will be added to the document created in the second step.

Array Aggregator:

We learned all about array aggregators in the last newsletter.

Without the iterator and aggregator section, you might get 20 emails [the API module might fetch 20 or more top posts from the time frame you mentioned] instead of one email with the key pointers from the subreddits’ posts.

Module 4: Download the Google Document file

This module will download the Google Doc file, which contains all the top posts from your favorite subreddit in a PDF in MS file format.

In the last newsletter, we learned how to link your Google modules to Make.com. Here’s a recap:

To connect restricted Google services, like Gmail and Google Drive, to Make, you must create a project on the Google Cloud Platform and a custom OAuth client.

You can follow this guide with additional required steps to connect.

Module 5: Upload the file to ChatGPT

To connect Make and ChatGPT, you need an OpenAI API key, which you can find here.

Sleep Module/Delay:

This module will wait for the file from the previous step to upload completely before the further processing starts.

Module 5: Process the file on ChatGPT to get the desired output

Before proceeding with this module, let’s create a new assistant on ChatGPT.

I intend to analyze the top-performing and most discussed posts on my favorite subreddit and generate ideas for my next social media posts.

This is the prompt that I used:

As an expert content writer for social media platforms and newsletters, your task is to analyze the data from the attached file containing the top posts on Reddit this week. Use the titles, posts, and number of comments to identify popular and engaging topics. Focus on posts with higher comment counts, as these indicate strong audience interaction.

Based on your analysis, brainstorm several content ideas for this week's social media posts and newsletters. Ensure that the content is relevant, timely, and likely to resonate with your target audience. Consider incorporating trending topics, leveraging user-generated content, or creating interactive posts to increase engagement.

Be creative in your approach and think about how you can leverage the data from the top Reddit posts to create compelling and shareable content. Your goal is to drive engagement, spark conversations, and ultimately enhance the reach and effectiveness of your social media and newsletter campaigns.

Now, let’s look at how to set up the module:

Module 6: Collate the topics and publish the results to your inbox

Now that we have collated the ideas/topics, we will bunch them together and send them to the inbox. Instead of emailing, you may choose to put them in a Google Doc or Sheet or send them to a Slack channel.

Now, let’s examine our progress so far.

Here’s the result of one iteration of the workflow.

The workflow is now ready!

The next step is to schedule it.

I don't want to come here and manually run this every weekend.

To turn scheduling on, press this toggle in the bottom left-hand corner. Then, you can configure the schedule to run on certain days of the week or month.

Remember that the more often it runs, the more operations your scenario will consume. So, think of a reasonable interval at which to run this.

And just like that, we've saved some precious time by automating repetitive tasks.

End Note

Staying on top of Reddit's best content doesn't mean spending hours scrolling. You can create multiple scrapers for different subreddits and customize each to match your interests.

With this automation running in the background, you'll never miss valuable discussions or trending topics again. To stay informed, you only need to scan your weekly email.

The initial setup takes just 30 minutes, but think about the hours you'll save monthly by not manually searching through Reddit.

I'd love to hear how you use this Reddit scraper. Maybe you're tracking industry trends, gathering content ideas, or just staying updated on your favorite communities. Reply to this email to share your experience.

Are you having trouble setting up your scraper? Do you need help fine-tuning the automation? Contact me. I'm here to help you get the most out of this workflow.

Until next week, happy automating!

Also, remember to check your automation settings occasionally. You might want to adjust the period (weekly/monthly) or tweak the prompt to better match what you want in the summaries.

Share the love ❤️ Tell your friends!

If you liked our newsletter, share this link with your friends and request them to subscribe too.

Check out our website to get the latest updates on AI.

Reply

or to participate.