Designing an AI Job Search Agent with n8n

30 Oct: This post is shifting to ‘why and how I designed/built’ the n8n job search agent; see How to use n8n to search job boards for how to install it.

I built an AI agent that automatically finds relevant job openings, evaluates them against my criteria, and emails me daily updates. This agent detailed below has replaced my morning job board searches.

Over the past couple of months, the AI job search agent processes 40+ new postings each week and flags 2-3 genuinely relevant opportunities. After trying several approaches this summer, I’ve continue to find n8n’s visual workflow builder striking the perfect balance of abstraction and control for this automation.

Why n8n for AI Agents

n8n’s flowgramming approach lets you see the entire agent workflow at a glance while maintaining granular control over each step. Unlike code-heavy frameworks, you can iterate quickly on logic and easily debug issues when they arise. And unlike no-code frameworks (e.g., Zapier), you get greater control over resource usage and greater predictability for how the agent operates.

5 Steps To Find Relevant Job Postings

The agent workflow has five functional steps to it:

  1. Search the web for recent job openings using a keyword string I created
  2. Iterate through the found job openings and skip ones we’ve seen before
  3. Ask Claude to analyze the role and evaluate if it is what I’m looking for
  4. Save the discovered job openings, marking them as relevant or not
  5. Email me an update of what was processed

Below is a screenshot of the current workflow, calling out where these steps occur.

Screenshot of n8n workflows that make up the AI job search agent

The rest of this blog post examines each step and explains the implementation.

What You’ll Need

  • You’ll need an n8n account (obviously). You can use either the cloud service or self-host.
    For me, I started with the cloud service to build the initial workflow, then moved it to a local Docker container to run on my local machine.
  • To search the web, a Brave Search API account and a Google Search API account. Both of their free tier accounts provide sufficient quota for daily job searches
  • You can use Airtable, PostgreSQL, or whatever else you prefer to store your data. My solution has used both Airtable and Postgres, which are detailed below.
  • Gmail account to send daily summary emails
  • Anthropic API key is needed to power the job evaluation. Note that you need an API account to call your AI service, which is separate from your Pro subscriptions.

Configured Data Storage

To store the data, I set up my data store with the following fields:

  • Company (single line text) – Company name
  • Position (single line text) – Job title
  • Date Identified (date) – Date that the role was first discovered
  • Date Applied (date) – Date applied for role (if applicable)
  • Decision Date (date) – Date that this opportunity closed – positive or negative (if applicable)
  • Status (single select) – For the purposes of this example, ‘Found by Agent’, ‘Ignore’, ‘Ignore – Duplicate’, ‘Applied’, and ‘Position Closed’
  • Job Description Link (single line text) – URL
  • Interview Rounds – Links to records in the Interviews data table
  • Notes (single line text) – Claude applies their reasoning here
  • Job Description (long text) – Full job description
  • Source (single select) – Where this job was found – for this example, ‘AI Agent’, ‘LinkedIn’
  • Green flags (single line text) – String array of green flags that the AI spotted from the job description
  • Red flags (single line text) – String array of green flags that the AI spotted from the job description
  • Normalized URL (single line text) – URL that has been cleaned up to remove URL variables and language-specific URL features, helping reduce duplicate job entries.
  • Platform (single line text) – Which job board the job listing came from

I use a number of additional fields in the Airtable database to track and drive my application/interview process, but the above are required for this workflow setup.

With the dependencies setup, let’s start creating the agent.

Step 1 – Search the Job Boards for Jobs

Using Brave Search

The agent uses Brave Search’s operators to find targeted results across multiple job boards. Here’s the search syntax:

site:ashbyhq.com AND (intitle:"developer marketing" OR intitle:"developer relations" OR intitle:"product marketing") AND (intitle:"director" OR intitle:"vp" OR intitle:"head") AND inpage:"developer" NOT intitle:"Product Management"

Brave Search key operators:

  • site: – Limits results to the URL of a specific job board
  • intitle: – Targets keywords within the job title
  • inpage: – Requires specific terms in the description
  • NOT – Used to exclude unwanted roles

I use two workflows to search Brave:

  • The first workflow (1) sets the search string (above), (2) runs separate searches for each target job board by calling the second workflow, and then uses n8n’s ‘Split Out’ activity to convert API results into individual job entries that can be processed. A brief pause between searches prevents API rate limiting.
  • The second workflow makes the Brave Search API call – it (3a) initializes a loop to (4) retrieve a page of search results and then (3b) concatenate the pages together and determine whether it needs to run again.
Workflow model for calling Brave Search API.

The end result is a collection of search results that are merged back together using the n8n Merge activity to bring all of the individual search results together into one collection that can be passed back to the primary workflow.

Using Google Search

I also use this for Google, which I wrote a separate blog post on.

Step 2 – Filter Out Duplicates

With search results merged into a single collection, we want to filter out the URLs that we have previously seen. This saves us a lot of processing and LLM tokens. The below section describes how to do this with PostgreSQL, but I have also done this using Airtable.

Workflow nodes for filtering out duplicates - retrieving prior results and using the n8n Combine node to filter out results

When I query the data store, I use the cached credentials and retrieve all rows, but only retrieve the job description link and normalized url columns to limit space. When configuring this, I found it very helpful to have the ‘Always Output Data’ option enabled – otherwise the workflow may stop if no results are returned.

To use the data, I use an n8n Combine node to match the search results against the previously processed URLs and use set the Output Type to Keep Non-Matches – this is the magic that filters out all of the previously processed URLs.

And I actually do this filtering-out step twice – once against the Job Description Link and once against the Normalized URL.

In my first implementation of this, I had the agent loop through each job posting and queries Airtable using the job URL and evaluate if it existed in the database. HOWEVER, I found this ate up a LOT of Airtable API calls and caused my agent to get blocked – leading to this solution. For additional details, check out my blog post on the topic.

Step 3 – Claude Evaluation

Now that we have a new job entry, let’s ask use an Anthropic AI node to evaluate if it is a role that marches what I’m looking for. This is where Claude analyzes each new job posting using a prompt that includes:

  • Your specific role requirements and preferences
  • The job title and URL from search results
  • Explicit instructions to return only valid JSON (no markdown formatting)
  • Output examples showing the expected structure
n8n properties for AI Messaging node to find prior applications - parameters tab

Within this activity, you’ll want to be as descriptive as possible of what you are looking for and how the AI should evaluate the job description. As you code this, consider the prompt as a starting point that will be evaluated and tweaked over time – I have continued to evolve and tweak this prompt on a weekly basis based on how it is processing job postings, and I expect I will continue to revisit this prompt weekly.

A few key elements to keep in mind:

  1. You must pass it the job title and the URL from the search results into the prompt – I got mine from the ‘Loop Over Posts’ activity
  2. A critical element for me was to give it output examples and explicitly tell it to pass only valid JSON and not to include any markdown formatting.
  3. Consider increasing the maximum token limit to handle longer job descriptions
screenshot of the full AI prompt

After Claude has analyzed the posting, you should add a ‘SET’ activity in JSON mode cleans up the response format before proceeding.

n8n properties for Set node that cleaned up the JSON

Step 4 – Save the Results

With the job opening evaluated, we need to post the entry up into Airtable for human evaluation using an Airtable activity with these mapped fields:

  • Company: {{ $json.company }}
  • Position: {{ $('Loop Over Posts').item.json.title }}
  • Date Identified: {{ $now.format('MM/dd/yyyy') }}
  • Status: {{ $if($json.relevant, "Found by Agent", "Ignore" ) }}
  • Notes: {{ $json.reasoning }}
  • Job Description Link: {{ $('Loop Over Posts').item.json.profile.url }}
  • Job Description Text: {{ $json['job description'] }}
  • Green flags: {{ $json.green_flags.join("; ") }}
  • Red flags: {{ $json.red_flags.join("; ") }}

Step 5 – Daily Email Summary

Lastly, the workflow sends me a summary email using the Gmail node to let me know what it has processed: job counts, token usage, and a list of newly discovered relevant roles. This has been extremely useful for letting me know what has been found by the agent.

screenshots of node configuration, relevant expression boxes, and a sample email sent by the node.

I love how this email summary captures the summary of all of the work, which took some work to reverse-engineer/hack my way into workflow variables, which I don’t think exist in the free tier of n8n. The process of creating these ‘variables’ helped me learn a lot about the way that n8n works, but that is the topic for a future blog post.

And while I like this email, I still really prefer the summary emails sent by my Zapier agent that is doing similar work each morning, but I’ll share more on the Zapier AI agent in another blog post.

Results & Performance

So far, the agent picks up about 60-70 postings from the search results, with 10 jobs being new each week and  1-2 of those being genuine opportunities. It filters out about 80%+ of irrelevant results and saves me roughly 3 hours of manual searching and evaluations per week.

False positives are rare when the AI prompt includes specific examples of roles to avoid. The main challenge for me is refining the job board search syntax and tweaking my AI prompt to provide better assessment.

The biggest thing holding me back is that I can’t specify the freshness from the Search queries. For Brave, the Search API is limited to 20 results, the API call typically processes most of the same 10-12 results top the returned results. If I were able to specify job postings new in the last week, I predict that the workflow would find more new postings and wouldn’t need to process the same ones.

Next Steps

I love how this workflow performs, but I’m not done yet!

Below is my current backlog of workflow enhancements:

  • Brave Search API enhancement – I REALLY want to only search the fresh job results. I am debating to either (a) move to an activity that calls the URL/API directly, or (b) updating the n8n Brave Search activity to add the freshness flag.
  • Querying my top-10 company list – I would like to enhance the workflow to query targeted company Career pages, not just job board web searches.
  • Give more agency to the AI – In my Zapier agent, the AI has given itself permission to dig around the internet if the URL isn’t found – querying the company’s job board and finding it. This one doesn’t, but I would love to figure out how to get it to do more on its own.
  • Multi-criteria scoring instead of binary relevant/irrelevant classification – I’m in the process of moving to a multi-shot prompt to get more useful notes. Eventually, I’d like to make this question itself and flag potential false positives and false negatives.
  • Automated career coaching resume updates for the most obvious fits – I have a rather extensive Career Coach ‘agent’ that helps me tailor my resume to specific jobs. I’m debating whether to incorporate that analysis into this workflow, moving it out of my Claude Pro account. This would likely be creating an AI Agent in n8n and calling into it.

I hope that this post helps you get use out of n8n, and that it helps other job seekers out there on the market today. If you have other suggestions for improvements or see where I could be using n8n better, drop me a note.

1 Comment

Comments are closed