{
  "name": "Dynamic MCP server selection with OpenAI GPT-4.1 and contextual AI reranker",
  "nodes": [
    {
      "id": "e01ddd1f-7261-4bfb-a1b1-7a6f58bfdb3c",
      "name": "OpenAI Chat Model",
      "type": "@n8n/n8n-nodes-langchain.lmChatOpenAi",
      "position": [
        912,
        368
      ]
    },
    {
      "id": "47e86618-9c89-4b70-b4ce-a30bd89eebde",
      "name": "If",
      "type": "n8n-nodes-base.if",
      "position": [
        1184,
        144
      ]
    },
    {
      "id": "30458c75-70ef-4b18-91b3-6ad36db1370e",
      "name": "Merge",
      "type": "n8n-nodes-base.merge",
      "position": [
        1632,
        64
      ]
    },
    {
      "id": "6c32510a-9b0a-40b4-90f3-3fae29359cee",
      "name": "Merge1",
      "type": "n8n-nodes-base.merge",
      "position": [
        2304,
        64
      ]
    },
    {
      "id": "5ee74efd-afb7-4cbc-940a-92cda69aeef2",
      "name": "Sticky Note",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        64,
        -112
      ],
      "parameters": {
        "width": 480,
        "height": 1152,
        "content": "# Dynamic MCP Selection\n## PROBLEM\nThousands of MCP Servers exist and many are updated daily, making server selection difficult for LLMs.\n- Current approaches require manually downloading and configur"
      }
    },
    {
      "id": "14d4db9f-0191-4cdd-a05e-0e1438a312db",
      "name": "Sticky Note1",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        592,
        -112
      ],
      "parameters": {
        "width": 704,
        "height": 608,
        "content": "## 1. Determine whether MCP servers are needed\nBased on user's request, LLM determines the need for an MCP Server, provides a reason, and if needed, provides reranking instruction text which will be p"
      }
    },
    {
      "id": "01a24336-eca1-48bf-bafb-f7c3ea1bdcf5",
      "name": "Sticky Note2",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        1360,
        -112
      ],
      "parameters": {
        "width": 640,
        "height": 400,
        "content": "## 2. Fetch MCP Server list and format them\nWe fetch 5000 MCP Servers from PulseMCP directory and parse them as documents to pass it onto the Contextual AI Reranker"
      }
    },
    {
      "id": "5c96dbde-fa89-4256-8850-95c5bfe21314",
      "name": "Sticky Note3",
      "type": "n8n-nodes-base.stickyNote",
      "position": [
        2064,
        -112
      ],
      "parameters": {
        "width": 816,
        "height": 400,
        "content": "## 3. Rerank the servers and display top five results\nWe use Contextual AI's reranker to re-rank the servers and identify the top 5 servers based ont eh user query and re-ranker instruction, which is "
      }
    },
    {
      "id": "2716d897-3417-45af-9820-3b6e68378419",
      "name": "User-Query",
      "type": "@n8n/n8n-nodes-langchain.chatTrigger",
      "position": [
        608,
        144
      ]
    },
    {
      "id": "8e59faac-6e36-43ec-9ef3-4169ab498065",
      "name": "LLM Agent for Decision-Making",
      "type": "@n8n/n8n-nodes-langchain.agent",
      "position": [
        832,
        144
      ]
    },
    {
      "id": "3afade9b-c6ba-45d3-8011-1aa02bcf5ab3",
      "name": "PulseMCP Fetch MCP Servers",
      "type": "n8n-nodes-base.httpRequest",
      "position": [
        1408,
        144
      ]
    },
    {
      "id": "dfb80dc9-dff8-4869-b160-f2e6b7e35742",
      "name": "Final Response1",
      "type": "@n8n/n8n-nodes-langchain.chat",
      "position": [
        1408,
        336
      ]
    },
    {
      "id": "6b2fa6bb-79d3-4fc6-b0d6-7a04fe126662",
      "name": "Parse MCP Server list into documents w metadata",
      "type": "n8n-nodes-base.code",
      "position": [
        1856,
        64
      ]
    },
    {
      "id": "f6249154-124d-4e53-82c1-d18c078a0a12",
      "name": "Format the top 5 results",
      "type": "n8n-nodes-base.code",
      "position": [
        2528,
        64
      ]
    },
    {
      "id": "8053044b-27fa-45b6-bd15-42d327dc7452",
      "name": "Final Response2",
      "type": "@n8n/n8n-nodes-langchain.chat",
      "position": [
        2752,
        64
      ]
    },
    {
      "id": "830a8333-3aa2-482c-b8a0-c05a978f2d81",
      "name": "Rerank documents",
      "type": "n8n-nodes-contextualai.contextualAi",
      "position": [
        2112,
        144
      ]
    }
  ],
  "connections": {
    "If": {
      "main": [
        [
          {
            "node": "PulseMCP Fetch MCP Servers",
            "type": "main",
            "index": 0
          },
          {
            "node": "Merge",
            "type": "main",
            "index": 1
          }
        ],
        [
          {
            "node": "Final Response1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Merge": {
      "main": [
        [
          {
            "node": "Parse MCP Server list into documents w metadata",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Merge1": {
      "main": [
        [
          {
            "node": "Format the top 5 results",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "User-Query": {
      "main": [
        [
          {
            "node": "LLM Agent for Decision-Making",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Rerank documents": {
      "main": [
        [
          {
            "node": "Merge1",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "OpenAI Chat Model": {
      "ai_languageModel": [
        [
          {
            "node": "LLM Agent for Decision-Making",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    },
    "Format the top 5 results": {
      "main": [
        [
          {
            "node": "Final Response2",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "PulseMCP Fetch MCP Servers": {
      "main": [
        [
          {
            "node": "Merge",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "LLM Agent for Decision-Making": {
      "main": [
        [
          {
            "node": "If",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Parse MCP Server list into documents w metadata": {
      "main": [
        [
          {
            "node": "Merge1",
            "type": "main",
            "index": 1
          },
          {
            "node": "Rerank documents",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}