Empowering Developers with the Finch MCP Server

May 1, 2025
0 min read
Dark-themed banner with the Finch logo and large white text reading “Introducing Finch’s MCP Server.” Below the title, a horizontal line of five icons represents the data flow: two LLM icons, a connector icon, a server icon highlighted in blue, and the Finch logo.
Table of Contents

Finch’s MCP Server lets developers connect LLMs to the HR and payroll data they access through Finch, powering new use cases, efficiency gains, and better user experiences.

Model Context Protocol, or MCP, is the latest lightning rod in the AI conversation, and today we’re excited announce the Finch MCP Server. This will allow developers to connect the employer data provided by the Finch API with large language models (LLMs).

MCP is an emerging standard designed to bridge the gap between LLMs and data sources. It’s so new that it remains to be seen exactly when or how developers will ultimately choose to use the technology. The way we see it, revolutionizing employment is about much more than standardized data access — it’s also about empowering application developers to do more with that data. The Finch MCP Server is another tool in the developer’s arsenal that has the potential to power new use cases, unlock efficiency gains, and provide a best-in-class user experience.

What is Model Context Protocol?

Model Context Protocol (MCP) is an open standard protocol that was open-sourced by Anthropic. It defines a structured way for AI agents to interact with data and tools so the LLMs can read data from and write back to connected applications.

Put simply, MCP acts as a bridge that lets an LLM access the permissioned data that Finch provides from employers’ HR and payroll systems. 

How does Model Context Protocol work?

At its core, MCP is made up of several key components:

  • MCP hosts: These are programs, usually LLMs or AI agents, that need to access data from an external source. 
  • MCP clients: These are the protocol clients that maintain 1:1 connections with MCP servers.
  • MCP servers: These are programs that expose specific capabilities through the standardized protocol. Finch’s MCP Server acts as the bridge that allows an LLM to read from and write to the HR and payroll systems that are connected via our unified API.
  • Local data sources: These are the files and databases that hold the relevant information the MCP host wants to access. In this case, the local data sources would be the employers’ HR and payroll systems. 
  • Remote services: These are the actions the AI agent is able to perform, like fetching employee contribution data or modifying payroll deductions.
A visual diagram showing the data flow from left to right: “Your LLM,” followed by icons labeled “MCP Client,” “Finch MCP Server” (highlighted in blue), and “Finch API.” The diagram illustrates how an LLM connects to HR and payroll data through the Finch MCP Server.

MCP is quickly gaining traction because LLMs are only as useful as the information they have access to. MCP provides a standardized way to connect models to real-time, structured data sources. That means developers and businesses can now go beyond chat-based experiments and begin operationalizing AI for real use cases — especially in complex, compliance-heavy industries like employment tech.

How is MCP different from API?

At a glance, both APIs and MCPs are about connecting systems; but they serve different functions.

An API is a standardized way for software systems to request and exchange data. Finch’s unified API, for example, allows platforms to pull structured data from hundreds of payroll and HR systems through a single integration.

MCP, on the other hand, acts as an interface layer for LLM-based applications. It allows AI agents to understand what tools are available, how to use them, and how to interpret the data they receive. MCP builds on the power of APIs by making them usable by autonomous agents — like AI copilots — without needing to manually provide the latest data as a file or write custom logic.

Model Context Protocol use cases

When paired with Finch’s unified employment data API, MCP unlocks a new category of intelligent, automated workflows in HR, payroll, and benefits administration. This can include:

  • Surfacing insights from payroll data on demand — no spreadsheet wrangling required
  • Enabling employer-facing or employee-facing chatbots that can perform real-time data lookups
  • Simplifying internal operations for benefits administrators, brokers, and payroll providers
  • Powering AI copilots inside existing HR software platforms
  • Performing ad-hoc audits on historical payroll and employment data

From compliance automation to proactive financial planning, the possibilities are enormous — and we’re just scratching the surface.

A simulated AI interaction where a user requests a table of each employee’s year-to-date benefit contributions. The table displays five employee names alongside columns for employee deductions and employer contributions. An LLM icon appears in the corner, running a query labeled “retrieve_many_hris_pay_statements from finch_api.”

Examples of using Model Context Protocol and Finch

There are endless possibilities when it comes to connecting AI agents to Finch via MCP — many of which we haven’t even thought of yet. Here are a few compelling examples that are already possible today:

Generate reports in seconds

With Finch’s read capabilities, AI agents can do more than retrieve data — they can synthesize it instantly. Imagine a retirement TPA using an AI agent to generate a detailed census report in seconds. The agent could be instructed to prepare a report for a specific company with each employee’s name, date of birth, hire and termination dates, and YTD earnings and benefits contributions. Using MCP, the agent can fetch the necessary data and return a complete, structured report. The same process can be repeated for any number of companies, enabling users to create these detailed census reports quickly, and at scale.

Support AI-assisted search

Suppose a payroll administrator wants to review year-to-date contributions for each employee across different benefit types. With Finch connected via MCP, an AI agent could respond to a query like: 

“Show me a table with each employee’s YTD benefits contributions, including employer and employee shares, broken out by benefit type.”

The agent would fetch the relevant data using Finch’s API, then return a formatted table, streamlining a task that would otherwise require manual data aggregation.

Customer-facing chatbots

B2B platforms can use MCP to create intelligent chatbots for their own customers. For example, an employee rewards and recognition platform could enable client HR managers to ask: 

“Which employees in our New York office were recognized the most in the last 6 months?”

The chatbot, powered by an LLM connected through MCP to Finch, could retrieve and summarize the relevant engagement data, making the platform more interactive, valuable, and sticky.

Benefits of using Model Context Protocol

By introducing MCP support via our MCP Server, Finch is empowering developers to build the next generation of AI-powered tools. Some of the core benefits include:

  • Standardized protocol — The standardized MCP makes it easier for LLMs to understand how to interact with different types of systems, allowing developers to introduce AI workflows into various systems quickly.
  • Simplified build — Instead of manually coding logic for every action, developers can define tools once and let AI agents handle requests dynamically, reducing development time and increasing scalability.
  • Natural language processing — LLMs are adept at interpreting natural language and restructuring it as necessary. By equipping LLMs with the required tools, users can instruct the LLM to write data into external systems in a prescribed way, even while using conversational terms.
  • Efficiency gains — MCP-powered agents can automate repetitive workflows, perform bulk actions, and surface insights faster than traditional interfaces, boosting productivity for teams across the board.
  • Data freshness and strengthened compliance — By allowing an LLM to access data directly from Finch’s API, users can be confident the LLM is always presenting the most up-to-date information in a single, unified format, reducing the risk of stale data influencing AI outputs.

How does MCP impact data security?

Because Finch handles highly sensitive data, we employ several security protocols within our API to protect sensitive information:

  • Employer data accessed through Finch is always encrypted in transit and at rest. Finch is SOC 2 and CCPA compliant.
  • Finch will never access data without the explicit consent of and authorization from the employer.
  • Employers must also give their consent and authorization before their data is shared with third-party applications, such as benefit administrators, fintech apps, HR tech platforms, and so on.

As a Finch customer, MCP represents another way you can interact with Finch’s API by using LLMs on your own data. Whether or not you choose to use Finch’s MCP Server is completely at your discretion, and organizations that do not allow working with open or externally hosted LLMs should not adopt this feature. If you’re not sure of your organization’s stance on using LLMs, be sure to speak with your Security team before adopting MCP. 

We recommend that developers who do decide to adopt MCP exercise caution when deciding which LLMs to share data with, including careful review of the LLM’s usage policies.

Ready to start building with the Finch MCP Server?

The Finch MCP Server makes it easier than ever to connect large language models to real-time employment data. Finch customers can get started with our MCP Server today by visiting our Developer Docs

Not a Finch customer yet? Book a call with our Sales team to learn what Finch can do for your business.

97% of HR professionals say it’s important for your app to integrate with their employment systems

Learn more in our State of Employment Technology report ->

97% of HR professionals say it’s important for your app to integrate with their employment systems

Download the report to learn more

Payroll Integrations Made for Retirement

Finch lets recordkeepers and TPAs integrate with the payroll systems their sponsors use to pull pay and census data and manage deductions automatically.

Learn how ->

Start building with Finch

Get your API keys or contact us for more information.