In the world of artificial intelligence and data analytics, we are constantly looking for ways to make AI tools more effective. One of the biggest challenges is connecting AI systems to existing data sources and tools. The Model Context Protocol (MCP) offers an elegant solution for this. In this blog post, we take you through our experience implementing MCP at Blenddata and show how this protocol is revolutionising the way AI can work with data.
The Model Context Protocol (MCP) is a standardised way to give Large Language Models (LLMs) direct access to tools, APIs, databases, documentation and other data sources. Instead of having to copy and paste information, the AI can directly use these tools and sources to get context and perform tasks. This opens up a world of new possibilities.
A common problem at companies is that valuable information and tools are available, but not accessible to AI tools. Employees have to manually look up, copy and paste information to get AI input. This takes time and hampers productivity. MCP solves this by connecting your LLM directly to your existing tooling and data sources.
MCP is important for three core reasons:
Concrete examples:
At Blenddata, we saw this as an opportunity to make our workflows more accessible and improve the development experience by integrating MCP into our daily processes.
During our research into LLMs working with our data, MCP emerged as the best solution. We wanted to explore how to make an LLM work with our data from data pipelines, and MCP offered exactly what we were looking for: a standardised way to connect AI tools with data sources.
Our implementation in 5 steps:
Step 1: Setting up data context with dbt MCP
Step 2: Database access via PostgreSQL MCP
Step 3: Selecting test data
Step 4: Configure AI workflow
Step 5: Model comparison and client development
Our test results show that MCP can be extremely useful in development scenarios, but additional steps are needed for dashboard end products.
Our 4 key insights:
✅ Works well:
⚠️ Challenges:
🔒 S ecurity guidelines:
Want to get started with MCP yourself? We’ll show you how to set up your own MCP setup in just a few steps. We use Notion as an example, but you can apply this process to other MCP servers.
Step 1: Download Cursor IDE Download Cursor IDE fromcursor.com/downloads. This is our MCP client that we will use.
Step 2: Choose an MCP Server For this tutorial, we will use the Notion MCP server from makenotion/notion-mcp-server. This is a good choice because Notion is accessible to most people.
Step 3: Set up Docker Desktop Make sure Docker Desktop is up to date. We are going to use the Docker MCP toolkit (beta) to simplify installation. You can also do this via the mcp config .json that most clients have.
Step 4: Add the Notion MCP server In Docker Desktop, go to the MCP toolkit and add the Notion MCP server. This process is visual and user-friendly.
Step 5: Configure the Notion MCP server You will need a Notion API token. Go to notion.so/profile/integrations to create it. Follow the instructions on the Notion MCP GitHub page for the exact steps.
Step 6: Connect Cursor to the MCP toolkit In the MCP toolkit, go to the clients tab and click “connect” at Cursor. This adds the docker mcp toolkit to Cursor’s mcp.json.
Step 7: Restart Cursor (optional) Restart Cursor to make sure everything is loaded correctly.
Step 8: Check the connection Go to File > Preferences > Cursor Settings > Tools & Integrations to see if the MCP server is connected correctly.
Step 9: Test it out! Start a new chat (Ctrl + I) and try it out. You can now ask questions about your Notion data and the AI has direct access to your Notion workspace.
Sample prompts to try:
Our recommendations for beginners
Tip: Using MCP servers to give context to your AI assistant is ideal for new team members to quickly understand complex projects. For example, “I’m going to work on project X, help me understand the data models.”
Blenddata is a data engineering consultancy that specialises in building reliable data pipelines and modern data architectures. We help companies make their data accessible and usable.
With our experience in MCP, we can help your organisation start using AI tools. Whether it’s dbt documentation, databases or other sources, we can help you harness the power of AI in your day-to-day work.
Or if you are a developer curious about the possibilities, contact us!