Show HN: Metorial (YC F25) – Vercel for MCP

github.com

57 points by tobihrbr a day ago

Hey HN! We're Wen and Tobias, and we're building Metorial (https://metorial.com), an integration platform that connects AI agents to external tools and data using MCP.

The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.

Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.

For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.

What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).

Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (https://metorial.com/api) to connect to our platform.

You can self-host (https://github.com/metorial/metorial) or use the managed version at https://metorial.com.

So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.

Demo video: https://www.youtube.com/watch?v=07StSRNmJZ8

Our Repos: Metorial: https://github.com/metorial/metorial, MCP Containers: https://github.com/metorial/mcp-containers

SDKs: Node/TypeScript: https://github.com/metorial/metorial-node, Python: https://github.com/metorial/metorial-python

We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!

cgijoe a day ago

Oh my lord, your timing is perfect. I need this so badly right now. Congrats on the launch, and wow, thank you for making your MCP containers available separately!

  • tobihrbr a day ago

    Haha, good thing we launched today. Thank you so much for the encouraging words!

solumos 21 hours ago

The distinction between "Vercel for MCP [integrations]" and "Vercel for MCP [servers]" is meaningful — maybe "Zapier for MCP" is a more appropriate "X for Y"?.

Congrats on the launch!

  • tobihrbr 21 hours ago

    That's a really interesting point. We've actually been discussing this quite a bit. We felt like putting an emphasize on the "dev tool" aspect (like Vercel) makes more sense, but the way you put it we might want to reconsider that. Thank for your interest!

rancar2 20 hours ago

I like the license (FSL) chosen for the project, but it may need some explaining for others. Can you comment on decision for selecting the Functional Source License (Version 1.1, ALv2 Future License), and the intent from the Metorial team with it including any restrictions on potential commercial use of the platform (i.e. free-to-paid without notice)?

For those who aren't aware of what FSL (https://fsl.software/) is: "The Functional Source License (FSL) is a Fair Source license that converts to Apache 2.0 or MIT after two years. It is designed for SaaS companies that value both user freedom and developer sustainability. FSL provides everything a developer needs to use and learn from your software without harmful free-riding."

  • tobihrbr 20 hours ago

    Thanks for pointing that out. Ultimately, we wanted to strike a balance between being fair and open to the community, welcoming contributions, and ensuring that people can self-host without having to worry about licensing issues, while also ensuring that Metorial, as a company, can exist and work on OSS sustainably. This isn't easy and I don't think there's a right answer. To us FSL strikes a pretty good balance. Allowing the community to use and participate while ensuring that Metorial makes sense as a business as well.

fsto 21 hours ago

We’ve just begun implementing Composio. Would love to reconsider if you help clarifying the main differences. From my perspective it looks like you have more robustness features to me as a developer and you’re fully open source (not just the client) whereas Composio has more integrations. But would love your input to clarify. Congrats on the launch!

  • wenyers 16 hours ago

    Wen here, the co-founder. I actually spent a couple hours today to take the time to give you a comprehensive answer.

    1. As you said, Composio doesn’t allow self-hosting and the source code isn’t available. We want to follow PostHog’s playbook in letting devs run everything on their own infrastructure and open sourcing all our MCP containers.

    2. A huge benefit of this approach is that we can let you fork any MCP server through our dashboard so that you can manage it yourself and make any adjustments you might need. We’ve heard the importance of doing this repeatedly from our enterprise customers.

    3. I do believe that we offer more robustness features, like environment provisioning, deployment versioning, server pooling, in-depth logs of server startup, as well as a complete trace of the entire MCP session.

    4. On the integrations side, Composio does indeed have more integrations right now, but we already have around 600 MCP servers (all with multiple tools of course) of which many are being modified by us every day to make them better. Since we support open source contributions, the catalog also grows with the community. (Quick note that you can have private servers scoped to your org).

    5. I tried to benchmark our architecture vs Composio’s in terms of speed. As we mentioned in the post above, one thing that we spent a lot of time on was optimizing how fast we can do serverless with MCP servers. However, since Composio has neither source available nor any technical documentation on how they handle their servers, I couldn’t actually find any information on their architecture. One thing that they enforce as default is having a meta-tool layer with tools like composio_search_tools and composio_execute_tool. Assuming that this is a long living process, I still found that our implementation returned a list_tools response quicker (including the cold start time). If you factor in the time that it takes for them to find the right tools, their response took close to double the time. While we might explore a similar meta tool layer as an optional MCP server in the future, we do seem to on average have a better architecture in terms of speed, though the benchmarking was not entirely rigorous. (I am also unable to answer how they handle multiple users connecting to one MCP server with different OAuth configs because they don’t share that information). I plan on making a more rigorous comparison in a blog post soon, also comparing to hosting on Vercel, Cloudflare, etc..

    Let me know if you have any follow up questions.

    If you want to talk more, please feel free to DM me on LinkedIn (https://www.linkedin.com/in/karim-rahme/) or X (https://x.com/wen_rahme).

    • fsto 10 hours ago

      Wow, thanks a LOT for that comprehensive answer! Very helpful!

      Two questions I didn’t manage to find answers to: * Do you have or plan support for webhooks? The scenario for us is that we’d ideally have one platform for setting up customer integrations for which we will make requests to and await request from * When you host, do you expose the access and refresh tokens for the connected integrations? The use cases for us are: * If we wanna make a feature / request that seems out of scope for Metorial * If we wanna migrate from Metorial, we don’t want to force our customers to have to reconnect * I love that we can bring our own OAuth apps which would be the default for us. But to try out an integration out or for (from our perspective) low prio integrations we’d still like to offer - do you offer your own OAuth apps that we can piggy back on. Just to save the customer from the effort of having to set up an OAuth apps foe each service. I know it comes with a lock in, but it’s worth it in some cases for us.

      You’ve made me very excited to try you out, so I’ll implement support for both Composio and Metorial.

      Thanks again for taking the time and efforts to answer so thoroughly!

      Sent a connection request to you on LinkedIn.

TOMDM 13 hours ago

I think a nod to how auth is handled would be well worth adding to the README.

Knowing it can integrate with API's is great, but knowing how a consumer of MCP interacts with auth, and how you do so with downstream API's would be very welcome.

  • tobihrbr 13 hours ago

    That's a great idea. We'll add that asap. More broadly, we're working on documentation that explains Metorial, and it's sub-components, in more detail.

    Just for context, it's as simple as 1) creating an OAuth Sessions (https://metorial.com/api/oauth-session) which includes a URL which you 2) pass on to your user's to authenticate at and you're done.

ushakov a day ago

congrats on the launch!

why do I need a specialized platform to deploy MCP instead of just hosting on existing PaaS (Vercel, Railway, Render)?

also if you're not using VMs, how do you isolate per-user servers?

  • tobihrbr a day ago

    Great questions!

    If you want to run your own remote servers (for your product/company) Railway or Render work great (Vercel is a bit more difficult since Lambdas are very expensive if you run them over long periods of time). Metorial targets developers who build their own AI agents and want to connect them to integrations. Plainly, we do a lot more then running MCP servers; we give you monitoring, observability, handle consumer-facing OAuth, and give you super nice SDKs to integrate MCP servers with your agent.

    Regarding the second question, Metorial has three execution modes depending on what the server supports: 1) Docker - this is the most basic one which any MCP server should support. We did some heavy optimizations to get those to start as fast as possible and our hibernation system supports stopping and resuming them while restoring the state. 2) Remote MCP - we connect to remote MCP servers for you, while still giving you the same features and ease-of-integration you get with any Metorial server (I could go more into detail on how our remote servers are better than standard ones). 3) Servers on our own lambda-based runtime. While not every MCP server supports this execution mode, it's what really sets us apart. The Lambdas only run for short intervals, while the connection is managed by our gateway. We already have about 100 lambda-based servers and working on getting more on to that execution model.

    There's a lot about our platform that I haven't included in this. Like our stateful MCP proxy, our security model, our scalable SOA, and how we transform OAuth into a single REST API calls for our users.

    Let me know if you have any additional questions, always happy to talk about MCP and software architecture.

    • ushakov 19 hours ago

      thanks for explaining, especially the runtimes part!

      i am currently running Docker MCP Containers + MCP Gateway mixed with Remote MCPs in microVMs (aka. Sandboxes).

      seems to be the most portable setup, so you don't have to worry about dealing with different exec like uvx, poetry, bun, npx and the whole stdio/streamable http conversion.

      lambdas sound interesting, esp. if you have figured out the way to make stateful work stateless, but comes with the downside that you have to maintain all the integrations yourself + the environment itself might have compatibility issues. i've seen someone also using cloudflare dynamic workers for similar use-case (disco.dev), but they're maintaining all the integrations by hand (or with Claude Code rather). more extreme version of this would be writing custom integration specific to the user by following very strict prompt.

      anyways, i'll look into Metorial as am curious about how the portable runtimes work.

      i am also maintaining a list of MCP gateways, just added you there as well: https://github.com/e2b-dev/awesome-mcp-gateways

      thanks for building this, looking forward to checking it out!

      • tobihrbr 19 hours ago

        Thanks for sharing and adding us to your list. The point about the lambdas is fair, though we do support other execution modes to combat this. Please let me know if you have any feedback or encounter hiccups :)

langitbiru a day ago

I wrote a book about MCP: https://leanpub.com/practical-mcp

I'm considering adding more chapters to the book: security, easy deployment, etc. So, I may look into your solution. I believe there are other players also, like Klavis AI, FastMCP and some MCP startups that I cannot remember.

Congratz!

  • electric_muse 13 hours ago

    I was wondering when books were going to start popping up. Looking forward to reading.

    Have you written about MCP gateways for helping companies route all MCP traffic through one plane for observability, security, and compliance? Happy to chat through that. I just recorded an end to end demo of what we are working on: https://vimeo.com/1127330739/ee1fe5245b

    • langitbiru 7 hours ago

      Not yet. I'm still considering what content to add to the existing book. I'll likely (90% probability) include security and easy deployment. But since you said that, I'll consider the observability and compliance topics.

  • Eldodi 7 hours ago

    Alpic.ai also provides a really sweet solution for simple MCP hosting, with MCP analytics and DCR delegation for Oauth

  • tobihrbr a day ago

    Thanks so much! I'll definitely check out your book. Always happy to talk MCP :)

samgutentag a day ago

mitochondria is the powerhouse of the cell