Loading video player...
Hello test. Awesome. Thanks, Elena. Uh
yeah, so as Elena said, uh right now I'm
leading DX and AI at Subase. Uh so super
stoked to uh chat more about the MCP
launch uh that Coppel mentioned this
morning. Um and also because I like to
torture myself, I might attempt a live
demo at the end of this. Um I someone
convinced me right before the
presentation to do this. So we'll uh
we'll see if we have time for that.
All right. So, in uh back in April this
year, we launched the initial version of
our uh Subbase MCP server. Uh this
allowed you to connect your uh favorite
AI agent uh like cursor or maybe uh
windsurf at the time uh with uh subbase.
Uh so this meant that you could continue
to use your you know your favorite AI
IDE um while also giving it like a full
view into your database.
Uh so practically speaking you could do
things like list tables but maybe more
useful you could actually uh use it to
help you create new tables in your
database or maybe modify columns all
while uh following best practices like
storing these in database migrations.
Um, and to be honest though, I think for
me the the real value of MCP in general
and Subbase MCP is the fact that now
everything can just be in one place,
right? You as a developer no longer have
to like context switch, jump between
your IDE, uh, Subase dashboard, uh, to
get what you need to, come back. It's
all just centralized, which is awesome.
Um, but there were some core features
missing initially. Uh, you guys said,
"What the heck? We're missing edge
functions." Uh, and of course, you know,
if you're building an app and the
built-in REST APIs that are
autogenerated, uh, don't do what you
need, you're probably going to lean on
edge functions to build, um, kind of the
custom APIs that you need for your app.
Uh, so we said, what the heck, you're
right. Uh, so then we added, uh, the
ability for the, uh, LM to actually
create edge functions through MCP as
well.
Uh, and this actually worked really
great for a lot of people. Um, so
actually as of the past couple months
now, um, we're seeing over 20,000
monthly active users with our MCP
server. Um, so clearly this is the daily
driver for a lot of folks. Um, which is
super awesome to see.
Now, of course, every story or
presentation, at least a good
presentation doesn't come out with uh go
through without a couple challenges,
right? So uh we'll start with challenge
number one with our MCP server. And it's
uh it's basically this command right
here. Um so assuming some of you guys
have actually tried using the MCP
server, you're probably familiar with
this command. Um this command is
actually running the server locally on
your machine using Node.js uh using the
command npx. Uh and it turns out that
this command is actually uh pretty
inconsistent across different
environments. Uh so first of all, of
course, you need to be running a
specific Node.js version. That's what
npx is doing under the hood. Um, so
that's kind of a given um to have that
set up properly. But also at the same
time, uh, it turns out this actually
runs differently across different
operating systems. So if you're a
Windows user and you try to run a lot of
these commands, you probably have hit
walls. And, uh, Windows goes about this
a different way than Mac OS. It goes
different than Linux. And it's just in
general a big pain point. So much so
that our number one issue on our MCD
repo is literally this. Um, to this day,
uh, yeah, there's just been tons and
tons of reports of people having issues
with this command. Um, we went down a
huge rabbit hole trying to get it to
work with as many environments as we
could, but it's hard to to cover
everything.
That's challenge number one. Challenge
number two is personal access tokens or
authentication in general. Um, so how do
you authenticate this MCP server or like
your MCP client in general with um,
Superbase like your suitbase account? So
to make it work in this context, you'd
have to literally go to um your
listbased dashboard, go into your
settings, generate manually a personal
access token, come back, pop that into
this CLI command argument. Uh and then
on top of all that, remember to never
commit this to git because if you do,
you're probably going to leak your
credentials. Uh and sadly, there were
some cases of this happening. Uh so
security not as good as it could be. Uh
and last but not least, um you might
have heard this of this uh web app
called Chat GBT. Uh this one they
actually added support for MCP as of
last month, which is super exciting. It
it's it's it's available in like their
developer mode is what they call it. Um
but of course this only works if you
have like a remote MCP URL. You can't
pop in a uh MPX command like this. Of
course, it's not going to uh that's not
going to fly. So for all these reasons,
uh we're super stoked to now as of today
release our remote MCP server.
Thank you. Yeah. Um and the best part is
is as simple as using this URL right
here. So this is personally what I love
about it compared to everything that we
just went through. You can just copy and
paste this into your client of choice,
cursor, uh cloud code, uh whatever you
prefer. Uh and that's pretty much all
you need to get started right off the
bat. Um so and this also means for the
first time you can actually chat with
your subbase project directly from chat
GBT. So you can do this today. Um so
this example we're chatting with the
table which we have in my database. This
giving me exact insight into my soup
base project. The other thing that we
added to the remote MCP server is what
the spec calls MCP off which if you've
gone into this rabbit hole this is
basically ooth 2 under the hood with
some extensions on top of it. So um to
be honest, this was more than half of
the work together with MCP working is
getting um all the OOTH stuff working as
well. What does this really mean for
you? It means that there's no more
personal access tokens, no more copy and
pasting tokens, no more risk of
committing them to uh source control.
This demo here is using cursor where
essentially you just pop in the URL, you
hit uh that connect button, it's going
to pop you over to your browser, you
just grant access because you're already
logged into your suitbase account uh and
then it pops back. So the typical flow
that we're all used to. Uh actually we
we had our eyes on this flow since
pretty much the very beginning. But of
course this only works with remote um
remote MCB servers. It does not work
with u the local the other one's called
a standard IO approach.
>> And this actually is the one that I'm
most excited for of them all. And this
is actually um the ability to run MCP
completely local on your local Subase
stack. So this command here subbase
start this is nothing new um if you're
familiar this is actually running a full
local development stack on your computer
uh under the hood this is using docker
to take all the subbase services and run
them within containers on your machine
um and this is kind of our standard
local first approach um and up until now
there is really no good way to connect
MCP with that unless you just want to
use like the postgress mcp and do a
direct postgress connection kind of
thing um so also as of today you can run
subbase start with the latest version of
our CLI and you'll now get this new MCP
URL at the bottom and you can connect
this to your um like all the other ones
you connect to cursor cloud code
whatever your preferred um client is. Uh
and it's actually I don't know why I was
surprised by this. It's like really fast
of course like what's faster than local.
Um but I I yeah it shocked me just uh
how how nice it was to work with.
So as a summary, uh if you're connecting
to a hosted version of of Superbase, you
would use this URL. And if you're
connecting to a local version, use this
URL.
Okay. Um so that brings us to today. Um
some things that I would also love to go
through is some other features we've
added to the MCP server itself in terms
of tools and ways you can use it that I
think you guys will find hopefully
useful. Uh so feature number one is what
we call feature groups. uh and this is
the ability to essentially choose which
tools you want to actually expose to
your clients and this is useful for two
reasons uh or at least two reasons that
I can think of. So reason number one
would be uh
uh lots of lots of clients like say
cursor actually will limit the number of
tools you can have or MCP servers you
can connect to um cursor at least this
was a previous limitation but in general
lms can only have so much context when
you're providing tools to it. Um, so I
think by default our MCP server exposes
like over 25 tools, uh, and say cursor's
limit was 40, then you're already eating
up like a large budget of tools
available, right? And if you don't
actually care about using all these
tools, you only want to actually have
access to your database and maybe some
debugging tools, then you can scope that
down and have more control over that.
Um, the other reason why this is useful
is for security. Let's say that you
never actually want your LM to ever
access uh edge functions on your
project, but you want to give a database
access or maybe vice versa. Don't touch
the database, but help me create these
edge functions. You can come in here and
choose just the edge function group uh
and expose just those tools to the LM.
The ones that you don't choose won't
even be shown at all.
>> They will never have access.
Um actually, what you're looking at
right here is um what we're calling our
URL builder. So, this URL builder just
this actually lives right in the
dashboard. So, if you go into the
dashboard, that connect button at the
top and then the MCP tab on the right,
um you can use this to actually do some
toggles like readon mode, feature
groups, uh and it will actually build
you a custom URL that you can copy and
paste into your client. It'll also allow
you to scope this directly to just a
single project. So, you can be confident
that the MCP server is only um
interacting with a single project and
not multiple projects.
Okay. The next thing that we added is
doc search. So the you probably if
anyone here has worked with LLMs I
assume everybody uh you know that that
one of the biggest challenges is
knowledge cut off right and one thing
that I um was actually surprised about
on the early days like back in 2023 was
actually how good the LMS still were
with superbase like they actually had
pretty good knowledge even back then um
they weren't amazing though um fast
forward till today I'd say a lot of the
LMS like the flagship ones are like
really great like they understand a lot
of subbase Um, but the problem is is
it's never going to be like fully up to
date, right? There's always that
knowledge cut off and it's not going to
have the most update to date
information. So let's say there's
something that we launched literally
let's say today uh and you want to start
working with that immediately. The your
LM isn't going to have knowledge about
that and won't know how to work with it.
So to solve this we created a tool
called search docs which gives you under
the hood we we created a what we call
our content API. It's actually a GraphQL
based API over our documentation that
you can use to um uh provide that to the
LM and it can search over the docs for
basically it's always up to date uh
current information from our docs. Um
under the hood we're also using a hybrid
search approach. So if you've gone down
like the rabbit hole of embeddings and
um keyword search we're using that same
technology from our docs uh for this
tool.
Okay. Next is what we call advisor. So
the challenge here is how do you know
that the LM is actually following best
practices. Um especially with MCP you
can connect this to any LM. You can
connect it to the best models out there
and you can connect it to local models
on your machine that maybe aren't so
good. Um so how do you know that these
are actually producing like high quality
SQL and so the way to ground that is
through what we call advisor. Advisor
are actually something that already
exists on top of superbase and you can
think of them as like lints on top of
your SQL. So um looking for best
practices. Right now we have two types
of uh advisers. One is performance
advisor and one is security advisers. So
it'll actually scan your database and
look for things like are you creating
RLS policies? Are you following best
practices uh on your database? And then
in this example you can see like why is
my query slow? Uh and we're talking to
chat GBT. And so instead of Chachi
giving me like a general, oh your your
database might be slow because of this,
this, and this is literally looking
directly at my database and saying, oh,
you got this to-dos table. You're
missing an index. Here's what we can do
to fix that.
Okay. Finally, we added some initial
support for storage. So we have storage
buckets at Subase. And so this just
gives the LM uh the ability to check
which uh storage buckets exist and
change some configuration. So it's it's
basic to start, but we're looking to add
more features like um you know insight
into actual files within your buckets
and like analyzing the details on those.
Um the cool one about the storage tools
is this was actually a community
contribution. Um so yeah, what I would
say there is if you guys ever feel like
there's a feature missing from the MCB
server and you're like this would be
really great to have like by all means
please create a poll request. Um we work
in public, we work in the community,
we're all uh working on the same
codebase. So uh please submit those.
Okay. And then what's next for the
future? There's three things I'm excited
for for what's next. Um, number one
would be fine grain permissions. So, uh,
earlier I was talking about how we have
the MCP off, but you can see in this
screenshot here that we have you
basically have to give it all or
nothing, right? You give it rate to
write access to database, edge function,
storage, all that in one like binary
decision. So what where we want to go
with this is each one of these should
actually be a toggle that at this point
in time you can choose exactly which
ones you want to expose. So I mean we
already have feature groups that give
you this capability but this actually
narrows it down to um which features are
available at the access token level. Um
so like the most secure way possible to
do that.
Okay. Next uh is local first. So, uh,
like I said, super excited to support,
um, local MCP with the local stack. Uh,
but I think there's a lot of more things
we can do around this. Um, basically, we
want we want to double down on this.
This is something that I think there's
lots of room for improvement. Maybe some
integration with the CLI. Uh, just make
that local dev experience first class
with MCP.
Okay, this last one, I know I keep on
saying I'm excited. I'm excited about
all these features. Um uh you might have
asked like asked what about you like
what if you want to build an your own
MCP server on top of your existing apps
and projects within Superbase. So I'm
super stoked to share that we have on
the road map we're currently working
right now uh on the ability for you
yourself to also create an MCP server on
your apps on top of Superbase. So we're
basically using the playbook and the
lessons learned as we built our own MCP
that we just presented today. take all
those learnings and we're going to apply
those to allow you to give you all the
tools you need to um build your own MCP
including MCP off with all the OOTH 2
protocol you need for your own project
um and remote MCP support as well.
All right, so do we have time for a
demo? Two-minute demo. Uh
>> yes.
>> Yes. Okay. Uh I was hoping that maybe
Oh, we ran out of time. Okay. Uh so here
we go. Uh, so what I've done, so what I
have here is um LM Studio. Anybody used
LM Studio before? Maybe. So there's
Olama, there's LM Studio. What what this
does is this allows you to run uh models
completely locally on your machine. So
the reason why I wanted to demo this
today is because I think it's like now
that we have local MCP support, I I
thought it'd be super cool to like do
like a fully offline, like you could do
this on an airplane kind of thing. A lot
of airplanes have internet now, but uh
you can't use that example, I guess. Um,
but you get the idea. So, the first
thing I'm going to do is I've actually
already set up the MTP server, but I'll
show you how that was set up. If I go
into um the details, it's just an
MTP.json file. So, what I've done is I'm
sure we're all familiar with this JSON
file. Pretty much every uh client
follows the same uh format. And unlike
the old standard IO command is all it is
is a simple URL that you can pass into
here. So, I've already spun up a local
Subra stack using the start command. um
it's using the latest stable build and I
pop that into here. If I come back over
here, you can see that it has now
connected that and you can see all the
tools that are listed here.
So, first I can say something like, oh,
actually, let me load the model first.
That might be important. So, I have
GPTOSS, the 20 billion version. Uh I'm
going to load that one up right now.
Um that was very quick. Uh so, looks
like we're good to go there. I've
already added the MCTP server as context
here. I'm just going to say what tables
do I have
available.
We had a failed tool call, but then it
figured it out. Okay, so it's going to
first list the tables to see what's
available. This is very common with MCP.
And we're saying, "Looks like you don't
have any tables." That's what we
expected. So, uh, please help me
create a to-dos table.
So, these would be like a hello world
type of app to-dos. U, this is like a
classic thing that you do. Now, I'm
going to say this right now from a
security side, always read the SQL that
it generates before you run it. Uh, I
know a lot of you guys like to do like
the YOLO modes. Um, we do not recommend
that. So, please read it what it's
doing. Okay, we're creating a table
public to-dos looks pretty good. Pretty
impressive from a local model. So, let's
go ahead and proceed. By the way, local
models are like like tool call
capabilities in general with local
models have notoriously bad. So, the
fact that this is working with tool
calls is in my opinion like amazing. Um,
so there's a new table. I can say, okay,
great. Um, can you confirm
by listing the tables? Let's just
confirm that.
Awesome. It found that RLS is not
enabled. Uh, so we could do that, but
instead let's just quickly check what it
has. Okay, so we have an ID column,
title completed, and created ad. Let's
add a
category column. So, we can track, you
know, which category these to-dos belong
to. It's going to apply a new migration.
I didn't read it. I just blindly I just
blindly clicked it. Uh, okay. Simple
command. We're good there. And then
finally, we can say, can you show me
which migrations I've run?
So, it's going to call this migrations.
And we have our two two migrations. One
that created the to-dos table and one
that added the category to use. So this
happened completely offline using fully
local subbase
Greg, the head of Developer Experience at Supabase announces the launch of Supabase Remote MCP server. The new remote MCP server eliminates complex setup commands and manual token management, making it easier to connect AI coding assistants like Cursor, Claude Code, and ChatGPT directly to your Supabase projects. New features include granular feature groups, documentation search, SQL performance advisors, and the ability to run MCP completely offline with local models. Watch a live demo showcasing fully offline database development using LM Studio with a local AI model. Learn more about the remote MCP server: https://supabase.com/blog/remote-mcp-server CHAPTERS: 00:00 Intro 00:32 Original MCP server and its issues 04:49 Announcing the Remote MCP Server 06:32 Using the MCP server locally 07:47 New features added to the MCP server 12:51 Future of MCP server 14:44 Fully offline demo with local models š» Videos to watch next: ā¶ https://www.youtube.com/watch?v=1SMldLoOhbg ā¶ https://www.youtube.com/watch?v=PxlZssDNrws ā¶ https://youtu.be/lICh9H5skVk š Learn more about Supabase š šø Website: https://supabase.com/ š Get started: https://app.supabase.com/ š Docs: https://supabase.com/docs š Subscribe for more tutorials and feature updates from Supabase: https://www.youtube.com/channel/UCNTVzV1InxHV-YR0fSajqPQ?sub_confirmation=1 š± Connect with Us: š Github: https://www.github.com/supabase š¬ Discord: https://discord.supabase.com/ š¦ Twitter: https://www.twitter.com/supabase/ ā¶ Instagram (follow for memes): https://www.instagram.com/supabasecom/ ABOUT SUPABASE: Supabase is the open source Firebase alternative. Supabase provides a full Postgres database for every project with pgvector, backups, realtime, and more. Add and manage email and password, passwordless, OAuth, and mobile logins to your project through a suite of identity providers and APIs. Build in a weekend, scale to millions. #Supabase #AppDevelopment #DeveloperTools