Loading video player...
You can turn cloud code into a scraping
powerhouse. All you have to do is just
give it access to one MCP server and you
can do things like extracting entire
websites, creating screenshots of that
website, taking all the branding, the
fonts, the colors, the look and feel,
and even pulling structured data for
research. All just using natural
language. If you don't believe me, let
me show you. Now, there are tons of
methods and MCP servers and tools you
can use for scraping. I just happen to
use firecrawl and you'll see it in a
bunch of my videos in the past. And the
reason for that is all you have to do is
give it a simple input which is
typically a URL or a bulk set of URLs
and you can choose what kind of data you
want back. So if I go to my agency
website promptadvisor.com
and I take this and we paste it in here,
we can just click on start scraping and
you'll see on the next screen you have
tons of options you can focus on. Yes,
you can just scrape the website as is,
but this isn't new. But what you can do
is you can click on things like
branding, for example, click on start
scraping, and then while this goes
through the URL, it'll come back not
just with the scrape of the website, but
also all the information about the
fonts, the colors, all the look and feel
elements that make my prompt advisor's
website actually native to me. So just
as a small example, you could see this
is the branding. This is my logo, my
assets, my landing page, all the primary
colors, fonts, and typography. And this
is just one small sliver of what you can
do. Now, my goal is to bring this power
into Cloud Code. So you can just say
something like go and search these
following websites. Go and grab the
branding from this one. Go and grab
screenshots from the bottom and the top
of the page of XYZ website. So if you go
to the docs and you go to the MCP server
section on the right hand side, you can
see right here there's different
sections for running in cloud code as
well as nen and other platforms. So if I
click on cloud code right here, all you
have to do is copy paste this command as
is and you're going to want to swap this
with your API key. You can get this API
key by just signing in. They have a
pretty generous free tier of 500
requests. So, if you go into the
dashboard, you'll see at the right hand
side right here, this is your API key.
I'm going to just take this and attach
it to that request and I'll send it over
in cloud code. So, all you have to do is
just enter that command as is with your
API key and then within seconds it
should say done and then it won't show
up or actually function within this same
chat or session. You're going to have to
respin up a brand new session. So,
you'll see right here it says firecrawl
is there. So it exists but it status
failed. But if I open a brand new
session and do slashmcp
load it up. Now we can see that this is
functional. So what I'll do is just to
show you that it does work. I'll do
shift tab and I'll go into yolo mode and
I'll say something like can you tell me
all the tools we have access to with the
fire crawl MCP? Maybe give me some use
cases as to what I can use each tool
for.
So we'll send this over and we'll get
back something like this that says we
have firecrawl scrape which is good for
scraping content from different URLs
parsing PDF documents even firecrawl map
firecrawl search which is what we just
used a few minutes ago and then the
crawl function which can go on multiple
portions and subpages of the same
website crawl status extract which uses
LLM capabilities to extract content on a
particular page so you can search for
different components or elements of said
page and this is the quick guide. Now,
in order to be able to create a
claude.md file, we'll make this into an
asset in the codebase since right now
it's empty. So, claude MD won't really
do anything for us. So, I'll say, okay,
great. Can you put a full understanding
of all the different functionalities of
this MCP server into a file that we call
firecrawl mcp server guide domarkdown.
So, then this will be sent and then
we'll be able to send it over and double
check from there exactly how to move
forward. And now that the guide is set
up, we can click on this right here. You
can see it's pretty comprehensive. And
going through there's a table of
contents and it breaks down every single
parameter that Claude should be aware
of. And once I type in slashinit, it
will initialize a brand new claude MD
file which won't be as detailed, but it
will have the TLDDR cheat sheet of
exactly how to use this resource. Now
that we have this set up, I'll go
through a mental model of how to use
this API just as a quick cheat sheet so
that we can open a brand new session and
experiment with just using natural
language like I promised to do things
like take screenshots, extract content,
and even more importantly extract the
look and feel and design of a particular
website. So, while there's some sub
functionality that we can use, it's good
to use this as your core cheat sheet.
So, you can scrape, screenshot, extract,
crawl, search, and map. In terms of what
to scrape, you could do a client brand
audit where you can scrape an entire
website and see is the copywriting up to
snuff when it comes to the brand
strategy and the tone of voice. If
you're a designer or a web developer,
then the screenshot functionality might
be handy to create portfolio documents
where it screenshots your entire web
page as one continuous document. You can
use the extract functionality to go on
any website that has tons of listings.
That could be e-commerce, it could be a
job platform, it could be Indeed, what
have you. And then crawling as well as
searching can both be used for research
and competitor analysis. And map is
really important to look at the site map
of an entire website. So if there are
multiple components and many subdomains,
it's a really good way to be able to
look at the entire topography of a
particular domain. So theory aside,
let's actually put this to the test. So
all I'm going to do is go on YOLO mode.
It'll go on shift tab tab tab and then
we'll take the URL of my agency website
and we'll just send a natural language
command. So I'll say the following. Use
the firecrawl MCP server to do the
following on this particular website.
And I'll just paste this in and I'll
just paste the URL. I'll go back. So, I
want to be able to extract all the
branding of this website as well as take
screenshots of the entire website from
top to bottom so I can see it in one
unified image. And then I also want to
scrape and understand what this company
does. And I want to possibly crawl and
see what other web pages there are as
well. Maybe screenshot those and put all
of these together in one folder that are
neatly put together. So, we could send
this over, right? And then paste it and
we'll see what it comes back with. So
you can see it's invoking the MCP
multiple times in this case to scrape.
You can even take a little bit of a
peak. So this is the website. These are
the parameters. It's using the
functionality type screenshot. And then
if we close this up, you can see all the
links from the map. So this is the map
function. It's mapping out the services,
the about us, everything involved.
Pretty straightforward. And then you can
see it's already setting up this folder
on the lefth hand side. And while this
is running, it's important to note that
you can use this on the free tier, the
hobby, the starter, and your only
changes are the number of requests that
you can send, as well as the number of
concurrent requests, which in plain
English means how many different calls
can you make to the same API, the crawl
API, at the same time. The one good
thing about the MCP server is if it
can't run more than the limit, so let's
say on the hobby plan, it's five, it
will cue and wait until it's ready to
go, then it'll resend the request. So,
now it's halfway and you can see right
here on the to-do list, we're still
creating the branding guide document. If
you take a little peek at where we're
at, it takes all the color palette of my
website, my typography, all the fonts,
the spacing, the layout, the logo,
everything you could need. And how could
you use this? Let's say you build RFPs
or requests for proposals or tenders or
anything that you build at scale and you
want to tailor it to either your
company, the company you work for, or a
company you're reaching out to. These
kinds of MCP server tools are amazing
because you can literally apply them to
any documents at scale. And if you've
watched my prior video on using clawed
code to create document files, you can
now marry both worlds where you can take
the file itself and then change the
branding and the look and feel using
something like this far crawl MCP. And
by the way, because I'm such a super
user of this platform, they gave me a
special link in the description below
that if you use that will give you way
more credits than the standard free
tier. So now that it's finished, it's
the moment of truth. So we have the
branding guide. We already saw that the
content. So this is all the company
overview of what my agency does. This is
mwah. It goes through pretty much every
part of my website. And if we close
that, the part that's probably
interesting to you are the screenshots.
So I can go on something like the
homepage. And you can see right here, it
takes a beautiful end to end top to
bottom picture of my website as well as
the other components of my website. And
this is just running one singular
command. What would happen if you had
multiple agents with different tasks?
One crawling, one scraping and doing
research, one taking screenshots, and
based off of that screenshot, basically
executing a task or doing analysis based
on whatever input is received. I'm not a
big fan of using many MCP servers in
general because it basically bloat your
context window, but this is one of the
few MCPs that I use on a recurring
basis. And that's pretty much it. So, I
just wanted to walk you through how to
use this MCP server and all the
associated tools. And you can include
this in whatever systems you want. For
example, I use this in my own AI
strategy system that I put together for
my exclusive community. And it's been so
helpful that people are actually closing
deals. They're doing audits and
strategies just using things like Fry
Crawl combined with some other things to
be able to look at a website, tailor the
branding, and apply it in a variety of
scenarios. So, if you found this
helpful, then please let me know down in
the comments below. helps the video,
helps the channel, and I'll see you in
the next
🔥 Get 1000 FREE FireCrawl credits with code: MARK0K1000R-960AOU Join the Early AI-dopters Community: https://www.skool.com/earlyaidopters/about Description Turn Claude Code into a web scraping powerhouse with just ONE MCP server. In this video, I show you how to add FireCrawl to Claude Code so you can scrape any website, extract branding (colors, fonts, typography), take full-page screenshots, and pull structured data - all using natural language commands. No coding required. Just tell Claude Code what you want, and it does the rest. --- ⏱️ TIMESTAMPS 0:00 - Intro: Turn Claude Code into a scraping powerhouse 0:43 - FireCrawl playground demo 1:30 - Extracting website branding (colors, fonts, logos) 1:53 - Setting up FireCrawl MCP in Claude Code 2:34 - Verifying the MCP installation 2:54 - Exploring FireCrawl tools (scrape, map, search, crawl, extract) 3:36 - Creating a guide file for Claude 4:24 - Mental model: When to use each tool 5:42 - Practical demo: Natural language scraping 6:35 - Watching it run (live) 7:36 - Reviewing the branding guide output 8:46 - Viewing the screenshots 9:25 - Wrap up & use cases --- 📌 WHAT YOU'LL LEARN • How to install FireCrawl MCP in Claude Code (one command) • Scrape entire websites using natural language • Extract branding: colors, fonts, typography, logos • Take full-page screenshots (top to bottom in one image) • Map out website structure and subpages • Pull structured data for research • Use cases: client audits, portfolios, RFPs, competitor analysis --- 🔗 RESOURCES FireCrawl: https://firecrawl.dev FireCrawl Docs: https://docs.firecrawl.dev Claude Code: https://claude.ai/code --- #claudecode #firecrawl #webscraping #aiautomation #mcp #claudeai #anthropic #aitools #nocode #scraping #webdevelopment #automation #artificialintelligence #techyoutube #aitutorial