Loading video player...
In our last video, we implemented a
basic tool calling agent and streamed
the results live into our React
application. Now, in most cases, a tool
call only takes a reasonable amount of
time. Often, you fetch something from
the API, write or fetch to the file
system or integrated into another
service. But what if your tool call
actually runs for a decent amount of
time? For example, you trigger a sub
agent that does a whole lot of work for
you until it responds. A really
responsive UI should give a user
constant updates to ensure they know
something is happening in the
background. Ideally giving them some
sense of expectation to when this
operation is about to finish with chain.
This can be managed via custom stream
events. Uh let's dive into to see how we
can implement and add custom events to
our agent and how we can render it into
our front end. Let's check it out. Now
in our sandbox example, we are working
with a data analysis agent here that
will help us to parse through multiple
files, analyze the content and give us
some trends about its content. So in a
normal application, you may show the
human message followed by some sort of
loading indicator that shows that
something is happening in the
background. And once all the two calls
have been done, you show the final
assistant message with the result of the
analysis.
Now, as you can see here, it may take
some time until all the files have been
processed by the LLM and a trend has
been generated. We can make the app now
more responsive by rendering immediate
tool events right into the front end
while the tool is being executed. So let
me go into the code and reenable
some sections and rerun the example
again. We will now see that as soon as
the tool is being executed, we are
streaming live updates from the tool
call into the UI which will make the
overall application much more
responsive. Let's look into the code
first. Let's have a look into the agent.
The agent is fairly simple. We again
define a model. We have a tool to
analyze the data and we have a tool to
process files. The implementation of
these tools is not important. What's
important is how we send updates to the
UI and we do this with the config.writer
function. The config.writer function is
part of the tool runtime and allows us
to just send arbitrary events to the
front end. It is part of the second
argument of your tool function and
provides you a way to send arbitrary
data bs to the UI. You can see here that
we're iterating through different stages
and for every stage we send a progress
update that satisfies a certain
interface and then we just have an
arbitrary wait time of 500 milliseconds
until we go through the next step and
finally send a final status report.
Now what's important here are two
things. For one, we are giving every of
our custom events a type that will help
us to later identify that event in our
UI and render specific cards for these
events in the front end. What we also
send along is the tool call. This will
help us especially the tool call ID will
help us to render the progress event to
the specific progress or data analysis
tool call.
Now in our front end we get access to
these custom events through the oncustom
event handler which is part of the use
stream hook that we have. Again we
access the custom streaming agent which
is defined in our agents and we register
the on custom event where we essentially
just collect the data and put them into
dedicated maps that we then access
during the rendering time of our
component.
Now when we receive the data, it's
usually typed as an unknown object. We
now have to have these or implement
these helper functions to help properly
type these objects so we can put them
into the right map. And we do this by
having these helper functions where we
validate that the data blob is an
object, it's not null, and that the type
that we set along is the right type. And
then whenever this function returns to
true, we can tell Typescript to label
the data object with the dedicated
interface. So for is progress data, we
label the object as progress data object
and for the other functions the same.
Now when we render the component again
we will stream through all the messages
and whenever we come across an AI
message that contains a specific amount
of tool calls we're mapping over these
tool calls and connect every tool call
with a specific component that renders a
specific custom event of that tool call
and at the end we render the message
bubble and render the the custom tool
update cards right after it. So the way
this now looks like is that we have an
assistant message with loading indicator
and as soon as the tool call is being
executed, we can render something in the
UI while the tool is working on
different types of data. We can even
show multiple updates for different or
multiple tool calls at the same time.
Now custom events are a great way to
render immediate feedback in your
application when a tool call may take a
second or two longer than desired. Check
out the example below to see the whole
application, how we deployed it with
langraph dev server and how we identify
these custom events and render them in
the front end. You can also see
everything in our front end docs where
we document how you can register your
custom event handler as well as detect
these custom events in your UI. Thank
you for watching and see you in the next
Build agent UIs that feel instant: stream **custom events** from LangChain tool calls (progress, status, file operations) straight into React as they happen. In this video we walk through the a demo using useStream + onCustomEvent, and show how to correlate updates to a specific tool call so your UI updates in-place while tools run. **What you’ll learn** - How to emit custom streaming events from tools via config.writer - How to receive them in React with useStream(..., onCustomEvent) - How to render progress + status cards tied to a tool call ID - A simple pattern for “streaming UX” instead of “spinner UX” 🧑💻 Example: https://github.com/langchain-ai/langgraphjs/blob/main/examples/ui-react/src/examples/custom-streaming 📚 Docs: https://docs.langchain.com/oss/javascript/langchain/streaming/frontend