Loading video player...
Um, in the bottom right corner of your
page, you should have a go on stage
button. So, if you click that,
>> I done it.
>> Nice. And then if you can test sharing
your screen.
>> Yeah, I
Where was the sh Oh, a screen.
Yes.
>> Yep.
>> You sure?
>> Okay. I'll give you a five minute
warning before the session ends. I'll
also I'll I'll click start session.
We'll get 30 seconds. Um and then um
I'll introduce and hand over to you and
I'll give you five minutes before the
session ends as well.
>> Thank you.
Um
>> I need to share my uh other my no 4G
browser as well. How can Oh, I should go
to that and share change the share. Can
I?
>> Uh yes. Yeah, I am going to start the
session now and
it will give us a countdown timer.
Okay.
Hello all, welcome to the session. Uh
this session is building contextual
knowledge graphs to adapt, evaluate and
leverage LLMs with Samira Kurani. Uh
over to you Samira.
>> Hi nice to have be here. Thank you so
much for this opportunity and thank you
so much for everyone who attend this uh
session for me. I really would like talk
about what I have done during last year
and also thanks to no for for building
my career. I'm here for the second time
and I learned a lot from no and I got a
good support from my startup which is
I think my screen is shar so let's start
oh no
okay
I would like talk about what is exactly
three put down. The triper wants to if
you're going to a trip and you're
walking on the street and suddenly you
want to do something nice and see for
example moon is shining and you would
like to watch how moon is beautiful and
find an event about going to adventure
adventure and uh find having
conversation and find the perfect night
for little adventure helps you If if one
of you like to go to a wine drink a wine
tasting event or another of you wants to
go for
watching moon have a idea and give you
the best offer based on your your trip.
Actually a trip is always was my things
and I had always that problem that that
if I want go to a trip and I'm a wine
person I couldn't easily find the uh
best wine event is like be being in trip
like I'm local. So I want to have this
recommener system and triple but easily
come to mind that oh that's okay LLM can
do that for us but no LLM never done
that for us because LLM never understand
it justification
it's just about
statistical number of words it's never
wants to bring the truth for the user
and fact for the user even for the
latest uh version of LLM. I last night I
asked that I want to go a trip to a
restaurant for Christmas. I'm living in
Gway and ask about the Gway. This can
you offer me Christmas event in a
vegetarian restaurant. I didn't get an
answer. I just I'm sorry I cannot find
an event. It's that restaurant is close
to cliff of Malahour and you can go that
it's both of one of them is CL another
is chat GBT5
I didn't get that answer because LLM
doesn't know the fact so I've decided to
instead of put the context inside the
LLM and prompt I decided to put the
context inside the graph what that means
instead of just playing with the
location and event and uh and uh to tour
populated by people, review by people. I
give more more uh context like weather
like uh event like like concert if I'm
going to Ireland tomorrow morning which
concert I can uh attend I can have in in
a one question not the list of data that
uh I I didn't want to burn on the lots
of data they provided to me so I decided
to change the uh the way usually
construct the graph. So instead of just
entity and semantic relationship I add
some more attribute which is dynamically
adopted and uh uh and actually help me
to have a reasoning layer when my LLM
try to understand to give answer to the
user. I also added some sort of
demography. What was how I brought that
demography from Instagram the viral post
who who have this post who talk about
which type of trip who talk about which
kind of adventure is uh perfect for for
me as a for example as a
woman as a second
English speaker. with how can I enjoy my
trip
but uh what h happened in that states I
the query understandings get very easy
and graphs
graphs what what I meant about graphs I
didn't want to I wanted my llm generate
the cipher when I gave more context the
graphs work better for me because it
didn't need to understand to have a um
fixed cipher it could understand after
this pass what is the priority if it's
in Gway so and I'm a woman so and I will
arrive in 10 p.m. I do not offer walking
on this walking in a for a distance
because and for that is this hop of
reasoning is based on the graph that
knows and reason before answering the
question
and also ground response. I definitely
uh had zero sense of hallucination.
graph is working without hallucinating
without adding anything to the answer
for the user.
How that work? What exactly? I wanted to
answer that where where can I stay near
the eel tower for Christmas that we
within walking distance of hotel Louisa
and has a holiday event. This is the
question everyone's going to trip
sometimes. Answer that. How my graph
answer find the node which is the hotel
name and age of walking distance. I
easily could calculate that one and has
event how because I get data from event
and then the walking distance is less
than 2 kilometer. I define that and
probably it can be better to the output
is look like hotel hotel A and hotel C
are in with two kilometer of hotel
Louisa and after Christmas event this
was the part that missed on a general
LLM at the moment.
So how I create this contextual
knowledge graph? There is a lots of data
and every day generated more data again
and again. For example, the number of
tourist uh
venue in Gway do not change in a year.
But every day with uh LLM with uh
influencer
many data generated for one place
for that reason I just gather place
Latin lang. It was one of the things
that I really get a lots of help for JS
because they had a a great function to
handle Latin L
for for gathering my data set data set.
I get data from Yel trip advisor event
bright open street map weather API
general name and construct my graph.
After constructing my graph, I provide a
restful API to a LLM which every times
LLM have a plug-in. Uh I use some anti-
kernel plugin connect to the graph bring
the answer and uh give it to the user.
The conversational can be ongoing and
every time answer can be improving. But
uh the most important part of this
project was constructing
the graph based on
location and based on
temporal data behavioral and situational
why I'm why focus on situational for
example Gway in Ireland the weather in
that in winter and autumn is awful But
we have a galway in United States which
the weather is not as rainy as in as in
Ireland. So um it it was another
challenge I had to deal with other than
Latin lang uh lang on uh conversation I
should understand when people were going
to uh for example Gway in Ireland they
are talking about the Gway in Ireland.
do they are not talking about Gway in
United State and there is a lots of more
uh example of that without context
without having the fact on knowledge
graph it was impossible to answer this
question correctly zero hallucination at
first question first answer
uh we construct this graph so we had
data source and we add this data to the
data base and give it to LLM to LLM
retrieve the answer. Definitely a
structure is agentic based and uh we had
many agent which a planner make decision
when can connect to the knowledge graph
but uh for the phase of evaluation I
definitely rely on uh factchecking for
for me I get API from different source
and it was another beauty of ner I could
uh will show that uh in no forj you can
connect to API get data farm API and and
generate node and relation that was
really helpful because I get weather API
uh every time construct the relation and
after 24 hours that relationship
would would change and delete that that
was beauty of the uh no forges graph So
uh let's go talk about the schema I've
design and this is a very small part of
my graph because it's is located is just
in island or even less than island. Uh
we had
uh accommodation, we had we had
hospitals, night life, even post office,
uh country, city and whether restaurant
we have we create these data set based
on the address uh even the the URL,
Latin language status and
and construct different type of uh
relation with them. Some of this data
was a statistic like place
names can be changed but Latin lang is
really hard to change. So uh my office
doesn't need to get update about the
place which is include uh hotels uh and
uh
hospitals for examp
zero chance you get change in a year but
the the constant change was comes from
event seasonal data
about the concert
about the uh weather that was the part
of my graph that should be changed
I create
many different semantic relationship
which most of them came from API. So uh
what then after that what we have done
we we add we improved the prompt to use
subgraph for answering the question. If
it's about island, you need to go just
search on subg graph name island that
label that helps you to bring back the
answer for the user and uh and that what
and other things was everything
hallucination zero and dynamic
adaptation which I talk about it because
of changing an API that data gave every
day
but u still we need evaluation for
example if
Seasoned Ky coffee in Gway hosting
Christmas event with 2K of hotel data is
that we you find the entity matching you
can construct concentrate which is in
that case was event and season and uh
check the distance verify the output.
The output should give you the most
direct answer based on what you need not
what I have.
An orange is perfect for my case. Why?
Because of Eskimo was flexible pattern
based quering
and seamless integration. Uh no has a
connection to lang chain. Unfortunately,
I didn't use lang. I used semantic
kernel and but it still was a very
helpful uh application for my use case.
But still cipher query is a is is a
bottleneck is always a bottleneck when
you have a big graph. You you need to
understand when you start and in a graph
with at least for a small country like
Ireland with 45 million no age is really
hard to start. Indexing was helpful and
P APO which is uh batch optimizing was
great. I I write something on my
medium about and indexing or sol solve
that problem and data quality filter. We
we try to have a uh high quality data
and stop generating
duplicate data. For example,
our data was based on Latin lang and
Latin lang was enough for me to
understand how my graph is working and
any uh extra data just add as a property
to the same Latin line for me and uh for
for going to a scale I used cloud
integration and power processing and
vector index. So another problem we have
which when user ask a question uh we
need to understand what is the intent to
based on that intent go and find the
pass and bring the answer for the user
for that using reasoning for for
understand the intent of the user wasn't
the best approach I went for uh for a
store intent of user in different
vectors
but what I learned is flexibility is
everything. Database design is not graph
design and graph enrichment is the
bridge data silos. We had a lots of data
in different source which talking about
same things but they never been together
and never been uh in some place to give
whatever user needs in in actually in
travel industry or especially for
inexperienced traveler and subgraph is
is the key and real time context uh
required hybrid infrastructure for me
and My takeaway was context is the
missing bridge graph design. I said that
and uh if that's okay I can show my
graph now. It's oh
let me share my screen
now.
So uh I said when you have a for example
this is my graph and when every day
every uh uh an hour I get update from
weather API and it's easily for I will
remove the my uh API every times you run
that code the the relation and the
weather will be created and it's not
just about that everything's in this
graph are related to each other based on
different attribute including in same
city in uh in in in same in in same
category
in in in even in same weather in same
season and even based on has event
and that's it
I think I'm finish everything
Okay. If there is
Yes, please. If there is any question, I
would be happy to answer.
Oh.
Yes, I have question here to check the
coherence. You took vector embedding
between the names that might be
heterogeneous. No. Uh I had a W vector
based on named and Latin lang.
uh it's it's take me uh two three months
to build the graph then we improve that
and now we are ready for launch actually
we are in the phase of getting fund
Yeah. Yes. Actually when you have a
question, you write a question where
what can I do now and you get you give
access to us to have your Latin line
based on who you are and what we know
about you. We offering the we we
offering you the events you can attend.
Thank you.
And yes
no no no we are having a graph learning
graph during the time with the with for
example if if this year get data from
Gway in 2023 2024 for 2025 can be
learned based on what happened and we we
actually um emphasize on graph learning
more than LLM learning.
Yes, because reasoning it's important.
You answer the question based on the
intent of the user. And if if I just
rely on reasoning, I can rely on 12 13
uh intent. But when I have a vector
database, I have a bunch of re a bunch
of intern that I know how to answer that
actually I can answer everything on my
knowledge graph because I do not rely on
reasoning on llam.
So if there is not a question
anymore.
Okay.
Sure.
Thanks. Thank you so much guys for
having me for attending my presentation.
It was really nice to have audience.
Thank you.
Register for the full experience! NODES 2025: https://neo4j.com/nodes-2025/?utm_source=video&utm_medium=stream Welcome to NODES 2025, the biggest online conference dedicated to graph-powered apps, knowledge graphs, and AI. Join thousands of developers, data scientists, architects, and data pros as they learn the best techniques for graph-powered applications, AI Agents, and more. Building Contextual Knowledge Graphs to Adapt, Evaluate, and Leverage LLMs In this session, I’ll share how I built contextual knowledge graphs by combining diverse datasets like Geonames, OpenStreetMap, Yelp, and Wikidata. These graphs were designed to enrich LLMs with real-world context, enabling more relevant and grounded outputs. I’ll walk you through the process of modeling user behavior and spatial semantics, and how I used Neo4j to prototype, query, and scale these graphs. Along the way, I’ll highlight lessons learned in schema design, Cypher optimization, and integrating graph-based context into LLM workflows. You’ll learn practical techniques to adapt LLMs using graph-driven context, evaluate their outputs with graph-based signals, and leverage Neo4j to bridge structured knowledge and generative AI. Whether you’re building AI applications or exploring graph-powered personalization, this session will provide actionable insights to enhance your projects. #nodes2025 #neo4j #graphdatabase #graphrag #knowledgegraph