diff --git a/modules/genai-ecosystem/nav.adoc b/modules/genai-ecosystem/nav.adoc index facdbf4..5711e5b 100644 --- a/modules/genai-ecosystem/nav.adoc +++ b/modules/genai-ecosystem/nav.adoc @@ -5,8 +5,8 @@ **** xref:google-cloud-demo.adoc[Google Cloud Vertex AI] *** Example Projects **** xref:llm-graph-builder.adoc[LLM Graph Builder] -***** xref:llm-graph-builder-deployment.adoc[Deployment] ***** xref:llm-graph-builder-features.adoc[Features] +***** xref:llm-graph-builder-deployment.adoc[Deployment] **** xref:rag-demo.adoc[GraphRAG Demo] **** xref:neoconverse.adoc[NeoConverse] **** xref:genai-stack.adoc[GenAI Stack] diff --git a/modules/genai-ecosystem/pages/llm-graph-builder.adoc b/modules/genai-ecosystem/pages/llm-graph-builder.adoc index e1079be..36ea53a 100644 --- a/modules/genai-ecosystem/pages/llm-graph-builder.adoc +++ b/modules/genai-ecosystem/pages/llm-graph-builder.adoc @@ -31,6 +31,8 @@ Afterwards you can use different RAG approaches (GraphRAG, Vector, Text2Cypher) The front-end is a React Application and the back-end a Python FastAPI application running on Google Cloud Run, but you can deploy it locally using docker compose. It uses the https://python.langchain.com/docs/use_cases/graph/constructing[llm-graph-transformer module^] that Neo4j contributed to LangChain and other langchain integrations (e.g. for GraphRAG search). +All Features are documented in detail here: xref::llm-graph-builder-features.adoc[] + Here is a quick demo: ++++ @@ -93,6 +95,8 @@ You can also run it locally, by cloning the https://github.com/neo4j-labs/llm-gr It is using Docker for packaging front-end and back-end, and you can run `docker-compose up` to start the whole application. +Local deployment and configuration details are available in the xref::llm-graph-builder-deployment.adoc[] + == Videos & Tutorials