diff --git a/modules/genai-ecosystem/pages/google-cloud-demo.adoc b/modules/genai-ecosystem/pages/google-cloud-demo.adoc index 3413197..7f14be9 100644 --- a/modules/genai-ecosystem/pages/google-cloud-demo.adoc +++ b/modules/genai-ecosystem/pages/google-cloud-demo.adoc @@ -9,38 +9,6 @@ include::_graphacademy_llm.adoc[] :page-product: google-cloud-demo //:imagesdir: https://dev.assets.neo4j.com/wp-content/uploads/2024/ -== Deploying GenAI Applications and APIs with Vertex AI Reasoning Engine - -GenAI developers, familiar with orchestration tools and architectures often face challenges when deploying their work to production. -Google's https://cloud.google.com/vertex-ai/generative-ai/docs/reasoning-engine/overview[Vertex AI Reasoning Engine Runtime^](preview) now offers an easy way to deploy, scale, and monitor GenAI applications and APIs without in-depth knowledge of containers or cloud configurations. - -Compatible with various orchestration frameworks, including xref:langchain.adoc[LangChain], this solution allows developers to use the https://cloud.google.com/vertex-ai/docs/python-sdk/use-vertex-ai-python-sdk[Vertex AI Python SDK^] for setup, testing, deployment. - -It works like this: - -- Create a Python class for your GenAI app class to store environment info and static data in the constructor. -- Initialize your orchestration framework in a `set_up` method at startup. -- Process user queries with a `query` method, returning text responses. -- Deploy your GenAI API with `llm_extension.ReasoningEngine.create`, including class instance and requirements. - -Our new integrations with Google Cloud, combined with our extensive LangChain integrations, allow you to seamlessly incorporate Neo4j knowledge graphs into your GenAI stack. -You can use LangChain and other orchestration tools to deploy RAG architectures, like GraphRAG, with Reasoning Engine Runtime. - -You can see below an example of GraphRAG on a Company News Knowledge Graph using LangChain, Neo4j and Gemini. - -image::https://dev.assets.neo4j.com/wp-content/uploads/2024/reasoning-engine-graphrag.png[] - -== Knowledge Graph Generation with Gemini Pro - -The xref:llm-graph-builder.adoc[LLM Graph Builder] that extracts entities from unstructured text (PDFs, YouTube Transcripts, Wikipedia) can be configured to use VertexAI both as embedding model and Gemnini as LLM for the extraction. -PDFs can be also be loaded from Google Cloud buckets. - -It uses the underlying llm-graph-transformer library that we contributed to LangChain. - -// TODO image - -// The Demo is available https://llm-graph-builder-gemini.neo4jlabs.com[online with with Google Gemini on Vertex AI^]. - == SEC Filings GenAI Labs This example consists of two sample applications that show how to use Neo4j with the generative AI capabilities in Google Cloud Vertex AI. We explore how to leverage Google generative AI to build and consume a knowledge graph in Neo4j. @@ -68,3 +36,14 @@ The Demo is available on GitHub: https://github.com/neo4j-partners/neo4j-generat ++++ ++++ + +== Knowledge Graph Generation with Gemini Pro + +The xref:llm-graph-builder.adoc[LLM Graph Builder] that extracts entities from unstructured text (PDFs, YouTube Transcripts, Wikipedia) can be configured to use VertexAI both as embedding model and Gemnini as LLM for the extraction. +PDFs can be also be loaded from Google Cloud buckets. + +It uses the underlying llm-graph-transformer library that we contributed to LangChain. + +// TODO image + +// The Demo is available https://llm-graph-builder-gemini.neo4jlabs.com[online with with Google Gemini on Vertex AI^]. diff --git a/modules/genai-ecosystem/pages/google-cloud-genai-integrations.adoc b/modules/genai-ecosystem/pages/google-cloud-genai-integrations.adoc new file mode 100644 index 0000000..6886959 --- /dev/null +++ b/modules/genai-ecosystem/pages/google-cloud-genai-integrations.adoc @@ -0,0 +1,40 @@ += Google Cloud GenAI Integrations +include::_graphacademy_llm.adoc[] +:slug: google-cloud-genai-integrations +:author: Ben Lackey, Michael Hunger +:category: genai-ecosystem +:tags: rag, demo, retrieval augmented generation, chatbot, google, vertexai, gemini, langchain, reasoning-engine +:neo4j-versions: 5.x +:page-pagination: +:page-product: google-cloud-demo +//:imagesdir: https://dev.assets.neo4j.com/wp-content/uploads/2024/ + +== Function Calling with Gemini + +// TODO + +== Quering Neo4j via Vertex AI Extensions + +// TODO + +== Deploying GenAI Applications and APIs with Vertex AI Reasoning Engine + +GenAI developers, familiar with orchestration tools and architectures often face challenges when deploying their work to production. +Google's https://cloud.google.com/vertex-ai/generative-ai/docs/reasoning-engine/overview[Vertex AI Reasoning Engine Runtime^](preview) now offers an easy way to deploy, scale, and monitor GenAI applications and APIs without in-depth knowledge of containers or cloud configurations. + +Compatible with various orchestration frameworks, including xref:langchain.adoc[LangChain], this solution allows developers to use the https://cloud.google.com/vertex-ai/docs/python-sdk/use-vertex-ai-python-sdk[Vertex AI Python SDK^] for setup, testing, deployment. + +It works like this: + +- Create a Python class for your GenAI app class to store environment info and static data in the constructor. +- Initialize your orchestration framework in a `set_up` method at startup. +- Process user queries with a `query` method, returning text responses. +- Deploy your GenAI API with `llm_extension.ReasoningEngine.create`, including class instance and requirements. + +Our new integrations with Google Cloud, combined with our extensive LangChain integrations, allow you to seamlessly incorporate Neo4j knowledge graphs into your GenAI stack. +You can use LangChain and other orchestration tools to deploy RAG architectures, like GraphRAG, with Reasoning Engine Runtime. + +You can see below an example of GraphRAG on a Company News Knowledge Graph using LangChain, Neo4j and Gemini. + +image::https://dev.assets.neo4j.com/wp-content/uploads/2024/reasoning-engine-graphrag.png[] + diff --git a/modules/genai-ecosystem/pages/llm-graph-builder.adoc b/modules/genai-ecosystem/pages/llm-graph-builder.adoc index 0ad4b4a..bbab567 100644 --- a/modules/genai-ecosystem/pages/llm-graph-builder.adoc +++ b/modules/genai-ecosystem/pages/llm-graph-builder.adoc @@ -34,6 +34,17 @@ It uses the https://python.langchain.com/docs/use_cases/graph/constructing[llm-g image::llm-graph-builder-viz.png[width=600, align=center] +[NOTE] +==== +[Creating a new Neo4j Aura Instance^] + +* Login or create an account at https://console.neo4j.io +* Under Instances, create a new AuraDB Free Database +* Which downloads the credentials file +* Wait until the instance is running +* Drop the credentials file on the connect dialog +==== + == How it works 1. Uploaded Sources are stored as `Document` nodes in the graph