We are developing an experimental, open-source suite of APIs that leverages Neural Networks and Generative AI to automate the entire property development process, from generating project proposals to handing over of projects and sales.
Driven by Large Language Models, this program provides a Generative (AI) Artificial intelligence agent to facilitate multiplayer communication and the distribution of relevant information to architects, contractors, and investors at the right time, ensuring effective communication and coordination.
This is an experimental API aiming to autonomously increase the availability of affordable housing within the framework set by the Builder's Remedy legislation in USA markets
In the context of our platform, the Generative AI agent is used to generate project proposals based on patterns in successful proposals, while a General AI agent (if it existed) could potentially handle every aspect of the platform, from identifying opportunities to managing projects.
For now we "glue" processes with APIs to automate processes
🔵
🔹 The new GAI context: Large Language Models are a phenomenal technology that will reshape the property development industry, from the investment decision stage down to daily construction site operations
🔹 The property development industry is marked by inherent challenges, such as friction between parties due to uneven information distribution, safety concerns, issues related to quality, and low productivity.
We can mitigate these fragmented procedures by connecting multiple stakeholders involved in property development processes via streamlined information sharing.
🔹 Our comprehensive toolkit to glue together processes and legacy technologies with Generative AI.
🔵
The goal of the builder's remedy projects platform is to increase the availability of affordable housing and ensure that municipalities are meeting their obligations to provide housing for all income levels. These projects can be complex and often involve navigating zoning laws, community opposition, and financial considerations.
BRIP is an advanced suite of APIs designed to automate the discovery and management of builder's remedy projects in the property development sector. We cater to a niche segment of the market by automating the creation and submission of proposals for these projects, and subsequently presenting these approved projects as viable investment opportunities. Our goal is to address the unique needs of this market segment, making the process of investing in affordable housing more accessible and efficient.
🔵
🔹 Builder's Remedy Opportunity Identification: to identify municipalities that are not meeting their regional needs for affordable housing, thus highlighting potential opportunities for builder's remedy projects.
🔹Proposal Generation API: Automatically identify plots of land and generate project proposals.
🔹Proposal Submission API: Automatically submit the generated proposals to the appropriate municipal authorities for review.
🔹 Project Recommendation and Information Retrieval system: Once a proposal is approved, this module will present the project as an investment opportunity to a network of investors, Architects and Contractors. It will provide the necessary detailed information.
🔹 Project Management API: This module will provide tools for tracking the progress of funded projects and reporting results to all parties involved through a User Interface.
🔹 Admin-User Interface (UI): The platform will have a user-friendly interface that allows Admins to easily browse plots for investment opportunities, view detailed project information, and track the progress of their investments.
🔵
OpenAI's ChatGPT Retrieval Plugin offers a standardized API specification, enabling document storage systems and document retrieval services to interface not only with ChatGPT, but also with any Large Language Model (LLM) toolkit that utilizes a retrieval service.
WE-AI leverages this by providing a range of integrations with the ChatGPT Retrieval Plugin API, thereby creating an interconnected ecosystem of native AI services.
Key features of WE-AI include:
- The development of versatile APIs that serve as 'software glue', bridging the gap * between legacy systems and AI.
The utilization of open-source code, capitalizing on a vast pool of resources. - A technology-agnostic design that allows for the integration of various models and technologies, ensuring adaptability to industry advancements.
🔵
Integrate Your Data Sources with Our AI Engine: Unlock the potential of untouched data reservoirs. Transform every piece of information into a valuable data source. Accumulate, order, and systematize data to construct an all-encompassing knowledge base.
Harness the power of our AI framework to establish a unified source of information.
Image from our chatgpt plugin retrieving data from the vector database deployed http://3.8.31.17:8000/docs#/default/get_nearest_neighbors
Get Services you can not find anywhere else:
AI-Powered Collaboration Platform: Enhances stakeholder communication, reducing friction and improving project transparency.
Data Aggregation and Visualization: Captures, aggregates, and presents project data comprehensively.
Intelligent Information Distribution System: Distributes relevant information to stakeholders at the right time.
AI-Driven Safety Monitoring and Alert System: Monitors site safety continuously, providing real-time alerts.
Predictive Productivity Analytics: Analyzes historical data to identify and eliminate productivity bottlenecks.
The transition to deploying fine-tuned, domain-specific, privately-owned AI models tailored for your organization is important and necessary for several reasons:
Customization: These models are specifically designed to address your organization's unique needs, challenges, and goals. They can therefore provide solutions that are much more relevant and effective compared to generic, one-size-fits-all AI models.
Data Security: Privately-owned AI models allow you to maintain complete control over your data, which is essential for complying with data privacy regulations and protecting sensitive information.
Scalability: As your organization grows, so too can your AI models. They can be designed to scale with your needs, ensuring you always have the right level of AI power at your disposal.
🔵
WE-AI Framework Design
-
We build versatile APIs that act as
software glue
, enabling the creation of varied services. -
We harness open-source code, tapping into an extensive pool of resources.
-
Our adaptive framework utilizes a technology-agnostic design, capable of integrating various models and tech, evolving in sync with industry advancements.
+-----------------------------------+ | OpenAI | | Open Source Language Models | | we-ai Private Data Models | +-----------------------------------+ | v +-----------------------+ | we-ai LLM Toolkit | +-----------------------+ | v +-----------------------+ | MODELS ROUTER | +-----------------------+ | v +-----------------------+ | we-ai API middleware | +-----------------------+ | v +--------------------+ +-----------------+ +--------------+ | Document Store | | Data Getters | | Data Loaders | +--------------------+ +-----------------+ +--------------+ | v +-------------------------------+ +----------------+ | Meta Index- Question Engines | | Services API's | +-------------------------------+ +----------------+
In this diagram:
-
AI language technologies: We focus primarily on Large Language Models, showcasing the various options accessible via the we-ai API middleware. Our offering extends to enterprise solutions that include privately operated open source models, empowering businesses with a high degree of customization and control over their AI language technologies, and private data.
-
The "LLM Toolkit" represents any language model toolkit, which interacts with the "we-ai API middleware "
-
The WE-API API Middleware acts as a middleman, enabling interaction between the "LLM Toolkit" and various data sources (i.e., "Document Store," "Data Getters," and "Data Loaders").
-
Loading Data from we-ai into the API Retrieval System The WE-API API Middleware defines an /upsert endpoint for users to load data. This offers a natural integration point with Data Getters hub, which offers over 25 data loaders from various API’s and document formats, with more under development.
-
The data sources Document Store, Data Getters, Data Loaders, represent various document storage and retrieval systems that implement the WE-API API Middleware , allowing them to exchange data with the model.
-
The Meta Index- Question Engines has been added under Document Store, Data Getters and Data Loaders These represent the vector indices built over any data stored in their respective document stores.
This high-level architecture allows any LLM to access data from various sources seamlessly, provided these sources implement the API. The we-ai API essentially standardizes how these systems interact, making it easier for developers to integrate and manipulate a wide range of data.