diff --git a/examples/avro-bean/README.md b/examples/avro-bean/README.md
new file mode 100644
index 0000000000..364404b164
--- /dev/null
+++ b/examples/avro-bean/README.md
@@ -0,0 +1,27 @@
+# avro-bean
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Avro as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Avro schema in the registry (registered by the producer)
+- Data sent as a {@link GreetingBean}
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+@author carles.arnal@redhat.com
+
diff --git a/examples/avro-maven-with-references-auto/README.md b/examples/avro-maven-with-references-auto/README.md
new file mode 100644
index 0000000000..fc81cb1b8a
--- /dev/null
+++ b/examples/avro-maven-with-references-auto/README.md
@@ -0,0 +1,6 @@
+# avro-maven-with-references-auto
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
diff --git a/examples/avro-maven-with-references/README.md b/examples/avro-maven-with-references/README.md
new file mode 100644
index 0000000000..8e6509150e
--- /dev/null
+++ b/examples/avro-maven-with-references/README.md
@@ -0,0 +1,6 @@
+# avro-maven-with-references
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
diff --git a/examples/camel-quarkus-kafka/README.md b/examples/camel-quarkus-kafka/README.md
index dcbe0f1844..734b9da0bb 100644
--- a/examples/camel-quarkus-kafka/README.md
+++ b/examples/camel-quarkus-kafka/README.md
@@ -1,26 +1,6 @@
-# Camel Quarkus Kafka Example involving the Service Registry Managed Service
+# camel-quarkus-kafka
-1. Create Kafka Managed Service instance on cloud.redhat.com
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-2. Create associated Service Account, save client Id and Client Secret
+## Instructions
-3. Create Service Registry Managed instance on cloud.redhat.com
-
-4. Populate correctly the producer application.properties file with the missing parameters
-
-5. Populate correctly the consumer application.properties file with the missing parameters
-
-6. From the Service Registry Managed Instance UI load the user.avsc as schema named 'test-value' with no group
-
-7. From the producer folder run
-
- mvn clean compile package
- java -jar target/quarkus-app/quarkus-run.jar
-
-8. From the consumer folder run
-
- mvn clean compile package
- java -Dquarkus.http.port=8081 -jar target/quarkus-app/quarkus-run.jar
-
-Notes:
-- The class User has been generated starting from the avsc user schema, through the avro tools
diff --git a/examples/cloudevents/README.md b/examples/cloudevents/README.md
index dde4cbb42b..c1e6ca7e92 100644
--- a/examples/cloudevents/README.md
+++ b/examples/cloudevents/README.md
@@ -1,122 +1,6 @@
-# Apicurio Registry CloudEvents example
+# cloudevents
-This is an example application that implements a REST API that consumes and produces CloudEvents.
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-This example application is implemented thanks to an experimental library implemented within the Apicurio Registry project. This library is used to validate incoming and outgoing CloudEvents messages in a REST API.
-The validation is performed using json schemas that are previously stored in Apicurio Registry.
+## Instructions
-The idea behind this library is to provide a tool for serialization and deserialization of CloudEvents that uses Apicurio Registry to store the schemas used for serialization, deserialization or validation of CloudEvents data. This library is built on top of the CloudEvents Java SDK, that among other things provides the CloudEvent Java type.
-
-For this PoC we only focused on **CloudEvents and http**. Meaning that, at least for now, this library primarily allows to use CloudEvents with REST services and REST clients. Also, this library only provides support for json schemas, support for other formats such as avro, protobuf,... could be easily implemented.
-
-### Apicurio Registry CloudEvents Serde and Kafka
-
-We decided to not focus on implementing a set of Kafka Serde classes that work with CloudEvents.
-
-We are open to discussion if you consider it could be interesting to be able to do serdes plus validation of CloudEvents using Apicurio Registry **but** using another protocol or transport such as Kafka, AMQP, MQTT,... Feel free to create an issue or to reach out to the Apicurio team.
-
-After implementing the serdes library for REST services we considered implementing the equivalent for Kafka but we dismissed this effort, you can find below some of our reasons.
-
-Our current Serdes classes could be easily improved to make them more compatible with CloudEvents based use cases, and this approach would be preferrable rather than implementing a new set of Kafka Serdes classes for CloudEvents.
-
-The [KafkaConsumer API](https://kafka.apache.org/26/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaConsumer.html) bounds the consumer to one class that will be the type of the message value that it will receive after deserialization, this means that all messages that a KafkaConsumer will receive are going to have the same structure.
-```
-KafkaConsumer consumer;
-```
-However for CloudEvents use cases we don't see any value in having Kafka Serde that works with a generic type, such as the java class for CloudEvents would be.
-```
-KafkaConsumer> consumer;
-```
-If required, this approach could be easy to achieve with some improvements to our existing Serdes classes.
-
-## Apicurio Registry CloudEvents Serde library
-
-The Apicurio Registry CloudEvents library consists of two maven modules:
-- `apicurio-registry-utils-cloud-events-serde`, provides the serialization and deserialization API, with data validation included. This component calls Apicurio Registry to fetch the required schemas to perform the serialization/deserialization/validation.
-- `apicurio-registry-utils-cloud-events-provider`, this contains a jaxrs provider implemented on top of CloudEvents [java sdk for restful services](https://github.com/cloudevents/sdk-java/tree/master/http/restful-ws). This provider allows to implement REST APIs that consume and produce CloudEvents, like the CloudEvents sdk does, but validating the CloudEvents data and ensuring the data adheres to it's respective schema stored in Apicurio Registry.
-
-This library is experimental and has not been released nor is available in the main branch of the Apicurio Registry project,
-so if you are interested you can find the source code [here](https://github.com/Apicurio/apicurio-registry/tree/cloud-events/utils/cloud-events).
-Also, to test the code (and to run this demo) you have to build it from source.
-
-```
-git clone -b cloud-events https://github.com/Apicurio/apicurio-registry.git
-
-cd apicurio-registry
-
-mvn install -am -pl 'utils/cloud-events/cloud-events-provider' -Dmaven.javadoc.skip=true
-
-```
-
-## Running the demo
-
-After installing in your local maven repo the `apicurio-registry-utils-cloud-events-provider` library you have to build this app.
-```
-mvn package
-```
-
-Once that's done you can start Apicurio Registry, I suggest doing it with a container
-```
-docker run -p 8080:8080 docker.io/apicurio/apicurio-registry-mem:1.3.2.Final
-```
-
-Then create the artifacts in the registry that are used by the CloudEvents serde to validate the data that the REST API will receive
-```
-curl --data "@new-order-schema.json" -X POST -i -H "X-Registry-ArtifactId: io.apicurio.examples.new-order" http://localhost:8080/api/artifacts
-```
-```
-curl --data "@processed-order-schema.json" -X POST -i -H "X-Registry-ArtifactId: io.apicurio.examples.processed-order" http://localhost:8080/api/artifacts
-```
-
-Finally it's time to run this demo app.
-```
-java -jar target/cloudevents-example-*-runner.jar
-```
-
-## Test the app
-
-To test the app we are going to make a few http requests sending CloudEvents to the API.
-
-Previously we created the artifact `io.apicurio.examples.new-order` in the registry with it's json schema.
-
-With this request we are going to send a CloudEvent of type `new-order` and dataschema `/apicurio/io.apicurio.examples.new-order/1` to the path `/orders`. The serdes layer will read that dataschema and fetch the json schema from
-Apicurio Registry in order to validate the json data adheres to the schema.
-The server responds with another CloudEvent of type `io.apicurio.examples.processed-order` that has also been validated against it's stored schema in Apicurio Registry.
-```
-$ curl -X POST -i -H "Content-Type: application/json" -H "ce-dataschema:/apicurio/io.apicurio.examples.new-order/1" -H "ce-type:new-order" -H "ce-source:test" -H "ce-id:aaaaa" -H "ce-specversion:1.0" --data '{"itemId":"abcde","quantity":5}' http://localhost:8082/order
-HTTP/1.1 200 OK
-transfer-encoding: chunked
-ce-source: apicurio-registry-example-api
-ce-specversion: 1.0
-ce-type: io.apicurio.examples.processed-order
-ce-id: 005762b9-9bea-4f6e-bf78-5ac8f7c99429
-ce-dataschema: apicurio-global-id-2
-Content-Type: application/json
-
-{"orderId":"c763f2b4-2356-4124-a690-b205f9baf338","itemId":"abcde","quantity":5,"processingTimestamp":"2021-01-20T16:26:40.128Z","processedBy":"orders-service","error":null,"approved":true}
-```
-
-This next curl command sends a request to another endpoint in this application. The important part of this is the implementation. This `purchase` endpoint shows the usage of the CloudEvents serde library in REST clients, allowing for producers of events to validate the CloudEvents they produce.
-
-```
-$ curl -i http://localhost:8082/purchase/abc/5
-HTTP/1.1 200 OK
-ce-source: apicurio-registry-example-api
-transfer-encoding: chunked
-ce-specversion: 1.0
-ce-type: io.apicurio.examples.processed-order
-ce-id: f1eabd84-ad78-4beb-9c6c-04f843abf669
-Content-Length: 187
-ce-dataschema: apicurio-global-id-2
-Content-Type: application/json
-
-{"orderId":"29606862-e74c-47b4-95d0-b59289ea023c","itemId":"abc","quantity":5,"processingTimestamp":"2021-01-20T16:32:06.198Z","processedBy":"orders-service","error":null,"approved":true}
-
-```
-
-This command shows an example of what happens when you try to send a CloudEvent using a non-existent schema.
-```
-$ curl -X POST -i -H "Content-Type: application/json" -H "ce-type:io.apicurio.examples.test" -H "ce-source:test" -H "ce-id:aaaaa" -H "ce-specversion:1.0" --data '{"itemId":"abcde","quantity":5}' http://localhost:8082/order
-HTTP/1.1 404 Not Found
-Content-Length: 0
-```
diff --git a/examples/confluent-serdes/README.md b/examples/confluent-serdes/README.md
new file mode 100644
index 0000000000..b2c1da794c
--- /dev/null
+++ b/examples/confluent-serdes/README.md
@@ -0,0 +1,28 @@
+# confluent-serdes
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario where applications use a mix of Confluent and Apicurio Registry serdes classes. This
+example uses the Confluent serializer for the producer and the Apicurio Registry deserializer
+class for the consumer.
+
+
+- Configuring a Confluent Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Avro schema in the registry (registered by the producer)
+- Data sent as a simple GenericRecord, no java beans needed
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/custom-resolver/README.md b/examples/custom-resolver/README.md
new file mode 100644
index 0000000000..b4b6e8f44c
--- /dev/null
+++ b/examples/custom-resolver/README.md
@@ -0,0 +1,27 @@
+# custom-resolver
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Avro as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Register the Avro schema in the registry using a custom Global Id Strategy
+- Data sent as a simple GenericRecord, no java beans needed
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092 or the value must be changed accordingly.
+- Apicurio Registry must be running on localhost:8080 or the value must be changed accordingly.
+
+
+@author eric.wittmann@gmail.com
+@author carles.arnal@redhat.com
+
diff --git a/examples/custom-strategy/README.md b/examples/custom-strategy/README.md
new file mode 100644
index 0000000000..88bdbc992e
--- /dev/null
+++ b/examples/custom-strategy/README.md
@@ -0,0 +1,27 @@
+# custom-strategy
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Avro as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Register the Avro schema in the registry using a custom Global Id Strategy
+- Data sent as a simple GenericRecord, no java beans needed
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092 or the value must be changed accordingly.
+- Apicurio Registry must be running on localhost:8080 or the value must be changed accordingly.
+
+
+@author eric.wittmann@gmail.com
+@author carles.arnal@redhat.com
+
diff --git a/examples/debezium-openshift/README.md b/examples/debezium-openshift/README.md
index 983cadd83a..7253aca25f 100644
--- a/examples/debezium-openshift/README.md
+++ b/examples/debezium-openshift/README.md
@@ -1,159 +1,6 @@
-# Debezium and Apicurio Registry on OpenShift
+# debezium-openshift
-This example contains a simple application that uses Debezium with Apicurio Registry, deployed on OpenShift.
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-## Prerequisites
+## Instructions
-1. Prepare or provision an OpenShift cluster.
-
-2. Install the following operators:
-
-- AMQ Streams (tested on `2.5.0-0` / Kafka `3.4`)
-- Red Hat Integration - Service Registry Operator (tested on `2.2.2`)
-
-3. Configure `oc`:
-
-```shell
-oc login #...
-export NAMESPACE="example"
-oc new-project $NAMESPACE
-```
-
-4. Prepare an image repository for example app images, and configure:
-
-```shell
-export APP_IMAGE_GROUP="quay.io/myorg"
-```
-
-which will result in `quay.io/myorg/apicurio-registry-examples-debezium-openshift:latest` image name.
-
-5. Prepare an image repository for customized Kafka Connect images, and configure:
-
-```shell
-export KAFKA_CONNECT_IMAGE="$APP_IMAGE_GROUP/kafka-connect-example:latest"
-```
-
-which will result in `quay.io/myorg/kafka-connect-example:latest` image name.
-
-6. Create a pull secret for the customized Kafka Connect image repository. This example command creates it from
-your local docker config file:
-
-```shell
-oc create secret generic example-components-pull-secret \
- --from-file=.dockerconfigjson=$HOME/.docker/config.json \
- --type=kubernetes.io/dockerconfigjson
-```
-
-## Deploy example components: MySQL, Kafka, and Debezium Kafka connector
-
-Review the *example-components.yaml* template, then apply it:
-
-```shell
-oc process -f example-components.yaml \
- -p NAMESPACE=$NAMESPACE \
- -p KAFKA_CONNECT_IMAGE=$KAFKA_CONNECT_IMAGE \
- | oc apply -f -
-```
-
-Wait for all components to deploy (some pods may be failing for a short time).
-
-After some time, you should be able to see the topics created by Debezium, for example:
-
-```shell
-oc get --no-headers -o custom-columns=":metadata.name" kafkatopic
-```
-
-```
-connect-cluster-configs
-connect-cluster-offsets
-connect-cluster-status
-consumer-offsets---84e7a678d08f4bd226872e5cdd4eb527fadc1c6a
-example
-example.inventory.addresses
-example.inventory.customers
-example.inventory.orders
-example.inventory.products
-example.inventory.products-on-hand---406eef91b4bed15190ce4cbe31cee9b5db4c0133
-kafkasql-journal
-schema-changes.inventory
-strimzi-store-topic---effb8e3e057afce1ecf67c3f5d8e4e3ff177fc55
-strimzi-topic-operator-kstreams-topic-store-changelog---b75e702040b99be8a9263134de3507fc0cc4017b
-```
-
-Apicurio Registry should contain the AVRO schemas registered by Debezium. Get and configure Apicurio Registry URL by
-running `oc route`:
-
-```shell
-export REGISTRY_URL="http://example-components-registry.example.router-default.apps.mycluster.com"
-```
-
-Then, you can list the schemas using the following example command:
-
-```shell
-curl -s "$REGISTRY_URL/apis/registry/v2/search/artifacts?limit=50&order=asc&orderby=name" \
- | jq -r ".artifacts[] | .id" \
- | sort
-```
-
-```
-event.block
-example.inventory.addresses-key
-example.inventory.addresses-value
-example.inventory.addresses.Value
-example.inventory.customers-key
-example.inventory.customers-value
-example.inventory.customers.Value
-example.inventory.orders-key
-example.inventory.orders-value
-example.inventory.orders.Value
-example.inventory.products-key
-example.inventory.products_on_hand-key
-example.inventory.products_on_hand-value
-example.inventory.products_on_hand.Value
-example.inventory.products-value
-example.inventory.products.Value
-example-key
-example-value
-io.debezium.connector.mysql.Source
-io.debezium.connector.schema.Change
-io.debezium.connector.schema.Column
-io.debezium.connector.schema.Table
-```
-
-From the Apicurio Registry URL, we can extract the `INGRESS_ROUTER_CANONICAL_HOSTNAME` variable that will be used later:
-
-```shell
-export INGRESS_ROUTER_CANONICAL_HOSTNAME="router-default.apps.mycluster.com"
-```
-
-## Build the example application
-
-```shell
-mvn clean install \
- -Dregistry.url="$REGISTRY_URL" \
- -Dquarkus.container-image.build=true \
- -Dquarkus.container-image.group=$APP_IMAGE_GROUP \
- -Dquarkus.container-image.tag=latest
-```
-
-Push the application image:
-
-```shell
-docker push $APP_IMAGE_GROUP/apicurio-registry-examples-debezium-openshift:latest
-```
-
-Apply the application template:
-
-```shell
-oc process -f example-app.yaml \
- -p NAMESPACE=$NAMESPACE \
- -p APP_IMAGE_GROUP=$APP_IMAGE_GROUP \
- -p INGRESS_ROUTER_CANONICAL_HOSTNAME=$INGRESS_ROUTER_CANONICAL_HOSTNAME \
- | oc apply -f -
-```
-
-## Run the example:
-
-```shell
-curl -v -X POST -d 'start' http://example-app.$NAMESPACE.$INGRESS_ROUTER_CANONICAL_HOSTNAME/api/command
-```
diff --git a/examples/event-driven-architecture/README.md b/examples/event-driven-architecture/README.md
index d693fd2c38..90a14e64c9 100644
--- a/examples/event-driven-architecture/README.md
+++ b/examples/event-driven-architecture/README.md
@@ -1,100 +1,6 @@
-# Kafka, ksqldb, Kafka-ui, apicurio-registry and Debezium together
+# event-driven-architecture
-This tutorial demonstrates how to use [Debezium](https://debezium.io/) to monitor the PostgreSQL database used by Apicurio Registry. As the
-data in the database changes, by adding e.g. new schemas, you will see the resulting event streams.
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-## Avro serialization
+## Instructions
-The [Apicurio Registry](https://github.com/Apicurio/apicurio-registry) open-source project provides several
-components that work with Avro:
-
-- An Avro converter that you can specify in Debezium connector configurations. This converter maps Kafka
- Connect schemas to Avro schemas. The converter then uses the Avro schemas to serialize the record keys and
- values into Avro’s compact binary form.
-
-- An API and schema registry that tracks:
-
- - Avro schemas that are used in Kafka topics
- - Where the Avro converter sends the generated Avro schemas
-
-### Prerequisites
-
-- Docker and is installed and running.
-
- This tutorial uses Docker and the Linux container images to run the required services. You should use the
- latest version of Docker. For more information, see
- the [Docker Engine installation documentation](https://docs.docker.com/engine/installation/).
-
-## Starting the services
-
-1. Clone this repository:
-
- ```bash
- git clone https://github.com/Apicurio/apicurio-registry-examples.git
- ```
-
-1. Change to the following directory:
-
- ```bash
- cd event-driven-architecture
- ```
-
-1. Start the environment
-
- ```bash
- docker-compose up -d
- ```
-
-The last command will start the following components:
-
-- Single node Zookeeper and Kafka cluster
-- Single node Kafka Connect cluster
-- Apicurio service registry
-- PostgreSQL (ready for CDC)
-- KsqlDb instance
-- Kafka UI
-
-## Apicurio converters
-
-Configuring Avro at the Debezium Connector involves specifying the converter and schema registry as a part of
-the connectors configuration. The connector configuration file configures the connector but explicitly sets
-the (de-)serializers for the connector to use Avro and specifies the location of the Apicurio registry.
-
-> The container image used in this environment includes all the required libraries to access the connectors and converters.
-
-The following are the lines required to set the **key** and **value** converters and their respective registry
-configuration:
-
-```json
-{
- "value.converter.apicurio.registry.url": "http://schema-registry:8080/apis/registry/v2",
- "key.converter.apicurio.registry.url": "http://schema-registry:8080/apis/registry/v2",
- "value.converter": "io.apicurio.registry.utils.converter.AvroConverter",
- "key.converter.apicurio.registry.auto-register": "true",
- "key.converter": "io.apicurio.registry.utils.converter.AvroConverter",
- "value.converter.apicurio.registry.as-confluent": "true",
- "value.converter.apicurio.registry.use-id": "contentId"
-}
-```
-
-> The compatibility mode allows you to use other providers tooling to deserialize and reuse the schemas in the Apicurio service registry.
-
-### Create the connector
-
-Let's create the Debezium connector to start capturing the changes of the database.
-
-1. Create the connector using the REST API. You can execute this step either by using the curl command below
- or by creating the connector from the Kafka UI.
-
- ```bash
- curl -X POST http://localhost:8083/connectors -H 'content-type:application/json' -d @studio-connector.json
- ```
-
-### Check the data
-
-The previous step created and started the connector. Now, all the data inserted in the Apicurio Registry database will be captured by Debezium
-and sent as events into Kafka.
-
-## Summary
-
-By using this example you can test how to start a full even driven architecture, but it's up to you how to use the produced events in e.g. ksqldb to create streams/tables etc.
diff --git a/examples/jsonschema-validation/README.md b/examples/jsonschema-validation/README.md
new file mode 100644
index 0000000000..3560e52d71
--- /dev/null
+++ b/examples/jsonschema-validation/README.md
@@ -0,0 +1,26 @@
+# jsonschema-validation
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use Apicurio Registry Schema Validation library for JSON and JSON Schema.
+
+The following aspects are demonstrated:
+
+
+- Register the JSON Schema in the registry
+- Configuring a JsonValidator that will use Apicurio Registry to fetch and cache the schema to use for validation
+- Successfully validate Java objects using static configuration to always use the same schema for validation
+- Successfully validate Java objects using dynamic configuration to dynamically choose the schema to use for validation
+
+
+Pre-requisites:
+
+
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/make-readmes.py b/examples/make-readmes.py
new file mode 100644
index 0000000000..41001ff255
--- /dev/null
+++ b/examples/make-readmes.py
@@ -0,0 +1,74 @@
+import os
+import re
+
+# Function to find all Java files in a directory and its subdirectories
+def find_java_files(directory):
+ java_files = []
+ for root, _, files in os.walk(directory):
+ for file in files:
+ if file.endswith(".java"):
+ java_files.append(os.path.join(root, file))
+ return java_files
+
+# Function to extract specific comment blocks from a Java file
+def extract_comments(file_path):
+ comments = []
+ with open(file_path, 'r') as file:
+ content = file.read()
+ comment_blocks = re.findall(r'/\*[\s\S]*?\*/', content)
+ for block in comment_blocks:
+ if "This example" in block:
+ comments.append(block)
+ return comments
+
+# Function to convert a list of comments to markdown format
+def comments_to_markdown(comments):
+ markdown = ""
+ for comment in comments:
+ # Clean up comment block, remove * from each line, and convert to markdown
+ clean_comment = comment.replace('/*', '').replace('*/', '').strip()
+ clean_comment_lines = [line.lstrip(' *') for line in clean_comment.split('\n')]
+ clean_comment = "\n".join(clean_comment_lines)
+ markdown += f"{clean_comment}\n\n"
+ return markdown
+
+# Function to create a README.md file from a template and extracted comments
+def create_readme(directory, markdown_content):
+ # Use the name of the directory as the H1 header
+ directory_name = os.path.basename(directory.rstrip('/'))
+ readme_template = f"# {directory_name}\n\nThis is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/\n\n## Instructions\n\n"
+ readme_content = readme_template + markdown_content
+
+ # Write the README.md file
+ readme_path = os.path.join(directory, "README.md")
+ with open(readme_path, 'w') as readme_file:
+ readme_file.write(readme_content)
+
+# Main function to process each immediate subdirectory
+def main():
+ # Get the base directory (where the script is run)
+ base_directory = os.getcwd()
+
+ # List all immediate subdirectories
+ subdirectories = [os.path.join(base_directory, d) for d in os.listdir(base_directory) if os.path.isdir(os.path.join(base_directory, d))]
+
+ # Process each subdirectory
+ for subdirectory in subdirectories:
+ # Find all Java files
+ java_files = find_java_files(subdirectory)
+
+ # Extract comments from each Java file
+ all_comments = []
+ for java_file in java_files:
+ comments = extract_comments(java_file)
+ all_comments.extend(comments)
+
+ # Convert comments to markdown
+ markdown_content = comments_to_markdown(all_comments)
+
+ # Create the README.md file
+ create_readme(subdirectory, markdown_content)
+
+# Entry point for the script
+if __name__ == "__main__":
+ main()
diff --git a/examples/mix-avro/README.md b/examples/mix-avro/README.md
new file mode 100644
index 0000000000..522514a753
--- /dev/null
+++ b/examples/mix-avro/README.md
@@ -0,0 +1,30 @@
+# mix-avro
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example application showcases a scenario where Apache Avro messages are published to the same
+Kafka topic using different Avro schemas. This example uses the Apicurio Registry serdes classes to serialize
+and deserialize Apache Avro messages using different schemas, even if received in the same Kafka topic.
+The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Avro schema in the registry (registered by the producer)
+- Data sent as a simple GenericRecord, no java beans needed
+- Producing and consuming Avro messages using different schemas mapped to different Apicurio Registry Artifacts
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author Fabian Martinez
+@author Carles Arnal
+
diff --git a/examples/protobuf-bean/README.md b/examples/protobuf-bean/README.md
new file mode 100644
index 0000000000..0c18695349
--- /dev/null
+++ b/examples/protobuf-bean/README.md
@@ -0,0 +1,26 @@
+# protobuf-bean
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Protobuf as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Protobuf schema in the registry (registered by the producer)
+- Data sent as a custom java bean and received as the same java bean
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/protobuf-find-latest/README.md b/examples/protobuf-find-latest/README.md
new file mode 100644
index 0000000000..453ab76301
--- /dev/null
+++ b/examples/protobuf-find-latest/README.md
@@ -0,0 +1,27 @@
+# protobuf-find-latest
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Protobuf as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Manually registering the Protobuf schema in the registry (registered using the RegistryClient before running the producer/consumer), this would be equivalent to using the maven plugin or a custom CI/CD process
+- Data sent as a custom java bean and received as the same java bean
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+@author carles.arnal@redhat.com
+
diff --git a/examples/protobuf-validation/README.md b/examples/protobuf-validation/README.md
new file mode 100644
index 0000000000..a8b92bed70
--- /dev/null
+++ b/examples/protobuf-validation/README.md
@@ -0,0 +1,26 @@
+# protobuf-validation
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use Apicurio Registry Schema Validation library for Protobuf
+
+The following aspects are demonstrated:
+
+
+- Register the Protobuf Schema in the registry
+- Configuring a Protobuf that will use Apicurio Registry to fetch and cache the schema to use for validation
+- Successfully validate Java objects using static configuration to always use the same schema for validation
+- Successfully validate Java objects using dynamic configuration to dynamically choose the schema to use for validation
+
+
+Pre-requisites:
+
+
+- Apicurio Registry must be running on localhost:8080
+
+
+@author carnalca@redhat.com
+
diff --git a/examples/quarkus-auth/README.md b/examples/quarkus-auth/README.md
index 6a8659d1ea..55f9f94c30 100644
--- a/examples/quarkus-auth/README.md
+++ b/examples/quarkus-auth/README.md
@@ -1,13 +1,6 @@
-# Commands
+# quarkus-auth
-`mvn generate-sources -Pavro`
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-`mvn package`
+## Instructions
-Set the envorinment variables for the registry url, auth details, kafka boostrap servers ...
-
-`java -jar target/quarkus-app/quarkus-run.jar`
-
-Or
-
-`mvn quarkus:dev`
\ No newline at end of file
diff --git a/examples/rest-client-downstream/README.md b/examples/rest-client-downstream/README.md
index 7676a9a82f..6353c8dc63 100644
--- a/examples/rest-client-downstream/README.md
+++ b/examples/rest-client-downstream/README.md
@@ -1,11 +1,6 @@
-# Apicurio Rest Client example application using your RHOSR instance.
+# rest-client-downstream
-1. Create RHOSR Managed Service instance on cloud.redhat.com and store your instance api url.
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-2. Create associated Service Account, save client Id and Client Secret.
+## Instructions
-3. Ensure your service account has at least, manager permissions on your RHOSR instance.
-
-4. Set the environment variables AUTH_CLIENT_ID, AUTH_CLIENT_SECRET, AUTH_TOKEN_URL and REGISTRY_URL.
-
-5. Execute the java main SimpleRegistryDemo on this module, it will create, get and delete a schema in your instance, proving the functioning of the service.
diff --git a/examples/rest-client/README.md b/examples/rest-client/README.md
new file mode 100644
index 0000000000..13f2ca0f97
--- /dev/null
+++ b/examples/rest-client/README.md
@@ -0,0 +1,6 @@
+# rest-client
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
diff --git a/examples/serdes-with-references/README.md b/examples/serdes-with-references/README.md
new file mode 100644
index 0000000000..7f7f910e6a
--- /dev/null
+++ b/examples/serdes-with-references/README.md
@@ -0,0 +1,6 @@
+# serdes-with-references
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
diff --git a/examples/simple-avro-downstream/README.md b/examples/simple-avro-downstream/README.md
index 32e4254c98..1707761a83 100644
--- a/examples/simple-avro-downstream/README.md
+++ b/examples/simple-avro-downstream/README.md
@@ -1,17 +1,26 @@
-# Apicurio Rest Client example application using your RHOSR instance.
+# simple-avro-downstream
-1. Create RHOSR Managed Service instance on cloud.redhat.com and store your instance api url.
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
-2. Create associated Service Account, save client Id and Client Secret.
+## Instructions
-3. Ensure your service account has at least, manager permissions on your RHOSR instance.
-4. Create or use an existing instance of Openshift Streams for Apache Kafka. Get the bootstraps servers for that instance.
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Avro as the serialization type. The following aspects are demonstrated:
-5. Create a topic with the name SimpleAvroExample on that Openshift Streams for Apache Kafka instance.
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Avro schema in the registry (registered by the producer)
+- Data sent as a simple GenericRecord, no java beans needed
+
+
+Pre-requisites:
-6. Ensure that the previously created service account has permissions on that Kafka instance topic for producing and consuming from that topic.
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
-7. Set the environment variables SERVERS, AUTH_CLIENT_ID, AUTH_CLIENT_SECRET, AUTH_TOKEN_URL and REGISTRY_URL.
+@author eric.wittmann@gmail.com
-8. Execute the java main SimpleAvroExample on this module, it will produce and consume 5 messages, creating and enforcing a schema during the way, proving the functioning of the service with a realistic application.
diff --git a/examples/simple-avro-maven/README.md b/examples/simple-avro-maven/README.md
new file mode 100644
index 0000000000..1346a66829
--- /dev/null
+++ b/examples/simple-avro-maven/README.md
@@ -0,0 +1,31 @@
+# simple-avro-maven
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Avro as the serialization type and the Schema pre-registered via a Maven plugin.
+The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Pre-register the Avro schema in the registry via the Maven plugin
+- Data sent as a simple GenericRecord, no java beans needed
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+- Schema is registered by executing "mvn io.apicurio:apicurio-registry-maven-plugin:register@register-artifact"
+
+
+Note that this application will fail if the above maven command is not run first, since
+the schema will not be present in the registry.
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/simple-avro/README.md b/examples/simple-avro/README.md
new file mode 100644
index 0000000000..1b193aed8d
--- /dev/null
+++ b/examples/simple-avro/README.md
@@ -0,0 +1,26 @@
+# simple-avro
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Avro as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Avro schema in the registry (registered by the producer)
+- Data sent as a simple GenericRecord, no java beans needed
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/simple-json/README.md b/examples/simple-json/README.md
new file mode 100644
index 0000000000..c02e8f9224
--- /dev/null
+++ b/examples/simple-json/README.md
@@ -0,0 +1,31 @@
+# simple-json
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with JSON as the serialization type (and JSON Schema for validation). Because JSON
+Schema is only used for validation (not actual serialization), it can be enabled and disabled
+without affecting the functionality of the serializers and deserializers. However, if
+validation is disabled, then incorrect data could be consumed incorrectly.
+
+The following aspects are demonstrated:
+
+
+- Register the JSON Schema in the registry
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Data sent as a MessageBean
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/simple-protobuf/README.md b/examples/simple-protobuf/README.md
new file mode 100644
index 0000000000..f364e0eedb
--- /dev/null
+++ b/examples/simple-protobuf/README.md
@@ -0,0 +1,26 @@
+# simple-protobuf
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to use the Apicurio Registry in a very simple publish/subscribe
+scenario with Protobuf as the serialization type. The following aspects are demonstrated:
+
+
+- Configuring a Kafka Serializer for use with Apicurio Registry
+- Configuring a Kafka Deserializer for use with Apicurio Registry
+- Auto-register the Protobuf schema in the registry (registered by the producer)
+- Data sent as a custom java bean and received as a generic DynamicMessage
+
+
+Pre-requisites:
+
+
+- Kafka must be running on localhost:9092
+- Apicurio Registry must be running on localhost:8080
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/simple-validation/README.md b/examples/simple-validation/README.md
new file mode 100644
index 0000000000..bd185ca632
--- /dev/null
+++ b/examples/simple-validation/README.md
@@ -0,0 +1,24 @@
+# simple-validation
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+
+
+This example demonstrates how to integrate with Apicurio Registry when performing client-side validation of
+JSON messages. This example imagines a generic scenario where JSON messages are sent/published to a custom
+messaging system for later consumption. It assumes that the JSON Schema used for validation must already be
+registered. The following aspects are demonstrated:
+
+- Fetch the JSON Schema from the registry
+- Generate and Validate JSON messages
+- Send validated messages to a messaging system
+
+Pre-requisites:
+
+- Apicurio Registry must be running on localhost:8080
+- JSON schema must be registered at coordinates default/SimpleValidationExample
+
+
+@author eric.wittmann@gmail.com
+
diff --git a/examples/tools/README.md b/examples/tools/README.md
new file mode 100644
index 0000000000..d32895bc56
--- /dev/null
+++ b/examples/tools/README.md
@@ -0,0 +1,6 @@
+# tools
+
+This is an Apicurio Registry example. For more information about Apicurio Registry see https://www.apicur.io/registry/
+
+## Instructions
+