-
Notifications
You must be signed in to change notification settings - Fork 40
Supporting Hubs Default Avatar Animations
When creating new robot bodies for use in Hubs, you will need to add a few additional nodes in the glTF file that makes up the avatar in order to have the new avatar body use the head and eye animations that play in Hubs to indicate that a user is speaking.
To do this, first convert your avatar .glb file into a .glTF file. If you are using the .blend files provided in this repository, you can also simply export to a .glTF from Blender. Open the glTF file in a text editor and add the following components at the specified areas, taking care to keep the JSON format valid with correct comma placement.
In the first scene
in the scenes
node:
"extensions": {
"MOZ_hubs_components" : {
"loop-animation": {
"clip": "idle_eyes"
}
}
}
In the head
node in nodes
:
"extensions": {
"MOZ_hubs_components": {
"scale-audio-feedback": ""
}
},
In the AvatarRoot
node:
"extensionsUsed": [
"MOZ_hubs_components"
]
You must then use a .glTF to .glb converter such as this one to convert the file with added nodes back into the .glb format. When you upload your avatar .glb to Hubs, it should now include the eye and head movement animations.
Notes:
- If you do not properly format the JSON file, you will have issues converting it into a .glb file
- The current importing of .glTF models into Blender modifies the skeleton of the .glTF file. If you try to use Blender for your conversion from .glTF to .glb, you may run into issues with the alignment of your avatar in Hubs