-
Notifications
You must be signed in to change notification settings - Fork 3
Development
Developers:
- Petre Tudor
- Timo Aho
Map hand gestures to other types of input
- Mouse control
- Keyboard control
- Custom control (evoke scripts)
- A GUI for defining gestures
A basic problem is that leap motion hand gestures provide no haptic feedback so we need to support it other ways
- Audio effects (mouse click, mouse down, button press etc)
- Visual feedback -> Some kind of status window? -> "Ghost hands" via webgl magic.. possibly?
- popup menus
Examples of use: -basic desktop usage: mouse movement, some limited keyboard support (arrow keys, page up/down), close window, switch windows. switch desktops - media usage: playback controls etc - Propellerhead use cases: -> Home automation integration -> Maybe a game control
- Git & Github
- Coffeescript
- Node.js
- Node libraries -> robot.js -> somethig for desktop notifications (like growl, but growl does not support timers on notificications) -> nw.js ?
This is my idea of the basic architecture. The Frame Parser parses a leap motion frame into our own model, which contains information about hand positions so that it can be used in our gesture configuration.
The basic implementation is to query screen resolution and use it to map the x and y attributes of palm position into a point in screen. In addition, to make mouse movement feasible several complementary techniques should be used.
Sensitivity is implemented as a multiplier to palm coordinates. A value of 1 indicates that mouse movement is mapped to the entire field of vision of the leap motion sensor. A value lesser than 1 makes mouse movement more accurate, but limits the mouse movement area to portion of the screen. A value greater than 1 makes it so that the mouse pointer reaches screen border before palm position reaches the border of sensor read area.
A notable benefit for using a sensitivity value greater than one is that hand confidence levels drop in the edges. Forcing the user to remain in the middle of the device FOV greatly increases accuracy near screen borders.
The leapgim frame model contains human readable information about tracked hands.
Legend:
{x,y,z} - a 3-element object representing a point in 3d space. The sensor device is used as the point of origin, and the possible values are [0..1] for y and [-1..1] for x and z. Palm position is mainly used for mouse control.
{direction} - ["up", "down", "left", "right", "forward", "backward"]
{extendedFingers} - { findexFinger, middleFinger, ringFinger, pinky, thumb }
{pinchingFinger} - [ "findexFinger", "middleFinger", "ringFinger", "pinky" ]
Hand model spec:
- handType: [ "left", "right" ]
- extendedFingers: {extendedFingers}
- pinchingFinger: {pinchingFinger}
- position: {x,y,z}
- palmDirection: {direction}
- palmNormal: {direction}
- grabStrength: [0..1]
- pinchStrength: [0..1]
The legacy version of leapgim old held one model, but we could consider keeping the previous frame saved as well. This would allow as to configure gestures e.g. "hand movement changed from left to right after the previous moment".
This draft does not include gestures. Should pass inforrmation about native leap gestures in a seperate array.
This draft does not include timed gestures. We need to include timers at some point.
Gesture model contained pretty much the same field as frame model, with some notable differences:
- For numeric values a {min, max} map object was used to provide acceptable parameters for each gesture.
- Timer attributes for gestures that were to be held a certain time perioid before triggering an action.
- Any value could basically be ommitted.
Main Repo: https://github.com/Zeukkari/leapgim
Track Main Repo from local fork: git remote add --track master leapgim [email protected]:Zeukkari/leapgim.git
Pull changes from repo: git fetch leapgim