ManipNet, Gestures, MIDI De-Quantization

Discussion in 'Software' started by Stranger Here Myself, Sep 20, 2021.

  1. Stranger Here Myself

    Stranger Here Myself Newbie

    Joined:
    May 6, 2021
    Messages:
    4
    Likes Received:
    0
    Something on SIGGRAPH caught my eye:
    https://github.com/cghezhang/ManipNet

    The gist of ManipNet is that it connects trajectories of the finger tips to gestures of the hands, with reference to an object those hands interact with.

    This led me to think that MIDI of a solo instrument might suffice as an input source for a specifically trained ManipNet. This net might be able to output gestures of the musician’s hand.

    Then, if we know how hand gestures corresponds to stresses in a performance, we can dequantize the original midi based on that (or output CCs).

    Does it sound feasible? I am an ML hobbyist, and would like to know what you think of the idea.
     
    Last edited: Sep 21, 2021
  2.  
  3. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    7,292
    Likes Received:
    4,028
    Location:
    Europe
    There have been research and experiments for quite some time about interfacing just with the body for music, VJs,... games. Lately using ML/AI.
    AFAIK there isn't still nothing so powerful in practice but I'm not fully updated.
     
Loading...
Similar Threads - ManipNet Gestures MIDI Forum Date
GestureSpark - Build your own multitouch control surfaces Software News Jan 2, 2020
Mi.Mu: Turns Hand Gestures Into Music Software Apr 27, 2014
Loading...