Hi Dick, thanks for commenting. Yes, I agree — this is a hack.
In my own work, I’m much more likely to be turning ON/OFF a single cursor from an algorithm running inside some Pd patch or other process, rather than via physical gestures. It seems users of widely varying levels of sophistication visit this forum and I was hoping to help inform the user who does not yet realize that one has that level of control over Iannix (I believe all example patches show Iannix only controlling something downstream, rather than vice-versa).
In the video, I used a MIDI controller in order to have something visual to relate to on-screen changes in some of Iannix’s objects (and to show that Iannix and *something else*, regardless of what that something else is, can both be affecting control over some other process). I’m hoping the viewer can also imagine another controller — maybe a smartphone, a Lemur, a joystick and flight-sim pedals, or maybe just another program; a Supercollider or Max patch. The MIDI is incidental.
MIDI is something I’ve been working toward extricating from my life. I would ask: Why deal with the limitations of MIDI when something like OSC exists? I can think of one answer for that: When the only physical gesture-capturing devices one has are MIDI-based. This, of course, applies to many of us. I view OSC as both an inter-process and an intra-process messaging mechanism. I can’t imagine using MIDI for such purposes ( well, I don’t want to imagine it :–)
If one uses OSC-based expressions inside of one’s algorithms, it becomes easier to break apart a CPU-heavy process and run parts of it on another computer across the network. This method of function parameter-passing (speaking in procedural programming terms) can then also become the mechanism for eventual load distribution. Cool, I shout!
Sorry for the long-winded reply that turned into a manifesto.
I really like your OSC demo video. One thing seemed a bit hacky to me, namely the use of Max to interpret MIDI and forward the control info to IanniX via OSC. IanniX can send MIDI info, and the documents hint that it can receive (and presumably, respond) as well. But, they don’t tell you HOW to do it?
Edit: I take it back: I recall the Max patch example *does* show how the downstream process can control Iannix timeline rate!
|cookielawinfo-checkbox-analytics||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".|
|cookielawinfo-checkbox-functional||11 months||The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".|
|cookielawinfo-checkbox-necessary||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".|
|cookielawinfo-checkbox-others||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.|
|cookielawinfo-checkbox-performance||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".|