Re: Track automation settings and ...
Yes, Ski, that's what I mean. Rounik's solution works very well for me because I can just set the transformers as "automation splitters" and have the desired parameters manipulated whether I'm recording the automation, or just 'playing' the parameters in real time, or both! I understood about the 'special fader data' used by Logic, but when you made your first comment in this post, it got my attention because of it being another way to accomplish this. I should have clarified about the iXY controlling fader data in my initial post.
Your suggestion was not at all lost, though, Ski, because I'm sure the occasion will come up for me to employ those methods in the future. I can see you, for example, doing some tutorial work as mentioned in this forum a few days ago, and showing how we can have expressive control over the instruments you've mentioned (EXS24, Sculpture, ES-2) in order to create more life-like simulations of real instruments. This would be the perfect opportunity to employ external controllers to effect those expressive changes through the manipulation of the on-board parameters of those software instruments. I think that's what you had in mind in your choice 'a' above, and I'd be very interested to see - and HEAR - how you'd go about adding realism in this way. Do you use x-y controllers of any kind to do this? It seems to me that would afford the player much more power and flexibility of expression since two different expressive parameters could be controlled at once, and with just one finger, to boot! This iXY app enables you to have as many as three separate x-y controls on the same virtual pad, so that theoretically you could have three fingers of one hand manipulating each of those three x-y controls simultaneously and independently of each other. So six separate expressive parameters could be controlled at once, with one hand, while the other hand could be playing a line on a keyboard that could wind up having a lot of realism!
To address your choice 'b' and 'c': I have experienced the flaky behavior you mentioned with the learn function, and my most recent attempt to use it, as well as your and other peoples' comments, have affirmed this. This is why I tend to go to the environment as a first and best course of action. I have come to really value the environment and view it as a world of almost endless musical possibilities. But what I've found with transformers, especially when using them in live MIDI applications, is that they provide very reliable, rock-solid transformation of MIDI data into the desired result every time. So, yes, Ski, your answer is very much right on the money.