Amid yesterday's slew of education-focused announcements, Apple quietly slipped out an update to the free GarageBand for iOS app. Amongst other new features, it now allows owners of the iPhone X to use the TrueDepth camera to control effect and synth parameters using facial movements, completely hands-free.
It's the same tech that enables the Animoji face-tracking to work, and while it's a relatively niche development, it does hint at the possibilities of what might come in the future with regard to live performance and how we interact with iOS devices. At some stage, this tech will come to a wider range of iPhones and the iPad, at which point it will likely be more advanced than it is now.
Here's the list of new features:
* Requires iPhone X. GarageBand uses ARKit face tracking features to translate your facial expressions into instrument effect controls. Your face information is processed on device, and only music is captured during your performance.
What do you think of this, are you excited to be able to "face the music"? Have you tried it out yet? Let us know!