Geert Bevin has an impressive CV. As co-author of the MIDI Polyphonic Expression specification, product and engineering lead on Moog Music's iOS synths, most notably Minimoog Model D, Model 15 and Animoog, he also developed and co-designed most of the firmware of LinnStrument, by Roger Linn
as well as developing and co-designing EigenD, the software for the Eigenharp instrument. We caught up with him to get his thoughts on the past, present and future of music technology.
AskAudio: Can you briefly describe what your current career focus is?
Geert Bevin: I'm passionate about finding how cutting-edge technology can make people more expressive and allow for a closer connection to one's creative humanity. Since music is so personal and emotional, while often also technical, it's a perfect medium to explore new creative avenues while pushing technical boundaries.
AA: How did you get into your career as an audio developer?
As an early teenager, I was attracted to both computer programming and music. I sang, played guitar and spent hours in book stores, reading up on programming languages. Not owning a computer myself, I wrote programs on paper by hand and imagined all the cool things that they would do. It felt futuristic and fantastic at the same time, the promise of creating whatever I wanted in a virtual world that seemed limitless.
After years of setting money aside, I got a Commodore 64 with a Citizen pen printer and spent my free time writing games and graphical drawing tools for my own use. Later, I got a Commodore Amiga and a Yamaha YS-100 synthesizer, and learned how to sequence my music using MIDI.
At school, I spent my breaks on their PCs, dove into the Turbo Pascal language and wrote my first commercial software program for a local squash club, helping them to more efficiently organize tournaments. I was 16 years old and had already negotiated prices, dealt with customer requirements, and handled support and maintenance for the product that I had created.
Over the next years, I used my Amiga to teach myself C++ and released a few shareware programs, one of which was a musical ear trainer because I needed help with the guitar, arrangement and composition classes I was taking in jazz school.
This was the first time that I realized both of my passions could be combined into a single effort, and that I didn't have to choose between one or the other. It took almost 15 years before I succeeded in making this my career.
AA: What attracted you to open-source?
When I learned how to program, the only options available to me were books and self-study. I pretty much learned everything by diving in, making mistakes, trying again, ... over and over, until I figured out what worked best. Once I started writing code for a living, I realized how much tooling and infrastructure was needed underneath and alongside your work, and just as with anything man-made, that third-party software is not perfect. It contain bugs, inefficiencies, missing features, outdated protocols, insufficient documentation, and so on. I was already using Linux distributions and Java as much as possible and realized that whenever something didn't work well, I could look for the source code, fix it myself or work around it with detailed knowledge. I learned a lot this way and also felt completely in control of the solutions I was providing to customers.
Once I started releasing my own software as open-source, I saw how powerful that aspect also was. You immediately create a community of interested and passionate people, that work together to improve what you're all using, so you're naturally lifting each-other up. Not only does it make the quality of the software dramatically better, it also prevents people from being locked in and allows for an unbounded creative influx of ideas and new directions.
In the physical world, you can pull apart most things when they're broken and see if you can fix them. Maybe you realize you can't, but it's likely you can find someone else who can. With closed source software, this becomes impossible. Given how much is governed by software these days, and how quickly things change, I think it's important for our future that we adopt similar repairability and freedom for the virtual world. It's no surprise that most of the software that runs the internet is open-source, it would be a real issue if this wasn't the case.
AA: Linux and Java development sounds so much like the enterprise world, where does that fit into your journey?
In my early twenties, I was fascinated by the rise of the internet and wanted to create interactive websites, but didn't have access to any servers to experiment on. I discovered Linux, the Apache web server, Perl, PHP and figured out that I could run everything on my own machine during development. This sounds so obvious nowadays, but back then it was pretty much a game changer. I starting making an income as a frontend and server-side developer while studying to become an architect, writing my own music and playing at clubs in the evenings.
There were very few people with the technical website skills that I had acquired, and I landed a job in a Belgian startup as a software engineer. They were doing work for the European Space Agency in Java and took my under their wings. I had the opportunity to learn everything about the Java language and platform in very professional and mission-critical context.
I was attracted to the power of Java, but also baffled at how rigid, heavy and cumbersome everything felt. I decided that I wanted to tap into all of the platform's goodness, but to create a framework that allowed single developers and small teams to take a very lean and pragmatic approach. This web framework, RIFE, pioneered many novel ideas and got me elected as one of the first Sun Java Champions out of a pool of over 6 million Java developers worldwide. I was invited to talk at conferences all over the world, and started my own consultancy business, working for companies inside and outside of Belgium, some of which in the United-States, like online retailers and those in charge of the infrastructure that powers many millions of credit card transactions every day.
I learned how to adopt the reliability of enterprise-level architectures and processes, and how to apply those to creative, focused and pragmatic solutions that were delivered in a fraction of the time and with a very small team of skilled engineers. It has allowed me to stand toe-to-toe with big consultancy firms and to come out with better results: feature complete, delivered on time, within budget, and standing the test of time.
AA: At one point it seems you made a drastic change and started contributing to the Eigenharp instruments, can you tell us more about that?
I always continued playing music and tried to connect my guitar skills to software synthesizers. I experimented with guitar to MIDI convertors, but never felt an intimate connection to the generated sound due to the technical limitations.
When Eigenlabs announced the Eigenharp instruments, I immediately bought the Pico since it made so much sense to have a grid layout with three-dimensional keys that were sensitivity enough to digitally translate all the nuances of your playing directly into sound. I quickly became a passionate Eigenharp player and advocate, bought an Eigenharp Alpha (their flagship), and started demonstrating the instruments. I collaborated closely with Eigenlabs' product and development team, and when they posted a job opening for a C++ developer, I jumped right on it.
I felt very fortunate that they hired me, since I finally was able to combine music and programming as my main occupation. I learned a lot from the code that was driving the Eigenharp, it was very different from anything else that I had seen before and it touched on all the technical aspects of software engineering in the music world. Eigenlabs shrunk down their development team and soon we were just three people working on the software. I was able to make meaningful contributions and also felt empowered to make product decisions that significantly improved the instruments. The Eigenharp allowed me to kickstart a new direction in my career and to move into the music industry.
AA: Your personal music product, Geco MIDI, used the Leap Motion controller. What was your reasoning behind using this?
The more I practiced the Eigenharp, the more I realized how important sensors were to achieve a level of expressiveness that felt natural and inspiring. When the Leap Motion was announced, it promised cheap access to touchless gestures that exceeded the threshold of what sensors needed to detect. Think of it as the reverse of HiDPI (Retina) displays where the pixels are so small that they become imperceptible to the eyes. If a sensor can detect small enough motions, fast enough, your senses will be unable to notice that it is digital.
I got one of the first Leap Motion controllers and signed up to their development program. I quickly wrote a prototype that translated three-dimensional hand and finger motions into sound. It then took weeks of real-world testing to figure out what actually felt musical, leaving enough room for exploration while determining which boundaries were necessary to allow anyone to develop their own gestural music language that felt intuitive.
Geco MIDI was released as soon as Leap Motion opened their app store and it quickly became the de facto music software for their device. Within days, people were posting creative videos of how Geco MIDI musically inspired them and some artists built a career based on this new gestural universe that had opened up to them. Geco MIDI sold tens of thousands of copies and is still now being used by many musicians, DJs, composers, arrangers and performers.
When I saw how much people loved Geco MIDI, I wondered if I could leverage my knowledge to create a product that made it possible to play video games by moving your hands around in the air, without touching a physical device. GameWave was born and also become the standard Leap Motion choice for computer control.
Then something profound happened, something I had completely not anticipated. The gestural boundaries that I had determined for Geco MIDI, informed the boundaries of GameWave, and it turned out that many of the capabilities worked with your feet as well as your hands. People that had lost the use of their hands or arms were adopting GameWave to control their computers and even to play video games. It was the first time I was confronted with how my work could profoundly change people's lives for the better. I still get emotional when I think back of the first time I saw someone on YouTube using his feet to control his computer with GameWave.
This is not where GameWave's unexpected usage ended. A few months later, I was contacted by surgeons that had successfully used my product to navigate 3D CT scans in the operating room while doing surgery. They had been looking for solutions that would allow them to stay inside the sterile field while still having very fine-grained and detailed control. Using physical devices like a mouse and keyboard, required them to leave the room, take off their gloves, navigate the imagery, scrub their arms again, put on new gloves and re-enter the operating room. They had to remember all the necessary information and also lost precious time. With GameWave allowing touchless navigation of the imagery, they were able to look up exactly what they needed while doing the surgical procedure. I considered creating a dedicated product and entering the medical industry, but it seemed GameWave was already doing the job well enough and I was personally more interested in pursuing my journey in the music industry.
My exploration of gesture control and high-resolution sensor technology excited a lot of people and it led me to give a TEDx Talk about how technology was creating a new age for digital musicians. I was never more nervous and more prepared for a talk than for this one.
AA: Roger Linn is an icon in the electronic music world, how did you start collaborating with him on LinnStrument and what is it like working with him?
I met Roger Linn when I was demonstrating the Eigenharp in 2011. We organized a small 'new musical instruments' tour in California together with Haken Audio's Continuum and CNMAT David Wessel's SLABS. We visited Stanford's CCRMA, Berkeley's CNMAT, Guitar Center San Francisco and the SF Music Tech Conference. Roger brought his first LinnStrument prototype and I was intrigued by his vision. We kept in touch for a few years and continued exchanging ideas about expressive controllers. After we did another talk together at MoogFest 2014, I offered Roger to help with the LinnStrument firmware. We quickly found that we worked very well together and over the course of that year we closely collaborated to bring LinnStrument from a prototype phase to a fully fledged production-ready instrument.
Roger has become a dear friend and we continue to work together to improve the LinnStrument. The firmware was immediately released as open-source, so we also get ideas and code contributions from users that can be folded back into the official releases.
Roger's experience in the music industry is legendary and I've learned so much from his take on musical instrument design. We discuss any new feature or change to LinnStrument and it has become a real partnership. I'm still humbled to have been given the opportunity to work on equal footing with an inventor that has been so influential to so many musicians worldwide.
AA: It seems that you were very involved in the MIDI MPE specification, how did that come to be and what impact does it have on the music industry?
Every expressive controller I used or worked on suffered from the same problem : it was very hard to figure out how to leverage its capabilities with existing synthesizers. The MIDI specification provided all the necessary elements, but there was no agreement on how to set this up for multi-dimensional polyphonic expression. Since there was no consensus, very few synthesizers supported these new forms of expression, and since the sound capabilities were limited or difficult to achieve, people were reluctant to adopt these new controllers. It was a genuine chicken and egg situation.
At NAMM 2015, ROLI organized a meetup to discuss what would be necessary to converge on a standardized MIDI specification for new expressive electronic controllers. I had been reaching out to many synthesizer makers with suggestions and after this meetup I volunteered to write the first draft of the MPE specification so that we had a starting point to discuss. That led into the creation of a NAMM expert group and we spent three years debating what the final specification would look like. As it often goes, we veered off into directions that immensely complicated the protocol, but I was very adamant to make it as simple as possible so that the leap for adoption was minimal. Luckily everyone in the expert group was very open-minded and understanding, and we were able to come to a consensus and strip everything back down to the bare minimum, which interestingly was not that far from my initial draft.
In January 2018, the MPE specification was officially adopted, and the impact on the music industry has been very noticeable. There are now countless MPE compatible synthesizers and more are coming out all the time. New MPE controllers are introduced at increased rate and musicians seems to have embraced the promise of their new capabilities.
Finally, the chicken and egg situation of polyphonic multi-dimensional expression is resolved and I'm excited to hear the new music and styles that artists will create.
AA: Moog Music has a strong reputation in the analog synthesizer world, this seems to be far away from software engineering and the digital world. Can you tell us about your role there and how you work with the rest of the team?
When I was exploring the expressive possibilities of the Eigenharp, I started using the Animoog iOS app. I was able to get very close to the sounds that I was looking for, but at the time Animoog was not yet able to fully leverage the capabilities of the Eigenharp. I reached out to Moog Music and offered to help enhance Animoog's performance. After meeting with them a few times at trade shows, I started working for them as a part-time consultant.
For years, Moog Music had adopted digital features in many of their products and our collaboration was perfect timing because they desired to grow these software abilities. Thanks to my history and experience, I was given more responsibilities and became Moog's iOS product manager and software engineering lead. I had never collaborated with so many passionate and creative people inside the same company before and I learned a lot from Bob Moog's legacy. Over the course of the last few years, I was part of the teams that created the Model 15 iOS app, the Minimoog Model D iOS app and the Moog One synthesizer.
The attention to detail that goes into designing a Moog synthesizer is staggering and my experience in product design allowed me to also participate in their user experience and user interface design. I now more than ever realize how important it is to get the complete flow of interaction right, so that a product can truly become an extension of oneself and foster feedback loops that propel creativity.
At first I was mostly working remotely from Belgium and traveled to North Carolina a couple of times a year. As my involvement increased, it became clear that I needed to be present more often, so I'm now planning to fully move to the United States.
AA: What's next for you, do you have any exciting products you're working on that you can talk about?
With Roger Linn I'm creating the spiritual successor to the famous LinnDrum, using much of the LinnStrument technology and the latest from the embedded electronics world. I keep working on new products at Moog and am excited at everything that is in store there.
I've started exploring the capabilities of the Apple Watch and have released MidiWrist, my first personal iOS app. It allows you to stay in your bubble while recording music or controlling effects and synths, without having to reach for a distant controller, mixing table, recording device or computer. Since it also supports Siri voice commands, you can even leave your hands on your instrument while interacting with your studio.
The Apple Watch offers a lot of exciting gestural capabilities and I'm investigating how I can leverage those for musical purposes. It's possible that some of Geco MIDI's features could become wirelessly available through MidiWrist.
AA: Thank you so much for sharing your journey so far in the world of music instrument and app development Geert!
This was a true pleasure, thanks so much for talking to me!