My iOS Guitar Journey

Another recurring theme here, is that it has been my lifelong nature in all things audio to really, really drill down on exactly how the end user will be using these things. The development of REAPER was very much a response to my complaint that most of the DAW stuff we had seemed to be written by people who had never been within 20 miles of a recording studio, and had ZERO idea of how this stuff was to be used in the real world, with a customer standing behind you, irately, arms crossed and just wanting to get what was in his head out through the speakers. From what seems to be partly iCulture and partly no end use goal planning (or charitably, a different end goal than many performing guitar players would have in mind), it will be more and more apparent that this whole setup isn’t what you would think it was….I’m trying to put this nicely.

I know there’s no real order to this post, but I’m just going to state it here for now…the goal?

A performing guitar player, on stage, in front of a crowd, playing with a band, with access to the utilities and tones needed to get through a set.

Some of the functions and flow needed for this are probably obvious: tuning and preset switching, for example….Some maybe not so obvious to a developer, or maybe just not what the developer was targeting for: IMMEDIATE access to critical functions, levels…levels, levels, levels, levels. Different types of outputs for different target devices like guitar amp and PA (much much much much more on this later!)

Ok, so pedalboard out of the way, lets get to #3, the interface between the guitar and the iThing. Again, this seemed to be a no brainer. IK Multimedia has been doing this from the beginning, so just grab one of their iGuitar plug-into-ifier devices and go! iRig 2 was ubiquitous, so I grabbed one and considered that problem solved…Some of you are already saying “big fat oops on you, man!”. Yup. More on this later, but safe to say, I could at least move forward now and start thinking about the last two factors on the list, the monitoring amp and the software.

So here we go, #1: the amp I’d be playing this through. And here we get to a CRITICAL disconnect between the developers and the end users, I’m sure I’ll be ranting and raving about this issue later, and I am getting to the point where maybe I should just enter the market and maybe shouldn’t be building this article as a “how to” on how I personally think one of these apps should be built, and and and…ok, I digress

As mentioned before, I am in a ZERO confidence area as far as getting an amp miked correctly onstage, but really, in some ways, aren’t we all? Love it or hate it, cabinet emulation, even in the form of impulse responses (oh man and yet again another, we will DEFINITELY be speaking more on this later) has gotten so subjectively good that shouldn’t we be feeding that to a soundman of any skill level instead of having yet another live mike picking up noise and feeding back onstage? Sorry Luddites, but I think the advantages outweigh the perceived tonal differences (that you guys fail consistently on ABX tests anyway) claimed.

Leave a Reply

Your email address will not be published. Required fields are marked *