One of the problems with touchscreens is that, even as they have become more sophisticated about tracking multiple fingers at once, they still generally don’t respond to pressure. To make touchscreens really useful for music, we need genuine pressure sensitivity.
For that reason, you may be intrigued to see this video of Zen Piano, a demo app for the iPhone and iPod touch. The idea: respond not only to the position of your finger taps, but also to how hard you’re tapping the phone That promises “velocity-sensitive” tapping, which would make touchscreen interfaces more powerful.
Here’s the somewhat overheated description by GreatApps, who say their “patent-pending,” “cutting-edge” technology is the result of “having gone through the research and development phases.”
TapForceTM has been developed from the ground up to provide a completely intuitive way of interaction for users. It can detect more than a hundred different levels of force, and has an accuracy that has to be seen to be believed. And all this can now be done in software, no hardware modifications are necessary. Hundreds of millions of devices currently on the market can make use of the TapForceTM technology today.
A whole new range of games and apps has just been made possible.
Okay, so what is it doing, exactly?
Most likely, it’s simply reading data from the accelerometer. Hit the device harder, and the accelerometer will respond to more force. That’s actually a fairly clever combination of two sensors – it’s just not the sort of stuff you’d necessarily want to trademark or try to get patented, at least, not if you’re a normal person. (TapForce creators, feel free to explain to us that you’re doing something fancier and I’ll eat my words.)
In fact, part of the reason I suspect that’s how they’re doing this is I’ve been tipped off by a developer who’s already implemented just this. He even uses a piano-style keyboard to show it off.
Sadly, that developer and application is Memo and his MSA Remote application, which was inexplicably blocked from the iTunes store – I think because whoever would have understood the app was on a lunch break or something. See, previously:
But as it happens, this is something any mobile device can do that has an accelerometer. I may try something like this on the Android app I’m developing. (No one can reject that, because Google allows any application package to be installed on the device should the user chose to do so. Perish the thought.) Accelerometer data alone is usually not very useful, but combined with touch, it could start to make more sense.
It’s another reason to look forward to MSA Remote, and I do still think that the snafu with Apple will get cleared up at some point. (Unfortunately, what we had on CDM were a lot of rants – perhaps even justified rants – but not necessarily the best way to make the argument to Apple’s store.)