Is the real future of music interfacing directly with our brain?

Jon Appleton, the composer and electronic musician who was on the team for the ground-breaking Synclavier keeps talking about how he thinks a 'brain cap' will be the interface of the future. Put it on, and it knows what music you're thinking. He talked about this on a panel
at the Electronic Music Foundation in the fall. I sat down to lunch
with him a few weeks ago, talked about the Synclavier and the future,
and asked him again. Maybe he was joking before. Maybe I'd get a
different answer. Nope. Brain cap.

Brain cap it is. The folks in Computer Music Research at Plymouth University are working on the problem. (via interactive tech blog pasta and vinegar) They've got a brain cap for "innovative portable music devices" (what, you think this is bigger than the keytar?) and music therapy. Here's the tricky bit: what do you do once you have all those EEG signals? Ultimately, you have to create "responsive environments" and "imaginative music grammars".

That's right. Ultimately you still have to write the music
even if "writing it" means designing an algorithm, you can't escape
aesthetics. The EEG is just an input, period. We're a long way from
being able to pull music out of our brains.