I realized, later of course, that I hadn’t even gone over the guitar technology and explained *how* I created it. I was so wrapped up in writing code for the conference that I completely spaced any discussion on it. Well, let’s make that up, shall we? Hopefully, this will make it up to all the people that actually showed up 9am Sunday morning for the show! Please forgive me for not going over the details right there!
A BIG HUGE THANK YOU TO LUKE HUBBARD AND JOACHIM BAUCH – I bugged and bugged and bugged them to help me make this thing work and they came through for me like you wouldn’t believe. They’re both VERY nice people and extremely talented developers. They’re also core developers on the Red5 team since day 1 of it’s creation.
First off, I used a Roland Guitar setup consisting of the GK-3 pickup, the GI-20 midi processor. If a Roland guy is reading this, I REALLY could use a sponsorship! This is turning into a real roadshow🙂
Next, I used the midi application with Red5 that everyone now gets when the install Red5 server. Just connect to “midiDemo” with a NetConnection object, and bingo, you get back a list of compatible Midi in and out devices (separate lists).
var nc:NetConnection = new NetConnection();
When the connection shows successful, you can then request the midi lists:
[as]nc.call(“getMidiInDeviceNames”, new Responder(updateMidiInDeviceList));
nc.call(“getMidiOutDeviceNames”, new Responder(updateMidiOutDeviceList))[/as]
[as]private function updateMidiInDeviceList(data:Array):void
deviceListIn = data;
private function updateMidiOutDeviceList(data:Array):void
deviceListOut = data;
Next, you connect to your devices by telling Red5 which 2 you’d like to use:
[as]public function setMidiDevice(midiDeviceIn:String, midiDeviceOut:String):void
nc.call(“connectToMidi”, null, midiDeviceIn, midiDeviceOut);
At this point, you’ll have 2 lists: 1 for Midi In devices, the other for Midi Out devices on your system and you’ll be able to receive midi events from Red5😉
Finally, you’ll receive midi in calls by creating a “midi” method and be able to send midi messages via sendMidiShortMessage3() call to Red5:
[as]public function midi(time:Number, data:Array):void
dispatchEvent(new MidiDeviceEvent(MIDI_MESSAGE, data));
public function sendMidiMessage(note:Number, velocity:Number):void
nc.call(“sendMidiShortMessage3”, new Responder(handleMidiSendResponse), 144, note, velocity, 0);
This setup is EXTREMELY fast and works well. I was playing close to 64th notes on the guitar and Red5 just smiled mildly😉 AS3/Flex2 provide a MUCH better rendering experience than the AS2 drum demo I created.
Papervision3D provided the next valuable chunk to this project. The idea was to create a learning tool for learning chords, scales and patterns people use on solos. The absolute mind blow of this thing is that when you rotate the guitar to the “guitarists” view (over the neck from behind the guitar), it’s about 1000x’s easier to follow along and learn.
I had to create a decent guitar model in 3D Studio Max that was as low on the poly’s as possible. I needed the performance to be retained for the playback of the notes on the fret board. I was able to get a good looking model at about 646 polys and keep the neck, body and head separate.
All I had to do was create groups in 3DS and name them appropriately. Then save out as Collada file format. In Flex2, I was able to simply use the getChildByName(“neck”) and I’m able to hide/show the various parts of the guitar. It’s almost like creating flash movieclips, but I’m actually doing the work in 3DS!
Ok, now we’re actually getting to the cool part: The skin. This is what makes it all happen! Here’s the actual texture I started with. I had to go outside and lay the guitar down on a white towel on a cloudy day – that helped in getting a great diffused shot of the guitar! HUGE thanks to Peter Kapelyan on the PV3D list for giving me a million tips on how to photograph stuff for use as textures! He’s a genius!! To bad he uses Lightwave😉
After I successfully mapped this bitmap to the guitar in 3DS, I was able to import the image into Flash CS3, and create a movieclip where I could then create my “Note” and “Fret” movieclips and supporting classes to handle the incoming midi notifications. I imported the skin, resized the stage to the dimensions of the skin (important since we’re going to use the Flash movieclip to skin the guitar, and it has to be the exact same size). Then proceeded to create a Fret movieclip and Note movieclip (which goes inside Fret). Fret contains 6 copies of Note – make sense? Then I just name them appropriately: fret_0 – 21, and then inside of Fret, note_0-5.
Now, this is where you glue it together – the big question in your mind is: How did I communicate with the SWF that’s loaded for use with a PV3D skin AND register events with it -am I right? is that your question?! good question.
Thanks to Grant for coming along at the EXACT moment I was dealing with this (no, i’m not kidding, literally, I was JUST starting work on it and 5 minutes later, Grant pops up with an IM). He’d just released FlashLib for this very issue. So, all I had to do was select “Create FlashLib” from the commands menu in CS3, and BAM, it created the SWF and FlashLib.as class for accessing assets from the library. The Note class I created was completely in tact and accessible from Flex2! All I had to do was call FlashLib.getInstance(LinkageName:String) and that’s all I needed!
[as]var texture:DisplayObject = FlashLib.getInstance(“guitar”);[/as]
DONE. Now, I could use that with my Collada object with Papervision3D for the skin, AND I had all the access to the fret board that I needed to marry the midi note values to the proper frets.
I’ll be going over this and showing code at the classes (RMI) in June, so if you’re interested, you might want to book a seat now😉 I’ll also post a demo so that you can rotate the model around and see the interface.