QueTwo's Blog

thouoghts on telecommunications, programming, education and technology

Monthly Archives: August 2011

Integrating the Microsoft Kinect into your Flex application

One of the topics I will be talking about during my MAX session this year is integrating game controllers into your Flex/AIR or ActionScript games in order to make more naturally interactive games.  There are a lot of examples of using the Wii controller online already — and plus with the next generation of video game consoles coming out soon, I thought it would be best to spend my time on the latest and greatest — the Microsoft Kinect.

For those who don’t know much about the Kinect, it is a pair of Web Cams — one RGB (like a normal Web Cam), and one that sees only infrared.  The combination of the two (along with a bunch of other electronics and software) allow the Kinect to have real depth-preception.  This means that being able to track people, hands and everything else become trivial.  The Microsoft Kinect was actually developed by PrimeSense, who originally made the device for PCs, rather than game consoles.  This means that the devices are actually compatible with PCs, rather than relying on a whole stack of hacks to get them to work. 

The biggest difficulty in getting the Kinect working is wading through all the pieces and parts that are needed to get it all working.  Documentation is in very short supply (and usually found within text files on GitHub).  Compounding the problem is that there are tens, if not hundreds of open-source projects revolving the Kinect — but unfortunately each depend on a different set of drivers, middleware and APIs — many which are NOT compatible with eachother. 

I will state right here that I am not an expert in all of these, but I did follow one path that made things work (after following MANY that didn’t).  Matt LeGrand and I are planning a separate website and blog that will explore these other paths more to help people make sense of what to choose and what not to choose.  My primary experience is getting the Kinect working with Windows 7, as I am well aware that my aging Mac laptop is not powerful enough to do much with the Kinect.  Yes, you need a modern computer in order to use the Kinect.

So, where would you start?  You obviously need the actual Microsoft Kinect controller.  There are two models in the market right now — one that is USB only, and one with a USB + Power supply.  The USB only version will ONLY work with the XBox, where the USB + Power Supply version works with PCs (the Kinect draws a few amp of power, which the XBox apparently can supply using USB).  The only difference between the two packages is the power brick and a USB power injection module.  If you end up with the wrong one, you are supposed to be able to buy the power brick and injection module, but I have no idea where one would pick that up.  I got my kit from Best Buy in the video game refurbished aisle (it was about $100, after an instant coupon).  The brand-new ones were about $140.

Before you plug it in, you need to install the drivers, middleware and APIs.  There are three well-known sets of drivers and middleware, but the ones I found worked for me are directly from PrimeSence.  The drivers and middleware are hosted on OpenNI’s website at http://www.openni.org/downloadfiles/opennimodules.  There are three downloads you need — the OpenNI Module (OpenNI Binaries), the PrimeSense NITE Middleware, and the PrimeSense Drivers (OpenNI Compliant Hardware Binaries).  Download only the 32-bit downloads, even if you are on a 64-bit Windows 7 install!!  I’ve found LOTS of issues with the 64-bit drivers that cause things to break.    Install the drivers, then the middleware, then the OpenNI APIs (in that order).

Finally, you will need the as3 modules.  There is an open-source project known as AS3OpenNI that makes programming towards the OpenNI APIs very simple.  Because AIR can’t directly talk to the APIs you have to use the included C++ application that proxies the driver calls to a TCP/IP connection.  I’m sure this will become easier in future versions of AIR.   AS3OpenNI takes all the hard word of processing the blob data that comes back from OpenNI and gives you either skeletal data (as a class), RGB, Depth, or “multitouch blob” data.  With this data you can read back data from multiple users and track their hands, head, neck, legs, etc.  I built all of my stuff on version 1.3.0, which is really stable.

Take a look at the examples in the AS3OpenNI project — they are pretty descriptive, and once you get all the parts working together, they work really, really well :) 

So, what did I make using the Kinect?  One of my first games I threw together was a simple version of Space Invaders.  I have AS3OpenNI track the right hand to move the player left and right, and when the right hand moves over the center point of the neck, I fire a missile.  The following code is all that is really needed (minus the setup of the player, etc.) :

protected function gotShipMove(event:ONISkeletonEvent):void
{
 var rightHand3D:NiPoint3D = event.rightHand; // get the right hand's x,y,z
 var rightHand:NiPoint2D = NiPoint3DUtil.convertRealWorldToScreen(rightHand3D, this.stage.width, this.stage.height);
    
 shipIcon.x = rightHand.pointX;
 if ((event.skeleton.rightHand.pointY > event.skeleton.neck.pointY) && canFire)
 {
  if (!fireInProgress)
  {
   fireInProgress = true;   // prevent 5,000 missiles from firing at once...
   var missile:MissileItem = new MissileItem(spaceInvadersCluster);
   missile.x = rightHand.pointX;
   missile.y = height - 64;
   missile.addEventListener("MissileFireComplete", missileFireComplete);
   missile.addEventListener("MissileFireHit", missileFireComplete);
   addElement(missile);
  }
 }
 else
 {
  fireInProgress = false;
 }
}

Watch my twitter stream for more information about our (Matt and I) new website dedicated to the Kinect and AS3 :)  I will also cover the game a lot more during MAX, so make sure to come!

The BikePOV. Adobe AIR + Arduino + Blinking lights on a bike

So, for the past month I have been working on a side project called the BikePOV.  If you have been reading my tweets, I’m sure you’ve picked up on my cursing, explaining and working on making it work. 

This evening I finally got everything working just the right way — and it actually works!

So, first let me explain what is going on.  I took an Arduino prototyping board and designed a circuit around it.  Essentially I took 12 RGB (Red, Green, Blue) LEDS and soldered them onto a circuit board.  I then mounted the circuit board in between the spokes of a bike wheel.  The theory is that when the wheel turns, I can control the LEDs, and make them flash in a pattern that represents letters, patterns or images.  This is called a POV, or Persistance of Vision. 

This idea has been done before — there are pre-made kits that you can buy from a company called AdaFruit.  A company called Monkeyletric also sells a POV kit for about $60 (which is MUCH nicer than my setup, but they only have pre-done patterns). Read more of this post

Follow

Get every new post delivered to your Inbox.

Join 29 other followers