QueTwo's Blog

thouoghts on telecommunications, programming, education and technology

Tag Archives: MAX

My Presentations at Adobe MAX 2011

This year I had the distinct honor of being asked to present at Adobe’s MAX conference.  The conference was an absolute blast.  From the Keynotes to ALL the other sessions I attended, the thing went off without a hitch.

I gave two presentations this year — one for the 360|MAX Unconference and one for the Develop Track at MAX. 

Getting Data From Here To There (with Flex)

This session was not recorded, but fairly well attended.  You can see my slides here.   In this session I talked about the different communications methods available to Flex developers, and I started to lay out a basic matrix of when to use what type of communication method, and what the pros and cons were of each type.  Not all of my demos worked due to a broken J2EE server, but I think everybody got the idea.  I don’t have great downloads for that presentation as most were specific to my server setups.

Getting Physical With Flash (Hardware Hacking)

This session was a blast to present.  We had about 140 people in the room who seemed to be really into it.  I presented on integrating the Arduino hardware prototyping kit into Flash/Flex in addition to showing how to integrate the Microsoft Kinect into Flash/Flex. I came armed with about 6 electronics projects that I showed people to inspire them to create their own.

Video Player - MAX 2011 Preso

You can download the PPT here.

You can find most of the downloads featured on my blog, but I will update this post and post the direct links to everything at a later date.

Thanks again to the entire MAX staff for making the show run so smoothly from both the speaker’s perspective and from the attendee’s perspective.  A+ work led to an A+ experience :)

Integrating the Microsoft Kinect into your Flex application

One of the topics I will be talking about during my MAX session this year is integrating game controllers into your Flex/AIR or ActionScript games in order to make more naturally interactive games.  There are a lot of examples of using the Wii controller online already — and plus with the next generation of video game consoles coming out soon, I thought it would be best to spend my time on the latest and greatest — the Microsoft Kinect.

For those who don’t know much about the Kinect, it is a pair of Web Cams — one RGB (like a normal Web Cam), and one that sees only infrared.  The combination of the two (along with a bunch of other electronics and software) allow the Kinect to have real depth-preception.  This means that being able to track people, hands and everything else become trivial.  The Microsoft Kinect was actually developed by PrimeSense, who originally made the device for PCs, rather than game consoles.  This means that the devices are actually compatible with PCs, rather than relying on a whole stack of hacks to get them to work. 

The biggest difficulty in getting the Kinect working is wading through all the pieces and parts that are needed to get it all working.  Documentation is in very short supply (and usually found within text files on GitHub).  Compounding the problem is that there are tens, if not hundreds of open-source projects revolving the Kinect — but unfortunately each depend on a different set of drivers, middleware and APIs — many which are NOT compatible with eachother. 

I will state right here that I am not an expert in all of these, but I did follow one path that made things work (after following MANY that didn’t).  Matt LeGrand and I are planning a separate website and blog that will explore these other paths more to help people make sense of what to choose and what not to choose.  My primary experience is getting the Kinect working with Windows 7, as I am well aware that my aging Mac laptop is not powerful enough to do much with the Kinect.  Yes, you need a modern computer in order to use the Kinect.

So, where would you start?  You obviously need the actual Microsoft Kinect controller.  There are two models in the market right now — one that is USB only, and one with a USB + Power supply.  The USB only version will ONLY work with the XBox, where the USB + Power Supply version works with PCs (the Kinect draws a few amp of power, which the XBox apparently can supply using USB).  The only difference between the two packages is the power brick and a USB power injection module.  If you end up with the wrong one, you are supposed to be able to buy the power brick and injection module, but I have no idea where one would pick that up.  I got my kit from Best Buy in the video game refurbished aisle (it was about $100, after an instant coupon).  The brand-new ones were about $140.

Before you plug it in, you need to install the drivers, middleware and APIs.  There are three well-known sets of drivers and middleware, but the ones I found worked for me are directly from PrimeSence.  The drivers and middleware are hosted on OpenNI’s website at http://www.openni.org/downloadfiles/opennimodules.  There are three downloads you need — the OpenNI Module (OpenNI Binaries), the PrimeSense NITE Middleware, and the PrimeSense Drivers (OpenNI Compliant Hardware Binaries).  Download only the 32-bit downloads, even if you are on a 64-bit Windows 7 install!!  I’ve found LOTS of issues with the 64-bit drivers that cause things to break.    Install the drivers, then the middleware, then the OpenNI APIs (in that order).

Finally, you will need the as3 modules.  There is an open-source project known as AS3OpenNI that makes programming towards the OpenNI APIs very simple.  Because AIR can’t directly talk to the APIs you have to use the included C++ application that proxies the driver calls to a TCP/IP connection.  I’m sure this will become easier in future versions of AIR.   AS3OpenNI takes all the hard word of processing the blob data that comes back from OpenNI and gives you either skeletal data (as a class), RGB, Depth, or “multitouch blob” data.  With this data you can read back data from multiple users and track their hands, head, neck, legs, etc.  I built all of my stuff on version 1.3.0, which is really stable.

Take a look at the examples in the AS3OpenNI project — they are pretty descriptive, and once you get all the parts working together, they work really, really well :) 

So, what did I make using the Kinect?  One of my first games I threw together was a simple version of Space Invaders.  I have AS3OpenNI track the right hand to move the player left and right, and when the right hand moves over the center point of the neck, I fire a missile.  The following code is all that is really needed (minus the setup of the player, etc.) :

protected function gotShipMove(event:ONISkeletonEvent):void
{
 var rightHand3D:NiPoint3D = event.rightHand; // get the right hand's x,y,z
 var rightHand:NiPoint2D = NiPoint3DUtil.convertRealWorldToScreen(rightHand3D, this.stage.width, this.stage.height);
    
 shipIcon.x = rightHand.pointX;
 if ((event.skeleton.rightHand.pointY > event.skeleton.neck.pointY) && canFire)
 {
  if (!fireInProgress)
  {
   fireInProgress = true;   // prevent 5,000 missiles from firing at once...
   var missile:MissileItem = new MissileItem(spaceInvadersCluster);
   missile.x = rightHand.pointX;
   missile.y = height - 64;
   missile.addEventListener("MissileFireComplete", missileFireComplete);
   missile.addEventListener("MissileFireHit", missileFireComplete);
   addElement(missile);
  }
 }
 else
 {
  fireInProgress = false;
 }
}

Watch my twitter stream for more information about our (Matt and I) new website dedicated to the Kinect and AS3 :)  I will also cover the game a lot more during MAX, so make sure to come!

The BikePOV. Adobe AIR + Arduino + Blinking lights on a bike

So, for the past month I have been working on a side project called the BikePOV.  If you have been reading my tweets, I’m sure you’ve picked up on my cursing, explaining and working on making it work. 

This evening I finally got everything working just the right way — and it actually works!

So, first let me explain what is going on.  I took an Arduino prototyping board and designed a circuit around it.  Essentially I took 12 RGB (Red, Green, Blue) LEDS and soldered them onto a circuit board.  I then mounted the circuit board in between the spokes of a bike wheel.  The theory is that when the wheel turns, I can control the LEDs, and make them flash in a pattern that represents letters, patterns or images.  This is called a POV, or Persistance of Vision. 

This idea has been done before — there are pre-made kits that you can buy from a company called AdaFruit.  A company called Monkeyletric also sells a POV kit for about $60 (which is MUCH nicer than my setup, but they only have pre-done patterns). Read more of this post

I’m Speaking at Adobe MAX!

This year I was lucky to be selected as one of the speakers at Adobe MAX 2011!  I will have a session that will talk about integrating various hardware products with Adobe Flash, Flex and AIR.  Most of my talk will revolve around using the Microsoft Kinect and Arduino based (and other AVR) projects as inputs and outputs from the Flash/Flex/AIR stack. 

If you have been following me lately on Twitter, you will see me talking about some projects that I’ve been working on, including a Kinect version of Space Invaders, and a BikePOV.  Both of these projects will be shown during my talk (in addition to others!)  The Kinect is such a cool input device that I think it hampered only by the developers working with it (the situation with drivers, required libraries, dependencies and lack of documentation makes it REAL hard for non-developers to do anything with them).   The Arduino allows hobbyists to use their basic electronics skills to build very complex electronic gadgets and interact with them using a computer.  These are all things that required EE degrees when I was a kid, so it’s super cool to see that technology has progressed to the point where you can build this stuff quickly and easily.

Make sure to sign up for the session!  It is on Tuesday from 1 – 2pm!

Debugging on embedded systems with AIR (including Adobe TV)

One of the projects I was recently involved with had me setup an unsusal enviroment — debugging an embedded device that ran Adobe AIR.  There are not a whole lot of specifics that I can talk about for that particular project, but one of its aspects I can talk about — debugging the AIR and Flex apps remotely.

Most developers are used to the enviroment where they have a single computer where they run their app, and if something breaks, they can launch the debugger and on the computer and debug the app.  When you are working with embedded devices, mobile devices, or simply devices that don’t have Windows/MacOS running on them, debugging can be a pain.  Unfortunatly, the documentation for AIR (and Flex) is pretty poor at telling you how to set this up — and the IDE fights you if you don’t know exactally what you are doing. Read more of this post

Adobe MAX write-up

I’ve just come back from this year’s Adobe MAX conference, and oh, boy was it a whirlwind! I fell that Adobe outdid themselves this year and set the bar much higher.  I guess being in the same location more than one year in a row lets them concentrate on content rather than logistics.

The theme of the conference this year was different from previous years…  In years past it was about products coming out soon, product announcements or beating th drum of certain technologies.  This year there were virtually no product announcements, and no announcements of things coming soon.  It was all about what was out today, and how to use it leverage it going forward.  I know this disappointed a lot of people in the audience, but with Adobe launching most of their products (ColdFusion, Flex, CS5, etc.) just a few months ago, there wasn’t much to talk about.  Read more of this post

Adobe MAX Wrap-Up

Photo Courtsy of Dee Sadler - http://www.flickr.com/photos/deesadler/So, I’m back from LA, and the Adobe MAX 2009 conference. Just like the MAX tagline of “Connect, Discover, Inspire,” I truly able to accomplish all of those.  This year’s conference packed in a lot of announcements, and gave everybody a good idea of where Adobe is heading in the marketplace.  All of the keynotes and sessions were recorded, so make sure to check them out on Adobe TV!

So, lets first talk about some of the major announcements:

  • ColdFusion 9 was released.  This has been in the works for about a year and a half, and offers a bunch of new features.  Some of the new things that are most compelling include the ability to work directly with Microsoft Office documents, ORM, integration with Sharepoint, and certain features pre-packaged as a service. 
  • LiveCycle ES2 was released.  I’m sure this effects all of 20 people on earth, but this product is just plain awesome.  LiveCycle ES is a workflow management applications (for those of you who only deal with consumer applications, think of the process that your paperwork has to go through when you hire somebody new.  You have multiple interviews, background checks, etc. that all belong in a workflow.  This allows you to manage that process, and make sure nothing is missed).  With it, a bunch of new Flex components have been released that allow you to integrate your applications with these workflows.  Yet another important part of this suite is the “LiveCycle Collaboration Suite,” formerly known as Cocomo.  This suite allows you to make your own interactive / collaboration services.
  • Mobile Devices.  So, there was lots of fanfare about Adobe’s push to make mobile devices 1st class citizens in the computing landscape.  21 of the top 22 device manufactures have signed on with Adobe including RIM, Symbian, Google, Microsoft, etc.  The only one that is missing is Apple, of course, but Adobe didn’t waste time shooting a warning shot over their bow. Adobe announced that in CS5, they expect to be able to publish full-fledged iPhone/iPod Touch applications that can be published on the iTunes store.  This does not mean that the Flash Player will be available for the iPhone, but simply that you can publish applications that were created in Flash/Flex/Catalyst.

A few things that were not released, but were talked about:

  • Flash Builder 4 - This looks like it was delayed until Q1 of next year.  It’s a shame, because a lot of the Adobe tooling is based on it now (interesting thought), so many of those applications have to wait too.  This included some ES2 apps, etc.  Adobe did release Beta 2 to allow people to refresh their builds, and play with things a bit more.
  • ColdFusion Builder – This also looks like it was delayed until Q1 of next year.  It is a LOT closer than people have been anticipating, and, personally I really like it. They have really done a lot of research on the workflow model, and I think they will win over a lot of developers who have been using Allaire ColdFusion Builder, Dreamweaver and all the other products.
  • Codename Stratus – This project allows users to build truly P2P applications with the Flash Player or AIR.  It allows IP Multicast or some sort of “home” server to point copies of FP together an allow them to communicate without the use of a server.  This saves bandwidth for the server, and makes the experience better if the users are geographically near by. The shear thought of being able to use IP Multicast in FP is a huge win for me.  This will require FP 10.1.
  • LiveCycle Data Services 3 – It is coming, and very soon.  This brings a whole slew of new features to the LCDS package that will make huge data applications faster and will allow data to flow better.  One of the coolest things about LCDS3 is the data modeler.  It brings the features of a UML designer, and allows you to both deploy databases via your model, or to build the skels of your applications via the model!  This, to me, is one of the coolest things I saw at the show.  How much was an LCDS server again?
  • Adobe Connect for Mobile – So, this one blew me away, but only a peep was said at the conference. During the Day 1 keynote, they showed the iPhone, among other devices using a mobile version of Adobe connect to join meetings!  This, to me, is one of the features that has the potential to keep Adobe Connect ahead of all the other web conferencing suites out there.  They said that we can expect the iPhone, RIM, Android and Microsoft connect clients to come out “soon”. 
  • Flash Player 10.1 – Lots of neat stuff coming in this one.  FP 10.1 will be smaller, meaner, and mobile ready.  It will feature lots of stuff like the Stratus support, and hopefully will make my bed and pour me a beer.  Make sure to check out the online sessions on this one :)

I also had the chance to catch up with a lot of the evangelists, and talk shop with a lot of the people I usually only communicate with online.  It was great to see everybody, oh, and yeah, I went to a lot of sessions and labs too.  I’m hoping to go through all my notes from the action-packed week from my labs and get cracking on some new apps I have floating in my head (yes, I was inspired).

Follow

Get every new post delivered to your Inbox.

Join 27 other followers