QueTwo's Blog

thouoghts on telecommunications, programming, education and technology

Tag Archives: Mobile

Simple Caching Techniques in Adobe AIR

One of the aspects of the Pointillism mobile app that was recently released was that users were expected to use the game while in remote areas.  Remote areas often mean that data service is limited or just plain not available at all, and that can wreck havoc for game participants waiting for data to load.  There are two schools of thought in how to approach this problem.

One is to pre-load all the content that the game would or could ever use.  This means that you either package all the data / images with your app, or you force the user to download this data when they launch the app.  The advantage of this method is that the user can pretty much be completely offline after that point and still get the entire experience of the game.  The disadvantage of this, of course is that then you front-load ALL of your content.  If the user is on EDGE (or worse!), this would mean they would be downloading a LOT more data than they may need to in addition to making your app use more space on the end devices.

The other method is to setup some sort of caching strategy.  This requires the user to be online at least for the initial exploration of each section of your app, but after that, the data is stored on their device.  This can be problemsome if they are offline, of course, but depending on the game, this may not be an issue.  In a cached mode, the user will attempt to read from disc and return that data WHILE making the call to the service in order to pull down the latest data.  To the end user, this becomes transparent.  Updating cached data is also routine as all if you have to do is invalidate the cache to get that bit of new data.

In Pointillism, we worry about two types of data — lists of data (Collections, Arrays, Vectors, etc.), and user-submitted images.  Our goal is to cache both.

Luckily, Caching the images was super easy.  Dan Florio (PolyGeek) wrote a component known as the ImageGate which houses an Image component and a caching mechanism.  Using his component is as simple as substituting the <s:Image> in your MXML or ActionScript with his component, and boom — your images are cached as soon as they are viewed.  I did make a few tweaks to his component and posted it on my space over at Apache.  I substituted the Image component with a BitmapImage for speed, and added a small patch to cache the images in the proper location on iOS devices.

Caching lists of stuff was not much harder.  AIR has a built-in “write to disc” functionality known as SharedObjects.  SharedObjects started as an alternative to cookies in the browser, but within AIR allow us to store variables for long-term storage.  In my case, I choose to store data that came back from the server as a SharedObject every time we got some data back.  This turned out to be a good strategy as it allowed us to show old data immediately  and update it with current data once it came in.  Our data didn’t change /that/ often, so it might update at most every day or so.

One of our data manager’s constructor looked like this :

so = SharedObject.getLocal("org.pointi.cache");
 if (so.data.pointsList == null)
 {
 so.data.pointsList = new Array();
 so.flush();
 }

When we got our data back from our server, we did this :

so.data.pointsList[curHuntID] = event.result as ArrayCollection;
 so.flush();

And finally, when we wanted to read back the data, this is all we had to do (pointsList is the variable that was sent to our calling components):

ro.getPointList(huntID, userID); //call the remote function on the server
if (so.data.pointsList[huntID] != null)
 {
 pointsList = so.data.pointsList[huntID] as ArrayCollection;
 }

Pretty simple, eh?  We did similar setups for all of our data lists, and also implemented some caching for outgoing data (like when the user successfully checked into a location), so we could keep the server in sync with the client.

Using AIR Native Extensions for Desktop and Mobile

During Wednesday’s meeting of the Michigan ActionScript User Group, we covered what AIR Native Extensions are, where to find the best ones, and how to actually use them.  Includes demos from both Desktop AIR and Mobile AIR projects.

The two locations to find some of the more popular ANEs are :

Passing data back and forth to a View in a Mobile AIR application

On a recent AIR for Android app I was working on, I had the need to send data back from the current view to the my controller.  The app was really simple, but the ability to send back the data to my data controller changed it from a 4 hour app to an entire day project.  It turns out, when you design your app around the ViewNavigatorApplication or the TabbedViewNavigatorApplication, they make it really hard to get a reference to the instantiated View created by the navigator.push() function.  In fact, since the navigator.activeView casts everything as a View (and it does it on the next frame, so you can’t get that reference anyway), rather than your custom class, you can’t call any methods in that class, nor can you attach any custom events.

This makes it very difficult if you want your controller (or in my case, the main application) to push or pull any data to the View after it has been created.  I didn’t want to re-create the functionality of the ViewNavigatorApplication by hand — they did lots of really nice stuff in there, but the inflexibility seemed kind of odd.

Anyway, I remembered that you could bubble events from the view and have the main application be the event listener to catch it.  While it does have the added advantage of not directly creating a reference to the View (preventing it from being garbage collected), it feels really dirty in my mind.  I guess I’ll get over it.

Injecting data to the View from the controller is a bit more trickey.  What I ended up doing was creating a new class that extends EventDispatcher, and adding getters and setters that dispatch their own events.  I would then package that custom class in the data property and send it along in the push function.  I can then have that View listen to events off that class, as the class dispatches them with every change.  Again, this won’t cause any references to the View (which is what the SDK was trying to avoid), but I can still get my data there.

Connecting your Flex application to BlazeDS or LiveCycle DS

If you have ever attended one of my presentations on BlazeDS, LiveCycle DS, or Flex/ColdFusion, you have heard me talk about how bad the data-connectivity wizards are in ALL of the IDEs available for the Flex SDK.  Even the new Flash Builder 4.5 dosen’t address the white-hot mess they call the data connectivity wizards that they started including in 4.0 (Side note:  I love most of the features in Flash Builder — I use it every day, but some of the wizards they included to make life easier really don’t).

Even including the services-config.xml document as a configuration option in your application will often lead you to troubles in the long-run.  This is included for you when you tell Flash Builder that you want to use ColdFusion or Java as your server model.  When you do this, the compiler inspects your services-config.xml configuration document on your server, and builds a table of the channels (endpoints) and destinations that are configured on the server.  You can then call the destination by name in theortags and not have to worry about how your client actually connects back to the server…

… untill you need to move your application…

… or you need connect your AIR application to your server…

… or you have a mobile or television application that needs resources on your server…

… or your server’s configuration changes.

Read more of this post

H.264 Being removed from Google’s Chrome Browser

This afternoon Google announced that they were dropping support for the H.264 codec in future releases of their browser, Chrome.  If you want to read more about the announcement, check it out here

Essentially their argument is that they want to support Open-Source.  And there is no better way to support open-source than to include support for the new codec that they just bought (VP8) and announced would be open-source (WebM).  On paper, it sounds great and the world should cheer for them.  I personally support ALL the browsers including support for WebM so at least there is some consistency across the landscape for a single codec.

However, how Google is introducing WebM into their browser is by removing the support for H.264 codec.  For those of you who don’t know what H.264 is, it is by far the most widely supported video codec today — being directly supported by DVD players, Blu-Ray players, video game systems, and more importantly, enterprise video systems. 

Over the last few years we have seen enterprise video make the shift from tape to MPEG-2, to MPEG-4 (H.264) video for recording, editing and storage.  The great thing about the industry was that as time passed more and more devices were getting support for H.264, so there was no need to re-encode your video to support each device under the sun.  It was starting to become a world where we could publish our video in a single format and be able to push it out to a great audience.  Re-encoding video takes time, storage space and typically reduces the quality of the end product.

What is even more unfortunate is that recently we have seen H.264 decoders get moved to hardware on many devices, allowing them to decode high-definition content and display them in full-screen without hitting the processor.  This is starting to allow developers to focus on making more interactive user interfaces and better looking interfaces without worrying about degrading the quality of the video.  A good example of this is the H.264 decoder that is built into the newer Samsung televisions — you can run HD content, as long as you are running H.264 — otherwise you are lucky to see a 640×480 video clip without taxing the processor.   One of the reasons why HD video on the web sucked so bad for Mac based computers was Apple didn’t allow the Flash Player to access the H.264 hardware accelerator.  Newer releases of the Flash Player now support it and HD video is smooth as silk on the web. 

By Google moving to WebM and abandoning H.264 it forces manufactures to reassess their decision to support H.264 in hardware.  It will also force content producers to re-encode their video in yet another format (can you imagine YouTube re-encoding their content into WebM just to support a Chrome users?), and will make the HTML5 video tag to be even harder to work with.  Content producers will need to produce their content in H.264 for Flash, many WebKit browsers, and IE9, WebM for Chrome, and Ogg Theora for Firefox.  The real tricky part right now is there are virtually no commercial products that can encode in all three required formats to date. 

So, if you are looking to support video on the web — what are you to do?  Right now a majority of the users don’t support HTML5 video — but they do support H.264 through the Flash Player.  iPad/iPhone/Android devices support H.264 directly, and embedded systems like the Samsung HDTVs, Sony PS3, etc. all support H.264 encoded video.  The only reason to encode in a different format today is to support Firefox and Chrome users that don’t have the Flash Player installed (less than 1% of all users?).  The recommendation I’m sure will change over the years, but for right now, you still pretty much have a one-codec-meets-most-demands solution that you can count on.

Debugging on embedded systems with AIR (including Adobe TV)

One of the projects I was recently involved with had me setup an unsusal enviroment — debugging an embedded device that ran Adobe AIR.  There are not a whole lot of specifics that I can talk about for that particular project, but one of its aspects I can talk about — debugging the AIR and Flex apps remotely.

Most developers are used to the enviroment where they have a single computer where they run their app, and if something breaks, they can launch the debugger and on the computer and debug the app.  When you are working with embedded devices, mobile devices, or simply devices that don’t have Windows/MacOS running on them, debugging can be a pain.  Unfortunatly, the documentation for AIR (and Flex) is pretty poor at telling you how to set this up — and the IDE fights you if you don’t know exactally what you are doing. Read more of this post

Creating Mobile Applications from a Photoshop Prototype

Thanks to Dr. Coursaris for this photo.

At the WUD Conference, East Lansing, MI

This past Thursday, I was given the opportunity to present on a really cool, but obscure topic — creating mobile applications from Photoshop prototypes, for the World Usability Day Conference.  Essentially, my job was to show people a workflow that is possible when using Adobe Device Central to create a Photoshop file, that is then turned into a working prototype using Adobe Catalyst, and then programmed using Adobe Flash Builder. 

All in all, the conference was excellent, and I was honored to be on stage after notable presenters from the State of Michigan, Motorola, Nokia and well, even the University of Michigan.  The focus of this year’s conference was on usability, and how it relates with mobile applications and devices, which was a perfect match for the presentation I was doing.

After my quick introduction to the subject, I demoed making a complete application from scratch and deployed it to a series of working phones.  I was able to accomplish this workflow in under a half hour, which was completely amazing for not only myself, but the audience too.  It is really cool to realize that the technologies that I’ve been using as betas for so long have actually matured to the point where I can use them to make real applications.

The session was recorded and hopefully will be posted online soon.  You can view my powerpoint here (I did have to disable the live voting, but I did keep the results for historical purposes), and download ALL the source code that and asset files that we used during the presentation here.  Please keep in mind, that the logos in the demo code are subject to the use standards found here

Thanks to the WUD East Lansing team for inviting me!

Adobe MAX write-up

I’ve just come back from this year’s Adobe MAX conference, and oh, boy was it a whirlwind! I fell that Adobe outdid themselves this year and set the bar much higher.  I guess being in the same location more than one year in a row lets them concentrate on content rather than logistics.

The theme of the conference this year was different from previous years…  In years past it was about products coming out soon, product announcements or beating th drum of certain technologies.  This year there were virtually no product announcements, and no announcements of things coming soon.  It was all about what was out today, and how to use it leverage it going forward.  I know this disappointed a lot of people in the audience, but with Adobe launching most of their products (ColdFusion, Flex, CS5, etc.) just a few months ago, there wasn’t much to talk about.  Read more of this post

Adobe MAX Wrap-Up

Photo Courtsy of Dee Sadler - http://www.flickr.com/photos/deesadler/So, I’m back from LA, and the Adobe MAX 2009 conference. Just like the MAX tagline of “Connect, Discover, Inspire,” I truly able to accomplish all of those.  This year’s conference packed in a lot of announcements, and gave everybody a good idea of where Adobe is heading in the marketplace.  All of the keynotes and sessions were recorded, so make sure to check them out on Adobe TV!

So, lets first talk about some of the major announcements:

  • ColdFusion 9 was released.  This has been in the works for about a year and a half, and offers a bunch of new features.  Some of the new things that are most compelling include the ability to work directly with Microsoft Office documents, ORM, integration with Sharepoint, and certain features pre-packaged as a service. 
  • LiveCycle ES2 was released.  I’m sure this effects all of 20 people on earth, but this product is just plain awesome.  LiveCycle ES is a workflow management applications (for those of you who only deal with consumer applications, think of the process that your paperwork has to go through when you hire somebody new.  You have multiple interviews, background checks, etc. that all belong in a workflow.  This allows you to manage that process, and make sure nothing is missed).  With it, a bunch of new Flex components have been released that allow you to integrate your applications with these workflows.  Yet another important part of this suite is the “LiveCycle Collaboration Suite,” formerly known as Cocomo.  This suite allows you to make your own interactive / collaboration services.
  • Mobile Devices.  So, there was lots of fanfare about Adobe’s push to make mobile devices 1st class citizens in the computing landscape.  21 of the top 22 device manufactures have signed on with Adobe including RIM, Symbian, Google, Microsoft, etc.  The only one that is missing is Apple, of course, but Adobe didn’t waste time shooting a warning shot over their bow. Adobe announced that in CS5, they expect to be able to publish full-fledged iPhone/iPod Touch applications that can be published on the iTunes store.  This does not mean that the Flash Player will be available for the iPhone, but simply that you can publish applications that were created in Flash/Flex/Catalyst.

A few things that were not released, but were talked about:

  • Flash Builder 4 - This looks like it was delayed until Q1 of next year.  It’s a shame, because a lot of the Adobe tooling is based on it now (interesting thought), so many of those applications have to wait too.  This included some ES2 apps, etc.  Adobe did release Beta 2 to allow people to refresh their builds, and play with things a bit more.
  • ColdFusion Builder – This also looks like it was delayed until Q1 of next year.  It is a LOT closer than people have been anticipating, and, personally I really like it. They have really done a lot of research on the workflow model, and I think they will win over a lot of developers who have been using Allaire ColdFusion Builder, Dreamweaver and all the other products.
  • Codename Stratus – This project allows users to build truly P2P applications with the Flash Player or AIR.  It allows IP Multicast or some sort of “home” server to point copies of FP together an allow them to communicate without the use of a server.  This saves bandwidth for the server, and makes the experience better if the users are geographically near by. The shear thought of being able to use IP Multicast in FP is a huge win for me.  This will require FP 10.1.
  • LiveCycle Data Services 3 – It is coming, and very soon.  This brings a whole slew of new features to the LCDS package that will make huge data applications faster and will allow data to flow better.  One of the coolest things about LCDS3 is the data modeler.  It brings the features of a UML designer, and allows you to both deploy databases via your model, or to build the skels of your applications via the model!  This, to me, is one of the coolest things I saw at the show.  How much was an LCDS server again?
  • Adobe Connect for Mobile – So, this one blew me away, but only a peep was said at the conference. During the Day 1 keynote, they showed the iPhone, among other devices using a mobile version of Adobe connect to join meetings!  This, to me, is one of the features that has the potential to keep Adobe Connect ahead of all the other web conferencing suites out there.  They said that we can expect the iPhone, RIM, Android and Microsoft connect clients to come out “soon”. 
  • Flash Player 10.1 – Lots of neat stuff coming in this one.  FP 10.1 will be smaller, meaner, and mobile ready.  It will feature lots of stuff like the Stratus support, and hopefully will make my bed and pour me a beer.  Make sure to check out the online sessions on this one :)

I also had the chance to catch up with a lot of the evangelists, and talk shop with a lot of the people I usually only communicate with online.  It was great to see everybody, oh, and yeah, I went to a lot of sessions and labs too.  I’m hoping to go through all my notes from the action-packed week from my labs and get cracking on some new apps I have floating in my head (yes, I was inspired).

Follow

Get every new post delivered to your Inbox.

Join 28 other followers