QueTwo's Blog

thouoghts on telecommunications, programming, education and technology

Tag Archives: TV

H.264 Being removed from Google’s Chrome Browser

This afternoon Google announced that they were dropping support for the H.264 codec in future releases of their browser, Chrome.  If you want to read more about the announcement, check it out here

Essentially their argument is that they want to support Open-Source.  And there is no better way to support open-source than to include support for the new codec that they just bought (VP8) and announced would be open-source (WebM).  On paper, it sounds great and the world should cheer for them.  I personally support ALL the browsers including support for WebM so at least there is some consistency across the landscape for a single codec.

However, how Google is introducing WebM into their browser is by removing the support for H.264 codec.  For those of you who don’t know what H.264 is, it is by far the most widely supported video codec today — being directly supported by DVD players, Blu-Ray players, video game systems, and more importantly, enterprise video systems. 

Over the last few years we have seen enterprise video make the shift from tape to MPEG-2, to MPEG-4 (H.264) video for recording, editing and storage.  The great thing about the industry was that as time passed more and more devices were getting support for H.264, so there was no need to re-encode your video to support each device under the sun.  It was starting to become a world where we could publish our video in a single format and be able to push it out to a great audience.  Re-encoding video takes time, storage space and typically reduces the quality of the end product.

What is even more unfortunate is that recently we have seen H.264 decoders get moved to hardware on many devices, allowing them to decode high-definition content and display them in full-screen without hitting the processor.  This is starting to allow developers to focus on making more interactive user interfaces and better looking interfaces without worrying about degrading the quality of the video.  A good example of this is the H.264 decoder that is built into the newer Samsung televisions — you can run HD content, as long as you are running H.264 — otherwise you are lucky to see a 640×480 video clip without taxing the processor.   One of the reasons why HD video on the web sucked so bad for Mac based computers was Apple didn’t allow the Flash Player to access the H.264 hardware accelerator.  Newer releases of the Flash Player now support it and HD video is smooth as silk on the web. 

By Google moving to WebM and abandoning H.264 it forces manufactures to reassess their decision to support H.264 in hardware.  It will also force content producers to re-encode their video in yet another format (can you imagine YouTube re-encoding their content into WebM just to support a Chrome users?), and will make the HTML5 video tag to be even harder to work with.  Content producers will need to produce their content in H.264 for Flash, many WebKit browsers, and IE9, WebM for Chrome, and Ogg Theora for Firefox.  The real tricky part right now is there are virtually no commercial products that can encode in all three required formats to date. 

So, if you are looking to support video on the web — what are you to do?  Right now a majority of the users don’t support HTML5 video — but they do support H.264 through the Flash Player.  iPad/iPhone/Android devices support H.264 directly, and embedded systems like the Samsung HDTVs, Sony PS3, etc. all support H.264 encoded video.  The only reason to encode in a different format today is to support Firefox and Chrome users that don’t have the Flash Player installed (less than 1% of all users?).  The recommendation I’m sure will change over the years, but for right now, you still pretty much have a one-codec-meets-most-demands solution that you can count on.

Debugging on embedded systems with AIR (including Adobe TV)

One of the projects I was recently involved with had me setup an unsusal enviroment — debugging an embedded device that ran Adobe AIR.  There are not a whole lot of specifics that I can talk about for that particular project, but one of its aspects I can talk about — debugging the AIR and Flex apps remotely.

Most developers are used to the enviroment where they have a single computer where they run their app, and if something breaks, they can launch the debugger and on the computer and debug the app.  When you are working with embedded devices, mobile devices, or simply devices that don’t have Windows/MacOS running on them, debugging can be a pain.  Unfortunatly, the documentation for AIR (and Flex) is pretty poor at telling you how to set this up — and the IDE fights you if you don’t know exactally what you are doing. Read more of this post

AIR for TV — The Virtual Keyboard

As I’ve been porting some of my AIR for TV applications over from Flex, one thing I quickly realized was that I was missing a way to get user input.  While AIR exposes all the remote buttons as Keyboard events, there is a real disconnect if you want to accept alpha based input (most remotes simply have numbers and special keys like the MENU or EXIT keys).  Since many of the apps I’ve been working on were orgionally designed with the desktop or mobile in mind, completely redoing them just to avoid a keyboard seemed impractal.

I’ve been working the past few weekends on a virtual keyboard component that will allow me to accept alpha user input on a “Virtual” keyboard.  This keyboard is rendered on the screen, and allows the user to pick their keys via the directional buttons on their remote (or pointing device, if one is available).  I modeled the keyboard after the Microsoft On-Screen keyboard that is available for tablets.

This is one of the first components I’ve created that is being released publically.  I know from talking to Jeffry Houser that there are probably 5,000 more things I should be doing for a component that is to be used in the public, but lets just call this my first crack at releasing a reusable component. 

Check out the project’s page for more information.  I have posted the source code, and the downloadable SWC there if you want to check it out.  If you would like to contribute, or help me make it better for everybody else, please let me know — I would love some feedback on it!

Creating your first Application for TV

One of the major announcements that came out of Adobe MAX was the availability of AIR 2.5, which is the first SDK to support televisions as an output.  While most of you may be scratching your heads as to why this is a big deal, the few of you who have ever attempted to write an application for a STB (set-top-box) or directly for a television know that they are one hard nut to crack.

Generally, up to now if you needed to push an application to a television-connected device, (including DVD players, Blueray players, STBs, or TVs themselves), you either had to learn the vendor’s propriety language, go with the vendor’s interpretation of Java, or just pay them to make the app for you.  Additionally, it has only been a very short while that the manufactures have even given the developers the ability to push apps to these devices (with Samsung really paving the way in the past few months). 

In comes Adobe with their OpenScreen Project, where they are allowing common RIA developers to simply create applications that can be deployed to these television connected devices.  The dream of write-once-publish-anywhere just got extended to another class of devices.  Mind you, these will be high-end devices at first (for example, take a look at Samsung’s TV lineup (filter by Samsung Apps in the features section) to get an idea of what you will be targeting. Read more of this post

Follow

Get every new post delivered to your Inbox.

Join 29 other followers