418 CHAPTER 11: Digital Video: Streaming Video, MediaPlayer, and MediaController classes
Digital Video Concepts: Bitrates & Codecs
Like the 2D animation we learned about in the previous chapter, digital video extends digital imaging
into 4D, the fourth dimension of time, by using something called frames in the digital video (and film)
industry. Video is thus also comprised of an ordered sequence of frames that are displayed rapidly
over time. The difference from animation, at least in real-world usage inside of Android, is that digital
video usually has a fairly massive number of frames (30 for every second of video playback time).
The optimization concept with frames in a digital video is very similar to the one regarding pixels
in an image (the resolution of the digital image) because video frames multiply the data footprint
with each frame used. In digital video, not only does the frame’s (image) resolution greatly impact
the file size, but so does the number of frames per second that the codec looks at to encode. This
is commonly referred to as FPS, or the “frame rate.” Standard industry video uses 30 frames per
second, but you can use less if you want!
Since digital video is made up of a collection of thousands of digital image frames, the concept of
digital video frame rate, expressed as frames per second, or more commonly referred to as FPS, is
also very important when it comes to our digital video data footprint optimization work process. This
is because with video optimization, lowering the frames per second that the codec looks at encoding
will lower the total amount of data that is encoded, which in turn lowers the resulting file size.
In Chapter 8, we learned that if we multiply the number of pixels in the image by its number of color
channels, we’ll get the raw data footprint for the image. With digital video, we will now multiply that
number again using the number of frames per second at which our digital video is set to play back,
and again by the number of total seconds which represent the duration of the video “clip” which
is being encoded into a digital video asset file. You can see why having a video codec that can
compress this raw data footprint down is extremely important!
You will be amazed (later on in this chapter) at some of the digital video data compression ratios
that we will achieve using the MPEG4 video file format, once we know exactly how to best optimize
a digital video compression work process by using the correct bit rate, frame rate, and frame
resolution for our digital video content. We’ll also get into the concept of bitrates as well as video
optimization during the next few sections of this chapter. First, in this next section, let’s review the
different digital video codecs that the Android OS currently supports.
Digital Video in Android: MPEG4 H.264 and WebM
Android supports the exact same open source digital video formats that HTML5 supports. This
includes MPEG-4 H.264 (MPEG stands for Motion Picture Experts Group) as well as the ON2 VP8
format, which was acquired by Google from ON2 Technologies, renamed WebM, and then released
into the open source environment. This is optimal from a content production standpoint, as the video
content which a developer produces and optimizes can be used both in HTML5 engines such as
HTML5 apps, browsers, and devices, as well as in the Android OS.
This open source digital video format cross-platform support thus affords us content developers with
a “produce once, deliver everywhere,” production scenario! This will reduce content development
costs, thus increasing our revenues, as long as this “economy of scale in content development” is
taken advantage of by app developers.