A (175)

(Tuis.) #1
CHAPTER 10: Android Animation: Making Your UI Designs Move 355

Just like with digital image file formats, Android prefers the PNG data format over GIF, WebP,
or JPEG when used in frame-based animation. This is due to the PNG lossless image quality,
and reasonably good image compression results, which will result in a higher quality end-user
experience, given a capable app developer.


It is also important to note that the Android OS does not currently support the Animated GIF format,
which is also known as the animGIF, or aGIF, animation file format. There are some workarounds
being discussed online for this until support for this format is added to Android. Given that PNG8
gives better compression results and that defining frames using XML and controlling them using
Java code will give us far more control over the end result we are trying to achieve, I’m going to
focus on currently supported Android frame animation solutions and methodologies, for now.


Android OS support for several mainstream digital image file formats gives us an impressive amount
of latitude to be able to optimize the frame animation’s data footprint. Because of Android’s support
for the PNG32 format, we will also be able to implement an image compositing work process in our
frame animation endeavors. You can do this by using the 8-bit alpha channel transparency capability
in the PNG32 format on a frame-by-frame basis.


Optimizing Frames: Color Depth and Frame Count


In frame animation, there are three primary ways to optimize your animation data in order to achieve
a smaller data footprint. You can reduce the resolution of each frame, you can reduce the color
depth used to define each frame, and you can reduce the number of frames utilized to create an
illusion of motion in the animation.


Since we need to provide at least three different resolution density targets to support Android 4.4,
we will focus on optimizing the color depth and the total number of frames as much as possible first,
for practical purposes, as you’re essentially going to want to provide frame resolution spanning from
at least 120 pixels for MDPI through 240 pixels for HDPI to a maximum of 480 pixels for XHDPI.
As you’ll see during this chapter, you do have the option to provide fewer than five resolution density
targets (we will use three) and have Android scale the rest. The hope is that Android will scale your
assets down (termed: downsampling) rather than scaling assets up (termed: upsampling).


Since we have a choice between using a lossless PNG32, which as you now know is a true color
PNG with a full 8-bit alpha channel, and an indexed color PNG8, which is many times smaller
(per each frame), we will use this lossless PNG8 for animated elements which do not need to be
composited (using a PNG32’s 8-bit alpha channel) for our application’s animated UI elements in
this chapter. We can use the ImageView plates to implement our compositing pipeline by using a
background plate for static graphics and a source (foreground plate) for the animation.


It is interesting to note that there is an advanced way around this compositing challenge, which is
to use a white or a black background color for a PNG8 animation using no alpha channel and then
composite with the Android PorterDuff class. This is because using certain PorterDuff blending
modes, you can make white and black values become transparent using blending algorithms
rather than alpha channels. Alpha channels are more efficient (but have a heavy data footprint,
especially when used across multiple frames in an animation) because they are “static” (that is,
pre-defined), whereas using a “dynamic” algorithm at runtime will use valuable CPU processor cycles
(system hardware resources). This PorterDuff image blending algorithm class is covered in great
detail in the Pro Android Graphics (Apress, 2013) title, which I wrote previous to writing this title.

Free download pdf