Silicon Chip – April 2019

(Ben Green) #1

18 Silicon chip Australia’s electronics magazine siliconchip.com.au


face in the database and those calculated from the unknown,
the more likelihood there was of a match between the un-
known and known face.
The computer code to calculate eigenfaces is relatively
simple to implement in software such as Matlab, as shown
in the following example, which uses the facial database
“yalefaces”.
There is also a video explaining the technique of prin-
cipal component analysis and eigenfaces titled “Lecture:
PCA for Face Recognition” at siliconchip.com.au/link/aaoe


clear all;
close all;
load yalefaces
[h,w,n] = size(yalefaces);
d = h*w;
% vectorize images
x = reshape(yalefaces,[d n]);
x = double(x);
% subtract mean
mean_matrix = mean(x,2);
x = bsxfun(@minus, x, mean_matrix);
% calculate covariance
s = cov(x’);
% obtain eigenvalue & eigenvector
[V,D] = eig(s);
eigval = diag(D);
% sort eigenvalues in descending order
eigval = eigval(end:-1:1);
V = fliplr(V);
% show mean and 1st through 15th principal
eigenvectors
figure,subplot(4,4,1)
imagesc(reshape(mean_matrix, [h,w]))
colormap gray
for i = 1:
subplot(4,4,i+1)
imagesc(reshape(V(:,i),h,w))
end


More advanced facial recognition


From 1993 to the early 2000s, the US Defense Advanced
Research Projects Agency (DARPA) and the National In-


stitute of Standards and Technology (NIST) developed a
facial database called FERET that eventually consisted of
2413 24-bit colour images of 856 different people.
Its purpose was to establish a large database of images
that could be used for testing facial recognition systems.
Controversially, in 2002, the US Government used facial
recognition technology at that year’s Super Bowl (Ameri-
can Football grand final).
Several petty criminals were detected but the test was
seen as a failure, as the technology of that time did not
work well in crowds.
This also led to concerns over the civil liberties impli-
cations of such technology.
Facebook started using facial recognition technology in
2010 to identify users who appeared in photos posted to
the site by other users. Google Photos and Apple Photos
have now deployed similar technology.
Facial recognition is now also used in airports and bor-
der crossings around the world, and by law enforcement
agencies.

Steps for facial recognition
For software systems to recognise a face, five main steps
must occur. These are:


  1. Detection of a human face in a still or video image (which
    may have a cluttered background).

  2. Alignment and normalisation of the face to a standard-
    ised position with even illumination.

  3. Representation of the normalised image with an appro-
    priate mathematical pattern.

  4. Feature extraction to determine those characteristics that
    are unique to the face and at variance to an “average” face.

  5. Searching a database of known faces for a match using
    these characteristics or variances.


Common problems in facial recognition are:



  1. A differing facial expression, pose or angle to that in
    the database.

  2. Differing or uneven illumination.

  3. Ageing of the subject or changes to hairstyle, hair col-
    our etc.

  4. Low size or poor quality of the image.

  5. Additions or deletions of items such as facial hair,


Fig.8: an idealised 3D facial recognition model as seen
from various angles. With a 3D model, a face can be
recognised from many different angles, not just from
straight ahead or with a slight deviation from straight.


Fig.9: Apple’s iPhone X uses its TrueDepth front-facing
3D camera to illuminate a face with a pattern of 30,
infrared dots which are then converted to a 3D facial
model. The system is highly accurate and in tests could
not be fooled by identical twins; it would only unlock the
phone for the twin to whom it was authorised.
Free download pdf