Headline writer gets it backwards

Today I came across what might be a perfect example of a biometrics headline/article non sequitur.

The article linked below, is pretty bullish on biometrics from both the convenience and security angles. In addition to favorable quotes from a user and an industry executive, an Electronic Privacy Information Center (EPIC) staff member is quoted speaking of biometric technologies in favorable terms. As long-time readers may recall, that hasn’t always been the case.

Given all that, it’s hard to comprehend why the headline writer went with:

New technology causes new privacy, security concerns (WJXT – Jacksonville, FL)

I’d quibble that that the headline writer has it backwards. On the article’s own terms a better headline would be, Privacy, security concerns fuel new technology, but reading the article might have given me an unfair advantage.

Security or Privacy? Yes, please.

Security vs. privacy (Homeland Security Newswire)

Those who ask you to choose security or privacy and those who vote on security or privacy are making false choices. That’s like asking air or water? You need both to live.

Maslow placed safety (of which security is a subset) as second only to food, water, sex, and sleep. As humans we crave safety. As individuals and societies, before we answer the question “security or privacy,” we first have to ask “security from whom or what?” and “privacy from whom and for whom?”

Face recognition, marketing and privacy

It’s Your Face. Or Is It? (Press Release at Marketwire)

“From a marketer’s point of view it’s heaven. They can tailor ads, products, even prices based on your age, tax bracket, social media persona and purchasing habits. Marketers will pay handsomely for that information.” For example, NEC has developed a marketing service utilizing facial recognition technology. It estimates the age and sex of customers, along with the dates and number of times that customers go to each store. This information is then analyzed to help predict trends in customer behavior and shopping frequency.

“From a consumer’s point of view this could be a nightmare — the ultimate invasion of privacy.”

Johnson continues, “I’m not just a brand strategist. I’m also a consumer. And I’d like to speak with the voice of reason. New technology can offer enormous benefits. It also comes with enormous responsibility.” Johnson firmly believes we are collectively charged with that responsibility. We have to ensure this facial recognition technology does not become an all out assault on our privacy. “Do we want our children to be added to these facial databases? Probably not. Do we ourselves want to be added without our knowledge or permission? Probably not.”

We tackled the very interesting topics of marketing and the privacy of faces in this post from 2011.

It’s also worth noting that there are two different ways facial recognition technology can be applied to marketing in the bricks-and-mortar world. True face recognition matching a face to a unique individual so as to send a marketing message tailored for that one person is still pretty hard. Inferring demographic traits of a person by using facial analysis technologies does not rely on a unique identification and may provide a bigger bang for the buck (ROI) than true facial recognition.

It’s all ID nowadays

If the one word for the 60’s was plastics and in the 80’s it was all ball bearings, the technology touchstone for the 2010’s figures to be identity.

The “i” in the next iPhone will stand for “identity.” (Cult of Mac)

When people hear rumors and read about Apple’s patents for NFC, they think: “Oh, good, the iPhone will be a digital wallet.” When they hear rumors about fingerprint scanning and remember that Apple bought the leading maker of such scanners, they think: “Oh, good, the iPhone will be more secure.”

But nobody is thinking different about this combination. Everybody is thinking way too small. I believe Apple sees the NFC chip and fingerprint scanner as part of a Grand Strategy: To use the iPhone as the solution to the digital identity problem.

NFC plus biometric security plus bullet-proof encryption deployed at iPhone-scale adds up to the death of passwords, credit cards, security badges, identity theft and waiting in line.

Apple loves to solve huge, hitherto unsolved problems. And there is no problem bigger from a lost-opportunity perspective than digital identity.

The Boston Consulting Group estimates that the total value created through real digital identity is $1 trillion by 2020 in Europe alone.

Read the whole thing. Stripped of the Apple-worship, it’s an astute post.

The link inside the quote above is in the original and the pdf it links to is highly worth a look, as well. From the executive summary…

Increasingly, we are living double lives. There is our physical, everyday existence – and there is our digital identity. Most of us are likely more familiar with that first life than with the second, but as the bits of data about us grow and combine in the digital world – data on who we are, our history, our interests – a surprisingly complete picture of us emerges. What might also be surprising for most consumers is just how accurate and traceable that picture is.

Views on digital identity tend to take one of two extremes: Let organisations do what they need to in order to realise the economic potential of “Big Data,“ or create powerful safeguards to keep private information private. But digital identity can‘t be cast in such black-and-white terms. While consumers voice concern about the use of their data, their behaviours – and their responses to a survey conducted specifically for this report – demonstrate that they are willing, even eager, to share information when they get an appropriate benefit in return. Indeed, as European Commissioner for Justice Viviane Reding remarked, “Personal data is in today‘s world the currency of the digital market. And like any currency it has to be stable and it has to be trustworthy.“ 1 This is a crucial point. Consumers will “spend“ their personal data when the deals – and the conditions – are right. The biggest challenge for all stakeholders is how to establish a trusted flow of this data.

A new type of ID is needed to bind our physical and online selves, payments and hardware. If the tech giants are going to finish off the post office and assume the role of credit card companies, they’re going to have to solve the ID problem. If they solve the ID problem, there’s really no telling how many other business models they can disrupt.

Survey: Banking customers willing to share more personal information for more personalized service

Internet of Things and Other Tech Consumers Want from Banks: Survey (American Banker)

A global survey Cisco released Monday offers clues to the types of technology consumers want to use to interact with their banks. One finding was that 69% of U.S. consumers would provide more private information in exchange for more personalized service, higher security against identity theft, and greater simplicity in managing their finances. These enhanced services could harness “the internet of things” in which everyday objects transmit information to a network.

More specifically, 83% of consumers said they would be willing to provide details about their financial habits and have their banks be more active advisors in exchange for greater protection from identity theft. “There’s an awareness that identity theft is a very ugly thing to have happen and that banks are naturally going to be targets,” says Al Slamecka, marketing manager, Financial Services, for Cisco. Many U.S. consumers (53%) would be willing to offer up biometric identification like a fingerprint in return for better protection against ID theft.

More interesting findings at the link.

The behavioral science of decisions affecting privacy

Profile: Alessandro Acquisti, behavioral economist at Carnegie Mellon University in Pittsburgh (New York Times)

Often, we turn over our data in exchange for a deal we can’t refuse.

Alessandro Acquisti, a behavioral economist at Carnegie Mellon University in Pittsburgh, studies how we make these choices. In a series of provocative experiments, he has shown that despite how much we say we value our privacy — and we do, again and again — we tend to act inconsistently.

Is your personal information worth more than the price of a cup of coffee? Yes and no. (IT World)

It’s easy to be apathetic about abstract terms like “privacy,” but much harder to be so casual if some stranger asks you to, say, share your kids’ schedule and the location of their schools. This is one reason why the terms we use matter so much when talking about user privacy, and why Orwellian definitions of words like tracking, anonymity, choice and freedom are an enormous red flag that should make all of us a little jumpy.

Please read both articles if you’re interested in privacy.

Keeping school lunch biometrics in perspective

Maryland: Bill from Carroll senator would ban collection of students’ biometric data (Baltimore Sun)

Earlier this school year, Carroll County Public Schools had biometric scanners in place in about 10 school cafeterias, where they were used to help expedite the process of paying for school meals. Officials said the scanners would be more efficient than processing cash transactions or using a PIN keypad system.

But officials fielded complaints from some parents who felt the scanners were an invasion of privacy.

If you think biometrics for school lunch payment are bad, you’re not going to like this:

Joy Pullmann: Data mining kids crosses line (Orange County Register)

The U.S. Department of Education is investigating how public schools can collect information on “non-cognitive” student attributes, after granting itself the power to share student data across agencies without parents’ knowledge.

The feds want to use schools to catalogue “attributes, dispositions, social skills, attitudes and intrapersonal resources – independent of intellectual ability,” according to a February DOE report, all under the guise of education.

Read the whole thing.

Like we’ve said before, “If schools are unable to keep data secure, biometric template information is the last thing that should concern parents.” “Secure” doesn’t really apply in the situation described above but the observation that schools already possess very detailed information about students stands.

For the curious: This is an actual biometric template created using one finger, an off-the-shelf fingerprint reader and their freely-circulated software development kit (SDK). It consists of 800 hexadecimal characters.

2aba08229b3b2a44e72c8f14da168a560a3caf2257add068a7fc1636215bff53152546da3fc8071ea84433a42261f4ff7bc3b455199be8980eea2bb1e922f18aa309e050130d72ca124ecd6e9e86459e60858ff44f71d0c1c4e23b97a9a6554619543e8d347f79ea8fa70db87eaea7f37bf2cac4e697d5525479cc72fb653b5d32089e7b3cbcd01f8dba60eda95a50a31b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c1b2dc9ebaf0d5f602a64ff47f06cf97c

Something similar could be used instead of a PIN number for lunch purchases in Maryland schools unless the state bans the technology.

Now which is more risky to student privacy, those 800 characters which I’ve freely put online and made public, or other types of records schools routinely and uncontroversially* keep?

*Ms. Pullmann seems to find the potential sharing of information without parental knowledge and the chipping away of existing privacy protections that prevented sharing of non-academic information (including biometric information) more problematic than the fact that schools know a lot of non-cognitive details about students.

On another note the mention of “a biometric wrap on kids’ wrists” caught my eye. Within the large and growing list of biometric modalities, I’ve never heard of wrist biometrics. I suspect that this is another example of confusion that arises when “biometrics” and “biostatistics” are needlessly lumped together, a subject we have covered in some detail.

Can respect for privacy be a competitive differentiator?

Though biometrics get quite a lot of attention from people interested in privacy, the real action is in the internet browser and online services. Just remember — If you are not paying for it, you’re not the customer; you’re the product being sold*.

The Microsoft “Scroogled” ad campaign against Google is interesting because it indicates that the high-level marketing types at Microsoft believe the public is open to the message that some web services are taking too much information from users compared to the value the users receive in “free” services. Whether respect for privacy is a competitive differentiator among web services remains to be seen, but the fact that Microsoft has spent real time and money on the assumption that it is should not go unnoticed.

Google Privacy Chief Blasts Microsoft’s “Scroogled” Campaign at RSA Conference (CIO)

The bulk of the article linked above is devoted to privacy standards, privacy policy and corporate management. While that’s not nearly as eye-catching as a slug fest between Information Age titans, it is a much more substantial issue and one worth of serious attention.

A survey of biometric modalities and their social impact

Biometrics Looks To Solve Identity Crisis (Electronic Design)

You see them in blockbuster movies and high-tech TV shows—biometric systems that rely on fingerprints, facial recognition, and other physical and behavioral data to provide identification. But these technologies have moved past the sci-fi genre, and even beyond the high-security arena. They’re hitting the mainstream now. In fact, you may even be using some of them already.

This is just the introductory paragraph. The whole article is worth reading.

Face rec philosophy

Face Recognition in Retail: Profit, Ethics and Privacy (Allevate)

Having previously written on the subject of the application of face recognition in airports as applied by law enforcement and border control, this article looks at the increasing exploitation of the technology for commercial advantage. As well as contrasting the different use-cases defined by commercial exploitation versus public safety applications, this article also touches upon the very different agendas of those using the technology and the privacy issues that arise.

Read the whole thing.

Thursday on Twitter: Biometric Chat on Biometrics & Privacy

UPDATE:
The transcript of the Biometric Chat on Privacy is here.

When:
January 10, 2013 11:00 am EST, 8:00 am PST, 16:00 pm BST, 17:00 pm (CEST), 23:00 pm (SGT), 0:00 (JST)

Where:
tweetchat.com/room/biometricchat (or Twitter hashtag #biometricchat)

What:
Tweet chat on iris biometrics technology with Shaun Dakin (@ShaunDakin, @PrivacyCamp), Data Privacy Advocate and Founder of #privchat

Topics:
What technologies have negative impacts on privacy, how the privacy industry works for change, privacy and biometrics, effectiveness of “privacy by design” and “privacy impact assessments,” biometrics as a “privacy protector,” and more.

More at the M2SYS blog.

Earlier topics have included privacy, mobile biometrics, workforce management, biometrics in the cloud, law enforcement, and modalities such as iris and voice.

I always enjoy these and judging by participation at the last one, they’re gaining some traction with ID professionals. Many thanks to John at M2SYS for putting these together.

‘Another Brick in the Wall’ was written in 1979

Washington Times Editorial: Securing America’s schools

Though the benefits of creating maximum-security schools is questionable, the negative impact on young minds is undeniable. Surveillance cameras would watch a child’s every move from kindergarten through high school. GPS devices would track them, and biometric scanners and identification cards would ensure compliance with all attendance regulations. This normalizes a police state. Instead of learning self-reliance, kids would grow up with a state-supplied — and illusory — security blanket.

Schools knowing where students are and whether or not they are attending class discourages self-reliance? Does using technology for the purpose change its nature?

Just remember ‘Another Brick in the Wall’ was released in 1979. High technology isn’t a necessary (or sufficient) condition for police state normalization.

On another note, and in the wake of recent events, a school system in Illinois is dusting off a previously shelved plan to use biometrics to restrict access to schools to those who have been vetted beforehand:

Dist. 201 plans to launch more safety measures (Morris Daily Herald – Illinois)

The district will also re-investigate biometric thumbprint scanning systems for the vestibule, a program they began looking at a year ago.

If the system were used, all parents/guardians would provide a digital thumbprint during school registration. Along with a photo ID, the fingerprint would be in the district’s computer system. Once inside the vestibule, the parent would scan their thumb and staff would pull up the person’s photo at the same time.

Biometrics experts on technology & privacy

Biometrics and Privacy: A Positive Match (accenture) …Views from leading biometrics specialists on how to reap the benefits offered by biometric solutions while preserving and enhancing the individual’s right to privacy.

The (5 min.) video is very well edited from interviews conducted among industry experts at the Biometrics 2012, London confab. It isn’t posted in an embeddable format or I’d have it for you here, but it’s very much clicking over to accenture to view it.

A threep-page pdf transcript of the quotes, with attribution, is here.

The quotes are excellent individually. Collectively they reflect that the people work in the biometrics industry have devoted considerable thought into the way biometric technologies can be used to improve people’s lives.

Irish privacy commissioner’s report

It’s mostly inspired by the Facebook photo tagging affair but it deals with privacy issues and biometrics in a holistic way.

Ireland: Preserving Privacy In The Age Of Biometrics (mondaq)

The Office of the Irish Data Protection Commissioner (‘ODPC’) recently published its audit report regarding Facebook. The audit was undertaken to determine whether Facebook had implemented recommendations stemming from the ODPC’s first audit in 2011. While the audit was largely positive in its findings, the photo tagging feature introduced by Facebook, ‘tag suggestion’, was deemed by the ODPC to be a step too far for compliance with European data protection rules. This tool used cutting-edge facial recognition technology to automatically suggest the matching of names and pictures, i.e. upon the Facebook user uploading a photo, ‘tag suggestion’ would prompt the names of the individuals appearing in such image.

Consent, contract and transparency are all discussed in some detail at the link and we’ve discussed those topics philosophically on this blog in the past. There is also an analysis of proportionality in the linked article. Proportionality is a concept seen a lot in discussions of privacy issues involving European government institutions. It’s not a big part of privacy discussions in the United States.

In Europe, governments seem to feel freer to proactively inject themselves into arrangements between private entities than do governments in the United States. The recent French decision re biometrics for time-and-attendance is a good example of the invocation of proportionality to regulate the behavior of private entities.

In the United States, negligence, liability and torts seem to fill some of the roles proportionality plays in Europe. Since the legal system in the United States generally holds that one cannot consent to another party’s negligence, negligent parties are exposed to civil suits in the event that a data breach harmful to individuals occurs.

In general, it seems that the European approach is more proactive and government driven while the approach in the United States is more reactive and driven by private interests.

France severely limits biometrics for time-and-attendance

No biometrics to control working hours (CNIL)

October 23, 2012

In recent years, the control techniques employed in their workplaces have experienced unprecedented growth, including through the use of biometric devices. Therefore, the CNIL wished to obtain the opinion of trade unions and employers, the General Directorate of Labour as well as some professionals, the use of this technology. The issue of biometrics as a tool for management and control of attendance zones has been analyzed under the Data Protection Act and in accordance with the Labour Code.

The Commission has always been vigilant about biometrics. They have the peculiarity of being unique and permanent, because they identify an individual from its physical, biological or behavioral (eg fingerprint, hand contour). They are not assigned by a third party or by the person chosen. They are produced by the body itself and the means permanently thereby allowing the “tracing” of individuals and their identification.

The sensitive nature of these data that explains the Data Protection Act provides a specific control of the CNIL essentially based on the proportionality of the device in relation to the objective sought, such as time management.

On 27 April 2006, the Commission adopted a single authorization for the implementation of biometric recognition based on the contour of the hand with the purpose of access control and time management and restoration of the site work (AU-007).

Following more than a dozen hearings, consensus is clearly expressed to consider the disproportionate use of biometrics for control schedules.

Therefore, the Commission has decided to modify the TO-007 in that it allowed the use of the hand contour for time management. now, no single authorization are used to control the schedules of employees by a biometric device.

Transitional measures
Organizations that already use this device to control schedules and staff who have made ​​a commitment to comply before the publication of this new debate will continue to use it for a period of five years. After this time, they will stop using the biometric feature, which will not involve systematically changing hardware. Organizations can indeed set the system to inhibit the function and use biometric instead, codes, cards and / or badges without biometrics. The CNIL has informed individually organizations having previously sent a commitment to comply with the AU-007.

However, devices contour of the hand can still be used to control access to the premises or manage the restoration of the workplace. These treatments will continue to be a commitment to comply with the AT-007

The fact install a biometric device for purposes other than those covered by the AU-007 will give rise to requests for specific permission, which will be considered on a case by case basis by the Commission. [ed. Translation by Google; Emphasis in original]

See also: No more single authorization of the CNIL can now monitor employee schedules by a biometric hand recognition.

It seems that France has placed some limits on biometrics for time-and-attendance, preventing new adoption   and requiring a five-year phaseout for those who are currently using the technology.

CNIL explicitly okays biometrics for physical access control.

No example of actual “tracing” or violation of privacy is mentioned in the statement.

It appears the CNIL has preserved by law a certain degree inefficiency in the French labor market — inefficiency that biometric technology can help reduce. So far, this is the only case of its kind that I’m aware of.

Oh well, vive la différence.

h/t:
PogoWasRight.org
@M2SYS

Wonderful New World vs. Brave New World

Data Privacy Commissioners Discuss Ubiquitous Tracking (Forbes)

The big question for those gathered here is finding the right mix between government regulations, industry’s best practices and consumer education. In a speech at the conference, Microsoft general counsel and executive vice president Brad Smith agreed that some regulations are necessary to create a level playing field and a clear set of rules for big and small companies to follow while regulators like Portugal’s Clara Guerra acknowledged that big government can’t solve all the risks associated with big data. It’s a shared responsibility and it requires consumer awareness starting with privacy education programs aimed at children as well as adults.

Author Larry Magid strikes an important balance between the wonderful things made possible by technological innovation, the downsides of unaccountable misuse, and the need to help people stay aware of the implications of changing technology on their lives.

I hope his temperament is contagious.

Bikini Detection algorithm raises the stakes in social media

Add bikinis to the list of objects recognizable by computer algorithms.

IPhone app that finds racy Facebook photos raises privacy worries (Los Angeles Times)

“This is a very touchy subject, of course,” Barto said. “Anything that’s readily available on Facebook, that’s what we can search. Those privacy tools on Facebook should be used to control the content that you want to be private.”

The app works in a similar way to the facial-recognition technology found in video chat programs and Facebook’s tag prompts. But instead of identifying faces, Badabing identifies the shape of a bikini. That means in addition to beach photos, the app may return pictures of a T-shirt with the outline of a swimsuit.

Object recognition is really starting to take off.

Related: Biometrics, object recognition and search

EU Urges Google on Transparency

EU regulators say Google must revise its privacy policy (The Verge)

The EU is fine with Google’s unified privacy policy acting as a “general guideline” about its operations, but it wants the search giant to return to its old system, which provided specific privacy notices for each Google product. It says these product-specific privacy policies must include “simple and clear explanations” on when, why, and how location, credit card, unique device identifiers (UDIDs), and telephony data is collected, along with information on how users can opt out. It asks that Google adds a specific clause for biometric data where necessary as there is currently no mention of facial recognition in its privacy policy.

Playing it down the middle

Biometric ID advance ignites debate over rights (Trib Live)

Long envisioned as an alternative to remembering scores of computer passwords or lugging around keys to cars, homes and businesses, technology that identifies people by their faces or other physical features finally is gaining traction, to the dismay of privacy advocates.

A balanced article on the tension between biometric technology and privacy.

UK Surveillance Commissioner Speaks

CCTV Technology has ‘Overtaken Ability to Regulate it’ (Wall Street Journal)

“A tiny camera in a dome with a 360-degree view can capture your face in the crowd, and there are now the algorithms that run in the background. I’ve seen the test reviews that show there’s a high success rate of picking out your face against a database of known faces.”

Research into automatic facial recognition being carried out by the Home Office has reached a 90 per cent success rate, he said, and it was “improving by the day”.

The headline quote comes from this more detailed article from The Independent, and might best be taken as a warning rather than a statement of fact. After all, if meant literally, the statement belongs in a resignation letter.

Surveillance Commissioner Andrew Rennison:

Let’s have a debate – if the public support it, then fine. If the public don’t support it, and we need to increase the regulation, then that’s what we need to do.”

Sounds like Transparency and Consent to me.