Bikini Detection algorithm raises the stakes in social media

Add bikinis to the list of objects recognizable by computer algorithms.

IPhone app that finds racy Facebook photos raises privacy worries (Los Angeles Times)

“This is a very touchy subject, of course,” Barto said. “Anything that’s readily available on Facebook, that’s what we can search. Those privacy tools on Facebook should be used to control the content that you want to be private.”

The app works in a similar way to the facial-recognition technology found in video chat programs and Facebook’s tag prompts. But instead of identifying faces, Badabing identifies the shape of a bikini. That means in addition to beach photos, the app may return pictures of a T-shirt with the outline of a swimsuit.

Object recognition is really starting to take off.

Related: Biometrics, object recognition and search

Benghazi: US uses face rec to confirm that suspect held in Tunisia was present at attack

US officials ID’d Libya attack suspect on surveillance video, sources say (FOX)

Ali Ani al Harzi and one other suspect were detained at an airport in Turkey in the days after the attack while travelling with false documents, and Harzi now has been identified as being present at the attack using the images obtained from the consulate compound video, Fox News’ sources say.

Harzi was transferred to Tunisian custody, but U.S. interrogators so far have not had access to him, much to the frustration of American authorities. Even so, U.S. intelligence agencies have confirmed through facial recognition technology that the Tunisian was present the night of the consulate attack.

More information on the NIST Biometric Conformance Test Software

Are your biometrics up to snuff? Free suite tests for compliance (GCN.com)

The BioCTS suite checks that the record of an iris image or other piece of biometric data being used has the correct data and in the order called for by the standard, so that it can be sent to and received correctly and filed accurately by any user, from the Homeland Security Department to state and local police departments. The conformance testing provides programmers, users and product purchasers with an increased level of confidence in product compliance and increases the probability of successful interoperability.

The tests do not ensure interoperability of different products, however; only that they adhere to common standards, Podio said. “Conformance increases the probability of interoperability, but cannot ensure it because of all the possible implementations that can be included” in a product. Each developer can implement different profiles from the standard, depending on how the product will be used.

More good analysis and links at the GCN link above.

Biometric system keeps excluded man from attending Boca Juniors-River Plate game

Argentina’s derby of derbies ends all-square (The Star – Malaysia) 

Meanwhile, security measures appeared to have worked efficiently after a renowned figure among Boca’s ‘barra brava’ or hooligan fringe was picked up by biometric identification system and was refused entry to the venue.

Mauro Martin tried to get into the game but Interior Minister Florencio Randazzo said he had been caught in the net and was prevented from attending after his fingerprints were checked.

During the summer, Martin required hospital treatment for a gunshot wound suffered in a confrontation between rival Boca hardcore followers.

Here’s the scene yesterday at the ‘Bombonera’ in Buenos Aires. It’s obviously an incredible atmosphere.

Notice that the players seem to be deposited into the center of the field via a long protective tube.

Getting Banking Security Right in a Mobile World

Security as a Service (Michael Nuciforo at Finextra)

One of my pet hates with most mobile banking projects is how security is treated as an adjunct rather than a key scope item. Any product or marketing manager worth their salt knows the number one reason consumers don’t adopt mobile banking services is security concerns. The reason security is treated as a ‘black sheep’ is that it isn’t doesn’t deliver tangible customer satisfaction improvements. And even though customers expect it, they don’t often get excited about it. A change in mind-set is required. Security should be treated as a service. If you get it right, and promote it appropriately, it could be the key factor in your bank achieving above normal user adoption.

Good advice to banks follows.

India Round-up

Security equipment industry grew by 25% in last 3 yrs – Compared to 7% for the rest of the world. (moneycontrol.com)

India tries handing out cash to poor – Those waiting on the cash probably want somebody to try harder. (news24)

Jharkhand: Slow state to review ration card pact – when asked, the food and civil supplies minister admitted that at present no steps were being taken to introduce biometric system in PDS supplies. (Yahoo)

PM gives Aadhaar awards in Rajasthan (Yahoo)

OPINION: An informed choice (on technology and economic growth) (Hindustan Times)

OPINION: Blundering on land & Aadhaar (Kashmir Times)

Biometric Chat on Iris Biometrics November 1

When: November 1, 2012 

11:00 am EDT, 8:00 am PDT, 16:00 pm BST, 17:00 pm (CEST), 23:00 pm (SGT), 0:00 (JST) 

Where: tweetchat.com/room/biometricchat (or Twitter hashtag #biometricchat

What: Tweet chat on iris biometrics technology with Jeff Carter, Chief Strategy Officer of @EyeLockCorp

Topics: Differences between iris and retina biometric identification technologies, using iris recognition to identify the unconscious, public acceptance of iris biometrics compared to other biometric modalities, iris biometrics and mobile device user authentication, iris biometrics accuracy compared to other biometric modalities, and more!

More information at the M2SYS blog.

I always enjoy these. 

Tune in, dial up, surf over (or do whatever it is you do to navigate the interwebs) and join in the conversation.

Here’s some background on Jeff’s vision for iris biometrics.

UPDATE: A good time was had by all. In case you missed it and would like to see how it went, the Twitter Biometric Chat transcript on Iris biometrics is up at Storify.

Kenya Elections: Many reasons for worry

The whole sad saga dating to September of last year is here.

Today’s news doesn’t inspire optimism that Kenya can deploy a successful biometric voter registration system (which, without voter verification, is really only half of a biometric election system anyway) by March.

The Biometric Voter Registration Kits are late.

Anxiety is also increasing due to several other factors that are well covered in Fear grows over delays in voter registration at the Kenya Standard.

That fear has grown to the point where public officials are publicly beginning to wonder whether there isn’t some conspiracy afoot that aims to delay elections. Placed against the historical backdrop of Kenya’s electoral experience — only three presidents since 1964, and many hundreds killed following the last presidential elections in 2007 — it’s no wonder Kenyans are starting to worry.

Correctly deployed and well managed biometric voting technology can be extremely helpful in bringing rigor and transparency to electoral systems at a cost that less developed countries can afford. Through careful planning and wise investments in technology, countries can build an affordable and rigorous ID infrastructure that strengthens democracy, which in turn opens the door to other benefits.

The systems themselves are technically complex but there are plenty of organizations like SecurLinx that can supply the technical expertise to implement them. The technical complexities, however, make up only a fraction of the overall bureaucratic load of running a decent election.

The logistical and human resources challenges are far larger and more expensive to address than the technical challenges of biometric systems.

Like we always say… Biometrics & ID management: it’s about people.

UPDATE: 
Kenya: Justice Minister Eugene Says Treasury Was Poll ‘Saboteur’ (All Africa)

Yesterday Prime Minister Raila Odinga chaired a crisis meeting at his office to resolve the delay in the procurement of the Biometric Voter Registration kits with the IEBC top officials, Finance minister Njeru Githae, Justice minister Eugene Wamalwa, Lands minister James Orengo and Treasury PS Joseph Kinyua among others.

However President Kibaki missed the meeting for a second time. On Tuesday Wamalwa accused “some people” of attempting to sabotage preparations for the polls.

“There was an anxiety and finger pointing that had started creeping in. The culprits I had in mind were actually the Treasury,” said Wamalwa who had promised to name the saboteurs.

UID isn’t painless but neither is the status quo

India risks backlash hurrying through Aadhaar project

The pilot project in Beelaheri, a village of 2,000 people some 130 km (81 miles) southwest of Delhi, replaces kerosene subsidies with cash rebates and has been running since December. It has massively lowered demand for the subsidized fuel, which weighs on government finances.

But teething problems are immediately visible.

The headline’s a bit harsh but the piece is well worth reading in its entirety.

FTC Freestylin’ on Face Recognition

Federal Trade Commission Staff Report Recommends Best Practices for Companies That Use Facial Recognition Technologies


Mission of the Federal Trade Commission…
To prevent business practices that are anticompetitive or deceptive or unfair to consumers; to enhance informed consumer choice and public understanding of the competitive process; and to accomplish this without unduly burdening legitimate business activity.

In December of last year, the Federal Trade Commission (FTC) hosted a workshop – “Face Facts: A Forum on Facial Recognition Technology” to examine the use of facial recognition technology and related privacy and security concerns.

Monday, the FTC released two documents summing up the effort. The first is the Staff Report, a 21 page attempt to synthesize the views of the forum’s participants and FTC staff into an authoritative guide. The second is a dissent from the 4-1 vote in favor of releasing the staff report.

In my opinion, Best Practices for Common Uses of Facial Recognition Technologies falls a little short for a couple of reasons. First, of the staff report’s three cases, only one — the Facebook case — is actually a facial recognition application. Then in the other instances where the report deals with facial recognition proper, it does so in a wholly hypothetical way. This approach runs the risk of being seen by many as falling outside the ambit of the FTC’s mission.

I have selected passages from both documents mentioned above for examination because they lie at the heart of the whole exercise. They are a distillation of what the entire project was about and has concluded. The entire documents are available via links below for those who seek more information.

from the Staff report (pdf at FTC.gov)

To begin, staff recommends that companies using facial recognition technologies design their services with privacy in mind, that is, by implementing “privacy by design,” in a number of ways. First, companies should maintain reasonable data security protections for consumers’ images and the biometric information collected from those images to enable facial recognition (for example, unique measurements such as size of features or distance between the eyes or the ears). As the increasing public availability of identified images online has been a major factor in the increasing commercial viability of facial recognition technologies, companies that store such images should consider putting protections in place that would prevent unauthorized scraping which can lead to unintended secondary uses. Second, companies should establish and maintain appropriate retention and disposal practices for the consumer images and biometric data that they collect. For example, if a consumer creates an account on a website that allows her to virtually “try on” eyeglasses, uploads photos to that website, and then later deletes her account on the website, the photos are no longer necessary and should be discarded. Third, companies should consider the sensitivity of information when developing their facial recognition products and services. For instance, companies developing digital signs equipped with cameras using facial recognition technologies should consider carefully where to place such signs and avoid placing them in sensitive areas, such as bathrooms, locker rooms, health care facilities, or places where children congregate.

Staff also recommends several ways for companies using facial recognition technologies to provide consumers with simplified choices and increase the transparency of their practices. For example, companies using digital signs capable of demographic detection – which often look no different than digital signs that do not contain cameras – should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs. Similarly, social networks using a facial recognition feature should provide users with a clear notice – outside of a privacy policy – about how the feature works, what data it collects, and how it will use the data. Social networks should also provide consumers with (1) an easy to find, meaningful choice not to have their biometric data collected and used for facial recognition; and (2) the ability to turn off the feature at any time and delete any biometric data previously collected from their tagged photos. Finally, there are at least two scenarios in which companies should obtain consumers’ affirmative express consent before collecting or using biometric data from facial images. First, they should obtain a consumer’s affirmative express consent before using a consumer’s image or any biometric data derived from that image in a materially different manner than they represented when they collected the data. Second, companies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer’s affirmative express consent. Consider the example of a mobile app that allows users to identify strangers in public places, such as on the street or in a bar. If such an app were to exist, a stranger could surreptitiously use the camera on his mobile phone to take a photo of an individual who is walking to work or meeting a friend for a drink and learn that individual’s identity – and possibly more information, such as her address – without the individual even being aware that her photo was taken. Given the significant privacy and safety risks that such an app would raise, only consumers who have affirmatively chosen to participate in such a system should be identified. The recommended best practices contained in this report are intended to provide guidance to commercial entities that are using or plan to use facial recognition technologies in their products and services. However, to the extent the recommended best practices go beyond existing legal requirements, they are not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC. If companies consider the issues of privacy by design, meaningful choice, and transparency at this early stage, it will help ensure that this industry develops in a way that encourages companies to offer innovative new benefits to consumers and respect their privacy interests. [ed.: bold emphasis mine]

The fist paragraph above is common sense. For example: “Companies should establish and maintain appropriate retention and disposal practices for the consumer images and biometric data that they collect.” Who could argue with that?

I believe many on all sides of the facial recognition issue will find the Face Facts forum findings disappointing and I think the second italicized paragraph above best encapsulates why. In it, the FTC staff report loses coherence.

Let’s examine it in detail.

1. The staff report doesn’t confine itself to facial recognition proper.

Staff also recommends several ways for companies using facial recognition technologies to provide consumers with simplified choices and increase the transparency of their practices. For example, companies using digital signs capable of demographic detection – which often look no different than digital signs that do not contain cameras – should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs.

Demographic inference isn’t facial recognition and nowhere does the FTC staff make a case that a computer guessing at gender, age or ethnicity has any privacy implication, at all. And then, even if that case is made, the task of tying the activity back to the FTC’s mandate remains.

¿Qué?

The recommendation that someone “should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs,” however reasonable it seems in theory, is odd in practice. The old microwave-and-pacemaker signs come to mind. But then where would an ad agency put those signs if they wanted to do advertising on, say, a city street? [Bonus: would it be appropriate to use language detection technology in those signs in order to display the warning message in a language the reader is judged more likely to understand?]

2. Next there’s a nameless “social network” — no points for guessing [See: Consumer Reports: Facebook & Your Privacy and It’s not the tech, it’s the people: Senate Face Rec Hearings Editionwhich — that  is hypothetically doing the exact same things a non-hypothetical social network actually did without much in the way of an FTC response.

Similarly, social networks using a facial recognition feature should provide users with a clear notice – outside of a privacy policy – about how the feature works, what data it collects, and how it will use the data. Social networks should also provide consumers with (1) an easy to find, meaningful choice not to have their biometric data collected and used for facial recognition; and (2) the ability to turn off the feature at any time and delete any biometric data previously collected from their tagged photos.

This is the closest the document ever gets to a concrete example of facial recognition technology even being in the neighborhood of an act the FTC exists to regulate and the staff of the FTC still doesn’t abandon the hypothetical for the real world.

3. Then there’s the warning that the FTC would take a dim view of two types of hypothetical facial recognition deployment each of which would require its own dedicated staff report in order to make a decent show of doing the topic justice.

Finally, there are at least two scenarios in which companies should obtain consumers’ affirmative express consent before collecting or using biometric data from facial images. First, they should obtain a consumer’s affirmative express consent before using a consumer’s image or any biometric data derived from that image in a materially different manner than they represented when they collected the data. 

This is far too general to be useful. The above would seem to preclude casinos from using facial databases of known or suspected cheaters, a proposition few would argue.

Then there’s the question of what makes biometric data so special? Should the same standards apply to all personal data or just pictures of faces?

For the situation above to apply to the FTC’s mandate a practice would have to be deemed “deceptive” or “unfair” and if a practice is deceptive or unfair when a face is part of the data being shared, how does using the data in a substantially equal manner cease to be deceptive and unfair by omitting the face? The report is silent on these points.

Second, companies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer’s affirmative express consent. Consider the example of a mobile app that allows users to identify strangers in public places, such as on the street or in a bar. If such an app were to exist, a stranger could surreptitiously use the camera on his mobile phone to take a photo of an individual who is walking to work or meeting a friend for a drink and learn that individual’s identity – and possibly more information, such as her address – without the individual even being aware that her photo was taken. Given the significant privacy and safety risks that such an app would raise, only consumers who have affirmatively chosen to participate in such a system should be identified.

This hypothetical future app does exactly what anyone can pay a private detective to do legally and today. If the FTC isn’t taking action against PI’s, it would be extremely helpful of the FTC to make clear to buyers and sellers of facial recognition technology the distinctions they see between the two.

Then, towards the end of the excerpted text, perhaps sensing how far ahead of themselves and the mission of the FTC they have gotten, a couple of sentences later (bold sentence) the staff report essentially says, “Never mind. We aren’t formulating new policy here. We’re just freestylin.”


However, to the extent the recommended best practices go beyond existing legal requirements, they are not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC. If companies consider the issues of privacy by design, meaningful choice, and transparency at this early stage, it will help ensure that this industry develops in a way that encourages companies to offer innovative new benefits to consumers and respect their privacy interests. [ed.: bold emphasis mine]

With the possible exception of the “social network” example, pretty much everything in the document goes beyond existing legal requirements enforced by the FTC. So what’s going on here?

My hunch is that someone at the FTC became concerned over a “social network” terms of service issue and rather than deal with it as a narrow terms of use issue — an issue seemingly right in the wheelhouse of the FTC’s mission  under the “deceptive or unfair” part of their mission — decided instead that it was a technology issue and that it was both possible and desirable to address the far bigger issues of facial recognition technology, ID and society in a coherent way, forgetting that doing so requires a novel interpretation of the FTC’s mission. Once that decision was made, the best practices document, flawed though it is, was about the best that could be hoped for… which brings us to the dissent.

The decision to release the Face Facts staff report wasn’t unanimous. Commissioner Thomas Rosch thought releasing the report at all was a mistake. Several paragraphs of the dissent follow below.

The last paragraph quoted below is particularly convincing.

then the lone dissent… (pdf at FTC.gov)

The Staff Report on Facial Recognition Technology does not – at least to my satisfaction – provide a description of such “substantial injury.” Although the Commission’s Policy Statement on Unfairness states that “safety risks” may support a finding of unfairness,3 there is nothing in the Staff Report that indicates that facial recognition technology is so advanced as to cause safety risks that amount to tangible injury. To the extent that Staff identifies misuses of facial recognition technology, the consumer protection “deception” prong of Section 5 – which embraces both misrepresentations and deceptive omissions – will be a more than adequate basis upon which to bring law enforcement actions.

Second, along similar lines, I disagree with the adoption of “best practices” on the ground that facial recognition may be misused. There is nothing to establish that this misconduct has occurred or even that it is likely to occur in the near future. It is at least premature for anyone, much less the Commission, to suggest to businesses that they should adopt as “best practices” safeguards that may be costly and inefficient against misconduct that may never occur.

Third, I disagree with the notion that companies should be required to “provide consumers with choices” whenever facial recognition is used and is “not consistent with the context of a transaction or a consumer’s relationship with a business.”4 As I noted when the Commission used the same ill-defined language in its March 2012 Privacy Report, that would import an “opt-in” requirement in a broad swath of contexts.5 In addition, as I have also pointed out before, it is difficult, if not impossible, to reliably determine “consumers’ expectations” in any particular circumstance.

In summary, I do not believe that such far-reaching conclusions and recommendations can be justified at this time. There is no support at all in the Staff Report for them, much less the kind of rigorous cost-benefit analysis that should be conducted before the Commission embraces such recommendations. Nor can they be justified on the ground that technological change will occur so rapidly with respect to facial recognition technology that the Commission cannot adequately keep up with it when, and if, a consumer’s data security is compromised or facial recognition technology is used to build a consumer profile. On the contrary, the Commission has shown that it can and will act promptly to protect consumers when that occurs.

To summarize, Rosch points out that the FTC staff report:

  • Exceeds the FTC’s regulatory mandate
  • Makes no allegation of consumer harm
  • Is so overly broad as to be unworkable
  • Provides no support for the conclusions it draws

The FTC would perhaps have been better served had more Commissioners taken Rosch to heart. As it happens, the FTC staff report over reaches, under delivers, and deviates from the organization’s stated mission and the results aren’t pretty.

NOTE: This post has been modified slightly from the original version to add clarity, by cleaning up grammar, spelling or typographical errors.

…and a couple from India

India tries handing out cash to help teeming poor (Asia One)

“On the basis of Aadhaar, we can ensure that the benefit of schemes reach genuine beneficiaries and that there is no mediator,” Prime Minister Manmohan Singh said last weekend.

India subsidises everything from fertilizer and food to kerosene so cutting waste is crucial to the government’s drive to rein in its budget deficit.

Aadhaar will now be used as identity proof, for bank KYC (TMCNet)

So, when a bank asks for your ID proof to open an account, all you will have to do is tap on a device that reads fingerprints and the information will be transmitted electronically in an encrypted form. The front desk will receive a message saying that the information has been received and has matched with the data available, explained an official.

First word about the FTC report

I’ll have much more to say on the topic, perhaps later today, but the first clutch of analysis of the FTC’s findings following the Face Facts workshop is starting to come out.

The best two examples I have seen so far are:
FTC Issues Privacy Guidelines for Facial Recognition Technology (eWeek)
FTC Issues Guidelines for Facial Recognition (Multichannel News)

Brian Prince at eWeek gets, I think, gets at two very important aspects of the FTC’s efforts: the degree to which Facebook is the elephant in the room; and the dissenting voice of Commissioner Thomas Rosch who thought releasing the report at all was a mistake.

John Eggerton at Multichannel News gives a down the middle summary of each of the two points of view (pro and con). Then he really gives the dissent the attention it deserves. The quotes from Daniel Castro, senior analyst at the Information Technology & Innovation Foundation, that close the article are highly appropriate.