Archive for the ‘Biometric data’ Category

12 Oct 2010

Tracking recent research in biometrics

One research area that the OPC tracks is biometrics – using physical features and behaviours to automatically identify people. Although biometric technologies can be very useful for establishing identities, they can also raise important privacy concerns. Biometric technology is constantly changing and the ability of systems to accurately recognize people is increasing. OPC staff recently attended the International Conference on Biometrics: Theory, Applications and Systems (BTAS) held in Washington D.C. where they heard about the latest research results.

An area of particular interest to the OPC is private biometrics. This refers to methods that transform or protect the biometric information so that it is replaceable, useless when stolen, and not linkable across applications. One well-known method for accomplishing this is biometric encryption, where biometric information is combined with cryptographic keys, but other methods involve geometric and mathematical transformations of the biometric data.

A research group from France presented a paper that described a method for combining biometric information with cryptographic systems. They used a shuffling scheme to transform the biometric information, where the shuffle was different for each user and application. They also developed a protocol to share crypto-biometric keys between clients and servers and a method to establish session keys. They found that the system was quite successful when applied to face recognition, but since biometric samples are naturally variable, error correction codes had to be applied at authentication time. The conclusion was that information protection methods can be combined with key management protocols to build effective user verification with privacy protection.

One of the limitations of some private biometric methods is that, at a point during the processing, the original biometric information may have to be recovered in order to perform matching, compromising the privacy protections. An interesting topic at this conference was homomorphic encryption, a method that allows biometric matching to be done while the data is encrypted. For example, if a is a set of fingerprint features (minutiae) that describe a fingerprint offered at authentication time and b is a set of features for a fingerprint stored in a database, the difference between the two feature sets can be calculated on the encrypted sets, without ever revealing the original information. A group of researchers from Italy and France presented a paper that evaluated homomorphic encryption in a fingerprint recognition system. Their proof-of-concept system combined fingerprints and homomorphic encryption and produced a fair level of matching performance. Although more work is needed, homomorphic encryption methods are worthy of further consideration for protecting the privacy of biometric information.

In addition to privacy protection research, the OPC is also tracking systems that can be very privacy invasive. At this conference there was a strong emphasis on face recognition for surveillance systems. For example, General Electric and Lockheed Martin demonstrated a face-at-a-distance system that showed impressive performance. The system uses both a wide field of view camera and a pan-tilt-zoom camera to locate people in a scene, find a face, zoom in, and then perform recognition. The system is able to detect faces at 25-50 meters from the cameras, and perform successful face recognition at 15-20 meters. It is also able to track multiple people simultaneously and record 10 facial images per second. This performance, combined with the recognition accuracy that is now possible, means that covert face recognition at a distance is feasible on a large scale.

The privacy implications of biometric systems can vary a great deal depending on the biological characteristics that are used and the way the systems are designed. Biometrics can greatly improve identification services without unduly affecting privacy, or they can be privacy invasive. The OPC will continue to track research in this area and work with organizations to explore all the technical options.

26 Aug 2010

Think before you spit

The decision whether to undergo genetic testing is often highly personal and is usually prompted by a serious medical concern such as a family history of an inherited disease. Traditionally, such testing has been done in a medical setting by health care professionals, including genetic counsellors, who explain the science and ethics behind testing and help patients interpret the results.

Direct to consumer (DTC) genetic testing allows consumers, with as little effort as mailing a biological sample like saliva, to have their DNA analyzed by companies that promise to tell them if they are at risk for a particular disease.

Proponents suggest that DTC services increase access to genetic testing as well as confidentiality of the results, which can be kept out of an official health record. On the down side, the reliability and significance of the results may not match companies’ claims.  Privacy-wise, giving extremely sensitive personal information, such as a DNA sample, to companies that to-date are largely unregulated carries a myriad of risks. In a health care setting, confidentiality of personal information and security of samples are subject to strict controls. On the Internet, it’s another story. How do companies safeguard the sample and test results? Is the information disclosed to any third parties? Some companies make ”de-identified” information available to third parties for research purposes, in which case how reliable is the de-identification? And what happens to personal information if the company is sold or folds?

The debate about DTC genetic testing heated up recently as the U.S. Federal Drug Administration focussed its attention on the increasing availability of such services and whether they need to be regulated. In the U.K., the Human Genetic Rights Commission just published voluntary guidelines for companies selling the tests, including guidance on data protection as well as consent, stating “Informed consent can only be provided when a consumer has received sufficient relevant information about the genetic test to enable them to understand the risks, benefits, limitations and implications (including the implications for purchasing insurance) of the genetic tests.” In the interests of informed consent, the Federal Trade Commission advises consumers to check the privacy policies of online companies to see how they use personal information and whether they share it with marketers.

Here in Canada, in 2008, the Canadian Medical Association’s (CMA) General Council passed a resolution calling for the CMA to develop policy to advise on the development of a national system to oversee, organize and access genetic testing in Canada. In May 2010, the CMA proposed a national Regulatory Framework for Direct-to-Consumer Clinical Genetic Tests as a tool to highlight issues raised by these tests as an advocacy tool.

As well, in a recent study funded by the OPC, researchers at the University of Alberta’s Health Law Institute analysed the privacy policies of 32 DTC genetic testing companies against the fair information principles that underpin Canada’ private sector privacy law, PIPEDA.  Of the 32 company websites studied, fewer than half had privacy policies that addressed how biological samples and genetic test results are handled.

The report concludes with a list of privacy-related questions that consumers should consider before buying genetic tests over the Internet. “Consumers who seek answers to the questions – through careful review of company privacy policies and direct contact with companies – will be able to make a more informed choice about sending their personal information and genetic samples to a company.”

Admittedly, satisfying your curiosity about what health challenges may await you is very tempting. However, consumers should be aware that they may be getting more – and less –  than they bargained for.

4 Mar 2009

Trust me…it’s bleeding.

So says a new report from Dartmouth College telling us that in the US “data hemorrhages” are coming from all over the health sector including hospitals, physicians, laboratories, as well as outsourced service providers.

For example, the researchers found a 1,718-page document from a medical testing laboratory containing patient Social Security numbers, insurance information, and treatment codes for thousands of patients exposed on a P2P network, as well as two spreadsheet databases from a hospital system detailing highly sensitive personal information on over 20,000 patients, including codes revealing their diagnoses.

Among the many troubling issues raised in this report, what strikes us is that a source of the problem is not necessarily a scheming employee intent on medical identity fraud but rather inadvertent disclosures on internet-based file sharing networks.  Stories like these are just one more reason for patients to be worried about the privacy of their personal health information.  And with the new funds flowing in support of electronic health records development here in Canada, there needs to be some sober second thought on how the health care sector proceeds to maintain patient trust.

The Canadian Medical Association reported on this question at a health conference in January 2009.  They said they have public opinion survey results over the last ten years that consistently show 11% of respondents holding back information from their physicians because of concerns about their privacy.  The Alberta Medical Association expressed similar concerns in its comments in Committee around Bill 52 (status) in that province: “If patients don’t believe we can protect their privacy and that we may be forced to share the information that they confide in us, they will stop telling us everything we need to know to make the right diagnosis and provide the right care.“

The rush toward electronic health records may well cause more people to feel concern and anxiety about the privacy of their health information so it will be important to keep these views in mind over the coming years.

Research we co-funded through EKOS in 2007 found that 45% of respondents worried that their information could be accessed for malicious or mischievous reasons, 37% were worried that privacy and security procedures would not be followed by those with access to their records, and 55% wanted the ability to mask or hide sensitive information in their file from some users who would be authorized to have access to their health records.

We believe that a patient’s ability to exert some control over who gets to see this most sensitive, personal information seems crucial to preserving patient trust in the health care system.  The last thing we need is more patients withholding information from their health providers because they don’t trust their privacy will be protected and because they continue to hear about privacy breaches involving medical information.

What’s needed is respect for patient wishes, patient control of their personal health information, strong legislation to protect patient privacy as well as transparency and accountability to patients.  And it goes without saying that organizations need to protect against the privacy breaches, such as exposure on P2P networks, that undermine patient trust in the whole system.

(Thanks to SC Magazine for reporting on this research at Dartmouth College.)

29 Apr 2008

“Wacky” and proud of it!

Last week, Al Kamen of the Washington Post published an ironic article lightly criticizing his Homeland Security Chief Michael Chertoff about his statement that fingerprints aren’t personal information.

Any thoughts?

18 Apr 2008

Our Top Ten list of Privacy Act fixes

Tool jar

The Privacy Act, the federal privacy law requiring federal government bodies to respect individual privacy rights, hasn’t been substantially updated since 1982 – the same year the Commodore 64 was released and we stopped calling July 1 Dominion Day. What’s interesting about these changes is they could be implemented immediately and relatively easily – and the benefit to Canadians would be a privacy law that is modern, responsive and efficient.

As readers of this blog will know we are quite fond of the Top Ten list. So today, we present you with our list of the Top Ten fixes for the Privacy Act:

10. Parliament could create a legislative requirement for government departments to show the need for collecting personal information.

9. The role of the Federal Court could be broadened to review all grounds under the Privacy Act, not just denial of access.

8. Parliament could enshrine into law the obligation of Deputy Heads to carry out Privacy Impact Assessments prior to implementing new programs and policies.

7. The Act could be amended to provide the Privacy Commissioner with a clear public education mandate. PIPEDA contains such a mandate for private sector privacy matters. Why shouldn’t the Privacy Act for public sector matters?

6. The Act could provide the Privacy Commissioner with greater flexibility to report publicly on the government’s privacy management practices. As it now stands, we are limited to reporting by way of annual and special reports only.

5. The Act could grant the Commissioner greater discretion at the front-end to refuse complaints or discontinue complaints if the investigation would serve no useful purpose or is not in the public interest. This would allow the OPC to focus our investigative resources on those privacy issues that are of broader systemic interest.

4. Parliament could amend the Act and align it with PIPEDA by eliminating the restriction that the Privacy Act applies to recorded information only. At the moment, personal information contained in DNA and other biological samples is not explicitly covered. (But fingerprints are, in case you thought otherwise.)

3. Parliamentarians could strengthen the annual reporting requirements of government departments and agencies under section 72 of the Act, by requiring these institutions to report to Parliament on a broader spectrum of privacy-related activities.

2. The Act could be amended to provide for regular five-year reviews of the legislation, as is the case with PIPEDA.

1. Finally, the Act currently does not impose a duty on Canadian government institutions to identify the precise use for which personal information is being disclosed abroad. An amendment to the Act could require the Canadian government to not only identify the precise use for the transfer of personal information to foreign states, but ensure that adequate measures are taken to maintain the confidentiality of shared information.

Read this for more information.

29 Feb 2008

Watching you watching TV

We know you didn’t watch the Oscars last weekend. Neither did we. And according to the latest figures from the Nielsen Company, neither did many viewers. Nielsen has been tracking the habits of TV viewers for decades now, and their research figures prominently in the business decisions made by television and advertising industry heads.

Nielsen is now looking to expand its influence, hoping to eavesdrop on other activities – like web surfing, cell phone usage, and purchasing habits. They concede it’s a tough sell though – while many Nielsen families view their influence as tastemakers as a “point of pride”, they bristle at the idea of having many of their day-to-day activities tracked.

Still, Nielsen plans to go ahead with a number of pilot projects aimed at providing their clients with ever-more detailed information about their customers – from how often people look at TV screens in malls and stores to how much perspiration people produce when watching TV at home.

12 Feb 2008

Nexus : Save time but at what cost?

Last Saturday, the French newspaper La Presse published an article about the Nexus program. The article, written by Jean-Philippe Brunet from Ogilvy Renault, highlights the advantages of the program; in particular, its capacity to save travelers some time.


The program is an agreement between Canada and the United States to share voluntarily given personal information to produce an identity card that makes the process of crossing the border less of a hassle.

To participate, you simply have to fill out a form that asks for all your addresses, your employment history from the last 5 years, $50 in administration fees and copies of your passport, your driver’s licence (front and back), and your birth certificate. Once the form is filled and signed, it is then evaluated by both countries that decide if you make it to the next (heavy duty) step – an interview where you will be fingerprinted and have your iris scanned. Pass this test and you’ll receive your Nexus Card that will enable you to “go home earlier and spend time with your family or catch up on your sleep”.

The Issue

In Canada, your personal information is yours and the government has to ask you permission to share that information with a third party. Not so in the U.S. In fact, the minute you sign that form, you are authorizing the U.S. government, under section 215 of the PATRIOT Act, to obtain any document or personal information under terrorist claims without your consent or knowledge and to share that information with whomever they chose. (The Information and Privacy Commissioner for British Columbia has published a report on Privacy and the PATRIOT Act as well.)

It’s for you to decide: catch up on your sleep, or have peace of mind knowing your personal information is safe and not shared with anybody.

15 Jan 2008

Hands across the ocean

An article out of the UK this morning reports that the U.S. FBI is considering the development of an international database in collaboration with the U.K., Australia, New Zealand and Canada which could potentially make personal information – biometric data like iris, palm and finger prints – of its citizens instantly available to police forces in other partner countries. The U.S.-led program, called “Server in the Sky”, would aid forces in tracking down major criminals and suspected terrorists.

The proposal to link databases is ambitious: each proposed partner country has different standards for the collection, storage and use of biometric information.

Governments already share information across borders, but under strict controls designed to protect the rights, including the right to privacy, of innocent individuals. While international participation in the Server in the Sky program looks to be in its very early days, it will be interesting to see who participates, and how. In terms of Canadian participation, our citizens rightfully expect that their personal information remains safeguarded and understandably, could be reluctant to see that information freely shared with two countries that were ranked near the bottom of Privacy International’s ratings of privacy protection around the world.