link to BCS Website

IT Security

Technology and concept — privacy and authentication

There has been increasing concern over privacy within Internet businesses. Juliet Hoskins, Editor of EEMA Briefing, published by the European Electronic Messaging Association (EEMA), explores this issue and explains how it differs from anonymity.

Privacy and anonymity

The concept of privacy is at the heart of Internet business, particularly in the B2C domain, where new users, already uncertain as to the security of their transactions, and the possibility of Internet fraud, are now concerned that information they pass to third parties may be used for purposes it was not originally intended for. However, there is a common misconception that privacy means the same as anonymity. It does not. It is true that under certain circumstances we may wish for anonymity for ourselves, but we almost always mistrust it for others. Stuart Baker of Steptoe & Johnson cites several examples: "A signed love letter is flattering; an anonymous love letter is creepy. Respectable newspapers rightfully refuse to publish unsigned letters to the editor. And none of us would want to walk in a city where all the pedestrians were masked — or drive in a city where none of the cars had license plates." The concept of privacy is rather to limit your personal information to those who need to see it, and to be able to trust that your information will not be passed to unscrupulous third parties. The debate therefore centres around whether to preserve anonymity, or to implement strong authentication so that the information trail can be properly controlled.

Privacy and authentication

The argument is similar to the debate over encryption. Do you try to limit the deployment of strong encryption in case it gets into the wrong hands, or do you regulate against the minority of people who would abuse it? Early in the debate, Baker thought that there was some good news about government policy: strong authentication was more important for secure networks than strong encryption — and there weren’t any governments trying to stop strong authentication. Nowadays he admits his opinions have changed: many governments appear to be taking steps to do just that. And attacks on authentication technologies are a worrying trend.

For example, smart-cards are increasingly being marketed as authentication and sign-on devices. Although they have not been targeted directly, the attacks on other authentication technologies do not augur well. One example is Intel’s new Pentium 3 processor that contains a processor serial number unique to that chip. The processor can be set to reveal its serial number in response to an inquiry from the network and would certainly add to the security of networks in real ways. Similarly, Microsoft incorporated into its software a method of identifying each document with a globally unique identifier (GUID) that incorporates information about the machine on which the document was produced. The GUID makes it difficult to generate an anonymous document, at least as far as the network is concerned. This feature, too, has security value, and has been used to track the author of the Melissa virus.

Privacy groups attacked the new technology as soon as it was announced. And governments quickly joined the fray. Only the US Federal Trade Commission and state attorneys general took a restrained view, refraining from suing Intel and Microsoft. In Europe it was a different story: while Europe claims not to regulate technology, several government agencies conveyed a simple message: "disable those technologies or face unpleasant government sanctions."

Double talk

As Baker states: "The privacy community’s attack on these technologies offers a rich lode for lovers of irony. In the encryption debate, privacy advocates led the fight against the FBI’s effort to restrict encryption technology, claiming that technologists, not government, should determine what technologies are deployed, particularly if they provided valuable security for responsible users. The FBI was restricting technology simply because it might be misused by a minority. Surely it was better to regulate the misuse than to deny everyone better security."

When it came to authentication, though, these arguments were reversed. Privacy advocates admitted that the serial number and GUID may be useful in making network users more accountable and secure, but they could be misused by unscrupulous Web merchants to track users through cyberspace. Was the answer to regulate unscrupulous Web merchants? It was not: privacy advocates would only be satisfied when these capabilities were removed entirely from the hardware and software. They want to prevent the deployment of authentication to preserve anonymity.

But if we do that, we end up trusting important information or transactions to networks full of anonymous, unaccountable users. Data is not stored on networks so that it can remain anonymous. When we talk about privacy, we mean that we want to share it with some people and not with others. For example, when our personal financial information is available on a network, we certainly want that information to be protected by strong authentication so that only we — and authorised officials — can access it. But if there’s no way to tell who’s using the network, who’s accessing our data, then we can’t tell whether or not our privacy expectations have been met. The campaign to preserve a kind of mandatory anonymity risks sacrificing many important forms of privacy.

Trust

Data protection is essential for electronic commerce. It is the measure of trust that makes users feel protected. And the problem of privacy is not merely national. Increasingly, it becomes apparent that international law has a long way to go to catch up with the pace of technological change. At the moment, we are still trying to map paper-based legislation onto the Internet world. The protection of data and people’s fundamental right to privacy is now a key debate, and one that crosses the Atlantic. The position in the US has been that the issue may be resolved with a self-regulatory approach; and that is not the view of the European Union, where privacy and data protection are a matter of legislation.

The EU and the US

This divergence has caused severe problems for anyone wishing to transmit data from Europe to America. As Dr Andreas Mitrakas, GlobalSign NV explains: "The EU European Directive 95/46 on the protection of individuals with regard to the processing of personal data and on the free movement of such data provides active protection to users and prohibits the data transmissions to countries that do not offer the same level of protection as the EU. Data collected in Europe cannot be transmitted over to the US for further storage or processing. Talks underway to negotiate a solution have succeeded so far to provide a settlement on the issues in question while they have stalled on the notion of the enforcement of the principles. In Europe, a nationally based Data Protection Authority ensures the enforcement of the principles of the directive at a national level."

He cites the recent case of a software company surreptitiously intercepting user data: "Allegedly, it was discovered that with the assistance of a packet sniffer, a popular plug-in application transmitted a unique identifying number — a GUID — along with the playlist of the currently inserted CD to the producer of a software program. It was feared that this information was later collected for marketing purposes without the users’ knowledge."

Digital certificates

An issue of substance may also arise with regard to the personal data of applicants of digital certificates. To date, much of these services are offered by European as well as overseas providers — called certification authorities (CAs) through the World Wide Web (WWW). According to the EU directive, if personal data is collected locally in Europe it should remain in Europe as well. This, however, may not necessarily be the case with providers of security services that make their products available through the WWW.

Eric Arnum, EEMA Briefing’s US Editor, cites another case of personal data being passed to third parties: "A well-known advertising banner company, briefly took its stock off the market earlier this year, after revealing that the Federal Trade Commission and two state governments were investigating the company’s data collection practices. This company controlled the banner ads on some 11,000 Web sites worldwide. When it serves up a banner, it also gives out a ‘cookie’, a small piece of unique code added to a cookie.txt file, whose subsequent detection signals to the company that they have a return visitor. The problem is that the company could tell which of the other 11,000 sites a Web browser had visited. People who don’t periodically erase their cookie.txt file are providing a detailed series of data points on how often they visit all those sites. What a profile it must be!

"Where the advertising banner company crossed the line, however, was when it acquired a company to help it launch a personalisation service. The company would collect names, ages, incomes, education, home location, number and ages of children etc. All this information would be collected by any of the 1,500 clients of the advertising banner company who might ask their consumers to fill out surveys or sweepstake entries, or who might respond to an after-sales survey.

"What customers might not have known was that they would effectively be allowing this company to attach a name and an income to a cookie. They could tell which rich suburban men bought online porno, and which kids bought the latest ‘N Sync album, and where their mothers work. A pair of eyes could follow you around, look where you’ve been for the past few months, and know your habits."

The four principles

In the US, data protection is largely based on self-regulatory initiatives which, however, fail to enforce their own rules to companies that clearly violate their guidelines. In May, following the opening of hearings on Capitol Hill, the Federal Trade Commission said that industry self-regulation was not enough. In a survey of randomly selected Web sites, only 20 percent adhered to the four principles of notice, consent, access and security:

  • Notice: companies must provide clear and conspicuous notice about what information will be collected, how that information may be used and to whom they will sell or disclose the information.
  • Consent: consumers will ‘opt in’ before their personally identifiable information is collected, used or disclosed. Consumers must be able to opt out when non-personally identifiable information is collected.
  • Access: upon request by a user, companies must provide reasonable access to personally identifiable data and an opportunity to correct it.
  • Security: companies must protect the security and confidentiality of information. Notice of breaches in security must also be provided.

Despite the fact that the industry in the US has failed to regulate itself in line with the standards set by the FTC, there is still no clear resolution to the problem in sight.

EEMA — The European Forum for Electronic Business

EEMAEEMA is a non-profit organisation, formed in 1987. Its aim is to bring together and improve the communications between all participants in Europe who wish to trade and communicate electronically, and to address industry issues on behalf of its members through the relevant international and governmental administrations.

Through its various interest groups (Unified Messaging, E-Commerce, User Group, Knowledge Management), publications and conferences, EEMA has been instrumental in co-ordinating and moving European business towards electronic trading. In 1998, EEMA launched the European Certification Authority Forum (ECAF) whose main objective is to steer the European digital certification market to enable European businesses to compete effectively in the global online marketplace.

There are currently more than 250 European-level member companies in over 30 countries.

For information on EEMA visit
http://www.eema.org/

For further information, please contact:

Catherina Rolinson, EEMA
Tel: +44 1386 793028
E-mail: catherina.rolinson@eema.org

 

   
   
 
Welcome  l  Article Index  l  Company Profiles  l  Corporate Index  l  Credits  l  Copyright