A reason why communities in general and e-communities in particular are difficult to define is that they are socio-cultural phenomena that undergo continuous change. No community is static, and new communities may exhibit features not seen in earlier ones.
Although no e-community is static, it is nevertheless possible to identify some fundamental development patterns, patterns that not only are common for several types of e-communities, but which are also a feature of technical developments within the Internet. These developments will by analysed in terms of the SCOT approach, since this is a suitable theoretical framework for identifying common socio-technical dynamics at this level of abstraction.
A limitation associated with SCOT is that it has traditionally been applied to single, well-defined, technological artefacts (e.g. bicycles). The Internet, however, is not such an artefact. As an infrastructure technology it is a heterogeneous collection of artefacts that are interlinked in assorted social and technical ways. To get around this problem, the hierarchical model used in the earlier chapter will again be employed. This time it will not be used to highlight static features, it will rather be used to expose some dynamic interrelations between the layers of the model. This exposition can in turn be used to tackle some apparent contradictions that arise when using SCOT.
Although the material (technical) capabilities of an artefact may remain basically constant over a long period of time, its social role may change drastically. The telephone is a straightforward example. Its capabilities have not changed very much since its invention (although its design has been modified and some new features have been added), but its social use has changed significantly. It was initially conceived and promoted as a medium for conveying price information to farmers (i.e. for business communication), but it was soon also used by people, particularly women, in rural areas as a tool for social communication (Williams 1997). Such a re-invention, or re-interpretation, of an artefact does usually not replace the original purpose, but comes in addition. As such it can be seen as a generalisation of the use of the technology, i.e. the appropriation of the technology by additional relevant social groups.
Re-interpreting a technology for social purposes has been repeated for other technologies, notably the French Minitel system and the Internet. In these two technologies re-interpretation involve not only the social processes, but also technical change. For example, by "hijacking" and "hacking" (Lemos 1996: 39) a piece of software from another telematics project, an unplanned feature of Minitel was created, namely the messageries, which provided games, chatting and a bulletin board service. This 'small' technical development formed the basis of Minitel's new role as a decentralised two-way social communication medium, as opposed to the centralised, one-way information-providing medium that was the initial intention. In this case what enabled a social re-interpretation of Minitel was a technical re-interpretation of a piece of software, which made it 'portable' from one system to another. This piece of software can be regarded as a sub-artefact of the system. Such technical re-interpretation processes are also central aspects of the development of the Internet.
The rise of e-communities represents a social re-interpretation of the Internet, from being a 'communication device for professional information' to also being a 'communication device for social information'. The role of the Net has in fact been modified throughout its history, from being a 'military communications device capable of withstanding a nuclear attack', via an 'academic research and communication tool', to a 'generally available general-purpose communication device' (although the latter characteristic should be regarded as a tendency rather than a present status).
The notable thing is that these far-reaching changes of social role have occurred with only modest technical developments. The Internet Protocol (IP), for example, which is arguably the most central (sub-)artefact of the Internet, has remained unchanged since 1981. In the technical sense this is therefore a (very) black box. Socially, however, IP has changed from being a 'carrier of short text messages' to a 'carrier of text, programs, pictures, audio, video and electronic money'. This re-interpretation has manifested itself in new basic services like the World-Wide Web, e-mail extension mechanisms (e.g. MIME), IP telephony, and audio and video players for computers. This means that black-boxing in the technical sense and black-boxing in the social sense are different things. In fact, technical black-boxing may act as a catalyst for social re-interpretation, or in the words of Hanseth, Monteiro and Hatling (1996), "standardisation is a precondition for flexibility".
At this stage it should be noted that although IP has remained unchanged, there have been important infrastructural improvements, like higher-speed and higher-capacity cables, gateways, routers and so on. This has enabled the data carried according to the IP standard to be transmitted faster and in bigger volume. Some 're-interpretations', like video transfer, had been unthinkable without these improvements, but the basic phenomenon remains intact: A given infrastructure can be used or interpreted in various ways.
The split between social and technical black-boxing is predominantly a feature of general-purpose technology; it is not so pronounced with more special-purpose technologies, like bicycles, and hence this split has not been expressed in SCOT theory. Infrastructure technologies are designed precisely to exhibit a high level of generality, since it is meant to be widely deployed, and hence expensive to replace or improve. This is both due to the cost of the infrastructure itself, but also since it will be embedded in a socio-technical network where dependency-relations will accumulate.
General-purpose technology can hence be said to inherently exhibit a high degree of interpretative flexibility, despite being technically black-boxed or closed. Social constructivists of the 'strong' variety would here likely complain that no reference should be made to any 'inherent' properties of technology (Brey 1997: Ch. 2), but I will maintain that technical characteristics of a technology provide some fundamental bounds for interpretation, and as such ought to be included in a socio-technical analysis. Some technologies offer through their design a high degree of interpretative flexibility, whereas others offer low degrees.
The standardisation/flexibility dynamic of infrastructure technology has proven to be a prominent feature of the development of the Internet, both regarding the creation of new technical applications and also the formation of social aggregations, like e-communities. This will now be further explored, starting with the technical side.
Since a considerable amount of new services and applications are developed by private enterprises and often require considerable investment, the anticipated market for these products must often be extensive. If the basic services or the networks on which these products rely exist in a great variety of formats, the market corresponding to each format is likely to be too small. Then the product must be made in different versions, each adapted to the format of the underlying platform. This solution is however likely to be prohibitively expensive, both with respect to product development and maintenance. This means that any lack of standardisation of the underlying platform (i.e. the infrastructure) entails more costly and risky business for application developers, which will impede application development. This is illustrated in fig. 5.1, where lack of standardisation in the basic services yields merely half-hearted application development.
Fig. 5.1: Standardisation phase 1.
But since the network layer has been standardised, there are several emerging versions of the basic services. These different versions will be subject to a selection process, based on a variety of socio-technical factors: technical characteristics, economic power, marketing capability, socio-economic ties to application developers, and political regulations are only some of these factors. With only a few 'surviving' platforms left, application developers have an easier job, and a number of competing and complementary applications will mushroom, as indicated in fig. 5.2.
Fig. 5.2: Standardisation phase 2.
The competing application variants will be subjected to a new round of selection, and the final result of might be as shown in fig. 5.3. Note that only three applications have been drawn, i.e. the same number as in fig. 5.1. The difference is that each of the standardised application will be used much more widely than the tentative applications that appeared initially.
Fig. 5.3: Standardisation phase 3.
This scenario of proliferation and selection - much in line with evolutionary economic theory, incidentally - thus propagates upwards in the hierarchy of technical artefacts. Standardisation on one level facilitates variation on the next level. A manifestation of this can be seen in the work of the IETF, the main Internet technical standardisation body. IETF standards - which are part of the Request For Comments (RFC) document series - go through three phases, "Proposed standard", "Draft standard" and "Internet standard", according to level of maturity (Bradner 1996: 11-13). Full Internet standards are relatively few in number (74), and they have on average existed for a comparatively long period of time. The intermediate Draft standards are slightly more numerous (92), whereas the least mature Proposed standards are significantly higher in number (577) and of lower mean age. Proposed standards often describe extension mechanisms to the more fundamental protocols that have reached full standard status.
A similar standardisation dynamic occurs in the development of e-communities, but the standardisation is not of the de jure, or formal, kind which apply to technical protocols. We are rather dealing with informal social norms that may or may not be explicitly expressed.
The first similarity is that when there are many different and incompatible means of communication (basic services), the number of people using each system is less likely to reach the 'critical mass' necessary for sustainable communities to develop. This applies both to technical means (e.g. compatible Internet chatting systems) and social means of communication (e.g. common language or similarity of interest). This is equivalent to fig. 5.1, if the 'application' boxes are replaced by people, and the basic services are extended to include social elements.
When the basic services start to converge, more people will use fewer different systems, and a critical mass will be more easily reached. This will provide the foundation for a variety of social formations, some of which will develop into communities. This is the phase in which the structuration activities mentioned in chapter 3 are likely to set in, a process which can be regarded as a social standardisation activity, whose outcome are social norms (e.g. netiquette) and symbolic lexica. And although the number of distinct communities has grown dramatically, the forms these communities assume converge towards de facto standards. For example, some more or less standard constituents of a Usenet-based e-community is (1) a list of frequently asked questions (FAQ), pointing out the existing boundaries and conventions of the community to newcomers, (2) a set of netiquette norms (partly expressed in the FAQ), and (3) a set of roles (community leaders and authorities, old-timers, newcomers, norm upholders, discussion mediators, and so on). Another example of trans-community standards are the rules of conduct imposed by the administrators of web-sites offering e-community facilities. Although these rules do not arise from the community itself, but are externally pre-imposed, they are partly taken from the netiquette that spontaneously arose in early e-communities. They are also partly resulting from off-line legislation. On the Excite community web site, for instance, it is not permitted to transmit unsolicited advertising or bulk e-mail, or material that is unlawful, obscene, threatening, and so on.
The process of standardisation is in SCOT terminology called closure or black-boxing. Closure refers less to the fixing of a technical standard than to the fixing of the social role of a technical artefact, although these are highly intertwined processes. Through a phase of negotiation between the relevant social groups, which involves both social and technical issues, there will eventually take place a settling of the 'properties' of the artefact. An example of closure of a technical Internet artefact is the development of web browsers, where there is today only two major survivors, Microsoft's Internet Explorer and Netscape's Navigator. The features of these products are deceptively similar, indicating that the issue regarding what a browser 'really' is, or should be able to do, is not far from being settled. Incidentally, this closure at the browser level has led to a prolific development of auxiliary programs that enhance the browsers' capabilities, so-called plug-ins, which further illustrate that standardisation on one level induces variation on a higher level.
Social aggregations like e-communities undergo similar standardisation processes, and although they are social rather than technical artefacts, they seem suited for analysis in terms of SCOT. As described above, e-communities show a tendency towards closure on two levels. The first applies to individual communities. Through the structuration activities each community experiences, the community participants come to a stage where they share a fundamental outlook regarding the community. Such a shared outlook is in many ways similar to the SCOT concept of a technological frame (introduced in chapter 1). On a second level, these structuration activities themselves become standardised, such that the same tendencies are identified across a number of different e-communities. It then becomes possible to speak about generic e-community forms, like the Usenet-example above. A further indication of the existence of generic forms is the emergence of off-the-shelf software packages for building one's own e-community, usually consisting of program modules for chatting, discussion forum and messaging. It might be noted that the makers of these packages seem to regard e-community as a collection of software modules, rather than a group of interacting people.
Although individual e-communities and their generic form develop towards standard forms, there is a third level of analysis where closure is not apparent. This level pertains to the analysis of e-communities, regarding their nature and social role. Discourse around the role and nature of e-communities has so far assumed a rather rhetorical and polarised form (e.g. 'virtual' vs. 'real'). A plausible reason for the lack of closure at this level is the low 'penetration' of e-communities in the wider society. The technical confinement of e-communities is accentuated by a separatist ideology, which means that these communities barely reach outside their own 'territory'. E-communities are therefore something of a fringe phenomenon in today's society, too esoteric and irrelevant to large sections of the general public. Compared to their low general societal relevance, e-communities seem to have received disproportionate academic attention. This is the kind of attentive over-reaction with which new tend media forms tend to be met. When the novelty fades the attention may also fade, unless e-communities become more relevant for its social environment, and more people get involved in defining their role. Until then the discourse is unlikely to close.
[ Next: Chapter 6 ] [ Contents ]