Abstracts
Abstract
The expanding field of network studies, which comprises histories, traditions and innovative research from myriad disciplines such as mathematics, the social sciences, linguistics, computer science, physics, biology, Internet and communication studies may find meaningful dialogue with the field of translation studies. This introductory article seeks to present a multifaceted and multi-tiered historical trajectory of the term and concept “network”, reflecting on the impact it has already had on studies in the domain of the sociology of translation. Can a network-based vocabulary emerging from network theories and studies, including recent works on network society, offer translation studies new conceptual tools with which to think through and articulate translation phenomena? By the same token, how might translation studies, viewing interlingual transfer in terms of product, process, profession, industry, politics and strategy, contribute to the growing body of research on the transmission and exchange of thoughts, ideas, messages, information, values, which characterize communication, the core of all translation activity? As connectivity and connectedness take on ever-important social organizing dimensions in a globalizing multilingual world, a translation-informed network approach as well as a network-informed translation theory approach may symbiotically help us better understanding human and social practices.
Keywords:
- translation studies,
- network studies,
- social network analysis,
- communication,
- globalisation
Résumé
Les études de réseaux constituent un domaine en pleine expansion qui comprend des histoires, des traditions et des recherches novatrices provenant de disciplines très diverses telles que les mathématiques, les sciences sociales, la linguistique, les sciences de l’informatique, la physique, la biologie et les études en communication et Internet. Entre ce domaine de recherches et celui de la traductologie devraient pouvoir se développer des liens fertiles. Cet article d’introduction vise à retracer la trajectoire historique du terme puis du concept de « réseau », en mettant en relief, l’influence qu’il a déjà en sociologie de la traduction. En quoi le vocabulaire émergeant des études et théories des réseaux, y compris les travaux les plus récents sur la « société de réseaux » pourrait-il offrir à la traductologie de nouveaux outils conceptuels pour comprendre et analyser les phénomènes de traduction? De la même façon, en quoi la traductologie, qui envisage les transferts linguistiques en termes de produit, de processus, de profession, d’industrie, de politiques et de stratégies, pourrait-elle contribuer aux études portant, plus généralement, sur la transmission et l’échange des idées, des modes de pensées,des messages et des valeurs qui caractérisent la communication, et le cœur de toute activité de traduction? À l’heure de la mondialisation, où l’inter-connectivité devient un élément de plus en plus important de l’organisation du social, une compréhension des réseaux intégrant les phénomènes de traduction et une approche théorique de la traduction inspirée des études de réseaux pourraient ensemble favoriser une meilleure compréhension des pratiques sociales.
Mots-clés:
- traductologie,
- études de réseaux,
- analyse structurale des réseaux,
- communication,
- mondialisation
Article body
Now that the world wide web exists, everybody believes they understand what a network is. While twenty years ago there was still some freshness in the term as a critical tool against notions as diverse as institution, society, nation-state and, more generally any flat surface, it has lost any cutting edge and is now a pet notion of all those who want to modernize modernization. ‘Down with rigid institutions’ they all say, ‘long live flexible network.’ What is the difference between the older and the new usage? At the time, the word network clearly meant a series of transformations – traductions – which could not be captured by any of the traditional terms of social theory. With the new popularization of the word network, it now means transport without deformation, an instantaneous, unmediated access to every piece of information.
Bruno Latour 1999: 15
1. Is it really the case?
If we pause for a moment to think of the basic scientific reason for theorizing any given practice, the assumption is that the contemplation of certain practice or practices will simply help us to describe and explain, and furthermore predict, phenomena and features that are unique to that/those practice/s; in sum, that theory will serve to enlighten practice which may in turn nurture a more profound disciplinary reflection on what we do and why. As a formal discipline, translation studies has benefited significantly from ideas, concepts, analytical perspectives and methodologies drawn from a wide range of other disciplines (and their internal trends) that help it better understand the precise object of its inquiry: translation. Most appreciably in the domain of literary studies, translation practice has informed the body of theorization work, extricating those elements that are uniquely characteristic of translation proper. While always and fundamentally an act of communication, hence innately linked and bound to other disciplinary discourses by default, translation occupies a space that is nevertheless distinct, one which is distinguished by the specificity of richly diverse languages and cultures continuously interacting with one another over time, to wit, the specificity of the human race organized along a wide and fascinating spectrum of societies. The plenitude of particularities found along this spectrum systematically challenges efforts to articulate any overarching theory of translation that is generally or universally applicable.
Although translators and interpreters have long been familiar, through linguistic and cultural expressions, with the not always unproblematic encounters between communities and societies, these encounters have historically largely transpired on terrain configured in the local: through the physical displacement of human beings and artefacts from one locale to another. Space/time constructs for translation theory and practice, including our relationships to the historical past, have primarily been experienced and expressed in terms of the physical, geographical perceptions shaped by our bodies occupying biological space and time. The boundaries of physical place have grounded organizations historically, configuring their very organizational definition and processes. The compression of space/time (David Harvey 1989), often cited recently as having tangibly transformed human perceptions of the relationships between space and time most notably from the late 19th c. industrial period to our contemporary “global” era, has sprung from technologies evolving and complexifying our globalizing economies and societal practices at a faster and faster pace. Scientific inquiry has turned its analytical eye to focus on them, including Internet space and time, and the term globalization has acquired a specific, historically-grounded contemporary connotation, where even virtual globalizing worlds are no longer just the stuff of science fiction. The space/time relationship has long been of interest to intellectual and creative pursuits within the humanities and sciences. This is no less true for translation studies, for it intimately configures and structures the transmission and exchange of thoughts, ideas, messages, information, values and visions, i.e. communication – which is at the core of all translation activity.
The means to interconnect, communicate and interact through space/time has repeatedly been conceptualized, visualized and designed –in many disciplines–through use of the term “network,” bespeaking the inherent notion of connectivity that lattices webs of relationships and organizational structuring phenomena in all their complexity. The polysemic, elastic nature of the word and associated concepts facilitates both theoretical and practical transversal deployment across many domains. As such, reading across the disciplines, the classification schemas proposed for network types in the literature vary widely, but generally include such categories as: biological; ecological; cognitive; neuronal; linguistic; cultural; social (human/non-human); organizational; economic; mathematical; physics; communication; transportation; information; engineering; technological; computer. Through all the categories and properties of these networks, connectivity threads its way as a common filament linking the most diverse contexts. Conceptualized simply as the act and quality of being connected, it can be conjugated along a sliding continuum of manifest degrees of connectedness and relationships through space and time. Concretely in the context of communication, connectivity can be articulated in terms of the simple connecting devices which make contact with one another in a (tele)communications system, or as the more sophisticated network:
[…] patterns of contact […] created by the flow of messages (i.e. data, information, knowledge, images, symbols, and any other symbolic forms that can move from one point in a network to another or can be cocreated by network members) among communicators throughout time and space.
Monge and Contractor 2003: 3
The continuity of global human history itself invites a certain connectivity approach, one that interpellates the changing dimensions of space/time relationships, and which takes into account the multi-layered histories of communication. An interesting paradigm of historical periodization worth mentioning for a history of human networks in its most global context has been articulated in terms of five successive “worldwide webs.”
Proposed by McNeill and McNeill (2003) (cited in van Dijk (2006: 22-23)), the first worldwide web would have emerged predominantly nomadic and loose until settlement occurred with the advent of agriculture, about 12000 years ago. The spreading of ideas, cultural expressions, technologies (use of fire, for ex.) and genes transpired through periodic, and later more sustained, contact among members of diverse communities. The second worldwide web would have emerged about 6000 years ago, with the metropolitanization of local webs and the creation of the first civilizations (Mesopotamia, Egypt, the Indus, the Yellow River (China), Mexico and the Andes), connected by caravans of transport animals and by coastal sea and river ships. Of note is this first instance of sustained stranger-based connectivity. The third worldwide web emerged from the contacts between Eurasian and North African civilizations about 2000 years ago, and expanded to include the empires of India, China, the Mediterranean (Greece and Rome), Mexico and the Andes. Transportation and communication means (including alphabetic writing) vastly improved, and human connectivity was signalled by both the import/export and defence/rejection of ideas and customs upon contact with the “other.” The fourth worldwide web, emerging about 1450, extended its sphere by bringing Eurasian and American civilizations in touch with one another, primarily through transatlantic navigation. Violent clashes among civilizations reverberated as the pursuit and exploitation of natural resources led to successive waves of colonization. Finally, the fifth worldwide web, emerging about 160 years ago, witnessed unprecedented urbanization and population growth, as well as increasingly global connectivity. Van Dijk elaborates on the McNeill’s fifth worldwide web even further, dividing it into two periods, the first configured by mass communication networks, and the second by network societies. This periodization outlined in terms of global human worldwide webs weaves a historical tapestry of interfacing, interacting networks of all types, underscores the weight of globalization, and sets an analytically-friendly backdrop for the histories of network terms, concepts and research. As van Dijk notes (2006: 23), these historical worldwide webs systematically reveal through time: 1) a continual presence of combined cooperation and competition; 2) a general flow in the direction of greater social cooperation (voluntary and compelled); 3) a tendency for human webs to grow; and 4) the increasing impact of human communication on the state of the planet. Profoundly implicated in the processes of transformation and in the changing relationships with space/time among connecting, communicating and globalizing societies worldwide are clearly the concepts and realities of networks and translation.
This volume aims to connect two vast areas of knowledge: translation and network studies. It presents ten papers that explore translation – as an interlingual process and profession, as an industry, as cultural exchange and even as political action or a research field – through a “network approach.” While all these papers deal with translation proper – in a literal sense rather than a metaphorical one – and generally share a methodological concern (and discomfort over conventional methodologies in Descriptive Translation Studies [DTS]), they are informed by different theoretical perspectives where “networks” have neither necessarily the same referent nor exactly the same semantic value. Network studies is a research field whose institutionalization is only quite recent (in the history of science); nevertheless, it encompasses a broad range of methods and applications which are themselves derived from and intertwine various disciplines, at times of long historical standing. As such, by way of introduction to this volume, we felt it useful to give a brief overview of what network studies is about. Where does it come from? How did it develop? What are its main achievements and conceptual tools? In so doing, we hope to situate each of the papers theoretically and methodologically, while suggesting further avenues of research at the crossroads of network and translation studies.
2. On the origins of network studies: mathematics
Although a case can certainly be made on many levels for the legitimacy of using the term network as a metaphor for discussing translation phenomena, does network studies potentially hold a repertory of useful vocabulary and concepts with which to actually rethink the field of translation studies? Has the term “network,” due to its very ubiquitousness, lost its critical, analytical edge, as Latour suggests? In search of a preliminary response to this question, we found it imperative to review some of the fundamental bases of network scholarship. Historically, networks have been a sustained subject of study since 1736 when mathematician Leonard Euler attempted to scientifically answer the Königsberg Bridge problem question (“Does there exist any single path that crosses all seven bridges exactly once each?”) (Newman, Barabási, and Watts 2006: 1). They have largely been the bastion of an important part of mathematics and the sciences, upheld for the most part by graph theory (the mathematical theory of properties and applications of graphs), which uses mathematical language to describe network properties. Graph theory was first systematically treated by Denes König in the 1930s (Eric W. Weisstein). While variations abound in the literature, a network can be defined, “[i]n its simplest form,” as “nothing more than a set of discrete elements (the vertices) and a set of connections (the edges) that link the elements, typically in a pairwise fashion.” (NBW 2006: 2) As clarified further by mathematics sociologist Robert A. Hanneman and Luis R. Izquierdo, “the formal abstraction called network in the social sciences is often named graph in graph theory, while the term “network” in graph theory is reserved for a specific type of graph.” (Izquierdo and Hanneman 2006: 4) Advocating a more precise definition so as to render it useful beyond pure mathematics, communication scholar Jan van Dijk (2006: 24) proposes the following: “A network can be defined as a collection of links between elements of a unit. The elements are called nodes. Units are often called systems. The smallest number of elements is three and the smallest number of links is two. A single link of two elements is called a relation(ship). Networks are a mode of organization of complex systems in nature and society.”
Graphs (mathematical figures consisting of points (vertices) and lines (edges) to represent relationships) in graph theory have consistently been used to depict the topological features of networks, by abstracting away problem-specific details in order to focus, in particular, on connectivity. A graph is considered to be connected if it is possible to get from one point to all other points in the graph, and disconnected if it is not possible to do so, implying the existence of subsets of points in the network (subgraphs) which are connected to one another (components) but not necessarily to each other (Monge and Contractor 2003: 43). Directionality is also important. The edges depicted in a graph may be directed (ordered pairs, connecting a source vertex to a target vertex) or undirected (unordered pairs, connecting the two vertices in both directions). Mathematically, the topological depictions of points (nodes/vertices) and lines (links/ties/edges) can focus on the manner in which elements are arranged into sets, according to certain properties, under certain conditions (combinatorial theory), or they can reflect a probability distribution function, which statistically describes all the possible value outcomes and probabilities that any random variable can acquire within a given space. The topological mapping out of elements and of interconnections (physical/real and logical/virtual) in networks has been the domain of network topology. Topology studies geometric figures and their properties under continuous transformations, independently of their shape, size or quantity. Graph theory and network topology, as a branch of topology in discrete mathematics, are capable of being both effectively abstracted and grounded in detail, and have thus been conceptually and practically applicable not only in mathematics, but likewise in science, engineering and computer science. Graphs and their topological properties, through the mathematical theorems and proofs that sustain them, provide us with the means by which we can conceptualize and demonstrate connectivity – most notably in the form of networks.
Distinctly modern trends in network theories originated during the 1950s and 1960s. The principal point of departure, concentrated mainly in mathematics and in the (natural) sciences, was random graph theory, which examines the actual or hypothetical probabilities of network connection and the properties of vertices and edges obtained by random processes. It was introduced in 1951 by Anatol Rapoport and Ray Solomonoff, who explored three natural systems in which networks could appear: neural; social (epidemics); genetics (NBW 2006: 11), and formally initiated in 1959 by mathematician Paul Erdós and his application of the probabilistic method as a technique to “establish the existence of certain objects by selecting an object at random from a certain probability space and proving that the object has the desired properties with positive (usually overwhelming) probability.” (Babai 1998: 27) In the 1960s, Erdós and Alfréd Rényi effectively developed a statistical theory of combinatorial structures, demonstrated graph evolution and phase transition, and ascertained that many properties of random graphs emerge suddenly, not gradually (NBW 2006: 12; see also Béla Bollobás for combinatorics and graph theory). Other studies focused on classifying objects or data into subsets (clusters) sharing a common feature, a process known as clustering. Mathematician John Hammersley, along with Simon R. Broadbent (1957), elaborated on the connected cluster behaviour found in random graphs. The description of this behaviour became known as percolation theory, i.e. a stochastic, probabilistic model exhibiting phase transition which “deals with fluid flow (or any other similar process) in random media” (cited in Weisstein). It is a model useful for demonstrating emergence, expansion and spreading.
Models are a fundamental component and tool of network theory and analysis. They have continually served mathematicians and scientists for describing systems and processes as representations, often in mathematical terms, of idealized or simplified concepts. Conceptual models, in turn, are habitually translated into drawings, blueprints, designs and even programming languages to execute small-scale representations for subsequent applications in areas such as engineering, medicine, physics, biology, ecology, education, cognitive science and computer science. The value of modeling resides not only in its capacity to depict and describe (allowing us to analyze our observations) but also to predict in the most accurate way possible. The intricate complexities of all natural, artificial and abstract systems, revealed and brought to light by the application of evolving technologies to all domains of scientific and intellectual inquiry and their concurrent trends towards hyper-specialization, draw upon a wide diversity of models. Interesting cases in point are models in the domains of (artificial) neural networks, fuzzy systems and fuzzy-neural networks, which have had a direct impact on translation memory, machine translation and voice recognition research, for example. Modeled on human neurobiological functions, computed neural networks are determined by connectivity and the weight values of individual input-output functions. They are trained to “learn” the complex relationships between sets of input and output variables. Based on algorithms such as backpropagation, they learn to recognize patterns, to predict outcomes and to perform tasks or functions. Models are essential for attempting to visualize and to predict behaviours of all types of complex systems, a burgeoning field of study that currently crosses many disciplines. Complexity theory, historically rooted in the more deterministic chaos theory, studies the simple behavioural patterns generated by complicated, dynamic sets of relations, and attempts to show how parts of a system give rise to the collective behaviour of the system while the system interacts with its environment. These dynamical complex systems are analytically pried open by models of phase transition and emergence phenomena, where the appearance of a new property may be deterministic in the system.
Network theories and models have influenced to varying degrees the methodologies and conceptual frameworks of the applied sciences (business and organizational studies, education, engineering, law, medical and health sciences, military studies and social work), the humanities (cultural studies, languages and literatures, philosophy, architecture and the arts), and such disciplines as linguistics, history, and communication studies. Adopted by some researchers in the social and communication sciences in the mid-20th century, they seemed to furnish more quantitative methodologies and data-gathering techniques, as well as a more scientific and mathematical structural framework within which to articulate social and behavioural phenomena. Still largely indebted to graph theory–used as a means both to describe abstract concepts and to analyze empirical data–, these network theories complexified along the way by their extrapolation to studies of social structures and organizations. They increasingly stressed the relational ties of connectivity between elements in networks, whether they be fixed, ordered, chaotic, or transitional, thereby embracing not only static frameworks but eventually dynamic ones as well. Some approaches further extended their models to include unconscious structures. As the following section will show, the shift to more systematically apply scientific method (and tangentially, mathematical models) to the study of human societies and social relations allowed network theories the requisite spaces within which to germinate, and at times flourish, in key academic disciplines of the social sciences including anthropology, economics, geography, information science, political science, psychology, and sociology.
3. The development of network studies in the social sciences
Network studies in the social sciences were most notably deployed through some form of Social Network Analysis (SNA). They neither grew out of nor were driven by any kind of grand “network theory” but rather developed first from methodological concerns – the need to accurately account for collected data. Though metaphors of the web or network always pervaded discourses in the social sciences, the origin of Social Network Analysis goes back to the 1930s and is usually attributed to social psychologist Jacob Moreno (1934). Initially trained in medicine and psychiatry, Moreno was interested in understanding how psychological well-being relates to a person’s interpersonal relations, and introduced the notion and method of “sociogram” as a way to map small-scale networks of affinities. Throughout the 1930s and 1940s, the method designed by Moreno was expanded and used to explore the internal structures of networks, notably in the work place (see the Hawthorne studies) and within a small urban North American setting (see the ‘yankee city’ study). The emphasis was on the analysis of the formation and functioning of sub-groups (therein arose the concept of clique) as well as their influences on people’s actions.
Innovative research by leading British anthropologists John Barnes, Max Gluckman, Elizabeth Bott and Clyde Mitchell, known for forming the so-called Manchester School of Social Anthropology, developed SNA substantially throughout the 1950s and 1960s. In a famous study of a Norwegian fishing village, Barnes (1954) first made the move from a metaphorical to a more literal use of the term “social network” to refer to relational patterns that were not based on institutionalized entities (such as the parish, factory, hamlet etc.) but on more informal affinities. The idea was taken up and extended by Bott (1957) in an analysis of the allocation of tasks in twenty London households. On both occasions, the use of the term network emerged from the researchers’ inability to effectively account for the data collected (in this case, the division of labour in an urban industrialized society) using conventional (institutional) categories. This rising interest in informal patterns of interactions and power relations (such as the division of labour) and internal conflict was part of a broader shift in the social sciences, where anthropologists started “to turn the telescope the other way round,” in the words of Mary Douglas (1995: 24), and to look critically at their own societies. They began conducting fieldwork in urban settings to highlight, among other things, the existence of solidarity networks within the workplace or among migrant communities, for example. Boissevain and Mitchell (1973)’s collection of essays nicely sums up the main contributions made during the period, as well as the theoretical debates underlying the emergence of the concept of network within the Manchester school. According to Boissevain, the development of network analysis at the time was…
an attempt to reintroduce the concept of man as an interacting social being capable of manipulating others as well as being manipulated by them. The network analogy indicates that people are dependent on others, not on an abstract society. […] Secondly, network analysis seeks to place again in the foreground of social analysis the notion of internal process and the inherent dynamics in relations between interdependent human beings. […] The basic postulate of the network approach is that people are viewed as interacting with others, some of whom in their turn interact with each other and yet others, and the whole network of relations so formed is in a state of flux.
Boissevain 1973: viii
As such, the notion of network here was clearly a way to depart from more deterministic approaches, such as those inspired by Marxism, where social structure equates with social class, i.e. where society is seen as a stratified entity imposing its order on each individual. Yet, delimiting this key concept in relation to others, such as the concept of institution, was not that easy and remained a continuous subject of debate (at times fraught with misunderstanding). Assuming that there was “no way of defining a network relevant for all purposes,” Mitchell (1973: 24) would distinguish between three kinds of content (i.e. networks): communication (information), exchange, and normative. While Boissevain regarded network/institution as two poles of a continuum, in Mitchell’s view the concepts referred instead to two different levels of abstraction, and should not be seen as forming a dichotomy. On this basis, Mitchell would distinguish between “partial networks” and the “total network” of society – thus, introducing the idea that society in its entirety could be potentially accounted for in network terms. However, in Bott’s view, the concept of social network could be more useful in explaining behaviours in large-scale complex societies rather than small-scale ones, in as much as “in the face of greater complexity we cannot work at very high levels of abstraction, starting from the content of social links” (Mitchell 1973:34). At this point, Social Network Analysis would primarily concern itself with the study of partial networks rather than total networks. From this common ground emerged two approaches which, in his view, should be kept separate: one that looks at how networks structure an individual’s action; and one that analyzes how individuals can manipulate networks for their own purposes. The first approach would lead to interaction theory, the second to exchange and game theory (Banck 1973: 40-41). According to Scott (2000: 32), by equating SNA with the study of localized interpersonal groups, the Manchester School “largely failed to attract adherents from outside the area of community studies,” thus marginalizing itself.
Though the idea of mapping networks as a way to represent social relations was intuitively attractive, the legibility of the result declined as the number of dots and links increased. In other words, the method was really suitable to small-scale groups only. As a way to overcome this limit, Harrison White and his research group at Harvard University took advantage of the advances in information technology to apply sophisticated mathematical and computer devices to the systematic analysis of more complex networks: maps were replaced by matrices that lent themselves better to computer analysis. This gave an important boost to network studies. By the end of the seventies, the research field had become institutionalized, with its own journals, methodological handbooks and conferences, bringing together scholars not only from within sociology and anthropology, but also political science, history and social psychology. According to Wellman (1997), the research conducted under this heading could broadly be divided into two groups: the formalists (analyzing the form of networks rather than their content) and the structuralists (using network analysis to investigate more traditional research questions in sociology). Again, the difference between the two would lie in the level of abstraction: network analysis and modelization being an end in itself in the first case; and a method in the second one. Yet, in fact, the two could be regarded as complementary and mutually enriching, as data generated in each area could be of direct use to the other.
Drawing from mathematics and computer technology, formal SNA “progressed slowly, almost linearly, with the developments of sociometry (sociograms, sociomatrices), graph theory, dyads, triads, subgroups, and blockmodels, all of which served to enlighten substantive concerns such as reciprocity, structural balance, transitivity, clusterability, and structural equivalence.” (in Monge and Contractor 2003: vii). These terms are among those which comprise a working conceptual vocabulary of metrics serving to explain and highlight various characteristics of networks and the impact of each on social interaction, or more exactly on the reproduction or change in social patterns and/or power relationships. Notions of reciprocity/asymmetry refer to directionality and differentiate between one-way and two-way relationships. Transitivity refers to a “closure” by which two distinct nodes sharing a relation with a third one will also tend to be directly connected (A → C, on the basis of A → B, B → C). Bridges refer to nodes that tend to link various transitive sub-networks. Contrary to transitive relations, a structural hole refers to the pattern by which different nodes are connected (either directly or indirectly) by one and only one node (for example, node A would have direct connections with B, C, and D, but none of these nodes would be connected to one another, except through node A). In network analysis, density is a measure of the network’s cohesion. A dense network can consist of various cliques, i.e. sub-networks with an independent capacity of action that will tend to produce fragmentation within the network. Also interesting – as they relate to key concepts in the sociology of translation – are the notions of centrality and periphery, which define the relative power of each component of a particular network. A central node – or sociometric star – is one that has many relationships and that is in a position of control with respect to other nodes. Reversely, a peripheral position is one that is loosely connected to a limited number of nodes, the extreme case being that of the isolate, i.e. a node that is not connected at all. The principle of structural equivalence was introduced by François Lorrain and Harrison C. White in 1971: two nodes/actors are structural equivalent when they have the same relationship with other similar nodes, and role equivalent when they have the same relationship with different nodes (as in a homological relationship). All of these concepts, drawn from mathematics and algebra, can equally apply to the ethnographic study of small informal networks or computer-aided analyses of broader entities. In 1973, Mark Granovetter published a seminal paper which proposed conceiving of weak ties in terms of their hitherto underestimated strength. Attempting to bridge the gap between macro- and micro-level analyses in sociology, and drawing on a diffusion model, he turned attention to the relational aspects between the nature of ties and processes, such as social integration, flow of information or the diffusion of innovation. White is now considered to be one of the main founders of the “new economic sociology,” a research field in which the network appears as a key concept. So, while some scholars (such as those working along with the Manchester School) used the concept as a way to depart from deterministic structuralism, others (such as White and later Granovetter) found in this same idea a way of accounting for socio-economic phenomena, breaking away from models based primarily on the market competition hypothesis.
While analyses of formalist orientation have allowed for the development of a vocabulary and concepts to understand the various forms networks can take, what Wellman calls “structuralist” network analyses would make use of this vocabulary to explore more traditional issues in sociology such as how network patterns may determine allocation of resources, access to information, social exclusion or integration, etc. Within this structuralist network tradition, two main foci emerged: one based on the analysis of whole networks (sociocentric studies) and the other approaching the network from within by taking individuals as the focal point (egocentric studies). As Wellman explains (1997), these two approaches each would have their own limits, such as exhaustivity in the first case (how to make sure the data collected allows us to draw a whole network) and representativity in the second. Most structuralist network analyses have tended to adopt a more egocentric approach. Throughout the seventies and eighties, network analysis expanded its domains of applications, most notably to information science, communication, economics and political science. For example, Derek de Solla Price (1965) would conduct his famous study on the network of citations between scientific papers, which later prompted research on power-law degree distributions and eventually scale-free networks (NBW 2006: 17-18). The newly developing, heterogeneous field of organization studies likewise benefited, first “in such things as quantitative research on institutional diffusion in populations of organizations, organizational demography, decision making, information processing, networks, learning, evolution and comparative structures,” and then “in such things as qualitative research on culture, gender, sense-making, social construction and power.” (James March 2007: 14) Cognitive psychologist Herbert Simon, and organizational scientist James March along with Richard Cyert, would seek to explain, from the perspective of economics and organizational behaviouralism, how decisions are made within networks of a firm. Meanwhile, the social sciences gradually were making their way towards the virtual world and the formation of social networks through the use of computer technology, exemplified in the work of Barry Wellman. Wasseman and Faust (1993), Scott (2000) and Carrington et al. (2004) give an overview of the range of concepts and techniques that are now part of SNA. Nowadays, though structural network analysis is obviously too diversified to be referred to as a “theory,” its institutionalization prevents us from regarding it merely as a set of techniques. The question remains an open one – some would argue for recognition of the “research paradigm” (Wellman 1997) while others would tend to favour the “toolbox” hypothesis (Chiesi 2001: 10501) – but it no longer seems to be the focus of debate. Indeed, around the late eighties, social scientists turned to other issues, some of which materialized in the emergence of a rather unorthodox form of network analysis: Actor-Network Theory, also known as the “sociology of translation.”
ANT developed mainly through the works of Michel Callon, Bruno Latour and John Law and was initially focused on analyzing the processes that underlie scientific and technological innovations: how do ideas transform into “hard” facts, facts which will then lead to technological innovations that become part of our society? The concept of translation in this framework was used in opposition to that of diffusion (see Granovetter, above). While the diffusion model assumed that facts are produced in response to a need, and spread in a given form into society, ANT claimed, to the contrary, that they get shaped and transformed by and through society. In other words, they are a product of social relations, rather than the opposite. Besides being ambitious in scope, ANT would depart from previous network studies in a number of ways. Most importantly, and from which all others would derive, was the nature of its post-structuralist epistemology, which shared some affinities with Foucault’s ideas (Latour 1997; Crawford 2005: 3) and with the order-out-of-chaos philosophy. Actor-Network Theory brings together two concepts that are usually regarded as oppositional: “actor,” implying the idea of agency, and “network,” the idea of structure. As seen above, the relation between the two, the difficulty of reconciling micro- and macro-analyses, was always of central concern to the social sciences, and a dilemma which ANT chose to bypass. Refusing to take a position in the structure/agent debate, or to defend any kind of “middle ground” position, ANT chose more radically to advocate the need of getting rid altogether of the dichotomies that pervade our ways of understanding society, to wit: the opposition between nature/culture, micro/macro, human/non-human, etc. Consequently, in ANT, there is no such thing as a “context” or pre-existing order, no network opposed to an institution, nor any individual opposed to a “corporate group,” just “irreducible, incommensurable, unconnected localities, which then, at a great price, sometimes end in provisionally commensurable connections” (Latour 1997). Likewise, to take up Michel Callon’s words, “neither the actor’s size nor its psychological make up nor the motivations behind its actions are predetermined” (Callon 1997:2). As a consequence, in ANT, networks are not fixed entities either. They “may have no compulsory paths, no strategically positioned nodes” (Latour 1997). Mapping networks, drawing typologies, and making predictions about the functions of any particular subject/object, are clearly not on ANT’s agenda. Rather, its aim is to create a vocabulary that will allow us to account for the various processes of translation and representation by which heterogeneous elements interact. At this point, the distinction between explanation and description tends to collapse, causing the theoretical position to translate into two methodological principles: in as much as actor-networks are not fixed, but only reveal themselves once activated, one has to “follow the actors” and account for their actions in a “symmetrical way.” The first principle means looking at the object from the viewpoint of those who produce it – following their steps and gathering the “inscriptions” that give shape to what they produce (i.e. socio-technical networks). Indeed, as ANT presumes that networks are omnipresent and dynamic, it is incompatible with any kind of sociocentric approach, requiring instead an egocentric attitude. The second principle, borrowed from David Bloor, involves not only treating winners and losers equally and on the same terms, but also treating nature and culture in similar terms (Latour and Woolgar 1996: 21-22).
As it gained popularity and adherents from various disciplines, ANT was subject to numerous attacks (Isambert 1985; Amsterdamska 1990; Collins & Yearley 1992; Bourdieu 1994; Gingras 1995) which crystallised most of the debates in social sciences in the nineties. This unorthodox framework was perceived as nihilistic in nature, fraught with hypertextualism, superficially radical but inherently conservative – perceived as a return to a mere form of realism – or even ‘reactionary,’ in that it was seen as betraying sociological objectives and epistemology in favour of engineering ones. Claiming that they had been misunderstood, ANT proponents continued to clarify their position, smoothing out some provocative statements that were part of the initial version of ANT – and responsible for much of the controversies surrounding it:
The intention was not to say that scallops have voting power and will exercise it, or that door closers are entitled to social benefits and burial rites, but that a common vocabulary and a common ontology should be created by crisscrossing the divide by borrowing terms from one end to depict the other.
Callon & Latour 1992: 359
A few years later however, Latour would state that the main problem of ANT was its very vocabulary, at least in four ways: actor, network and theory, “not to forget the hyphen” (Latour 1999: 15). Despite this apparent step back, the founder of ANT did not really distance himself much from his initial agenda. His most recent publication, Reassembling the social: an introduction to ANT (2005) is a reassertion of his earlier positions– the anti-essentialist epistemology, the necessity to redefine “the social” by scrutinizing the assemblages it is made of–, though in a (superficially?) more humble way. He presents ANT as a set of techniques and a method, rather than as a theory. While some believe that this “theory” is all but dead and buried, evidence would tend to suggest the opposite. Indeed, not only was ANT applied to various case studies beyond the sociology of science; it has also generated – and keeps on generating – innovative, rigorous and useful empirical research that may, or may not, allow it to fulfil its promise in the long run. More importantly, even while ANT (as a programme) remains and continues to be perceived as controversial, many of the ideas raised by Latour (such as the “agency” of objects) have resonated widely within the social sciences, and have certainly influenced a body of research not specifically labelled as a part of ANT, but which does share some of its post-structuralist assumptions. A French series aimed at presenting key contributions to ANT is currently being developed, with the first volume focusing on the use of ethnography, launched in 2006 (Akrich et al. 2006).
What emerges with ANT is a shift in the semantic extension of “network.” Here the term no longer refers to the representation of localized interpersonal relations (as conceptualised by the Manchester School), nor to abstract models of complex social structures (as with the Harvard group), nor to communication channels (whether physical or virtual), but to any kind of relation that may connect to entities while transforming the very nature of what passes between them, by translating this content. Network is no longer something that can be mapped and explained. On the contrary, it becomes a given, a constitutive feature of society, a way of looking at the world. As such, ANT brings us close to the idea of “network society” later promoted and explored by others. Translation would become a core metaphor of ANT used in opposition to transmission, as a way to suggest that any movement implies transformation, and that networks get formed and transformed through negotiations, conflicts, controversies, etc. They are not stable and they should not be essentialized. But… “now everybody thinks they know what a network is” Latour laments… This disparaging criticism could also be levelled at ANT for its main concept: translation. In everyday language – the register Latour takes up in this quotation – translation generally refers to interlinguistic transfer (rather than, say, chemical synthesis); translation scholars and historians who have extensively studied discourses on translation would probably tell Latour that the transfer has generally been perceived as closely synonymous to “seamless,” a simple quest for equivalence, something that can be done almost automatically. While the development of translation tools speeding up the process to almost simultaneity may tend to reinforce this idea, these tools present limits – as demonstrated by less than consistently ideal translation quality and communication flaws – and are a constant reminder of its illusory nature. At this stage, the objectives of a (Latourian) socio-technical network analyst and those of translation scholars could coalesce toward a similar agenda: i.e. breaking up the illusion.
4. On language and networks
Even while the most notable impact of network studies on translation studies thus far clearly came through contact with the social sciences, and as such, warranted a more comprehensive description, another domain, closely linked to translation studies, has also productively engaged research on networks. Linguistics, the formal scientific study of language as an academic discipline, emerged during the 19th century, and subsequently evolved into specialized sub-domains during the 20th century, building on an already long tradition of philosophical and scholarly inquiry into language and its specificity to human perception. “Networks” of linguistic relationships first took on their initial form in the work of linguist Ferdinand de Saussure whose notions of the linguistic sign, and langue and parole, sparked structuralism, a major repositioning of approaches in intellectual thought throughout the social sciences and humanities during the second half of the 20th century. Fundamental offshoots of the dialogue between linguistics and other disciplines such as anthropology and sociology have represented important, if controversial, milestones in linguistic theories. These include theories of linguistic relativity as first proposed by linguist-anthropologists Edward Sapir and Benjamin Whorf (Sapir-Whorf hypothesis) – seeking to show that language and thought are linguistically and socially mediated through cultural differences-, and linguistic universality as proposed by linguist Noam Chomsky (universal and transformational grammars) – seeking to reveal innate grammatical rules and structures in all human languages. The literature on linguistics and its specialized sub-domains is extremely vast, and for the purpose of this introductory article, reference to it is necessarily limited. Of particular note, however, are the scholarly references that seek to succinctly synthesize key areas of network research with regard to linguistic investigation.
As stated and emphasized by Ricard V. Solé et al. (2007: 3) and Jinyun Ke (2007: 3), certain network structures in particular capture the attention of linguists, namely: language systems and their elements (at the levels of semantics, pragmatics, syntax, morphology, phonetics and phonology); language users (i.e. language communities and the social structures defined by their members); and the interaction between language, meaning and the world (in terms of semiotic landscapes and their dynamics). One fundamental concept informing linguistic research is the notion of “semantic network,” principally dating back to the work of Ross Quillian (1968) who conceived of meaning in language within a network framework, i.e. where words, meanings, or concepts are the nodes, and various semantic relationships are the links. Research in this domain has focused on linguistic relational structures (such as synonyms, antonyms, hyponyms, hypernyms, meronyms, etc.) and properties (in terms of categories, attributes and taxonomies, as manifest in phonological forms, grammatical categories, and meanings), based on a paradigm of mental lexicon as a network. In addition to semantic relations among lexicalized concepts, co-occurrence and syntactic networks have been examined (Ricard Solé et al.) in terms of directionality (directed, undirected graphs) and other network analysis components in order to determine generative potential and functional words in areas such as dependency grammar. Other innovative studies have focused on the different levels of representation (low; high) and meaning, in particular through models of cognition, and the use of polysemy arising from human metaphoric thinking and culminating in semantic extensions. George Lakoff (1987) would advance the idea of metaphors as mapping from one cognitive domain to another, where these domains are environmentally, bodily, and experientially configured. Finally, as observed by both Solé and Jinyun in relation to complex systems, co-occurrence, syntactic, and semantic networks have shown patterns of organization that resemble small-scale, small-world dynamics, with “scaling in their degree distribution, […] high clustering coefficients and short path lengths between any given pair of units […], which remarkably are the same universal features as found in most of the natural phenomena studied by complex network analysis so far.” (Solé et al. 2007: 4) As for large-scale lexical networks (semantically and syntactically relational), Jinyun notes that they, too, exhibit features shared with networks in other areas. Steyvers & Tenenbaum (2005), for example (cited in Jinyun 2007: 11), “introduce a growing network model [Barabási & Albert (1999)] which incorporates two factors in preferential attachment: new words (concepts) preferentially attach not only to existing highly connected words, but also to words with high utility in terms of word frequency […]. The higher the frequency of a word, the higher probability it will get connected.”
Within the second group of networks of interest to linguists, i.e. language users in terms of social networks in linguistic communities, we find social networks analyzed from the perspectives of language change, contact, shift and maintenance. Sociolinguistics has furnished “pure” and applied linguistics with the means by which to observe, describe and likewise prescribe language use, as structured by and as a process of degrees of individual interaction with social entities, i.e. societies and communities. As in other disciplines, a network vocabulary has proved constructive and useful. For example, explanations of the dynamics of linguistic acquisition, innovation and language change would find inspiration in mathematical models of spreading/diffusion (see, e.g., pioneering work in this domain by Derek Bickerton, 1971, and William Labov, 1966). Insights into the relations between social network properties and individual and collective linguistic patterns of performance have emerged from considering local and long-distance connections, variation and regularity of language, as well as degrees of community integration, through social network analyses. Lesley Milroy (1980), for example, would conduct studies in Northern Ireland, among other places, to link social networks of relationships to linguistic vernacular norms and community identities, using a modified participant observation technique. Finally, the move from highly idealized models to more real world network models is challenging. Corpus linguistics, benefiting from advanced computing technologies and the Internet, has sought to study language use as it is revealed in content extracted from voluminous corpora gathered as data. It has yielded promising research in lexicography, terminology, foreign language teaching and in translation studies. Evolving methods of statistical analysis not only assist in monitoring language use but also increasingly reveal the statistical properties of network growth, perhaps eventually even statistic universals in language networks. As in the social sciences, the move from small-scale to large-scale representation has drawn on new techniques and technologies. Indeed, it appears as though several models, rather than one, may be necessary in order to achieve any kind of reliable model of simulation, particularly when contemplating large-scale networks diachronically, and realistically, over time. Or perhaps, as Jinyun Ke (2007: 24, 25) suggests, what is needed is a new model that can build on small-world and scale-free networks (“the current two typical network models”) and furthermore take into account a “local-world view to grow the network.”
5. Networks in computer sciences
At roughly the same time that network theories were emerging and making their influence felt in the social sciences and other disciplines, they were likewise serving as a primary foundation in computer sciences, albeit in a different way. While the history of computing by machine goes back to ancient times (in the rudimentary form of the abacus), the science of computing only became a firmly established academic discipline in the 1960s. Computer science, whose origins are often credited to mathematician Alan Turing, currently covers the range from basic theories of computation to the most sophisticated and complex specializations of computer and information systems, human-computer interaction, natural language processing in computational linguistics, artificial intelligence, and communication networks. Communication between computer systems, known technically as computer networking, has its roots historically in the earlier models provided by telecommunications, a milestone in the history of human communication across the globe. As succinctly stated by Anton Huurdeman in The Worldwide History of Telecommunications (2003: 3-4): Telecommunication is the “technology that eliminates distance between continents, between countries, between persons”; in order for it to occur, “local, regional, national and international telecommunication networks are required.” Tracing the evolution of telecommunications from optical telegraphy – through telegraphy, telephony, radio, satellite, optical fiber – to multimedia by means of a “telecommunications tree” (2003: 7-10), Huurdeman designates the bases (roots) as science and industrialization. Without science, industrialization and technologies, and without the almost ubiquitous concrete physical infrastructures that materialized, the media and social networks that ensued and which currently constitute and configure our real and virtual worlds simply would not exist. Without networks, virtually all worldwide communication as we know it would come to a halt. If not for this sobering reality alone, it behooves us to understand the nature of the very tangible relations of human beings with technical networks as well as the relations which arise among human beings exclusively by virtue of the existence and mediation of networks.
So, what is a network, technically and technologically speaking? A sequence of entries in the Oxford English Dictionary provides us with a convenient starting point, for it mirrors the evolving technical history of network technologies through the series of its definitions. Characterized first as “any netlike or complex system or collection of interrelated things,” a network can connote “topographical features, lines of transportation, or telecommunication routes”; a “system of cables for the distribution of electricity”; “any system of interconnected electrical conductors or components”; a “broadcasting system consisting of a series of transmitters able to be linked together”; a “group of radio or television stations linked by such a system”; or finally a “system of interconnected computers, frequently the local area network or wide area network.” These basic definitions can be nuanced further for precision, in accordance with the distinct yet on some levels shared, histories of telecommunication and modern data communication, of which networks are an integral, if not compulsory, part. As noted by van Dijk (2006: 46, 48), telecommunication can be “defined as a type of communication using technical media to exchange sound in the form of speech and text over (long) distances,” while data communication can be defined as a type of communication using technical media to exchange data and text in the form of computer language. In its quest for a globally-applicable definition useful for policy purposes, the International Telecommunication Union (SANCHO database) simply states that a network be defined as “a set of nodes and links that provide connections between two or more defined points to facilitate telecommunication between them.” In sum, the telecommunication and data communication means that have defined the history of modern communication and human social relations and practices over the past century, are grafted intimately onto the history of physical electric power grids and computer network infrastructures.
In the context of computers, a network – simply put – is generally considered to consist of at least three (sometimes two) computers connected together using a telecommunication system for the purpose of communicating data and sharing resources such as files, applications, printers, etc. Data may be of text (unformatted, formatted, hypertext), image (computer-generated, digitized), video (video clips, movies or films) or audio (speech, general audio) type. Data are communicated in the form of electrical signals: analog or digital. They flow, i.e. are transferred through applications, in one of five modes: simplex (in one direction only); half-duplex (in both directions, alternately); duplex (in both directions, simultaneously); broadcast (output by a single source device and received by all other devices connected to the network); and multicast (output by a single source device and received by only a specific subset of devices connected to the network). Channelled synchronously or asynchronously, data can be connected and transmitted from source to destination by communication techniques of circuit- and/or packet-switching (as well as the recently developed cell-switching), which manage the resources needed along the path. (see Fred Halsall 2005: 4-12) Circuit-switching is a feature of telecommunication while packet-switching is characteristic of data communication. The mapping of data flow, in terms of nodes and connections (the layout of wires and cables), determines network topologies. Nodes are the “processing locations,” typified by devices such as hubs, switches, and routers, or stopping and end-points such as gateways and hosts. Five basic network topologies, i.e. the physical or logical layout of connected network devices: bus, ring, star, tree, mesh, undergird different types of area networks. The two main types, local area networks (LANs) and wide area networks (WANs), have yielded others, based on evolving technologies: metropolitan area network (MAN), wireless local area network (SLAN), campus area network (CAN), personal area network (PAN), home area network (HAN), etc. These area networks operate privately, publicly, or semi-publicly, and are local, national, transnational and international or global in scope. The two most basic types of network architectures are client/server (centralized) and peer-to-peer (decentralized and highly distributed), or increasingly, a hybrid of both. At the most basic structural level, network communications transpire externally and internally through the standard OSI (open systems interconnection) model of seven layers: application, presentation, session, transport, network, data link and physical, or the TCP/IP Internet reference model of four (sometimes five) layers: application (program/process), transport (host-to-host), network and link + physical layers. The first model, maintained by ISO, is an abstract model of how network protocols and devices should communicate and “interoperate.” Different networks linked to the network “fill in” the layers differently with content. For all but the layer of applications, however, network operators are increasingly supportive of open standards and protocols in order to ensure successful communication within and among all the layers. (van Dijk 2006: 50) Fixed, or physically wired/cabled, networks exist alongside the more recent wireless and cellular networks. The latter depend on radio transmission to facilitate exchange of information and communication, relying on other kinds of specialized protocols, standards and access points to communicate between fixed and portable devices. Forms of wireless communication currently include mobile phone, short (text) message service (SMS), multimedia message service (MMS) and wireless fidelity (WiFi).
The largest network on the planet is the Internet, comprised of many thousands of physically interconnected computer-access networks able to communicate standard- and protocol- (TCP/IP, a transport transmission control protocol and Internet protocol) based data by packet-switching through wires, cables and wireless connections. The World Wide Web (WWW), one service which is accessible through the Internet, was developed at CERN (Conseil européen pour la recherche nucléaire) by Tim Berners-Lee between 1989 and 1991. It is comprised of interconnected documents and resources linked by uniform resource locators (URLs) and hyperlinks, transmitted through connecting Web servers and client computers. The protocol defining the contents of a document (Web page), which in itself can include many types of objects (files), is the hypertext transfer protocol (HTTP). The content is displayed through the graphical user interface (GUI) of a Web browser, in accordance with the hypertext mark-up language (HTML) instruction tags embedded within the document. The Web also supports content created according to eXtensible markup language (XML) specifications, a special type of standard general markup language (SGML) which allows for customizable tags able to not only specify formatting but also, like metadata, define data types. XML tags are designed to enable data to be transmitted, validated and interpreted more smoothly during communication between diverse applications and systems. Other Internet services include email and attach, file sharing through file transfer protocol (FTP), peer-to-peer (P2P) file sharing, streaming video, instant messaging (IM), voice over IP (VoIP), remote access, Internet commerce, etc. The Web has continued to evolve by virtue of the increasingly more sophisticated information, knowledge, communication and Web technologies nourishing it. Versions subsequent to the first WWW include the Semantic Web, “Web 2.0” and “Web 3.0,” with a more precise focus on user-generated content, and eventually on data as knowledge, correlated and queried in meaningful linkages rather than just the results of combinatorial searches or hyperlinked pieces of information published online as we currently know the Web today. Under the leadership of Berners-Lee and the World Wide Web Consortium (W3C), the next-generation Web architecture known as the Semantic Web is designed to deploy new open markup languages and technologies, including the resource description framework (RDF) and Web ontology language (OWL), with the goal of providing “a knowledge representation of linked data in order to allow machine processing on a global scale” and “the best framework for adding logic, inference, and rule systems to the Web.” Despite mathematical and artificial intelligence complications, Semantic Web “agents” (“pieces of software that work autonomously and proactively”) will try to “utilize metadata, ontologies, and logic to carry out its tasks.” (H. Peter Alesso; Craig F. Smith 2006: xvii-xviii, 14). Connectivity and communication through networks of computer systems linked together and to the Internet, likewise, are increasingly being complexified by faster routers, higher transmission speeds, evolving Internet service models, and issues of security. “The explosion of blogs, vlogs, podding, streaming, and other forms of interactive, computer to computer communications sets up a new system of global, horizontal communication networks that, for the first time in history, allow people to communicate with each other without going through the channels set up by the institutions of society for socialized communication.” (Castells 2006: 13) In sum, physical connectivity and the structuring of networks are becoming dynamically ever-closer allies.
As recent research indicates, the physical roles of software, hardware, networking and communication transmission channels in directing the formation of organizational and social structures cannot be underestimated. But how exactly are these roles and structures mediated? The growing academic field of Internet studies has only recently begun to systematically address the myriad issues arising from the interaction between the Internet (in particular the World Wide Web) and societal organizing and social practices. Unlike earlier innovative telecommunications models that had a striking impact on the social realm, the nature of the Web clearly has ushered in the need to reconceptualize human relations to technologies at all levels. Peeling away the layers of the network infrastructure with a goal to visualizing the separate, distinct structures and forms constituting it, in the words of media and communications scholars Robert Burnett and P. David Marshall (2003: 2), helps us to “understand the genuine complexity of this new medley of technologies and to anticipate how decisions and policies concerning its potential benefits and dangers could affect its future values to society.” This is no easy task, however, as information and communication technologies (ICTs) intertwine, and even merge, ever so tightly with multi-media forms. Tracing the historical linkages and interplay between cultural forms and technology, Andrew Murphie and John Potts (2003: 3-4) note that regular usage of the term “technology” (Greek tekhne, “art or craft,” and logos, “word,” “system,” “study”) emerged only in the second half of the 19th century, “along with other terms like the ‘Industrial Revolution,’ to describe the radical restructuring of Western societies as a result of industrial processes.” Over time, the term has acquired meanings that are both abstract (knowledge, processes, goals) and concrete (products, artefacts) in form, reflecting and shaped by values, ideologies, political, economic and ethical concerns. The exponential proliferation of meanings generated through amalgamating technologies and cultural forms and practices, contingent on myriad multimedia, within networked Internet space-time provokes far-reaching questions that now resonate fully with our contemporary human experience. The new terminology regularly coined with which to depict and represent the restructuring of human societies globally as a result of 20th century technological innovations reflects unprecedented speed and links to the economy, to wit, in little more than a decade: digital revolution; information revolution; digital economy; information economy; knowledge economy; service economy; global economy; Internet economy; virtual economy; information society; knowledge society; to name the most salient ones.
To the amassing strata of analyses and discussions on technologies and social forms – of which culture is a core component –, clings the prickly question as to what degree technology actually functions as a structuring agent. While the premise of technology neutrality is no longer widely upheld, the extent to which its effects can be qualified as technological determinism or instrumentalism varies. Some scholars, such as Andrew Feenberg (1999: ix), call for a more logical coherence, arguing that the methodological dualism of technique and meaning that pits technical disciplines (“constituted around devices conceived as essentially functional and therefore as essentially oriented toward efficiency”) against humanistic disciplines (which examine meaning in relation to practice in human beings and societies), is not only of little use, but also – by reducing technology to mere functions and raw materials – does not adequately reflect the realities of our times. Situating technologies more profoundly in the social, cultural and political contexts in which they are received and embedded constitutes a growing body of critical literature in the domain of sociology of technology. The technology-society relationship, particularly in relation to the Web, reveals itself as diverse and as heterogeneous, informed and nuanced by the multifarious relations that comprise human life on all levels. Interrogating this relationship has opened up within academia new domains of critical inquiry, among which we find analyses of the relations between:
different institutional and social networks and the cultural commodities that Web space-time has engendered or facilitated;
global(izing) economies and commerce, in and by means of the Web (B2B; B2C);
regulating, standardizing, open and proprietary entities with regard to Internet territory, and relations to Internet governance;
conventional and Web-informed notions of information and knowledge, and subsequent relations to database structures;
traditional and Web network constructs of identity; and subsequent relations to digital/online identity management, “Identity 1.0” and “Identity 2.0”;
conventional law, legal traditions and Web law, and subsequent relations to intellectual property;
“culture” and cyberculture;
media conglomerations and the entertainment industry (including online games; massively multi-user online role-playing games, i.e. MMORPGs, and virtual worlds);
forms of political and social activism;
online and offline communities;
linguistics and discourse theories and Web-enabled means of communication;
globalization and global knowledge management;
globalizing and localizing processes and practices;
multilingualism;
human and machine translation, manifest in nearly every domain of social, cultural, economic and political practice, both on- and off-line.
Clearly, research into the global implications for societies shaped by technologies has burgeoned and deepened analytically from the initial studies on social aspects of the Internet in computer-mediated communication which first appeared in the mid- to late-1990s. Furthermore, if we consider that the principal agent for these technology-induced transformations in society is the Internet (and World Wide Web), in essence the network of networks, then there is reason to believe that studies on human interfacing with machine and technologies will need to devote serious methodological and analytical attention to the notion of network as paradigm, both as a concept and in practice.
6. Network society: metaphor? ideology? paradigm?
Network terms and topologies are serving as fundamental concepts and models used by scholars across many disciplines, attempting to express, simply put, a kind of contemporary worldwide readjusting of societal relations and practices in response to the quickly escalating and pervasive presence of technologies. As with all contemporary historical moments viewed under the microscopic lens of the present, the challenge lies in balancing description with sufficient critical analysis. One prominent instance for network studies is in the introduction of the term/concept “network society.” On the surface, the term seems to intuitively correspond to the formation of a new paradigm with which to conceptualize society in its global and local configurations. However, does it meet the test of a tool able to effectively analyze and critique society, social relations and practices? While network approaches and trends in mathematics, physics and computer sciences, for example, might seek to focus on structures to explain, elucidate and model, the network concept counterparts in the social sciences must be assessed additionally in terms of their capacities to critique and, perhaps to act, transform and improve. As such, they need to embrace difficult and complex issues of power, authority, politics, ideology, values, morality, ethics, justice, etc.
The now widespread term “network society,” in particular as formulated by sociologist Manuel Castells, attempts to position “network/s” contemporarily within a sociological context. While refraining from using much of the standard terminology of traditional network theories, Castells nonetheless has relied on mathematically, sociologically and technologically informed network concepts in his definitions of network and network society. His famous trilogy, published between 1996 and 1998, perspicaciously examines – through empirical research – a wide range of diverse networks in human societies. To what extent can or should the concept of network society be adopted? Aware of the potential limits of using such a flexible term, he continues to refine aspects of his definition in light of evolving technologies, mounting empirical evidence, and critical debate. As succinctly articulated in his 2004 essay, for example, network and network society are defined together as follows:
A network society is a society whose social structure is made of networks powered by microelectronics-based information and communication technologies. By social structure, I understand the organizational arrangements of humans in relations of production, consumption, reproduction, experience and power expressed in meaningful communication coded by culture. A network is a set of interconnected nodes. A node is the point where the curve intersects itself. A network has no center, just nodes. Nodes may be of varying relevance for the network. Nodes increase their importance for the network by absorbing more relevant information, and processing it more efficiently. The relative importance of a node does not stem from its specific features but from its ability to contribute to the network’s goals. However, all nodes of a network are necessary for the network’s performance. When nodes become redundant or useless, networks tend to reconfigure themselves, deleting some nodes, and adding new ones. Nodes only exist and function as components of networks. The network is the unit, not the node.
2004: 3
In 2006, Castells furthermore underscores the important role played by the pervading network logic itself as it becomes increasingly inscribed in network society structures and relations yet resonates with a basic organizational form already present in early human social worlds, i.e. networks:
The network society, in the simplest terms, is a social structure based on networks operated by information and communication technologies based in microelectronics and digital computer networks that generate, process, and distribute information on the basis of the knowledge accumulated in the nodes of the networks. A network is a formal structure (Monge and Contractor 2004). It is a system of interconnected nodes. Nodes are, formally speaking, the points where the curve intersects itself. Networks are open structures that evolve by adding or removing nodes according to the changing requirements of the programs that assign performance goals to the networks. Naturally, these programs are decided socially from outside the network. But once they are inscripted in the logic of the network, the network will follow efficiently these instructions, adding, deleting, and reconfigurating, until a new program replaces or modifies the codes that command its operational system.
2006: 7
The network society, as conceptualized by Castells, is complex and continually expanding and synthesizing in definition as new technologies embed themselves within social relations and practices (expanding the contours) and certain features amalgamate and consolidate. Felix Stalder (2006: 180) explicitates Castells’ expression of the network society by reformulating the language somewhat, underscoring that a network’s nodes and connections inter-create themselves and inter-define one another without formalized authority, coordinating themselves on the basis of common or shared protocols, values, goals and projects. Flexibility, adaptability, and shared projects are not to be underestimated. The network society, emerging from the interactions of networks based on a technological paradigm of “electronic informational-communicationalism” (Castells 2004: 9), is characterized in particular by large-scale organizational flexibility, scalability, and coordination that continuously interdefine themselves, through a common frame of reference of shared protocols, values, goals and projects, for the network society’s survival. (Stalder 2006: 183)
Nonetheless, Castells is careful to ground the concept by reiterating that the network society arose as an “accidental coincidence in the 1970s of three independent processes: 1) the crisis and restructuring of industrialism and its two associated modes of production, capitalism and statism; 2) the freedom-oriented cultural social movements (i.e. oriented toward a transformation of the values of society) of the late 1960s and early 1970s; and 3) the revolution in information and communication technologies.” (Castells 2004: 15; 19) In other words, its emergence was neither a foregone conclusion to the introduction of computer, information and communication technologies in human societies (thus not a result of technological determinism) nor a logical phase within any notion of a grand narrative of human progress. The network society is grounded in historical antecedents, but is an entity of fortuitous (serendipitous?) birth whose growth reflects the manifestation of the continual dynamic interplay between human beings and their post-1980s technological environment. It is global, and arguably local as well. Its salient force is that even while much of the world’s population may not be aware of its concrete effects locally, what communities experience locally is in part the result of its concrete effects globally. It is the representation of the fundamental structural change that has permeated the institutions, forces and powers driving the essential dynamics of our human existence, individually and collectively. As such, the creation and continued presence of the network society is synchronized intimately with globalization and the technology-enabled circulation of goods, services, capital and people across borders, with all the contradictions and paradoxes this implies. For Castells (2006), the organizational and reorganizational capacities of network logic clearly yield project-inspired networks that depend on performance as one logic constituting connectivity.
The concept of “project” within networks is also fundamental for Boltanski and Chiapello in their presentation and development of the notion “projective city” (“cité par projets,” 1999 French/2005 English). “The project,” note Boltanski and Chiapello (2005: 104), becomes “the occasion and reason for the connection.”
It temporarily assembles a very disparate group of people, and presents itself as a highly activated section of network for a period of time that is relatively short, but allows for the construction of more enduring links that will be put on hold while remaining available. Projects make production and accumulation possible in a world which, were it to be purely connexionist, would simply contain flows, where nothing could be stabilized, accumulated or crystallized.
Boltanski and Chiapello 2005: 104-105
As such, the “projective city” becomes a mental reality that is shared and recognized by members of society, hence it justifies actions. It reflects and defines a mode of organization that increasingly defines society at large:
[…] [W]e have chosen to call the new apparatus of justification that seems to us to be being formed the “projective city.” It is in fact modelled on a term that frequently crops up in management literature: project organization. This refers to a firm whose structure comprises a multiplicity of projects associating a variety of people, some of whom participate in several projects. Since the very nature of this type of project is to have a beginning and an end, projects succeed and take over from one another, reconstructing work groups or teams in accordance with priorities or needs. By analogy, we shall refer to a social structure in project form or a general organization of society in project form.
Boltanski and Chiapello 2005: 105
As we have seen, from the emergence of graph theory to the surfacing of the very concept “network society,” network has become an increasingly polysemic term referring to different, and at times opposing, research paradigms (as with SNA vs. ANT). Even while, very broadly speaking, the notion appeared as a way to move away from social determinism and to favour explanations based on the relations between entities rather than their substance, diversity of interpretation and extensions of the term could also be felt in the early stages: some research highlighted the normative character of networks whereas other research stressed their dynamic character and creative potential. “Network” was used as a representation of small-scale relations, but also intuitively felt as potentially applicable to society at large. The term “network society” clearly bears witness to this. From the very beginning, network research has been characteristically ambiguous as to what degree networks are to embrace an authentic way of thinking about the world. As used in the social sciences, network conjures up its own complex relation to structuralism. So, while clearly not a recent invention, nowadays the term undoubtedly evokes a frame of mind that resonates acutely with computer, information, and communication technologies, in particular the Internet, materializing in an epithet by which the contemporary world likes to refer to itself. Yet, as Boltanski and Chiapello (1999: 218/2005) suggest, the current popularity of the metaphor results as much from its a priori suitability to new communication channels as a celebration of the network society. The increasing popularity of this epithet obviously involves some dangers. On the one hand, it could eventually function as rhetorical discourse. On the other hand, it could cause us to lose sight of the distinctive features of the term, thereby drowning out its meaning and bringing it back to a purely metaphorical sense, where network becomes another synonym for evoking any kind of connection, transport or… translation. This could lead to a risk of fetichizing the term itself and to naturalizing the ideas that underlie it – which would be paradoxical and ironic given the usual post-structuralist assumptions underlying the “network philosophy.” Consequently, even the most active proponents of network theoretical paradigms now tend to be wary of the notion and its use. In his conclusion to The Network Society, a handbook aimed primarily at disseminating Castells’ ideas to a broader public, Darin Barney stresses the ideological dimension underlying the network philosophy:
The ‘Network Society’ is not just a descriptive name. It is also an elaborate discourse that, in purporting simply to describe the set of contemporary social dynamics, provides a script that sets out roles, norms, expectations and the terms of dialogue. Thinking through the model of the network – nodes, ties, flows – certainly helps us to understand a great deal about, for example, the restructuring of capitalist enterprise and work, the disaggregation of state sovereignty, the rise and operation of new social movements, and emerging practices of community and identity formation. But when an idea such as this is elevated from heuristic device to the status of an all-encompassing social and historical fact, its function shifts significantly. As an alleged fact, the Network Society becomes the standard for what is normal, desirable, and for what we can reasonably expect.
Barney 2004: 179-180
Finally, when networks become automatically and simply associated with technology, one runs the risk of falling into some kind of “techno-utopie cyberspatiale,” to use Musso’s words (2003: 230), where the dynamics of human interactions are thought to be not only regulated by but analogous to those of technological networks – just another illusion. “Network ideology” has its limits and dangers as well, as thoroughly explained by Musso (2003) and Boltanski and Chiapello (1999/2005). In Boltanski’s and Chiapello’s now famous work, the proliferation of network analyses across the disciplines since the seventies is associated with the rise of “the new spirit of capitalism” wherein social network analysts have adopted either a historicist attitude (assuming that network analyses are especially suitable for describing contemporary societies) or a naturalistic one (assuming that analyses aiming to describe historical changes take account solely of the structure of networks). In both cases, the authors note, the same mistake has been made, i.e. that of believing that “states of things and modes of description can be treated independently of the normative positions from which a value judgement on events can be made.” (Boltanski and Chiapello 2005: 150-151) These justified criticisms do not mean that the term “network” now ought to be, banned, only that one must remain conscious of its performative dimension and not use it without positioning oneself with respect to the multitude of possible interpretations. As with the myriad coined terms that preceded the term network society, used to refer to and represent the degrees of fusion between ICTs, computers and humans, critical definitions and techniques will be instrumental in trying to discern whether or not, and to what extent, the terms network and network society are useful conceptual mechanisms through which to understand society, communication, and ultimately for us…translation.
7. Recent breakthroughs and directions in network studies
The recent upsurge in network studies research is exponential, and is linked, in large degree, to the massive increase in computing power and network infrastructures, to escalating consumer access, and to the voluminous rise in data facilitated by propagating computer and Internet technologies. The empirical results ensuing from systematic observations of behaviours on large networks such as the Internet at times confirm existing hypotheses on network activities and at other times furnish researchers with entirely unexpected directions to pursue. The end of the millennium proved to be a threshold turning point for certain areas of network studies. In 1998 sociologist Duncan Watts and mathematician Steven Strogatz introduced the small world model for social networks, and proposed a measure of clustering known as the clustering coefficient to determine whether or not a graph is a small-world network (NBW 2006: 229). Small-world networks typically have a high degree of clustering and a small average distance. In 1999, physicists Albert-László Barabási and Réka Albert introduced scale-free network models, and explored properties and models of power-law degree distribution (including the World Wide Web) in terms of their dynamic growth, preferential attachment and representation of real-world phenomena (NBW 2006: 335). Scale-free networks typically have a power-law degree distribution, and two essential elements: growth and preferential attachment. One example of documented scale-free social networks is the co-authorship network. As noted by Claire Christensen and Réka Albert (2006: 14), the scale-free network model would “lead to a frenzy of activity in developing evolving graph models that would not only capture the scale-free nature of real networks, but that would also come closer to predicting more realistic clustering coefficients and distances.” Also in 1999, Albert, Jeong and Barabási would test real-world network models against conventional random graph models through studies which designated Web pages as vertices and hyperlinks as edges, problematizing their dynamic aspect by attempting to quantify dynamic Web pages that materialize upon individual user requests. Still other studies sought to map out the graph structure of the World Wide Web through degree distribution frequency, to try to answer the question: What is the Internet really? What kind of network? Beyond the WWW, and in terms of the physical Internet, the Faloutsos brothers (1999) would explore three levels of networking ((1) computers as vertices and physical inter-computer connections as edges; (2) routers as vertices and inter-router cables as edges; and (3) groups of computers, or autonomous systems, as vertices and computers with one direct data connection to another as edges)) to preliminarily conclude that the Internet is a scale-free network with a few hub nodes linked to many other nodes which are not exactly power-law in form (NBW 2006: 195).
Before the late 1990s the cross-pollination that did occur among specialized disciplines of mathematics and the social sciences yielded and functioned by means of a working vocabulary that was still beholden to conventional graph theories and social network analyses. These representational and analytical tools, succinctly presented in Hanneman and Izquierdo (2006: 3-29), include:
defining networks as a set of actors (or agents) that have relationships (or links);
representing relations among actors as simplex (single type) or multiplex (more than one kind);
expressing relations (or ties) as directed (originating in a source actor and reaching a target actor) or undirected (co-occurrence, co-presence or bonded ties between pairs of actors);
weighting the strengths of ties as binary, signed, ordinal or numerically valued (representing varying qualities and degrees of tie presence or absence);
representing social network data by either drawing or using matrices;
determining the properties of the networks and actors, through: size (number of nodes or number of edges in the network); density (the number of ties in the network expressed as a proportion of the number of all possible ties); degree of actors (in terms of incoming [in-degree] or outgoing [out-degree] links, to measure roles and influence); distance (how far an actor is from another and how deeply embedded individuals are); walks; cycles; trails; paths; eccentricity of actors; diameter; reachability (if there exists a set of connections by which we can go from the source to the target actor regardless of how many others are in between); connectivity (property of a network, not of its individual actors, that extends the concept of adjacency); reciprocity; transitivity (displays of balance type); cliques; clustering coefficient (the ratio of existing links connecting the node’s neighbors to each other, to the maximum possible number of such links); centrality (the measure of a network node’s structural importance, based on degree, closeness, betweenness); and equivalence.
After 1999, however, the Internet and World Wide Web proved to be an authentic and fertile testing ground for network theories. In terms of sheer size of the Internet and its millions of potential nodes, topology could no longer exclusively apply random graph models and a conventional network property vocabulary to the programming of Internet topology generators (i.e. algorithms generating networks whose structure is expected to resemble the Internet’s real topology). In the interest of trying to uncover the features of real-world networks, and to generate more accurate models for designing simulation and more efficient protocols with which to capitalize on the Internet’s real properties, new network topologies began to emerge. They increasingly demonstrated that topological organization had a significant impact on the dynamical behaviour of complex systems (Christensen and Albert 2006: 16). Complex systems research, cross-disciplinary in nature and spanning the boundaries of biology, chemistry, physics, social sciences, economics, information sciences, technologies, computer and cognitive sciences, is likely to continue having a significant impact on the models and paradigms used to describe, explain and predict real-world phenomena. While traditional science has mandated that methodologies and explanations of scientific phenomena be as objective and value-free as possible, complex systems dynamics have revealed the presence of motivation and purpose, leading to the challenge of how to account for the values and ideologies inscribed therein. At the same time, the fine-tuning of the “image” of connectivity structures and patterns in these systems has revealed certain similarities in organizational principles and emergence across various research disciplines. As noted by Solé et al. (2006), universal patterns seem to be emerging in complex dynamical system structure and evolution in general, be they ecological webs, software maps, genomes, brain networks or Internet architectures. These patterns, characteristically both efficient and fragile, are revealed in the “statistical properties of networks” and the “general laws that all complex networks abide by, independently of the nature of the elements or their interactions.” (Solé et al. 2006: 3)
Two main features seem to be shared by most complex networks, both natural and artificial. The first is their small world structure. […] The second is less obvious, but not less important: these webs are extremely heterogeneous: most elements are connected to one or two other elements and only a handful of them have a very large number of links. These hubs are the key components of web complexity. They support high efficiency of network traversal but are for the same reason their Achilles heel. Their loss or failure has very negative consequences for system performance, sometimes even promoting a system’s collapse.
Solé et al. 2006: 3
Language itself, is also considered to be a functioning complex dynamical system:
It exhibits highly intricate network structures at all levels (phonetics, lexical, syntactic, semantic) and this structure is to some extent shaped and reshaped by millions of language users over long periods of time, as they adapt and change them to their needs as part of ongoing local interactions. The cognitive structures needed to produce, parse, and interpret language take the form of highly complex cognitive networks as well, maintained by each individual and aligned and coordinated as a side effect of local interactions. These cognitive structures are embodied in brain networks which exhibit themselves non-trivial topological patterns. All these types of networks have their own constraints and interact with the others generating a dynamical process that leads language to be as we find it in nature.
Solé et al. 2006: 3
Finally, complex networks are not only the organizational result of top-down decisions in design processes, as for example in software architecture; they are also the resulting organizational dynamics that emerge in the interplay of community goals, such as the self-organization and hierarchy found in Open Source social networks (Valverde and Solé 2007: 1).
Clearly, there is a palpable need for shared research and project collaboration: perhaps a shared goal that can connect researchers within a network to unravel the intricacies of networks? The challenge lies first in understanding and “translating” working vocabularies that confront one another and are being re-shaped as the research conducted and questions asked within different disciplines come into contact with one another on similar issues. Isolated or non-connected clusters of network researchers focus on networks within the scope of their specific domains, such that social network theorists, network scientists, and network society scholars, for example, do not engage one another in more sustained dialogue, thus instigating a multitude of answers to the question of what it means to theorize networks. Furthermore, as maintained by researchers such as Monge and Contractor (2003: xii), many network research hypotheses have depended on single-theory approaches at a single level of analysis, while it is increasingly evident that a multi-theoretical and multilevel perspective is in fact necessary and more useful. Grounded in decades of research on communication and organizational networks, they propose, for example, an “agent-based modeling framework” from within a “complex adaptive systems perspective” that could contribute to enlightening scientific research on emergent system properties such as complexity, chaos, catastrophe, and co-evolutionary dynamics. Research and development of agent-based computer systems (including multi-agent systems) itself draws on a wide range of domains, most predominantly complex systems, logic, game theory and economics.
During a recent conference on new network theories held at the University of Amsterdam in June 2007, Noshir Contractor discussed his use of digital harvesting (text mining tools, Web crawling, and Web of science citation) to mine data and relational metadata in order to ascertain, much in the way of a “statistical MRI” (magnetic resonance imaging), the internal picture of structural signatures etched on networks. Based on the hypothesis that social network theories of all types (including self-interest; social and resource exchange; mutual interest and collective action; contagion; balance; homophily; proximity and co-evolution) are linked to specific structural signatures on the networks, Contractor advocates that these structures can reveal which motivations create and sustain networks, and how certain motivations tend to be linked to specific network types. The digital structural imprints, which materialize within combinations of human and non-human interactions and perceptions, can only be captured through multi-dimensional, multi-theoretical perspectives that resist unification under any single theoretical approach. Other recent approaches stress the need to analyze not only traditional formal organizational structures, but also the considerable impact of emergent unplanned communication linkages spanning networks of relations across organizations. In fact, mobile communication technologies, and the emerging nomadic technology devices which depend on them, may eventually contribute significantly to unexpected organizational “structures” and communication linkages: those of social networks “materializing” in interconnected physical and virtual spaces connected solely by mobility. Perhaps, as Joseph Badaracco (1991, cited in Monge and Contractor 2003: 32-33) has advanced in his distinction between migratory knowledge and embedded knowledge, we will need to take movement itself more deeply into consideration during our analyses of networks. In sum: it would seem that the appearance of order out of randomness in nature and society, and the myriad types of heterogeneous relations among humans this implies, are multifaceted and dynamic, and cannot be straitjacketed into scientific approaches confined to static or top-bottom frameworks. Contemporary and upcoming computer, communication and Internet technologies seem to be capable of measuring these network dynamics to some degree, and this fact alone has jump-started a substantial body of current network research. Some, like researchers at the Institute of Network Cultures in Amsterdam, are already “rethinking” network theory in this light. Rapidly evolving computer, information, and communication technologies have equipped scientists, social scientists, and mathematicians alike with the means and infrastructures to pursue their studies of networks in spaces of intellectual inquiry that now converge inter-disciplinarily, most notably with a focus on the dynamic nature and complexity of networks as they emerge in real-time, real-world environments. Most network studies scholars would concur that the last few years have indeed witnessed a noticeable and decisive turn in this multi-inter-disciplinary field. Yet, translation studies does not yet seem to be visible on the playing field.
8. Connecting translation and network studies
Translation studies and network studies, both in theory and practice, seem to encounter certain similar challenges. At the fore we find the ever-present need to balance and reconcile the broad breadth and scope of basic general concepts in the field with the many diverse details and variables of local circumstances and case studies. Both domains, especially in light of current technologies, have needed to synchronize the general/global/universal with the local/particular, leading some to wonder if the abstraction of details to such a degree in a field’s quest for a theory can ultimately be useful as a theory in practice. This balancing act links to the need to succinctly yet adequately define the object of disciplinary inquiry: concretely, what is a “network”? what are the features of networks? what is “translation”? what are the features of translation? Does the rising importance and visibility of translation (human and machine) in network society, for example, allow us to consider networks and translation as more epistemological than metaphorical at this point? Attempting to answer these questions inevitably uncovers along the way fundamental links with other areas. As such, both translation and network studies have needed to import concepts and analytical tools from interfacing disciplines, all the while maintaining their own configurations as separate disciplines in their own right. At the same time, the nature of both translation studies and network studies has required coordinating seemingly contradictory impulses: a push in the direction of stability/fixedness and a pull in the direction of flexibility/flux. In concert with the trends that have informed other disciplines throughout the past century, and in response to their own inclinations, both translation studies and network studies have emerged as authentic sites of adaptation, adjusting to evolve in accordance with the variables, local and global, that arise.
Through this process of adaptation one may have the impression that the primary object of study has been lost sight of. In that respect, both translation studies and network studies face a similar challenge and peril, one of dealing with a key word that is becoming increasingly popular and which lends itself well to metaphorical usage. The danger, at this juncture, is of losing oneself within metaphorical language. This danger has been amply discussed with regard to network studies, but it also applies to translation studies. Our field gradually developed and gained its autonomy by distinguishing its specificity from the domains out of which it emerged – linguistics and literature –, while borrowing from an increasing number of neighbouring disciplines ranging from cultural theory to history. According to some (Singh 2004), this protracted borrowing could lead translation studies to eventually lose sight of its core objective, producing knowledge about translation rather than acknowledgement of translation. There is some truth to this observation. Yet, one could ask whether the dichotomy should not be minimized rather than emphasized. Although some research that comes under the label of DTS is only very indirectly concerned with the mechanics of interlingual transfer, translators do not work in a vacuum but, rather, in/for a society of which they are part, and translation researchers are also, generally speaking, translation educators and trainers. Understanding the process of translation, in the strictest sense, also implies, then, recording and analyzing the relationship between translators and their work environment, in as much as these relations do not fall “outside” the process but instead contribute to its very shaping. Investigating this relationship between the process and the context of interlingual transfer is what the ten papers of this volume strive to do. The range of issues addressed can briefly be summarized as follows:
Professional translators can now instantaneously carry out exchanges online while working, for example through Internet distribution lists and instant messaging. How do these new resources impact upon and modify translation problem-solving strategies? What role do they play in the process of interlinguistic transfer?
A growing number of translator humanitarian associations have recently developed worldwide. What are the overt and hidden agendas of these organizations? Are they all alike? And what do they say about the ethical dilemma of translation professionals?
As a service, translation requires trust. How is this trust “managed” in an economy based on and promoting out-sourcing, efficiency and flexibility? How does this mode of production hinder the translator’s socioeconomic status?
In the publishing industry, co-publication is now about to become the norm. How does this development change the dynamics of international literary exchanges, i.e. the way texts are selected for translation and the way they are translated/adapted for various markets at the same time?
Over the past ten years, the polysystemic model has been regularly criticized, and its limits – such as its excessive rigidity and determinism – are now clearly established. Translation scholars, even those who were the most actively involved in the development of this model, now recognize the need for more flexible and agent-based methodologies. Could SNA be of any help in that respect?
The growth of translation studies, in Canada at least, is closely related to that of university translator training programmes, whose development was triggered by an increasing demand in translation services (mostly from the government). Precisely how are these entities linked? What relationships do translation professionals have with translator trainers and translation researchers? What do they share? How, how much and for what purposes do they connect?
Like translation studies, terminology is a relatively new academic field. How did the main ideas and research paradigms now accepted as part and parcel of this field actually develop and spread? How can citation analysis help us to explore this question?
According to Antoine Berman, translation generally fails to render adequately the “underlying signifying web” of a literary piece; i.e. the network of key terms that repeatedly recur throughout the text, forming a “chain” that gives the text its own rhythm and meaning. Is this really the case? Would it be possible to approach a literary text in such a way as to unveil this network and recreate it?
Several remarks can be drawn from the above. First, we must acknowledge that, although our call for contributions was open and the number of submissions received significant, very few were centered on semantic/lexical networks or adopted a linguistics-based theoretical framework approach. There does exist a body of research in translation studies that makes use of semantic/lexical network analysis, but this approach is not well represented in this volume; the bias is not intentional. Secondly, beyond the diversity of the research topics, most of the questions posed are related to rather contemporary phenomena (triggered by a combination of technological, economic and political factors), which may be better accounted for through some kind of network approach. Thirdly (and partly as a consequence of the above), all these papers are marked by reflexivity, both on the methods of inquiry in translation studies, and in relation to its object: translation. Finally, while these papers certainly do not summarize all the research avenues that could be pursued at a crossroads between network and translation studies, they raise a number of innovative research questions and suggest interesting ways of exploring them.
Translators as social and political actors. The study of interpersonal networks of translators
Translation has traditionally been regarded as a solitary activity, and translators as invisible middlemen. These features are almost internalized, constituting a part of the translator’s habitus. How are these features reflected in an increasingly connected world, where interdependence becomes the norm? Although translators always have had the opportunity to meet, exchange and promote their profession or, more broadly, to enhance cultural exchange and understanding, the development of high-speed, multimedia communication channels has offered them new possibilities in that regard. The first two papers of this volume look at two different kinds of translator networks in the most literal sense: recently developed professional translator associations such as Internet distribution lists, and non-profit associations of translators with humanitarian objectives. The first kind is driven by strictly professional and internal needs while the second seems to extend professionalism into a more altruistic domain. Yet, analysis of these networks suggests that there might be some relationship between the two.
Freddie Plassard’s article examines Internet distribution lists as a way of sharing resources among translators on a “collaborate to compete” basis. Based on three months of data collection from le réseau franco-allemand, her article highlights the characteristics of this form of exchange. If information sharing in itself is nothing new, the multiplication of the volume and frequency of those exchanges is apparent, and of late, rendered both possible and necessary by time/space compression. Distribution lists are becoming one more component of the translator’s set of tools, and as Plassard suggests, may even replace other more traditional (and individual) ones such as problem-solving strategies. This also implies that the cognitive process of translation acquires a social dimension, whereby translation problem-solving becomes a collective endeavour whose outcome depends as much on the translators’ individual knowledge as his/her social skills. Conducted through email, these translator exchanges hinge on performance-based skills: the capability of connecting with others, the ability to raise interest and show empathy, or to argue, show humility, and convince. If Internet distribution lists become important tools for translators, they could also become valuable material for translation scholars, offering economic and realistic alternatives to Think Aloud Protocols and video recordings.
While Freddie Plassard looks at distribution lists, Yves Gambier questions the recent upsurge in volunteer and humanitarian networks of translators. If we go by the headlines and topics of recent conferences, there seems to be a renewed interest in the relationship between translation and “social action” or “political activism” among translation scholars. But what lies behind these terms? Based on a close analysis of ten translator humanitarian associations worldwide, Gambier provides a framework within which to search for answers. What are the goals of these networks? What do they promote? How do they work? Are they all alike? What do they tell us about the contradictions inherent to the profession? What role does translation proper play in these networks? As his data and analyses reveal, networks, far from being equal, form a sort of continuum in terms of ideologies and values, ranging from pure political commitment to commercial added value.
Together, these two papers could serve as preliminary steps towards working out a three-part typology of translator networks based on their main raisons d’être: collective sharing of resources, problems and solutions; participation in the process of multilingual or international project management; social activism. How are these networks shaped? What position may translators have in them? How does this position dictate their degree of intervention and visibility? Using a multiplex analysis, one could inquire how these networks relate to one another and complement one another and what these interactions reveal about the ethical concerns and motivations of professional translators.
Translation and the network economy
While the first two papers analyze professional/social networks formed by and for translators – for various purposes – the two following ones instead focus on the position of translation in wider production networks, hence moving from the notion of network as a small-scale group of individuals to network as a broader, defining feature of economic life. Production networks are not fixed or definite, but are constantly moving, growing and adjusting to their needs. As such they are difficult to pin down. However, they can be analyzed from within, by way of an ethnographic approach. The aim here is no longer to work out a typology of translator social networks, but to question how this “network economy” – with increased flows of exchanges, relying on and favouring flexible modes of production such as out-sourcing, etc. – informs the translation process and the profession. While the first of these two papers concentrates on the translation industry (pragmatic translation), the second looks at international publishing (literary translation).
In their article based on a study of the Finnish translation industry, Kristiina Abdallah and Kaisa Koskinen show that translation models such as those by Holz-Mänttaäri, Reiss and Vermeer are now being challenged by a new structure that takes the shape of a self-organizing, scale-free, real-world network. This kind of network, studied and formalized by Albert-Lázsló Barabási, suggests to the authors that his model and concepts may help us to better understand the place and socioeconomic status of professional translators. As they remind us, trust is a key element of a network economy – “capital” that is “managed,” and that is all the more precious the rarer it becomes, as is the case in scale-free networks (based on hierarchical and asymmetrical relationships). Lack of trust may be the Achilles’ Heel of scale-free networks, and possibly an indication of their limitations. Meanwhile, translators have had to cope with this mode of production which, as argued by the authors, has so far tended to contribute to their socio-economic marginalization, a marginalization which the development of translator professional networks (such as those examined by Gambier and Plassard) may very well compensate for. Now, what of literary translation?
Similar to other industries, the acceleration of exchanges among publishers entails more competition/cooperation. The development of international co-publishing is an expression of this phenomenon. In her paper based on a three-year ethnographic analysis of the practice of a few well-established Québec publishers (and their translators), Hélène Buzelin shows how co-publishing, traditionally regarded as specific to the domains of illustrated books or cost-intensive projects, now applies – in varying degrees – to all sectors of the industry. Original titles and their translations are often produced at the same time, contingent on international partnerships and their association with other companies abroad. Publishers buying translation rights for these titles only have limited control over how they will be translated/adapted for their market. As such, the success of an original title is no longer a prerequisite to its being translated. To the contrary, it is translation itself conditioning the success, or even, the very existence of the original title. This observation forces us to think differently about their culture value, specificity, and the logic underlying the selection and production processes. If culturalist or institutional systemic models seem to lose their explanatory power, economic determinism should also be avoided. Analysis of co-publishing projects requires an ethnographic approach that is incompatible with any one-dimensional explanatory model. As such, both papers do not so much promote the idea(l) of a network economy as try to suggest how ethnography could help us better understand its implications regarding professional translation.
Reintroducing the agent in the system. On the use of SNA in translation historiography
As Claisse (2006) reminds us, structural network analysis has often been applied to the sociology of literature, and still is. Traditionally, it has been used primarily as a way to account for local relations – associations of writers or publishers, reading boards, literary prize juries, etc. In a broader context, the network approach would seem to be more appropriate and applicable when the relative autonomy of a literary field/institution/polysystem is weak, threatened, or in the process of formation. This is what Tahir-Gürçağlar’s and Pym’s essays likewise seem to suggest. But the originality of these papers, with regard to other literary network studies, is to move us across national boundaries.
Contextualization is at once critical and still largely undefined in descriptive translation studies. Tahir-Gürçağlar sees network analysis, and more particularly the drawing of network maps, as a form of “scouting” that would allow meaningful relationships to emerge rather than be dictated by a ready-made order imposed on them. Network analysis thus appears as a preliminary but essential phase of DTS that would unveil a fuller inventory of translation-related phenomena with which to supplement critical approaches based on models of social causation. This idea is illustrated by a case study: the network of translated popular literary works in Turkey during two different time periods, the 1960s and the 2000s. Altin Kitaplar, one of the major Turkish publishers of best-selling translations, is used as a focal point (gateway) through which to enter a broad network consisting of publishers, translators, authors, editors, readers, and government and literary institution agents. Mapping this network reveals unexpected interrelations between the object (translated popular literature) and other areas of the publishing world.
While Tahir-Gürçağlar provides a nice example of egocentric network analysis, Anthony Pym adopts a sociocentric approach to explore how a network of small specialized literary periodicals spread the principles of Paris-based Aestheticism at the end of the 19th century. By tracing the links between the various “inscriptions”–articles, translations, authors, etc.– recorded in the main literary journals of the era, his paper shows that literary exchanges did not flow from any one country or the other, but rather stemmed from a sub-network specific to a nascent intercultural space, at least until the 1890s. This sub-network helped disseminate knowledge and form a sense of artistic belonging, revealing that network connections can make strategic use of intercultural space during certain circumstances. The principle of cooperation as an ideal is not always upheld, and can be undermined by numerous cultural strategies. As the war was looming, most actors who were a part of this intercultural space withdrew behind national borders, breaking off their links and letting this intercultural network unravel by prioritising differences rather than being motivated by cross-cultural understanding.
María Sierra Córdoba Serrano’s contribution bears many resemblances with Tahir-Gürçağlar’s and Pym’s in terms of concerns and objectives. Though she looks at cultural exchanges – between Québec and Spain – from a contemporary rather than historical perspective, her exploration starts from a sense of discomfort (likewise felt by others) as to the functionalist assumptions of the polysystemic model, specifically, the idea that translation is carried out to fill in the gaps in a target culture. Favouring a qualitative approach primarily based on interviews, Córdoba Serrano sees the introduction of Québec literature in Spain as the result of a process of cooperation among various agents who, echoing Pym’s contribution, could be regarded as forming an intercultural sub-network. Like Pym’s, her approach appears to be more sociocentric in nature. It aims more precisely at drawing a two-part typology (distinguishing official networks from associative ones), allowing her to highlight the relations between micro- and macro- phenomena, and their movements from one level to the other.
Together these papers engage the reader in a discussion about methodology in historiography. Though their discussions begin from different perspectives and have different sources of inspiration – Pym uses Milroy’s work in sociolinguistics as a starting point, Tahir-Gürçağlar and Córdoba Serrano, ANT –, the authors come to rather similar conclusions about the potential role of network analysis in DTS: network study is not regarded as an end in itself – here the authors distance themselves from ANT – but rather as an inductive and incremental approach enabling us to study actors and interactions without making too many presuppositions about their nature. Together, these contributions not only discuss the strengths and limits of structural network analysis in translation historiography, they also illustrate the various ways it can be performed (from data collection to analysis) and represented (as visual or conceptual maps, or in narrative form).
The relation between translation scholars, their object and their ideas
While all of the above papers take translation/translators as their primary object, Julie McDonough, and María Rosa Castro-Prieto and María Dolores Olvera-Lobo, reflect upon how translation scholars interrelate and organize themselves. Like María Sierra Córdoba Serrano, Julie Mc Donough seeks to draw a typology that can serve as a framework for analyzing interpersonal relations. Adopting an empirical approach (based on the analysis of forty associations), the paper works out four categories of translation networks: profession-, practice-, education-, and research-oriented networks. Following SNA methods and concepts, McDonough analyzes the characteristics of each category in terms of subfocus, value, geographic location and membership requirement, computer mediation, relations and governance. She shows how these networks (and their flexibility) define translators who in turn define their networks. The author then analyzes one of these networks, the TranslatorsCafé, on a deeper level. The contribution of this essay is two-fold: first, it provides a general framework and empirical data that can pave the way for more comprehensively investigating the connections between various professional sub-networks (part of a larger network of “language professionals”) and the role that these sub-networks (or associations) play in the relations between academics, practitioners and trainers. Second, it presents certain concepts and methodological guidelines for analyzing these networks.
One way to learn about the structure of an academic field and how its main ideas and research orientation have developed is to conduct Author Citation Analysis (ACA). María Rosa Castro-Prieto and María Dolores Olvera-Lobo apply this approach in order to draw a portrait of the trends and developments in the field of terminology. Their case-study analysis of scholarly citation analysis in terminology builds on Julie McDonough’s suggestions for a more concrete typological framework within which to analyze translation scholar interrelations and organizational patterns. Using the bibliometric method of ACA as well as visualization techniques mediated by statistical techniques to trace the relational conceptual links and associations between authors writing and authors cited within scholarly journals, they are able to set the markers for disciplinary boundaries and trends as seen from the trail of citations. Citation analysis examines the referring documents of an item in terms of visibility, importance and impact on the sustained intellectual discourse and appearance of trends in the field. Of critical note is whether or not high quality is related to high visibility, and how an author’s “variable” image compares and contrasts to his or her “constant” image. SNA is deployed through the “pathfinder network” to map out the trajectories of an author’s intellectual thought and contribution, and with objective data, map out the development of the discipline of terminology. The results of the “snapshot” can map out the “state of the discipline.” Together, these two papers suggest how some form of network analysis could be performed as a way to reflect on the development and structure of translation studies.
Back to linguistic transfer. Translating semantic networks in a literary work
The issue’s final contribution by Laurence Jay-Rayon brings us back to the “core” subject of translation studies: the text and the challenges arising from its interlinguistic transfer. Unlike the other essays, it is inspired by the concept of network as used in literary analysis, in particular by Antoine Berman and more generally within the Bakhtinian critical tradition. Translation, claims Berman (1999), generally fails to recreate the underlying semantic and poetic networks of a text. Laurence Jay-Rayon questions this assumption by analyzing metaphorical/semantic networks in the original and translated versions of Sardines, a novel by English-speaking Somali writer Nuruddin Farah, translated into French by Christian Suber. Her analysis maps the metaphors from one cognitive domain to another, tracing the links and nodes as they appear in semantic network-informed networks, passing through the gateway to other literary, social, cultural, linguistic networks. The essay shows that metaphorical networks are an essential part of the literary work, one which can hardly be ignored by any translator willing to pay tribute to the East African linguistic, poetic, social and cultural realities that inspired the novel. In this perspective, the author proposes a framework for analysing these semantic networks, and suggests avenues for translating them.
Finally, all the essays in this issue find their respective voices in language emanating from the disciplinary discourses that have inspired them. In attempting to initiate a more profound dialogue between translation studies and network studies, we may discover the emergence of a certain complementarity of language, one that could eventually yield a language (or meta-language) of hybrid origin capable of both simplifying and complexifying analytically, and able to articulate continuity and connectivity from the past through the present to the future during these quickly globalizing times. This language could serve both a historical purpose and a trans-academic disciplinary one. At the core, network concepts underpinning discussions in and from biology, physics, chemistry, mathematics, linguistics, philosophy, business and computers all have some common ground, be it physical/real or logical/virtual. A network-translation dialogue could well yield a language across disciplines with which to speak of sameness, commonalities and differences in an increasingly inter- and multi-disciplinary world: a unique model of cooperation and competition where the greater collaborative collective sum of separate parts could configure a new dynamics for the ever-escalating presence of human communication worldwide. Can theories on and of networks allow us to articulate translation phenomena through another vocabulary, or offer us new conceptual tools with which to probe and investigate these phenomena more deeply? Are network-based theories able to engage existing translation theories in any meaningful way? The very existence of this special issue indicates that we tend to believe so. We invite Meta readers to consider the data and to formulate their own opinions.
Appendices
Remerciements
The authors would like to express their gratitude and thanks to Yves-Marie Abraham, Francisco F. Lasheras and Pavel Trofimovich for their readings and insightful comments on specific portions of this text.
References
- Berman, A. (1999): “L’analytique de la traduction et la systématique de la déformation,” A. Berman (ed.), La Traduction et la Lettre. Ou l’Auberge du lointain, Paris, Le Seuil, pp. 49-68.
- Bernett, R. and P. D. Marshall (2003): Web Theory. An Introduction, London and New York, Routledge.
- Bickerton, D. (1971): “Inherent variability and variable rules,” Foundations of Language, 7, 457-492.
- Bollobás, B. (1998): Modern Graph Theory, New York, Springer.
- Boissevain J. and C. Mitchell (eds.) (1973): Network Analysis Studies in Human Interaction, Mouton, The Hague / Paris.
- Boissevain, J. (1973): “Preface,” in Boissevain, J. and C. Mitchell (eds.), Network Analysis Studies in Human Interaction, Mouton, The Hague / Paris, pp. vii-xiii.
- Boltanski L. and È. Chiapello (1999): Le nouvel esprit du capitalisme, Paris, Gallimard.
- Boltanski L. and È. Chiapello (2005): The New Spirit of Capitalism. Trans. G. Elliott, London, New York, Verso.
- Bott, E. (1957), Family and Social Network, London, Tavistock.
- Bourdieu, P. (2001): Science de la science et réflexivité, Paris, Éditions Raisons d’agir.
- Briggs, A. and P. Burke (2005): A Social History of the Media – From Gutenberg to the Internet, Cambridge and Malden, Polity Press.
- Broadbent, Simon R. and J. Hammersley (1957): “Percolation processes: I. Crystals and mazes,” Proc. Cambridge Philos. Soc., 53, pp. 629-641.
- Callon, M., J. Law and A. Rip (1986): Mapping the Dynamics of Science and Technology, Houndsmills & London, Macmillan.
- Callon, M. (1997): “Actor-Network Theory – The Market Test,” Keynote speech presented at the Department of Sociology of Lancaster University, published by Lancaster University at http://www.comp.lancs.ac.uk/sociology/papers/callon-market-test.pdf.
- Callon, M. (2001): “Actor Network Theory,” N.J. Smelser et P. B. Baltes (dirs). Encyclopedia of the social & behavioral sciences, volume 1, pp. 62-66.
- Carrington, P. J. and S. Wasserman (eds.) (2005): Models and Methods in Social Network Analysis, Cambridge, Cambridge University Press.
- Castells, M. (ed.) (2004): The Network Society – A Cross-cultural perspective, Cheltenham, Northampton, Edward Elgar Publishing Ltd.
- Castells, M. and G. Cardoso, eds. (2006): The Network Society: From Knowledge to Policy, Washington DC, The John Hopkins Center for Transatlantic Relations.
- Chiesi, A.M. (2001): “Network Analysis” in N J. Smelser and P. Baltes (eds.), International Encyclopedia of the Social and Behavioral Sciences, volume 15, pp. 10501-10504.
- Christensen, C. and R. Albert (2007): “Using Graph Concepts to Understand the Organization of Complex Systems,” International Journal of Bifurcation & Chaos in Applied Sciences & Engineering, Special Issue “Complex Networks’ Structure and Dynamics,” 17, 7, pp. 2201-2214.
- Claisse, F. (2006): “De quelques avatars de la notion de réseau en sociologie,” D. de Marneffe et B. Denis (eds.), Les réseaux littéraires, Bruxelles, Le Cri, pp. 21-43.
- Collins H. M. & S. Yearley (1992): “Journey into Space,” A. Pickering (ed.), Science as Practice and Culture, Chicago, University of Chicago Press, pp. 369-389.
- Collins H. M. & S. Yearley (1992) “Epistemological Chicken,” A. Pickering (ed.) Science as Practice and Culture, Chicago, University of Chicago Press, pp. 301-326.
- Contractor, N. (2007): “MTML meets Web 2.0: Theorizing social processes in multidimensional networks,” plenary session at New Network Theory, conference, University of Amsterdam, June 28-30, online video recording: http://www.networkcultures.org/networktheory/.
- Crawford, S. A. (2005): “Actor Network Theory,” Georges Ritzer (dirs.), Encyclopedia of social theory, volume 1, pp. 1-3.
- Czarniawska, B. (2007): “Has Organization Theory a Tomorrow?,” Organization Studies, 28, 1, pp. 27-29.
- De Marneffe D. et B. Denis (dirs.) (2006): Les réseaux littéraires, Bruxelles, Le Cri.
- Douglas, M. (1995): “Forgotten knowledge,” Marilyn Strathern (ed.), Shifting Contexts, London & New York, Routledge, pp. 13-27.
- Feenberg, A. (1999): Questioning Technology, London and New York, Routledge.
- Ferrer i cancho, R. (2005): “The structure of syntactic dependency networks: insights from recent advances in network theory,” Problems of Quantitative Linguistics, pp. 60-75.
- Flew, T. (2005): New Media – An Introduction. 2nd Edition, Oxford and New York, Oxford University Press.
- Freeman, L. C. (2004): The Development of Social Network Analysis. A Study in the Sociology of Science, Vancouver, Empirical Press.
- Gauntlett, D. and R. Horsley, eds. (2004): Web.Studies 2nd Edition, London, Arnold Publishers Ltd.
- Gingras, Y. (1995). “Un air de radicalisme,” Actes de la recherche en sciences sociales, 108, pp. 3-17.
- Granovetter, M. S. (1973): “The Strength of Weak Ties,” The American Journal of Sociology, 78-6, pp. 1360-1380.
- Halsall, F. (2005): Computer Networking and the Internet, 5th edition, Essex, Pearson Education Limited.
- Harvey, D. (1989): The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change, Oxford and New York, Blackwell.
- Hassan, R. and R.E. Purser (2007): 24/7 Time and Temporality in the Network Society, Stanford, Stanford U. Press.
- Huurdeman, A. A. (2003): The Worldwide History of Telecommunications, Hoboken, John Wiley & Sons, Inc. (Wiley-Interscience).
- Isambert, F.-A. (1985): “Un ‘programme fort’ en sociologie de la science,” Revue française de Sociologie, pp. 485-508.
- Izquierdo, L. R. and R. A. Hanneman (2006): “Introduction to the Formal Analysis of Social Networks Using Mathematica,” http://library.wolfram.com/infocenter/MathSource/6638/, based on Robert A. Hanneman and M. Riddle (2005), Introduction to Social Network Methods, Riverside, U. of California Riverside, http://www.faculty.ucr.edu/~hanneman/nettext/ (digital text)
- Jinyun, K. (2007): “Complex networks and human language,” Cornell University Library, arXiv:cs/0701135v1 [cs.CL], http://arxiv.org/abs/cs.CL/0701135.
- Labov, W. (1966): The social stratification of English in New York City, Washington, DC: Center for Applied Linguistics.
- Lakoff, G. (1987): Women, Fire, and Dangerous Things: What Categories Reveal About the Mind, Chicago, U. of Chicago Press.
- Latour B. and S. Woolgar (1988): La vie de laboratoire, Paris, La Découverte, traduction de l’anglais de Michel Biezunski.
- Latour, B. (1997): “On actor-network theory, a few clarifications” on http://amsterdam.nettime.org/Lists-Archives/nettime-l-9801/msg00019.html, visited in September 2005.
- Latour, B. (1999): “On recalling ANT” J. Law and J. Hassard (eds.), Actor-Network Theory and After, Oxford, Blackwell.
- Latour, B. (2005): Reassembling the social. An introduction to Actor-Network-Theory, Oxford, Oxford University Press.
- Law J. and J. Hassard (1999): Actor Network Theory and after, Oxford and Malden, Blackwell Publishers.
- Lorrain, F. P., and H. C. White (1971): “Structural equivalence of individuals in social networks,” Journal of Mathematical Sociology, 1, pp. 49-80.
- March, J. G. (2007): “The Study of Organizations and Organizing Since 1945*,” Organization Studies, 28, 1, pp. 9-19.
- Marsden, P. V. (2000): “Social Networks,” F. Borgatta et R. J.F. Montgomery (dirs.), Encyclopedia of Sociology, MacMillan and Gale Group, 2e edition, volume 4, pp. 2727-2735.
- Mckingley, W. (2007): “The March of History: Juxtaposing Histories,” Organization Studies, 28, 1, pp. 31-36.
- McNeill, J. R. and W. H. McNeill (2003): The Human Web: A Bird’s-eye View of World History, New York, London: W.W. Norton.
- Milroy, L. (1980): Language and Social Networks, London, Baltimore, Basil Blackwell; University Park Press.
- Mitchell, C. (1973): “Networks, Norms and Institutions,” in Boissevain J. and C. Mitchell (eds.), Network Analysis Studies in Human Interaction, Mouton, The Hague / Paris, pp. 15-36.
- Mizruchi, M. S. (2005): “Network Theory” Georges Ritzer (dirs.) Encyclopedia of social theory, volume 2, pp. 534-540.
- Monge, P. R. and N. S. Contractor (2003): Theories of Communication Networks, Oxford, New York, Oxford U. Press.
- Moreno, J. (1934): Who Shall Survive?, Washington, Nervous and Mental Disease Publishing.
- Murphie, A. and J. Potts (2003): Culture & Technology, New York, Palgrave Macmillan.
- Musso, P. (2003): Critique des réseaux, Paris, PUF.
- Musso, P. (ed.) (2003): Réseaux et société, Paris, PUF.
- Newman, M., A.-L. Barabási, and D. J. Watts (2006): The Structure and Dynamics of Networks, Princeton and Oxford, Princeton University Press.
- Scott, J. (2000): Social Network analysis: a handbook, 2nd edition, Sage, London.
- Scott, J. (2004): “Social Networks” A. Kuper & J. Kuper (eds.) The Social science encyclopedia, volume 2, pp. 687-689.
- Singh, R. (2004): “Unsafe at Any Speed? Some Unfinished reflections on the ‘Cultural Turn’ in Translation Studies” in P. St. Pierre and P. C. Kar,Translation. Reflections, refractions, Transformations, New Delhi, Pencraft International.
- Solé, R. V., B. Corominas Murtra, S. Valverde, and L. Steels (2007): “Language Networks: their structure, function and evolution,” submitted to Trends in Cognitive Sciences, http://complex.upf.es/index.php?page=4&subpage=2.
- Solla Price, D. de (1963): Little Science, Big Science, New York, Columbia University Press.
- St. Amant, K. (2007): Linguistic and Cultural Online Communication Issues in the Global Age, Hershey and London, Information Science Reference.
- Stalder, F. (2006): Manuel Castells: The Theory of the Network Society, Cambridge, Malden, Polity Press.
- Starbuck, W. H. (2007): “Living in Mythical Spaces,” Organization Studies, 28, 1, pp. 21-25.
- Valverde, S. and R. V. Solé (2007): “Self-organization and Hierarchy in Open Source Social Networks,” Physical Review E, 3267, http://complex.upf.es/index.php?page=4&subpage=3.
- Valverde, Sergi, R. V. Solé, M. A. Bedau and N. H. Packard (2007): “Topology and Evolution of Technology Innovation Networks,” Physical Review E, accepted for publication, http://complex.upf.es/index.php?page=4&subpage=3 (deals with patent citation in terms of innovation and networks)
- Valverde, S. and R. V. Solé (2007): “Hierarchical small-worlds in software architecture,” Dynamics of Continuous Discrete and Impulsive Systems: Series B; Applications and Algorithms, accepted for publication, http://complex.upf.es/index.php?page=4&subpage=3 (study of graph attributes of software designs in order to find universal patterns of software organization)
- Van Dijk, J. (2006): The Network Society, 2nd edition, London, Thousand Oaks, New Delhi, Sage Publications Ltd.
- Weisstein, E. W. (2007): Wolfram Mathworld, online mathematics resource, http://mathworld.wolfram.com/
- Wellman, B. (1997): “Structural Analysis: from Method and metaphor to theory and substance” in B. Wellman and S.D. Berkowitz (eds.), Social Structures: A Network Approach, Canadian Scholars’ Press, Toronto, pp. 19-61.