Hi, please

Tag Archives: networking

Weekly Summary: Networking, Notworking, and What to do Next?

Networks – The Science-Spanning Disciplines - Anna Nagurney

Dr. Anna Nagurney is a professor in the Department of Finance and Operations Management at the Isenberg School of Management at the University of Massachusetts Amherst. She is the Founding Director of the Virtual Center for Supernetworks. You can read more about her on her blog here.

In Nagurney’s presentation (from 2005), she enthusiastically discusses the pervasiveness of networks in people’s every day lives and how they’re essential to the functioning of societies and economies. She notes that networks are imperative parts of business, social systems, science, technology, and education by providing their very infrastructure.

Background of Networks

Transportation is one of the most essential forms of networks, and can also be one of the most complex. Nagurney uses the concept of the transportation network throughout her presentation to help explain a number of different points. This network is so important because transportation is used not only to facilitate face-to-face communication, but also to provide access to other networks. Anna notes in her speech that there are 3 basic network components:

  • Nodes
    • Ex. Transportation intersections, homes, work places
  • Links or Arcs
    • Could have direction or be bidirectional or just represent connections without any type of direction
    • Ex. Roads, railroad tracks
  • Flows
    • Means various things within different contexts and applications
    • Without these, (with just nodes and links), one is essentially talking about a graph
    • Ex. Cars, trains

The Study of Networks

From a scientific methodology standpoint, to her, the beauty of studying networks lies in finding problems where one might think no network exists. Much like we talked about last week concerning the sense that there’s a plethora of virtual interconnections taking place every day on the street that go unnoticed, Anna searches for these happenings and looks to study how they interact as a network. She explains that, “the study of networks is not limited to only physical networks, but also to abstract networks in which nodes do not coincide to locations in space.” More specifically, the study of networks involves:

  • Forming these applications as mathematical units
  • Studying these models from a qualitative perspective
  • Creating algorithms to solve the ensuing model

The studying of networks has elicited 3 classic problems:

  • The Shortest Path Problem
    • The search to move flows in the most efficient way from an origin to one or more destinations
    • Ex. Transportation; minimizing storage needed for books in a library
  • The Maximum Flow Problem
    • Figuring out the capacity of the network
    • Ex. Network reliability testing; Building evacuation
  • The Minimum Cost Flow Problem
    • The search to find the flow pattern that minimizes the total cost, without exceeding capacity
    • Ex. Warehousing & distribution; biology; finance- asset liability management

This scientific approach to studying networks seeks to determine patterns within networks, which can then aid in unifying a variety of applications.

Characteristics of Today’s Networks

In the past, congestion was not such a huge problem, but now it is becoming more and more so. This can even be considered when talking about social networks, with Nagurney explaining that with, “a push of a button, you can reach 10s of thousands of millions” of people.

The behavior of users is also an important characteristic to consider. Users, both on an individual and group level, can behave in a variety of ways within a network. This can even lead to alternative behaviors and paradoxes, such as the Braess Paradox. The paradox highlights the cost to society concerning user optimization vs. system optimization.

The Supernetwork

Nagurney postulates that it’s time for a new paradigm: that of the supernetwork. These supernetworks can be connected, multilevel, or even multi-criteria. It’s important to not only study individual decision-making, but “the effect of many competing, collaborating, cooperating.”

With these supernetworks, come new tools to study them, including game theory and optimization theory. She also lists a few common applications of these supernetworks, including knowledge networks, teleshopping decision-making, and electronic transactions.

Nagurney then explores how these supernetworks can integrate social networks, by looking at types of relationships. The value and strength of the relationships that are fostered become the “flows” in social networks. She explains that establishing relationships incurs costs, but with higher relationship levels comes a reduction in costs and risk and an increase in value. The belief in social responsibility of the users and the fact that social networks are dynamic and ever-changing are important factors to consider when studying these networks.

The Principle of Notworking - Geert Lovink

Dr. Geert Lovink is a Research Professor of Interactive Media at the Hogeschool van Amsterdam and an Associate Professor of New Media at the University of Amsterdam. His book The Principle of Notworking was published in 2005.

Throughout the first section (“Multitude, Network and Culture”) of Lovink’s book The Principle of Notworking, Lovink mainly quotes George Yudice, Antonio Negri, and Michael Hardt. (In 2003, Yudice wrote the book The Expediency of Culture: Uses of Culture in the Global Era, where he theorizes about the changing role of culture in a world that’s becoming more global-oriented. Negri and Hardt co-wrote the books Empire (2000) and Multitude: War and Democracy in the Age of the Empire (2004). While Empire was about corporations and global institutions coming to the forefront, Multitude centered on the population of the ‘empire,’ explaining that this body is defined by its diversity.)

Lovink begins his book by explaining the importance of analyzing culture as a resource, rather than a commodity, which he argues is especially important when discussing Internet culture. He believes that the commercial efforts of the dotcom models during the late 1990s were “wrong.” He argues that the, “culturalization of the Internet is at hand,” and, like Nagurney, seeks to present the importance of the user over the system.

Much like Nagurney stated in her presentation, Lovink also recognizes that an important aspect of Internet culture is that it is in, “a permanent flux.” He explains that experts on the Internet are still having trouble comprehending this, though, mentioning that it is a “cultural turn.” He notes that those having trouble seeing the Internet as something constantly changing still see the Internet as a commodity and tend to hold theories of “religious nature.”

In accordance with his belief of the importance of the user over the system, he believes that more sufficient research is required on the subject and does not believe Nagurney’s scientific approach is adequate. With this, he thinks that new media needs a language of its own, which will be more inclusive of his idea of networks as “post-human.”

Lovink also explains the importance of having different communities come together (similar to a point Nagurney makes). He sees this happening with the outsourcing of IT, which allows for the chance of “cultural mingling.” But, while networks have the opportunity to foster creativity, cooperation, and a sense of liberation, they can also be used for the purpose of control. This is mentioned through his discussion of the ‘protocol’ theory and Gilles Deleuze’s ideas of ‘the control society.’

What Lovink believes defines today’s networks, he describes through the term “notworking.” It is elements that go awry within the make-up of the network from yesterday that help to shape the network of today. These examples of “notworking,” such as spam and viruses stem from the “frustrated mind” – those, “who breach the consensus culture,” and are pushed to the outer boundaries of the network.

Review of The Exploit: A Theory of Networks (2 Reviews + 1 Response)

The Exploit: A Theory of Networks is a book co-written by Alexander Galloway and Eugene Thacker, which was published in 2007. It is a theoretical book about how networks operate, their political implications, and how flaws in the system can lead to positive change. Galloway is an associate professor in the Department of Culture and Communication at New York University. Eugene Thacker is an associate professor of new media in the School of Literature, Communication, and Culture at the Georgia Institute of Technology.

Review 1: Daniel Gilfillan

Daniel Gilfillan is Associate Professor of German Studies and Information Literacy, and Affiliate Faculty in Film and Media Studies and Jewish Studies at Arizona State University. Read more about him and his work on his Academic Portfolio site.

Gilfillan’s review of The Exploit mainly focuses on commending Galloway and Thacker for presenting a contemporary understanding of networks. Like Lovink, Gilfillan, Galloway, and Thacker recognize that networks are used for control purposes and consumerism (also referencing Deleuze and his “control societies” and “dataveillance” concepts).

What Gilfillan is mainly concerned with is the concept of pushing past this, “system of control,” by taking advantage of openings within it, which can lead to something new and progressive. Similar to Lovink’s point of what makes networking is the “notworking,” Gilfillan agrees with Galloway and Thacker that it is these “flaws” within networks that makes progressive change possible. In relation to this, Gilfillan discusses Galloway and Thacker’s belief that there is a new balance between networks- an “alliance between ‘control’ and ‘emergence.’” But, a new type of asymmetry must be found that takes advantage of inconstancies within a network; Galloway and Thacker call this need both the “antiweb” and “an exceptional topology.”

While networks need hierarchical systems of control, it is also important to have aspects of a decentralized system of distribution. This helps to allow for this asymmetry, and hence, flaws within the system. Gilfillan notes that it’s here that allows for the possibility for “counterprotocol practices,” making advancement possible: “it will be sculpted into something better, something in closer agreement with the real wants and desires of its users” (from Galloway & Thacker).

He gives the following definitions as a guide to the exploitation of these flaws:

  • Vector: The exploit requires a medium where an action or motion can take place
  • Flaw: The exploit needs weaknesses within the network, enabling the exposure of the vector
  • Transgression: The exploit then creates a change within the ontology of the network, making the “failure” of the network an alteration in its topology

Review 2: Nathaniel Tkacz

Nathaniel Tkacz is a PhD candidate at the University of Melbourne, where he’s researching the, “political dynamic of Open Projects (projects influenced by the principles and production models of Free and of Open Source Software, but translated into different domains).” Read more about him and his work on his research site.

While protocol was a minor detail in the overall message presented by Gilfillian, this was the main topic of discussion for Tkacz. He explains, “protocol is a set of rules or codes that enables, modulates, and governs a specific network and also a general logic of governance for all networks.” It is a form of control and a way of, “directing flows of information,” which he equates to the Panopticon in Foucault’s disciplinary society.

But, this protocol allows for the exploitation of the flaws within it- it becomes the “target of resistance.” Rather than changing existing technologies to promote transformation, “protological struggles,” emerge that entail, “discovering holes in existing technologies and projecting potential change through these holes.” These “holes” are called “exploits” by hackers.

From here, Tkacz goes on to explain a number of ‘limitations” he feels the book has. Tkacz believes that the way the book was structured created some limitations in itself (the book was written as a ‘network,’ which Tkacz believed left things underdeveloped). Another problem that Tkacz sees is that the book relies too heavily on the “old centralized/decentralized dichotomy,” rather than holding firm to one of the main claims of their book: networks can take numerous forms. A third dilemma he had is that he found the idea behind the authors’ protocol/exploit argument less persuasive as it moved from the specific, more important details to the general points.

Author Response: Alexander R. Galloway and Eugene Thacker

The authors begin their response by noting that Gilfillan mentioned one of the key points of the book: “the uncannily anonymous, network tactics demonstrated by ‘pliant and vigorous nonhuman actors.’” They explain their interest in the view that networks are, “something beyond the human altogether.” While networks might have once originated from human means, in their functioning as a network, they have lost their most essential human qualities. Viruses on networks don’t thrive because the network is “down” and not working properly; rather, they excel because of the very fact that the networks are working just as they should be. This is similar to a point that Lovink makes of networks being “post-human.”

Looking at both Gilfillan’s and Tkacz’s mention of Foucault and Deleuze being used in The Exploit, Galloway and Thacker clear up their reasoning behind using Foucault’s ideas. The two authors were not looking at Foucault’s work concerning discipline-surveillance; rather, they looked to build upon his work in biopolitics and security. Similarly, the authors note that the influential aspects of Deleuze did not just lie in his essay on “control societies.” Rather, it was in connecting that concept to his interest in the notions of immanence and univocity (the belief, expanded upon from Spinoza, that there are no numerically separate substances).

The authors ultimately ask: what should be done concerning these networks? “Should we as humans learn to be more like nonhumans?” They explain that there have been a number of responses to their question throughout philosophy. But, there are three in particular that they deem important. The first being the “’master of the universe’ attitude.” This says that exploits, such as viruses, must be eliminated. The opposite of this viewpoint is that of the agnostic. Here, it is accepted that, “the world is lost in the hands of technology, dry and lifeless after the passage into modernity.” The third thought process is that within this “dry and lifeless” world, lies something new and emergent at the core.

The authors leave us with the question, “Can there be an ontology of networks?” Must there always be an outside mediator to the network? Can a network topology express itself from within?