Paper on web services as a complex adaptive system

(jobcwcas02:draft 25)



Emergence of web services as complex adaptive system - not EAI but Google

The success of Linux, the Internet and the World Wide Web is based on complex adaptive properties of the design aggregates that support these technology based systems.  Each benefited from the consistent design rules and network effects of prior systems that support them. 

The initial strategy for leveraging Web Services, in contrast, integrates two separate systems: business and web information. 

The business systems' generally chaotic architectures negatively impact the catalytic effect of the modular characteristics of the web technologies. 


The investment in Service Oriented Architectures (SOA) and Web Services based integration is being justified as a solution to the enterprise application integration problem and the logical second phase of the World-Wide-Web.  This paper looks critically at the SOA architecture and major industry strategies to migrate the eco-net and infrastructure of enterprises to the World-Wide-Web. 

Applying theoretical models to analyse a series of global communication developments, this paper then highlights the underlying causes of success and failure of each. 

In conclusion, "Emergence of web services as complex adaptive system - not EAI but Google" it proposes strategic approaches for successfully leveraging SOA. 


SOA based computing

Why deploy Web service infrastructure across your enterprise? 

The enterprise's goal is to integrate its value-add processes with efficient operations and a minimum of wastage.  It is argued that the Service-Oriented Architecture provides a powerful and flexible tool to enable this. 

At first glance, there is some sense to this.  The use of web-based technologies scales globally.  Documents typically act as the formal descriptions of contracts.   The power of computer programs and web documents, flexibly integrated, should enable better ways of doing business. 

Looking more deeply, however, it is not clear that the hard problems have really been addressed. 

Enterprises use value-added processes in order to generate profits and growth.  They must use resources to search for environmental opportunities to exploit, spot the real opportunities and then design processes that will create the added value at a profit. And all of this goes on in parallel. 

The inability to clearly identify the value of a potential opportunity or the best strategies for executing on any selected opportunity, leads organizations to protecting captured niches and making dramatic changes if their environment becomes hostile.  Typically organizations divide up and distribute these value creation and capture problems to different groups. The organization attempts to gain from each group being able to act in parallel. 

Hence, the information processing infrastructure of an enterprise must be able to support adaptation to new environmental conditions as well as efficiently operationalize the processes that the company is using to generate growth and profits.  This must be done in support of a number of different business strategies. 

The problem of semantic mismatches

The problem of semantic mismatches evolves naturally from a distributed approach to organizational action.  Without clear foresight, parallel actions create incompatible processes and operations.  For example, one business unit may decide to limit R&D expenses to maintain profitability while another is expanding R&D to identify and capture new opportunities.  Their approach to investment in shared R&D support infrastructure is likely to be in conflict. 

Within the IT infrastructure semantic mismatches are very troublesome.  Whereas humans evolved coping with the strategic dilemmas described above, enterprise computer systems are typically constructed based on the assumption that they can depend on consistent, logical design. 

Given the successful evolution and inter-operation of networked systems, such as the Internet, there may be effective strategies to utilize within the enterprise. 

The success of the uncoordinated deployment of the world-wide-web demonstrated that some approaches were able to handle the problems of chaos and semantic miss-matches.  Web-based technologies clearly had something.  Web browsers did not need to be upgraded in synchrony with the interfaces and features of specific web-based applications.  The client deployment problem of client server systems disappeared. 

Java could extend the browser to enable powerful programmatic enhancements required by particular web applications.  Except now it turned out that security, performance and GUI version problems limited the technical opportunity to integrate. 

The concept of Web Services supports mappings between a programmatic model (procedures with interfaces), and a standardized web XML description (WSDL) that can be used to define a protocol of messages to any target web resource.  Messages are then transformed back into the programming model of the target.  There is a trust model and state sharing infrastructure. 

Any target that can obtain the mapping service can process the message and respond to the requester. 

There are several issues that create complications, however. 
Huge value is locked up in the current enterprise infrastructure, but it is essentially acting as a brake on the evolution of current businesses to leverage the changes in their global environment, by its limited ability to adapt.  Can the enterprise legacy environment be made adaptive?  Will everything be fine if we can achieve this? What would a replacement adaptive infrastructure look like and how would it deploy? 

Infrastructure adaptation interfaces
sadpws02.gif from rhljsp01.ppt

Complex adaptive systems

There is a class of systems that is able to adapt effectively.  Complex adaptive systems (CAS - Holland) are aggregate entities with multiply connected nodes and flows between the nodes sharing state.  They model the world they are a part of.  They have the property of responding effectively to many changes in their aggregate environment, learning even though the information they obtain from the environment is chaotic, and the models of the world and strategies they develop contain epistatic attributes. 

Complex adaptive systems enable genetic operators to act on them.  Logical and physical tags allow partitioning of the flows within the network of nodes by the operators. 

Businesses are aggregate structures with flows of people, information and ideas through the nodes.  When parts of businesses are similarly structured as a network supporting flows of information, ideas and people through the nodes they can become adaptive too.  When businesses attempt to coordinate changes in their Eco-net and themselves they assume the aggregate will have become a complex adaptive system. 

Modular design systems(i.e. with the properties of high specifiability, measurability & predictability), in aiming to reflect the needs of the evolving environment and distribute the conclusions to an Eco-net, have been shown to have the properties of a complex adaptive system.  Baldwin and Clark identified a set of genetic operators: porting, inverting, excluding, augmenting, substituting and splitting.  With these operators, highly scalable and adaptive design systems can be constructed. 

The Internet and UNIX

The deployment of the Internet and UNIX was initially supported by unusual financial offers, induced by the regulation of AT&T and government and academic investment in the Arpanet. 

The disruptive nature of this combination of products to OSI and the existing computing platforms was catalyzed by the adaptive approaches taken to the engineering of UNIX and the IETF protocol suite. 

Both were based on simple foundations and modular extensions:

The World-Wide-Web. 

As the Internet and UNIX slowly grew to a sizable infrastructure, they acted as the foundation of a higher level aggregate - the World-Wide-Web.  A strongly modular system based on the orthogonality of the Web's specifications (identity, interaction and representation are defined in separate groups of specifications) and definition of interfaces as protocols was developed.  Once again, a pattern of simplicity, exclusion of unnecessary capabilities, and strong modularity ensured the web could be deployed without central coordination.  The asynchronous deployment of web pages and the network effects (the more connections the greater the value of each connection) of the Internet provided huge catalytic force. 

EDI: an old strategy suffering from interface problems - or is it?

The existence of a global network might be expected to enable "Electronic Data Interchange(EDI)"  between companies that have Internet connections. 

EDI aims to flow data between companies so that they can integrate and coordinate processes.  The data is typically labeled and so can be operated on selectively by the receiving business.  The receiving company could provide interfaces to filter the EDI data and forward it to the right "nodes" within its organization. 

However, in comparison with UNIX, the Internet and World-Wide-Web deployments, EDI has been very unsuccessful. 

Can Web access to WSDL definitions catalyze EDI success?  If the problem was lack of compatibility of the interfaces and name tags, or efficiency of this operation, it will certainly help. 

What went wrong with EDI?

But it's the next stage of the EDI process that is the problem.  The receiving business would have needed to have business processes designed to integrate into the larger process.  In an already developed business, the functions might exist but there is little reason why they should have added the overhead of being modularized.  Even if they were modularized, the end-to-end process would likely require re-factoring to become efficient and effective.  

IF the business opportunity provided by EDI justified wholesale process re-engineering then it might occur, even with the risks of impacting the operating process.  Without a significant upside the re-engineering would not occur.  For example, check clearing banks in the UK initiated an EDI activity tasked with improving (reducing) the clearing transaction times.  However, each of the participating banks actually gained from holding the money in clearing.  In re-engineering the processes they would impact their current business results.  The EDI activity failed when the processes were not re-engineered within each bank.

Process improvements may generate value, but this depends on selecting a value-critical process.  The value of a process is highly situation-dependent, and the identification is made more difficult due to the ill-defined nature of process, and the poor measures available to assess the value-add.  It may be the sales process that is critical.  But is it the incentive scheme?  Is the channel strategically conflicted?  There may be no clear cut answer.  Alternative strategies may have to be played out in competition to demonstrate relative merit.  Some options will not be achievable within a company without compromising its current business strategies. 

Enterprises must respond to environmental and competitive change.  The best response will vary, and so the infrastructure must be able to provide support for a variety of business models, including some with directly conflicting demands.  That can place the IT organization in the unenviable position of highlighting factional differences.  These business difficulties should set the expectations for the early returns from deployment of Service Oriented Infrastructure. 

Accepting the conflicts of business process re-engineering, can Web Services architecture provide a catalyst for improved value generation?  An IT architecture that allowed different businesses to utilize alternative strategies, operations and processes, and then to re-factor them if they fail market tests or the environment changes, should offer the enterprise strategic flexibility. 

What would it take to make this infrastructure an effective part of an adaptive architecture?  To answer this question we will look at some successful adaptive systems. 

Root causes of Web success

The Web is a complex adaptive system, a persistent aggregate of the core IETF protocols.  The Internet's proliferation was catalyzed by its disruptive properties, but these would have been irrelevant if the protocols had not been modular, and adaptable when broadly deployed.

The Internet's modularity resulted from having to ensure robust operation of the network without the ability to coordinate or control deployment and operation.  Robustness required:

The World-Wide-Web leveraged the Internet's methodologies and solved a very specific computing problem.  The problem was from within the distributed data processing design area already modularized by the IETF.  The solution used a mechanism analogous to the one deployed already in the global DNS. 

DNS: a highly scalable complex adaptive system (CAS)

The Internet Domain Name Server (DNS) highlights many aspects of a CAS.  Its architecture:

Certain well connected agents will be key cache points; name servers do not have to gain deep knowledge of addresses, instead they can just maintain information addressing the authoritative servers, and will attempt to share this with requestors. 

The system is designed with recursive queries which encourage CAS nonlinearity. 

If a hole develops, say a company needs to provide administrative support for its sub-network, the current administration point is able to delegate a new zone of authority, and the network effects of the Internet, and the engineering policies that make resolvers easy to develop, ensures that computing utilities will contain compatible agents and adapt to the new name server. 

DNS adaptation and evolution

The DNS has a well structured specification that is maintained and changed by participants of the IETF.  Its CAS flow structure and default hierarchy allow for autonomous extensions of the system, and when it is found that the core agent framework does not enable required performance, the framework can be altered using the "genetic operators" of the IETF process (splitting, substituting, augmenting, excluding, porting) and market feedback. 

The result is a highly adaptable and scalable aggregate. 

The architecture of a complex adaptive system

adaptive system architecture
rhlcaa01.gif from rhljsp01.ppt

Complex adaptive systems can evolve due to possession of certain features in these systems:
Each of our adaptive systems can be seen to conform to this architecture. 

Merging business architectures - the problem of hidden aspects of design

Each of my earlier scenarios turns out to be from the same communication software design space.  Without this property the hidden aspects of design can no longer be guaranteed to fit the modularity assumptions of the design rules for each sub space. 

EDI's problem is that it attempts to integrate mission-critical IT infrastructure and stateful business processes with typically chaotic architectures from separate companies through architected interfaces, and in so doing merges the hidden aspects generating system wide properties that are unpredictable.  In such an environment, neither predictability nor adaptability are likely to be achieved. 

Web services, when targeted at the same goal as EDI, suffer from the same weakness. 

Today's deployments of Enterprise Resource Planning systems improved the businesses operational efficiency by removing current chaos from the operations.  Corporate strategies have aligned with and championed the operational changes required.  Efficiency and predictability are obtained. 

However, as the scope of infrastructure change is broadened to increase the adaptability of the enterprise, these infrastructure systems will need dramatically different capabilities to model relations of a political and uncertain strategic nature since they don't:

The design aspects are likely to have to re-factor functionally disparate facets:
Any potential opportunity will require a typically lengthy restructuring, involving definition of a shared language, logic, metrics and agreement on a valid business Eco-system design. 

The co-dependencies of incumbent members of an Eco-net supporting the business limit the options the enterprise will be able to use to migrate to the new structure without effective modularization of many extra aspects:

The difficulty of predicting Environmental changes and their consequences limits the applicability of long-term commitments, and places a premium on adaptability. 

The power, knowledge and compatibility relationship between the environment and agents will affect the adaptability of the agents.  Asset properties, communication capabilities and legal costs must all be expressed in the classes of agents that form.  Knowledge of the environment enables agents which can interface easily with the infrastructure to form. 

Opportunism, within a complex relationship can be expressed in a variety of ways.  Participants are likely to maintain strategies for both cooperation and opportunism, and often the system is extended, at a cost, to provide legal encouragement for cooperation.  Never the less, opportunism, cheating and parasitism will likely have a lower cost than a legal structure that can totally constrain them.  Hence today's human agents have developed sensors and instincts for detecting high-risk lies that have impacted their ancestors' use of trust over evolutionary time. 

Some systems become increasingly regular over time.  As long as this situation holds investment in transparency and end-to-end optimizations enable improved benefits.  The way transparency is developed can be based on end-to-end refactoring or attribute by attribute standardization.  However, in the general case, systems will be evolving on the edge of chaos. 

It is a significant challenge to find and apply the right strategy for each shift in the environment. 

Strategic approaches to allocating resources to systems adding modularity

Disruptive strategies

Limited markets can be targeted with low cost, small scale business models, when these are able to justify the unpredictable growth process.  Once the design rules are agreed upon and the modularity is shown to be effective, the market can form.  If it becomes large enough through an incremental process enabled by the modularity, then the disruptive businesses will grow, in conditions that will destroy most incumbents from intersecting markets.  High profit monopolists can theoretically respond effectively but, excepting these special cases, impacted businesses will struggle to justify any direct response except retreat. 

Indirect strategies 

For such businesses, the inability to justify direct participation in the competition can be countered with indirect strategies, exemplified by IBM's use of Web-services as glue to integrate specific enterprise systems as consulting projects.  Participation of IBM researchers, consultants and designers enables the capture of business and technology design issues from the projects.   These issues can be categorized and integrated into a language, logic and metrics to enable searching the environment for large opportunity spaces, and to identify technologies that match the design rules.  

By encouraging third parties to resource the researching of modular technical solutions and promote the legitimacy of web-services as an enterprise integration bus, IBM can utilize any demand induced in customers at low cost to itself relative to its competition. 

For current leading companies, the paradoxical investment in a potential disruption creates significant stresses.  If the current business can sustain the additional funds, it can adopt strategies that limit commitment and limit waste by recycling people and product modules while replacing its business infrastructure with modular components.  Still the change is not without risk and difficulty.  A major network operator will find it hard to balance the need for open standards with their current position of power over their suppliers and partners.  Executives are well advised to limit their association with new technology until it has proved itself.  At the very least it must provide support for making these new assets fungible or the claims of adaptability are dubious. 

Emergence of web services as complex adaptive system - not EAI but Google

For a pure play software company providing the infrastructure supporting web services, the limited early success of EDI and EAI based on the new infrastructure is likely to induce questioning of the fitness of the strategy.  While limiting the commitments to the pre-conditions is sound for this market cluster, they appear in a squeeze between high touch strategies and open source strategies.  Competitive strategies do exist for these players, and they are of the most indirect and resource conserving kind.  However, most participants are using direct costly strategies, funding development and promotion of EAI solutions for example, with limited chance of success. 

As current businesses migrate their applications to the new architectures, or new businesses deploy directly onto the new architectures, the increased conformance to the web-services design rules will reduce the current applications adaptability constraint.  If the new infrastructure proves to operate predictably, with acceptable performance, the modularity will enable adaptation.  But human action is typically key to the adaptability of evolving processes.  Replacing ad-hoc knowledge networks and informal business relationships with limited commitments by architected processes has typically failed to sustain the knowledge and relationships.  Evolving effective strategies will require development of multiple competing strategies.  Integrating business and information infrastructure success will require mechanisms to compete and select for integrated strategies. 

As long as any architecture adopted has key enabling properties such as global scalability, and it is modular across the necessary aspects, and supported by a process that enables adaptation, new infrastructure-based catalysis will emerge over time.  Just don't bet that web-services will solve the modularity issues of today's business "architectures" overnight. 


Copyright ©2004 Robert Lingley