Site hosted by Angelfire.com: Build your free website today!

The methods and tools of compositional business and software engineering are just emerging. The move beyond the object to the component paradigm has been underway for only a few years. Most methods associated with software engineering are top-down and decompositional, and they are still fully viable approaches. This long entrenched way of thinking will take time to be extended by mature compositional methods and tools. Fortunately, the key players in the software development industry are committed to the component model and early methods and tools have already appeared (CSC's Lynx method, Riverton Software's HOW, ICON Computing's Catalysis, Sterling Software's COOL). An essential corporate software strategy for e-Commerce is to adopt component-based architectures, development methods and tools, and evolve with them - today's methods and tools are quite powerful and will only get better.

Software architectures are generally tiered and divided into two broad categories: information architectures that deal with the business domain and technical architectures that deal with the computing infrastructure services needed by the application domain. While information architecture partitions the business logic across components and arranges the collaborations needed to meet functional requirements, technical architecture includes domain-independent design decisions such as middleware, database management, fail over, event management and persistence. Technical architecture should be developed early as it provides the technical services upon which applications will be built and reinforces the separation of concerns so critical to rapid component-based development. 

Issue 6: Server-side Component Models, Platforms and Frameworks

Software component architectures are service-based: end-user services, business process services and data services. Application components rely on distributed computing infra-structure services, freeing solution developers from the complexity and intricacies of the underlying technologies. Component builders are technologists who use component-based software engineering disciplines to produce components of extreme quality. Solution developers consume these prefabricated components during business process modeling and rapid application assembly.

Component architectures divide software into construction and consumption: once built in compliance with standards, newly constructed software components can register the services they provide, while other components can subscribe to and consume these services. Components do not act alone, they plug into component frameworks that connect participating components and enforce rules of component interaction. Component frameworks mediate and regulate component interaction. Component frameworks are arranged in a tiered architecture. Figure 4 illustrates components and how they plug into an interoperability framework that in turn calls on the services and facilities of a distributed computing platform. 

Two leading component models are Microsoft's' COM+ and Sun Microsystems' Enterprise JavaBeans (EJB). Both are server component models, as opposed to client-side components such as JavaBeans or ActiveX. The chief difference between the two is that the Microsoft model is restricted to Microsoft runtime systems and communication protocols. For example, although COM and DCOM have been ported to Solaris and other Unix platforms, COM containers are not available on these platforms. EJB, on the other hand, adheres to the write once, run anywhere (WORA) philosophy that made Java the overnight success for programming the Internet. 

EJB and the CORBA middleware standard go hand-in-hand. EJB provides software portability (allowing EJB components to run on any operating system) and CORBA provides software interoperability with platform services written in languages other than Java, such as C++ and Smalltalk. Rather than reinvent the wheel, EJB's application programming interfaces (APIs) provides access to underlying enterprise-class, standards-based infrastructure services. For example, the Java Transaction API defines a distributed transaction management service based on CORBA's Object Transaction Service (OTS) standard. In turn, the Java IDL creates a remote interface to CORBA communication and includes a lightweight object request broker (ORB) that supports CORBA's Internet Inter-ORB Protocol (IIOP). 

Business and e-Commerce application components will not be delivered to corporations as a big pile of parts and pieces. Instead the components will be preassembled into industry specific application frameworks as illustrated by the Financial, Manufacturing, and e-Commerce components shown in Figure . 


 
 

A Business Component Software Architecture

The e-Commerce components provide essential inter-enterprise functionality for e-Commerce (e.g. negotiation, mediation, inter-enterprise user access management, inter-enterprise workflow management and event notification). These frameworks represent applications that are almost, say 60% - 80%, complete. The task of solutions developers is to customize and extend the frameworks to incorporate the unique business rules and processes of a company. Thus, solution developers concentrate on the unique character and knowledge of the company that provides its competitive advantage. They are insulated from the technology plumbing.

IBM's SanFrancisco(tm) is a leading example of a component-based application framework that includes most of the code needed for developing server-side, mission-critical business applications. SanFrancisco's component architecture is layered and services-based, using an architecture similar to that portrayed in Figure 8.4. It includes four layers. The Foundation layer utilizes the distributed object services of the EJB component platform that, in turn, are based on definitions of the Object Management Group (OMG) (i.e. object transaction services, collections, object communication and persistence). The Foundation layer provides the base classes and utilities needed to support the distributed, mission-critical applications for which SanFrancisco was designed. Built on the Foundation layer is the Common Business Object layer that provides cross-application business objects common to most commercial domains (e.g. Company, Address, Business Partner, Bank Accounts, Unit of Measure and Credit Check). The Core Business Process frameworks layer provides the prefabricated components for applications in a specific business domain (e.g. General Ledger, Accounts Receivable, Accounts Payable, Warehouse Management and Order Management). The final Application layer is built by solution developers who map business requirements to the core business processes, configure their properties to reflect unique business needs, and extend the prebuilt functionality to add specific requirements, the user interface, business rules, competitive differentiators and complimentary application functionality. 

SanFrancisco makes extensive use of design patterns derived from classical architecture. Design patterns are techniques used to solve recurring design problems. As we mentioned, Christopher Alexander defined the concept of patterns in architecture, "Each design pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem, in such a way that you can use this solution a million times over, without ever doing it the same way twice." In the worlds of objects and software components, design pattern form the cornerstone of software reuse and provide consistency throughout the SanFrancisco frameworks.

The breakthrough of server-side component architecture is the clear separation of business and technology concerns that simplifies the process of building enterprise-class e-Commerce applications. Solution developers can concentrate on business models and logic while the component framework manages the complex middleware services for the applications. For an architectural approach to software development to be practical, enterprise-class e-Commerce systems and their component frameworks must have certain characteristics. These non-functional characteristics include the requirements and constraints shown in Table 8.1 

loosely coupled

self-organizing

self-describing

event-driven

asynchronous

reliable

secure

survivable

intuitive

collaborative

embody business controls

auditable

Table 1: Requirements and Constraints

In a DARPA research report by Thompson, Linden and Filman list the architectural properties required of the non-functional, technology-level environment. As shown in Table 8.2, their list is dominated by property names suffixed with "ility," and thus the list is referred to as the ilities of architecture. "Software is designed primarily to meet functional specifications - that is, to achieve particular input and output behavior. However, a well-designed system achieves an appropriate balance of other [non-functional] properties: those that encompass its ability to exist in its environment and evolve as that environment changes. Following evolving practice, we call these ilities. Ilities are architectural properties that often can cut across different levels or components of a system's architecture to be properties of the whole. Unlike simple functionality, ilities can include tradeoffs - for example, enhanced interoperability may have been produced at the cost of lesser security. Although ilities are sometimes invisible properties of the system as a whole, they represent real (though perhaps unspoken or even unrealized) customer requirements. The enterprise architect or system architect is responsible for ensuring such ilities, and a most appropriate task for an architectural framework is to support them." In total, these ilities create the ability to evolve and meet new demands of the business. 

interoperability composability

scalability evolvability

extensibility tailorability

security reliability

adaptability survivability

affordability maintainability

understandability performance

quality of service nomadicity

Table :2. Some "Ilities"

Third wave companies embrace component-based architecture and their ultimate strategy is to implement an architected approach to rapid application development. Components accelerate the entire application development life cycle and, when used directly in business process modeling, they eliminate the disconnect between business and technology. This approach is the essence of business and software agility. Although component-based development is an emerging approach, sufficient methods and tools are available to begin using the approach now, and gain even greater benefits as the approach grows to full maturity. 

Issue 7: The XML Factor: Industry Vocabularies

Open markets require the interoperation of e-Commerce applications and consistent protocols and formats for information interchange. The complexity of building such virtual commerce places mandates a common vocabulary based on standards if there is to be any hope of interoperability at the commerce level. The ultimate purpose of e-Commerce interoperability standards is to develop consistent business semantics that can be used by all participants - a common language of digital commerce. These semantics provide commonality to the names of and relationships between processes, workflows, and data across value and supply chains. 

The World Wide Web Consortium (W3C) adopted a new standard for defining and naming data on a Web page in 1998. It is likely that the eXtensible Markup Language (XML) will revolutionize the Web because it allows structured data - with standard names and consistent semantics - to be moved around the Web in a simple, straightforward way, as easily as HTML does today. XML is a native Web approach that enables extensible data-exchange formats, and gives industries the flexibility to create their own data tags to develop a shared Internet file system.

XML is being touted as the next big thing on the Internet since the introduction of Java. XML, however, is not a replacement for Java. In fact, Sun engineer, Jon Bosak chaired the W3C XML Coordination Group and is generally regarded as the father of XML. XML and Java go hand-in-hand. In Bosak's words, "XML and Java technologies are the yin and yang of vendor-neutral programming. Put them together and you have a complete, platform-independent, Web-based computing environment." Java provides a way for software programs to be shuttled around the Internet, XML does the same thing for data, offering a universal data interchange format. With Java and XML working together (XMLbeans), the result is much like the science fiction films where all the information (data, process and control) needed to perform an activity moves seamlessly from computer to computer throughout cyberspace. Java turns the Internet into a single, ubiquitous computer, XML turns the Internet into a single, ubiquitous filing cabinet for data. 

The contents of an XML document, however, need not be only data. In fact the term "smart data" may be more appropriate. Anne Thomas of the Patricia Seybold Group explains, "Combining Java and XML technologies produces portable 'smart' data. XML supplies a universally portable structured data format, and Java technology supplies universally portable code. Since code written in the Java programming language can be embedded into a document written in the XML language, we can create a data structure that includes its own data manipulation applications. It's a great combination." Thus XML can play a role at all three levels of process integration as illustrated in Figure 3: simple data hand-offs, process hand-offs and real-time interoperation. 

XML is a document-centered technology ideally suited for message passing between trading partners in an e-Commerce ecosystem. Document messaging is a way for e-Commerce applications to interoperate in a loosely-coupled, request-for-service, communication process. The document type definition (DTD) alone can identify a given document type in a business-to-business transaction. This is similar to the various document types defined for the EDI community. For example, an ANSI X12 EDI 850 is a Purchase Order transaction set. By sending such a document to an EDI enabled system, the receiving organization knows what processing services to perform on the data. Such data hand-offs trigger business processes in the receiving organization based simply on knowing the document type contained in the message sent to it. 

On the other hand, process tags (<Process) tags can be included in an XML document to indicate the business process or system process to be launched or invoked by the receiving organization as a result of the process hand-off. The Document Object Model (DOM) is a platform- and language-neutral interface that allows programs and scripts to dynamically access and update the content, structure and style of documents. DOM allows XML content (including all markup and any Document Type Definitions) to be accessed and manipulated as a collection of objects. The document can be further processed and the results of that processing can be incorporated back into the presented page. Java code thus embedded in an XML document enables real-time process integration. Regardless of the techniques used to add "behavior" to XML documents, the result is a document services architecture that allows trading partners to combine and request services needed to process business documents marked up in XML. 

XML also is being embraced by the Enterprise Application Integration (EAI) community whose central goal is to achieve interoperability between legacy and Enterprise Resource Planning (ERP) systems. ERP integration has been a long-standing problem that has become a burning issue with the advent of e-Commerce. Using the same principles described above, ERP vendors (SAP, Baan, Peoplesoft and Oracle) and the non-profit Open Applications Group (OAG) are turning to XML to solve some of the chronic interoperability problems between and among ERP systems. Today companies have a choice of digging in and dealing with the proprietary application program interfaces (APIs) such as SAP's BAPI, or they can turn to third party companies like Crossworlds, Active Software or Visual Edge to provide adapters for higher-level interfaces to the leading ERP systems. The ERP vendors are developing means to take lower-level function calls to their systems and translate them into standardized XML documents. Thus, XML can be used to greatly simplify and lower the costs of legacy and ERP interoperation just as it can simplify and lower the cost of EDI. 

XML is, however, not a silver bullet, and in some ways it is little more than the reintroduction of the "unit record concept" that was introduced with the punched card in the 1950s, where chunks of data (fields) were tagged with names that gave us attribute/value pairs bound together as a standalone document (a record). After all, XML is simply text (ASCII) data, and it must have links to a powerful, underlying object infrastructure (based on a Web Object Model) to handle the adaptive business processes and workflows that e-Commerce requires. The truly difficult part of this process is gaining global agreement on the semantics - an effort that has eluded information systems designers since the introduction of centralized corporate databases in the 1960s. Ask any experienced data administrator and they will tell you that they cannot even get departments within the same company to agree on data names, much less their meaning. 

Rik Drummond, CommerceNet's XML/EDI project leader, put it this way in his research note, Is XML Dead in the Water?, "Just like the Clinton 1992 presidential race 'It's the economy dummy! - In XML, 'It's the semantics dummy!' Since XML is a Meta language [a language used to define other languages], it is easy to define the structure of the document to describe how things relate to each other. It also allows one to establish data element names, that is, names such as <price, <DiscountPrice or <size. The syntax is not the problem, because the XML parsers can read the syntax. The semantics are the issue. This is especially true with semantic interoperability. For example, how do we agree with what <size means? Is it length, width, depth, or some combination of all three? Is it in feet, yards or meters? These are semantic issues." 

Thus, XML is simply an enabler, not a guarantor, of the consistent business semantics required for e-Commerce. XML can be used within a single company with little debate over vocabulary and semantics. Two trading partners could do the same, although this process could require mapping the DTDs of each company if each had its own definitions. Add more trading partners and the mapping problem grows exponentially. What is really needed for open e-Commerce are standard XML industry vocabularies for DTDs and schema. Then companies that adhere to the standards can all talk to each other digitally in the same tongue, no mapping required. 

Some industries will do better than others at establishing semantic consensus. For example, groups such as the XML/EDI Group and Data Interchange Standards Association (DISA) may have early success because they are dealing with an established body of inter-enterprise business semantics. The task is one of mapping EDI document definitions to XML, and, DISA, which maintains the ANSI ASC X.12 EDI standards, has already undertaken a XML initiatives. 

A number of XML industry vocabularies are being developed by individual companies, vertical industry consortia and software industry groups. Some of these are listed here and a brief description of each appears in Appendix A:

 Open Financial Exchange (OFX/OFE) for consumer finance 

  Information Content Exchange (ICE) for content syndication and exchange 

  Open Trading Protocol (OTP) 

  Open Buying on the Internet (OBI) for MRO procurement 

  DISA XML/EDI 

  Open E-Book (OEB) for electronic book publishing 

  Financial Products Markup Language (FpML) for financial derivatives 

  FinXML&trade; for trading and risk management for capital markets. 

  FIX/FIXML for securities transactions in the equities market. 

  Signed Document Markup Language (SDML) for digital signatures 

  Electronic-Commerce Modeling Language (ECML) for electronic wallets 

  XMLNews for the news industry 

  RosettaNet for the PC industry supply chain 

  cXML for e-Commerce transactions 

  XML Metadata Interchange (XMI) for interoperating UML and Data Warehousing model and artifacts 

  Channel Definition Format (CDF), a Microsoft push standard 

  Bank Internet Payment System (BIPS) 

  OpenMLS - Real Estate DTD Design 

  Legal XML Working Group 

  Customer Support Consortium 

  Electronic Component Manufacturer Data Sheet Library Specification (ECMData) 

  XML-HR for recruiting and placement 

  SWAP - Simple Workflow Access Protocol 

  XML for the Automotive Industry - SAE J2008 

  HL7 for healthcare 

  ACORD for insurance

The list of standards and industry vocabulary initiatives will surely grow, probably until it comes close to the number of global standards related to commerce in the physical world. The sheer number of new and evolving industry vocabularies causes obvious problems. What happens when connections are needed between them in a given e-Commerce ecosystem? What is needed is a common repository of XML schema that can be shared by participants globally over the Net. The ideal goal would be to establish a XML portal for industry vocabularies. The portal should serve as a repository and a registry of vocabularies. Within the registries, XML tags must be managed using the Namespaces standard for XML so that names do not overlap and collide in documents containing multiple vocabularies. 

While registries must be organized into a meaningful taxonomy, such taxonomies must be more than a way to classify information. They must encode and represent knowledge to get at the meaning of the information. While XML data and Java processing bring the information needed for e-Commerce, ontologies bring knowledge to the e-Commerce table. Open markets require smart software acting on behalf of all participants.

Knowledge representation is a discipline within the artificial intelligence community and is captured in the form of ontologies. What is an ontology? In its general meaning, ontology is the branch of metaphysics that deals with what kinds of things exist in the universe - the classification and essence of things. Tom Gruber of Stanford University provides the short answer in context of artificial intelligence, "An ontology is an explicit specification of a conceptualization." He explains, "A body of formally represented knowledge is based on conceptualization: the objects, concepts, and other entities that are assumed to exist in some area of interest and the relationships that hold among them." An ontology defines the basic terms and relations comprising the vocabulary of a topic area, as well as the rules for combining terms and relations to define extensions to the vocabulary. All domain models embody an ontology - albeit mostly implicitly. An artificial intelligence approach to defining business objects emphasizes the use of explicit ontologies as implementation-neutral representations of knowledge that can then be mechanically translated into different target modeling tools. 

Previous Next