• Aucun résultat trouvé

An approach to integrating protocol design disciplines

This above example attests to the need to tailor protocols to the environment they operate in, and is the strongest argument for a design methodology that integrates performance metrics with functional correctness. Separating the design of the protocol from the context in which it exists leads to performance penalties that are unacceptable for wireless, portable applications. The remainder of this dissertation explores the relationships between specification, verification, performance estimation, and implementation. The summary of this exploration is presented here as a guide for the reader.

The formal methods community has long advocated a methodology that begins with an abstract, formal description of the system functionality that is supposed to be the basis for rigorous formal verification and architectural exploration.

Conceptually, this provides the designer with an implementation-independent way of evaluating a protocol or an algorithm. Further, this methodology proposes that the designer refine this abstract description by successively adding implementation details, proving at each step that the refinement is consistent with the original specification.

In practice few if any formal methods are employed in the design community.

Informal text documents usually specify the system requirements, and the typical design flow starts with simulation models based on these informal descriptions.

Simulation is then used to drive the bulk of the algorithmic exploration, and the results of these simulations are used to elaborate and fine-tune the original design.

Typically, it is only after a prototype of the system has been built and checked via

black-box conformance testing [Hol92] that the system is checked against the

Figure 1— 2. Mixed Formal/Informal design flow for data link protocols

The premise of this thesis is that a mix of informal and formal specifications and models are needed in order to facilitate the design of robust protocols that have reasonable performance. However, it is essential to understand where each is most appropriate in the design flow as well as the relationships between formal and informal models. With this in mind, we recommend the following methodology:

1) Develop a set of functional requirements that specify the services that a protocol is required to provide, along with performance considerations. For example, a data link protocol for mobile applications must support roaming, thus part of the functional requirement is “support for mobility”.

2) Develop an informal, coarse-grained architectural definition of the system that identifies a set of message passing entities (e.g., mobile devices and basestations), along with a (perhaps incomplete) set of message exchange sequences for each protocol function. Performance considerations, along with details about computation, data structures, etc., are omitted. The primary purpose of this phase is to focus on the exchange sequences that comprise the protocol, without regard to implementation, in typical scenarios. Message sequence charts (MSCs) are one semi-formal approach that is provides a means to graphically depict the actors, the state of each actor as time progresses, and their possible interactions. In addition, this specification can be used later during verification to check the trace-equivalence of the implementation at the message passing and state transition level.

3) Develop a more detailed, formal state machine model of each actor in the system, omitting performance-tuning features of the protocol during the early stages. Though a variety of formal languages exist, the most widely accepted of these is SDL, and is a reasonable choice for modeling functionality that is likely to be implemented in software (e.g., the logical link and high-level MAC functionality). The strength of SDL at this level is that it allows the designer to focus on the state machines, the messages that are exchanged, and the structure of the system at the block-diagram level. The formal semantics of timers, channels, and message passing allows the designer to focus creative effort on the design of the protocol rather than on developing a simulation infrastructure and defining the state machine using the semantics of a simulator.

4) At this point the design process branches into two largely independent tasks:

formal verification and performance estimation. Logical link protocols are by nature distributed-state concurrent systems and the design process must insure the logical consistency of the protocol, and are thus the primary target for formal verification in our context1. This is because formal verification focuses on proving properties about the system given a set of possible events, without regard to the probability of any event. So, for example, one would like to prove that the logical link could not deadlock under packet reordering or loss events. On the other hand, the media access control protocol consists of an algorithm that is designed to minimize the interference between users, and its evaluation must be done in terms of the probabilities of collision, loss, and corruption. Thus, performance estimation will largely focus on the media access algorithm.

5) Finally, the system is ready to be implemented as a mix of hardware and software. For hardware subsystems, it is not desirable to directly map an SDL process onto a hardware implementation: the result is semantically incon-sistent with the abstractions that SDL enforces (detailed in Chapter 3 and Chapter 5). Chapter 5 presents a compositional refinement methodology whereby it is possible to informally relate a high-level, asynchronous FSM, message-passing view of the system to a detailed hardware implementation. In Chapter 6, we consider a path from a high-level language such as SDL to software, and present an operating system implementation that provides the

1 For wireless systems, logical link protocols include both link establishment protocols and link management protocols that support mobile users.

infrastructural "glue" that is necessary to combine the hardware and software implementations.