Teaching and Learning Forum 2000 [ Proceedings Contents ]

Modeling computer and network systems: A new 'soft' technique for soft systems analysis methods

Stanislaw Paul Maj
Department of Computer Science
Edith Cowan University
    A detailed market analysis within WA clearly indicated that both students and employers perceive the standard computer technology curriculum as increasingly irrelevant. Work to date clearly indicates that this standard approach provides technical detail and complexity that is not only inappropriate but also irrelevant for introductory courses on computer and network technology. As part of an international study the same investigation was conducted with several European universities. The results to date parallel those obtained from the WA study. Modern 'soft' systems analysis methods (Checkland, SSADM, etc) consider the organisation as a whole and any externalities. Such methods are extensively used however they provide little or no guidance on hardware issues. Accordingly a new technique has been designed for the modeling of computer and network equipment. This new method provides a simple to use framework allowing technical detail to be introduced and controlled by a top down abstraction that is meaningful and therefore readily understandable to students not only from computer science but also other disciplines. The model used has the desired characteristics of visibility, simplicity, consistency and flexibility - key characteristics of soft systems analysis methods. Work to date indicates that this new technique is not only technically valid but also supports increasing levels of technical complexity and hence may be appropriate basis for more advanced studies. Furthermore the abstractions used in this model are independent of technical detail and can therefore accommodate rapid changes in technology. Preliminary investigations indicate that this method may be used as one of the standard tools in a soft systems method thereby extending the scope of the analysis.
Teaching and Learning Forum 2000 Home Page


Introduction

Within Western Australia an exploratory market audit was conducted of a wide range of industrial and commercial companies. This was complemented by a further detailed analysis of the IT department of a state wide rail company. From this survey a set of guidelines were developed for the type of skills expected of computer science graduates. Using the criteria developed a random selection of ten, final year ECU computer science undergraduates were interviewed from a graduating population of approximately one hundred. According to Maj,
It was found that none of these students could perform first line maintenance on a Personal Computer (PC) to a professional standard with due regard to safety, both to themselves and the equipment. Neither could they install communication cards, cables and network operating system or manage a population of networked PCs to an acceptable commercial standard without further extensive training. It is noteworthy that none of the students interviewed had ever opened a PC. It is significant that all those interviewed for this study had successfully completed all the units on computer architecture and communication engineering. (Maj, Robbins et al. 1996)
Furthermore, interviews conducted with five ECU graduates employed in computer and network support clearly indicated that they were, to a large degree, self taught in many of the skills they needed to perform their job. Preliminary investigations indicated a similar situation with computer science graduates from other universities within Western Australia. Other countries similarly have professional accreditation. In the United Kingdom (UK) the British Computer (BCS) accredits university courses and has an internationally recognised examination scheme in two parts with Part II at the level of a UK honors degree in computing.

The initial ECU student questionnaire, first used in 1993, was also conducted in 1999 at two universities in the UK. A similar study is currently being undertaken in Sweden. The first university has well established degree programs and is fully BCS accredited. The second university recently redesigned their IT awards, some of which are now BCS accredited. The degree programs at the first university offer students the opportunity to examine a PC in the first year as part of a module in Computer Organisation. However they never take a PC apart. Students are taught network modeling, design and management but they do not physically construct networks.

The results clearly demonstrate that students lacked knowledge about PC technology and the basic skills need to operate on computer and network equipment in a commercial environment. This is despite the fact that most students thought such knowledge would be beneficial. The survey indicated that any practical knowledge students have of hardware is largely a result of experience outside the course. At the second university the results demonstrate that these students had a broad, hobbyist's understanding of the PC but no knowledge of health and safety law. Significantly, the students interviewed identified that their skills and knowledge of PCs and networks came from self study or employment, not from courses at university. Again student responses indicated that such knowledge would be useful. The results to date indicate that both students and employers may perceive the standard computer technology curriculum as increasingly irrelevant. According to Nwana (1997), 'Perhaps most worrying of all is the persistent view that computer science graduates are not suitable for some employers, who appear to distrust computer qualifications'.

Computer and network technology curriculum

The problems associated with teaching computer technology are not new. Units in microcomputer systems are fundamentally important to both computer science and engineering students (Ewing 1993). These address issues that include: computer organisation, memory systems, assembly language, digital logic, interrupt handling, I/O and interfaces. Mainstream computer science education is well supported by journal articles on various aspects of re-programmable hardware for educational purposes (Gschwind 1994) and assembly language (Coey 1993). Simulation has proved to be a very useful tool (Magagnosc 1994, Searles 1993, Bergmann 1993). Reid (1992) used laboratory workstations to allow undergraduate students to 'build a complete, functioning computer - in simulation'.

Pilgrim (1993) took an alternative approach in which a very small computer was designed in class and bread boarded in the laboratory by students using small and medium scale TTL integrated circuits. Thereby, according to Pilgrim, providing students with the 'knowledge and experience in the design, testing and integration of hardware and software for a small computer system'. According to Parker and Drexel (1996) simulation is a preferred approach in order to provide students with the 'big picture'. The difficulty of providing a suitable pedagogical framework is further illustrated by Coe and Williams Coe et al address this problem by means of simulation. Barnett (1995) suggests that standard computer architecture is too complex for introductory courses and recommends a simplified computer for educational purposes. However, it is possible to consider the PC and network technology from a different perspective. The PC is now a (relatively) low cost consumer item. This has been possible due to design and manufacturing changes that include: Assembly Level Manufacturing (ALM), Application Specific Integrated Circuits (ASICs) and Surface Mounted Technology (SMT). The result is PCs with a standard architecture and modular construction. However, traditionally computer technology education is typically based on digital techniques, small scale integration IC's, Karnaugh maps, assembly language programming etc. A new conceptual model is needed that provides abstraction in order to control irrelevant technical detail.

The PC - a new model

Models are used as a means of communication and controlling detail. By example, a transistor can be modeled by a simple diagram with parameters directly relevant to an engineer. The details of semiconductor theory are not relevant in this context i.e. detail is encapsulated and hence controlled. Similarly digital techniques such as sequential logic is a higher level modeling technique that masks the details of individual transistors. Models should have the following characteristics: Clements suggests that a PC may be considered as a loosely coupled Multiple Instruction, Multiple Data (MIMD) device. According to Clements:
Although most people do not regard it as a multiprocessor, any arrangement of a microprocessor and a floppy disc controller is really a loosely coupled MIMD. The floppy disc controller is really an autonomous processor with its own microprocessor, internal RAM and ROM.
and further that:
Because the FDC has all these resources on one chip and communicates with its host processor as if it were a simple I/O port, it is considered by many to be a simple I/O port. If it were not for the fact that the FDC is available as a single chip, engineers would be designing "true" multiprocessor systems to handle disc I/O. (Clements 1989)
A PC is a complex collection of heterogeneous devices interconnected by a range of bus structures. However it can be modeled as a MIMD architecture of sub-units for nodes. Each node (microprocessor, hard disc drive etc) can be treated as a data source/sink capable of, to various degrees, data storage, processing and transmission. This simple model may provide the basis of a suitable conceptual map of computers. This model is conceptually simple and controls detail by abstraction. PC performance was then considered in order to obtain metrics suitable for use with this model.

PC Performance - Bandwidth nodes

Benchmark programs considered directly relevant to a typical single user, multi-tasking environment running a de facto standard suite of 32 bit applications include : AIM Suite III, SYSmark and Ziff-Davis PC Benchmark. Consumer magazines use Benchmark suites to evaluate PCs and publish their results (APC, 1998). As a relative guide Benchmarks are an aid to selection, however, all of these results must be interpreted and many questions still remain for users. We conclude that the plethora of benchmarks, though useful, do not provide the basis of a coherent conceptual model of a PC. Any measurement standard, to be of practical value to PC users, must be relevant to human dimensions or perceptions and use units based on the decimal scaling system.

To a first approximation the performance of PC nodes can be evaluated by bandwidth with units in Bytes/s. Though useful it may not be the best initial, user oriented unit - a typical user runs 32 bit windows based applications. The user is therefore interacting, via a Graphical User Interface (GUI), with text and graphical images. For general acceptance a Benchmark must be easy to understand and should therefore be based on user perception of performance and as such be simple and use reasonably sized units. We suggest that a useful unit of measurement is the display of a single, full screen, full color image. For this paper we define a full screen image as 640x480 with 4 bytes per pixel, which represents 1.17Mbytes of data. This appears to be the standard image for the new generation of video display adapters. The performance of a PC and associated nodes can still be evaluated using the measurement of bandwidth but with the units of standard images/s or frames/s. This unit of measurement may be more meaningful to a typical user because it relates directly to their perception of performance. To a first approximation, smooth animation requires a minimum of 5 frames/s (5.85Mbytes/s). Obviously sub multiples of this unit are possible such as quarter screen images and reduced color palette such as 1 byte per pixel.

It was possible to experimentally measure the data transfer rate from a Hard Disc to Electronic memory as 1.48Mbytes/s, which can be expressed as 1.21 frames/s. From a Hard Disc to a video adapter card the data rate was 1.37Mbytes/s, ie, 1.1 frames/s. The data transfer rate for the video card was 18.6Mbytes/s ie15.1 frames/s. We have therefore a common unit of measurement, relevant to common human perception, with decimal based units, that can be applied to different nodes and identify performance bottlenecks. In this case the HDD is the limiting factor and unable to provide a bandwidth suitable for smooth motion in an animation sequence. The concept of using images to evaluate PC performance can be made directly relevant to users from different disciplines, in particular Multimedia. However, any units may be used such are records, data fields etc.

System development methods - an overview

A method is a collection of procedures, techniques, tools and documentation aids that provide guidance and assistance to system developers. A method consists of phases or stages that in themselves may consist of sub-phases. There exist a wide range of methods that include ad hoc (Jones 1990), waterfall (Royce 1970), participative (Mumford and Wier 1979), soft systems (Checkland 1981), prototyping (Nauumann and Jenkins 1982), incremental (Gibb 1988), spiral (Boehm 1984), reuse (Matsumoto and Ohno 1989), formal (Andrews and Ince 1991), rapid application development (Martin 1991), object oriented (Coad and Yourdon 1991) and software capability (Humphrey 1990). Regardless of the underlying theme of each information system method all methods must provide techniques for modeling data, processes, system functions. Some systems development methods only stress the technical aspects. It can be argued that this may lead to a less than ideal solution as these methods underestimate the importance and difficulties associated with the human element.

For the purpose of this investigation Structure Systems Analysis and Design Method (SSADM) was used. SSADM is mandatory for UK central government software development projects. This method is sponsored by the Central Computer and Telecommunications Agency (CCTA) and the National Computing Centre (NCC) thereby further ensuring its importance within the software industry within the UK. SSADM is a framework employing a wide range of techniques (Data Flow Diagrams, Entity Models, Entity Life Histories, Normalisation, Process Outlines and Physical Design Control). SSADM is divided into six stages (Analysis, Specification of Requirements, Selection of System Option, Logical Data Design, Logical Process Design and Physical Design). The Physical Design translates the logical data design into the database specification and the logical process designs into code specifications. Capacity planning is used to estimate the data storage requirements of the hard discs. It is possible to analyse the process specifications detailed in the analysis and calculate the processing load - typical units include Million Instructions Per Second (MIPS). However no simple technique exists that will model the target hardware to accurately determine if it will perform to an acceptable standard. A PC is a complex collection of heterogeneous devices interconnected by a range of bus structures. A typical uniprocessor computer consists of four major components - the microprocessor, memory (primary and secondary), peripherals and the bus structures. The heterogeneous nature of the sub-units of a PC is clearly illustrated by the range of measurement units used ranging from MHz to seek times in milliseconds. Benchmarks exist for the PC as a whole. However, as discussed above the plethora of benchmarks, though useful, do not provide the basis of a coherent basis for modeling hardware.

Nodes as a systems analysis technique

Each node (microprocessor, hard disc drive etc) can be treated as a data source/sink capable of, to various degrees, data storage, processing and transmission. The PC itself may be described as a node or indeed any network. Each node may be can now be treated as a quantifiable data source/sink with an associated transfer characteristic in units appropriate to the user. This approach allows the performance of every node and data path to be assessed by a simple, common measurement. Using this technique it is possible to easily identify nodes that may limit the performance of equipment.

Evaluation

As a result of the initial investigations at ECU a new curriculum was designed, implemented and fully evaluated at ECU (Maj, Fetherston et al. 1998). Unlike the standard computer technology curriculum students are not taught digital techniques, assembly language programming etc. Rather the curriculum is based on a constructivist approach. This curriculum has always been oversubscribed, has a very low student attrition rate, attracts students from other Faculties in ECU and students from other universities in the state. An independent review of this unit found: 80% would recommend this unit; 75% found the practical sessions useful; 70% found the unit relevant to their needs and 55% think this should be a compulsory unit. When one new unit from this curriculum was first introduced, from an enrolment of 118, only 66 were computer science students the others were from a wide range of disciplines (psychology, biological and chemical sciences etc).

An educational expert conducted a detailed analysis of student learning. Five students, chosen at random, were interviewed. Interviews were semi-structured consisting of a number of closed and open ended questions and respondents were encouraged to comment on any positive or negative aspect of the course and its effect on their learning. The results were this curriculum:

is perceived as very valuable by students from different disciplines; supports learning in other units; increases students' understanding of computers and computing; generates a demand for further curriculum in this field (Maj, Fetherston et al. 1998)
This new curriculum, though perceived as valuable by students when first introduced, arguably lacked a coherent conceptual framework. Last year we used nodes as such a framework for the first time and evaluated the results. Using this conceptual framework the PC is considered as a series of nodes that can store, process and transfer data. Using the standard ECU course evaluation questionnaire the unit was highly rated by students. Student understanding was evaluated by means of two assignments in which they were required to obtain the technical specifications for a PC and construct a nodal model. One student wrote:
The lack of meaningful and comparable technical specifications makes the task of calculating the performance of a PC and it's individual components a difficult one. The computer industry appears to be a law unto oneself, with incomplete or non existent technical specifications. The use of standards, terms and abbreviations that are not comparable across different systems or manufacturers. This all leads to frustration and confusion from consumers and users of these computer systems and components
and further when given technical specifications for a PC:
Sounds very impressive, yet by undertaking the exercise of converting to the components common units the relative performance of the PC and it's individual components can be measured and conclusions drawn. You will finally be able to see exactly what you are purchasing, its strengths, weaknesses and overall value
Prior to the introduction of this model student previous student assignments resulted in almost exclusively a list of hardware details copied directly from technical literature with little or no critical analysis. Most students most were able to predict the likely performance of a PC and identify nodes (devices) that would significantly handicap performance. The model is currently being evaluated in a commercial environment. Advantages to using this model include:

Conclusions

The proposed model is simple to use, self documenting and controls technical complexity. Furthermore, it supports top down decomposition and can therefore by used to model a wide range of computer and network equipment. Using this technique all such equipment can be evaluated using 'user oriented' units of performance. It is possible, using this technique to extend the scope of standard methods such as SSADM to obtain a more complete, consistent and correct model of equipment and its anticipated performance. Further work is needed to estimate the effects of factors such as operating system overheads.

References

Andrews, D. and D. Ince (1991). Practical Formal Methods with VDM. New York, McGraw Hill.

APC (1998). Brand Name PCs. Australian Personal Computer, 3, 122.

Barnett III, B. L. (1995). A Visual Simulator for a Simple Machine and Assembly Language. ACM SIGCSE Bulletin, 27(1), 233-237.

Bergmann, S. D. (1993). "Simulating and Compiling a Hypothetical Microprogrammed Architecture with Projects for Computer Architecture and Compiler Design." ACM SIGCSE Bulletin 25(2): 38-42.

Boehm, B. W. (1984). A software development environment for improving productivity. Computer, 17(6), 30-42.

Checkland, P. B. (1981). Systems Thinking, Systems Practice. Chichester, John Wiley.

Clements, A. (1989). Microprocessor Interfacing and the 68000: Peripherals and Systems. Essex, Anchor Press.

Coad, P. and E. Yourdon (1991). Object-oriented Analysis. Englewood Cliffs, NJ, Prentice-Hall.

Coey, W. A. (1993). An interactive tutorial system for MC68000 assembly language using hypercard. ACM SIGCSE Bulletin, 25(2), 19-23.

Ewing, D. J. (1993). Microcomputers systems 1: A computer science and engineering capstone course. ACM SIGCSE Bulletin, 25(1), 155-159.

Gibb, T. (1988). Principles of Software Engineering Management. Reading, MA, Wesley.

Gschwind, M. (1994). Preprogrammable hardware for educational purposes. ACM SIGCSE Bulletin, 26(1),183-187.

Humphrey, W. S. (1990). Managing the Software Process. Reading, MA, Addison-Wesley.

Jones, G. W. (1990). Software Engineering. New York, Wiley.

Magagnosc, D. (1994). Simulation in computer organisations: a goals based study. ACM SIGCSE Bulletin, 26(1), 178-182.

Maj, S. P., T. Fetherston, et al. (1998). Computer & Network Infrastructure Design, Installation, Maintenance and Management - a proposed new competency based curriculum. Proceedings of the Third Australasian Conference on Computer Science Education. The University of Queensland, Brisbane, Australia.

Maj, S. P., G. Robbins, et al. (1996). Computer and Network Installation, Maintenance and Management - A Proposed New Curriculum for Undergraduates and Postgraduates. The Australian Computer Journal, 30(3), 111-119.

Martin, J. (1991). Rapid Application Development. New York, Macmillan.

Matsumoto, Y. E. and Y. E. Ohno (1989). Japanese Perspectives in Software Engineering Practice. Reading, MA, Addison-Wesley.

Mumford, E. and M. Wier (1979). Computer Systems in Work Design - the ETHICS Method. London, Associated Business Press.

Nauumann, J. D. and A. M. Jenkins (1982). Prototyping: The new paradigm for systems development. MIS Quarterly.

Nwana, H. S. (1997). Is Computer Science Education in Crisis. ACM Computing Surveys, 29(4), 322-324.

Parker, B. C. and P. G. Drexel (1996). A system-based sequence of closed labs for computer systems organization. ACM SIGCSE Bulletin, 28(1), 53-57.

Pilgrim, R. A. (1993). Design and construction of the very simple computer (VSC): a laboratory project for undergraduate computer architecture courses. ACM SIGCSE Bulletin, 25(1), 151-154.

Reid, R. J. (1992). A laboratory for building computers. ACM SIGCSE Bulletin, 24(1), 192-196.

Royce, W. W. (1970). Managing the development of large software systems: Concepts and techniques. WESCON.

Searles, D. E. (1993). An Integrated Hardware Simulator. ACM SIGCSE Bulletin, 25(2), 24-28.

Please cite as: Maj, S. P. (2000). Modeling computer and network systems: A new 'soft' technique for soft systems analysis methods. In A. Herrmann and M.M. Kulski (Eds), Flexible Futures in Tertiary Teaching. Proceedings of the 9th Annual Teaching Learning Forum, 2-4 February 2000. Perth: Curtin University of Technology. http://lsn.curtin.edu.au/tlf/tlf2000/maj1.html


[ TL Forum 2000 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf2000/maj1.html
Last revision: 20 Feb 2002. Curtin University of Technology
Previous URL 27 Dec 1999 to 20 Feb 2002 http://cleo.murdoch.edu.au/confs/tlf/tlf2000/maj1.html