|Teaching and Learning Forum 2000 [ Proceedings Contents ]
Modeling computer and network systems: A new 'soft' technique for soft systems analysis methodsStanislaw Paul Maj
Department of Computer Science
Edith Cowan University
It was found that none of these students could perform first line maintenance on a Personal Computer (PC) to a professional standard with due regard to safety, both to themselves and the equipment. Neither could they install communication cards, cables and network operating system or manage a population of networked PCs to an acceptable commercial standard without further extensive training. It is noteworthy that none of the students interviewed had ever opened a PC. It is significant that all those interviewed for this study had successfully completed all the units on computer architecture and communication engineering. (Maj, Robbins et al. 1996)Furthermore, interviews conducted with five ECU graduates employed in computer and network support clearly indicated that they were, to a large degree, self taught in many of the skills they needed to perform their job. Preliminary investigations indicated a similar situation with computer science graduates from other universities within Western Australia. Other countries similarly have professional accreditation. In the United Kingdom (UK) the British Computer (BCS) accredits university courses and has an internationally recognised examination scheme in two parts with Part II at the level of a UK honors degree in computing.
The initial ECU student questionnaire, first used in 1993, was also conducted in 1999 at two universities in the UK. A similar study is currently being undertaken in Sweden. The first university has well established degree programs and is fully BCS accredited. The second university recently redesigned their IT awards, some of which are now BCS accredited. The degree programs at the first university offer students the opportunity to examine a PC in the first year as part of a module in Computer Organisation. However they never take a PC apart. Students are taught network modeling, design and management but they do not physically construct networks.
The results clearly demonstrate that students lacked knowledge about PC technology and the basic skills need to operate on computer and network equipment in a commercial environment. This is despite the fact that most students thought such knowledge would be beneficial. The survey indicated that any practical knowledge students have of hardware is largely a result of experience outside the course. At the second university the results demonstrate that these students had a broad, hobbyist's understanding of the PC but no knowledge of health and safety law. Significantly, the students interviewed identified that their skills and knowledge of PCs and networks came from self study or employment, not from courses at university. Again student responses indicated that such knowledge would be useful. The results to date indicate that both students and employers may perceive the standard computer technology curriculum as increasingly irrelevant. According to Nwana (1997), 'Perhaps most worrying of all is the persistent view that computer science graduates are not suitable for some employers, who appear to distrust computer qualifications'.
Pilgrim (1993) took an alternative approach in which a very small computer was designed in class and bread boarded in the laboratory by students using small and medium scale TTL integrated circuits. Thereby, according to Pilgrim, providing students with the 'knowledge and experience in the design, testing and integration of hardware and software for a small computer system'. According to Parker and Drexel (1996) simulation is a preferred approach in order to provide students with the 'big picture'. The difficulty of providing a suitable pedagogical framework is further illustrated by Coe and Williams Coe et al address this problem by means of simulation. Barnett (1995) suggests that standard computer architecture is too complex for introductory courses and recommends a simplified computer for educational purposes. However, it is possible to consider the PC and network technology from a different perspective. The PC is now a (relatively) low cost consumer item. This has been possible due to design and manufacturing changes that include: Assembly Level Manufacturing (ALM), Application Specific Integrated Circuits (ASICs) and Surface Mounted Technology (SMT). The result is PCs with a standard architecture and modular construction. However, traditionally computer technology education is typically based on digital techniques, small scale integration IC's, Karnaugh maps, assembly language programming etc. A new conceptual model is needed that provides abstraction in order to control irrelevant technical detail.
Although most people do not regard it as a multiprocessor, any arrangement of a microprocessor and a floppy disc controller is really a loosely coupled MIMD. The floppy disc controller is really an autonomous processor with its own microprocessor, internal RAM and ROM.and further that:
Because the FDC has all these resources on one chip and communicates with its host processor as if it were a simple I/O port, it is considered by many to be a simple I/O port. If it were not for the fact that the FDC is available as a single chip, engineers would be designing "true" multiprocessor systems to handle disc I/O. (Clements 1989)A PC is a complex collection of heterogeneous devices interconnected by a range of bus structures. However it can be modeled as a MIMD architecture of sub-units for nodes. Each node (microprocessor, hard disc drive etc) can be treated as a data source/sink capable of, to various degrees, data storage, processing and transmission. This simple model may provide the basis of a suitable conceptual map of computers. This model is conceptually simple and controls detail by abstraction. PC performance was then considered in order to obtain metrics suitable for use with this model.
To a first approximation the performance of PC nodes can be evaluated by bandwidth with units in Bytes/s. Though useful it may not be the best initial, user oriented unit - a typical user runs 32 bit windows based applications. The user is therefore interacting, via a Graphical User Interface (GUI), with text and graphical images. For general acceptance a Benchmark must be easy to understand and should therefore be based on user perception of performance and as such be simple and use reasonably sized units. We suggest that a useful unit of measurement is the display of a single, full screen, full color image. For this paper we define a full screen image as 640x480 with 4 bytes per pixel, which represents 1.17Mbytes of data. This appears to be the standard image for the new generation of video display adapters. The performance of a PC and associated nodes can still be evaluated using the measurement of bandwidth but with the units of standard images/s or frames/s. This unit of measurement may be more meaningful to a typical user because it relates directly to their perception of performance. To a first approximation, smooth animation requires a minimum of 5 frames/s (5.85Mbytes/s). Obviously sub multiples of this unit are possible such as quarter screen images and reduced color palette such as 1 byte per pixel.
It was possible to experimentally measure the data transfer rate from a Hard Disc to Electronic memory as 1.48Mbytes/s, which can be expressed as 1.21 frames/s. From a Hard Disc to a video adapter card the data rate was 1.37Mbytes/s, ie, 1.1 frames/s. The data transfer rate for the video card was 18.6Mbytes/s ie15.1 frames/s. We have therefore a common unit of measurement, relevant to common human perception, with decimal based units, that can be applied to different nodes and identify performance bottlenecks. In this case the HDD is the limiting factor and unable to provide a bandwidth suitable for smooth motion in an animation sequence. The concept of using images to evaluate PC performance can be made directly relevant to users from different disciplines, in particular Multimedia. However, any units may be used such are records, data fields etc.
For the purpose of this investigation Structure Systems Analysis and Design Method (SSADM) was used. SSADM is mandatory for UK central government software development projects. This method is sponsored by the Central Computer and Telecommunications Agency (CCTA) and the National Computing Centre (NCC) thereby further ensuring its importance within the software industry within the UK. SSADM is a framework employing a wide range of techniques (Data Flow Diagrams, Entity Models, Entity Life Histories, Normalisation, Process Outlines and Physical Design Control). SSADM is divided into six stages (Analysis, Specification of Requirements, Selection of System Option, Logical Data Design, Logical Process Design and Physical Design). The Physical Design translates the logical data design into the database specification and the logical process designs into code specifications. Capacity planning is used to estimate the data storage requirements of the hard discs. It is possible to analyse the process specifications detailed in the analysis and calculate the processing load - typical units include Million Instructions Per Second (MIPS). However no simple technique exists that will model the target hardware to accurately determine if it will perform to an acceptable standard. A PC is a complex collection of heterogeneous devices interconnected by a range of bus structures. A typical uniprocessor computer consists of four major components - the microprocessor, memory (primary and secondary), peripherals and the bus structures. The heterogeneous nature of the sub-units of a PC is clearly illustrated by the range of measurement units used ranging from MHz to seek times in milliseconds. Benchmarks exist for the PC as a whole. However, as discussed above the plethora of benchmarks, though useful, do not provide the basis of a coherent basis for modeling hardware.
An educational expert conducted a detailed analysis of student learning. Five students, chosen at random, were interviewed. Interviews were semi-structured consisting of a number of closed and open ended questions and respondents were encouraged to comment on any positive or negative aspect of the course and its effect on their learning. The results were this curriculum:
is perceived as very valuable by students from different disciplines; supports learning in other units; increases students' understanding of computers and computing; generates a demand for further curriculum in this field (Maj, Fetherston et al. 1998)This new curriculum, though perceived as valuable by students when first introduced, arguably lacked a coherent conceptual framework. Last year we used nodes as such a framework for the first time and evaluated the results. Using this conceptual framework the PC is considered as a series of nodes that can store, process and transfer data. Using the standard ECU course evaluation questionnaire the unit was highly rated by students. Student understanding was evaluated by means of two assignments in which they were required to obtain the technical specifications for a PC and construct a nodal model. One student wrote:
The lack of meaningful and comparable technical specifications makes the task of calculating the performance of a PC and it's individual components a difficult one. The computer industry appears to be a law unto oneself, with incomplete or non existent technical specifications. The use of standards, terms and abbreviations that are not comparable across different systems or manufacturers. This all leads to frustration and confusion from consumers and users of these computer systems and componentsand further when given technical specifications for a PC:
Sounds very impressive, yet by undertaking the exercise of converting to the components common units the relative performance of the PC and it's individual components can be measured and conclusions drawn. You will finally be able to see exactly what you are purchasing, its strengths, weaknesses and overall valuePrior to the introduction of this model student previous student assignments resulted in almost exclusively a list of hardware details copied directly from technical literature with little or no critical analysis. Most students most were able to predict the likely performance of a PC and identify nodes (devices) that would significantly handicap performance. The model is currently being evaluated in a commercial environment. Advantages to using this model include:
APC (1998). Brand Name PCs. Australian Personal Computer, 3, 122.
Barnett III, B. L. (1995). A Visual Simulator for a Simple Machine and Assembly Language. ACM SIGCSE Bulletin, 27(1), 233-237.
Bergmann, S. D. (1993). "Simulating and Compiling a Hypothetical Microprogrammed Architecture with Projects for Computer Architecture and Compiler Design." ACM SIGCSE Bulletin 25(2): 38-42.
Boehm, B. W. (1984). A software development environment for improving productivity. Computer, 17(6), 30-42.
Checkland, P. B. (1981). Systems Thinking, Systems Practice. Chichester, John Wiley.
Clements, A. (1989). Microprocessor Interfacing and the 68000: Peripherals and Systems. Essex, Anchor Press.
Coad, P. and E. Yourdon (1991). Object-oriented Analysis. Englewood Cliffs, NJ, Prentice-Hall.
Coey, W. A. (1993). An interactive tutorial system for MC68000 assembly language using hypercard. ACM SIGCSE Bulletin, 25(2), 19-23.
Ewing, D. J. (1993). Microcomputers systems 1: A computer science and engineering capstone course. ACM SIGCSE Bulletin, 25(1), 155-159.
Gibb, T. (1988). Principles of Software Engineering Management. Reading, MA, Wesley.
Gschwind, M. (1994). Preprogrammable hardware for educational purposes. ACM SIGCSE Bulletin, 26(1),183-187.
Humphrey, W. S. (1990). Managing the Software Process. Reading, MA, Addison-Wesley.
Jones, G. W. (1990). Software Engineering. New York, Wiley.
Magagnosc, D. (1994). Simulation in computer organisations: a goals based study. ACM SIGCSE Bulletin, 26(1), 178-182.
Maj, S. P., T. Fetherston, et al. (1998). Computer & Network Infrastructure Design, Installation, Maintenance and Management - a proposed new competency based curriculum. Proceedings of the Third Australasian Conference on Computer Science Education. The University of Queensland, Brisbane, Australia.
Maj, S. P., G. Robbins, et al. (1996). Computer and Network Installation, Maintenance and Management - A Proposed New Curriculum for Undergraduates and Postgraduates. The Australian Computer Journal, 30(3), 111-119.
Martin, J. (1991). Rapid Application Development. New York, Macmillan.
Matsumoto, Y. E. and Y. E. Ohno (1989). Japanese Perspectives in Software Engineering Practice. Reading, MA, Addison-Wesley.
Mumford, E. and M. Wier (1979). Computer Systems in Work Design - the ETHICS Method. London, Associated Business Press.
Nauumann, J. D. and A. M. Jenkins (1982). Prototyping: The new paradigm for systems development. MIS Quarterly.
Nwana, H. S. (1997). Is Computer Science Education in Crisis. ACM Computing Surveys, 29(4), 322-324.
Parker, B. C. and P. G. Drexel (1996). A system-based sequence of closed labs for computer systems organization. ACM SIGCSE Bulletin, 28(1), 53-57.
Pilgrim, R. A. (1993). Design and construction of the very simple computer (VSC): a laboratory project for undergraduate computer architecture courses. ACM SIGCSE Bulletin, 25(1), 151-154.
Reid, R. J. (1992). A laboratory for building computers. ACM SIGCSE Bulletin, 24(1), 192-196.
Royce, W. W. (1970). Managing the development of large software systems: Concepts and techniques. WESCON.
Searles, D. E. (1993). An Integrated Hardware Simulator. ACM SIGCSE Bulletin, 25(2), 24-28.
|Please cite as: Maj, S. P. (2000). Modeling computer and network systems: A new 'soft' technique for soft systems analysis methods. In A. Herrmann and M.M. Kulski (Eds), Flexible Futures in Tertiary Teaching. Proceedings of the 9th Annual Teaching Learning Forum, 2-4 February 2000. Perth: Curtin University of Technology. http://lsn.curtin.edu.au/tlf/tlf2000/maj1.html|