|Management of agricultural research: A training manual. Module 6: Management information systems, computers and network techniques (1997)|
|Session 1: Management information systems|
|Session guide: Management information systems|
|Reading note: Management information systems|
|Information and the MIS concept|
|Management and the MIS process|
|Organizational structure and MIS|
|Information requirements for MIS|
|Types of MIS|
|Process of MIS|
|Criteria for MIS|
|Strategies for determining MIS design|
|Session 2: MIS exercise|
|Session guide: Management information system exercise|
|Session 3. Computers as management tools|
|Session guide: Computers as management tools1|
|Reading note: Computers as management tools|
|Overview of computer technology|
|Computer applications in agricultural research|
|A framework for de-centralized use of computers|
|Management of the computer services function|
|Acquisition of computer resources|
|Literature references for further reading|
|Session 4. Network techniques|
|Session guide: Network techniques|
|Reading note: Network techniques|
|A drainage experiment for salinity control|
|Concept of a project network|
|Distinguishing between events and activities|
|Drawing the network|
|PERT and CPM models|
|Incorporating the time estimate|
|Earliest start and finish times|
|Latest finish and latest start times|
|Session 5. PERT and CPM exercise|
|Session guide: PERT and CPM exercise|
|PERT and CPM exercise: Developing salt-tolerant varieties of paddy|
The physical components of a computer are called hardware. The set of instructions given to the computer to accomplish a task is referred to as software.
The hardware components of a computer system consist of the input devices through which data or instructions are entered; the output devices by which the processed results are presented; and the central processing unit (CPU), which receives data or instructions from input devices, processes them, and presents the results to the output devices, the CPU can be has three primary components: a memory; an arithmetic and logic unit (ALU); and a control unit. Most of the components of a computer, such as memory, ALU, control unit and interconnections to (interface between) input and output devices and the CPU operate through electronic circuitry, which makes it possible to perform the processing at extremely high speeds. The speed of a CPU is normally measured in millions of instructions per second (MIPS).
The input and output devices are usually electro-mechanical items. Input devices -which include keyboards, scanners, etc. - convert mechanical actions into electrical signals and send them through the interfacing circuitry to the CPU. The output devices are printers, plotters, VDUs, etc. These devices convert the electrical signals received from the CPU into physical movements to generate the output. Certain storage devices working on the principle of electro-magnetic storage, such as disk drives or magnetic tape drives, are also used with computers. These are helpful in storing data and instructions for later use. They are also known as auxiliary storage devices. The input, output and auxiliary storage devices are commonly known as peripheral devices.
The CPU control unit seeks the instructions stored in the memory, one by one, decodes them and executes them with the help of the arithmetic and logic Unit (ALU). It also coordinates operations related to transfer of data to and from input and output devices. The control unit and ALU together are called the processor.
The functional diagram of a computer system is given below:
The electronic circuitry forming the ALU and control unit is called the CPU, or simply the processor. The processor executes instructions stored in the main memory by fetching and decoding them. The instructions must be in a language the processor can understand. Normally these are groups of bits which trigger an appropriate circuitry of ALU or control unit. Each processor has its own convention for using the combinations of bits for executing specific arithmetic (add, multiply, etc.), logic (compare, branch) and control (initiate device, store, retrieve, etc.) operations. Such a convention is known as the machine language. Each processor has its own machine language. The architecture of the processor (instruction set, size of data handled per instruction, unit of data transfer between processor and memory, basic data types handled, etc.) determines its size, power and cost.
Developments in micro-electronics in the past decade or so have provided tremendous opportunities for the growth of computer technology. Micro-electronic technology has made it feasible to have thousands of electronic components fabricated into a small area of silicon wafer (approximately thumbnail size. Such fabricated wafers of silicon are popularly known as silicon chips or simply 'chips.' With these developments, various functional units of computers (CPU, memory, input/output interfaces) which previously required thousands of separate electronic components have become very compact through integration. In fact, it has become possible to have a CPU on a chip, large memory capacity on a chip, and input/output interfaces also on a third chip. These developments have brought down the cost of various components, and have made computer systems more affordable by the end user.
The late 1970s saw the arrival of the home computer, which consisted of limited powered (8 bit) microprocessor chips with limited memory (64 kb) and inexpensive peripheral devices, such as floppy disk drives, video monitors, keyboards and character printers. The most attractive feature of these systems was the availability of user-friendly software products, such as electronic spreadsheets, data management packages and word processing packages, which made using computers so simple. Since then, a large number of vendors have introduced inexpensive computer systems catering for low-volume, data processing applications: systems based on microprocessor chips from companies like Intel, Motorola and Rockwell.
The use of micro-computers as end-user computing devices got a boost when the giant, USA-based computer manufacturer International Business Machines Corporation (IBM) entered the micro-computer market. IBM introduced a product called the 'IBM Personal Computer' (IBM/PC) based on a partial 16 bit microprocessor chip (INTEL 8088) manufactured by the Intel Corporation. Soon after its introduction, several manufacturers, who were otherwise offering different products centred around a variety of microprocessors, introduced equivalent PCs in their own product range, with hardware specifications similar to that of the IBM-PC. Such systems are called IBM-compatible PCs. The reason for such adoption of one specification is simply the market potential. The same is responsible for the availability of third-party software (software developed by neither manufacturer nor user, but by a commercial software company) on IBM-compatible PCs. Later we shall see how these PCs are superior to earlier mini- and large computers in terms of meeting the information processing needs of end-users, in addition to their price advantage.
Mini-computer manufacturers like Digital, Data-General and Hewlett Packard have also taken advantage of the micro-electronic revolution and have introduced microprocessor-based versions of their earlier computers. This approach gave them advantage of providing readily available and well tested software from their minis for the inexpensive micro-computer systems.
Some manufacturers have adopted a technique called bit slicing, in which a combination of smaller microprocessors is put together to offer the power of a larger processor. For example, using bit-slicing technique, four 4-bit microprocessors can be used to make a 16-bit processor.
Today, the user has a wide choice in the availability and use of computers. Computer systems based on 32-bit microprocessors, offering features superior to those of earlier super-mini-computers are already available as desktop models, with 64-bit machines on their way.
The component in which the instructions and data to be handled by the processor are stored is called main memory. Normally computer memories today use through electronic circuitry, although in the early computers, tiny magnetic cores were used. Since auxiliary storage devices are also used for storing instructions and data, the memory system from which the processor takes instructions directly is also known as main memory.
The basic unit of storage in memory systems is a bistable device, i.e. a component having two stable states. Conventionally these stable states are used to represent a 0 or a 1. Hence, the unit of storage is known as a binary digit or bit. Since a bit can represent only two values, we need a group of bits to store a meaningful instruction or data. The standard unit adopted for such group is eight, and a group of eight bits is known as a byte. Using standard coding systems, known as ASCII (American Standard Code for Information Interchange) or EBCDIC (Extended Binary Coded Decimal Interchange Code), a byte is used to store a character (typically any keyboard character). Larger memory units are the kilobyte (written kb) (a storage unit of 210 bytes = 1024 b); similarly, a megabyte (Mb) is a unit of 1024 kb (= 220 bytes); while gigabyte memories are increasingly common (1 Gb = 1024 Mb = 230 bytes). The sizes of main memories normally range from 1 to 2 Mb for home computers, to 16 Mb to 64 Mb for larger computers. These capacities help the processor to readily access the instructions and data. The larger the capacity, the better should be the utilization of processor power and hence the better the performance. The time taken for a main memory to supply or receive information is measured in nanoseconds (1-9 seconds, or one thousand-millionth of second). A typical memory unit may take around 200 nanoseconds to transfer a byte to the processor, i.e., 5 Mb per second.
The storage space of a memory system can be used to hold permanent instructions or used as a scratch pad. Since memories are made of electronic circuits, they can be prefabricated to have a desired set of frequently used instructions. The information stored in such memory modules cannot be overwritten; they can only be read. Such memory modules are known as read-only-memories (ROM). Since the information is prefabricated, their contents cannot be erased.
A large portion of main memory is normally used as temporary storage space. In that portion of memory - known as random access memory (RAM) - instructions or data are copied from auxiliary storage devices. Once they are used, another set of instructions and data can be copied into the same place. This feature gives us the flexibility to use the same computer for different applications. Since RAM is made of electronic circuitry, but not prerecorded like ROM, the contents of RAM get erased by switching off the power supply.
Computer peripherals have also seen major developments. Input devices like card readers and paper tape readers have become obsolete. Since computers have become inexpensive, direct data-entry systems, by which users directly interact with the computer, have become common. These systems, which are driven by inexpensive processors (quite often they are IBM-compatible PCs) accept data from keyboards. They offer, through resident software, formatting and data validation features, with the help of which the user can design customized screens and incorporate validation checks for data to be entered.
A pointing device called a mouse has become popular for use with PCs. The mouse facilitates the selection of menu and data items displayed on VDU screen without using the conventional keyboard. The user can move the pointer displayed on the VDU screen to a desired position by moving the mouse on a pad, and indicate the selection by pressing the select buttons on the mouse.
Another useful device that helps in capturing graphic information by tracing different points on a map is digitizer. This is an important input device for applications involving spatial planning, as well as in engineering design.
Optical and video scanners are used to capture pictures directly into computer files. Optical scanners create an image of pictures in computer memory by scanning them. Video scanners take the picture of the object kept in front of a video camera. These devices are widely used in desk-top publishing (DTP) applications. They are also used in geographical information system (GIS) applications.
VDUs of different types have become common output devices with PC and minicomputer systems. A wide range are available, either cathode-ray tube based (like a TV) or using liquid crystal display (LCD) technology, and with a wide range of options in terms of size, colour or B&W, resolution, colour quality, size of screen, etc.
A wide range of light- to heavy-duty, hard-copy printers offering different character fonts are used with PCs and micro-based mini-computers. These printers, which are now inexpensive can generate regional language printouts, since they use a dot matrix technique to print the characters. Letter-quality printers are also available for use with word processing applications. Laser printers and ink-jet printers which produce a high resolution hard-copy image composed by the user in computer memory are a new addition to the variety of printers. These are popularly used with DTP applications.
Floppy disks have become common back up and porting media. The 5¼" size disks are becoming obsolete, being replaced by 3½" disks. They are standard equipment for almost all systems. Floppy disk drives record 360 kb to 1.4 Mb of data on these disks. To back up large volumes of hard-disk-resident data, tape drives are available to take a backup on cartridge tapes. They record data of the order of 40 to 60 Mb or greater. Winchester technology is widely used for hard disks. With improved reliability, Winchester disc drives come as a compact and composite unit of drive and disc, offering storage capacities of the order of 30 Mb to several gigabytes. Compact disk read-only memories (CD-ROMs) are a recent innovation, capable of storing gigabytes of data on an optical disc. Information on such discs is normally pre-recorded by the suppliers, offering the data and software for popular applications such as encyclopaedias, dictionaries, literary collections and tutorial material for various subjects, with extensive illustrations. CD write-once read-many times (WORM) capability are also available today.
With the availability of different types of processors and peripheral devices, a number of configurations are possible. Typically, they can be classed as:
· Stand-alone, inexpensive systems (e.g., PCs).
· Work stations (powerful processors with large memory and disk capacities, high-resolution graphics and advanced software for specialized applications).
· Mini-computer systems with dumb terminals (terminals having keyboard and VDU only) or intelligent terminals (terminals having some processing capacity) such as PCs.
· PCs interconnected through an inexpensive local area network (LAN).
· Computer systems at different locations connected through a wide area network (WAN). Some of these configurations are considered in the sections below.
The original IBM PC was based on a CPU processor chip called the Intel 8088, and optionally with an Intel 8087 numerical co-processor chip to provide faster computational speed. The motherboard (the main printed circuit board) provided 40 kb of ROM. Using expansion slots, additional RAM memory of up to 640 kb could be added. The PC in its simplest form was interfaced with two 5¼" floppy drives, a keyboard and a monochrome VDU. Additional ports could be used to connect a 10 or 20 Mb Winchester disk (in which case it was called the IBM PC/XT), printers or LAN boards.
The IBM PC/AT (Personal Computer; Advanced Technology) computer used the next generation of Intel microprocessor, the 80286 chip, with a clock frequency of 8 to 10 MHz, RAM memory of around 2 Mb and either colour graphics or enhanced graphic adapter (EGA). This system was normally interfaced with a 1.2 Mb, 5¼" floppy disk drive and a 40 Mb hard disk drive.
The next generation was the IBM PS/2 (Personal System model 2), and used micro-channel architecture for efficient input-output and graphics handling. It was based on Intel 80386 chip as the main processor, supported 8 Mb main memory and worked with a clock speed of 25 MHz, offering a capacity of about 5 MIPS.
The IBM PC and PC/XT models, being inexpensive, were widely used in home-computing and end-user computing applications. Many organizations provided at least a PC/XT to each departmental head to facilitate computing and information processing needs. The operational cost of these systems is minimal since their power requirements are less than 1 kW per unit and do not require air-conditioning.
The technology for small computers similar to the PC has been advancing by leaps and bounds: so fast that it is difficult to keep abreast of the latest developments. User-friendly software to exploit the full potential these powerful systems is more and more accessible, and their operating environments are making the use of computers simpler. At the same time, international operating modalities - in agricultural research as much as in any other sphere - are based increasingly upon the use of computers in every activity.
Other personal computers
Apple Computers are another popular microcomputer company, which introduced several popular PCs even before IBM entered the scene. This company has used Motorola chips as the main processors. The Apple Macintosh is the most popular PC centred around the MC68020 processor and later versions, and is a system with very user-friendly screen management software, together with excellent word processing and desk-top publishing software capabilities.
The concept of work stations emerged because dedicated systems can be configured to offer highly efficient problem solving environments for special applications. Computer aided design (CAD) applications available on work stations include digitizers, plotters and high resolution graphic monitors, in addition to the powerful processors supported by large main and auxiliary memories. User-friendly and comprehensive design software available with work stations enable the designer to solve problems with relative ease. Similarly work stations for geographic information systems (GIS) can be used for spatial planning applications such as development of infrastructure facilities. SUN and Apollo work stations are two popular models.
Micro based mini-computers
The current trend is to introduce low cost, mini-computer systems based on advanced microprocessor chips in a multi microprocessor architecture as multiterminal systems. The power offered by these systems is comparable to some of the super-mini-computer systems of not long before. PCs are used as terminals to mini-computers. Such configurations offer the advantages of providing computing facilities in a distributed manner, with scope for centralized processing and storage facilities wherever needed.
Microprocessor-based computer configurations vary from simple, single-user systems such as PCs, to complex multiterminal systems. Apart from these developments, minicomputers and mainframe computers offered by established computer manufacturers have undergone changes. These have become more powerful and compact, and offer powerful software systems.
Local area networks
Personal computers located in close proximity, such as in a suite of offices, can be interconnected through inexpensive hardware using telephone cables. Such interconnection is known as a local area network (LAN). One of the computers is used as a file server to store commonly used software and data. A LAN reduces software cost since the installations need not buy multiple copies of the software. Each PC user connected through the LAN can access the software from the file server. Apart from this, file transfers, electronic mail, etc., are other benefits of a LAN configuration. The viability of LAN configuration will have to be evaluated based on the cost of LAN circuit boards to be installed in each PC, the cost of a file server to offer better performance, and the benefits of such interconnection.
Wide area networks
Computers located at various sites distant to each other can be interconnected through telecommunication networks known as wide area networks (WAN), using communication controllers, modems and associated communications software to facilitate software and data sharing amongst the users of the systems. Pooling of software and hardware resources located at different locations, and transmission of data from originating sources to destinations using them are strengths of WANs. The systems connected to a WAN need not be homogenous. Expensive software systems installed at one of the nodes of the WAN can be used by any user connected through one of the other nodes.
Computers have become powerful and well accepted - if not almost obligatory - tools for decision support, built up on the availability of user-friendly software. Operating systems, language processors, general purpose packages and special purpose packages constitute the main elements in software. General purpose, end-user software systems include electronic spreadsheets, data management packages and integrated software packages. Special purpose software systems include packages for operations research, statistical analysis, project management, computer aided design, computer aided software engineering, presentation, word processing and desk-top publishing.
Software can be classified into systems and applications software. Broadly, software that offers facilities for better utilization of systems resources is called systems software. Software developed for specific application needs is called applications software. Operating systems, language processors and general purpose end-user packages are examples of system software. Financial accounting, payroll, personnel and inventory control packages are examples of applications software. Systems software will have to be procured from the manufacturer or established software houses. Application software can be designed and developed in-house using a team of professionals or with the help of professional software developing companies.
The following sections briefly present such software packages. For more details you should refer to product literature or software reviews in computer periodicals.
The first and most important piece of software needed for computers is called the operating system (OS). This software presents an end-user view of the computer, making several physical characteristics of the computer and its peripheral devices transparent to the user. OS allows the computer to accept commands in natural language (rather than in binary code, the native language of the machine) and to execute them and thus offer various services. The services offered by OS include acceptance of instructions and data from several types of input devices, presentation of results through various types of output devices, organization of the storage space on auxiliary storage devices in the form of files, load the specified software from these devices into the main memory and execute them, and so forth. The basic operations required to work with different devices are all performed by OS. Normally the user is required to specify the operation (read/write) to be performed and the device on which to perform. The task will be executed by OS without burdening the user with several device-dependent details. Similarly, several software packages can be stored on disk or tape and can be executed by giving simple instructions to OS. In large computer systems, OS also provides a multi-user working environment. Security through passwords, resource sharing, accounting of system utilization, etc., are some of the additional tasks performed.
MS-DOS [Microsoft Disc Operating System] on IBM-compatible PCs and UNIX on minis have become de facto industry standards. Because of this, the portability of software and data files has increased enormously. Many computer manufacturers still offer proprietary operating systems on their mini- and large computers.
The CPU executes instructions stored in main memory by fetching and decoding them. Therefore these instructions will have to be in binary code, the machine language of the CPU. However, it is difficult to give instructions in machine language to handle even the simplest of operations. We can express our problems better in natural language closer to our application environment. Taking this into account, software developers have designed higher-level computer languages (machine language being lower-level language) and developed translators which translate instructions given in high-level language into machine language, which then can be executed.
For scientific applications, languages such as FORTRAN (FORmula TRANslation), COBOL (COmmon Business-Oriented Language) and BASIC (Beginners All-purpose Symbolic Instruction Code) have been developed. The American National Standards Institute (ANSI) has also developed standard specifications for these languages. In addition to these, a language called PASCAL (named after the scientist) was developed by computer scientists to promote better discipline in program writing, called structured programming. Languages such as Prolog (PROgramming LOGic) for artificial intelligence applications and language C/C+ + for developing applications involving the use of basic system resources have become popular.
To be able to use any of these languages, we need to have the language translator software - called a compiler or interpreter - needed to execute programs written in a higher level language. Compilers or interpreters for the same language will be different for different computer operating environments. Depending on the requirements, we need to acquire these software systems.
All these languages were offered on the early mini- and large computers. Today they are all available on PCs and micro-based mini-systems. Some of the popular language processors available on PCs are Turbo Pascal, Microsoft C, Quick BASIC, Microsoft COBOL, Micro Focus COBOL, and Turbo Prolog. These language processors offer in-built editing features and efficient compilation techniques to improve programmer productivity and run-time efficiency.
Electronic spreadsheet software are considered a software marvel, which has brought computers closer to end-users. With a matrix-like column-row interface and cell positioning through directional keys, users can enter data in the form of text, numbers and formulae into specified cell addresses, and specify the relationships between the cells. Financial, statistical and mathematical functions supported by these packages offer model building capabilities to end-users. Graphics features enable improved presentation of results and data. Data management functions provide good interfaces with spreadsheet databases. Table handling facilities such as table look up and result tabulation by substitution of the given values in the specified cells enable the user to perform 'What-if?' analyses. These features qualify the spreadsheet packages to be used as DSS generators (software systems that facilitate the development of DSS) in a limited sense. Today these packages are extensively used in cash flow projections, project investment analysis, budgeting and business planning. They have almost replaced the use of conventional programming languages for those applications which can be modelled as spreadsheets. To give an example in the area of materials management, electronic spreadsheet packages are widely used in the generation of comparative statements, consumption budgeting exercises and A-B-C analysis.
Popular electronic spreadsheet packages are VISI-CALC, Lotus 1-2-3, VP Planner, Multi-Plan, Super Calc, Excel, Quattro, Softpro-456 and IFPS.
Data management systems
Data management software facilitates development of data processing systems with user-convenient interfaces. Facilities offered by these packages include data creation, manipulation, processing, organization, query processing and report generation. Data management packages available on PCs are directly responsible for development of effective de-centralized information systems. Users can participate actively in the design, development and use of computer-based information systems because of the simple interfaces provided by these packages. The command- and programming-level features enable the packages to be used as generators for data-oriented, decision-support systems. Popular data management packages on PCs are dBASE IV, RBASE, Reflex, INGRESS and ORACLE.
In a number of situations, the user is required to use features offered by electronic spreadsheets, data management packages and word processors together to solve a problem. Integrated software systems offer all these features through one package and offer a convenient programming language. They also eliminate the need for transferring data from one package to another. They are the ideal DSS generators. Popular integrated packages are FRAMEWORK II, Symphony, Focus and Farsight.
Operations research and statistics
Packages for operations research and statistics enable the user to solve optimization, forecasting and simulation problems. LINDO, GINO, SPSS/PC+, RATS and SLAM are some of the popular packages.
Project-management software packages offer facilities to accept project network data, perform resource analysis, scheduling, cost analysis and generate reports to aid project monitoring. Popular project management packages are Harvard Total Project Management, Time Line, MS Project, INSTAPLAN and PRISM.
Computer aided design
CAD packages provide features such as automatic dimensioning, projections, hatching, 3-D visualization and standard libraries of designs. Popular CAD packages are AUTOCAD, PRODESIGN-II and Generic CAD.
Computer aided software engineering
Computer Aided Software Engineering (CASE) packages are tools which improve the productivity of designing and developing information systems. They provide a structured systems analysis and design environment and accept systems specifications in the form of data flow diagrams, record layouts, entity-relationship diagrams, structured charts, systems flow charts and screen layouts. CASE tools automatically document systems specifications entered by the analyst and generate a number of analysis reports and diagrams. They provide features like prototyping, screen painting, validation of data flow diagrams and generation of record layouts in COBOL, dBASE IV or Pascal. Popular CASE tools are Yourdon Tool Kit, Nastec Design Aid, MEGA, Execlerator, Structsoft and TURBO ANALYST.
Presentation software systems assist the user to produce electronic slides involving text and pictures, to capture from other software packages, and cut and paste from picture libraries. The packages can be used for classroom instruction, seminars, workshops and boardroom presentations.
Word processing and desk-top publishing
An application that came into prominence with the availability of inexpensive hardware is word processing. Word processing packages offer facilities to create, edit and present textual information. Cut-and-paste features, underlining, boldfacing and alignment features greatly simplify the preparation of final versions of documents. Facilities such as mail merge, spell checking, generation of table of contents, indexing, etc., greatly enhance the power of these packages. Wordstar, WordPerfect and WORD are some of the popular word processing packages.
Desk-top publishing (DTP) is a related application which addresses problems such as development of page layouts, selection of fonts and inclusion of graphic objects in addition to word processing. Pagemaker and Ventura are two popular DTP software systems.
Recent developments in microcomputer technology and user-oriented software products offer enormous opportunities for improving the quality of decision making. They have provided excellent scope for developing convenient interfaces with databases, data analysis models and graphics so that the user can use the computer as a decision-support aid, accommodating personal styles in the analysis and interpretation of data. There is wide scope for using computers as a management tool in any organization.