|Management of agricultural research: A training manual. Module 6: Management information systems, computers and network techniques (1997)|
|Session 3. Computers as management tools|
|Reading note: Computers as management tools|
Computers have been increasingly used in research and commerce over the last three decades. The concept of stored program computers, in which instructions and data are stored in a memory unit and fetched and executed by a processor, has not undergone change, but developments in micro-electronics have brought the size and cost of computer systems to previously unimaginably low figures. In parallel, computers have become more powerful and accessible with the emergence of sophisticated, user-friendly software.
Computers are equipment which receive information, process this information in some way according to a given set of instructions, and present the results in a useful form.
The physical components of a computer are called hardware. The set of instructions given to the computer to accomplish a task is referred to as software.
The hardware components of a computer system consist of the input devices through which data or instructions are entered; the output devices by which the processed results are presented; and the central processing unit (CPU), which receives data or instructions from input devices, processes them, and presents the results to the output devices, the CPU can be has three primary components: a memory; an arithmetic and logic unit (ALU); and a control unit. Most of the components of a computer, such as memory, ALU, control unit and interconnections to (interface between) input and output devices and the CPU operate through electronic circuitry, which makes it possible to perform the processing at extremely high speeds. The speed of a CPU is normally measured in millions of instructions per second (MIPS).
The input and output devices are usually electro-mechanical items. Input devices -which include keyboards, scanners, etc. - convert mechanical actions into electrical signals and send them through the interfacing circuitry to the CPU. The output devices are printers, plotters, VDUs, etc. These devices convert the electrical signals received from the CPU into physical movements to generate the output. Certain storage devices working on the principle of electro-magnetic storage, such as disk drives or magnetic tape drives, are also used with computers. These are helpful in storing data and instructions for later use. They are also known as auxiliary storage devices. The input, output and auxiliary storage devices are commonly known as peripheral devices.
The CPU control unit seeks the instructions stored in the memory, one by one, decodes them and executes them with the help of the arithmetic and logic Unit (ALU). It also coordinates operations related to transfer of data to and from input and output devices. The control unit and ALU together are called the processor.
The functional diagram of a computer system is given below:
The electronic circuitry forming the ALU and control unit is called the CPU, or simply the processor. The processor executes instructions stored in the main memory by fetching and decoding them. The instructions must be in a language the processor can understand. Normally these are groups of bits which trigger an appropriate circuitry of ALU or control unit. Each processor has its own convention for using the combinations of bits for executing specific arithmetic (add, multiply, etc.), logic (compare, branch) and control (initiate device, store, retrieve, etc.) operations. Such a convention is known as the machine language. Each processor has its own machine language. The architecture of the processor (instruction set, size of data handled per instruction, unit of data transfer between processor and memory, basic data types handled, etc.) determines its size, power and cost.
Developments in micro-electronics in the past decade or so have provided tremendous opportunities for the growth of computer technology. Micro-electronic technology has made it feasible to have thousands of electronic components fabricated into a small area of silicon wafer (approximately thumbnail size. Such fabricated wafers of silicon are popularly known as silicon chips or simply 'chips.' With these developments, various functional units of computers (CPU, memory, input/output interfaces) which previously required thousands of separate electronic components have become very compact through integration. In fact, it has become possible to have a CPU on a chip, large memory capacity on a chip, and input/output interfaces also on a third chip. These developments have brought down the cost of various components, and have made computer systems more affordable by the end user.
The late 1970s saw the arrival of the home computer, which consisted of limited powered (8 bit) microprocessor chips with limited memory (64 kb) and inexpensive peripheral devices, such as floppy disk drives, video monitors, keyboards and character printers. The most attractive feature of these systems was the availability of user-friendly software products, such as electronic spreadsheets, data management packages and word processing packages, which made using computers so simple. Since then, a large number of vendors have introduced inexpensive computer systems catering for low-volume, data processing applications: systems based on microprocessor chips from companies like Intel, Motorola and Rockwell.
The use of micro-computers as end-user computing devices got a boost when the giant, USA-based computer manufacturer International Business Machines Corporation (IBM) entered the micro-computer market. IBM introduced a product called the 'IBM Personal Computer' (IBM/PC) based on a partial 16 bit microprocessor chip (INTEL 8088) manufactured by the Intel Corporation. Soon after its introduction, several manufacturers, who were otherwise offering different products centred around a variety of microprocessors, introduced equivalent PCs in their own product range, with hardware specifications similar to that of the IBM-PC. Such systems are called IBM-compatible PCs. The reason for such adoption of one specification is simply the market potential. The same is responsible for the availability of third-party software (software developed by neither manufacturer nor user, but by a commercial software company) on IBM-compatible PCs. Later we shall see how these PCs are superior to earlier mini- and large computers in terms of meeting the information processing needs of end-users, in addition to their price advantage.
Mini-computer manufacturers like Digital, Data-General and Hewlett Packard have also taken advantage of the micro-electronic revolution and have introduced microprocessor-based versions of their earlier computers. This approach gave them advantage of providing readily available and well tested software from their minis for the inexpensive micro-computer systems.
Some manufacturers have adopted a technique called bit slicing, in which a combination of smaller microprocessors is put together to offer the power of a larger processor. For example, using bit-slicing technique, four 4-bit microprocessors can be used to make a 16-bit processor.
Today, the user has a wide choice in the availability and use of computers. Computer systems based on 32-bit microprocessors, offering features superior to those of earlier super-mini-computers are already available as desktop models, with 64-bit machines on their way.
The component in which the instructions and data to be handled by the processor are stored is called main memory. Normally computer memories today use through electronic circuitry, although in the early computers, tiny magnetic cores were used. Since auxiliary storage devices are also used for storing instructions and data, the memory system from which the processor takes instructions directly is also known as main memory.
The basic unit of storage in memory systems is a bistable device, i.e. a component having two stable states. Conventionally these stable states are used to represent a 0 or a 1. Hence, the unit of storage is known as a binary digit or bit. Since a bit can represent only two values, we need a group of bits to store a meaningful instruction or data. The standard unit adopted for such group is eight, and a group of eight bits is known as a byte. Using standard coding systems, known as ASCII (American Standard Code for Information Interchange) or EBCDIC (Extended Binary Coded Decimal Interchange Code), a byte is used to store a character (typically any keyboard character). Larger memory units are the kilobyte (written kb) (a storage unit of 210 bytes = 1024 b); similarly, a megabyte (Mb) is a unit of 1024 kb (= 220 bytes); while gigabyte memories are increasingly common (1 Gb = 1024 Mb = 230 bytes). The sizes of main memories normally range from 1 to 2 Mb for home computers, to 16 Mb to 64 Mb for larger computers. These capacities help the processor to readily access the instructions and data. The larger the capacity, the better should be the utilization of processor power and hence the better the performance. The time taken for a main memory to supply or receive information is measured in nanoseconds (1-9 seconds, or one thousand-millionth of second). A typical memory unit may take around 200 nanoseconds to transfer a byte to the processor, i.e., 5 Mb per second.
The storage space of a memory system can be used to hold permanent instructions or used as a scratch pad. Since memories are made of electronic circuits, they can be prefabricated to have a desired set of frequently used instructions. The information stored in such memory modules cannot be overwritten; they can only be read. Such memory modules are known as read-only-memories (ROM). Since the information is prefabricated, their contents cannot be erased.
A large portion of main memory is normally used as temporary storage space. In that portion of memory - known as random access memory (RAM) - instructions or data are copied from auxiliary storage devices. Once they are used, another set of instructions and data can be copied into the same place. This feature gives us the flexibility to use the same computer for different applications. Since RAM is made of electronic circuitry, but not prerecorded like ROM, the contents of RAM get erased by switching off the power supply.
Computer peripherals have also seen major developments. Input devices like card readers and paper tape readers have become obsolete. Since computers have become inexpensive, direct data-entry systems, by which users directly interact with the computer, have become common. These systems, which are driven by inexpensive processors (quite often they are IBM-compatible PCs) accept data from keyboards. They offer, through resident software, formatting and data validation features, with the help of which the user can design customized screens and incorporate validation checks for data to be entered.
A pointing device called a mouse has become popular for use with PCs. The mouse facilitates the selection of menu and data items displayed on VDU screen without using the conventional keyboard. The user can move the pointer displayed on the VDU screen to a desired position by moving the mouse on a pad, and indicate the selection by pressing the select buttons on the mouse.
Another useful device that helps in capturing graphic information by tracing different points on a map is digitizer. This is an important input device for applications involving spatial planning, as well as in engineering design.
Optical and video scanners are used to capture pictures directly into computer files. Optical scanners create an image of pictures in computer memory by scanning them. Video scanners take the picture of the object kept in front of a video camera. These devices are widely used in desk-top publishing (DTP) applications. They are also used in geographical information system (GIS) applications.
VDUs of different types have become common output devices with PC and minicomputer systems. A wide range are available, either cathode-ray tube based (like a TV) or using liquid crystal display (LCD) technology, and with a wide range of options in terms of size, colour or B&W, resolution, colour quality, size of screen, etc.
A wide range of light- to heavy-duty, hard-copy printers offering different character fonts are used with PCs and micro-based mini-computers. These printers, which are now inexpensive can generate regional language printouts, since they use a dot matrix technique to print the characters. Letter-quality printers are also available for use with word processing applications. Laser printers and ink-jet printers which produce a high resolution hard-copy image composed by the user in computer memory are a new addition to the variety of printers. These are popularly used with DTP applications.
Floppy disks have become common back up and porting media. The 5¼" size disks are becoming obsolete, being replaced by 3½" disks. They are standard equipment for almost all systems. Floppy disk drives record 360 kb to 1.4 Mb of data on these disks. To back up large volumes of hard-disk-resident data, tape drives are available to take a backup on cartridge tapes. They record data of the order of 40 to 60 Mb or greater. Winchester technology is widely used for hard disks. With improved reliability, Winchester disc drives come as a compact and composite unit of drive and disc, offering storage capacities of the order of 30 Mb to several gigabytes. Compact disk read-only memories (CD-ROMs) are a recent innovation, capable of storing gigabytes of data on an optical disc. Information on such discs is normally pre-recorded by the suppliers, offering the data and software for popular applications such as encyclopaedias, dictionaries, literary collections and tutorial material for various subjects, with extensive illustrations. CD write-once read-many times (WORM) capability are also available today.
With the availability of different types of processors and peripheral devices, a number of configurations are possible. Typically, they can be classed as:
· Stand-alone, inexpensive systems (e.g., PCs).
· Work stations (powerful processors with large memory and disk capacities, high-resolution graphics and advanced software for specialized applications).
· Mini-computer systems with dumb terminals (terminals having keyboard and VDU only) or intelligent terminals (terminals having some processing capacity) such as PCs.
· PCs interconnected through an inexpensive local area network (LAN).
· Computer systems at different locations connected through a wide area network (WAN). Some of these configurations are considered in the sections below.
The original IBM PC was based on a CPU processor chip called the Intel 8088, and optionally with an Intel 8087 numerical co-processor chip to provide faster computational speed. The motherboard (the main printed circuit board) provided 40 kb of ROM. Using expansion slots, additional RAM memory of up to 640 kb could be added. The PC in its simplest form was interfaced with two 5¼" floppy drives, a keyboard and a monochrome VDU. Additional ports could be used to connect a 10 or 20 Mb Winchester disk (in which case it was called the IBM PC/XT), printers or LAN boards.
The IBM PC/AT (Personal Computer; Advanced Technology) computer used the next generation of Intel microprocessor, the 80286 chip, with a clock frequency of 8 to 10 MHz, RAM memory of around 2 Mb and either colour graphics or enhanced graphic adapter (EGA). This system was normally interfaced with a 1.2 Mb, 5¼" floppy disk drive and a 40 Mb hard disk drive.
The next generation was the IBM PS/2 (Personal System model 2), and used micro-channel architecture for efficient input-output and graphics handling. It was based on Intel 80386 chip as the main processor, supported 8 Mb main memory and worked with a clock speed of 25 MHz, offering a capacity of about 5 MIPS.
The IBM PC and PC/XT models, being inexpensive, were widely used in home-computing and end-user computing applications. Many organizations provided at least a PC/XT to each departmental head to facilitate computing and information processing needs. The operational cost of these systems is minimal since their power requirements are less than 1 kW per unit and do not require air-conditioning.
The technology for small computers similar to the PC has been advancing by leaps and bounds: so fast that it is difficult to keep abreast of the latest developments. User-friendly software to exploit the full potential these powerful systems is more and more accessible, and their operating environments are making the use of computers simpler. At the same time, international operating modalities - in agricultural research as much as in any other sphere - are based increasingly upon the use of computers in every activity.
Other personal computers
Apple Computers are another popular microcomputer company, which introduced several popular PCs even before IBM entered the scene. This company has used Motorola chips as the main processors. The Apple Macintosh is the most popular PC centred around the MC68020 processor and later versions, and is a system with very user-friendly screen management software, together with excellent word processing and desk-top publishing software capabilities.
The concept of work stations emerged because dedicated systems can be configured to offer highly efficient problem solving environments for special applications. Computer aided design (CAD) applications available on work stations include digitizers, plotters and high resolution graphic monitors, in addition to the powerful processors supported by large main and auxiliary memories. User-friendly and comprehensive design software available with work stations enable the designer to solve problems with relative ease. Similarly work stations for geographic information systems (GIS) can be used for spatial planning applications such as development of infrastructure facilities. SUN and Apollo work stations are two popular models.
Micro based mini-computers
The current trend is to introduce low cost, mini-computer systems based on advanced microprocessor chips in a multi microprocessor architecture as multiterminal systems. The power offered by these systems is comparable to some of the super-mini-computer systems of not long before. PCs are used as terminals to mini-computers. Such configurations offer the advantages of providing computing facilities in a distributed manner, with scope for centralized processing and storage facilities wherever needed.
Microprocessor-based computer configurations vary from simple, single-user systems such as PCs, to complex multiterminal systems. Apart from these developments, minicomputers and mainframe computers offered by established computer manufacturers have undergone changes. These have become more powerful and compact, and offer powerful software systems.
Local area networks
Personal computers located in close proximity, such as in a suite of offices, can be interconnected through inexpensive hardware using telephone cables. Such interconnection is known as a local area network (LAN). One of the computers is used as a file server to store commonly used software and data. A LAN reduces software cost since the installations need not buy multiple copies of the software. Each PC user connected through the LAN can access the software from the file server. Apart from this, file transfers, electronic mail, etc., are other benefits of a LAN configuration. The viability of LAN configuration will have to be evaluated based on the cost of LAN circuit boards to be installed in each PC, the cost of a file server to offer better performance, and the benefits of such interconnection.
Wide area networks
Computers located at various sites distant to each other can be interconnected through telecommunication networks known as wide area networks (WAN), using communication controllers, modems and associated communications software to facilitate software and data sharing amongst the users of the systems. Pooling of software and hardware resources located at different locations, and transmission of data from originating sources to destinations using them are strengths of WANs. The systems connected to a WAN need not be homogenous. Expensive software systems installed at one of the nodes of the WAN can be used by any user connected through one of the other nodes.
Computers have become powerful and well accepted - if not almost obligatory - tools for decision support, built up on the availability of user-friendly software. Operating systems, language processors, general purpose packages and special purpose packages constitute the main elements in software. General purpose, end-user software systems include electronic spreadsheets, data management packages and integrated software packages. Special purpose software systems include packages for operations research, statistical analysis, project management, computer aided design, computer aided software engineering, presentation, word processing and desk-top publishing.
Software can be classified into systems and applications software. Broadly, software that offers facilities for better utilization of systems resources is called systems software. Software developed for specific application needs is called applications software. Operating systems, language processors and general purpose end-user packages are examples of system software. Financial accounting, payroll, personnel and inventory control packages are examples of applications software. Systems software will have to be procured from the manufacturer or established software houses. Application software can be designed and developed in-house using a team of professionals or with the help of professional software developing companies.
The following sections briefly present such software packages. For more details you should refer to product literature or software reviews in computer periodicals.
The first and most important piece of software needed for computers is called the operating system (OS). This software presents an end-user view of the computer, making several physical characteristics of the computer and its peripheral devices transparent to the user. OS allows the computer to accept commands in natural language (rather than in binary code, the native language of the machine) and to execute them and thus offer various services. The services offered by OS include acceptance of instructions and data from several types of input devices, presentation of results through various types of output devices, organization of the storage space on auxiliary storage devices in the form of files, load the specified software from these devices into the main memory and execute them, and so forth. The basic operations required to work with different devices are all performed by OS. Normally the user is required to specify the operation (read/write) to be performed and the device on which to perform. The task will be executed by OS without burdening the user with several device-dependent details. Similarly, several software packages can be stored on disk or tape and can be executed by giving simple instructions to OS. In large computer systems, OS also provides a multi-user working environment. Security through passwords, resource sharing, accounting of system utilization, etc., are some of the additional tasks performed.
MS-DOS [Microsoft Disc Operating System] on IBM-compatible PCs and UNIX on minis have become de facto industry standards. Because of this, the portability of software and data files has increased enormously. Many computer manufacturers still offer proprietary operating systems on their mini- and large computers.
The CPU executes instructions stored in main memory by fetching and decoding them. Therefore these instructions will have to be in binary code, the machine language of the CPU. However, it is difficult to give instructions in machine language to handle even the simplest of operations. We can express our problems better in natural language closer to our application environment. Taking this into account, software developers have designed higher-level computer languages (machine language being lower-level language) and developed translators which translate instructions given in high-level language into machine language, which then can be executed.
For scientific applications, languages such as FORTRAN (FORmula TRANslation), COBOL (COmmon Business-Oriented Language) and BASIC (Beginners All-purpose Symbolic Instruction Code) have been developed. The American National Standards Institute (ANSI) has also developed standard specifications for these languages. In addition to these, a language called PASCAL (named after the scientist) was developed by computer scientists to promote better discipline in program writing, called structured programming. Languages such as Prolog (PROgramming LOGic) for artificial intelligence applications and language C/C+ + for developing applications involving the use of basic system resources have become popular.
To be able to use any of these languages, we need to have the language translator software - called a compiler or interpreter - needed to execute programs written in a higher level language. Compilers or interpreters for the same language will be different for different computer operating environments. Depending on the requirements, we need to acquire these software systems.
All these languages were offered on the early mini- and large computers. Today they are all available on PCs and micro-based mini-systems. Some of the popular language processors available on PCs are Turbo Pascal, Microsoft C, Quick BASIC, Microsoft COBOL, Micro Focus COBOL, and Turbo Prolog. These language processors offer in-built editing features and efficient compilation techniques to improve programmer productivity and run-time efficiency.
Electronic spreadsheet software are considered a software marvel, which has brought computers closer to end-users. With a matrix-like column-row interface and cell positioning through directional keys, users can enter data in the form of text, numbers and formulae into specified cell addresses, and specify the relationships between the cells. Financial, statistical and mathematical functions supported by these packages offer model building capabilities to end-users. Graphics features enable improved presentation of results and data. Data management functions provide good interfaces with spreadsheet databases. Table handling facilities such as table look up and result tabulation by substitution of the given values in the specified cells enable the user to perform 'What-if?' analyses. These features qualify the spreadsheet packages to be used as DSS generators (software systems that facilitate the development of DSS) in a limited sense. Today these packages are extensively used in cash flow projections, project investment analysis, budgeting and business planning. They have almost replaced the use of conventional programming languages for those applications which can be modelled as spreadsheets. To give an example in the area of materials management, electronic spreadsheet packages are widely used in the generation of comparative statements, consumption budgeting exercises and A-B-C analysis.
Popular electronic spreadsheet packages are VISI-CALC, Lotus 1-2-3, VP Planner, Multi-Plan, Super Calc, Excel, Quattro, Softpro-456 and IFPS.
Data management systems
Data management software facilitates development of data processing systems with user-convenient interfaces. Facilities offered by these packages include data creation, manipulation, processing, organization, query processing and report generation. Data management packages available on PCs are directly responsible for development of effective de-centralized information systems. Users can participate actively in the design, development and use of computer-based information systems because of the simple interfaces provided by these packages. The command- and programming-level features enable the packages to be used as generators for data-oriented, decision-support systems. Popular data management packages on PCs are dBASE IV, RBASE, Reflex, INGRESS and ORACLE.
In a number of situations, the user is required to use features offered by electronic spreadsheets, data management packages and word processors together to solve a problem. Integrated software systems offer all these features through one package and offer a convenient programming language. They also eliminate the need for transferring data from one package to another. They are the ideal DSS generators. Popular integrated packages are FRAMEWORK II, Symphony, Focus and Farsight.
Operations research and statistics
Packages for operations research and statistics enable the user to solve optimization, forecasting and simulation problems. LINDO, GINO, SPSS/PC+, RATS and SLAM are some of the popular packages.
Project-management software packages offer facilities to accept project network data, perform resource analysis, scheduling, cost analysis and generate reports to aid project monitoring. Popular project management packages are Harvard Total Project Management, Time Line, MS Project, INSTAPLAN and PRISM.
Computer aided design
CAD packages provide features such as automatic dimensioning, projections, hatching, 3-D visualization and standard libraries of designs. Popular CAD packages are AUTOCAD, PRODESIGN-II and Generic CAD.
Computer aided software engineering
Computer Aided Software Engineering (CASE) packages are tools which improve the productivity of designing and developing information systems. They provide a structured systems analysis and design environment and accept systems specifications in the form of data flow diagrams, record layouts, entity-relationship diagrams, structured charts, systems flow charts and screen layouts. CASE tools automatically document systems specifications entered by the analyst and generate a number of analysis reports and diagrams. They provide features like prototyping, screen painting, validation of data flow diagrams and generation of record layouts in COBOL, dBASE IV or Pascal. Popular CASE tools are Yourdon Tool Kit, Nastec Design Aid, MEGA, Execlerator, Structsoft and TURBO ANALYST.
Presentation software systems assist the user to produce electronic slides involving text and pictures, to capture from other software packages, and cut and paste from picture libraries. The packages can be used for classroom instruction, seminars, workshops and boardroom presentations.
Word processing and desk-top publishing
An application that came into prominence with the availability of inexpensive hardware is word processing. Word processing packages offer facilities to create, edit and present textual information. Cut-and-paste features, underlining, boldfacing and alignment features greatly simplify the preparation of final versions of documents. Facilities such as mail merge, spell checking, generation of table of contents, indexing, etc., greatly enhance the power of these packages. Wordstar, WordPerfect and WORD are some of the popular word processing packages.
Desk-top publishing (DTP) is a related application which addresses problems such as development of page layouts, selection of fonts and inclusion of graphic objects in addition to word processing. Pagemaker and Ventura are two popular DTP software systems.
Recent developments in microcomputer technology and user-oriented software products offer enormous opportunities for improving the quality of decision making. They have provided excellent scope for developing convenient interfaces with databases, data analysis models and graphics so that the user can use the computer as a decision-support aid, accommodating personal styles in the analysis and interpretation of data. There is wide scope for using computers as a management tool in any organization.
Recent developments in computer technology can be exploited to excellent effect in various facets of agricultural research. Consider the tasks associated with management of agricultural research.
The director of a typical institution conducting agricultural research would have to manage the major areas of:
· training, and
In research, computers can assist in management, primarily in the areas of project management and analysis of research data.
Formulation of projects for agricultural research involves extensive searches of literature and development of a technically feasible proposal.
Computers can offer support in the task of literature search through information retrieval systems in libraries. A good information retrieval system which offers selection and retrieval of related work on the topic of research interest can greatly enhance the work of research. If a network service is available, connecting libraries of related organizations, the scope of the task can be further enhanced. One library network package available for PCs is CDS/ISIS, distributed by UNESCO. Apart from this, there are a number of other packages for this service.
Tasks like working out project budgets, time frames and generation of proposal reports can be aided with the help of electronic spreadsheets, data management systems and word processing, which facilitate development and presentation of different alternatives with relative ease.
Monitoring and bookkeeping activities related to project finances (grants and expenditures) can be aided through accounting packages developed using data management systems. Several useful reports for internal record keeping as well as for submission to funding agencies can be generated by these packages. Any of the project management software systems listed earlier would offer comprehensive analysis and reporting features. Users should be acquainted with formal project management concepts in order to use these packages effectively.
Research data analysis
Use of computers for data analysis is not new to agricultural research scientists. Statistical techniques such as regression analysis, discriminant analysis and factor analysis are used widely in research studies. These were performed using minis or mainframe computers in the past. Today every researcher can perform these analyses for reasonably large-sized problems more easily on PCs with the help of the -user-friendly packages described earlier. Where necessary, more complex analyses can be performed using advanced statistical and operations research packages. Area planning applications can use graphic software. GIS packages can assist researchers to generate alternatives by performing complex data analyses of spatial information and displaying the solutions on maps. Research studies relating to monitoring applications can also benefit from thematic mapping software systems in reducing the cumbersome mapping tasks.
Training in the context of agricultural research involves development of case studies based on research and associated experiences in the field. Preparation of teaching material can be assisted by computers through data analysis, data management and packages performing the transfer of data from one package to another, and then using word processing and DTP for reports and didactic material preparation. Cases can be updated easily if they are maintained in computer memories.
By using a computer-connected projection system, presentations in seminars, workshops and classrooms can be made more effective, either by presenting live situations of data analysis, or by presenting an electronic slide show. Such teaching modules can be easily exchanged among instructors and made available for wider dissemination.
A large number of administrative functions related to agricultural research projects can be assisted by computers.
Preparation of financial statements through processing of revenue and expenditure documents on a day-to-day basis can be done by computers. Such systems offer correct and up-to-date statements on the financial position of the institution. A detailed project accounting statement giving the expenditure under different budgetary heads can also be generated using the same data. Such details help the project coordinator to work out a financial plan.
Generation of employee payslips and associated accounting statements is a fairly common and well accepted application of computers.
Processing of basic salaries, allowances and recoveries to generate pay-slips would also facilitate automatic preparation of various statements to be sent to external agencies like insurance, provident funds and banks. Similar statements can also be prepared for various internal servicing units, including the telephone department for telephone charges recovered, hostel on the amount of hostel bill recovered, cooperative society on loan amounts recovered, and personnel and accounts department on the recovery of various loans sanctioned by the organization.
Payroll systems can also assist management by facilitating the study of the financial implications of various wage revision schemes. The system, with some extensions, can also be used in wage negotiation exercises.
Personnel information system
Personnel information systems, which contain the basic data on various employees of the organization, can assist the personnel department in planning and execution of human resources development activities and employee welfare schemes by providing vital information on employee background, such as educational qualifications, training and past experience. Availability or gaps in human resources can be estimated easily. Future scenarios can be developed under various policy options to help the organization develop long-range plans.
Administrative services of the library, such as circulation and document acquisition systems, can be effectively aided by computer-based systems. Operational efficiency and user services can be significantly improved through such systems. The circulation system can assist in locating books in circulation, generating loan records, overdue statements and usage frequency of library materials. The acquisition system keeps track of books procurement. In the case of periodicals, the system can keep track of receipt of volumes and assist in follow-up procedures.
Qualitative improvement to the services provided to researchers can be accomplished by using computerized indexing and information retrieval systems. Such systems can provide selected retrieval of information on recent acquisitions as well as acquisitions as of a given date for any given author, subject, publication, keyword, etc. Sharing and access of similar information from other libraries is feasible through computer networks.
Management information system
Information about achievements and use of resources compared with targets and budgets can be prepared through computer-based systems maintaining information on various activities of the organization. Such reports help management in taking timely corrective actions (if needed) and guide it toward efficient use of resources. Systems can be developed to aid in the planning and monitoring tasks of departmental heads by providing them with information on activities desired at specific intervals.
The applications presented above are not exhaustive. Depending on the volume of data, complexity of procedures or need for quick retrieval of data, computer applications would vary. Systematic development of systems with the correct, open perspective can bring desired results through the use of information technology. The next section deals with approaches to the use of computers in organizations.
With the prices of microcomputers coming down and the systems becoming more and more user-friendly, use of microcomputers in a de-centralized set-up has been steadily increasing. PCs are used in a number of de-centralized data processing and decision-support applications. In this section we discuss two types of use in detail.
Systems for data processing
One of the most common applications of microcomputers in a de-centralized set-up is data processing. These applications are developed for either regular use or prototyping.
Regular data processing systems
Since software available on PCs makes it possible to develop data processing systems with less effort, one is often tempted to develop systems for regular use. These systems can, however, be successfully implemented only if the security and integrity features, which are weak in PCs, are achieved through externally imposed data access discipline, i.e., by following certain norms for accessing database files and by establishing procedures of back up and recovery. Normally such externally imposed discipline would function well if the systems are managed by individuals or by close-knit groups. Special efforts are, however, required to extend PC-based data processing systems to a general-user environment, because it is difficult to impose security and integrity disciplines externally on a large group of users. Advanced PCs which offer UNIX-like operating systems and advanced database management systems are one solution to this problem. Today, a large number of organizations are adopting this approach. Quite a few data processing applications are developed and regularly used for processing using PCs.
Prototype information systems
Development of illustrative systems in pilot projects is another popular use of PCs. Using end-user software packages, it is possible to develop, easily and quickly, a live model of an information system, involving all steps of information processing. Such systems can be subjected to field tests through installation in the pilot project areas. Experiences of using the system and suggestions for improvement can be documented. Subsequently, the main system can be developed using the appropriate technology, taking into consideration the experiences and suggestions resulting from use of the prototype. Prototyping in this form also facilitates user education and improves user participation in the computerization process.
Systems for decision support
From earlier discussions, it is evident that there is no dearth of software tools available on PCs to develop systems to assist the decision-maker. In fact, users have to prepare themselves to meet the challenge of utilizing the power offered by inexpensive and yet powerful information technology.
To design systems for decision support in planning, users should acquire model building and optimization skills. To design decision support systems for monitoring, users should acquire a feel for numbers and develop better indicators of performance using advanced statistical techniques. In both cases, development of aesthetic screen interfaces is an art to be acquired through experience. There are a number of instances where users have developed powerful decision support systems using end-user software without the assistance of systems specialists. However, beyond a certain level of complexity, users need to acquire system design and programming skills. More importantly, users should concentrate on acquiring the modern tools of problem solving in their problem domain to use the microcomputer technology to its fullest potential in a de-centralized set-up.
Systems for evaluating alternatives in project appraisals, project monitoring, profitability analysis, market research studies, production scheduling, inventory management, purchasing, portfolio management, advertising and engineering design are some examples of PC-based decision support systems.
System development strategies
Two approaches are generally used in the development of computer-based systems in a decentralized environment.
In the first approach - the traditional - a centralized department, such as a computer services department, develops systems and installs them on PCs. This approach has all the advantages of using expert skills in developing systems which are vital to the successful implementation of complex application systems. If systems so developed meet most of the user requirements, their acceptance should be high, primarily because:
· users interact directly with the systems for solutions; and
· users can enhance the system at local level by use of simple add-ons based on use of peripheral applications developed using the computer and end-user software on hand.
Major limitations in this approach are:
· it may not ensure total translation of users' requirements into the computer system because of imperfect communication between systems developers and users; and
· users may not wholeheartedly endorse the systems which are not developed by they themselves.
In the second approach, users themselves develop their applications with the help of end-user software packages like electronic spreadsheets and data management systems. In this case, since the problem context is very well known to the developers, the effort involved in completely translating the users' requirements into a computer-based system is minimal. It can be expected that such systems get implemented smoothly since users are the owners of the system. It should also be possible for users to introduce improvements to the system from time to time. The current trend in a number of organizations is to encourage this approach.
Some problems likely in this approach are:
· excessive de-centralization might lead to data indiscipline, making system integration a difficult task. Each department and individual might develop individual coding schemes, define their own data fields and the types and sizes of data to be handled. Sharing such data would be difficult if proper standards are not evolved and enforced;
· lack of systems analysis and design skills in users might result in the development of half-baked products. Systems which are not thoroughly tested might be put into use while users are totally ignorant of any bugs in the system. Systems developed must therefore be subjected to rigorous checking by others not associated with development before releasing them for regular use;
· de-centralization may lead to disintegration if each individual solves his or her problem in isolation. One might develop an efficient system within a very narrow scope of a department or individual, but, in a number of cases, such solutions probably turn out to be inefficient solutions overall;
· users may tend to be possessive of 'their' systems and databases, and may not share them with others; and
· lack of exposure to decision analysis and model building techniques might result in the development of mundane applications, where the value added to processed data is negligible. Users might waste their energy in playing with computers to developing more cosmetic features rather than objectively analysing results and taking the necessary development action.
In spite of the above dangers, development of applications by users is an ideal solution to increase users' involvement in information processing in organizations. Perhaps a mixed approach is desirable. Based on the organization culture, each organization will have to work out a strategy of information processing and cautiously blend technology with decentralization. A core group from management services, computer services and user departments could analyse the issues and work out a strategy to take advantage of developments in information technology.
In this section we discuss the issues related to the management of computer services. These are covered under three broad headings:
· Organization of computer services.
· Acquisition of computer resources.
· Performance monitoring and expansion.
Organization of computer services
Normally computer services are managed by a professionally trained group within the organization. Such a group would include computer services managers, systems analysts, programmers, computer operators and data-entry operators. These professionals have computer hardware, software and applications backgrounds to provide information processing services to the organization. The department could have various titles, including the Computer Services Department or the Electronic Data Processing Department. The department is normally attached as a staff function to the managing director or general manager. In organizations where the management services function exists, computer professionals are included as a part of this function. A typical staffing pattern appears as Figure 2.
The main function of a computer services department is to analyse information requirements of the organization, identify the areas where computers can be used to bring tangible or intangible benefits, and design and implement computer-based systems in the identified areas. Training users in data entry and interpreting results generated by the system would be a major responsibility. This department should also be responsible for maintenance and upgrading of the systems (hardware, system software and application software). Since the technology is developing fast, it is necessary that this department undertakes a market survey of information technology from time to time and suggest ways of adopting new technology.
In the staffing chart presented as Figure 2, maintenance of hardware, installation of system software, development of new systems facilities and training of users on system resource utilization could be vested in the systems analyst(s) [systems].
The task of developing new application packages and providing programming assistance to researchers and administrators could be assigned to systems analyst(s) [applications].
These three section heads are assisted by programmers and computer and data-entry operators in accomplishing their tasks. They report to the computer services manager.
The office assistant provides record keeping services to the computer services manager, in addition to assisting in procuring, stocking and issuing consumable items like paper, floppy disks, printer ribbons and cartridges, software manuals and essential computer spares. A separate computer library could be created, if the number of books and software manuals is large.
The precise number of staff member needed for each of these positions will have to be worked out based on the requirements of the organization. All staff must have formal professional qualifications. They should be able to work with computer systems in a methodical way, with perfect clarity. End users developing applications for their own use need not be computer specialists, but must respect the operating protocols and data integrity requirements established by the system controller.
Developments in computer technology today encourage de-centralized use of computers. The computer services department will have to see its changed role as a catalyst in promoting modern techniques in data processing and data analysis, and as auditor of data processing practices in the de-centralized set-up. This is in addition to its role of developing centralized databases and information systems.
A detailed exercise may be necessary for most organizations wishing to evolve a strategy for adoption of information technology. The exercise would include study of requirements, design of a suitable configuration, scheme of acquisition and selection of systems. The following sections describe these processes.
Requirements analysis and configuration
Design of a suitable computer configuration is fundamental to the use of computer in management. Computer services departments, if they exist, or expert consultants should be invited to design a suitable configuration, with approximate cost estimates. The alternatives discussed in the earlier sections, such as stand-alone systems, work stations, minicomputer with terminals, PCs in a LAN or WAN, etc., will have to be evaluated in the context of organizational needs. Application needs and availability of suitable software generally become vital factors in configuration design. It is always desirable to start with a good software base, since it determines the pace at which applications can be developed and users can be involved in the computerization process. A reliable hardware-software combination is essential for successful computerization.
Requirements analysis is an elaborate exercise, involving almost all the members of the organization. The professional group entrusted with the task of designing a suitable computer configuration should hold discussions with all the relevant members of the organization and try to understand the practices of information management as they exist in the institution. Methods to improve these practices with the help of appropriate information technology will be worked out by the group. Any special computing requirements will also be taken into consideration at this stage. A detailed exercise of this nature is necessary to develop a strategy for the use of information technology in the organization as a whole. This ensures proper introduction of information technology and effective use of the system.
A phased approach to acquisition and introduction of computers is normally adopted in cases where experience in the use of computers is limited within the organization. Unless it is estimated that the full configuration will be utilized within a year, it is not desirable to acquire a large configuration of hardware and software, since the rate of developments in technology may render these systems obsolete before the organization is ready to use them.
Selection of computer systems
The effort necessary for the selection of hardware and software depends upon the configuration decided upon. While the prime consideration for selecting a PC may be limited to the availability of good after-sales service, it is quite complex for mini- and large computer systems, and for systems involving use of recent processors, peripheral devices and software products. Even in the case of PCs, the software selection exercise may have to be done extensively for advanced software products.
It is perhaps the power of the machine and how well the operating system and application software can exploit it in a desired configuration which determine the selection of minis and large systems.
Data on comparative performances of hardware and software products can be obtained independently from standard computer magazines. Any user experience, if available, in the neighbourhood will be a valuable input for evaluation. These inputs along with vendor supplied information may be tabulated to prepare a short list for detailed consideration.
Data may include hardware characteristics such as speed, capacity, expandability and method of interconnection of processor, memory and input-output devices. The model and make of these units are equally important inputs.
Ease of use, features supported, ease of data conversion, efficiency of implementation, etc., are some dimensions by which data on software may tabulated for comparison. These inputs will have to be collected for each important software package, including operating systems, language processors, end-user packages and special purpose application software systems.
Vendors offering comparatively superior hardware and software products for the specified computer configuration can be shortlisted for detailed performance evaluation.
To evaluate the performance of shortlisted systems, a detailed study using benchmark test programs will have to be carried out.
These studies involve development of a large number of prototype programs (around 20 to 30) running in the proposed hardware-software environment. These benchmark programs will have to be run in several experiments on the proposed equipment.
Observations such as smooth performance of the system (easy navigable, no breakdowns, no hang-ups, etc.) and knowledge of system engineers concerning the hardware and software may be given equal importance to elapsed and execution times required to run the test programs.
Benchmark data may be used to evaluated the vendors using either of approaches. Either
· assign interval scores to each vendor on each criterion and multiply this score by the weights of the criteria. Add these weighted scores to produce a final weighted total. Use these totals to rank order the systems. This scheme works right when the weights are judiciously chosen and the resultant totals are not too close; or
· prepare a brief scenario of how the organization would function with each proposed system. These scenarios would illustrate efforts required in using the proposed system along with the associated costs and benefits. The decision making body can then rank these scenarios and choose the most desirable one.
In addition to benchmark results, the market image and service reputation of the vendor are other important points to be considered while selecting computer systems. Detailed techno-economic evaluation may be necessary after this stage.
Since computer selection is a specialized task, it is advisable to engage professional consultants to select complex hardware/software systems involving large budgets.
Performance monitoring and expansion
Monitoring the performance of computer services is as important as monitoring other functions. It is perhaps more complex because of the several issues, including:
· the manager of the computer services function has a wide variety of subordinates, ranging from highly technical computer personnel to clerical personnel;
· the department is responsible for a broad range of activities, from creative system design to routine clerical tasks;
· the department has an impact on many areas of the organization; and
· the manager is responsible for major investment in hardware and software.
Given these points, top management should see the manager of the computer services department as a change agent and give support to the manager's activities within the organization.
Management will have to evolve policy guidelines for data retention, privacy and security. These areas have significant bearing on cost, legal and societal implications.
Auditing computer procedures is essential to ensure adequate control for computer-based systems. Some illustrations of audit controls are: maintenance of control logs for input and output; records of job run, beginning, ending, errors, re-starts and re-runs; file back-up procedures; program back-up procedures; back-up arrangement for processing with another organization; insurance for re-creating bad files; disk and tape library controls; system documentation; user documentation; and operator documentation.
Performance of the computer services department can be influenced by reviewing the performance of existing systems, including users, in the design of new systems and training of the staff of user departments.
Based on these inputs, an expansion plan may be worked out. The plan could include the expansion of the computer services department in terms of augmentation of manpower or equipment. A detailed exercise may be necessary in the case of equipment expansion to take advantage of developments in information technology.
Bodily, E.S. 1985. Modem Decision Making. New York, NY: McGraw-Hill.
Chien. 1989. Introduction to Micro-computers and Applications. Homewood, IL: Irwin.
Condon, J.R. 1987. Data Processing Systems Analysis and Design. New Delhi: Prentice-Hall of India.
Grauer, T.R., & Sugrue, K.P. 1987. Micro Computer Applications. New York, NY: McGraw-Hill.
Haueisen, D.W., & Camp, L.J. 1988. Business Systems for Micro Computers. New Delhi: Prentice-Hall of India.
Lucas, H.C., Jr. 1984. Managing Information Services. London: Macmillan.
Lucas, H.C., Jr. 1984. Information Systems Concepts for Management. New York, NY: McGraw-Hill.
Norton, P. 1989. Inside the IBM PC. New Delhi: Prentice-Hall of India.
Pratt, W.T. 1983. Programming Languages. New Delhi: Prentice-Hall of India.
Sanders, H.D. 1988. Computers Today. 3rd ed. New York, NY: McGraw-Hill.
Senn, J.A. 1989. Analysis and Design of Information Systems. New York, NY: McGraw-Hill.
Byte, published by McGraw-Hill Inc., New York.
Computers Today, published by Living Media Ltd., New Delhi.
Dataquest, published by Cyber Media (India) Pvt Ltd., New Delhi.
PC Magazine, published by Ziff-Davis Publishing Company, New York.