You are on page 1of 18

IGNOU4U.BLOGSPOT.

COM

IGNOU BCA
Term-End Examination
December 2009

IGNOU BCA CS-05 Solved Question Papers Dec, 2009


CS-05 ELEMENTS OF SYSYTEM ANALYSIS & DESIGN
TIME: 3 Hours Maximum Marks: 100

Note : Question 1 is compulsory. Attempt any three from the rest. 1 (a) Construct an entity-relationship diagram for an Airlines reservation system. Also explain the concept of cardinality through it. Make assumptions. If necessary. Ans. Draw Entity-relationship diagram for an Airlines reservation system (b) Explain different types of file Organization. Give atleast one advantage and one disadvantage of each. Ans. In computing, a file system (often also written as file system) is a method for storing and organizing computer files and the data they contain to make it easy to find and access them. File systems may use a data storage device such as a hard disk or CDROM and involve maintaining the physical location of the files, they might provide access to data on a file server by acting as clients for a network protocol(e.g.NFS, SMB, or 9P clients),or they may be virtual and exist only as an access method for virtual data(e.g., procfs). More formally, a file system is a set of abstract data types that are implemented for the storage, hierarchical organization, manipulation, navigation, access, and retrieval of data. File systems share much in common with database technology, but it is debatable whether a file system can be classified as a special-purpose database(DBMS). Needs of file systems The most familiar file systems make use of an underlying data storage device that offers access to an array of fixed-size blocks, sometimes called sectors, generally 512 bytes each. The file system software is responsible for organizing these sectors into files and directories, and keeping track of which sectors belong to which file and which are not being used.

IGNOU4U.BLOGSPOT.COM

However, file systems need not make use of a storage device at all. A file system can be used to organize and represent access to any data, whether it be stored or dynamically generated(e.g. from a network connection). Whether the file system has an underlying storage device of not, file systems typically have directories which associate file names with files, usually by connecting the file name to an index into a file allocation table of some sort, such as the FAT in an MSDOS file system, or an in ode in a Unix-like file system. Directory structures may be flat, of allow hierarchies where directories may contain subdirectories. In some file systems, may be flat, of allow hierarchies where directories may contain subdirectories. In some file systems, file names are structured, with special syntax for filename extensions and version numbers. In others, file names are simple strings, and per-file metadata is stored elsewhere. Types of file systems organizations:There are various types of files in which the records are collected and maintained. They are categorized as: Master file Transaction file Table file Report file Back-up file Archival file Dump file Library file Master File :Master file are the most important types of file. Most design activities concentrate here. In a business application, these are considered to be very significant because they contain the essential records for maintenance of the organizations business. A master file can be further categorized. It may be called as reference master file, in which the records are static or unlikely to change frequently. For example, a product file containing descriptions and codes a customer file containing name, address and account number are example of reference files. Alternatively, it may be described as a dynamic master file. In this file, we keep records which are frequently changed (updated) as a result of transactions or a other events. These two types of master file may be kept as separate files or may be combined, for example, a sales ledger file containing reference data, such as name, address, account number, together with current transaction and balance outstanding for each customer. Transaction File:

IGNOU4U.BLOGSPOT.COM

A transaction is a temporary file used for two purposes. First of all, it is used to accumulate data about events as they occur. Secondly, it helps in updating master files to reflect the results of current transactions. The term transaction refers to and business event that affects the organization and about which data is captured. Examples of common transactions in the organization are making purchases, hiring of workers of workers and recording of sales. Table File : A special type of master file is included in many systems to meet specific requirements where data must be referenced repeatedly. Table files are permanent files containing reference data used in processing transaction, updating master file or producing output. As the name implies, these files store reference data in tabular form. Table files conserve memory space and make the program maintenance easier by storing data in a file, that otherwise would be included in programs or master file records. Sequential Organization A sequential file contains records organized in the order they were entered. The order of the records is fixed. The records are stored and sorted in physical. Contiguous blocks within each block the records are in sequence. Records in these files can only be read or written sequentially. Once stored in the file, the record cannot be made shorter, or longer, or deleted. However, the record can be updated if the length does not change. (This is done by replacing the records by creating a new file.) New records will always appear at the end of the file. If the order of the records in a file is not important, sequential organization will suffice, no matter how many records you may have. Sequential output is also useful for report printing or sequential reads which some programs prefer to do. Line-Sequential Organization Line-Sequential files are like sequential files, except that the records can contain only characters as data. Line-sequential files are maintained by the native byte stream files of the operating system. In the COBOL environment, line-sequential file that are created with WRITE statements with the ADVANCING phrase can be directed to a printer as well as to a disk. Indexed-sequential Organization Key searches are improved by this system too. The single-level indexing structure is the simplest one where a file, whose records are pairs, contains a key pointer. This pointer is the position in the data file of the record with the given key. A subset of the records, which are evenly spaced along the data file, is indexed, in order to mark intervals of data records.

IGNOU4U.BLOGSPOT.COM

This is how a key search is performed: the search key is compared with the index keys to find the highest index key coming in front of the search key. While a linear search is performed from the record that the index key points to. until the search key is matched or until the record pointed to by the next index entry is reached. Regardless of double file access(index + data) required by this sort of search, the access time reduction is significant compared with sequential file searches. Hierarchical extension of this scheme is possible since an index is a sequential file in itself, capable of indexing in turn by another second-level index, and so forth and so on. And the exploit of the hierarchical decomposition of the searches more and more, to decrease the access time will pay increasing dividends in the reduction of processing time. There is however a point when this advantage starts to be reduced by the increased cost of storage and this in turn will increase the index access time. Inverted list In file organization, this is a file that is indexed on many of the attributes of the data itself. The inverted list method has a single index for each key type. The records are not necessarily stored in a sequence. They are placed in the data storage area, but indexes are updated for the record keys and location. Direct or hashed access With direct or hashed access a portion of disk space is reserved and a hashing algorithm computes the record address. So there is additional space required for this kind of file in the store. Records are placed randomly through out the file. Records are accessed by addresses that specify their disc location.. also this type of file organization requires a disk storage rather than tape . it has an excellent search retrieval performance, but care must be taken to maintain the indexes. If the indexes become corrupt, what is left may as well go to the bit-bucket, so it is as well to have regular backups of this kind of file just as it is for all stored valuable data. (c) What is preliminary investigation? Explain various data gathering techniques and compare them.

Ans. Preliminary investigation refers to gathering the basic information about the nature of customer. Working of organization, existing system and the expectation from the proposed system. In primary data collection, you collect the data yourself using methods such as interviews and questionnaires. The key point here is that the data you collect is unique to you and your research and, until you publish, no one else has access to it. There are many methods of collecting primary data and the main methods include: Questionnaires Interviews

IGNOU4U.BLOGSPOT.COM

Focus group interviews Observation Case-studies Diaries Critical incidents Portfolios.

Questionnaires Questionnaires are a popular means of collecting data, but are difficult to design and often require many rewrites before an acceptable questionnaire is produced. Advantages: can be used as a method in its own right or as a basis for interviewing or a telephone survey. Can be posted, e-mailed or faxed Can cover a large number of people of organizations Wide geographic coverage. Disadvantages : Design problems. Questions have to be relatively simple. Historically low response rate(although inducements may help). Time delay whilst waiting for responses to be returned. Interviews Interviews is a technique that is primarily used to gain an understanding of the underlying reasons and motivations for peoples attitudes, preferences or behavior. Interviews can be undertaken on a personal one to one basis or in a group. They can be conducted at work, at home, in the street or in a shopping centre, or some other agreed location, Personal interview Advantages: Relatively cheap Quick Can cover reasonably large numbers of people or organizations. Wide geographic coverage. Disadvantages: Often connected with selling Questionnaire required.

IGNOU4U.BLOGSPOT.COM

Not everyone has a telephone. Repeat calls are inevitable-average 2.5 calls to get someone. Diaries A diary is a way of gathering information about the way individuals spend their time on professional activities. They are not about records of engagements or personal journals of thought! Diaries can record either quantitative or qualitative data, and in management research can provide information about work patterns and activities. Advantages: Useful for collecting information from employees. Different writers compared and contrasted simultaneously. Allows the researcher freedom to move from one organization to another. Researcher not personally involved. Disadvantages: Subjects need to be clear about what they are being asked to do. Why and what you plan to do with the data. Diarists need to be of a certain educational level. Some structure is necessary to give the diarist focus, for example, a list of headings. Encouragement and reassurance are needed as completing a diary is timeconsuming and can be irritating after a while. (d) What is the need of quality assurance in a SDLC? Name and explain few quality assurance standards available for SDLC.

Ans. The various factors which are responsible for the quality of a system are as follows. Correctness : the extent to which a program meets system specification and user obeectives Reliability : the degree to which the system performs its intended functions over a time Efficiency : the amount of computer resources required by a program to perform a function. Usability: the effort required to learn and operate a system. Maintainability: the ease with which program errors are located and corrected. Testability: the effort required to test a program to ensure its correct performance. Portability the ease to transporting a program from one hardware configuration to another. Accuracy: the required precision in input editing, computations and output.

IGNOU4U.BLOGSPOT.COM

Error tolerance: error detection and correction versus error avoidance. Expandability: ease of adding or expanding the existing data base. Access control and audit: control of access to the system and the extent to which that access can be audited. Communicativeness: how descriptive of useful the inputs and outputs of the system are. Different levels of quality assurance:Analysts use three levels of quality assurance : testing verification with validation and certification. 1. Testing: The quality assurance goal of testing phase is to ensure completeness and accuracy of the system & to eliminate errors, so as to minimize the retesting process, since designers cannot prove 100 percent(100%) accuracy. Therefore, all that can be done is to put the system through a fail test cycle determine what will make it fail. A successful test, then, is one that finds errors. 2. Verification with validation: Like testing verification is also intended to find errors it is performed by executing a program in a simulated environment. Validation refers to the process of checking the quality of software in a live environment to find errors. When commercial systems are developed with the main aim of distributing for sale purposes, they first go through verification (alpha testing) the feedback from the validation phase generally brings some changes in the software to deal with errors and failure that are uncovered. Then a set of user sites is selected for putting the system into use on alive basis. Validation may continue for several month during the course of validating the system, failure may occur and the software will be changed. Continued use may bring more failure and the need for still more changes. 3. Certification: The third level of quality assurance is to certify that the software package developed conforms to standards. With a growing demand for purchasing ready to use software, importance of certification has increased. A package that is certified goes through a team of computer specialists who test, review & determine how well it meets the users requirements and vendors, claims. Certification is issued only if the package is successful in all the tests. Certification, however, does not mean that it is the best package to adopt. It only attests that it will perform what the vendor claims. =============================================================== 2. (a) Explain in brief, few techniques of system analysis. Ans. System Analysis is a management technique which helps us in designing a new system or improving an existing system.

IGNOU4U.BLOGSPOT.COM

Techniques of Systems Analysis : 1. Requirement Analysis Requirement Determination is generally done though extensive study of the system including the understanding of the goals, processes and constraints of the system for which information systems are designed. Several forms are also designed and illustrated in the texts of system analysis. In the view of the author such techniques must be left to the ingenuity of the analyst: there is no straight forward algorithm to elicit the requirement from the user. It is an iterative process which the analysis use while interviewing several user / users groups. It will continue to remain an art rather than science. For Requirement Specification both at the preliminary as well as the detailed stage, several diagramming techniques have evolved. In fact such diagrams have become the language of the analysis just as blue prints have become the language of the designer of balance sheet becoming the language of eth accountant. We will detail below some of the diagramming techniques. 2. Date Dictionary : Another powerful tool that is extensively used is system analysis is the data dictionary. DDS as they are called provide a detailed reference to every data item- the different names by which the item is represented, in different program modules, different data structures used to represent the item in different modules, the modules where the data item is generated, where it is stored and destroyed. In essence it provides a quick snapshot of every data item used by the information system. Needless to say it is extremely detailed and very useful for consistency cheeks, system modification and completeness checking. A typical data dictionary appears as follows: 1 page Data Dictionary While these techniques are general in name and used by the analyst in the different stages of the system life cycle the following are specific to some of the system life cycle. 3. Detailed Design: Roughly the detailed specification are worked out followed by hardware /software plan. This constitutes system design which once again need to be whetted by the user. Once this is done detailed system design starts. Effectively one can say that the analysis phase ends here and the design

IGNOU4U.BLOGSPOT.COM

phase begins. Being a detailed design it may involve substantial effort on the part of technical system analysts, hardware, software, communication specialists etc. A major component of detailed system design is the database design covered in the next section. Actual coding is undertaken after the database design is complete. 4. Database Design While DBMS permit efficient storage and manipulation of data files they do not cater to the structuring of the database themselves. After extensive uses of database in real world applications, several early analysts have felt the need for the right abstraction of data into the database so that any update / query operation captures the spirit of the meaning of the data stored in the data bases. Once of the basic observation made by the early analysts was the possible loss of information by careless update operations on databases. This led to the concepts of normalization pioneered by Codd []. Intuitively normalization leads to the decomposition in such a way that no information is lost due to processing of data. Normalization ensures no loss of information and avoids insertion update and other anomalies. A table (relation) is said to be in First Normal Form (INF) if there is an identifying key and there are no repeating groups of attributes. Intuitively First Normal Form ensures that all the table entries are atomic. (b) Give levels of MIS. Explain the role of key players in an organization in all of these levels.

Ans. Transaction Processing System(TPS): A transaction is a business event such a sale to a customer. Other transactions include payment to an employee for work performed, purchase of raw materials and payment of an accounts payable. Transaction processing systems. Transaction processing systems have three basic objectives. First, they must collect and store data concerning business events. Second, they provide the information necessary for the day to day control of business events. Finally, they serve as the database for higher level information systems that are used by managers and executives at the middle and upper level of an organization. 1. Decision Support Systems:

IGNOU4U.BLOGSPOT.COM

A decision support system (DDS) is an integrated set of computer tools that allow a decision maker to interface directly with computers to create information useful in making semi structured and unstructured decisions. These decisions may involve, for example, mergers and acquisitions, plant expansions, new products, stock portfolio management, or marketing . Management information systems in the past have been most successful in providing information for routine, structured and anticipated types of decisions. In addition, they have succeeded in acquiring and storing large quantioties of detailed data concerning transaction processing. They have been less successful in providing information for semi structured or unstructured decisions, particularly those that were not anticipated when the computer information system was designed, The basic idea underlying decision support systems is to provide a set of computer based tools so that management information systems can produce information to support semi structured and unanticipated decisions. 2. Office Automation Systems : Office automation systems(OAS) use the computer to automate many of the routine tasks that are performed in a typical office. Perhaps the most widespread type of office automation system is word processing But there are many other applications in the office including desktop publishing, electronic mail, facsimile transmission and image processing. (C ) Why is cost benefit analysis important in system design? Explain with an example.

Ans. Developing an IT application is an investment. Since after developing that application it provides the organization with profits. Profits can be monetary or in the form of an improved working environment However. It carries risks, because in some cases an estimate can be wrong. And the project might not actually turn out to be beneficial Cost benefit analysis helps to give management a picture of the costs. Benefits and risks. It usually involves comparing alternate investments. Cost benefit determines the benefits and savings that are expected from the system and compares them with the expected costs. The cost of an information system involves the development cost and maintenance cost. The development costs are one time investment whereas maintenance costs are recurring The development cost is basically the costs. Incurred during the various stages of the system development. Each phase of the life cycle has a cost Some example are: Personnel

IGNOU4U.BLOGSPOT.COM

Equipment Supplies Overheads Consultantss

Cost and Benefit Categories. In performing Cost benefit analysis (CBA) it is important to identify cost and benefit factors. Cost and benefits can be categorized into the following categories There are several cost factors/elements. These are hardware, personnel, facility, operation and supply costs. In a broad sense the costs can be divided into two types 1. Development costsDevelopment cost that are incurred during the development of the system are one time investment. Wages Equipment 2. Operating costs, e.g. Wages Supplies Overheads Another classification of the costs can be: Hardware/software costs: It includes the cost of purchasing or leasing of computers and its peripherals. Software costs involves required software costs. Personnel Costs: It is the money, spent on the people involved in the development of the system. These expenditures include salaries, other benefits such as health insurance. Conveyance allowance, etc. Facility costs: Expenses incurred during the preparation of the physical site where the system will be operational. These can wiring, flooring, acoustics, lighting and air conditioning Operating costs:

IGNOU4U.BLOGSPOT.COM

Operation costs are the expenses required for the day to day running of the system. This includes the maintenance of the system. That can be in the form of maintaining the hardware or application programs or money paid to professionals responsible for running of maintaining the system. Supply costs: These are variable costs that vary proportionately with the amount of use of paper, ribbons, disks, and the like. These should be estimated and included in the overall cost of the system. Benefits We can define benefit as Profit or Benefit = Income Costs Benefits can be accrued by : -Increasing income, or -Decreasing costs, or -both ============================================================ 3 (a) Distinguish between user acceptance tasting and system testing. Ans: User Acceptance testing is an opportunity for the customer to evaluate and accept a system and/or application based on meeting or exceeding the defined requirements as communicated to the vendor. UAT can vary greatly based on user experience with the application and each persons experience with testing itself. Time is nearly always a factor in testing and time is a factor in UAT as well. The users often have little time to test. More important is that if the users do find defected with the software , they are sometimes pressured to accept the software as is. In some cases, users have been waiting months for the software and for business needs are anxious to receive the application. These dynamics can add to the pressure for users to accept the software as is even if there are defects that prevent or inhibit the very functionality they have waited for.

IGNOU4U.BLOGSPOT.COM

Users often dont know how to test and havent been trained to think like a tester. They receive the software and are given time to test but often without any ideas about what to do. Under pressure they will often execute happy path testing and dont try harder test cases or interesting test conditions not because they dont have these ideas but because theyre not testers. They are not prepared and time and pressure can be great. The emphasis is more on design because they will attempt to use the system as if it is in production. So every requirement may not be evaluated in UAT due to limited time and resources. However. Requirements testing is basically done in Unit and SIT testing phases. If the new system/application meets the customers needs, then the system will be Accepted by the customer and is implemented. System testing is testing conducted by testers of the application. Testers who have been testing functionality as its been delivered are usually prepared to see the application function as a whole integrated solution. System testing is often more technical and more prepared and it is testing designed and executed by testers whove become familiar with the types of defects the application has beep apron to throughout the SDLC. System testing is very different from UAT because of the test ideas, experience and point of view of the testers. Also the technical expertise between users and testers can be significant so the two teams are likely to find vastly different defects. Both forms of testing are later in the SDLC process is often the only common elements between the two forms of testing. (b) What is software reliability? How can it be achieved during SDLC?

Ans. Software Reliability Describe methods of ensuring that software is reliable a testing, b testing, agreements between software houses and purchaser for testing. Understand the reasons why fully-tested software may fail to operate successfully when implemented as part of an information technology system. Understand the need for maintenance releases.

IGNOU4U.BLOGSPOT.COM

All software is tested rigorously before its released. You have done testing as an important part of your projects (you did, didnt you?). You may remember using normal data. Boundary data, out of range data. Or invalid data. There are five key stages: 1. Unit testing each component (subroutine, code ) is tested. 2. Module testing-the module is made up of several components that are put together. These are tested. 3. Subsystem testing-collection of modules are tested. The Purchase Order system is a subsystem of the Accounts system. 4. System testing-testing all the sub-system put together to make up the whole system. The interaction of the subsystems may produce unpredicted results. Also the testing is carried out to ensure that it matches all the requirements in the specification. Do you remember doing that in your project? 5. Acceptance testing as installed at the users site. Rather than dummy data, real data from the user are used (C ) Ans. 1. Files act locally where as DBMS saves directly in a database 2. Saves in temporary locations where as DBMS in well arranged and permanent data base location 3. In File System, transactions are not possible where as various transactions like insert, delete, view, updating etc are possible in DBMS 4. Data will be access through single or various files where as in DBMS. Tables (schema)is used to access data. 5. A File manager is used to store all relationships in directories in File Systems where as a data base manager (administrator) stores the relationship in form of structural tables 6. Data in data bases are more secure compared to data in files. ============================================================= 4 (a) Construct a PERT for designing a library management system.Make suitable assumptions wherever required. Explain any four characteristics of output design. Compare and Contrast Database system with a file system.

IGNOU4U.BLOGSPOT.COM

Ans. 1. 2. 3. 4. (C )

Characteristics of output design are as follows: Output school be concise and precise Output should not be cluttered. Output should be self explanatory. Output should be as per user requirements. What is work breakdown structure? Give an example.

Ans. A work breakdown structure (WBS) in project management and systems engineering, is a tool used to define and group a projects discrete work elements (or tasks) in a way that helps organize and define the total work scope of the project A work breakdown structure element may be a product, data, a service, or any combination. A WBS also provides the necessary framework for detailed cost estimating and control along with providing guidance for schedule development and control. Additionally the WBS is a dynamic tool and can be revised and updated as needed by the project manager.

Aircraft System Air Vehicle Rec Fire Cont Comm Trainin g Equi Peculia r Suppor Serv Dep

Example of a product work breakdown structure of an aircraft system =================================================================== 5. Write short notes on the following : (a) 4GL. Ans.One definition for a fourth generation language is that is non-procedure - that is, the programmer specifies what has to be done, but not how the task is to be performed. In traditional second and third generation programming languages, all the instructions necessary to bring each record, test for end of file, placing each item of data on screen and then going back and repeating the operation, would have to be coded. Thus the command

IGNOU4U.BLOGSPOT.COM

LIST in dbase software package does what a long program would do in third generation languages like BASIC or PASCAL. Some 4GLs are aimed at the end user, and ease of use is then a prime consideration. Others which could be described as very high level languages, are designed for use of professional computer experts, and their main objective is to cut down on development and maintenance time. Some such as ORACLE, offer a number of tools (SQL*CALC,SQL*FORMS,SQL*REPORT) suitable for an end user. (b) Software privacy.

Ans. The PC industry is around 20 years old. In this, both the quality and quantity of available software programs have increased drastically. Although approximately 70 percent of the worldwide market is today supplied by developers in the United States, important development work is carried out in scores of nations around the world. But in both the united states and else where, unauthorized copying if personal computer software is a serious problem. On the average, for every authorized copy of PC software in use, at least one unauthorized copy is made. Unauthorized copying is know as software piracy, and in 1993 it costs the software industry in excess of $ 12.8 billion. Software theft is widely practiced and widely tolerated. In some countries, legal protection for software is nonexistent, laws are unclear or not enforced with sufficient public commitment to cause those making unauthorized copies to take legal prohibitions on copying seriously. (c) GUI design.

Ans. User interface design or Interface engineering is the design of computers, appliances, machines, mobile communication devices, software application, and websites with the focus on the users experience and interaction. The goal of user interface design is to make the users called user-centered design. Good user interface design may be utilized to support its usability. The design process must balance technical functionality and visual elements (e.g., mental mode ) to create a system that is not only operational but also usable and adaptable to changing user needs. Interface design is involved in a wide range of projects from computer systems, systems, to cars, to commercial planes: all of these projects involve much of the same basic human interactions yet also require some unique skills and knowledge. As a result, designers tend to specialize in certain types of projects and have skills centered around their expertise, whether that be software design, user research, web design, or industrial design. There are several phases and processes in the user interface design, some of which are more demanded upon than others, depending on the project. (Note : for the remainder of this section, the word system is used to denote any project whether it is a web site. Application, or device.)

IGNOU4U.BLOGSPOT.COM

Functionality requirement gathering assembling a list of the functionality required of the system to accomplish the goals of the project and potential needs of the users. User analysis analysis of the potential users of the system either through discussion with people who work with the users and/or the potential users themselves. Typical questions involve: o What would the user want the system to do? o How would the system fit in with the users normal workflow of daily activities? o How technically savvy is the user and what similar systems does the user already use? What interface look & feel styles appeal to the user? (c) Categories of documentation (d) Ans. Characteristics of a good documentation are : a) b) c) d) Availability : It should be accessible to those for whom it is intended. Objectivity : It must be clearly defined in language that is easily understood. Cross-referable: It should be possible to refer to other documents. Easy to maintain : When the system gets modified, it should be easy to update the documentation. e) Completeness : It should contain everything needed, so that those who have gone through it carefully can understand the system. (e) CASE tools in SDLC. (Any two). Ans. CASE tools are to improve routine work through the use of automated support. CASE tools can be classifies as upper CASE tools and lower CASE tools. 1. Upper CASE tools: Upper CASE tools primarily help analysts and designers.It allows the analyst to create and modify the system design. All the information about the project is stored in an encyclopedia called CASE repository, a large collection of records, elements, diagrams, screens, reports and other information.Analysis reports may be produced using repository information to show the design is incomplete or contains errors. 2. Lower CASE tools:

IGNOU4U.BLOGSPOT.COM

Lower CASE tools are used more often by programmers and workers who must implement the systems designed using upper CASE tools.They are used to generate computer source code. Eliminating the need for programming the system. =============================================================== =============================================================== ===========================THE END============================

You might also like