Sunday, December 7, 2008

System Analysis Design

Analysis
The third phase of the SDLC in which the current system is studied and alternative replacement system is proposed. Description of current system and where problems or opportunities are with a general recommendation on how to fix. Enhance, or replace current system, explanation of alternative systems and justification for chosen alternative

Analysis Phase – One phase of the SDLC whose objective is to understand the user’s needs and develop requirements
Gather information
Define system Requirements
Build Prototypes for Discovery of Requirements
Prioritize Requirements
Generate and Evaluate Alternative
Review recommendations with Management

Requirement Analysis
Based on the survey have been conducted and studied several analysis had been performed on some existing solutions that are currently used. The following of the requirements will be incorporate into this “Modern Acrobatic Centre” in order to make it successfully adopted by users.
a. Easy to use.
The system will not incorporate complicated functions to make it difficult to use. The user interface will be properly designed, so that the user can understand it clearly.

b.Easy to maintain.
The system will be easy developed with easing technology so that it is easy to maintain and does not depend on proprietary technology.

c.Easy to support.
By developing the system on Visual Basic environment, it will make easy to support. By ensuring the user interface is properly clear and good help modules is designed for the system, we will ensure that minimal support required for the system.

d.Support multi levels of security control to access the different parts of the system.

e.System must be able to update frequently.
System must be able for future enhancement

Design
The next mode is focus on physical and logical design. This shows that this phase is devoted to designing the new enhanced system. During this phase, I have converted the description of the recommended alternatives solutions into logical and physical system specifications. It explains more on all the aspects of the system from input and output screens to log files and computer processes. The physical aspects always specifies the system being designed either as model or as detailed documentation which can guide those who wish to implement this system.

Logical Design
The logical design is all the functional features of the system chosen for development. During logical design, the “look and feel” of all system input and output as well as interfaces and dialogues are defined. It includes the following step:

Physical Design
Once the overall high-level design of the system is worked out, and then begins the turning of all the logical specifications into physical design. This physical design consist of various part of system which perform the physical operations necessary to facilitate data handling, processing and information output. This can be done in many ways, from creating a working model of system to be implemented, to writing detailed specification describing all the different parts of the system and how they should be build. In many cases, the working model becomes the basis for the actual system to be used. During physical design, we must determine many of the physical aspects of the system which is from the programming language until the single data capture within the system.

Design phase – the phase of the SDLC in which the system and programs are designed
Design and integrate the Network
Design the Application Architecture
Design the User interfaces
Design the system interfaces
Design the integrate the database
Prototype for Design Details
Design and integrate the System Controls

Good Disign
'Design' could refer to many things, but often refers to 'functional design' or 'internal design'. Good internal design is indicated by software code whose overall structure is clear, understandable, easily modifiable, and maintainable; is robust with sufficient error-handling and status logging capability; and works correctly when implemented. Good functional design is indicated by an application whose functionality can be traced back to customer and end-user requirements. For programs that have a user interface, it's often a good idea to assume that the end user will have little computer knowledge and may not read a user manual or even the on-line help; some common rules-of-thumb include:
the program should act in a way that least surprises the user
it should always be evident to the user what can be done next and how to exit
the program shouldn't let the users do something stupid without warning them

Project planning
Effective management of a software project depends on thoroughly planning the progress of the priject .The project manager must anticipate problems which might arise and prepare tentative solutions to those problems. Aplan, drawn up at the start of a project, should be used as the driver for the project. This intial plan is not static but must be modified as the project progresses and better information becomes available.
Quality plan- Describes the quality procedures and standards that will be used in a Project

Validation plan - Describes the approach, resources And schedule used for system Validation

Configuration management plan- Describes the configuration Management procedures and structures to be used

Maintenance plan- Predicts the maintenance requirements of the system maintenance costs and effort required

Staff development plan- describes how the skills and experience of the project team members will be developed

Project planning is probably the activity that takes most management time planning is required for development activities from specification through to delivery of the system. Organization must, of course, have longer-term business and strategic plans these will be used to guide choices on which projects have the highest priority and to assess whether or not software systems are needs.

Testing
Testing involves operation of a system or application under controlled conditions and evaluating the results (eg, 'if the user is in interface A of the application while using hardware B, and does C, then D should happen'). The controlled conditions should include both normal and abnormal conditions. Testing should intentionally attempt to make things go wrong to determine if things happen when they shouldn't or things don't happen when they should. It is oriented to 'detection'.
Organizations vary considerably in how they assign responsibility for QA and testing. Sometimes they're the combined responsibility of one group or individual. Also common are project teams that include a mix of testers and developers who work closely together, with overall QA processes monitored by project managers. It will depend on what best fits an organization's size and business structure.

kinds of testing
Black box testing - not based on any knowledge of internal design or code. Tests are based on requirements and functionality.

White box testing - based on knowledge of the internal logic of an application's code. Tests are based on coverage of code statements, branches, paths, conditions.

Unit testing - the most 'micro' scale of testing; to test particular functions or code modules. Typically done by the programmer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses.

Incremental integration testing - continuous testing of an application as new functionality is added; requires that various aspects of an application's functionality be independent enough to work separately before all parts of the program are completed, or that test drivers be developed as needed; done by programmers or by testers.

Integration testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems.

Functional testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing.)

System testing - black-box type testing that is based on overall requirements specifications; covers all combined parts of a system.

End-to-end testing - similar to system testing; the 'macro' end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate.

Sanity testing or smoke testing - typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5 minutes, bogging down systems to a crawl, or corrupting databases, the software may not be in a 'sane' enough condition to warrant further testing in its current state.

Regression testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing tools can be especially useful for this type of testing.

Acceptance testing - final testing based on specifications of the end-user or customer, or based on use by end-users/customers over some limited period of time.

Load testing - testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system's response time degrades or fails.

Stress testing - term often used interchangeably with 'load' and 'performance' testing. Also used to describe such tests as system functional testing while under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, large complex queries to a database system, etc.

Performance testing - term often used interchangeably with 'stress' and 'load' testing. Ideally 'performance' testing (and any other 'type' of testing) is defined in requirements documentation or QA or Test Plans.

Usability testing - testing for 'user-friendliness'. Clearly this is subjective, and will depend on the targeted end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Programmers and testers are usually not appropriate as usability testers.

Install/uninstall testing - testing of full, partial, or upgrade install/uninstall processes.

Recovery testing - testing how well a system recovers from crashes, hardware failures, or other catastrophic problems.

Failover testing - typically used interchangeably with 'recovery testing'

Security testing - testing how well the system protects against unauthorized internal or external access, willful damage, etc; may require sophisticated testing techniques.

Compatability testing - testing how well software performs in a particular hardware/software/operating system/network/etc. environment.

Exploratory testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it.

Ad-hoc testing - similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it.

Context-driven testing - testing driven by an understanding of the environment, culture, and intended use of software. For example, the testing approach for life-critical medical equipment software would be completely different than that for a low-cost computer game.

User acceptance testing - determining if software is satisfactory to an end-user or customer.

Comparison testing - comparing software weaknesses and strengths to competing products.

Alpha testing - testing of an application when development is nearing completion; minor design changes may still be made as a result of such testing. Typically done by end-users or others, not by programmers or testers.

Beta testing - testing when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically done by end-users or others, not by programmers or testers.

Mutation testing - a method for determining if a set of test data or test cases is useful, by deliberately introducing various code changes ('bugs') and retesting with the original test data/cases to determine if the 'bugs' are detected. Proper implementation requires large computational resources.

A sample testing cycle
there is a typical cycle for testing:
Requirements analysis: Testing should begin in the requirements phase of the Soft development life cycle During the design phase, testers work with developers in determining what aspects of a design are testable and with what parameters those tests work.

Test planning: Test strategy, test plan test bed creation. A lot of activities will be carried out during testing, so that a plan is needed.

Test development: Test procedures, test, test cases, test scripts to use in testing software.

Test execution: Testers execute the software based on the plans and tests and report any errors found to the development team.

Test reporting: Once testing is completed, testers generate metrics and make final reports on their test effort and whether or not the software tested is ready for release.
Test result analysis: Or Defect Analysis, is done by the development team usually along with the client, in order to decide what defects should be treated, fixed, rejected (i.e. found software working properly) or deferred to be dealt with at a later time. .

Coding
is the production f machine comprehensible instruction. Techniques used here include
structured programme techniques coding maybe performed in one of 4 types of languages
a) 1st generation languages-machine code i.e.1’s and 0’s
b) 2nd generation languages-simple instruction such as arithmetic function
e.g. assembly language
c) 3rd generation languages-languages which use more English or more scientific constructs
e.g. COMOL,PLI,FORTRAN
d) 4th generation languages-languages which use powerful commands to perform multiple operations
e.g. FOCUS,POWERHOUSE

Documentation
Documentation may refer to the process of providing evidence ("to document something") or to the communicable material used to provide such documentation (i.e. a document). Documentation may also (seldom) refer to tools aiming at identifying documents (see bibliography) or to the field of study devoted to the study of documents and bibliographies (see documentation (field).Subfields of documentation includes. Documentation understood as document is any communicable material (such as text, video, audio, etc., or combinations thereof) used to explain some attributes of an object, system or procedure. It is often used to mean engineering documentation or software documentation, which is usually paper books or computer readable files (such as HTML pages) that describe the structure and components, or on the other hand, operation, of a system/product.
A professional whose field and work is documentation used to be termed a document list. Normally, document lists are trained or have a background in both a specific subject and in the field of documentation (today information science). A person more or less exclusively to write technical documentation is called a technical writer. Technical writers are similarly trained or have a background in technical writing, along with some knowledge of the subject(s) they are documenting. Often, though, they collaborate with subject matter experts (SMEs), such as engineers.

Meeting agenda
An agenda is a list of meeting activities in the order in which they are to be taken up, beginning with the call to order and ending with adjournment. It usually includes one or more specific items of business to be considered. It may, but is not required to, include specific times for one or more activities. The agenda is usually headed with the date, time and location of the meeting, followed by a series of points outlining the order of the meeting.

Points on a typical agenda
Welcome/open meeting
Support for absence
Approve minutes of the previous meeting
Matters arising from the previous meeting
A list of specific points to be discussed — this section is where the bulk of the discussion in the meeting usually takes place.
Any other business (AOB) — allowing a participant to raise another point for discussion.
Arrange/announce details of next meeting
Close meeting
In purchase order system meetings of deliberative bodies, the agenda may also be known as the orders of the day. The agenda is usually distributed to a meeting's participants prior to the meeting, so that they will be aware of the subjects to be discussed, and are able to prepare for the meeting accordingly. In parliamentary procedure, an agenda is not binding upon an assembly unless its own rules make it so, or unless it has been adopted as the agenda for the meeting by majority vote at the start of the meeting. Otherwise, it is merely for the guidance of the chair. If an agenda is binding upon an assembly, and a specific time is listed for an item, that item cannot be taken up before that time, and must be taken up when that time arrives even if other business is pending. If it is desired to do otherwise, the rules can be suspended for that purpose

Validation
Verification typically involves reviews and meetings to evaluate documents, plans, code, requirements, and specifications. This can be done with checklists, issues lists, walkthroughs, and inspection meetings. Validation typically involves actual testing and takes place after verifications are completed. The term 'IV & V' refers to Independent Verification and Validation.

In common usage, validation is the process of checking if something satisfies a certain criterion. Examples would include checking if a statement is true (validity), if an appliance works as intended, if a computer system is secure, or if computer data are compliant with an open standard. Validation implies one is able to document that a solution or process is correct or is suited for its intended use.
In engineering or as part of quality management system , validation confirms that the needs of an external customer or user of a product, service, or system are met. Verification is usually an internal quality process of determining compliance with a regulation, standard, or specification. An easy way of recalling the difference between validation and verification is that validation is ensuring "you built the right product" and verification is ensuring "you built the product right." Validation is testing to confirm that it satisfies stakeholder or user's needs.
Validation can mean to declare or make legally valid or to prove valid or confirm the validity of data, information, or processes:
By Mr P.Rajamohan

Accessibility keyboard shortcuts



Right SHIFT for eight seconds - Switch FilterKeys on and off.

Left ALT +left SHIFT +PRINT SCREEN Switch High Contrast on and off.

Left ALT +left SHIFT +NUM LOCK - Switch MouseKeys on and off.

SHIFT five times - Switch StickyKeys on and off.

NUM LOCK for five seconds - Switch ToggleKeys on and off.

.

blogger templates | Make Money Online