Software Test Engineering: Manual Testing
Software Test Engineering: Manual Testing
ENGINEERING
MANUAL TESTING
INTRODUCTION:
WHAT IS QUALITY?
WHAT IS TESTING?
WHY TESTING?
TESTING METHODOLOGY:
BLACK BOX TESTING
WHITE BOX TESTING
GREY BOX TESTING
LEVELS OF TESTING:
UNIT LEVEL TESTING
MODULE LEVEL TESTING
INTEGRATION LEVEL TESTING
SYSTEM LEVEL TESTING
USER ACCEPTANCE LEVEL TESTING
ENVIRONMENTS:
STAND-ALONE ENVIRONMENT (OR) ONE-TIER ARCHITECTURE
CLIENT-SERVER ENVIRONMENT (OR) TWO-TIER ARCHITECTURE
WEB ENVIRONMENT (OR) THREE-TIER ARCHITECTURE
DISTRIBUTED ENVIRONMENT (OR) N-TIER ARCHITECTURE
TYPES OF TESTING:
BUILD VERIFICATION TESTING/BUILD ACCEPTANCE TESTING/SANITY TESTING
REGRESSION TESTING
RE TESTING
ALPHA TESTING
BETA TESTING
STATIC TESTING
DYNAMIC TESTING
INSTALLATION TESTING
1
COMPATABILITY TESTING
MONKEY TESTING
Page 1
USABILITY TESTING END-TO-
END TESTING
EXPLORATORY TESTING
SECURITY TESTING
PORT TESTING
MUTATION TESTING
SOAK TESTING/RELIABILITY TESTING
ADHOC TESTING
INPUT DOMAIN TESTING
INTER SYSTEM TESTING
PARALELL TESTING
PERFORMANCE TESTING
LOAD TESTING
STRESS TESTING
STORAGE TESTING
DATA VOLUME TESTING
BIG BANG TESTING/INFORMAL TESTING/SINGLE STAGE TESTING
INCRIMENTAL TESTING/FORMAL TESTING
TERMINOLOGY
DEFECT PRODUCT
DEFECTIVE PRODUCT
QUALITY ASSURANCE
2
QUALITY CONTROL
NCR
INSPECTION
AUDIT
INTERNAL AUDIT
EXTRNAL AUDIT
CAPA (CORRECTIVE ACTIONS & PREVENTIVE ACTIONS)
CORRECTIVE ACTIONS
PREVENTIVE ACTIONS
Page 2
SCM (SOFTWARE CONFIGURATION MANAGEMENT)
CHANGE CONTROL
VERSION CONTROL
COMMON REPOSITORY
CHECK
IN CHECK OUT
BASE LINE
PUBLISHING/PINNING
RELEASE
DELIVERY
SRN (SOFTWARE RELEASE NOTE)
SDN (SOFTWARE DEVELOPMENT NOTE)
REVIEW
REVIEW REPORT
COLLEAGUES
PEER
PEER REVIEW
PEER REVIEW
REPORT TEST SUIT
TEST BED
HOT FIX
DEFECT AGE
LATENT DEFECT
SLIP AGE
ESCALATION
METRICS
TRACEABILITY MATRIX
PROTOTYPE
TEMPLATE
BENCH MARK
CHANGE REQUEST IMPACT
ANALYSIS
WALK THROUGH
CODE WALK THROUGH
CODE OPTIMIZATION/FINE TUNING
PPM (PERIODIC PROJECT MEETING)
PPR (PERIODIC PROJECT REPORT)
MRM (MANAGEMENT REPRESENTATIVE MEETING)
PATCH
WORK AROUND
WAYS OF TESTING
MANUAL TESTING
AUTOMATION TESTING
DRAWBACKS OF MANUAL TESTING
DRAWBACKS OF AUTOMATION TESTING
3
MANUAL TESTING
What is MANUAL TESTING?
MANUAL TESTING is a process, in which all the phases of STLC (SOFTWARE TESTING LIFE CYCLE) like
Test planning, Test development, Test execution, Result analysis, Bug tracking and Reporting are accomplished
successfully and manually with Human efforts.
Project: Project is something that is developed based on particular customer’s requirements and for their usage only.
Product: Product is something that is developed based on the company specifications and used by multiple
customers.
Note: The product based company will first have general survey in the market. Gather’s clear requirements from
different customers, and based on common requirements of so many customer`s. They will decide the specifications
(Requirements).
Quality:
Classical Definition of Quality: Quality is defined as justification of all the requirements of a customer in a product.
Note: Quality is not defined in the product. It is defined in the customer`s mind.
Testing: Testing is a process in which defects are identified, isolated, subjected for rectification and ensure that the
product is defect free, in order to produce the quality product and hence the customer satisfaction. (or) Verification &
Validation of software is called Testing.
Bidding: The project is defined as a request for proposal, estimation and signing off.
Kick off meeting: It is an initial meeting conducted in the software company, soon after the project is signed off, in
order to discuss the overview of the project and also to select a project manager.
Usually Project managers, Technical managers, Quality managers, High level management, Test leads,
Development leads and sometimes customer representatives will be involved in this meeting.
Note: Apart from this meeting any kind of startup meeting during the process can be considered as ‘Kick off Meeting.
4
Project Initiation Note (PIN): It is a mail prepared by the project manager and sent to CEO of the software company
as well as for all the core team members in order to intimate them, that they are about to start the actual project
activities.
Software Quality:
Technical:
Meeting Customer Requirements
Meeting Customer Expectations (User friendly, Performance, Privacy)
Non-Technical:
Cost of Product
Time to Market
Software Quality Assurance: To monitor and measure the strength of development process, Organization follows
SQA concepts.
Software Project: Software related problems solved by software engineers through a software engineering process.
Some companies will maintain two documents. One is for overall business flow information and second is detailed
functional information, but some companies will maintain both this information’s in a single document.
5
Template: Template is pre-defined format, which is used for preparing a document very easily and perfectly.
Prototype: Prototype is a roughly and rapidly developed model which is used for demonstrating to the client in
order to gather clear requirements and also to build the confidence of a customer. Ex: Power Point slide show.
Tasks:
Feasibility study
Tentative planning
Technology selection and Environment confirmation
Requirement analysis
Roles: System Analyst – SA, Project Manager – PM, Technical Manager - TM Process:
a. Feasibility study:
It is detailed study conducted on the requirement documents, in order to confirm whether the given
requirements are possible within the given budget, time and available resources or not. b. Tentative
planning:
In this section resource planning and time planning will be temporarily done.
c. Technology selection & Environment confirmation:
The list of all technologies required for accomplishing the project successfully will be analyzed, the environment
suitable for that project will be selected and mentioned here in this section. d. Requirement analysis:
The list of all the requirements that are required by the company to accomplish this project successfully will be
analyzed and listed out clearly in this section.
Note: Requirements may be Human Resources, Software’s, and Hardware’s.
Proof: The proof document of the Analysis phase is System Requirements Specifications (SRS).
Tasks:
High level designing
Low level designing
Roles:
High level design is done by chief Architect(CA)
Low level design is done by Technical Lead(TL)
Process:
The chief architect will divide the whole project in to modules by drawing some diagrams using
unified modeling language (UML).
The Team lead will divide the modules into sub modules by drawing some diagrams using the same
UML.
In this phase they will also design GUI part of the application, as well as PSEUDO CODE also
developed Proof: The proof document of this phase is Technical Design Document (TDD). LLD: Ex: DFD-Data
Flow Diagram, E-R Diagram, Class Diagram, Object Diagram.
PSEUDO CODE: PSEUDO Code is a set of English instructions, which will make the developer’s more comfortable,
while developing the actual source code.
6
Example’s for Coding standards:
Proper Indentation (left margin)
Color codling’s
Proper Commenting
Proof: Proof document of the Coding phase is Source Code Document (SCD).
BUILD: A finally intigrated all modules set .EXE form is called Build.
TEST CASES: Implementing the creative Ideas of the Test Engineer on the application for testing, with the
help of requirement document is known as TEST CASES.
Delivery:
Tasks: Hand over the Application to the client
Roles: Deployment engineers (or) Installation engineers.
Process: The Deployment engineers will go to the customers place, install the application in to the customers
environment and handover the original software to the client.
Proof: The final official agreement made between the customer and company is proof document for Delivery.
Maintenance:
Once the application is delivered. The customer will start using it, while using if at all they face any problems
then that particular problem will be created as tasks. Based on the tasks corresponding roles will be appointed. They
will define the process and solves the problem .This process is known as Normal Maintenance. But some customers
may request for continuous Maintenance, in that case a team of members will be continuously working in the client
site in order to take care of their Software.
7
1. Unconventional Testing
2. Conventional Testing
Unconventional testing: It is sort of Testing in which the Quality Assurance people will check each and every out
come document is according to the company standards or not right from the Initial phase to the end.
Conventional testing: It is sort of Testing in which one will check the developed applications or its related parts
are working according to the exceptions or not, from the Coding phase to the end. Usually Quality Control people
will do Conventional testing.
LEVELS OF TESTING:
There are 5 levels of Testing:
1. Unit level testing (WBT)
2. Module level testing
3. Integration level testing
4. System level testing
5. User acceptance level testing
8
In this stage the developers will develop interfaces (Linking Prg’s), in order to integrate the modules. The White
Box testers will test whether the interfaces are working fine or not. Developers will integrate the modules by following any one of
the following approach:
TOP-DOWN APPROACH: In this approach parent modules will be develop first and then related child modules will be integrated.
M1
M2 M3
M4 M5 M6
M7 M8
STUB: While integrating the modules in the Top-Down approach, if at all any mandatory module is missing then that module
is replaced with a temporary program known as STUB.
BOTTOM-UP APPROACH: In this approach child modules will be developed first and then integrated back to the
Corresponding parent modules.
STUB
DRIVER
M1
M2 M3
M4 M5 M6
DRIVER: While integrating the modules in Bottom-Up approach, if at all any mandatory module is missing then that
module is replaced with a temporary program known as DRIVER.
SANDWICH (OR) HYBRID APPROACH: This is a mixture of both Top-Down and Bottom-Up approach.
9
M1
M2 M3
M4 M5 M6
M7 M8
BIG BANG APPROACH: In this approach one will wait till the modules are developed and will integrate them at a time
finally.
M1 M2
M7
M3
M6
M4
M5
1. Usability Testing
2. Functionality Testing CORE LEVEL
3. Performance Testing
4. Security Testing
ADVANCE LEVEL
During Usability Testing, testing team validates User Friendliness of screens.
During Functionality Testing, testing team validates Correctness of Customer
Requirements.
During Performance Testing, testing team estimates Speed of Processing.
10
During Security Testing, testing team validates Privacy to User Operations.
SYSTEM INTEGRATION TESTING: It is a type of testing in which one will perform some actions at one module and
check for the reflections in all the related areas.
Ex:
Warehouse Finance Inventory
Sales
Purchase
In this stage the Black Box test engineers will test once again the user desired areas in the presence of
the user in order to make him to accept the application.
ENVIRONMENT: Environment is defined as group of hardware components with some basic software’s which can
hold the business logic, presentation logic and database logic.
(Or)
Environment is a combination of Presentation layer, Business layer, and Database layer which can hold
presentation logic, business logic and database logic.
TYPES OF ENVIRONMENTS:
There are 4 types of environments:
1. STAND-ALONE ENVIRONMENT (OR) ONE-TIER
ARCHITECTURE.
2. CLIENT-SERVER ENVIRONMENT (OR) TWO-TIER
ARCHITECTURE.
3. WEB ENVIRONMENT (OR) THREE-TIER ARCHITECTURE.
4. DISTRIBUTED ENVIRONMENT (OR) N-TIER
ARCHITECTURE.
1. STAND-ALONE ARCHITECTURE:
In this environment all the three layers that is presentation layer, business layer, database layer will be
available in the single tier. When the application needs to be used by a single user at a time then one can
suggest this environment.
PL
BL
DBL
2. CLIENT-SERVER ENVIRONMENT:
11
In this environment two tiers will be there. One is for clients, other is for servers. Presentation layer and
business layer will be available in each and every client; database layer will be available in the server.
Whenever the application need to be used by multiple users sharing the common data in a single premises
and wants to access the application very fastly and there is no problem with security. Then one can suggest
clientserver environment.
Ex: LAN.
PL
DBL
+
BL
3. WEB ENVIRONMENT:
This environment contains three tiers. One is for clients, middle one is for application server and the last one is for
database servers.
Presentation layer will be available in client, Business layer will be available in the application server and
Database layer will be available in the database servers.
Whenever the application needs to be used all over the world by limited number of people then this
environment can be suggested.
PL + BL + DBL
Ex: WAN.
4. DISTRIBUTED ENVIRONMENT:
This environment is similar to web environment but number of application servers are introduced in individual
tiers. In order to distribute the business logic, so that load will be distributed and performance will be
increased.
Whenever the application needs to be used all over the world by huge number of people then this
environment can be suggested.
BL BL BL
PL AS DBL
+ AS +
AS
AS – APPLICATION SERVER
BL – BUSSINESS LOGIC
DATABASE: It is a base (or) a place where on can store and retrieve the data
12
SOFTWARE PROCESS DEVELOPMENT MODELS:
1. WATER FALL MODEL
2. PROTOTYPE MODEL
3. EVOLUTIONARY MODEL
4. SPIRAL MODEL
5. FISH MODEL
6. V-MODEL
1. WATERFALL MODEL:
PHASE ACTIVITY OUTCOME
INITIAL REQUIREMENTS
BRS
GATHERING
DELIVERY &
DELIVERY TO CLIENT
MAINTANANCE
Page 13
13
ADVANTAGES:
1. It is a simple model.
2. Project monitoring and maintenance is very easy.
DISADVANTAGES:
Can’t accept the new requirements in the middle of the process.
2. PROTOTYPE MODEL:
H/W
DEMO TO CLIENT
PROTOTYPE PROTOTYPE
S/W
DEMO TO CLIENT
PROTOTYPE
ADVANTAGES:
Whenever the customer’s are not clear with their requirements then this is the best suitable model.
DISADVANTAGES:
a. It is not a full fledged process development model.
b. Prototype need to be build on companies cost.
c. Slightly time consuming model.
d. User may limit his requirements by sticking to the PROTOTYPE.
14
3. ENVIRONMENT MODEL:
INITIAL
REQUIREMENTS
DEVELOPMENT
AND TESTING
FEED BACK WITH
N NEW
APPLICATION USER USER REQUIREMENTS
VALIDATION ACCEPT
APPLICATION
Y
BASE LINED
ADVANTAGES:
Whenever the customers are evolving the requirements then this is the best suitable model. Adding the new
requirements
after some
period of time.
1. Project monitoring and maintenance is difficult.
2. Can’t define the deadlines properly. DISADVANTA
GES:
4
.
S
P
I
R
A
L
M
O
D
E
L
:
15
Ex: Risk based scientific projects, Satellite projects.
ADVANTAGES:
Whenever the project is highly risk based this is the best suitable model.
DISADVANTAGES:
1. Time consuming model.
2. Costly model.
3. Risk route cause analysis is not an easy task.
NOTE: Cycles depend upon Risk involved in the project and size of the project, Every cycle has 4 phases, except the
last phase.
5. FISH MODEL:
ADVANTAGES :
As both verification and validation are implemented the outcome will be a quality product.
16
DISADVANTAGES:
1. Time consuming model.
2. Costly model.
VERIFICATION:
Verification is a process of checking each and every role in the organization in order to confirm weather
they are working according to the company’s process guidelines or not.
VALIDATION:
Validation is a process of checking, conducted on the developed product or its related parts in order to
confirm weather they are working according to the expectations or not.
PORT TEST
DELIVER S/W EFFICIENCY
& TEST S/W CH ANGES
MAINTANANCE
DRE=0-1(Range).
DRE=A/A+B.
DRE = Defect Removal Efficiency.
A = Defects found by the Testing Team. .
B = Defects raised by the Customer.
DRE=80/80+20=80/100=4/5=0.8 (Good Software).
17
ADVANTAGES:
As verification, validation, test management process is maintained. The outcome will be a quality
product.
DISADVANTAGES:
1. Time consuming model.
2. Costly model.
AGILE MODEL: Before development of the application, where testers write the test cases and gives to the
development team, so that it can be easy for developers to develop defect free Programs.
TERMINOLOGY:
IF A DEVELOPER FINDS A MISTAKE IN CODE, WHILE DEVELOPING OF AN APPLICATION IT IS CALLED AN ERROR.
IF A TESTER FINDS A MISTAKE IN A BUILD, WHILE TESTING IT IS CALLED A DEFECT (or) ISSUE.
IF A DEFECT IS ACCEPTED BY DEVELOPER TO SOLVE IT IS CALLED A BUG.
IF A CUSTOMER FINDS ANY MISTAKES, WHILE USING THE APPLICATION IT IS CALLED A MISTAKE (or) FAULT (or) FAILURE.
A mistake in code is called ERROR. Due to errors in coding, test engineers are getting mismatches in application
called DEFECTS. If defect accepted by development to solve called BUG.
TYPES OF TESTINGS
1. BUILD ACCEPTANCE TEST/BUILD VERIFICATION TEST/SANITY TESTING:
It is type of testing In which one will perform overall testing on the released build, in order to confirm
whether it is proper for conducting detailed testing or not. Usually during this type of testing they check the
following:
Whether the build is properly installed or not
Whether one can navigate to all the pages of application or not
Whether all the important functionality are available or not
Whether all the required connections are properly established or not
Some companies even called this as SMOKE TESTING, but some companies will say that before releasing the
build to the testing department, the developers will check whether the build is proper or not that is known as
SMOKE TESTING, and once the build is released what ever the test engineers is checking is known as BAT or BVT or
SAINITY TESTING (BAT: Build Acceptance Test, BVT: Build Verification Test).
2. REGRESSION TESTING:
It is type of testing in which one will perform testing on the already tested functionality again and again.
Usually we do this in 2 scenarios:
When ever the tester’s identify the defects raise to the developers, next build is released then the test
engineers will check defect functionality as well as the related functionality once again.
When ever some new features are added, the next build is released to testing department team. Then the
test engineers will check all the related features of those new features once again this is known as
Regression Testing
Note: Testing new features for the first time is known as New testing it not the Regression testing. Note:
Regression testing starts from 2nd build and continuous up to last build.
3. RETESTING:
It is type of testing in which one will perform testing on the same funcatnality again and again with deferent sets of
values, in order to confirm whether it is working fine or not.
18
Note: Retesting starts from 1st build and continuous up to last build. Note:
During Regression testing also Retesting will be conducted.
4. ALPHA TESTING:
It is type of user acceptance testing conducted in the software company by the test engineers just before
delivering the application to the client.
5. BETA TESTING:
It is also a type of user acceptance testing conducted in the client’s place either by the end users or third
party experts, just before actual implementation of the application.
6. STATIC TESTING (Look and Feel Testing):
It is a type of testing in which one will perform testing on the application or its related factors without doing any
actions.
EX: GUI Testing, Document Testing, Code Reviews etc…,
7. DYNAMIC TESTING:
It is a type of testing in which one will perform testing on the application or its related factors by doing some
actions.
Ex: Functional Testing.
8. INSTALLATION TESTING:
It is a type of testing in which one will install the application in to the environment, following the guide lines
provided in the deployment document(Installation Document), in order to confirm whether these guide lines are
really suitable for installing the application into the environment or not.
9. PORT TESTING:
It is a type of testing in which one will install the application in to the original client’s environment and check
weather it is compatible with that environment or not.
20
The execution of our application under customer expected configuration to estimate peak limits of data is
called data volume testing.
1. Test Planning.
2. Test Development.
3. Test Execution.
4. Result Analysis.
5. Bug Tracking.
6. Report.
1. Test planning:
Plan: Plan is strategic document which describes how to perform a task in an effective, efficient and optimized way.
Test plan: Test plan is strategic document, which contains some information that describes how to perform
testing on an application in an effective, efficient and optimized way.
Optimization: It is process of utilizing the available resources to their level best and getting maximum possible out
put.
1.0 Introduction
1.1 Objective.
1.2 Reference documents.
2.0 Test coverage
2.1 Features to be tested
2.2 Features not to be tested
3.0 Test strategy
3.1 Levels of testing
3.2 Types of testing
3.3 Test design technology BVA ECP
3.4 Configuration management
3.5 Test matrices
3.6 Terminology
3.7 Automation plan
3.8 List of automated tools
4.0 Base criteria
4.1 Acceptance criteria
4.2 Suspension criteria
5.0 Test deliverables
6.0 Test environment
21
7.0 Resource planning
8.0 Scheduling
9.0 Staffing and Training 10.0
Risk and Contingences.
11.0 Assumptions. TEST DATA
12.0 Approval information.
1.0 INTRODUCTION:
1.1 Objective: The purpose of the document will be clearly described clearly in this section.
1.2 Reference documents: The list of all the documents that are referred while preparing the test plan will be listed
out here in this section.
22
The lists of all the documents that are to be prepared during testing process will be maintained here in this
section.
Ex: Test case document, Defect profile document etc.
8.0 SCHEDULING:
The starting dates and ending dates of each and every list will be clearly planed and maintained here in this
section.
11.0 ASSUMPTION:
The list of all the assumption’s need to be made by the testing people will be maintained here in this section.
Ex: Test Data.
23
2. Test Development:
Requirement Document:
USE CASE: It describes the functionality of certain feature of an application in terms of actors, actions and
responses.
SNAPSHOT:
Functional Requirements:
24
1. Login screen should contain USER NAME, PASSWORD, CONNECT TO Fields, LOGIN, CLEAR, CANCEL Buttons.
2. Connect to field is not a mandatory field, but it should allow the user to select a database option, if required.
3. Upon entering the user name, password and clicking on login button, the corresponding page must be displayed.
4. Upon entering the information into any of the fields and clicking on clear button, all the fields must be cleared
and the cursor must be available in the user name field. 5. Upon clicking on cancel button, login screen must be
closed.
Special Requirements:
1. Upon invoking the application, login and clear buttons must be disable.
2. Cancel button must be always enabled.
3. Upon entering some information into any of the fields, the clear button must be enabled.
4. Upon entering some information into username and password login button must be enabled.
5. Tabbing order must be username, password, connect to, login, clear, and cancel.
1. Implicit Requirements
2. Explicit Requirements
1. Implicit Requirements: The requirements that are analyzed by the Business Analyst and his team, which
will add some value to the application, and will not affect any of the customer’s requirement.
2. Explicit Requirements: The requirements that are explicitly given by the customer are known as Explicit
Requirements.
3. ……………………………………………………………………
Explicit Requirements:
1. Upon invoking the application, login and clear buttons must be disable.
2. Cancel button must be always enabled.
3. Upon entering some information into any of the fields, the clear button must be enabled.
25
4. Upon entering some information into both the
username and password fields’ login button must be enabled.
5. Tabbing order must be username, password, connect to, login, clear, and cancel.
Implicit Requirements:
1. Upon invoking the application the cursor must be available in the user name field.
2. Upon entering invalid user name, valid password, and clicking on login button the following error msg must be
displayed “INVALID USER NAME Please try again”.
3. Upon entering valid user name, invalid password and clicking on login button the following error msg must be
displayed “INVALID PASSWORD Please try again”.
4. Upon entering invalid user name, invalid password and clicking on login button the following error msg must be
displayed “INVALID USER NAME & INVALID PASSWORD Please try again”.
GENERIC REQUIREMENTS: Universal Requirements.
SPECIFIC REQUIREMENTS: Customer Requirements.
Flow of Events:
Main Flow:
ACTION REPONSE
*Actor invoke the application * Application displays the login screen with the following
fields:
User name, password, connect to, login, clear, and cancel
buttons.
*Actor enters valid user name, valid password and clicks *Authentication’s, application displays either home page
on login button. or admin page depending upon the actor enter.
*Actor enters valid user name, valid password, selects a *Authenticates, application displays either home page or
database option and clicks on login button. admin page depending upon the actor enter with the
mentioned database connection.
26
Alternative flow table 1 (INVALID USER NAME):
ACTION RESPONSE
*Actor enters invalid user name, valid password and *Authenticates, application displays the following error
clicks on login button. msg: “INVALID USER NAME Please try again”.
ACTION RESPONSE
*Actor enters some information in any of the fields and *Application clears the fields and makes the cursor
clicks on clear button. available in the user name.
27
Alternative flow table 5 (CANCEL CLICK):
ACTION RESPONSE
*Actor clicks on cancel button. *Login screen is closed”.
THE GUIDE LINES TO BE FOLLOWED BY A TEST ENGINEER, SOON AFTER THE USE CASE DOCUMENT IS RECEIVED:
1. Identify the module to which the use case belongs to.
A: Security module.
2. Identify the functionality of the use case with the request of total functionality. A: Authentication.
5. Identify whether the use case is linked with other use case or not. A: It is linked with Home page and Admin page
use cases.
8. Identify the functional points and prepare the functional point document.
UNDERSTAND:
9. Understand the main flow of the application.
10. Understand the alternative flow of the application.
11. Understand the special requirements.
DOCUMENT:
12. Document the test cases for main flow.
13. Document the test cases for alternative flow.
14. Document the test cases for the special requirements.
15. Prepare the cross reference metrics or traceability metrics.
Functional Point:
The point at which the user can perform some actions in the application can be considered as Functional
Point.
Test Scenario:
TRACEABILITY MATRIX:
It is a document which contains a table of linking information used for tracing back for the reference. In any kind of
confusion or questionable situation.
TM:
UCD ID FPD ID TSD ID TCD ID DPD ID
8.1 3 4 26 1
23.2 21 8 86 2
5.4 34 6 44 3
RTM:
TEST CASE ID REQUIREMENT ID
29
1
2 1.0 1.0
3 1.1 1.2
4 1.2
5 2.0
6
Note: The Test cases which are prepared related to which requirement is mentioned here in this table. If we
me
DEFECT ID TEST CASE ID ntio
n
Req
. Id
in
the
test
case
doc
um
ent,
it
sam
e as
the
RT
M,
1 23 so
2 34 we
3 56 nee
4 44 d
not to maintain RTM separately for the application.
DTM:
Note: The defects which are related to which test case is mentioned here in this table.
30
d. Installation Testing.
FUNCTIONALITY TESTING: It is a type of testing in which one will perform testing the functionality of an application,
functionality means behavior of the application.
Apart from the above guidelines any idea we get with which we can test something in the application, just by look
and feel without doing any actions, and then all those ideas also can be considered as GUI Test Cases.
31
TEST CASE TEMPLATE:
PROJECT NAME:
MODULE:
AUTHOR:
REQ. TEST CATEGORY PREREQUISITE DESCRIPTION/ TEST STEPS EXPECTED VALUE ACTUAL VALUE TEST RESULT BUILD PRIORITY
ID CASE ID DATA (PASS/FAIL) No.
BLOCKED
9 POSITIVE NA ENTER SOME INFORMATION ALL THE FIELDS MUST BE ALL THE FIELDS ARE FAIL 1
INTO ANY OF THE FIELDS AND CLEARED AND THE CLEARED BUT CURSOR IS
CLICK ON CLEAR BUTTON CURSOR SHOULD BE NOT PLACED IN THE
DISPLAYED IN THE USERNAME FIELD
USERNAME FIELD
10 POSITIVE NA ENTER THE USERNAME, CORRESPONDING PAGE CORRESPONDING PAGES VIT FAIL 1
PASSWORD AS PER THE VIT AND MUST BE DISPLAYED AS ARE NOT DISPLAYED AS
CLICK ON LOGIN BUTTON PER THE VIT PER THE VIT
11 POSITIVE NA ENTER USERNAME, PASSWORD CORRESPONDING PAGE CORRESPONDING PAGES VIT FAIL 1
AS PER THE VIT & SELECT A MUST BE DISPLAYED AS ARE NOT DISPLAYED AS
DATABASE OPTION PER THE VIT WITH THE PER THE VIT, BUT THE
MENTIONED DATABASE MENTIONED DATABASE
CONNECTION CONNECTION IS PROPERLY
ESTABLISHED
12 POSITIVE NA CLICK ON CANCEL BUTTON LOGIN SCREEN MUST BE LOGIN SCREEN IS CLOSED PASS 1
CLOSED
32
13 POSITIVE LOGIN SCREEN CHECCK FOR THE TABBING ORDER MUST BE TABBING ORDER IS AS PASS 1
MUST BE TABBING ORDER AS FOLLOWS: USERNAME, FOLLOWS: USERNAME,
INVOKED PASSWORD, CONNECT TO, PASSWORD, CONNECT TO,
LOGIN, CLEAR & CANCEL LOGIN, CLEAR & CANCEL
14 NEGATIVE NA ENTER THE USERNAME & CORRESPONDING ERROR CORRESPONDING MSG`S IVIT FAIL 1
PASSWORD AS PER IVIT AND MSG SHOULD DISPLAYED ARE NOT DISPLAYED AS
CLICK ON LOGIN BUTTON AS PER IVIT PER IVIT
15 NEGATIVE NA ENTER SOME INFORMATION LOGIN BUTTON MUST BE LOGIN BUTTON IS FAIL 1
ONLY INTO THE USERNAME DISABLED ENABLED
FIELD AND CHECK FOR THE
ENABLED PROPERTY OF LOGIN
BUTTON
16 NEGATIVE NA ENTER SOME INFORMATION LOGIN BUTTON MUST BE LOGIN BUTTON IS PASS 1
ONLY INTO THE PASSWORD DISABLED DISABLED
FIELD AND CHECK FOR THE
ENABLED PROPETY OF LOGIN
BUTTON
Note: The underlined words in the Test Data column are the Hyperlinks to go for the Respective Data Table.
LOGIN SCREEN:
4 LOGIN BUTTON
5 CLEAR BUTTON
6 CANCEL BUTTON
33
S.No. USERNAME PASSWORD EXPECTED MESSAGE ACTUAL VALUE RESULT
1 SURI QTP INVALID USER NAME PLEASE TRY INVALID USER NAME PLEASE TRY PASS
AGAIN AGAIN
2 CHIRUTHA SRIDEVI INVALID USER NAME PLEASE TRY INVALID USER NAME PLEASE TRY PASS
AGAIN AGAIN
3 VENKI SAVITHRI INVALID PASS WORD PLEASE TRY INVALID PASS WORD PLEASE TRY PASS
AGAIN AGAIN
4 NTR BALU INVALID PASS WORD PLEASE TRY INVALID PASS WORD PLEASE TRY PASS
AGAIN AGAIN
5 SRI JAVA INVALID USERNAME & PASS WORD SRI HOME PAGE FAIL
PLEASE TRY AGAIN
6 RAJA RANI INVALID USERNAME & PASS WORD INVALID USER NAME PLEASE TRY FAIL
PLEASE TRY AGAIN AGAIN
Min PASS
Min-1 FAIL
Min+1 PASS
Max PASS
Max-1 PASS
Max+1 FAIL
VALID INVALID
Pass Fail
Ex:
A login process allows user ID and Password to validate users. User ID allows Alpha Numerics in lower case
from 4 to 16 characters long. Password allows alphabets in lower case 4 to 8 characters long. Prepare BVA and ECP
for user ID and password.
USER ID
BVA ECP
4 --- PASS VALID INVALID
3 --- FAIL a to z A to Z
5 --- PASS Special characters
0 to 9 Blank space
16 -- PASS
15 -- PASS
17 -- FAIL
34
PASSWORD
BVA ECP
4 -- PASS VALID INVALID
3 -- FAIL
a to z A to Z
5 -- PASS
0 to 9
8 -- PASS Special characters
7 -- PASS Blank space
9 -- FAIL
TEST DESIGN TECHNIQUES: They are used for making the Test Engineers to write the Test Cases very easily and
comfortably, even in the complex situations. Mainly 2 famous techniques used by most of the companies are:
1. BOUNDARY VALUE ANALYSIS (BVA): Whenever there is a range kind of input to be tested, it is suggested to
test the boundaries of the range instead of whole range, usually one will concentrate on the following values:
LB-1 FAIL
LB PASS
LB+1 MV PASS
UB-1 PASS
UB PASS
UB+1 PASS
FAIL
LB-LOWER BOUNDARY
MV-MEDIUM VALUE
UB-UPPER BOUNDARY
2. EQUALANCE CLASS PARTITION (ECP): Whenever there are more no of requirements for particular feature,
for huge range of data need to be tested than it is suggested to, first divide the inputs into equal classes and then
write the Test Cases.
Ex: Write the Test Cases for testing a Text Box, whose requirements are as follows:
1. It should accept Min 4 Characters and Max 20 Characters.
2. It should accept only @ and _ Symbols only.
3. It should accept only Small Alphabets.
LB-1 3
LB 4
LB+1 5
MV 12
UB-1 19
UB 20
35
UB+1 21 BVA:
ECP:
VALID INVALID
4 3
5 21
12 A-Z
19 Except @ and _ all the remaining Special characters
20 0-9
a-z Spaces
@,_ Decimal Points
VIT:
S.No. INPUTS
1 abcd
2 ab@cd
3 abcdabcda__z
4 abcdabcdabcdabcd
5 abcdabcdabzdabcda_@z
IVIT:
S.No. INPUTS
1 abc
2 ABCD
3 ABCD@__@abcd
4 <>@+-*/,.\abcdzyxw
5 12345
6 5.4
7 abcd ABCD z@/*
8 abcdabcdABCDABCD@<>+-ab
9 ABCD123, abcd123
10 abcdabcdABCD12345@<>ab @abcd
TEST CASE ID TEST CASE TYPE DESCRIPTION EXPECTED VALUE TEST DATA
1 Positive Enter the values into the It should accept VIT
Text Box as per the VIT
2 Negative Enter the values into the It should not accept IVIT
Text Box as per the IVIT
36
3. Test Execution:
In this phase the test engineer will do the following:
4. Result Analysis:
In this phase the test engineer will compare the actual value with the expected value. If both are matching, then he
will decide the result as PASS otherwise FAIL.
Note: If at all the Test case is not executed in any reason, then the test engineer will specify BLOCKED in the result
column.
Ex: If application has 5 pages, in each page a next button will be available, when we click on next button, we will
enter into next page, if next page button of 4th page is not working, we can’t enter into 5 th page and all the test
cases related to that 5th page can’t test. The result to all that test cases will be BLOCKED.
5. Bug Tracking:
It is a process of identifying, isolating and managing the defects.
DEFECT ID: The sequence of defect no’s will be mentioned here in this section.
TEST CASE ID: The test case id based on which defect is found will be mentioned here in this section.
ISSUE DISCRIPTION: What exactly the defect is will be clearly described here in this section.
REPRODUCABLE DEFECTS: The list of all the steps followed by the test engineer, to identify the defects will be listed
out here in this section.
DETECTED BY: The name of the test engineer, who has identified will be mentioned here in this section.
DETECTED DATE: The date on which the defect is identified will be mentioned here in this section.
DETECTED BUILD: The build number in which the defect is identified will be mentioned here in this section.
DETECTED VERSION: The version number in which defect is identified will be mentioned here in this section.
VERSION: On which version the build was released will be mentioned here in this section.
Note: Depending upon the size of an application the version will be changed, but the build will be the same keep on
changing. VERSION will be decided by the Software Configuration Management (SCM).
Ex: If the application version is 2.3.6, if the build 1 is released, the testing team will test the build, if they identified
the defects and sent back for next build, if the new requirements from the customer are added to that application,
then the version of an application changes from 2.3.6 to 2.3.7. Then the build 2 is released, the build number will be
the same keeps on increasing.
37
2. MAJOR DEFECTS: If at all the problems are related to working of the main functionality then such types of
defects are treated as MAJOR DEFECTS.
Ex:
3. MINOR DEFECTS: If at all the problems are related to look and feel of the application, then such type of
defects are treated as MINOR DEFECTS.
Ex:
38
Instead of ADD button BAD button , and Text boxes size and Value names are not consistent with each other.
4. SUGGESTIONS: If at all the problems are related to value (User friendliness) of the application then such type
of problems are treated as SUGGESTIONS.
Ex:
INVALID USERNAME PLEASE TRY AGAIN----POOR HELP
PLEASE ENTER ALPHANUMERIC ONLY----STRONG HELP
DEFECT PRIORITY: It describes the sequence in which, the defects need to be rectify.
1. CRITICAL PRI 1 P1 1
2. HIGH PRI 2 P2 2
3. MEDIUM PRI 3 P3 3
4. LOW PRI 4 P4 4
Usually:
SEVERITY PRIORITY
FATAL DEFECTS CRITICAL
SUGGESTIONS LOW
Sometimes highest Severity defects will be given least Priority and sometimes least Severity defects will be given
highest Priority.
Ex:
Suppose 80% of an application developed and remaining 20% was not developed. The application is released
to the testing department. The test engineer will raise this as a fatal defect in the 20% application, but developer
lead will treated it as a least priority.
Note:
Sometimes highest Severity defects will be given least priority and sometimes least priority defects will be given
highest priority, for this sake we use both SEVERITY and PRIORITY.
TYPES OF DEFECTS:
40
8. ID Control Bugs: MINOR
Ex: Logo missing, wrong logo, version no mistake, copyright window missing, developers name missing, tester
names missing.
FIX-BY/DATE/BUILD: The developer who has fixed the defect, on which date and build no will be mentioned here in
this section.
DATE CLOSURE: The date on which the defect is rectified will be mentioned here in this section.
DEFECT AGE: The time gap between “Reported on” and “Resolved On”.
Note:
Define NA in the Reproducible steps column if it is look and feel.
Heart of the Defect profile is Issue description.
Support column for Description is Reproducible steps.
DEFECT PROFILE TEMPLATE:
DEFECT TEST ISSUE DISCRIPTION REPRODUCABLE STEPS DETECTION DEFECT DEFECT FIX DATE
ID CASE RESOLUTION CLOSURE
ID (OR)
DATE STATUS
41
1 5 UPON INVOKING THE S 16.12.09 1 2
APPLIICATION, INITIALLY THE U .
CURSOR IS NOT POSITIONED IN THE NA R 3
USER NAME I .
FIELD 6
3 9 UPON CLICKING ON CLEAR ALL THE 1. ENTER SOME INFORMATION INTO ANY OF S 16.12.09 1 2
FIELDS ARE CLEARED, BUT THE THE U .
CURSOR IS NOT DISPLAYED IN THE FIELDS. R 3
USER NAME FIELD 2. CLICK ON CLEAR BUTTON. I .
3. OBSERVE THAT ALL THE FIELDS ARE 6
CLEARED, BUT CURSOR IS NOT PLACED IN THE USER
NAME
FIELD.
4 10 UPON ENTERING SURESH INTO 1. ENTER SURESH INTO USER NAME FIELD & S 16.12.09 1 2
USER NAME FIELD & QTP INTO QTP INTO PASSWORD FIELD. U .
PASSWORD FIELD THE PERSONAL 2. CLICK ON LOGIN BUTTON. R 3
HOME PAGE IS DISPLAYED, INSTEAD 3. OBSERVE THAT PERSONAL HOME PAGE IS I .
OF ADMIN PAGE DISPLAYED INSTEAD OF ADMIN PAGE. 6
5 11 UPON ENTERING SURESH INTO 1. ENTER SURESH INTO USER NAME FIELD & S 16.12.09 1 2
USER NAME FIELD, QTP INTO QTP INTO PASSWORD FIELD. U .
PASSWORD FIELD. SELECT A 2. SELECT A DATABASE OPTION. R 3
DATABASE OPTION AND CLICK ON 3. CLICK ON LOGIN BUTTON. I .
LOGIN BUTTON. DATABASE 4. OBSERVE THAT DATABASE CONNECTION IS 6
CONNECTION IS PROPERLY PROPERLY ESTABLISHED, BUT PERSONAL HOME
ESTABLISHED BUT PERSONAL HOME PAGE IS DISPLAYED INSTEAD OF ADMIN PAGE.
PAGE IS DISPLAYED INSTEAD OF
ADMIN PAGE
6 12 UPON ENTERING INVALID DATA INTO 1. ENTER USER NAME & PASSWORD AS PER S 16.12.09 1 2
USER NAME & PASSWORD THE TABLE 1. U .
FIELDS. EXPECTED ERROR MESSAGES 2. CLICK ON LOGIN BUTTON. R 3
ARE NOT DISPLAYED. 3. OBSERVE THE ACTUAL VALUE AS PER THE I .
FOR DETAILS REFER TABLE 1. TABLE 1. 6
TABLE 1:
S.No. USER NAME PASSWORD EXPECTED MESSAGE ACTUAL VALUE
42
43
NEW: whenever the defect is newly identified by the tester, he will set the status as new.
OPEN: Whenever the developer accepts the defects then he will set the status as open.
44
DEFERRED: whenever the developer accept the defects and wants to rectify it in later then he will set the status as
deferred.
FIXED: Once the defect is rectified then the developer will set the status as fixed, it is also called as rectified.
REOPEN & CLOSED: Once the next build is released the tester will check whether the defect is really rectified or not,
if at all the defect is really rectified, then he will set the status as closed, otherwise reopen.
HOLD: Whenever the developer is confused to accept or reject, then he will set the status as hold.
Whenever the defect is in hold status there will be a meeting on that defect and if it is decided as a defect,
then the developers will set the status as open otherwise the testers will close it.
REJECTED: Whenever the developers feel that, it is not at all a defect, then they will set the status as rejected.
Whenever the defect is rejected, then the test engineers will once again check it, if at all they also feel it is
not a defect then they will set the status as closed otherwise reopen.
AS PER DESIGN: This situation really occurs. Whenever developers feel the testers are not aware of latest
requirements then they will set the status as as per design.
Whenever the defect is as per design status the testers will once again check it by going through the
latest requirements, if at all they also feel it is as per design then they will set the status as closed otherwise reopen.
6. Reporting:
CLASSICAL BUG REPORTING PROCESS:
--------------------------------------------
--------------------------------------------
MAIL --------------------------------------------
--------------------------------------------
-------------------------------------------- --------------------------------------------
-------------------------------------------- --------------------------------------------
-------------------------------------------- --------------------------------------------
-------------------------------------------- --------------------------------------------
TE-TEST ENGINEER.
D-DEVELOPER.
DRAWBACKS:
1. Time consuming
2. No transparency
3. Redundancy
4. No security (Hackers may hack the Mails)
45
TL-TEST LEAD DL-DEVELOPMENT LEAD
-------------------------------------------- --------------------------------------------
-------------------------------------------- --------------------------------------------
-------------------------------------------- --------------------------------------------
-------------------------------------------- --------------------------------------------
--------------------------------------------
COMMON --------------------------------------------
--------------------------------------------
--------------------------------------------
REPOSITORY
TE-TEST ENGINEER.
D-DEVELOPER.
COMMON REPOSITORY: It is a server which can allow only authorized people to upload and download.
DRAWBACKS:
1. Time consuming
2. No transparency
3. Redundancy (Repeating)
46
COMMON
REPOSITORY
---------------------------------------
---------------------------------------
---------------------------------------
---------------------------------------
---------------------------------------
TE1 TE2 TE3 --------------------------------------- D1 D2 D3
TE-TEST ENGINEER.
D-DEVELOPER.
DEFECT TEMPLATE
BUG TRACKING TOOL: It is a software application which can be access only by the authorized people and provides
all the facilities for bug tracking and rep orting.
PR-PERFORMANCE REPORTER.
Ex:
No Transparency: Test engineer can’t see what was happening in the development deportment and developer can’t
look what was the process is going in the testing deportment.
Redundancy: There is a chance that some defect will be found by all the test engineers.
Ex: Suppose all the test engineers found the defect of the login screen login button, and then they raise the same as
a defect.
The test engineer enter into bug tracking tool, he add defect to the template with add defect feature and
writes the defect in corresponding columns, the test lead parallely observes it by bug tracking tool, and he assign
Severity.
The development lead also enters into the bug tracking tool. He assigns the priority and assigns the task to
the developer. The developer enters into the tool and understands the defect and rectifies it.
Tool: Something that is used to complete the work easily and perfectly.
Note: Some companies’ use their own Bug Tracking Tool, this tool is developed by their own language, this tool is
called ‘INHOUSE TOOLS’.
47
TEST CLOSURE ACTIVITY: This is the final activity in the testing process done by the test lead, where he will
prepare the test summary report, which contains the information like:
Number of cycles of execution,
Number of test cases executed in each cycle,
Number of defects found in each cycle, Defect ratio
and…………etc.
TERMINOLOGY:
DEFECT PRODUCT: If at all the product is not satisfying some of the requirements, but still it is useful, than such type
of products are known as Defect Products.
DEFECTIVE PRODUCT: If at all the product is not satisfying some of the requirements, as well as it is not usable, than
such type of products are known as Defective Products.
48
QUALITY ASSURANCE: It is a dependent, which checks each and every role in the organization, in order to confirm
whether they are working according to the company process guidelines or not.
QUALITY CONTROL: It is a department, which checks the develop products or its related parts are working according
to the requirements or nt.
NCR: If the role is not following the process, the penalty given for him is known as NCR (Non Conformances Raised).
Ex: IT-NCR, NON IT-MEMO.
INSPECTION: It is a process of sudden checking conducted on the roles (or) department, without any prior
intimation.
AUDIT: It is a process of checking, conducted on the roles (or) department with prior intimation well in advance.
INTERNAL AUDIT: If at all the Audit is conducted by the internal resources of the company, than the Audit is known
as Internal Audit.
INTERNAL AUDIT: If at all the Audit is conducted by the external people, than that Audit is known as External Audit.
AUDITING: To audit Testing Process, Quality people conduct three types of Measurements & Metrics.
No. of defects
0
Time x
Stability:
20% testing → 80% defects
80% testing → 20% defects
Sufficiency:
→ Requirements Coverage
→Type-Trigger analysis
Test Status:
→ Completed
→ In progress
→ Yet to execute
Delays in Delivery:
→ Defect arrival rate
→ Defect resolution rate
→ Defect age
Test Efficiency:
→ Cost to find a defect ( No of defects / Person-Day)
Test Effectiveness:
→ Requirements Coverage
→Type-Trigger analysis
Test Efficiency:
→ Cost to find a defect ( No of defects / Person-Day)
CORRECTIVE ACTIONS: Whenever the role has committed repairable mistake, than the Corrective Actions will be
taken care, in order to correct such type of mistakes.
PREVENTIVE ACTIONS: Whenever the role has committed irreparable mistake, than the Preventive Actions will be
taken care, in order to correct such type of mistakes in future.
SCM (SOFTWARE CONFIGURATION MANAGEMENT): It is a process where in 2 tasks are perform:
1. CHANGE CONTROL
2. VERSION CONTROL
CHANGE CONTROL: It is a process of updating all the related documents, whenever some changes are made to the
application, in order to keep the Documents and Applications in sink with each other.
VERSION CONTROL: It is a process in which one will take care of Naming conventions and Versions.
COMMON REPOSITORY: It is basically a server, which can be accessed only by the authorized people, where in they
can store the information and retrieve the information.
CHECK IN: It is a process of uploading the information from the Common Repository.
CHECK OUT: It is a process of downloading the information from the Common Repository. BASE
LINE: It is a process of finalizing the documents.
50
PUBLISHING/PINNING: It is a process of making the finalized documents available to the relevant resources.
RELEASE: It is a process of sending the application from the development department to the testing department (or)
from the company to the market.
DELIVERY: It is a process of sending the application from the company to the client (or) from the market to the
client.
SRN (SOFTWARE RELEASE NOTE): It is a note prepared by the development department and sent to the testing
department during the release, it contains the following information:
Build path information.
Deployment document path information.
Test data path information.
Known issues path
information.
Release manager name.
Release date.
Build number.
Version number.
Module name……etc
SDN (SOFTWARE DEVELOPMENT NOTE): It is a note prepared by team of members under the project managers
guidance and given to the customer during the delivery, it contains the following information:
User manual
Known issues
51
BUILD
DEPLOYMENT DOCUMENT CR
TEST DATA
KNOWN ISSUESS
DELIVERY
RELEASE TESTING
DEVELOPMENT CLIENT
RELEASE DELIVERY
RELEASE
MARKET
SRN
BPI SDN
DDPI
TDDI UM
KIPI KI
VER
REVIEW: it is defined as either of process of studying (or) process of checking depending upon the role involved.
REVIEW REPORT: It is an outcome document of review, which may contain either list of doubts (or) list of
comments.
PEER REVIEW REPORT: It is an outcome document of peer review, which contains the list of comments.
Ex:
------------ ------------
------------ ------------
------------ RR ------------
REVIEW REPORT
------------ ------------
----------- -----------
52
TEST SUIT: Combination of all the different types of test cases is known as Test Suit.
DEFECT AGE: The time gap between opening date and closing date of the defect is known as Defect Age.
LATENT DEFECT: The defect that is found late after some releases are known as Latent Defects.
SLIP AGE: The extra time taken to accomplish a task is known as Slip Age.
ESCALATION: It is a process of intimating the issue related information to the next level of authority.
TRACEABILITY MATRIX: It is a document which contains a table of linking information used for tracing back for the
reference in any kind of confusion or questionable situation.
PROTOTYPE: It is a roughly and rapidly developed model used for demonstrating to the client, in order to gather the
clear requirements and also to win the confidence of ac customer.
TEMPLATE: It is predefined format, which is used for preparing a document easily and perfectly.
BENCH MARK: The standard with which usually we compare with is known as Bench Mark.
CHANGE REQUEST: It is a process of requesting the changes; to do the same customers will use the change request
template.
IMPACT ANALYSIS: Whenever the customer proposes some changes, the senior analyst will analyze, How much
impact will fall on the already developed part? This process is known as Impact Analysis.
WALK THROUGH: It is defined as an informal meeting between two or more roles, may be for checking something
(or) for transferring something.
CODE WALK THROUGH: It is a process of checking conducted on the source code document, in order to confirm
whether it is developed according to the coding standards or not.
CODE OPTIMIZATION/FINE TUNING: It is a process of reducing the number of lines of code (or) complexity of code,
in order to increase the performance.
PPM (PERIODIC PROJECT MEETING): It is a meeting conducted periodically, in order to discuss the status of the
project. Usually they discuss the following points:
Percentage covered in the project during the period
Percentage not covered in the project during the period
Tasks completed during that period.
Defects found during that period.
Slip ages.
Reasons for the slip ages.
53
Technical issues.
HR related issues.
PPR (PERIODIC PROJECT REPORT): It is a report prepared by Team Lead by interacting with the team members
before the PPM is conducted. This document contains the information related to all the above said points.
MRM (MANAGEMENT REPRESENTATIVE MEETING): It is a meeting conducted, in order to discuss the status of the
company. Usually they discuss the following points:
Success rate and growth rate of the company.
Projects that are recently signed off.
Projects that are in pipe line.
Customers’ appraisals (Good comments).
Future plans.
Internal audit reports.
Individual appraisals.
PATCH: Whenever the test engineers suspend a build the developers will rectify the problems and release the same
build as Patch.
WORK AROUND: A workaround is a method, sometimes used temporarily, for achieving a task or goal when the
usual or planned method isn't working. In information technology, a workaround is often used to overcome
hardware, programming, or communication problems. Once a problem is fixed, a workaround is usually abandoned.
WAYS OF TESTING:
1. MANUAL TESTING
2. AUTOMATION TESTING
1. MANUAL TESTING: Manual Testing is a process, in which all the phases of STLC (Software Testing Life Cycle)
like Test planning, Test development, Test execution, Result analysis, Bug tracking and Reporting are accomplished
successfully and manually with Human efforts.
DRAWBACKS:
54
4. Tiredness.
5. Simultaneous actions are almost impossible.
6. Repeating the same task again and again in same fashion is almost impossible.
2. AUTOMATION TESTING: Automation Testing is a process, in which all the drawbacks of Manual Testing are
addressed properly and provides speed and accuracy to the existing testing process.
DRAWBACKS:
Note: Automation Testing is not a replacement for Manual Testing, it is just continuation for Manual Testing.
55