Professional Documents
Culture Documents
Table of Contents
Smoke testing is designed to quickly test the application and to understand of any potential issues which
can lead to failure of major functionality. Testing teams will have the information of what functionalities are
being releases in a particular iteration. Only when smoke testing is completed successfully, the testing team
picks the rest of the functionality.
Accessibility testing will follow the guidelines specified in W3C Web Content Accessibility Guidelines 2.0,
Conformance Level A.
Security testing is a process to determine that an information system protects data and maintains functionality
as intended.
The basic security concepts that need to be covered by security testing are: confidentiality, integrity,
authentication, availability, authorization and non-repudiation using Parameter Tampering, cookie poisoning,
Stealth commanding and Forceful Browsing.
Browser compatibility
Though relatively simple to do, it pays to spend enough time testing in this area. Decide on the lowest level of
compatibility and test that the system does indeed work without problem on the early as well as the latest
browser versions.
Even with the same release version, browsers behave differently on different platforms, and when used with
different language options. Testing should cover at least the main platforms (Unix, Windows, Mac, and Linux)
and the expected language options.
Session Management
Most applications and Web servers configure sessions so that they expire after a set time. Attempting to
access a session object that has expired causes an error, and must be handled within the code. Testing of
session expiration is ofter overlooked, largely because under normal operational circumstances, session
expiration is unlikely to occur.
Usability
Site navigation is crucial for attracting customers and retaining them. Sophisticated Web sites, such as for
travel booking, need to pay particular attention to navigation issues.
Large entities catalogs are central to many trading systems. Client should be able to quickly browse and
search through catalogs. Developers can define tests to measure the effectiveness of entity navigation
mechanisms. For example, you could test that a search on particular keywords brings up the correct entities.
Availability
Before going live, predicted business usage patterns should indicate maximum stress levels. You should test
system availability against the maximum stress levels plus a safety margin for a defined period of time.
Internationalization
Does the site offer the option to view non-English pages? If the choice of language is based on browser
preferences, does it work on all the desired browsers? Many older browsers do not support language
customization. We need to test all the above aspects.
Test that words are correctly displayed and that sentence are grammatically correct. Use a native speaker to
verify that this is the case..
System Integration
The data interface defines the format of data exchanged by front- and back-end systems. Tools such as XML
(Extensible Mark- Up Language) alleviate data interface problems by providing document type definitions.
The processing between front- and back-end systems may be time dependent. For example, a back-end
system could be necessary to process a data transmission from the frontend system immediately or within a
defined period. Tests should ascertain whether a system actually observes timeliness constraints and whether
it activates data transmissions at the correct time.
Low complexity MDM implementation would contain only basic features of MDM. The integration with 3rd party
vendors would be very limited and there would not be integration with multiple vendors. Any changes to the
application would be an easy job.
Testing of Interface
Testing of integration with standalone applications
Test data management is simpler in these implementations. Since the teams are smaller in size and the content on
the sites are not heavy, the tester will have to manage smaller sized test data.
Medium complexity MDM implementation would contain all the basic features as well as some advance features of e-
commerce. There would be multiple touch points with other vendors. The information in these systems is not
hardcoded and backend systems are used for publishing data on the website.
The testing of such systems would include all that is mentioned under Low complexity implementation and the items
mentioned below:
Test data management would require pre planning and co-ordination with other vendors. There should be a
communication channel wherein the data coming into the system and going out of it is coordinated with 3 rd party
vendors.
Large complexity MDM implementation would contain all the advance features and would involve multiple integrations
with different 3rd party systems. The implementation would be complex and would require huge testing effort. The data
from one system would flow to multiple systems.
The testing of such systems would include all that is mentioned in low and medium complexity implementations and
the items mentioned below:
Test data management would require proper planning and co-ordination with other vendors. The ownership of test
data needs to be defined and a central repository needs to be in place from where everyone can pick up data and use
it.
Test Global and Test Distributed: MDM system is global in spirit and structure. The different underlying systems
may be on different continents, but they appear to integrate seamlessly over large, distributed and non-homogenous
Consider User Profile: The user profile varies in terms of role(Admin or Normal user). While testing the application,
ensure that all the access rights of user profile are taken care of.
In order to ensure defects are on a path to resolution, daily sessions (TAR sessions) will be conducted to
review defects with Client during SIT and UAT phase. These sessions are very critical to the success of
testing. It is therefore critical that key lead resources attend. During this session, we review the
outstanding issues (New, Open, and Reopen) by the priorities identified below. Some issues may require
breakout sessions with other teams in order to resolve. These defects will be discussed briefly in the
meetings and follow up meetings will be held immediately following the session.
The duration of the meeting will vary based upon the number of open defects, and the speed at which defects
are being worked.
No Yes
Is it a Defect? Defect is assigned to Dev Defect is assigned to
TL with Priority. Developer with Status “Open”
Yes
Action Required: Developer to Change Status to Is Defect a Duplicate?
mention duplicate Defect ID. “Reject – Duplicate”.
No
Change Status to
“Reject – Invalid Change Status to No
Defect Not Valid Defect”. “Reject – Can’t be Is Defect reproducible?
Reproduced”.
Yes
Defect logged is not
a Requirement. Developer fixes the Defect.
Enhancement
Training
Requirement Tester retests the Defect assigned to “Tester”.
defect for correct fix.
Yes
Action Required: Tester to add
Stop Defect Status is comments in Description Field.
changed to “Closed”
The tester will enter all defects into the defect tracking system; assign the defects to the QA TL in New
status. A defect will contain a
Title/Summary
Description with steps to reproduce and screen shot or relevant information
Status
Priority
Module Name
Discovered In release
The defect is then discussed in TAR session.
If the defect is acknowledged to be “Invalid”, that the defect raised is invalid as per the current
requirement, defect is marked as Reject-Invalid Defect.
If the defect is acknowledged to be “Valid”, then it is assigned to the Development TL who assigns it to the
concerned developer.
When a developer acknowledges the defects, he will change the status to Open and will start working on
the defect
If the developer finds a defect to be a duplicate of an existing open defect, they will assign it back to
testers mentioning the exact duplicate defect id. Testers re-verify whether it is actually a duplicate or not.
If yes then defect is marked as Reject – Duplicate else it is re-assigned back to developers with proper
comment in Open status
If the developer cannot reproduce the defect, he will change the status to Reject-Can’t be Reproduced
and assign it back to tester. Tester again tries to reproduce it and if it is re-producible, tester assigns it
back to developer with appropriate comment in Open state else it is Closed
If the defect is reproducible, then the developer fixes a defect & he will change its status to Fixed and
keep it with him until next build is provided to QA
Before every release, developers set the status to Ready to Retest and mentions the build in which the
defect has been fixed and assigns it back to the QA TL and he/she assigns it to the tester
Once the defect fixes are released, the tester will re-test to verify the defect. If the defect has been fixed,
the status will be changed to Closed. If the Defect has not been fixed s/he will assign it back to developer
with status as Reopen
If the review team or developer requires further information about the issue, they will change the status to
More Info Required and assign it back to testers. Testers provide required information and assign the
defect back to developer in Open status
Before every release if there are certain known defects to developers, they log it with status Known Issue
When a defect is raised and the review team acknowledges it as a known limitation of the system but
decides against fixing it & releasing the course with the defect, the status will be set to ‘Known Issue’
If Testing has any suggestion about the functionality or UI which may result in requirement change, a
“Suggestion” will be logged for the same. ‘Suggestion’ is the category of the defect and these artifacts
will not be counted in QA defect report.
QA team will work on only those defects which are assigned to the QA team members & are in only Re-
Test/Cannot Reproduce/More Info Required/Duplicate status.
1. New – Indicates new defect is logged on to defect tracking tool along with the Severity of the Defect. All
“New” Defects are assigned to QA TL.
2. Open – Defect is assigned “Open” status when it is assigned to a Developer for fix along with Priority
assigned against it.
3. Fixed – Developer, after fixing the defect changes its status to “Fixed” and don’t assign it to anyone.
4. Ready to Retest – Defect after getting fixed by developer is released to QA environment for Retest after
new build containing the fix is deployed. The Defect status is then changed to “Ready to Retest” and
assigned to QA TL who in turn assign it to Testers for retest.
In normal scenarios, testing will be done only in QA environment. If the QA environment is inaccessible due to
technical issues then testing will be done on Dev environment.
We would be using the following client environment setup for our testing
Software/OS Version
Windows XP SP-4
Linux
MAC
The following status reports will be generated only during UAT phase to track testing and provide visibility
into the status of testing, outstanding issues and risks.
Test Lead Client Plans and co-ordinates test activities for SIT / UAT testing at Client test
environment
Ensures access to Testing testers to System Integration Testing test
environment
Provides test data required from Client Internal systems and External
systems
Participates in TAR sessions/Defect Prioritization meetings during SIT/UAT
Tester(s) Client Responsible for Security (Cybercom for external security testing), Migration
(Migration Imports performed by Sapient would be verified by Sapient where
as overall Migration testing would be done by Client), and User Acceptance
testing
Carries out System Integration testing for Client’s internal and external
systems to be working fine after integration
Tester(s) Sapient Understands the requirements/stories and the application
Creates and Updates test cases
Execution of test scripts/cases as per the plan
Capturing defects in defect tracking tool
Retesting fixed defects and closing them
XYZ Client