Thursday, July 15, 2010

Black Box Testing, Its Advantages and Disadvantages

The advantages and disadvantages of black box testing

Black Box Testing is a testing technique where no knowledge of the internal functionality and structure of the system is available. This testing technique treats the system as a black box or closed box. The tester only knows the formal inputs and expected outputs, but does not know how the program actually arrives at those outputs. As a result, all testing must be based on functional specifications. For this reason black box testing is also considered to be functional testing and is also a form of behavioral testing or opaque box testing or simply closed box testing. Although black box testing is behavioral testing, behavioral test design is slightly different from black box test design because internal knowledge may be available in behavioral testing.

Advantages of Black Box Testing

  • Efficient when used on large systems.
  • SInce the tester and developer are independent of each other, testing is balanced and unprejudiced.
  • Tester can be non-technical.
  • There is no need for the tester to have detailed functional knowledge of system.
  • Tests will be done from an end user's point of view, because the end user should accept the system. (This testing technique is sometimes also called Acceptance testing.)
  • Testing helps to identify vagueness and contradictions in functional specifications.
  • Test cases can be designed as soon as the functional specifications are complete.

Disadvantages of Black Box Testing

  • Test cases are challenging to design without having clear functional specifications.
  • It is difficult to identify tricky inputs if the test cases are not developed based on specifications.
  • It is difficult to identify all possible inputs in limited testing time. As a result, writing test cases may be slow and difficult.
  • There are chances of having unidentified paths during the testing process.
  • There is a high probability of repeating tests already performed by the programmer.

Wednesday, July 14, 2010

QTP Material - VBScript

Chapter 03-VBScript.pdf

Scripting QTP - CH04 - Advanced VBScrict.pdf

Scripting QTP - CH05 - Regular Expressions.pdf

CH06 - Error Handling.pdf

Scripting QTP - Chapter 07- Working with Files.pdf



Scripting QTP - CH08 - ADODB.pdf

Scripting QTP - CH09 - WSH.pdf

CH10 - SVG.pdf

Scripting QTP - CH11 - shell32.pdf


CH13 - Win32 API.pdf

Scripting QTP - CH14 - DotNetFactory.pdf


CH16 - Accessing PDF.pdf

test Automation Architecture

Product testing Life Cycle

Field of an error tracking Record

Tuesday, July 13, 2010

Checking Error Report and Developers feedback

Performance Testing Process Chart

Game Testing Process Chart

QA_lifecycle Model

Verification and validation techniques

Bug-life-cycle.jpg

IEEE 829 standard for software testing documentation

IEEE 829 standard for software testing documentation
One of the challenges facing software testers has been the availability of an agreed set of document standards and templates for testing. The IEEE 829 provides an internationally recognised set of standards for test planning documentation.

IEEE 829 has been developed specifically with software testing in mind and is applicable to each stage of the testing life cycle including system and acceptance testing.
Types of Document
The IEEE 829 standard covers 8 document types.
Test specification
• Test Plan: Covers how the testing will be managed, scheduled and executed.
• Test Design Specification: defines logically what needs to be tested by examining the requirements or features. Then these requirements can be converted into test conditions.
• Test Case Specification: Converts the test conditions into test cases by adding real data, pre-conditions and expected results
• Test Procedure: Describes in practical terms how the tests are run.
• Test Item Transmittal Report: Specify the items released for testing.
Test execution
• Test Log: Is an audit trail that records the details of tests in chronologically.
• Test Incident Report: Record details of any unexpected events and behaviours that need to be investigated.
Test reporting
• Test Summary Report: Summarise and evaluate tests.
Documentation for test specification
The test preparation is by far the most important part of any software testing project. During this stage you must create your tests and define the requirements of your test environment.
IEEE 829 - Test Plan
The Test Plan describes how you will deliver the testing.
1. Test Plan Identifier
2. References
3. Introduction
4. Test Items
5. Software Risk Issues
6. Features to be tested
7. Features not to be tested
8. Approach
9. Item Pass/Fail Criteria
10. Suspension Criteria and Resumption Requirements
11. Test Deliverables
12. Remaining Test Tasks
13. Environmental Needs
14. Staffing and Training Needs
15. Responsibilities
16. Schedule
17. Planning Risks and Contingencies
18. Approvals
19. Glossary
IEEE 829 - Test Design Specification
The test design specification identifies the test conditions from the requirements and functional design documentation.

Let’s use a Banking project example where the following testing requirements have been defined. In this case Bank A is rolling out new “hole in the wall” machines all over the country and the project will include testing the functionality of the ATMs to demonstrate the ability to:
• Complete a valid withdrawal of funds.
• Print a report summary of recent transactions.
• Change passwords.
The test design does not record the values to be entered for a test, but describes the requirements for defining those values. This is done at a logical level and should not be cluttered with individual examples and data.The design specification provides a link between test requirements and test cases.
IEEE 829 - Test Case Specification
The test cases can be produced when the test design is completed. A test case may cover one or more test conditions derived from the test design. A test case should include:
• The precise data that is required to execute the test. This is a combination of input data, and application and system data that is needed.
• The expected results and outputs
• Pre-conditions that will determine the starting point for each test. A feature (requirement) from the test design may be tested in more than one test case, and a test case may test more than one feature. The aim is for a set of test cases to test each feature (requirement) in the test design at least once. Taking the ATM project example the first 2 requirements could be tested using one test case:
• Test case 1 could be defined to include the completion of a cash withdrawal from the ATM and then a printout request to show this withdrawal has been correctly executed and the right amount has been debited.
IEEE 829 - Test Procedure Specification The test procedures are developed from both the test design and the test case specification. The procedure document describes how the tester will physically run the test, the physical set-up required, and the procedure steps that need to be followed. The standard defines ten procedure steps that may be applied when running a test.
IEEE 829 - Test Item Transmittal Report
This document is a handover document and provides details of the previous stage of testing.

Similar to a release note this provides a list of what is being delivered and shows any changes and new items contained. It includes the person responsible for each item, its physical location and its status.
Documentation for Test execution
The schedule of what test cases are run and when, is defined in the Test Plan. The test results are recorded in the test log, and in test incident reports.
IEEE 829 - Test Log
The Test Log records the details of what Test Cases have been run, the order of their running, and the results of the test. The results are either the test passed, meaning that the actual and expected results were identical, or it failed and that there was a discrepancy. If there is a discrepancy than one or more Test Incident Reports are raised or updated, and their identities recorded on the Test Log.

The Test Log is important as it allows progress of the testing to be checked, as well as providing valuable information for finding out what caused an incident. If an incident is a coding fault, the fault may have occurred not in the Test Case that failed but in one that was run previously. Thus the sequence of the tests enables the fault to be found.
IEEE 829 -Test Incident Report
This report documents any event that requires subsequent investigation. An incident should be raised when there is an unexpected result or any unexpected behaviour during testing. At this point it may not always be clear whether there is a bug or fault in the software, since incidents can occur as a result of configuration errors, faults in the software, faults in the requirements and in-correct expected results recorded in the test case.

The report consists of all details of the incident such as actual and expected results, when it failed, and any supporting evidence that will help in its resolution. The report will also include, if possible, an assessment of the impact upon testing of an incident.
Documentation for test reporting
Eventually testing will be completed according to the criteria specified in the Test Plan. This is when the success or failure of the system is decided based on the results. The Test Summary records this information.
IEEE 829 - test summary
The summary provides the results of the designated testing activities and an evaluation of these results.

Moreover the summary provides an overall view of the testing and the quality of the software.
Use of the standard
The 829 standard is a good start point to create a sensible structure for your testing documents. However, it should be customised or changed to reflect the needs of your project.

Test Automation Tools Comparison Matrix

ion HP (Mercury) QuickTest Professional Rational Functional Tester AutomatedQA - TestComplete WATIR Selenium WET
Objects Recognition Windows Handle based recognition and operation. Smart object recognition technology. This enables an easy and robust recognition of objects. ScriptAssure Technology. This also enables an easy and robust recognition of objects. Object Recognition as powerful as QTP's (if not more). Supports web-based and desktop applications as well as applications running on portable devices such as PDAs, Pocket PCs, and smartphones. Object name mapping is flexible and customizable (in TestComplete7). Fairly good object recognition. However, when accessing dynamic menus, i had to use some workarounds. For example the following code is used to click a dynamic menu element. The menu is basically a table that pops when the mouse hovers on its title: Good object recognition. Objects can be identified by name, ID or located by using XPath. Supported

ie.image(:src, 'path_to_the_menu_tile_image/loanapp_mnu.gif').fire_event("onmouseover")
tempTable1 = ie.table(:id, 'Table_ID')
(tempTable1[8][1]).click()

This code simulates the user action of hovering on the menu title and clicking on the specified menu element.
Browsers Supported IE, Firefox IE, Firefox IE7, IE8, up to Firefox 3.0.12 IE, FF, Recently on Safari IE (only partly because selenium IDE do not support IE), FF, Safari, Chrome and Opera IE 6.0:version 6.0.2800.1106
Recording and Playback Stable recording and playback. Records in VBScript. Stable recording and playback Stable recording and playback. Can record "Keyword-based script", but normally records object-based operations Watir Recorder ++ is available, But it's very simple recorder. Playback is stable. Advanced Recorder: Selenium IDE Supported
Operating System Windows Windows, Linux and Mac Windows (including 64-bit) Windows (IE, FF), Linux (FF), Mac (Safari, FF) Windows, Linux, Mac Supported Operating Systems: Windows 98/ME/2000 SP3/XP SP2/Server 2003
Test Results Reports XML based execution log. Every single step is logged. Viewed as collapsible/expandable tree. Could be manually modified to view as a web-page report (using XSL transformation script). From QTP 10 could be exported into other document formats. Detailed reporting supported Detailed execution log, displayed via built-in viewer. Can be exported as HTML and XML. Log shows details of checkpoint failures and image snapshots. Watir has no reporting facilities. So we pumped our testing results dynamically to database while the script is running. And eventually, a nice grid should be developed to display the results in a user friendly way. Reporting is supported through the test runner and various logs, screen shots can also be captured. Results are displayed in a easily readable HTML format
Support for Terminal Application Supported through an extension (TE-addin). Supported through an extension N/A N/A N/A Not supported since it’s Mainly performed for testing web applications.
Programming Language VBScript. Can also include external compiled modules. Can access objects and invoke methods using COM. Java programming language and VB .Net VBScript, DelphiScript, JScript, and JScript variants (e.g., C++Script, C#Script). Recommend using JScript (dialect of ECMAScript) Ruby Scripting. An easy programming language. Native language is 'Selenese', the scripts can be exported as C++, C#, Java, Ruby, Python and HTML. WET scripts are written using Ruby
Version Control and sharing of scripts. Scripts can be shared among automation developers, but HP Quality Center is necessary to achieve this. Can use own versioning control tool, but many things are binary (object repository) so hard to merge and diff. Scripts can be shared among developers. We have two options: IBM Clearcase or the free CVS technology. We used CVS in previous SLSS project and it proved reliable. Scripts can be shared among developers and run on standalone playback application called TestExecute. Integrates with VSS, ClearCase, Team Foundation Server via SCC Provider Not in built, but can easily use something like TortoiseSVN which is free an easy to use. Scripts can be saved in the native language or exported in a supported langauge for inclusion into test frameworks such as JUnit or NUnit. Version control can be done using any source control tool such as SVN. CVS, etc. Not Supported
Speed of Execution Fast, customizable in many ways. Medium Medium-Fast Generallly fast. Watir speed varied a lot according to various network topologies. The speed is changed by a factor of 10 slower or faster. So you have to be careful and make sure that Watir server and the web application under test are neighboring. You can arrange that with your network admin. Of course this is the rule for all automation tools. Speed of execution via the Selenium IDE can be varied, the setting is from 'fast' to 'slow'. If scripts have been exported to a supported language execution can be adjusted programmatically. Medium , however the time needed for recording may vary from application to another
Datapools for Data dynamicity "DataTable" Object - Set of worksheets. "Environment" Object - XML tree. Can easily create any required data model, or access Excel workbooks and XML trees directly through COM Supported Supports Database tables, Files, Images, XMLs, Excel via Storages. DB, CSV, and Excel can be accessed using TestComplete's DDT object Supported through a third party driver. Supported Supported , data table supported using either Excel spreadsheet or XML
DB Access Supp orted, through ADO Supported Supported via DDT or COM Supported through a third party driver. Supported Supported through a third party driver.
Issues and Concerns - Problems with Terminal Application Execution. Speed is not optimized and demanding high hardware requirements. AutomatedQA support via newsgroup is old and not standard. They really need to create a different web-based support system. Newsgroup does not work. Very basic recorder that requires a lot of editing which is time consuming. components are difficult to set-up. - Hardly to deal with JavaScript’s error

Crashes occurs sporadically.
Recommend to use on IE: IE developer toolbar, Web metrics RIA script recorder for analyse frames, and index etc..
- Recording may take too much time
- Very expensive to version control the scripts.
Sometimes, automation execution will be working great one on machine, but not the other. Same when playing back on different operating systems.
Online documents are not overly helpful as they assume knowledge and omit key details. Some files required for the set-up of RC and Grid are not part of the Selenium tools (e.g. ANT, JDK), this also means required set-up files contain errors in context of Selenium. - Objects implemented using JavaScript are not recognized by the tool.





- Keyword driven not supported.


TestComplete has a slew of hard_to_reproduce issues that make it difficult for them to fix.
Selenium does not natively support features required by testers such as iteration or data driven testing and there is difficulty testing Flash that has to be overcome by changing Actionscripts. - Automatic Exception Handling





- Interactive test debugging not supported


If an application crashes and Dr. Watson window appears, there is no uniform way to closing the Dr. Watson window to continue the automation execution. Currently, you have to create a third-party script/tool to detect Dr. Watson windows and excuse them.
To extend Selenium functionality it is common to combine a Selenium based automation framework with a language such as Ruby. This provides ways to implement iteration, etc. mentioned above and access Gems (in the case of Ruby).








TestComplete 7 introduces performance enhancements to the Object Browser (map of the application objects), but we find the enhancement breaks parts of our automation framework.










Your mileage may vary.


Price The price could vary according to license. Priced without Tax USD: Named User Licenses: FREE FREE, Open Source FREE
BM Rational Functional Tester Authorized User License and SW Subscription & Support 12 Months (D53NFLL) 5,821.00 Enterprise with one year maintenance support $1999

Standard with one year maintenance support $999
IBM Rational Functional Tester Floating User License + SW Subscription & Support 12 Months (D530BLL) 11,235.00

Floating User Licenses:
IBM Rational Functional Tester Authorized User Initial Fixed Term License + SW Subscription & Support 12 Months (D54SHLL) 3,146.00 Enterprise with one year maintenance support $4499

Standard with one year maintenance support $2999
IBM Rational Functional Tester Extension Floating User License + SW Subscription & Support 12 Months (D59RMLL) 2,846.00
URL

http://www.automatedqa.com/products/testcomplete/ http://watir.com/ http://seleniumhq.org

Software Test Estimation - 9 General Tips on How to Estimate Testing Time Accurately

Software Test Estimation - 9 General Tips on How to Estimate Testing Time Accurately

Posted In | Testing Life cycle, Test strategy



For success of any project test estimation and proper execution is equally important as the development cycle. Sticking to the estimation is very important to build good reputation with the client.

Experience play major role in estimating “software testing efforts”. Working on varied projects helps to prepare an accurate estimation for the testing cycle. Obviously one cannot just blindly put some number of days for any testing task. Test estimation should be realistic and accurate.

In this article I am trying to put some points in a very simple manner, which are helpful to prepare good test estimations. I am not going to discuss the standard methods for test estimations like testing metrics, instead I am putting some tips on - How to estimate testing efforts for any testing task, which I learned from my experience.

Factors Affecting Software Test Estimation, and General Tips to Estimate Accurately:

1) Think of Some Buffer Time
The estimation should include some buffer. But do not add a buffer, which is not realistic. Having a buffer in the estimation enables to cope for any delays that may occur. Having a buffer also helps to ensure maximum test coverage.

2) Consider the Bug Cycle
The test estimation also includes the bug cycle. The actual test cycle may take more days than estimated. To avoid this, we should consider the fact that test cycle depends on the stability of the build. If the build is not stable, then developers may need more time to fix and obviously the testing cycle gets extended automatically.

3) Availability of All the Resources for Estimated Period
The test estimation should consider all the leaves planned by the team members (typically long leaves) in the next few weeks or next few months. This will ensure that the estimations are realistic. The estimation should consider some fixed number of resources for test cycle. If the number of resources reduces then the estimation should be re-visited and updated accordingly.

4) Can We Do Parallel Testing?
Do you have some previous versions of same product so that you can compare the output? If yes, then this can make your testing task bit easier. You should think the estimation based on your product version.

5) Estimations Can Go Wrong - So re-visit the estimations frequently in initial stages before you commit it.
In early stages, we should frequently re-visit the test estimations and make modification if needed. We should not extend the estimation once we freeze it, unless there are major changes in requirement.

6) Think of Your Past Experience to Make Judgments!
Experiences from past projects play a vital role while preparing the time estimates. We can try to avoid all the difficulties or issues that were faced in past projects. We can analyze how the previous estimates were and how much they helped to deliver product on time.

7) Consider the Scope of Project
Know what is the end objective of the project and list of all final deliverables. Factors to be considered for small and large projects differ a lot. Large project, typically include setting up test bed, generating test data, test scripts etc. Hence the estimations should be based on all these factors. Whereas in small projects, typically the test cycle include test cases writing, execution and regression.

8 ) Are You Going to Perform Load Testing?
If you need to put considerable time on performance testing then estimate accordingly. Estimations for projects, which involve load testing, should be considered differently.

9) Do You Know Your Team?
If you know strengths and weaknesses of individuals working in your team then you can estimate testing tasks more precisely. While estimating one should consider the fact that all resources may not yield same productivity level. Some people can execute faster compared to others. Though this is not a major factor but it adds up to the total delay in deliverables.

And finally tip number 10.
Over To You!
This test estimation tip is purposefully left blank so that you can comment your best estimation techniques in below comment section.

Software developement method

• Phase 1 - Architecture
o
Analysis
We don't write a line of code until we know your business as well as the people who keep it running every day. During our initial Analysis stage, we'll interview your key employees and stakeholders to get a thorough sense of all your company's operations—and the pain points we'll need to address in our software. The way your company works will be reflected in the solution we create.
Objectives
 Review existing business process
 Identify organization/brand position
 Identify user profiles
 Create use cases
 Research supporting technologies


o
Interface Design
Once we know exactly how our software fits into your business process, we build a functioning model that allows users to interact with it. This step is a key principle of Interface-Driven Architecture and a critical part of our process. Because we do it early in the development stage, it allows us to ensure your software is intuitive, easy to use, and really does what you need it to do—well before launch.
Objectives
 Define site/application structure
 Create site/application design
 Design user interaction
o
Usability Testing
Once we've designed your prototype, you and your stakeholders take it for a test drive. We assemble focus groups of target users to test the product, leave comments on-screen, and deliver feedback until we get the usability just right. This methodical usability testing process lets us ensure the software is both effective and easy to use—well before implementation.
Objectives
 Assemble virtual focus groups
 Test use cases
• Phase 2 - Fabrication
o
Software Engineering
You can't build a house without a blueprint. And you can't code software without a component model. This is the foundation on which your software will be built, including all best practices, design patterns, UML diagrams, and relational databases.
Objectives
 Create domain model
 Create data schema
 Create API (Application Programming Interface)
 Create user documentation/help system
o
Coding
The Interface-Driven Architecture methodology makes coding faster and easier for us—and less expensive for you. Unlike a traditional development process, ours establishes all the system's requirements before a line of code is written. And because our programmers are knowledgeable in a wide range of programming languages, we have the expertise to choose the language and database that will work best for your project.
Objectives
 Develop system component
 Perform unit testing
 Perform coverage testing
 Perform load testing
o
BETA Testing
During the coding phase, we build the code. In the beta testing phase, we try to break it. Many software developers will leave the beta testing up to the user—so that as you use the "finished" product, you find the errors and request the fixes. We prefer not to deliver the software until it's perfect—so we test in-house. Our dedicated testers work in partnership with coding specialists to find and eliminate problems before the software is launched.
Objectives
 Test all use cases
 Conduct acceptance testing

This Article describes the various types of Responsibilities involved in developing software projects

This Article describes the various types of Responsibilities involved in developing software projects
Product Planning Related
• Manage the Vision and Strategy of a Project
• Control the feature roadmap of the project
• define integration points within the Product which enable internal and external users to provide feedback
• gather user feedback, communicate with users and prioritize feature requests
• prioritize bugs
• Competition review
o subscribe to news, resources and competitor blogs / feeds
o regularly post tidbits of information w.r.t competition on internal lists
o study new releases by competition and conduct atleast one structured presentation for the benefit of all teams on a regular basis (monthly / quarterly)
• Document requirements and specifications
UI Prototyping Related
• Investigate competition and check various alternatives for performing a given task
• Generate prototypes using Axure and Balsamiq
• Usability Testing - Hallway / Focus Groups / Staging etc
UI Design Related
• design the graphics, flash, images, icons, html, css and javascript
UI Text Related
• write the labels, text, instructions, error messages on every single page of the UI
• write every email that will be sent to users from the software
User Experience related
• Reviewing user experience of the product as a whole
• Ensuring UX is intuitive and kick-ass
Architecture Related
• Platform Selection
• Module design
• Define the high level design of the project
• Define the Deployment Architecture
o How will the modules be deployed
o Reverse proxy setup
o Load Balancer Setups
o Vertical/Horizontal Partitioning
• Data structure architecture
o Determine the optimal way in which data will be stored - RDBMS / OODBMS / Proprietary Format etc
o Database structure and architecture
o Clustering setups
o Indexing strategies
o Archival strategies
o CDN
• Caching architecture and platform
• Storage Architecture
o SAN configuration
o RAID configuration
• Conduct relevant tests to determine the most suitable architecture for all of the above
• Perform relevant research to make decisions on all of the above
• Document of all of the above in detail after confirmation
• Document any Complex algorithms
• Conducting generic training sessions on scalability, architecture and design
• Conducting specific training sessions on Project Architecture
Dev Related
• Test driven development
• Writing Unit Tests with close to complete coverage
• High and low level design
• Coding (Java / C++ / C# / Ruby / PHP / etc)
• Frontend Coding (Javascript, Ajax, jQuery, Flex, Silverlight etc)
Testing Related
• Automating Testing
• Planning and writing Functional Tests
• Planning and writing Stress Tests
• Planning and implementing Continuous Builds and Continuous Integration
• Implementing test coverage reports for Unit Testing and Functional Testing
• Defining manual test cases and overseeing Manual Testing process
QA Management Related
• Identifying relevant testing tools and technologies for different test requirements
• Defining testing processes across the organization
• Conducting regular training courses on automated testing
• Inspecting, improving and evolving our Project Development Processes
Delivery and Process Related
• Ensuring all of the below as a part of ensuring process adherance -
o Agile dev processes are being followed
o TDD is being followed
o Daily stand-ups and weekly iterations are being conducted
o All tasks, features, bugs are tracked through our Issue Tracker
o Continuous Integration is functional
• Ensuring documentation of Architecture
• Ensuring our Project Development Process is followed to the tee
• Ensuring all tasks have estimates associated with them, and all releases have schedules associated with them
• Maintaining the heartbeat of a project
• Ensuring timely delivery of a Project and keeping dev teams and other teams informed of timelines
• Publishing an accurate Release log upon release
• Ensuring test coverage for unit tests and functional tests is in line with our defined standards
User Documentation Related
• Maintaining the User manual
• Defining the structure of the user manual
• Writing articles within the user manual
• Coordinating with support, sales and other teams to identify documentation requirements
• Ensuring all new features are adequately documentated upon release
• Encouraging community participation in the documentation and corresponding translation efforts
People Management
• Managing the team
• Resource allocation
• Providing feedback to team members
• Motivating the team regularly
• Being available to resolve any issues of any team member
• Recruitment
• Managing and implementing Training processes for team members
• Appraisal Management
Marketing Related
• Evangelism & Social Media Marketing
• Blogging about the project, its features, releases, usage etc
• Managing relationships with the beta testing community within and outside the organization
System Administration Related
• Defining the server specifications, and network topology of the deployment
• Managing and monitoring server deployments

This Article describes the various roles and respective designations involved in developing software projects
Roles represent a collection of Responsibilities that will be carried out by an individual. Every designation in the product engineering team is a part of one of the below roles.
Prior Reading
• Types of Responsibilities
Product Owner Role
• Intro
o A Product Owner as the name suggests own one or more Projects in their entirety
o Is a hardcore techie and possesses very good business acumen
o Has exceptional people management skills
o Various individuals responsible for various facets of the project report into a Product Owner
o May not actively code unless the team size / project is very small
o Reports into Management and is responsible for driving the project
• Types of Responsibilities Handled
o Direct Responsibilities
 Product Planning Related
 People Management
o Oversight Responsibilities
 UI Prototyping Related
 UI Design Related
 UI Text Related
 User Experience related
 Architecture Related
 Dev Related
 Testing Related
 Test Management Related
 Delivery and Process Related
 User Documentation Related
 People Management
o Oversight responsibility means that the actual responsiblity belongs to someone else in the team under supervision / guidance / monitoring / mentoring of the Product Owner
o If a specific role for a given type of responsibility is absent in that team then that Responsibility becomes the Direct Responsibility of the Product Owner. For eg
 if a project does not have a Product Analyst / Business Analyst, this responsibility falls on the Tech Lead / GM / VP of that project
 similarly if the project does not have a QA Lead, or a Usability Lead, or a Iteration Manager, all these responsibilities fall on to the Product Owner of that project
o The Product Owner role represents the final authority in Project Development. For eg
 even if a Project has a Product Analyst or Business Analyst, the actual vision and feature map will be discussed with and confirmed by the Product Owner of that project
 similarly despite a Usability Lead being assigned to a Project, the Product Owner of that project will be involved in and confirm the UI Prototypes
• Hierarchy of Product Owner Designations
o Tech Lead
o GM, Software Dev,
o VP, Software Dev,
o SVP, Software Dev,
Module Owner Role
• Intro
o A Module owner owns a specific module
o a hardcore techie - possesses good business acumen
o People management skills required - responsible for managing a handful of software devz / testers who work on the module
o Reports into Product Owners
• Types of Responsibilities Handled
o Direct Responsibilities
 Product Planning Related
 People Management
 Dev Related
 Delivery and Process Related
o Ancillary Responsibilities
 UI Prototyping Related
 User Experience related
 Testing Related
 Other responsibilities that Product owners may assign
• Hierarchy of Product Owner Designations
o Module Lead
Product Management Role
• Intro
o Product Management roles assist Product Owners by taking on direct responsibilities involving the product's vision, strategy, path, goals, ui, text etc
o Must have been a developer for several years
o Must have excellent business acumen
o Purely an Individual Contributor, does not require People Management Skills
o Reports into relevant Product Owner
• Types of Responsibilities Directly Handled
o Product Planning Related
o UI Prototyping Related
o UI Text Related
o User Experience related
o In absence of a Iteration Manager, the Product Analyst may manage Delivery and Process related responsibilities
• Hierarchy of Product Visionary Designations
o Product Analyst
o Business Analyst
o GM,
o VP,
o SVP,
Dev and Architect Role
• Intro
o Individual Contributors, do not require People Management Skills
o Report into relevant Product Owner
o Junior designations - Developer with 1-6 yrs experience
o Senior designations
 Very senior techie with a large legacy of diverse experience (8+ years)
 Spends most time on architecture, and some time on hands-on coding of specific important modules
• Types of Responsibilities Handled
o Junior Designations
 Dev Related
 Delivery and Process Related
o Senior Designations
 Dev Related
 Architecture Related
 Delivery and Process Related
• Hierarchy of Dev Designations
o Junior Designations
 Software Developer
 Lead Developer
o Senior Designations
 Principal Developer
 Architect
 Lead Architect
 Principal Architect
Delivery and Process Role
• Intro
o Sr developer with 1+ yrs experience in our organization
o Must possess a clear understanding of our processes
o Individual Contributor with some people management and collaboration skills, assists Product Owner in managing people on the project
o Reports into relevant Product Owner
o Typically the delivery role maybe performed by any Sr Developer on the team for a given release (sometimes even part-time)
• Types of Responsibilities Handled
o Delivery and Process Related
• Delivery Designations
o Iteration manager
Testing and QA Role
• Intro
o Well versed with automated testing, defining and writing tests
o Well versed with Unit Testing, TDD, Test driven development, Functional Testing and testing tools
o Well versed with Best practices, coding standards, code coverage tools, code analysis tools etc
o Junior designations
 purely individual contributors
 report into relevant Product Owner or Senior designations
o Senior Designations
 must have significant QA and testing experience (8+ yrs) across diverse platforms
 people management skills required. may have Junior designations as reportees
 report into relevant Product Owner
• Types of Responsibilities Handled
o Junior Designations
 QA Related
o Senior Designations
 QA Related
 QA Management Related
• Testing Designations
o Junior Designations
 Software Developer - Testing
 Lead Software Developer - Testing
o Senior Designations
 QA Lead
 Principal QA
 VP, QA
Sample Teams
Below we present two sample team configurations and who fulfills what responsibility
Small Project Team
• Tech Lead
o Direct Responsibilities
 Product Planning Related
 Architecture Related
 Delivery and Process Related
 People Management
o Oversight Responsibilities
 Dev Related
 Testing Related
 User Documentation Related
• 2 Software Developers
o Dev Related
• 1 Test Engineer
o Testing Related
• Part time UI lead + UI Associate
o UI Related
• Part time Tech writer
o User Documentation Related
• Part time Sysad
o Sysad related
Fully Mature Project Team
• Tech Lead / GM
o Direct Responsibilities
 None
o Oversight Responsibilities
 Dev Related
 Testing Related
 User Documentation Related
 Product Vision Related
 Architecture Related
 Delivery and Process Related
 People Management
• Product Analyst / Business Analyst
o Product Planning Related
• 'x' Software Developers + Architects
o Dev Related
o Architecture Related
• QA Lead + Test Engineers
o Testing Related
• Iteration Manager
o Delivery and Process Related
• Part time UI lead + Fulltime UI Associates
o UI Related
• Part time Tech writer
o User Documentation Related
• Part time Sysad
o Sysad related


Monday, July 12, 2010

How to write a good bug report? Tips and Tricks

How to write a good bug report? Tips and Tricks
Why good Bug report?
If your bug report is effective, chances are higher that it will get fixed. So fixing a bug depends on how effectively you report it. Reporting a bug is nothing but a skill and I will tell you how to achieve this skill.
“The point of writing problem report(bug report) is to get bugs fixed” – By Cem Kaner. If tester is not reporting bug correctly, programmer will most likely reject this bug stating as irreproducible. This can hurt testers moral and some time ego also. (I suggest do not keep any type of ego. Ego’s like “I have reported bug correctly”, “I can reproduce it”, “Why he/she has rejected the bug?”, “It’s not my fault” etc etc..)
What are the qualities of a good software bug report?
Anyone can write a bug report. But not everyone can write a effective bug report. You should be able to distinguish between average bug report and a good bug report. How to distinguish a good or bad bug report? It’s simple, apply following characteristics and techniques to report a bug.
1) Having clearly specified bug number:
Always assign a unique number to each bug report. This will help to identify the bug record. If you are using any automated bug-reporting tool then this unique number will be generated automatically each time you report the bug. Note the number and brief description of each bug you reported.
2) Reproducible:
If your bug is not reproducible it will never get fixed. You should clearly mention the steps to reproduce the bug. Do not assume or skip any reproducing step. Step by step described bug problem is easy to reproduce and fix.
3) Be Specific:
Do not write a essay about the problem. Be Specific and to the point. Try to summarize the problem in minimum words yet in effective way. Do not combine multiple problems even they seem to be similar. Write different reports for each problem.
How to Report a Bug?
Use following simple Bug report template:
This is a simple bug report format. It may vary on the bug report tool you are using. If you are writing bug report manually then some fields need to specifically mention like Bug number which should be assigned manually.
Reporter: Your name and email address.
Product: In which product you found this bug.
Version: The product version if any.
Component: These are the major sub modules of the product.
Platform: Mention the hardware platform where you found this bug. The various platforms like ‘PC’, ‘MAC’, ‘HP’, ‘Sun’ etc.
Operating system: Mention all operating systems where you found the bug. Operating systems like Windows, Linux, Unix, SunOS, Mac OS. Mention the different OS versions also if applicable like Windows NT, Windows 2000, Windows XP etc.
Priority:
When bug should be fixed? Priority is generally set from P1 to P5. P1 as “fix the bug with highest priority” and P5 as ” Fix when time permits”.
Severity:
This describes the impact of the bug.
Types of Severity:
• Blocker: No further testing work can be done.
• Critical: Application crash, Loss of data.
• Major: Major loss of function.
• Minor: minor loss of function.
• Trivial: Some UI enhancements.
• Enhancement: Request for new feature or some enhancement in existing one.
Status:
When you are logging the bug in any bug tracking system then by default the bug status is ‘New’.
Later on bug goes through various stages like Fixed, Verified, Reopen, Won’t Fix etc.
Click here to read more about detail bug life cycle.
Assign To:
If you know which developer is responsible for that particular module in which bug occurred, then you can specify email address of that developer. Else keep it blank this will assign bug to module owner or Manger will assign bug to developer. Possibly add the manager email address in CC list.
URL:
The page url on which bug occurred.
Summary:
A brief summary of the bug mostly in 60 or below words. Make sure your summary is reflecting what the problem is and where it is.
Description:
A detailed description of bug. Use following fields for description field:
• Reproduce steps: Clearly mention the steps to reproduce the bug.
• Expected result: How application should behave on above mentioned steps.
• Actual result: What is the actual result on running above steps i.e. the bug behavior.
These are the important steps in bug report. You can also add the “Report type” as one more field which will describe the bug type.
The report types are typically:
1) Coding error
2) Design error
3) New suggestion
4) Documentation issue
5) Hardware problem
Some Bonus tips to write a good bug report:
1) Report the problem immediately:If you found any bug while testing, do not wait to write detail bug report later. Instead write the bug report immediately. This will ensure a good and reproducible bug report. If you decide to write the bug report later on then chances are high to miss the important steps in your report.
2) Reproduce the bug three times before writing bug report:Your bug should be reproducible. Make sure your steps are robust enough to reproduce the bug without any ambiguity. If your bug is not reproducible every time you can still file a bug mentioning the periodic nature of the bug.
3) Test the same bug occurrence on other similar module:
Sometimes developer use same code for different similar modules. So chances are high that bug in one module can occur in other similar modules as well. Even you can try to find more severe version of the bug you found.
4) Write a good bug summary:
Bug summary will help developers to quickly analyze the bug nature. Poor quality report will unnecessarily increase the development and testing time. Communicate well through your bug report summary. Keep in mind bug summary is used as a reference to search the bug in bug inventory.
5) Read bug report before hitting Submit button:
Read all sentences, wording, steps used in bug report. See if any sentence is creating ambiguity that can lead to misinterpretation. Misleading words or sentences should be avoided in order to have a clear bug report.
6) Do not use Abusive language:
It’s nice that you did a good work and found a bug but do not use this credit for criticizing developer or to attack any individual.
Conclusion:
No doubt that your bug report should be a high quality document. Focus on writing good bug reports, spend some time on this task because this is main communication point between tester, developer and manager. Mangers should make aware to their team that writing a good bug report is primary responsibility of any tester. Your efforts towards writing good bug report will not only save company resources but also create a good relationship between you and developers.
For better productivity write a better bug report.





Sample bug report
Below sample bug/defect report will give you exact idea of how to report a bug in bug tracking tool.
Here is the example scenario that caused a bug:
Lets assume in your application under test you want to create a new user with user information, for that you need to logon into the application and navigate to USERS menu > New User, then enter all the details in the ‘User form’ like, First Name, Last Name, Age, Address, Phone etc. Once you enter all these information, you need to click on ‘SAVE’ button in order to save the user. Now you can see a success message saying, “New User has been created successfully”.
But when you entered into your application by logging in and navigated to USERS menu > New user, entered all the required information to create new user and clicked on SAVE button. BANG! The application crashed and you got one error page on screen. (Capture this error message window and save as a Microsoft paint file)
Now this is the bug scenario and you would like to report this as a BUG in your bug-tracking tool.
How will you report this bug effectively?
Here is the sample bug report for above mentioned example:
(Note that some ‘bug report’ fields might differ depending on your bug tracking system)
SAMPLE BUG REPORT:
Bug Name: Application crash on clicking the SAVE button while creating a new user.
Bug ID: (It will be automatically created by the BUG Tracking tool once you save this bug)
Area Path: USERS menu > New Users
Build Number: Version Number 5.0.1
Severity: HIGH (High/Medium/Low) or 1
Priority: HIGH (High/Medium/Low) or 1
Assigned to: Developer-X
Reported By: Your Name
Reported On: Date
Reason: Defect
Status: New/Open/Active (Depends on the Tool you are using)
Environment: Windows 2003/SQL Server 2005
Description:
Application crash on clicking the SAVE button while creating a new
user, hence unable to create a new user in the application.
Steps To Reproduce:
1) Logon into the application
2) Navigate to the Users Menu > New User
3) Filled all the user information fields
4) Clicked on ‘Save’ button
5) Seen an error page “ORA1090 Exception: Insert values Error…”
6) See the attached logs for more information (Attach more logs related to bug..IF any)
7) And also see the attached screenshot of the error page.
Expected result: On clicking SAVE button, should be prompted to a success message “New User has been created successfully”.
(Attach ‘application crash’ screen shot.. IF any)
Save the defect/bug in the BUG TRACKING TOOL. You will get a bug id, which you can use for further bug reference.
Default ‘New bug’ mail will go to respective developer and the default module owner (Team leader or manager) for further action.

Independent QA

Independent QA

Our offshore testing services span the entire software lifecycle. This includes formulating the test plan and test cases, execution, defect reporting, defect analysis, risk assessments and recommendations. We develop test suites and the tests are carried out to ensure Functionality, Usability, Performance, Regression, Integration, Installation, Compatibility and User Acceptance. By collaborating with us, you can leverage our experience in testing business applications on a range of platforms and technologies like .Net, J2EE and Open Source to name a few. We also have a team of ISTQB Certified Testers thus ensuring that the capabilities of the members are confirmed by International Testing bodies.

Domain Experience - The QA Team has so far handled projects on various domains, some of which include:

* Telecom
* Healthcare
* Taxation
* CRM
* BFSI

Assignment Models


Testing Services


Tools Experience

Testing Consulting


Usability Testing


Rational Robot

End to End Testing


Functionality Testing


LoadRunner

Managed Testing


Integration Testing


WinRunner




Load and Performance Testing


QuickTest Professional




Volume Testing


Test Complete




Stress Testing


Test Director




Scalability Testing


Turbo Data Generator




System Testing


App Perfect




Regression Testing


Mantis BTS




Compatibility Testing


BTS by Microsoft SharePoint




Data Validation Testing


Collaboration Tools




Security Testing


Configuration Management Tools




Black Box Testing


JIRA BTS




Support Maintenance Testing


Visual Studio Team System (VSTS)




GUI Testing


Selenium IDE/RC




Migration Testing






Database Testing