Friday, January 23, 2009
What is Driver and Stub?
(i) Driver : It is a temporary Calling program.It functions similarly like main module for calling the sub modules.
(ii) Drivers are dummy lines of code, Used in Bottomup Approach, calling programs.
Stub:-
(i) Stub : It is a temporary called program.It functions similarly like sub modules when called by the main module.
(ii) Stubs are dummy windows, Used in Topdown Approach, called programs.
Friday, January 16, 2009
Basic Testing Terminology
White Box Testing:- This type of testing based on the knowledge of the internal programming code of an application. Tests are based on the coverage of code statements, branches, paths and conditions.
Unit Testing: - When an individual module can be tested is called as unit testing.
This testing typically has done by the developer not the tester.
Integration Testing: - When a number of modules can be executed successfully after they are integrated with each other and they are working properly with communicating with each other is called as Integration testing.
Functional Testing: - This type of testing is used to check the function of a product/software is working properly or not, the desert specifications are working properly or not.
It can only emphasis on the input and the output. It ignores the inner parts and focus on the output as per the requirement.
Gray Box Testing: - It is a combination of black box testing and white box testing. In this testing the tester does know some internal programming code of the software under test.
Acceptance Testing:- It is total based on the user requirement, when all the user requirement meets to the specification that is called as Acceptance Testing.
System Testing: - This testing based on the overall requirement, specification and covers all the combined parts of a system.
Sanity Testing:- Testing to determine if a new software version is performing well enough to accept if for a major testing effort.
Regression Testing: - This type of testing can be done after fixes the bug or modification of the software or its environment.
Load Testing: - Testing an application under heavy loads, such as testing of a website or windows application to determine that at which point the system response degrades/fails.
Friday, January 9, 2009
Difference between Compatibility & Comparison Testing.
(i) It is done to see if any particular application behaves the same across the different browsers, Operating System, Databases.
(ii) Testing to ensure compatibility of an application or web with different Browsers, Operating System and Hardware platforms.
Comparison Testing:-
(i) It is basically done when two similar applications are accessed both of the applications features and functionalities are compared.
Advantages and Disadvantages of Black Box and White Box Testing?
(ii) Testers can be non technical.
(iii) No need of having detailed internal knowledge of the application.
Disadvantages of Black Box Testing:-
(i) It is difficult to find tricky input.
(ii) It is difficult to identify all possible inputs in limited time.
(iii) Chances of having repetition of tests.
Difference between Verification & Validation?
(i) In Verification you have to check the process of a product.
(ii) It refers to the set of activities that ensures that software correctly implements a specific function.
(iii) Are we building the product right?
(iv) Verification means meeting the requirements of the software.
(v) Concerned with the correctness of the process.
Validation:-
(i) In Validation you have to check the final product.
(ii) It refers to a different set of activities that ensure that the software that has been built is traceable to customer requirements.
(iii) Are we building the right product?
(iv) Validation means meeting the sections of the software.
(v) Concerned with the final product itself.
Difference between Testing & Debugging?
(i) Testing is conducted by the Testers in Testing Phase.
(ii) In Testing phase, the tester can find the bug called Testing to improve the quality of a product.
Debugging:-
(i) It can do in the development phase by the developers.
(ii) In the development phase, developer fixes the Bug.
What is Priority and Severity?
(i) How important to fix the Bug in the Product/Software.
(ii) Priority is the level which determines which bug needs to be fixed first...whether it is of priority high(first) or can it be resolved(not fixed in this release) or can it wait(low level).
(iii) It is usually determined by the developing team or the manager.
(iv) Priority defines the importance of defect from the customer’s point of view.
Severity:-
(i) How badly affect the Bug in the software.
(ii) Severity is the one which states how bad or critical the Bug is.
(iii) It is usually reported by the tester to the developing team.
(iv) Severity defines the importance of defect with respective to functional point of view.
Thursday, January 8, 2009
Basic Things About Software Testing
(i) The difference between the real value and the experimental value is called as Error.
(ii) Any programmatically mistakes in a software/product are called as Error.
2) What is Bug?
(i) The non-conference to requirement of a product is called as Bug.
(ii) Any Deviation from the expected result is called as Bug.
(iii) A persistent error in Hardware and Software is called as Bug.
3) What is Defect?
(i) The missing functionality of a product is called as Defect.
(ii) It occurs usually when a product is no longer works the way it used to.
(iii) Defect means, it is an error, flaw, mistake, failure, or fault in an application that prevents it from working as intended.
(iv) Faults in the production environment are called as Defects.
4) Write Different types of Defects?
These are the following types of Defects-:
(i) System Crash.
(ii) Application Crash.
(iii) Functional Major.
(iv) Functional.
(v) UI Defects.
(vi) Usability.
(vii) Functional Enhancement.
(viii) UI Enhancement.
(ix) Design Issue.
(x) Architecture.
(xi) Change Request.
(xii) Database.
(xiii) Documentation.
5) What is the Defect Source or why does software have Bugs?
Source of Defects:-
(i) Miscommunication.
(ii) Software Complexity.
(iii) Programming errors.
(iv) Changing requirements.
(v) Poorly Documented code.
(vi) Software Development tools.
(vii) Time Pressure.
(6) What is Testing?
(i) Testing is the process of executing a program with the intent of finding errors.
(ii) Testing is the process of uncovering errors in a program makes it a feasible task.
(iii) Testing is a destructive process of trying to find the errors in a program.
(iv) Testing is a set of activity that can be planned in advanced and conducted systematically.
(v) Testing is the infinite process of comparing the invisible to the ambiguous so as to avoid the unthinkable happening to the anonymous.
(vi) Testing is the Process which helps to identify the Correctness, Completeness and Quality of developing software.
(vii) Testing= Validation + Verification.
(viii) Testing is basically an activity designed for tracking Defects, Communicating the same to the concerned developers and then getting the fixed verified.
(7) What is Testing Life Cycle?
Testing Life Cycle means when new software is testing under it there are different stages are working. These are the following-:
(i) Requirements stage.
(ii) Test Plan.
(iii) Test Design.
(iv) Design Reviews.
(v) Code Reviews.
(vi) Test Case Preparation.
(vii) Test Execution.
(viii) Test Reports.
(ix) Bug Reporting.
(x) Reworking on Patches.
(xi) Release to Production.
(8) What is Bug Report and what is the purpose of it?
The Bug report is a document which explains the gap in the excepted result and the actual result and it also determine how to reproduce the scenario.
Purpose of Bug Report:- The main purpose of the Bug report is to enable the programmers to see the program failing in front of them.
(9) What is the Aim of a Bug Report?
The Bug report should be detailed enough to fail the program in front of the programmer itself. It should clearly distinguished between the excepted behavior and the actually behavior.
(10) What is Bug Tracking System?
(i) A bug tracking system is a software application that is designed to help quality assurance and programmers keep track of reported software bugs in their work. It may be regarded as a sort of issue tracking system
(ii) A bug tracking system for software defect tracking, and a general issue management application for helpdesk customer support and trouble ticketing.
(11) Write some important fields in a Bug Tracking System?
The important fields of a Bug Tracking Systems are in the following-:
(i) Bug Heading.
(ii) Status.
(iii) Description.
(iv) Type.
(v) Priority.
(vi) Severity.
(vii) System details like OS,Browsers etc.
(viii) Suggestion fix or expected behavior.
(ix) Attachments if any.
(12) What is Test Case?
(i) Test Case is a set of conditions under which a tester will determine if a requirement of an application is satisfied or not.
(ii) A test case is a noted/documented set of steps/activities that are carried out or executed on the software in order to confirm its functionality/behavior to certain set of inputs.
Explain Load, Performance, Stress Testing with an example:-
(i) Load testing :- Its a performance testing to check system behavior under load. Testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system’s response time degrades or fails.
(ii) Stress testing :- System is stressed beyond its specifications to check how and when it fails. Performed under heavy load like putting large number beyond storage capacity, complex database queries, continuous input to system or database load.
(iii) Performance testing :- Term often used interchangeably with ’stress’ and ‘load’ testing. To check whether system meets performance requirements. Used different performance and load tools to do this.
Load Testing and Performance Testing are commonly said as positive testing where as Stress Testing is said to be as negative testing.
Say for example there is a application which can handle 25 simultaneous user logins at a time.
->In load testing we will test the application for 25 users and check how application is working in this stage.
-> In performance testing we will concentrate on the time taken to perform the operation.
->In stress testing we will test with more users than 25 and the test will continue to any number and we will check where the application is cracking the Hardware resources.
Types of Testing Comes Under Testing Levels
1. Unit Testing
(i) Unit Testing is primarily carried out by the developers themselves
(ii) Deals functional correctness and the completeness of individual program units
(iii) White box testing methods are employed
2. Integration Testing
(i) Integration Testing: Deals with testing when several program units are integrated
(ii) Regression testing: Change of behavior due to modification or addition is called ‘Regression’. Used to bring changes from worst to least
(iii) Incremental Integration Testing: Checks out for bugs which encounter when a module has been integrated to the existing
(iv) Smoke Testing: It is the battery of test which checks the basic functionality of program. If fails then the program is not sent for further testing
3. System Testing
(i) System Testing : Deals with testing the whole program system for its intended purpose
(ii) Recovery testing: System is forced to fail and is checked out how well the system recovers the failure
(iii) Security Testing: Checks the capability of system to defend itself from hostile attack on programs and data
(iv) Load & Stress Testing: The system is tested for max load and extreme stress points are figured out
(v) Performance Testing: Used to determine the processing speed
(vi) Installation Testing: Installation & uninstallation is checked out in the target platform
4. Acceptance Testing
(i) UAT: ensures that the project satisfies the customer requirements
(ii) Alpha Testing : It is the test done by the client at the developer’s site
(iii) Beta Testing : This is the test done by the end-users at the client’s site
(iv) Long Term Testing : Checks out for faults occurrence in a long term usage of the product
(v) Compatibility Testing : Determines how well the product is substantial to product transition
Friday, January 2, 2009
Difference between QA and QC?
Top 20 practical software testing tips you should read before testing any application.
I wish all testers read these software testing good practices. Read all points carefully and try to implement them in your day-to-day testing activities. This is what I expect from this article. If you don’t understand any testing practice, ask for more clarification in comments below. After all you will learn all these testing practices by experience. But then why not to learn all these things before making any mistake?
Here are some of the best testing practices I learned by experience:
1) Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide solutions.
2) Learn to maximize the test coverage every time you test any application. Though 100 percent test coverage might not be possible still you can always try to reach near it.
3) To ensure maximum test coverage break your application under test (AUT) into smaller functional modules. Write test cases on such individual unit modules. Also if possible break these modules into smaller parts.E.g: Lets assume you have divided your website application in modules and ‘accepting user information’ is one of the modules. You can break this ‘User information’ screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the ‘User information’ form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all such test cases for maximum coverage.
4) While writing test cases, write test cases for intended functionality first i.e. for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of application under test.
5) Think positive. Start testing the application by intend of finding bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application by intention of finding bugs you will definitely succeed to find those subtle bugs also.
6) Write your test cases in requirement analysis and design phase itself. This way you can ensure all the requirements are testable.
7) Make your test cases available to developers prior to coding. Don’t keep your test cases with you waiting to get final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop quality application. This will also save the re-work time.
8 ) If possible identify and group your test cases for regression testing. This will ensure quick and effective manual regression testing.
9) Applications requiring critical response time should be thoroughly tested for performance. Performance testing is the critical part of many applications. In manual testing this is mostly ignored part by testers due to lack of required performance testing large data volume. Find out ways to test your application for performance. If not possible to create test data manually then write some basic scripts to create test data for performance test or ask developers to write one for you.
10) Programmers should not test their own code. As discussed the basic unit testing of developed application should be enough for developers to release the application for testers. But you (testers) should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger know when the module/update is released for testing and they can estimate the testing time accordingly. This is a typical situation in agile project environment.
11) Go beyond requirement testing. Test application for what it is not supposed to do.
12) While doing regression testing use previous bug graph (Bug graph - number of bugs found against time for different modules). This module-wise bug graph can be useful to predict the most probable bug part of the application.
13) Note down the new terms, concepts you learn while testing. Keep a text file open while testing an application. Note down the testing progress, observations in it. Use these notepad observations while preparing final test release report. This good habit will help you to provide the complete unambiguous test report and release details.
14) Many times testers or developers make changes in code base for application under test. This is required step in development or testing environment to avoid execution of live transaction processing like in banking projects. Note down all such code changes done for testing purpose and at the time of final release make sure you have removed all these changes from final client side deployment file resources.
15) Keep developers away from test environment. This is required step to detect any configuration changes missing in release or deployment document. Some times developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to testing environment they will not do any such changes accidentally on test environment and these missing things can be captured at the right place.
16) It’s a good practice to involve testers right from software requirement and design phase. These way testers can get knowledge of application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make request to your lead or manager to involve your testing team in all decision making processes or meetings.
17) Testing teams should share best testing practices, experience with other teams in their organization.
18) Increase your conversation with developers to know more about the product. Whenever possible make face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also when you understand the requirement or resolve any dispute - make sure to communicate the same over written communication ways like emails. Do not keep any thing verbal.
19) Don’t run out of time to do high priority testing tasks. Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work.
20) Write clear, descriptive, unambiguous bug report. Do not only provide the bug symptoms but also provide the effect of the bug and all possible solutions.
Don’t forget testing is a creative and challenging task. Finally it depends on your skill and experience, how you handle this challenge