Iterative Review Session (IRS): How_to
5/27/14
Abstract/Summary:
Iterative review sessions, or IRS's, are going to be the new procedure to deal with any incorrectly failed OR blocked test case for an SWS session - this will be standard way SWS project will deal with QA results in Colorado, and the procedure below will also contain information for Thailand to retest what is necessary.
It is very easy to go through this process, but I find it to be very time-consuming to instruct testers in Thailand to do it, since it is on a case-by-case basis and there hasn't been a standard documented procedure for it yet.
Whenever I specify that testers perform an "IRS", I'll be referring to the procedure in this document.
There will be a new box in the SWS section of the Work Assignment that will used to specify which trackers require IRS's.
Please read the below procedure to understand what is necessary to complete an IRS:
IRS PROCEDURE:
STEP 1: Know how test cases should (AND shouldn't) be failed/blocked
A failed test case will always have a JIRA defect associated with it, and this defect must be
A) Status = Open (in Analysis, Development usually).
B) Correct configuration - for example, a defect failing an iOS test case cannot be logged for Android issue.
If a failed test case is not in an open status AND correct configuration, then it will be highlighted and assigned as an IRS for the tester in Thailand to deal with.
A blocked test case should ONLY be blocked if functionality cannot be accessed to test, or is so impaired by existing problems that testing it is not logical.
A) A good rule of thumb is that if you CAN test the functionality, even if multiple bugs are observed, you should still test it, and reference all that you see.
B) Duplicate test cases need to be inspected in more detail: sometimes they have the same title, summary, etc, but are for testing different document types (.doc vs .xls) etc.
STEP 2: Review highlighted rows in Trackers for IRS - these are incorrectly failed test cases.
I will direct Thailand to perform IRS on each tracker (see work assignment screenshot above), and when you see a test case row that has been highlighted in the spreadsheet, review these.
STEP 3: The tester WHO FAILED THE TEST CASE WITH THE INCORRECT DEFECT needs to retest it.
Since the test failure is not valid, the tester who performed it the first time needs to do it again.
STEP 4: The tester must then find a new defect that is Open AND correct configuration to associate with the failure. IT MAY BE NECESSARY TO LOG A NEW DEFECT for the problem.
STEP 5: Document the test results, including a different JIRA defect that correctly describes the failure, in the same tracker, and send the results to CO the same as normal.
The CO review team will go over the highlighted test cases the next day to make sure everything was updated.
This is the entire procedure. It is not long, but it can take some time to retest the defects and find/log the correct information in JIRA to successfully log a failed test case.
I do not want to see any test cases being failed a second time for the exact same defect.
The goal is that each time you retest a failed test case, you will come up with a different defect to fail it. If a failed test case is going through an IRS, it absolutely cannot be failed again for the same defect.
Iterative Review Session (IRS): How_to
5/27/14
Abstract/Summary:
Iterative review sessions, or IRS's, are going to be the new procedure to deal with any incorrectly failed OR blocked test case for an SWS session - this will be standard way SWS project will deal with QA results in Colorado, and the procedure below will also contain information for Thailand to retest what is necessary.
It is very easy to go through this process, but I find it to be very time-consuming to instruct testers in Thailand to do it, since it is on a case-by-case basis and there hasn't been a standard documented procedure for it yet.
Whenever I specify that testers perform an "IRS", I'll be referring to the procedure in this document.
There will be a new box in the SWS section of the Work Assignment that will used to specify which trackers require IRS's.
Please read the below procedure to understand what is necessary to complete an IRS:
IRS PROCEDURE:
STEP 1: Know how test cases should (AND shouldn't) be failed/blocked
A failed test case will always have a JIRA defect associated with it, and this defect must be
A) Status = Open (in Analysis, Development usually).
B) Correct configuration - for example, a defect failing an iOS test case cannot be logged for Android issue.
If a failed test case is not in an open status AND correct configuration, then it will be highlighted and assigned as an IRS for the tester in Thailand to deal with.
A blocked test case should ONLY be blocked if functionality cannot be accessed to test, or is so impaired by existing problems that testing it is not logical.
A) A good rule of thumb is that if you CAN test the functionality, even if multiple bugs are observed, you should still test it, and reference all that you see.
B) Duplicate test cases need to be inspected in more detail: sometimes they have the same title, summary, etc, but are for testing different document types (.doc vs .xls) etc.
STEP 2: Review highlighted rows in Trackers for IRS - these are incorrectly failed test cases.
I will direct Thailand to perform IRS on each tracker (see work assignment screenshot above), and when you see a test case row that has been highlighted in the spreadsheet, review these.
STEP 3: The tester WHO FAILED THE TEST CASE WITH THE INCORRECT DEFECT needs to retest it.
Since the test failure is not valid, the tester who performed it the first time needs to do it again.
STEP 4: The tester must then find a new defect that is Open AND correct configuration to associate with the failure. IT MAY BE NECESSARY TO LOG A NEW DEFECT for the problem.
STEP 5: Document the test results, including a different JIRA defect that correctly describes the failure, in the same tracker, and send the results to CO the same as normal.
The CO review team will go over the highlighted test cases the next day to make sure everything was updated.
This is the entire procedure. It is not long, but it can take some time to retest the defects and find/log the correct information in JIRA to successfully log a failed test case.
I do not want to see any test cases being failed a second time for the exact same defect.
The goal is that each time you retest a failed test case, you will come up with a different defect to fail it. If a failed test case is going through an IRS, it absolutely cannot be failed again for the same defect.
การแปล กรุณารอสักครู่..