✅ Key Concepts – TM 2.3.5 Defect Report Information
📌 Purpose of Defect Report:
A defect report should serve four main purposes:
-
Defect lifecycle management
-
Project status assessment
-
Assessment of product increment quality
-
Process capability assessment
📌 Core Principles:
-
Tailor the attributes to the project context to avoid overloading the team.
-
Use consistent information across SDLC phases and projects.
-
Include additional info for non-functional defects (e.g., load for performance issues).
📋 Summary Table: Defect Report Data & Classification
Purpose | Information Collected | Example |
---|---|---|
Defect Lifecycle Management | - Title - Description & Steps to Reproduce - Severity - Priority - Owner - State | "Login button not responsive" Severity: Major Priority: High |
Defect Tracking & Status | - Unique ID - Creation Date/Time - Reporter - Project/SDLC Phase | ID: DEF-0231 Found in System Test phase |
Help Defect Resolution | - Subsystem/Component - Test Item & Release - Environment | Found in Payment Module on Release 2.3 Env: Android 13, Chrome browser |
Project Quality & Progress Assessment | - Type of Defect - Work Product Affected - Quality Characteristics Affected - Related Requirements | Type: Performance Impact: Requirement #RQ-102 Affects Response Time |
Process Capability Assessment | - Phase of Introduction - Detection Phase - Removal Phase - Root Cause | Introduced: Design Detected: System Test Cause: Missing validation |
Project Decision Making | - Workaround Availability - Cost/Risk of Fix - References (linked test case/requirements/defects) | Workaround: Available (manual entry) Linked TC: TC-876, Req: RQ-198 |
🧠 Mind Map – TM 2.3.5 Defect Report Information
🔹 Short Scenario 1 (MCQ):
A tester finds a defect in the login module that affects only a rare configuration. What info must be added for meaningful resolution?
A. Steps to reproduce, test environment, severity
B. Test summary, stakeholder approval, budget impact
C. Tester's resume, module roadmap, deadline
D. Number of lines of code changed, build duration
✅ Answer: A
🔹 Short Scenario 2 (MCQ):
In your project, the customer asks to track defect trends and phases of introduction. What key data should your defect report capture?
A. Browser version and team size
B. SDLC phase of detection and root cause
C. Code coverage metrics
D. Developer comments
✅ Answer: B
🔸 Long Scenario 3 (MCQ):
Scenario: During system testing, you find a performance issue that crashes the mobile app when the server is under high load. A workaround is available through a manual retry.
Which of the following data fields should be captured in the defect report to support both resolution and project-level decisions?
A. Title, severity, workaround, test environment, performance load conditions
B. Budget allocation, feature roadmap, stakeholder list
C. Testing team org chart, country, timezone of tester
D. Version control commits, static code analysis history
✅ Answer: A
🔸 Long Scenario 4 (MCQ):
Scenario: Your defect report process includes capturing the test item, release version, and the test phase in which the issue was found. Your manager wants to improve process effectiveness.
Which field addition would best help with process capability analysis?
A. User feedback score
B. Phase of defect introduction and root cause
C. Priority and workaround availability
D. Link to internal wiki
✅ Answer: B
Q1. Scenario: A defect was found in a high-priority business function. It's causing intermittent failure under heavy load. The fix is costly, and a workaround exists.
What information helps both in defect resolution and stakeholder decision-making?
A. Screenshot, build logs
B. Priority, severity, performance load, workaround
C. Unit test pass status
D. Reviewer notes and peer comments
✅ Answer: B
Q2. Scenario: A tester identifies a UI glitch. It doesn’t block any workflow but breaks visual consistency. The manager wants to track such defects for process analysis.
What data helps process capability assessment?
A. Browser type and screen resolution
B. UI toolkit used
C. Phase of introduction and root cause
D. Component developer’s experience
✅ Answer: C
Q3. Scenario: A customer asks for defect trend analysis across releases. What consistent information is essential?
A. Defect ID, SDLC phase of detection, defect type
B. Developer salary and project budget
C. Change log frequency
D. Daily status reports
✅ Answer: A
Q4. Scenario: A tester enters only the title and severity of a defect. What’s missing for effective resolution?
A. Deadline for fix
B. Detailed description and reproduction steps
C. Root cause
D. Product owner’s name
✅ Answer: B
Q5. Scenario: The team wants to measure how early defects are caught. Which defect report attribute is needed?
A. Owner of defect
B. Date of report
C. Phase of introduction
D. Build release number
✅ Answer: C
Q6. Scenario: A high-risk security flaw is identified post-release. What attributes should be prioritized in the report?
A. Severity, quality characteristic affected, workaround
B. Uptime logs
C. Browser cache size
D. Tester background
✅ Answer: A
Q7. Scenario: Your team struggles with duplicate defect reports. What can help reduce this?
A. Auto-assignment of defects
B. Summary, ID, links to test case and defect taxonomy
C. Color-coded tags
D. Developer time tracking
✅ Answer: B
Q8. Scenario: You need to prioritize defects impacting stakeholders. What fields help?
A. Impacted requirement, severity, stakeholder priority
B. Budget, company vision
C. Number of lines in the module
D. Tester's availability
✅ Answer: A
Q9. Scenario: Defects across teams need comparison. What’s essential?
A. Same naming convention for files
B. Consistent defect attributes and taxonomy
C. Uniform break times
D. Shared chat group
✅ Answer: B
Q10. Scenario: Test Manager wants to identify which phase most defects are introduced. What’s relevant?
A. Priority field
B. Introduction phase and root cause
C. Team composition
D. Weekend work hours
✅ Answer: B
No comments:
Post a Comment