Exam Name | Certified Software Tester - Foundation Level (CSTFL) |
Exam Code | CTFL |
Actual Exam Duration | 90 minutes |
Expected no. of Questions in Actual Exam | 60 |
Official Information | https://astqb.org/certifications/foundation-level-certification/ |
See Expected Questions | GAQM CTFL Expected Questions in Actual Exam |
Take Self-Assessment | Use GAQM CTFL Practice Test to Assess your preparation - Save Time and Reduce Chances of Failure |
Section | Weight | Objectives |
---|---|---|
1 Learning Objectives for Fundamentals of Testing: | 1.1 What is Testing? FL-1.1.1 (K1) Identify typical objectives of testing FL-1.1.2 (K2) Differentiate testing from debugging 1.2 Why is Testing Necessary? FL-1.2.1 (K2) Give examples of why testing is necessary FL-1.2.2 (K2) Describe the relationship between testing and quality assurance and give examples of how testing contributes to higher quality FL-1.2.3 (K2) Distinguish between error, defect, and failure FL-1.2.4 (K2) Distinguish between the root cause of a defect and its effects 1.3 Seven Testing Principles FL-1.3.1 (K2) Explain the seven testing principles 1.4 Test Process FL-1.4.1 (K2) Explain the impact of context on the test process FL-1.4.2 (K2) Describe the test activities and respective tasks within the test process FL-1.4.3 (K2) Differentiate the work products that support the test process FL-1.4.4 (K2) Explain the value of maintaining traceability between the test basis and test work products 1.5 The Psychology of Testing FL-1.5.1 (K1) Identify the psychological factors that influence the success of testing FL-1.5.2 (K2) Explain the difference between the mindset required for test activities and the mindset required for development activities |
|
2 Testing Throughout the Software Development Lifecycle | Learning Objectives for Testing Throughout the Software Development Lifecycle 2.1 Software Development Lifecycle Models FL-2.1.1 (K2) Explain the relationships between software development activities and test activities in the software development lifecycle FL-2.1.2 (K1) Identify reasons why software development lifecycle models must be adapted to the context of project and product characteristics 2.2 Test Levels FL-2.2.1 (K2) Compare the different test levels from the perspective of objectives, test basis, test objects, typical defects and failures, and approaches and responsibilities 2.3 Test Types FL-2.3.1 (K2) Compare functional, non-functional, and white-box testing FL-2.3.2 (K1) Recognize that functional, non-functional, and white-box tests occur at any test level FL-2.3.3 (K2) Compare the purposes of confirmation testing and regression testing 2.4 Maintenance Testing FL-2.4.1 (K2) Summarize triggers for maintenance testing FL-2.4.2 (K2) Describe the role of impact analysis in maintenance testing |
|
3 Static Testing | Learning Objectives for Static Testing 3.1 Static Testing Basics FL-3.1.1 (K1) Recognize types of software work product that can be examined by the different static testing techniques FL-3.1.2 (K2) Use examples to describe the value of static testing FL-3.1.3 (K2) Explain the difference between static and dynamic techniques, considering objectives, types of defects to be identified, and the role of these techniques within the software lifecycle 3.2 Review Process FL-3.2.1 (K2) Summarize the activities of the work product review process FL-3.2.2 (K1) Recognize the different roles and responsibilities in a formal review FL-3.2.3 (K2) Explain the differences between different review types: informal review, walkthrough, technical review, and inspection FL-3.2.4 (K3) Apply a review technique to a work product to find defects FL-3.2.5 (K2) Explain the factors that contribute to a successful review |
|
4 Test Techniques | Learning Objectives for Test Techniques 4.1 Categories of Test Techniques FL-4.1.1 (K2) Explain the characteristics, commonalities, and differences between black-box test techniques, white-box test techniques, and experience-based test techniques 4.2 Black-box Test Techniques FL-4.2.1 (K3) Apply equivalence partitioning to derive test cases from given requirements FL-4.2.2 (K3) Apply boundary value analysis to derive test cases from given requirements FL-4.2.3 (K3) Apply decision table testing to derive test cases from given requirements FL-4.2.4 (K3) Apply state transition testing to derive test cases from given requirements FL-4.2.5 (K2) Explain how to derive test cases from a use case 4.3 White-box Test Techniques FL-4.3.1 (K2) Explain statement coverage FL-4.3.2 (K2) Explain decision coverage FL-4.3.3 (K2) Explain the value of statement and decision coverage 4.4 Experience-based Test Techniques FL-4.4.1 (K2) Explain error guessing FL-4.4.2 (K2) Explain exploratory testing FL-4.4.3 (K2) Explain checklist-based testing |
|
5 Test Management | Learning Objectives for Test Management 5.1 Test Organization FL-5.1.1 (K2) Explain the benefits and drawbacks of independent testing FL-5.1.2 (K1) Identify the tasks of a test manager and tester 5.2 Test Planning and Estimation FL-5.2.1 (K2) Summarize the purpose and content of a test plan FL-5.2.2 (K2) Differentiate between various test strategies FL-5.2.3 (K2) Give examples of potential entry and exit criteria FL-5.2.4 (K3) Apply knowledge of prioritization, and technical and logical dependencies, to schedule test execution for a given set of test cases FL-5.2.5 (K1) Identify factors that influence the effort related to testing FL-5.2.6 (K2) Explain the difference between two estimation techniques: the metrics-based technique and the expert-based technique 5.3 Test Monitoring and Control FL-5.3.1 (K1) Recall metrics used for testing FL-5.3.2 (K2) Summarize the purposes, contents, and audiences for test reports 5.4 Configuration Management FL-5.4.1 (K2) Summarize how configuration management supports testing 5.5 Risks and Testing FL-5.5.1 (K1) Define risk level by using likelihood and impact FL-5.5.2 (K2) Distinguish between project and product risks FL-5.5.3 (K2) Describe, by using examples, how product risk analysis may influence the thoroughness and scope of testing 5.6 Defect Management FL-5.6.1 (K3) Write a defect report, covering a defect found during testing |
|
6 Tool Support for Testing? | Learning Objectives for Test Tools 6.1 Test tool considerations FL-6.1.1 (K2) Classify test tools according to their purpose and the test activities they support FL-6.1.2 (K1) Identify benefits and risks of test automation FL-6.1.3 (K1) Remember special considerations for test execution and test management tools 6.2 Effective use of tools FL-6.2.1 (K1) Identify the main principles for selecting a tool FL-6.2.2 (K1) Recall the objectives for using pilot projects to introduce tools FL-6.2.3 (K1) Identify the success factors for evaluation, implementation, deployment, and on-going support of test tools in an organization |
Cramtick's authentic study material entails both practice questions and practice test. GAQM CTFL exam questions and practice test are the best options to appear in the exam confidently and well-prepared. In order to pass the actual Certified Software Tester - Foundation Level (CSTFL) CTFL exam in the first attempt, you have to work really hard on these GAQM CTFL questions, offering you with updated study guide, for the whole exam syllabus. While you are studying actual questions, you should also make use of the GAQM CTFL practice test for self-analysis and actual exam simulation by taking it. Studying again and again of actual exam questions will remove your mistakes with the Certified Software Tester - Foundation Level (CSTFL) CTFL exam practice test. Online and windows-based, Mac-Based formats of the CTFL exam practice tests are available for self-assessment.
Certified Software Tester | CTFL Questions Answers | CTFL Test Prep | Certified Software Tester - Foundation Level (CSTFL) Questions PDF | CTFL Online Exam | CTFL Practice Test | CTFL PDF | CTFL Test Questions | CTFL Study Material | CTFL Exam Preparation | CTFL Valid Dumps | CTFL Real Questions | Certified Software Tester CTFL Exam Questions