Mocka logoMocka
Home
Why MockaPricingFAQAbout

PMI Professional in Business Analysis (PMI-PBA)® Ultimate Cheat Sheet

5 Domains • 55 Concepts • Approx. 7 pages

Your Quick Reference Study Guide

This cheat sheet covers the core concepts, terms, and definitions you need to know for the PMI Professional in Business Analysis (PMI-PBA)®. We've distilled the most important domains, topics, and critical details to help your exam preparation.

💡 Note: While this study guide highlights essential concepts, it's designed to complement—not replace—comprehensiv e learning materials. Use it for quick reviews, last-minute prep, or to identify areas that need deeper study before your exam.

PMI Professional in Business Analysis (PMI-PBA)® Practice Questions
Access Mock Exams & Comprehensive Question Bank
Listen to Audio Podcasts
Expert summaries for PMI Professional in Business Analysis (PMI-PBA)®

About This Cheat Sheet: This study guide covers core concepts for PMI Professional in Business Analysis (PMI-PBA)®. It highlights key terms, definitions, common mistakes, and frequently confused topics to support your exam preparation.

Use this as a quick reference alongside comprehensive study materials.

PMI Professional in Business Analysis (PMI-PBA)®

Cheat Sheet •

Provided by GetMocka.com

About This Cheat Sheet: This study guide covers core concepts for PMI Professional in Business Analysis (PMI-PBA)®. It highlights key terms, definitions, common mistakes, and frequently confused topics to support your exam preparation.

Use this as a quick reference alongside comprehensive study materials.

Needs Assessment

18%

Solution Scope: Scope Statement & Business Case Inputs

Define the solution's boundaries, measurable deliverables, inclusions/exclusions tied to the business case.

Key Insight

Deliverables are outcomes with acceptance criteria; objectives must be SMART and map to the business case.

Often Confused With

Project Scope StatementBusiness CaseTasks/Activities

Common Mistakes

  • Accepting assumptions or constraints without verifying impacts, dependencies, or risks.
  • Using 'goal' and 'objective' interchangeably; objectives must be specific and measurable.
  • Confusing deliverables with tasks; listing a deliverable without acceptance criteria.

Problem Statement: Situation, Impact & Success Criteria

Solution-neutral summary of the problem/opportunity: who/what's affected, impacts, and measurable success criteria.

Key Insight

Keep it solution-neutral and include measurable success criteria that drive scope and acceptance decisions.

Often Confused With

Root Cause AnalysisSolution Proposal

Common Mistakes

  • Stating a preferred solution or technical details inside the problem statement.
  • Omitting measurable success criteria or restricting them to only financial metrics.
  • Confusing the statement with a root-cause analysis; it describes the situation, not causes.

Value Proposition Analysis (Fin & Non‑Fin)

Estimate and compare initiative value using financial and qualitative metrics to justify, prioritize, and realize gains.

Key Insight

Combine monetary and non‑monetary benefits, state assumptions, and run sensitivity to see which drivers change the decision.

Often Confused With

Financial valuation metrics (NPV, IRR, ROI)Cost–Benefit Analysis

Common Mistakes

  • Ignoring non‑financial benefits (NPS, customer retention, strategic value).
  • Relying on a single metric (only ROI) without sensitivity or scenario checks.
  • Treating business‑case targets as immutable after scope or market changes.

NPV • IRR • ROI — Core Financial Metrics

NPV = discounted cash‑flow value today; IRR = discount rate that zeros NPV; ROI = simple return ratio (no time value).

Key Insight

Use NPV for absolute value and mutually exclusive choices; IRR is a rate (can be multiple); ROI ignores timing — choose metric per question.

Often Confused With

Value proposition analysisPayback Period

Common Mistakes

  • Treating ROI as time‑aware (it ignores cash‑flow timing unless adjusted).
  • Picking higher IRR when a lower‑IRR project has higher NPV for mutually exclusive options.
  • Using a single corporate WACC for every project regardless of project risk or horizon.

SMART Goals (Specific, Measurable, Achievable, Relevant, Time-bound)

Checklist that makes objectives testable and ties scope to verifiable acceptance criteria.

Key Insight

Measurable can be numeric or a proxy; 'Achievable' means realistic against capacity/resources, not easiest.

Often Confused With

Acceptance CriteriaKPIs (Key Performance Indicators)

Common Mistakes

  • Treat 'Measurable' as only numeric; proxies or qualitative indicators can satisfy it.
  • Set 'Achievable' as the easiest target instead of realistic given capability/resources.
  • Add a deadline that isn't tied to when the outcome can actually be verified.

Organizational Goals & Objectives — Strategic Alignment

High-level strategic targets used to map, prioritize, and validate the business value of requirements.

Key Insight

Map each requirement to the single most relevant prioritized objective and validate it using SMART attributes.

Often Confused With

Project DeliverablesRequirements Traceability Matrix

Common Mistakes

  • Assume objectives needn't be measurable or time‑bound; ignore SMART criteria.
  • Attempt to align every initiative to every goal instead of the most relevant prioritized goal.
  • Conflate goals/objectives with deliverables or treat traceability matrix as the same artifact.

Change Control & Approval Matrix

Formal workflows, decision rights and criteria that authorize changes to baselined requirements.

Key Insight

Treat approval as a ruleset: who approves, who must be consulted, and the objective criteria plus traceability.

Often Confused With

Change Control Board (CCB)Configuration Management

Common Mistakes

  • Skipping decision criteria or traceability — makes approvals inconsistent and un‑auditable.
  • Treating consultation as optional — stakeholder feedback often changes approval decisions.
  • Using sign‑off as a signature only — it must record conditions, commitments and context.

Stakeholder ID & Segmentation

Identify and classify people/groups by interest, influence, role and info needs to plan elicitation and representation.

Key Insight

Do it repeatedly — map influence + interest to who needs one‑on‑one, who fits group sessions, and who needs escalation.

Often Confused With

Stakeholder AnalysisStakeholder EngagementRACI Matrix

Common Mistakes

  • Ignoring informal influencers — titles don't equal influence.
  • Treating ID as one‑and‑done at kickoff — stakeholders evolve; revisit regularly.
  • Assuming group sessions replace individual interviews — some stakeholders need private elicitation.

Prioritization & Allocation — MoSCoW, RICE, Weighted Scoring

Rank and assign requirements to releases using scoring, trade-offs, and constraints to build a traceable prioritized bas

Key Insight

Weighted scores are input, not a mandate — combine value×effort, risks, dependencies and strategy, then validate with stakeholders before allocating.

Often Confused With

Release PlanningCost-Benefit Analysis

Common Mistakes

  • Treating stakeholder votes as the same as prioritization
  • Assuming weighted scores are fully objective and replace negotiation
  • Prioritizing only by cost; ignoring benefits, risk, or dependencies

MultiVoting & Nominal Group Technique (NGT)

Facilitated multi-round method: silent idea capture, clarify, then limited iterative voting to surface ranked priorities

Key Insight

NGT reveals relative priorities via structured rounds and facilitation — it's consensus-building, not a single yes/no tally.

Often Confused With

Delphi TechniqueSimple Majority Vote

Common Mistakes

  • Treating NGT as a one-shot yes/no vote
  • Believing more votes per person always improves discrimination
  • Assuming results are bias-free; framing and facilitation shape outcomes

Solution Scope: Scope Statement & Business Case Inputs

Define the solution's boundaries, measurable deliverables, inclusions/exclusions tied to the business case.

Key Insight

Deliverables are outcomes with acceptance criteria; objectives must be SMART and map to the business case.

Often Confused With

Project Scope StatementBusiness CaseTasks/Activities

Common Mistakes

  • Accepting assumptions or constraints without verifying impacts, dependencies, or risks.
  • Using 'goal' and 'objective' interchangeably; objectives must be specific and measurable.
  • Confusing deliverables with tasks; listing a deliverable without acceptance criteria.

Problem Statement: Situation, Impact & Success Criteria

Solution-neutral summary of the problem/opportunity: who/what's affected, impacts, and measurable success criteria.

Key Insight

Keep it solution-neutral and include measurable success criteria that drive scope and acceptance decisions.

Often Confused With

Root Cause AnalysisSolution Proposal

Common Mistakes

  • Stating a preferred solution or technical details inside the problem statement.
  • Omitting measurable success criteria or restricting them to only financial metrics.
  • Confusing the statement with a root-cause analysis; it describes the situation, not causes.

Value Proposition Analysis (Fin & Non‑Fin)

Estimate and compare initiative value using financial and qualitative metrics to justify, prioritize, and realize gains.

Key Insight

Combine monetary and non‑monetary benefits, state assumptions, and run sensitivity to see which drivers change the decision.

Often Confused With

Financial valuation metrics (NPV, IRR, ROI)Cost–Benefit Analysis

Common Mistakes

  • Ignoring non‑financial benefits (NPS, customer retention, strategic value).
  • Relying on a single metric (only ROI) without sensitivity or scenario checks.
  • Treating business‑case targets as immutable after scope or market changes.

NPV • IRR • ROI — Core Financial Metrics

NPV = discounted cash‑flow value today; IRR = discount rate that zeros NPV; ROI = simple return ratio (no time value).

Key Insight

Use NPV for absolute value and mutually exclusive choices; IRR is a rate (can be multiple); ROI ignores timing — choose metric per question.

Often Confused With

Value proposition analysisPayback Period

Common Mistakes

  • Treating ROI as time‑aware (it ignores cash‑flow timing unless adjusted).
  • Picking higher IRR when a lower‑IRR project has higher NPV for mutually exclusive options.
  • Using a single corporate WACC for every project regardless of project risk or horizon.

SMART Goals (Specific, Measurable, Achievable, Relevant, Time-bound)

Checklist that makes objectives testable and ties scope to verifiable acceptance criteria.

Key Insight

Measurable can be numeric or a proxy; 'Achievable' means realistic against capacity/resources, not easiest.

Often Confused With

Acceptance CriteriaKPIs (Key Performance Indicators)

Common Mistakes

  • Treat 'Measurable' as only numeric; proxies or qualitative indicators can satisfy it.
  • Set 'Achievable' as the easiest target instead of realistic given capability/resources.
  • Add a deadline that isn't tied to when the outcome can actually be verified.

Organizational Goals & Objectives — Strategic Alignment

High-level strategic targets used to map, prioritize, and validate the business value of requirements.

Key Insight

Map each requirement to the single most relevant prioritized objective and validate it using SMART attributes.

Often Confused With

Project DeliverablesRequirements Traceability Matrix

Common Mistakes

  • Assume objectives needn't be measurable or time‑bound; ignore SMART criteria.
  • Attempt to align every initiative to every goal instead of the most relevant prioritized goal.
  • Conflate goals/objectives with deliverables or treat traceability matrix as the same artifact.

Change Control & Approval Matrix

Formal workflows, decision rights and criteria that authorize changes to baselined requirements.

Key Insight

Treat approval as a ruleset: who approves, who must be consulted, and the objective criteria plus traceability.

Often Confused With

Change Control Board (CCB)Configuration Management

Common Mistakes

  • Skipping decision criteria or traceability — makes approvals inconsistent and un‑auditable.
  • Treating consultation as optional — stakeholder feedback often changes approval decisions.
  • Using sign‑off as a signature only — it must record conditions, commitments and context.

Stakeholder ID & Segmentation

Identify and classify people/groups by interest, influence, role and info needs to plan elicitation and representation.

Key Insight

Do it repeatedly — map influence + interest to who needs one‑on‑one, who fits group sessions, and who needs escalation.

Often Confused With

Stakeholder AnalysisStakeholder EngagementRACI Matrix

Common Mistakes

  • Ignoring informal influencers — titles don't equal influence.
  • Treating ID as one‑and‑done at kickoff — stakeholders evolve; revisit regularly.
  • Assuming group sessions replace individual interviews — some stakeholders need private elicitation.

Prioritization & Allocation — MoSCoW, RICE, Weighted Scoring

Rank and assign requirements to releases using scoring, trade-offs, and constraints to build a traceable prioritized bas

Key Insight

Weighted scores are input, not a mandate — combine value×effort, risks, dependencies and strategy, then validate with stakeholders before allocating.

Often Confused With

Release PlanningCost-Benefit Analysis

Common Mistakes

  • Treating stakeholder votes as the same as prioritization
  • Assuming weighted scores are fully objective and replace negotiation
  • Prioritizing only by cost; ignoring benefits, risk, or dependencies

MultiVoting & Nominal Group Technique (NGT)

Facilitated multi-round method: silent idea capture, clarify, then limited iterative voting to surface ranked priorities

Key Insight

NGT reveals relative priorities via structured rounds and facilitation — it's consensus-building, not a single yes/no tally.

Often Confused With

Delphi TechniqueSimple Majority Vote

Common Mistakes

  • Treating NGT as a one-shot yes/no vote
  • Believing more votes per person always improves discrimination
  • Assuming results are bias-free; framing and facilitation shape outcomes

Planning

22%

Business Case — Living Investment Blueprint

Economic and strategic justification for an initiative, updated across the lifecycle to guide investment and alignment.

Key Insight

Approval doesn't freeze assumptions — maintain and govern the business case with stakeholders; traceability alone doesn't prove business relevance.

Often Confused With

Project CharterCost–Benefit Analysis

Common Mistakes

  • Treating the business case as final after approval
  • Updating only financials; ignoring scope, risks, assumptions
  • Assuming traceability guarantees business alignment

Benefits Valuation & Realization (KPIs → Value)

Assess expected vs realized benefits using KPIs, acceptance criteria, owners, timing, evidence and corrective actions.

Key Insight

Measure with multiple metrics (financial + nonfinancial), tie each metric to an owner/evidence, and re‑baseline targets when scope or market changes.

Often Confused With

ROIBenefits Realization PlanPerformance Metrics

Common Mistakes

  • Assuming business-case targets are immutable
  • Relying solely on a single financial metric (e.g., ROI)
  • Ignoring nonfinancial benefits like customer satisfaction or risk reduction

Traceability Matrix (RTM) Baseline Checklist

Tabular map linking requirements to needs, design, tests, and acceptance; baseline for change control and coveragechecks

Key Insight

The baseline is a controlled snapshot—evidence of approved links at a point in time, not automatic proof of coverage; maintain and analyze it.

Often Confused With

Traceability TreesRequirements SpecificationTest Cases

Common Mistakes

  • Treating the RTM as only a test-case map—must include needs, sources, design and acceptance criteria.
  • Creating the RTM once and never updating it; the baseline requires controlled maintenance.
  • Assuming a baseline equals proof of full coverage without doing coverage analysis.

Traceability Trees — Upstream & Downstream Paths

Hierarchical diagrams of parent/child and upstream/downstream links that expose dependencies and impact paths for change

Key Insight

Use trees to trace origin-to-impact paths—show bi-directional links and IDs, and keep trees synced with the RTM for accurate impact analysis.

Often Confused With

Traceability Matrix (RTM)Requirements HierarchyDependency Graphs

Common Mistakes

  • Confusing trees with matrices—trees show paths; matrices show many-to-many link tables.
  • Assuming trees only show downstream links; they must show upstream origins too.
  • Using trees to replace requirement IDs or trace tools—trees must reference IDs and be maintained.

RMP — Lifecycle, Traceability & Approval Rules

A blueprint of roles, processes, tools, baselines and sequenced activities from elicitation to evaluation to assureTrace

Key Insight

RMP = process map + approval gates: sequence activities so every requirement has a traceable path and controlled baselines.

Often Confused With

Requirements managementTraceability matrix

Common Mistakes

  • Thinking approval happens only at project end — RMP mandates ongoing baselining and change control.
  • Stopping elicitation after initial interviews — RMP requires iterative discovery and refinement cycles.
  • Reducing RMP to just a traceability matrix — it also defines roles, tools, processes, sequencing and governance.

Requirements Management — Trace, Control, Sustain Value

Processes, tools and controls to document, trace, change and preserve requirements so the solution sustains business价值

Key Insight

Select traceability depth by risk, cost and governance — management sustains long‑term value, not just version control.

Often Confused With

Requirements elicitationRequirements management planTraceability matrix

Common Mistakes

  • Assuming full bidirectional traceability is mandatory for every project — tailor to risk and cost.
  • Treating requirements management as mere paperwork or versioning, not as value sustainment.
  • Confusing requirements management with elicitation — one governs, the other discovers needs.

Integrated Change Control (ICC) — Baseline Guardrails

Formal cross‑project process to review, approve/reject, and track changes to keep scope/schedule/cost/quality baselines,

Key Insight

ICC enforces the RMP‑defined approval path and assesses impacts across all baselines — it controls decisions, not the RMP or config items.

Often Confused With

Configuration ManagementRequirements Management Plan (RMP)

Common Mistakes

  • Thinking ICC only applies to technical deliverables, not requirements or documents.
  • Assuming the project manager alone approves changes; ignores governance/stakeholder roles.
  • Confusing ICC with configuration management or assuming RMP is owned by ICC.

Change Control Board (CCB) — Decision Forum

Cross‑functional panel that reviews requirement change requests, weighs impact/risk/stakeholder input, and preserves bas

Key Insight

CCB decides using documented impact/risk assessments and RMP criteria; it must include relevant stakeholders/SMEs and does not itself perform detailed

Often Confused With

Project Manager ApprovalConfiguration Management

Common Mistakes

  • Believing the CCB automatically approves any stakeholder request.
  • Assuming only the project manager or a single role sits on the CCB.
  • Expecting the CCB to perform impact analysis rather than review documented assessments.

Document Control & Versioning — Baselines & Audit Trails

Formal practices to identify, baseline, store, and control requirement docs with auditable change history.

Key Insight

A baseline + configuration control creates the single authoritative version; file timestamps alone do not prove approval or traceability.

Often Confused With

Configuration ManagementChange ControlTraceability Matrix

Common Mistakes

  • Using renames/timestamps as a 'version control' shortcut — not an approved baseline.
  • Believing version control is only for source code, not requirements or project documents.
  • Assuming a repository removes need for formal baselining or configuration control.

OPAs & Tailoring: Validate, Adapt, Record

Review organizational templates/assets, tailor them to project needs, and document impacts on governance, traceability,‑

Key Insight

Tailoring is mandatory: applying an OPA without validating/updating approvals, trace links, or versioning creates audit gaps.

Often Confused With

Enterprise Environmental Factors (EEFs)Standards and Templates

Common Mistakes

  • Assuming an OPA fits—skip applicability checks and miss required governance changes.
  • Restricting tailoring to templates only and ignoring process or approval adjustments.
  • Treating OPAs and EEFs as interchangeable.

Critical Success Factors (CSFs) — Strategic Musts

The few high‑level areas that must go right; they direct which KPIs and acceptance criteria you define.

Key Insight

CSFs are WHAT must succeed; convert each CSF into measurable KPIs and stakeholder‑agreed acceptance criteria.

Often Confused With

KPIsAcceptance Criteria

Common Mistakes

  • Treating CSFs as KPIs — CSFs are areas of success, KPIs are the measures.
  • Assuming CSFs must be financial — include operational, customer, and compliance CSFs.
  • Defining CSFs solo or never revising them — they must be agreed with stakeholders and refined as needed.

Definition of Done (DoD) — Team Completion Checklist

A team‑owned checklist of criteria an increment must meet to be considered complete; enforces quality, not stakeholder批准

Key Insight

DoD is a team/process quality gate for increments; it ensures technical completeness but does not replace stakeholder acceptance.

Often Confused With

Acceptance CriteriaDefinition of Ready

Common Mistakes

  • Using DoD as stakeholder acceptance — DoD confirms completeness, stakeholders still must accept value.
  • Treating DoD as a single‑owner artifact — DoD should be created and owned by the whole delivery team.
  • Applying DoD only to code or never updating it — include docs/tests/ops and evolve it as practices change.

Analysis

35%

Traceability Map — IDs, Rationale & Test Links

Specify requirement attributes (ID, origin, rationale, priority, acceptance, test approach) and link rules to enable lin

Key Insight

Tailor link depth by risk/cost/governance — but always use a unique ID and consistent link structure to enable impact analysis.

Often Confused With

Version controlRequirements management planTraceability matrix

Common Mistakes

  • Don't trace everything—select trace depth based on risk, cost and governance.
  • A requirement without a unique ID or consistent links is not traceable.
  • Traceability isn't a substitute for version control or document management.

Elicitation Toolkit — Individual & Group Methods

Choose interviews, workshops, observation, surveys or prototyping—and record requirement origin and rationale for trace.

Key Insight

Match technique to the information gap: interviews for tacit knowledge, observation for work-as-done, workshops for alignment—always capture who said/

Often Confused With

Requirements gatheringStakeholder statementsStakeholder analysis

Common Mistakes

  • Don't assume workshops create consensus—capture conflicts, decisions and action owners.
  • Don't record stakeholder quotes as requirements without documenting underlying rationale.
  • Avoid one-off elicitation—iterate, validate and reconfirm as requirements evolve.

Reqs Decomposition & Interface Mapping

Split stakeholder and solution needs into measurable, testable elements and map all dependencies and interfaces.

Key Insight

Decompose to the smallest testable outcome—but always capture explicit dependencies (team, data, timing, regulatory).

Often Confused With

Scope DefinitionWork Breakdown Structure (WBS)Interface Design

Common Mistakes

  • Decompose only technical items; ignore NFRs, business rules and org/process interfaces.
  • Assume decomposed items are independent; skip dependency sequencing and risk of coupling.
  • Over-decompose—creates fragmentation and traceability overhead instead of clarity.

Process Modeling: Decisions, Flows, Handoffs

Use flowcharts/BPMN to expose activities, decisions, roles and handoffs that define requirements and gaps.

Key Insight

Use the simplest notation your audience will validate; focus models on decision points, exceptions, data handoffs and rework loops.

Often Confused With

Data ModelingUse Case ModelingSystem Architecture Diagrams

Common Mistakes

  • Treat modeling as static documentation—skip stakeholder walkthroughs and validation.
  • Force one 'formal' notation on every audience instead of tailoring views.
  • Add unnecessary micro-detail—creates analysis paralysis and maintenance burden.

Delivery Lifecycles: Agile, Iterative, Incremental, Waterfall

Match lifecycle to project uncertainty, governance and BA activities; tailor and document scope controls.

Key Insight

Choose by uncertainty/time‑to‑value: Agile for evolving scope/fast feedback; Waterfall for stable scope/regulatory control.

Often Confused With

TailoringHybrid approachesGovernance models

Common Mistakes

  • Treating tailoring as 'no governance' — always document deviations and controls.
  • Assuming Waterfall forbids change — apply formal change control and traceability.
  • Believing Agile means no documentation — record 'just enough' requirements and acceptance criteria.

Options Analysis: Score, Select, Disposition Requirements

Use documented, weighted criteria to compare solution options and decide which requirements to accept/defer/reject.

Key Insight

Decisions come from scored tradeoffs — include cost, benefit, risk, feasibility and strategic fit; scores justify dispositions.

Often Confused With

Cost‑benefit analysisFeasibility studyTechnical design

Common Mistakes

  • Treating options analysis as cost‑only — neglecting risk, feasibility and alignment.
  • Assuming chosen option auto‑accepts all linked requirements — each needs explicit disposition.
  • Relying on qualitative opinion alone — always document criteria, weights and scores for defensibility.

Requirements Baseline (Prioritized Scope)

Approved, prioritized set of validated requirements that defines the authorized scope and control point for changes.

Key Insight

Baseline is a formal control point: frozen for control until changed via formal change control; priority shows value/risk, not execution order.

Often Confused With

Project scope baselineSchedule baselineRequirements backlog

Common Mistakes

  • Treating the baseline as permanently fixed — it's changed only through formal change control.
  • Including every proposed requirement instead of only approved, prioritized items.
  • Assuming priority equals implementation order — it reflects value/risk, not sequencing.

Backlog Management — Groom, Prioritize, Ready

Continuous refinement and prioritization of work items to maximize value, readiness and traceability for delivery.

Key Insight

Backlog is dynamic: frequent grooming balances value, cost, risk and dependencies; items need acceptance criteria before scheduling.

Often Confused With

Product backlogRequirements baselineRequirements traceability matrix

Common Mistakes

  • Treating the backlog as a static list — it requires regular review and reprioritization.
  • Assuming backlog management only applies to Agile projects.
  • Prioritizing by urgency alone — ignore value, cost, risk and dependencies at your peril.

Delphi: Anonymous Consensus Rounds

Iterative anonymous expert rounds using feedback to converge estimates and secure stakeholder consensus for baseline‑ok.

Key Insight

Anonymity + iterative feedback reduces dominance and groupthink; use defined rounds/stop rules — consensus ≠ correctness.

Often Confused With

Nominal Group TechniqueFocus GroupsExpert Surveys

Common Mistakes

  • Treating Delphi as a one‑shot survey instead of multi‑round with feedback.
  • Assuming consensus = correct decision; outputs still need validation.
  • Running open/face‑to‑face sessions or revealing names — destroys the method's bias control.

Metric-Driven Acceptance Criteria (AC & RTM)

Define measurable acceptance criteria with numeric targets and map each to specific test evidence in the RTM.

Key Insight

Every acceptance criterion must be measurable, have a numeric target, and link to concrete test evidence.

Often Confused With

Test casesKPIs (Key Performance Indicators)

Common Mistakes

  • Writing vague or non-measurable criteria that can't be tested
  • Overloading with too many metrics — dilutes decision focus
  • Confusing the metric (measured value) with the numeric target/goal

Functional Requirements (Behaviors & Rules)

Specify user-visible behaviors, inputs/outputs and business rules — state WHAT must happen, not HOW to build it.

Key Insight

Functional requirements must be verifiable, technology-agnostic, and include acceptance criteria for traceability to tests.

Often Confused With

Non-functional requirementsDesign specificationsUser stories

Common Mistakes

  • Mixing functional statements with non-functional constraints
  • Embedding design/implementation details (the 'how')
  • Omitting acceptance criteria or testability and trace links

Requirements Quality Checklist (Complete, Consistent, Correct)

Nine attributes to judge if a requirement is handoff-ready: complete, correct, consistent, feasible, measurable, precise

Key Insight

Measurable ≠ testable; traceability must link to source need, rationale and downstream artifacts; completeness = coverage of stakeholder needs and ACs

Often Confused With

Acceptance criteriaTraceability matrixVerification vs Validation

Common Mistakes

  • Treating measurable and testable as identical
  • Calling a requirement 'traceable' just because it's documented
  • Treating completeness as including implementation details

Requirements Walkthrough — Author‑Led Review

Author-led group review to clarify intent, expose gaps, and gather stakeholder feedback before formal validation or sign

Key Insight

Walkthroughs are informal, learning-focused sessions to expose functional gaps and assumptions — they build understanding, not provide formal sign-off

Often Confused With

Formal inspectionPeer reviewPrototyping

Common Mistakes

  • Treating walkthroughs as formal inspections with defect-count sign-off
  • Using walkthroughs only to correct grammar or formatting
  • Assuming walkthroughs provide formal approval or replace prototypes/testing

Validation Methods — Reviews, Prototypes & Demos

Select verification/validation methods, set measurable acceptance metrics (MOV), derive test scenarios, and tie evidence

Key Insight

Match method to risk and objective: reviews for correctness, prototypes for usability, demos/tests for measurable acceptance — always map evidence to 

Often Confused With

Acceptance testingRequirements elicitationVerification methods

Common Mistakes

  • Using vague/high‑level acceptance criteria — they must be measurable and tied to the MOV.
  • Waiting until project close to validate — validate each increment or delivery.
  • Relying on peer walkthroughs/inspection alone without stakeholder or user evidence.

Nonfunctional Requirements (Quality Attributes)

Define measurable quality and environmental conditions (performance, security, availability) with clear acceptance tests

Key Insight

NFRs are business-facing, testable metrics — decompose into SLAs, response times, throughput, error budgets and acceptance thresholds

Often Confused With

ConstraintsFunctional requirementsBusiness rules

Common Mistakes

  • Treating NFRs as optional or lower priority — they drive acceptance and operational risk.
  • Writing vague NFRs — decompose into specific, testable metrics (SLA, MTTR, TPS).
  • Confusing NFRs with constraints or implementation choices instead of measurable qualities.

Traceability and Monitoring

15%

Monitoring & Controlling — BA Role

Continuous oversight of performance, requirement baselines, traceability and change impacts; BAs assess and protect.

Key Insight

Monitoring is continuous and collaborative: BAs must update requirement status, run impact analyses, and protect baselines.

Often Confused With

Execution (process group)Change ControlRequirements Management

Common Mistakes

  • Assuming only the PM owns monitoring; BAs must manage requirement status and traceability
  • Limiting monitoring to schedule/cost — requirements and traceability belong here
  • Viewing monitoring as post-execution instead of an ongoing activity

Requirement Dependencies — Sequencing & Risk

A link where one requirement's delivery/verification depends on another or an external event; drives sequencing and risk

Key Insight

Dependencies are execution constraints and functional risks — trace links alone don't equal a dependency.

Often Confused With

TraceabilityAssumptions & Constraints

Common Mistakes

  • Treating traceability links as automatic dependencies
  • Assuming dependencies are always internal to requirements
  • Believing dependencies always mandate serial execution

Formal Requirements Inspection (Peer Review)

Structured peer review to verify artifact quality, acceptance, standards and traceability before handoff.

Key Insight

Defined roles, checklists and cross-discipline reviewers uncover spec defects and missing trace links early; inspections do not replace execution‑test

Often Confused With

Informal walkthroughsRequirements validation/testingChange control reviews

Common Mistakes

  • Picking only technical reviewers — misses business alignment and independent review
  • Treating inspections as a replacement for execution-based testing
  • Assuming a completed checklist guarantees readiness; still verify requirements and trace links

Business Analysis Deliverables (Requirements & Artifacts)

Artifacts BA owns—requirements, models, acceptance criteria and traceability matrices that guide delivery, verification,

Key Insight

Deliverables include textual and visual artifacts (diagrams, wireframes, matrices); traceability and acceptance criteria are governance essentials,not

Often Confused With

Project artifactsTechnical specificationsTest deliverables

Common Mistakes

  • Calling any project document a BA deliverable
  • Ignoring diagrams, wireframes and models as true BA deliverables
  • Skipping the traceability matrix assuming it's optional for governance

Requirement Trace Log (Status • Source • Relations)

Record each requirement’s status, source, version, approvals and transitions with controlled, auditable entries.

Key Insight

Every status change must be authorized, time‑stamped (who/when/why) and preserved for baselining and audits.

Often Confused With

Change ControlConfiguration Status Accounting (CSA)Version Control

Common Mistakes

  • Letting anyone flip status without following workflow approvals.
  • Assuming a traceability tool alone guarantees compliance—skip human review.
  • Recording only current state and omitting who/when/why or history.

Status Reporting: Cadence, Audience, Message

Deliver timely, audience‑tailored requirement updates (progress, risks, impacts) using the right cadence and channel.

Key Insight

Match content and cadence to stakeholder decisions: execs need impacts/asks; delivery teams need tasks and blockers.

Often Confused With

Change ControlStakeholder Engagement

Common Mistakes

  • Reporting only approved/not‑approved instead of impacts, conflicts and risks.
  • Sending one generic update to all stakeholders instead of tailored messages.
  • Assuming status communication replaces formal change evaluation and approvals.

Conflict Resolution 5‑Toolbox

Five tactics—Avoid, Accommodate, Compromise, Compete, Collaborate—pick by urgency, impact, and authority.

Key Insight

Match tactic to urgency, impact and decision authority: Compete for urgent/safety with sponsor backing; Collaborate for high‑impact buy‑in; Avoid to ­

Often Confused With

Negotiation techniquesStakeholder engagement strategies

Common Mistakes

  • Thinking 'compete' is always wrong — it's right for urgent/safety decisions with authority.
  • Treating 'avoid' as failure — strategic delay can buy facts or cool tensions.
  • Assuming 'compromise' is best — it can produce diluted, ineffective requirements in high‑stakes cases.

Bi‑directional Traceability (Req ↔ Source/Deliverable)

Links each requirement backward to its source and forward to design/code/tests to prove coverage and enable impact scans

Key Insight

Traceability is a living map used in status reports: show coverage %, identify impacted requirements/tests for a change, and expose gaps to business‑r

Often Confused With

Forward traceabilityTraceability matrixRequirements coverage

Common Mistakes

  • Linking only to test cases — omit backward links to business objectives or design at your peril.
  • Treating traceability as one‑time paperwork — update links with every approved change.
  • Assuming links equal correctness — links show coverage, not validation; run verification too.

Impact Analysis — Change Ripple Map

Identify what a proposed change affects (artifacts, stakeholders, schedule, cost) and estimate effort, time, and risks.

Key Insight

Use trace links to enumerate affected items, then produce informed ranges for cost/schedule and residual risk — iterative, not binary.

Often Confused With

Risk analysisTraceability matrix

Common Mistakes

  • Checking only technical artifacts; skipping stakeholder or schedule impacts.
  • Running analysis only after approval instead of iteratively during decision-making.
  • Reporting exact cost/schedule values instead of ranges with assumptions.

Requirements Change: Impact & Risk Check

Compare the proposed requirement change to the approved baseline; list affected artifacts/stakeholders and assess cost, 

Key Insight

Baseline + traceability show scope; prioritize using probability × consequence (and detectability) to choose responses.

Often Confused With

Impact analysisTraceability matrixRisk analysis

Common Mistakes

  • Ignoring business processes, contracts, or regulatory implications — checking only code/design.
  • Skipping formal baseline comparison because stakeholders 'verbally agree'.
  • Using impact severity alone; ignoring likelihood and detectability when prioritizing risk.

Evaluation

10%

Phase Exit Criteria (Gates & Evidence)

Measurable, evidence-based conditions required to start or close a phase/test cycle to permit acceptance.

Key Insight

Exit criteria must be measurable, tied to specific artifacts/approvals, and allow risk-based exceptions—not a loose checklist.

Often Confused With

Entry CriteriaAcceptance CriteriaDefinition of Done

Common Mistakes

  • Treating exit criteria as informal checklists instead of measurable, evidence-based conditions.
  • Assuming exit criteria apply only to testing or implementation, not planning or design phases.
  • Believing meeting exit criteria guarantees project success or that no further review is needed.

Traceability Artifacts (Evidence Links)

Linked evidence—requirements, tests, logs, screenshots, code—proving lineage, verification, and delivery of requirements

Key Insight

Traceability is many-to-many; non-document evidence (logs, code, screenshots) is valid if linked, timestamped, and verifiable.

Often Confused With

Traceability MatrixRequirements DocumentTest Cases

Common Mistakes

  • Ignoring non-document evidence—logs, screenshots, code are valid trace artifacts when linked.
  • Assuming traceability is one-to-one; requirements often map many-to-many to tests, code, and defects.
  • Counting only baselined docs; verifiable informal items (emails, prototypes) can also serve as evidence.

Gap/Delta → Prioritize & Communicate

Identify where solutions fall short, assess impact/severity, and recommend prioritized, measurable remediation.

Key Insight

Quantify severity + business impact, not just size; link each gap to a risk-based decision (fix/waive/accept).

Often Confused With

Root Cause AnalysisDefect ManagementRequirements Traceability

Common Mistakes

  • Assume gaps are only coding defects—include requirement, process, design, acceptance issues.
  • Measure gap size only; omit severity and business impact when prioritizing.
  • Send a generic report and expect decisions—fail to tailor messages and request actions.

UAT — Business Sign-off Testing

Business users execute real-world scenarios vs acceptance criteria and produce traceable evidence for formal sign-off.

Key Insight

UAT proves business acceptance at delivery via stakeholder-executed, auditable tests — it doesn't prove long‑term value or defect‑free code.

Often Confused With

System TestingIntegration TestingOperational Acceptance Testing (OAT)

Common Mistakes

  • Treat UAT as an informal demo; skip documented evidence and traceability.
  • Equate UAT pass with realized business value over time.
  • Hand UAT off to QA alone; business stakeholders must drive acceptance tests.

Stakeholder Sign‑Off (Go/No‑Go)

Documented stakeholder approval to deploy after verifying tests, acceptance criteria, gaps and risk evidence.

Key Insight

Sign‑off is evidence‑based consensus with verified delegated authority — not a ceremonial or lone senior signature.

Often Confused With

Change control approvalUAT completionProject sponsor approval

Common Mistakes

  • Granting sign‑off without reviewing test evidence or acceptance criteria
  • Assuming a single senior signature equals full stakeholder consensus
  • Believing sign‑off forbids all post‑deployment fixes (ignore change control)

Cutover Planning & Downtime Comms (48‑hr)

Map cutover sequence, assign owners, confirm acceptable downtime with business, and run a timed communication/escalation

Key Insight

Confirm acceptable windows with business owners, repeat confirmations, rehearse the cutover, and pre‑define rollback/escalation criteria.

Often Confused With

Release schedulingIncident/Outage communication

Common Mistakes

  • Sending one notification and assuming stakeholders are informed
  • Letting implementation set downtime without business agreement
  • Skipping contingency/rollback criteria because the schedule is fixed

KPIs (Key Performance Indicators) — Aligned Impact Metrics

Specific, measurable metrics tied to strategic goals used to judge deployed-solution value vs baselines and targets.

Key Insight

A KPI = alignment + baseline + target + timeframe + owner — without all five it's just a metric.

Often Confused With

MetricsOKRs (Objectives and Key Results)CSFs (Critical Success Factors)

Common Mistakes

  • Calling any measured value a KPI — not every metric maps to strategic impact.
  • Packing dashboards with KPIs — too many dilute focus and decision value.
  • Using a KPI without baseline, target, timeframe or owner — yields meaningless results.

Comparative Results — Outcome vs Baseline

Quantify gaps by comparing observed outcomes to baselines, expectations, or alternative options to assess effectiveness.

Key Insight

Comparisons reveal gaps, not causation — you must validate baselines, controls, and real-world significance.

Often Confused With

A/B TestingBenchmarkingRoot Cause Analysis

Common Mistakes

  • Treating any difference from baseline as proof the solution caused the change.
  • Using an invalid or inappropriate baseline and trusting the comparison blindly.
  • Focusing only on numeric gap size and ignoring statistical or business significance.

© 2026 Mocka.ai - Your Exam Preparation Partner

PMI Professional in Business Analysis (PMI-PBA)® Practice Questions
Access Mock Exams & Comprehensive Question Bank
Listen to Audio Podcasts
Expert summaries for PMI Professional in Business Analysis (PMI-PBA)®

Certification Overview

Duration:120 min
Questions:200
Passing:70%
Level:Intermediate

Cheat Sheet Content

55Key Concepts
5Exam Domains

Similar Cheat Sheets

  • PMI Certified Associate in Project Management (CAPM)® Cheat Sheet
  • PMI Agile Certified Practitioner (PMI-ACP)® Cheat Sheet
  • Google Cloud Certified Generative AI Leader Cheat Sheet
  • PMI Construction Professional (PMI-CP)™ Cheat Sheet
  • Project Management Institute Portfolio Management Professional (PfMP)® Examination Cheat Sheet
  • IAPP Certified Information Privacy Manager (CIPM) Cheat Sheet
Mocka logoMocka

© 2026 Mocka. Practice for what's next.

Product

  • Browse Certifications
  • How to get started

Company

  • About Us
  • Contact

Legal

  • Terms of Service
  • Privacy Policy
  • Imprint
Follow