This guide provides quick decision trees and checklists to accelerate your vendor selection process using the Periodic Cube of AI framework.

Decision Tree 1: Should We Source This Externally?
START: Identify the AI component you need
|
├─> Check "Build vs Buy" dimension in framework
|
├─> Is it classified as "Build"?
| ├─> YES → Strong signal to build internally
| | Consider: Strategic differentiation, IP protection
| | ⚠️ Still evaluate: Do we have the capability?
| |
| └─> NO → Continue to next check
|
├─> Is it classified as "Buy" or "Integrate"?
| ├─> YES → Strong signal to source externally
| | Continue to Decision Tree 2
| |
| └─> NO (Hybrid) → Requires deeper analysis
| Continue to Decision Tree 2
|
└─> Cross-check with "Criticality"
|
├─> Mission-Critical + Build → Build internally for control
├─> Mission-Critical + Buy → Vet vendors rigorously
├─> Enhancing/Optional → Prefer Buy to focus resources
└─> High Priority → Depends on TRL and Org capability
Decision Tree 2: How Urgent Is This Decision?
START: Confirmed we need to source externally
|
├─> Check "Criticality" dimension
|
├─> Is it "Mission-Critical"?
| ├─> YES → SLOW TRACK (8-12 weeks)
| | • Full RFP process
| | • Comprehensive due diligence
| | • Multiple vendor evaluations
| | • PoC required
| | • Executive approval needed
| |
| └─> NO → Continue to next check
|
├─> Is it "High Priority"?
| ├─> YES → MEDIUM TRACK (4-6 weeks)
| | • Abbreviated RFP or RFI
| | • Focused due diligence
| | • 2-3 vendor comparison
| | • Demo or trial required
| | • Director-level approval
| |
| └─> NO → Continue to next check
|
└─> Is it "Enhancing" or "Optional"?
└─> YES → FAST TRACK (1-2 weeks)
• Direct vendor outreach
• Basic security review
• Single vendor evaluation
• Free trial if available
• Manager-level approval
Decision Tree 3: Which Vendors Should We Consider?
START: Ready to create vendor longlist
|
├─> Check "TRL" (Technology Readiness Level) dimension
|
├─> Is it "Established" or "Foundational"?
| ├─> YES → Target: Large, established vendors
| | Examples: AWS, Microsoft, Google, IBM, Oracle
| | Expect: High stability, broad features, higher cost
| |
| └─> NO → Continue to next check
|
├─> Is it "Maturing"?
| ├─> YES → Target: Mid-market specialists + Large vendors
| | Examples: Databricks, Snowflake, Scale AI
| | Expect: Innovation + stability, competitive pricing
| |
| └─> NO → Continue to next check
|
└─> Is it "Emerging"?
└─> YES → Target: Startups + Innovation labs of large vendors
Examples: Early-stage companies, research spinouts
Expect: Cutting-edge, higher risk, flexible pricing
⚠️ Requires extra financial due diligence
Checklist: Pre-RFP Preparation (30 minutes)
Use this checklist before creating your RFP:
Step 1: Framework Analysis
- Identify the component in the Periodic Table of AI
- Note its classification across all 7 dimensions
- Review the component’s position in the functional group
Step 2: Stakeholder Identification
- Check «Org. Ownership» dimension
- Identify primary owner team
- Identify secondary stakeholder teams
- Schedule kickoff meeting with decision team
Step 3: Requirements Definition
- Review «Human Intensity» → Define automation expectations
- Review «Cost Structure» → Set budget parameters
- Review «SFIA Category» → Identify required skills
- Review «Criticality» → Define SLA requirements
Step 4: Vendor Research
- Review «TRL» → Identify appropriate vendor types
- Review «Build vs Buy» → Clarify integration expectations
- Create initial vendor longlist (5-10 vendors)
- Assign research tasks to team members
Checklist: RFP Question Selection (20 minutes)
Use the RFP Question Bank document and select questions based on:
High Priority Questions (Always Include)
- 5-10 questions from «Cost Structure» section
- 5-10 questions from «TRL» section
- 5-10 questions from «Security & Compliance» section
Conditional Questions (Based on Framework)
If «Criticality» = Mission-Critical:
- All questions from «Criticality / Risk Level» section
- Questions 41-49 (SLA, DR, Business Continuity)
If «Human Intensity» = Human-Driven or Human-Collaborative:
- All questions from «Human-in-the-Loop Intensity» section
- Questions 33-40 (Setup, Maintenance, Automation)
If «TRL» = Emerging:
- Questions 8-16 (Product History, Roadmap, Stability)
- Questions 63-66 (Vendor Stability)
If «Build vs Buy» = Integrate or Hybrid:
- Questions 1-7 (Product Architecture, Integration Points)
- Questions 4-6 specifically (APIs, Standards, Lock-in)
If «SFIA Category» = Data or Technology:
- Questions 50-56 (Skills, Training, Documentation)
Scoring Quick Reference
How to Weight Evaluation Criteria
Use this table to set weights based on the component’s framework classification:
| Framework Dimension | If Classification Is… | Then Weight Should Be… |
|---|---|---|
| Criticality | Mission-Critical | Security & Compliance: 25-30% |
| High Priority | Security & Compliance: 15-20% | |
| Enhancing/Optional | Security & Compliance: 10-15% | |
| Cost Structure | CapEx | Upfront Cost: 20-25% |
| OpEx or Usage-Based | Ongoing Cost & Predictability: 20-25% | |
| Technology Readiness | Emerging | Vendor Stability: 20-25% |
| Maturing | Vendor Stability: 15-20% | |
| Established | Vendor Stability: 10-15% | |
| Human Intensity | Human-Driven | Ease of Use: 20-25% |
| Human-Collaborative | Ease of Use: 15-20% | |
| Fully Automated | Ease of Use: 10-15% | |
| Build vs Buy | Integrate or Hybrid | Integration & Compatibility: 20-25% |
| Buy | Integration & Compatibility: 10-15% |
Total weights must equal 100%
Red Flags by Framework Dimension
Watch for these warning signs during evaluation:
Build vs Buy Dimension
🚩 Vendor claims «no integration needed» but framework says «Integrate»
🚩 Vendor requires extensive customization when framework says «Buy»
TRL Dimension
🚩 Vendor has <10 production customers but component is «Established»
🚩 Vendor has no enterprise customers but you need «Mission-Critical»
🚩 Major version releases every <6 months (instability)
Org. Ownership Dimension
🚩 Vendor’s target user doesn’t match framework’s ownership team
🚩 Training is only available for roles we don’t have
Cost Structure Dimension
🚩 Pricing model doesn’t match framework classification
🚩 Cannot provide 3-year TCO estimate
🚩 Hidden costs exceed 30% of base price
Human Intensity Dimension
🚩 Requires >10 hours/week maintenance for «Fully Automated» component
🚩 No automation for «Human-Supervised» component
Criticality Dimension
🚩 SLA <99.9% for «Mission-Critical» component
🚩 No disaster recovery plan for «Mission-Critical» component
🚩 RTO >4 hours for «Mission-Critical» component
SFIA Category Dimension
🚩 Required skills not available in our organization
🚩 No training program for critical skills
🚩 Vendor doesn’t understand the SFIA activities for this category
Time Estimates by Component Criticality
Use these estimates for project planning:
| Criticality | Vendor Research | RFP Process | Evaluation | Due Diligence | Total Timeline |
|---|---|---|---|---|---|
| Mission-Critical | 2 weeks | 3 weeks | 3 weeks | 4 weeks | 12 weeks |
| High Priority | 1 week | 2 weeks | 2 weeks | 2 weeks | 7 weeks |
| Enhancing | 3 days | 1 week | 1 week | 1 week | 3.5 weeks |
| Optional | 2 days | N/A (direct) | 3 days | 3 days | 1.5 weeks |
Common Pitfalls to Avoid
Pitfall 1: Ignoring the Framework Classification
❌ Mistake: Treating all components the same regardless of their framework classification.
✅ Solution: Always start with the framework. Let it guide your process rigor and priorities.
Pitfall 2: Over-Engineering for Low-Criticality Components
❌ Mistake: Running a 12-week RFP process for an «Enhancing» component.
✅ Solution: Use the Decision Tree 2 to right-size your process.
Pitfall 3: Underestimating Integration Effort
❌ Mistake: Assuming «Buy» means zero internal development.
✅ Solution: Always check if it’s «Hybrid» or «Integrate» and plan accordingly.
Pitfall 4: Wrong Team Leading the Evaluation
❌ Mistake: Security team leading evaluation of a «Data» SFIA category component.
✅ Solution: Use «Org. Ownership» dimension to assign the right lead.
Pitfall 5: Focusing Only on Features
❌ Mistake: Choosing vendor with most features without considering TCO or operational burden.
✅ Solution: Use all 7 dimensions for a balanced evaluation.
Emergency Decision Protocol
When you need to make a vendor decision in <1 week:
Day 1: Framework Sprint (2 hours)
- Classify the component across all 7 dimensions
- Identify 2-3 must-have vendors based on TRL
- Assign roles based on Org. Ownership
Day 2-3: Rapid Evaluation (8 hours)
- Send abbreviated RFI with 10 critical questions
- Conduct vendor demos (2 hours each)
- Score using simplified scorecard
Day 4: Due Diligence (4 hours)
- Security questionnaire (if Criticality ≥ High Priority)
- Reference check (1 customer minimum)
- Review contract terms
Day 5: Decision (2 hours)
- Compare scores
- Assess risks
- Make recommendation
- Get approval
⚠️ Only use this protocol for «Enhancing» or «Optional» components
Quick Links to Other Documents
- Periodic Cube of AI homepage
- Full Process Guide:
Vendor_Selection_Guide.md - Detailed Scorecard:
Vendor_Evaluation_Scorecard.md - RFP Questions:
RFP_Question_Bank.md - Due Diligence Checklist:
Due_Diligence_Checklist.md - Framework Data:
ai_classification_matrix - Framework Documentation:
Periodic Cube of AI Documentation
Document Version: 1.0
Last Updated: November 2025