& Vision Documents
Lecture 4
Creating the source of truth for AI-driven projects
2026 WayUp
AI without context is AI without direction
The vision document is the persistent memory AI needs but doesn't have.
What we're building and why it exists
Who uses this and their key needs
Must-achieve objectives (prioritized)
What we explicitly won't build
Languages, frameworks, services
Key choices with reasoning (ADRs)
Performance, security, compliance
How we measure achievement
# TaskFlow - Project Vision ## Overview TaskFlow is a minimalist task management API for small teams. Focus: simplicity over features. ## Target Audience - Small development teams (3-10 people) - Technical users comfortable with APIs - Teams using CLI/integrations over GUIs ## Primary Goals 1. Fast task creation (< 1 second API response) 2. Simple data model (tasks, tags, assignees) 3. Robust API documentation 4. 99.9% uptime ## Non-Goals - NO mobile app (API-first only) - NO complex workflows/automations - NO real-time collaboration - NO file attachments ## Tech Stack - Python 3.12 + FastAPI - PostgreSQL 16 - Redis for caching - Docker for deployment ## Architecture Decisions - ADR-001: REST over GraphQL (simplicity) - ADR-002: JWT over sessions (stateless) - ADR-003: Monolith over microservices (team size)
Saying NO is as important as saying YES
AI won't suggest features that are out of scope
When AI faces choices, non-goals help decide
No debates about "nice to have" features
Everyone (human + AI) agrees on boundaries
NO multi-tenancy (single team per instance) | NO offline support | NO IE11 compatibility | NO real-time sync | NO third-party integrations (v1)
Document the WHY, not just the WHAT
# ADR-001: Use REST API over GraphQL ## Status Accepted ## Context We need to choose an API style for TaskFlow. Options: REST, GraphQL, gRPC ## Decision We will use REST with OpenAPI specification. ## Reasoning 1. Team has REST experience, no GraphQL expertise 2. Simple CRUD operations don't benefit from GraphQL flexibility 3. Better tooling ecosystem (Postman, curl, etc.) 4. Simpler caching with HTTP standards ## Consequences - Positive: Faster development, familiar patterns - Negative: Multiple requests for related data - Mitigated by: Thoughtful endpoint design with includes ## Alternatives Considered - GraphQL: Rejected (overkill for simple data model) - gRPC: Rejected (not web-friendly)
Root-level, always referenced first by AI
Numbered decisions with reasoning
Detailed feature specifications
Detailed requirements for complex features
# Feature: User Authentication ## Overview Secure user authentication with email/password and JWT tokens. ## User Stories - As a visitor, I want to register so I can access the app - As a user, I want to login so I can view my tasks - As a user, I want to logout to secure my session ## Requirements ### Registration - Email must be valid format (RFC 5322) - Password: min 8 chars, 1 uppercase, 1 number - Duplicate emails return 409 Conflict - Send verification email on success ### Login - Return JWT access token (1 hour expiry) - Return refresh token (7 days expiry) - Rate limit: 5 attempts per minute ### Security - Passwords hashed with bcrypt (cost 12) - Tokens signed with RS256 - HTTPS required in production ## Out of Scope (v1) - OAuth/social login - 2FA - Password reset
"Read vision.md and docs/specs/user-auth.md first."
"Does this approach align with ADR-002?"
"Is this within the non-goals defined in vision.md?"
"After implementing, update the spec with any changes."
How specs support each phase
| AIDD Phase | Documentation Used |
|---|---|
| Discover | vision.md (goals, audience), existing specs (current state) |
| Plan | Feature specs (requirements), ADRs (constraints) |
| Review | vision.md (alignment check), ADRs (decision validation) |
| Execute | Feature specs (acceptance criteria for tests) |
| Commit | Update specs if implementation differs from plan |
| Test | Feature specs (test scenarios), vision.md (success metrics) |
Specifications evolve with the project
1
Write Spec
Define requirements before code
2
AI Review
Ask AI to find gaps in spec
3
Implement
Code follows spec exactly
4
Validate
Tests match spec requirements
5
Update
Sync spec with any changes
6
Repeat
Next feature, same process
"The app should be fast"
Better: "API response time < 100ms at p95"
Implementation details in specs
Better: Focus on WHAT, not HOW
Docs written once, never maintained
Better: Update with each PR
Docs in random folders
Better: Consistent, discoverable structure
Review this feature specification for completeness: 1. Are all user stories covered by requirements? 2. Are edge cases and error states defined? 3. Are security implications addressed? 4. Is the scope clear (what's NOT included)? 5. Are acceptance criteria testable? 6. Does it align with vision.md? Flag any gaps or ambiguities.
## Overview [What is this feature?] ## User Stories - As a [user], I want [action]... ## Requirements ### [Category] - [Requirement 1] - [Requirement 2] ## Security Considerations - [Security item] ## Out of Scope - [Explicit non-requirement] ## Acceptance Criteria - [ ] [Testable criterion]
## Status [Proposed/Accepted/Deprecated] ## Context [Why is this decision needed?] ## Decision [What did we decide?] ## Reasoning [Why this choice?] ## Consequences - Positive: [benefit] - Negative: [drawback] ## Alternatives Considered - [Option]: [Why rejected]
vision.md
Every project needs a north star document
Non-Goals
Define what you WON'T build
ADRs
Document decisions with reasoning
Living
Update docs with every change
Good specifications are the difference between AI assistance and AI chaos.
Specifications & Vision Documents
Next: Test-Driven Development with AI
2026 WayUp - way-up.io