Skip to content

Instantly share code, notes, and snippets.

@vedovelli
Last active June 19, 2025 12:19
Show Gist options
  • Save vedovelli/4886c3268bb76fcd2a2d441250335293 to your computer and use it in GitHub Desktop.
Save vedovelli/4886c3268bb76fcd2a2d441250335293 to your computer and use it in GitHub Desktop.

Lessons Learned: Test Suite Recovery and Quality Maintenance

This document captures critical lessons learned from resolving 31 failing tests after a major controller refactoring, ensuring future development maintains code quality standards.

Executive Summary

After implementing Task 10 (lesson management CRUD), the test suite had 31 failures due to:

  • Laravel 12 service registration issues
  • Test infrastructure gaps
  • DTO validation expectation mismatches
  • Controller signature changes from refactoring

Key Insight: Major refactoring work should include immediate test validation to catch breaking changes early.

Critical Discovery (June 19, 2025): Task 10.2 was marked "done" but lesson creation UI was never actually implemented, requiring complete implementation from scratch in Task 21.3. This represents a fundamental failure in task completion verification.

Critical Mistakes and Solutions

1. Service Registration in Laravel 12

Mistake: Attempted to use withAliases() method which doesn't exist in Laravel 12.

Solution:

  • Register services as singletons in bootstrap/app.php
  • Add facade aliases in AppServiceProvider using AliasLoader
  • Prefer dependency injection over facades in Action classes
// bootstrap/app.php
->withSingletons([
    'log.service' => \App\Services\LogService::class,
])

// AppServiceProvider.php
public function boot(): void
{
    $loader = \Illuminate\Foundation\AliasLoader::getInstance();
    $loader->alias('LogService', \App\Facades\LogService::class);
}

Lesson: Always verify Laravel version-specific patterns before implementation.

2. Test Infrastructure Assumptions

Mistake: Assumed unit tests would automatically bootstrap Laravel application.

Solution: Created proper test infrastructure:

  • CreatesApplication trait for consistent app bootstrapping
  • Updated TestCase to use the trait
  • Ensured all tests have access to Laravel services

Lesson: Unit tests require explicit Laravel application bootstrapping.

3. DTO Exception Type Misunderstanding

Mistake: Expected Laravel's ValidationException instead of Spatie Laravel Data exceptions.

Solution: Updated test expectations to match actual library behavior:

  • CannotCreateData for missing required parameters
  • CannotCastEnum for invalid enum values
  • Focused tests on successful DTO creation rather than internal validation

Lesson: Test against actual library behavior, not assumptions about what exceptions should be thrown.

4. Controller Signature Changes Not Reflected in Tests

Mistake: Tests still used old UpdateLessonData constructor with id: parameter after refactoring to remove it.

Solution:

  • Updated all test constructor calls to match new DTO signature
  • Changed execute($data) calls to execute($lesson, $data) pattern
  • Ensured tests match actual implementation

Lesson: When refactoring DTOs or Action signatures, immediately update corresponding tests.

5. Incomplete Validation Rules

Mistake: ReorderLessonsData lacked proper validation rules for array elements.

Solution: Added comprehensive validation:

public static function rules(): array
{
    return [
        'module_id' => ['required', 'integer', 'exists:modules,id'],
        'lesson_ids' => ['present', 'array'],
        'lesson_ids.*' => ['integer'],
    ];
}

Lesson: DTOs should have complete validation rules, including array element validation.

6. Validation at Request Level Instead of DTO Level

Mistake: Relying on controller request validation instead of leveraging Spatie Laravel Data's built-in validation capabilities.

Problem Identified: The architecture uses DTOs for data transfer but validation was happening at the controller/request level, missing the opportunity for centralized, reusable validation logic.

Solution:

  • Move validation logic into DTO rules() methods
  • Use DTO validation attributes (#[Required], #[Exists], etc.)
  • Let DTOs handle their own validation concerns
  • Controllers become thinner and focused on orchestration
// Good: DTO handles its own validation
class CreateLessonData extends Data
{
    public function __construct(
        #[Required, Exists('modules', 'id')]
        public int $module_id,
        
        #[Required, Rule(['string', 'max:255'])]
        public string $title,
        
        #[Rule(['nullable', 'string', 'max:1000'])]
        public ?string $description = null,
    ) {}
    
    public static function rules(): array
    {
        return [
            'module_id' => ['required', 'integer', 'exists:modules,id'],
            'title' => ['required', 'string', 'max:255'],
            'description' => ['nullable', 'string', 'max:1000'],
        ];
    }
}

Lesson: DTOs should be self-validating. Move validation logic from controllers/requests into DTOs for better encapsulation and reusability.

7. Missing Static Analysis During Development

Critical Oversight: PHPStan was not run during the development process, missing type safety issues that could have been caught early.

Problems This Caused:

  • Type mismatches in method signatures
  • Incorrect property types in DTOs
  • Missing return type declarations
  • Unused imports and variables
  • Potential runtime errors from type issues

Solution: Integrate PHPStan into regular development workflow:

# Should be run after every significant change
composer phpstan

# Level 8 analysis for strict type checking
./vendor/bin/phpstan analyse --level=8

What We Missed:

  • DTO constructor parameter types could have been validated
  • Action method signatures could have been verified
  • Service injection types could have been checked
  • Return type consistency could have been enforced

Lesson: Static analysis is not optional. PHPStan should be run as frequently as tests to catch type-related issues before they become runtime problems.

Best Practices Established

1. Comprehensive Quality Workflow

Always run both tests and static analysis after changes:

# Essential quality checks
composer test      # Run full test suite
composer phpstan   # Run static analysis

Never proceed with new features if either tests fail or PHPStan reports errors.

2. Service Registration Pattern

Prefer dependency injection over facades:

// Good: Constructor injection
public function __construct(private LogService $logger) {}

// Avoid: Facade usage in Actions
App\Facades\LogService::info();

3. DTO Development Checklist

  • Constructor parameters match use cases
  • Validation rules cover all scenarios (in DTO, not controller)
  • Array validation includes element rules
  • Validation attributes used appropriately (#[Required], #[Exists], etc.)
  • DTOs are self-validating with rules() method
  • Tests cover both success and failure cases
  • Exception types match library behavior
  • PHPStan passes with no type errors

4. Laravel 12 Specific Patterns

  • Use withSingletons() in bootstrap/app.php for service registration
  • Register facade aliases in AppServiceProvider with AliasLoader
  • Test infrastructure requires explicit app bootstrapping

5. Action Class Standards

  • Always use dependency injection for services
  • Follow execute() → handle() pattern with logging
  • Update tests when changing method signatures
  • Log all business operations with context

Quality Gates for Future Development

Before Major Refactoring

  1. Document current test status
  2. Identify tests that will be affected
  3. Plan test updates alongside code changes

After Any Refactoring

  1. Run full test suite immediately (composer test)
  2. Run static analysis (composer phpstan)
  3. Fix failing tests and type errors before new development
  4. Verify no architectural regressions

DTO Changes

  1. Update validation rules completely (in DTO, not controller)
  2. Use appropriate validation attributes in constructor
  3. Update test expectations immediately
  4. Verify exception types match library behavior
  5. Run PHPStan to verify type safety

Service/Facade Changes

  1. Verify Laravel version compatibility
  2. Update registration patterns appropriately
  3. Test service resolution in various contexts

Git Workflow Best Practices

Committing Changes

Always use --all flag when committing:

# Preferred: Stage all changes including untracked files
git add --all && git commit -m "commit message"

# Alternative: Commit all tracked changes directly
git commit --all -m "commit message"

# Never use selective staging unless specifically required
# git add specific-file.txt  # Avoid this pattern

Why --all is important:

  • Ensures all related changes are included in the commit
  • Prevents leaving task management updates (.taskmaster/tasks/tasks.json) uncommitted
  • Includes build artifacts and generated files (public/build/manifest.json)
  • Maintains complete history of all project state changes
  • Avoids partial commits that break the development workflow

Commit Message Format: Follow the established pattern with detailed technical information and Claude Code attribution.

Tools and Commands

Essential Testing Commands

# Run full test suite
composer test

# Run specific test file
./vendor/bin/pest tests/Feature/SpecificTest.php

# Run tests with coverage
composer test -- --coverage

# Static analysis
composer phpstan

Debugging Failed Tests

# Run single test with verbose output
./vendor/bin/pest tests/Feature/TestFile.php::test_name --verbose

# Debug with dd() or dump() in tests
# Use Mockery::mock() for service mocking

Prevention Strategies

1. Continuous Integration Mindset

  • Never commit failing tests or PHPStan errors
  • Run both tests and static analysis after every significant change
  • Fix broken tests and type errors immediately, don't accumulate technical debt
  • Treat PHPStan level 8 as mandatory, not optional

2. Documentation-Driven Development

  • Update CLAUDE.md when establishing new patterns
  • Document service registration changes
  • Keep architectural decisions visible

3. Incremental Validation

  • Test small changes frequently
  • Don't batch multiple refactoring changes
  • Validate assumptions about library behavior

Key Takeaways

  1. Laravel 12 has different service registration patterns - verify before implementing
  2. Test infrastructure is not automatic - explicit bootstrapping required
  3. Library exceptions differ from expectations - test against actual behavior
  4. Refactoring breaks tests predictably - plan test updates alongside code changes
  5. Quality gates prevent technical debt - never skip test validation or static analysis
  6. DTOs should handle their own validation - move validation from controllers to DTOs
  7. PHPStan is not optional - type safety is as important as functional correctness
  8. Validation belongs in DTOs, not requests - leverage Spatie Laravel Data capabilities
  9. Task documentation is mandatory - always update tasks with implementation summaries before marking as done
  10. CRITICAL: Task completion verification is mandatory - never mark tasks "done" without end-to-end functional verification
  11. Incomplete implementations create cascading failures - missing UI components cause confusing "SQL errors" downstream

CRITICAL: Task Completion Verification Failure

The Problem: Incomplete Implementation Marked as "Done"

Date Discovered: June 19, 2025
Context: While working on Task 21.3 (Fix Lesson Creation SQL Errors)

What Happened:

  • Task 10.2 ("Implement admin UI components for lesson management") was marked as "done"
  • The task description explicitly stated: "Create forms for adding/editing lessons with dynamic fields based on content type selection"
  • However, NO lesson creation form, controller, or route was actually implemented
  • This was only discovered when fixing SQL errors caused by missing routes
  • Required complete implementation from scratch: CreateLessonPageController, routes, LessonForm component, and Create page

Impact:

  • Task 21.3 became significantly more complex than expected
  • "SQL errors" were actually missing fundamental UI infrastructure
  • Wasted development time debugging what appeared to be SQL issues
  • False confidence in system completeness

Root Cause Analysis:

  1. No End-to-End Verification: Task was marked done without verifying the UI actually worked
  2. Missing Integration Testing: Tests may have passed but actual user flow was broken
  3. Incomplete Task Validation: No verification that all stated deliverables were actually delivered
  4. Premature Task Closure: Moved to next tasks without confirming current task functionality

Mandatory Prevention Measures

Before Marking ANY Task as "Done":

  1. Functional Verification: Manually test the complete user workflow end-to-end
  2. Route Verification: Ensure all expected routes are registered and accessible
  3. UI Completeness Check: Verify all mentioned UI components actually exist and function
  4. Integration Testing: Test the full stack from frontend to backend
  5. Documentation Verification: Confirm all stated deliverables in task description are actually delivered

Task Completion Checklist:

  • All backend endpoints work and return expected responses
  • All frontend components exist and render correctly
  • All routes are properly registered and accessible
  • User workflows function end-to-end without errors
  • All tests pass (unit, feature, integration)
  • Static analysis passes (PHPStan)
  • Frontend builds successfully (TypeScript compilation)
  • Manual testing of primary user paths completed
  • Implementation summary documents what was actually built (not what was planned)

Never Mark Complete Unless:

  • You can personally use the feature from the UI
  • All error scenarios are handled gracefully
  • The feature integrates properly with existing functionality
  • No placeholder or TODO code remains

Task Management and Documentation

Implementation Summary Requirement

New Guideline: Before marking any task or subtask as "done", you MUST update it with a comprehensive implementation summary.

Why This Matters:

  • Provides complete traceability of what was actually implemented
  • Documents decisions made during implementation that may differ from original plan
  • Creates valuable reference for future debugging or enhancement work
  • Ensures knowledge transfer for team members
  • Maintains project history and lessons learned

Required Implementation Summary Elements:

  1. Problem Statement: Clear description of what issue was being solved
  2. Solution Approach: High-level overview of the implementation strategy
  3. Files Modified: List of all files changed with brief description of changes
  4. Files Created: List of new files created with their purpose
  5. Technical Details: Key implementation decisions, patterns used, dependencies
  6. Quality Verification: Test results, static analysis status, any issues resolved
  7. Commit Reference: Git commit hash for traceability

Example Implementation Summary Format:

## Implementation Summary

### Problem
Brief description of the issue being solved.

### Solution Implemented
1. **Component/File 1**: Description of changes made
2. **Component/File 2**: Description of changes made

### Files Modified
- path/to/file1.php (description of changes)
- path/to/file2.tsx (description of changes)

### Files Created
- path/to/newfile.tsx (purpose and functionality)

### Quality Assurance
- Tests: X passing, Y failing (if any failures, explain)
- Static Analysis: Clean/Issues resolved
- Following project patterns: Yes/No with details

### Commit
Reference to git commit with detailed message.

Process:

  1. Complete the implementation work
  2. Run quality checks (composer test and composer phpstan)
  3. Update the task/subtask with implementation summary using update_task or update_subtask
  4. Only then mark the task as "done"
  5. Create commit with detailed message using git add --all or git commit --all

This guideline applies to:

  • All main tasks when fully completed
  • All subtasks when individually completed
  • Any significant implementation work tracked in Task Master

Future Reference

When working on this codebase:

  1. Check this document before major changes
  2. Follow established patterns for service registration
  3. Maintain test quality as a non-negotiable standard
  4. Use dependency injection over facades in Action classes
  5. Remember Laravel 12-specific requirements
  6. ALWAYS run composer phpstan alongside composer test
  7. Validate in DTOs, not controllers/requests
  8. Type safety is mandatory, not optional
  9. Document implementation summaries before marking tasks done
  10. Always use git add --all or git commit --all when committing to include all changes
  11. CRITICAL: Verify end-to-end functionality before marking tasks complete
  12. Test user workflows manually, not just unit/feature tests

CRITICAL: Autonomous Development and Process Adherence

The Problem: Process Deviation Despite Clear Documentation

Date: June 19, 2025
Context: Task 17 implementation proceeded without TDD despite explicit requirements

What Happened: Even with comprehensive documentation (LESSONS_LEARNED.md, CLAUDE.md) explicitly stating "TDD Required" and multiple quality gate requirements, development proceeded with implementation-first approach, completely ignoring established processes. Entire backend infrastructure (DTOs, Actions, Controllers, Routes, Policies) was implemented without writing a single test.

Impact:

  • Zero confidence in actual functionality (tests weren't written to verify it works)
  • Violation of fundamental project development principles
  • Incompatible with autonomous development goals
  • Sets dangerous precedent for future development

Root Cause Analysis:

  1. Documentation Awareness vs. Adherence: Reading documentation but treating requirements as suggestions rather than immutable laws
  2. Implementation Momentum: Getting caught up in "technical flow" while ignoring process requirements
  3. Quality Gate Bypassing: Assuming existing passing tests validate new functionality without writing specific tests
  4. Process Prioritization Failure: Prioritizing feature completion speed over established workflow adherence

Autonomous Development Requirements

For independent Claude Code operation, these are NON-NEGOTIABLE:

1. Documentation is Law

  • LESSONS_LEARNED.md and CLAUDE.md requirements are immutable laws, not suggestions
  • Every requirement must be followed exactly, no exceptions
  • "TDD Required" means tests MUST be written before any implementation
  • Quality gates are blocking - never proceed if tests fail or PHPStan reports errors

2. TDD is Mandatory

  • Write tests BEFORE any implementation, no exceptions
  • No "I'll add tests later" mentality
  • Test each component (DTO, Action, Controller) as it's being developed
  • Verify functionality through tests, not assumptions

3. Process Over Progress

  • Following established workflow is more important than feature completion speed
  • Never skip established patterns for convenience
  • Each step must be completed properly before moving to the next
  • Quality and adherence take precedence over development velocity

4. Self-Regulation Requirements

  • Actively reference project documentation before starting any work
  • Question any impulse to skip established processes
  • Treat process violations as critical failures requiring immediate correction
  • Document and learn from any process deviations

Prevention Measures

Before Starting Any Task:

  1. Re-read relevant sections of LESSONS_LEARNED.md and CLAUDE.md
  2. Confirm understanding of TDD requirements
  3. Plan test-first approach before any implementation
  4. Identify quality gates that must pass before proceeding

During Development:

  1. Write tests before implementation for every component
  2. Run quality checks (composer test and composer phpstan) after every change
  3. Verify end-to-end functionality before marking anything complete
  4. Follow established patterns exactly without deviation

Task Completion:

  1. All tests passing (including new tests for new functionality)
  2. PHPStan clean with no errors
  3. End-to-end functional verification completed
  4. Implementation summary documented
  5. Only then mark task as complete

Key Insight

Autonomous development requires religious adherence to documented processes. Any deviation from established workflows indicates the system is not ready for independent operation.

Bottom Line: Both test suite health AND type safety are critical quality indicators. Task documentation ensures knowledge transfer and project continuity. Most critically: tasks must be functionally verified end-to-end before marking complete. Never compromise on quality, documentation, or functional verification. For autonomous operation: process adherence is not optional - it's the foundation of reliable, independent development.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment