Просмотр исходного кода

Release v0.1.0 (#1562)

* updated version number

* chore(config): apply ruff formatting to release/v0.1.0

- Run ruff check --fix to remove unused imports and fix code issues
- Run ruff format to apply PEP 8 formatting (4 spaces for Python)
- Fix TemplateRenderError import and remove unused defaults variable

* fix(core): required sections ignore toggle and always enabled

- Modified VariableSection.is_enabled() to return True for required sections
- Hide toggle variables from display in required sections
- Add warnings when attempting to disable required section toggles via config or CLI
- Updated pihole template with better defaults and required network section

Fixes #1411

* feat(core): comprehensive improvements to variable dependencies and prompts

Major enhancements:
- Sort variables by dependencies within sections for logical display/prompt order
- Skip prompting for variables with unsatisfied needs
- Hide toggle variables in required sections from display
- Use standard prompt logic for toggle variables (supports extra text)
- Add warnings when setting values for variables with unsatisfied needs via config/CLI
- Fix section merge to preserve needs from module spec when template doesn't override
- Support semicolon-separated multiple AND conditions in needs syntax
  Example: needs: 'traefik_enabled=true;network_mode=bridge,macvlan'

Fixes and improvements for issues with:
- Required sections incorrectly showing as disabled
- Variables displayed/prompted in illogical order
- Macvlan variables prompted when network_mode=bridge
- Toggle descriptions not showing extra text
- Section needs being cleared during template merge

Related to #1411

* fix(pihole): handle host network mode correctly

Remove dependency on network_enabled toggle since Network section is now required.
Template now directly checks network_mode value for host/bridge/macvlan logic.

* feat(core): remove Jinja2 default() filter extraction (#1410) (#1416)

- Removed _extract_jinja_default_values() method from Template class
- Removed merge logic for Jinja2 defaults in variables property
- Fixed validate command to handle 3-tuple from LibraryManager.find()
- All defaults must now be explicitly defined in template/module specs
- Updated CHANGELOG.md with removal notice

* feat(core): add --var-file support for loading variables from YAML (#1331)

- Added _load_var_file() method to parse YAML variable files
- Added _apply_var_file() method to apply var file variables with proper precedence
- Added --var-file/-f parameter to generate command
- Supports both flat (var: value) and nested (section: {var: value}) YAML structures
- Proper precedence chain: module < template < config < var-file < CLI --var
- Comprehensive error handling for file not found, invalid YAML, and type errors
- Updated documentation and examples
- All existing templates validate successfully

* update changelog

* docs(quality): add comprehensive code quality analysis

* refactor(collection): add iter_active_sections() helper method

- Adds centralized iterator for sections with proper filtering
- Eliminates duplicate iteration logic in prompt.py
- Supports include_disabled and include_unsatisfied flags
- Reduces code duplication by ~30 lines

Related to #1364 (High Priority #1)

* refactor(module): deduplicate template loading logic

- Adds _load_all_templates() helper method with optional filtering
- Updates list(), search(), and validate() to use centralized helper
- Eliminates ~90 lines of duplicate code
- Improves maintainability with single source of truth for template loading

Related to #1364 (High Priority #3)

* docs: add comprehensive naming and API improvement analysis

- Identifies 6 major improvement opportunities
- Prioritizes by impact (High/Medium/Low)
- Proposes CRUD standardization across ConfigManager
- Recommends consolidating duplicate methods
- Includes breaking change mitigation strategies
- Estimates +1.0 code quality score improvement

Related to #1364

* docs: remove analysis documents (not needed for PR)

* refactor(display): split DisplayManager into specialized managers

- Refactored monolithic DisplayManager (971 lines) into 4 specialized managers:
  * VariableDisplayManager - variable and section rendering
  * TemplateDisplayManager - template display and file trees
  * StatusDisplayManager - status messages and errors
  * TableDisplayManager - all table types

- Maintained 100% backward compatibility via delegation methods
- All existing code works without modifications
- Follows Single Responsibility Principle
- Updated AGENTS.md with new architecture documentation

Previous improvements included in this commit:
- Renamed display methods for consistency (display_template, display_section)
- Reduced variable map lookups in reset_disabled_bool_variables()
- Improved exception hierarchy (VariableValidationError, VariableError)
- Extracted error context building to TemplateErrorHandler class
- Fixed VariableSection forward reference in variable.py

All ruff checks pass. Tested with compose list and compose show commands.

Relates to #1364

* refactor(display): complete optimization with settings, helpers, and method splitting

Major improvements to display.py architecture:

1. DisplaySettings Class (65 lines):
   - Centralized all hardcoded values (colors, styles, layouts, text labels)
   - Easy customization via single class
   - Constants: colors, styles, padding, sizes, labels, etc.

2. Helper Methods in DisplayManager:
   - _format_library_display() - eliminates duplicate library formatting
   - _truncate_value() - centralized value truncation logic
   - _format_file_size() - human-readable size formatting (B, KB, MB)

3. Split render_variables_table() (91 → 56 lines):
   - Extracted _render_section_header() (20 lines)
   - Extracted _render_variable_row() (35 lines)
   - Main method now cleaner coordinator logic

4. Updated All Managers to Use Settings:
   - VariableDisplayManager: uses all style/color/text constants
   - TemplateDisplayManager: uses settings and _format_library_display()
   - StatusDisplayManager: uses color scheme constants
   - TableDisplayManager: uses helpers and settings throughout

5. Removed Code Duplication:
   - Library display logic (was in 2 places)
   - File size formatting (was in 1 place)
   - Value truncation (was in 2 places with different logic)
   - Sensitive masking (consolidated)

Benefits:
- Single source of truth for all display configuration
- Easy to theme/customize CLI appearance
- Better testability (helpers can be unit tested)
- Reduced duplication
- More maintainable (change color scheme in one place)

File stats: 1343 → 1337 lines (-6 lines despite adding 65-line settings class)
All tests pass: compose list, compose show traefik
Linting: ruff checks passed

Relates to #1364

* refactor(display): split display.py into separate manager modules

- Created cli/core/display/ package structure
- Split DisplayManager into specialized managers:
  - VariableDisplayManager: variable rendering
  - TemplateDisplayManager: template display
  - StatusDisplayManager: status messages and errors
  - TableDisplayManager: table rendering
- Moved DisplaySettings and IconManager to __init__.py
- Maintained backward compatibility through delegation methods
- All imports remain unchanged (from cli.core.display import DisplayManager)
- Follows Single Responsibility Principle for better maintainability

* docs(changelog): add display module refactoring entry

* refactor(display): separate DisplaySettings, IconManager, and DisplayManager into individual files

- Moved DisplaySettings to display_settings.py
- Moved IconManager to icon_manager.py
- Moved DisplayManager to display_manager.py
- Updated __init__.py to only handle imports/exports (27 lines vs 526)
- Each file now has single, clear responsibility
- Better adherence to Single Responsibility Principle
- Updated AGENTS.md with new structure and ruff formatting instructions

* style: apply ruff formatting to entire codebase

* fix(display): simplify section disabled label logic and fix table row styling

- Fixed table row styling being added as 5th column instead of style parameter
- Simplified disabled label logic: show (disabled) if section has toggle and is not enabled
- Removed redundant has_dependencies parameter from _render_section_header
- Now all disabled toggle sections consistently show (disabled) label

* refactor(display): standardize table header styling across CLI

- Enforce consistent STYLE_TABLE_HEADER ('bold blue') for all tables via _print_table()
- Remove optional style parameter logic from table header styling
- Remove separate heading() calls before tables for cleaner output
- Ensure uniform table appearance throughout compose, repo, and config commands

* updated changelog

* updated changelog and description for code quality

* feature(ci): Add Ruff linting configuration and GitHub Actions workflow

- Add Ruff configuration to pyproject.toml with PEP 8 compliance
  - Line length: 88 characters
  - Indentation: 4 spaces (PEP 8 standard)
  - Enable comprehensive rule sets (pycodestyle, pyflakes, isort, pylint, etc.)
- Create .github/workflows/codequality-ruff.yaml
  - Runs on PRs to main and pushes to main/release/* branches
  - Checks both linting and formatting (blocking)
- Fix yamllint errors in config.yaml and release workflow
- Remove whitespace from table_display.py

Related to #1318

* fixed some ruff errors

* feat(compose): add --var and --var-file support to show command (#1421)

- Add --var and --var-file options to show command
- Apply same variable precedence as generate command
- Update CHANGELOG.md with feature description
- Users can now preview variable overrides before generating files

* started developing new functions

* critical updates to templates

* updates to the template tags

* n8n template preparation

* traefik security headers

* version pinning for twingate-connector

* template fix

* recent template updates

* traefik template publish

* n8n tags

* prometheus update

* fix(compose): use CF_API_TOKEN_FILE for Cloudflare API token in Traefik

* fix bug in schema 1.1

* renovate draft template

* make updates to renovate

* code quality updates

* feat: Add schema 1.2 with dedicated volume and resources sections

- Add spec_v1_2.py with new volume and resources sections
- Volume section: Replaces swarm_volume_* vars, works universally
- Resources section: CPU/memory limits for production deployments
- Ports section: Add ports_http and ports_https variables
- Update compose module to support schema 1.2
- Create new v2 archetypes:
  - service-volumes-v2.j2: Uses volume_mode
  - volumes-v2.j2: Top-level volumes with new section
  - service-resources-v1.j2: Resource limits

Related: #1519

* fixed #1522 and archetype improvements

* fixed issues in archetypes

* prepare other templates

* prepare other templates

* fix(variable): correct email validation regex

Fixed malformed email validation regex that was matching literal
backslash-s characters instead of whitespace. Changed from
r"^[^@\\s]+@[^@\\s]+\\.[^@\\s]+$" to
r"^[^@\s]+@[^@\s]+\.[^@\s]+$"

Fixes #1481

* fix(variable): replace regex with RFC-compliant email validation

Replaced regex-based email validation with email-validator library.
Regex cannot properly validate emails per RFC 5322/5321 - it fails
on valid addresses like "John Doe"@example.com, user+tag@example.com.

Changes:
- Added email-validator>=2.0.0 dependency to pyproject.toml
- Removed EMAIL_REGEX constant
- Updated _convert_email() to use validate_email() function
- Returns normalized email addresses
- Provides better error messages for invalid emails

Fixes #1481

* prepare migration for other modules

* ruff fixes

* fix(template): gracefully handle missing 'needs' dependencies

When a section's 'needs' dependency references a non-existent section,
the CLI now logs a warning instead of raising an error. This allows
templates to be modified without breaking when dependencies are removed.

Closes #1428

* fix(template): skip empty files during generation (#1518) (#1530)

* big update

* docs: add markdown support to changelog (#1471)

* feature(install): add auto-install for dependencies on Linux and macOS

- Auto-detects OS and Linux distribution
- Installs python3, pip, git, and pipx if missing
- Supports Ubuntu, Debian, Fedora, RHEL, CentOS, Rocky, AlmaLinux, openSUSE, Arch, Manjaro, Alpine, and macOS
- Adds --no-auto-install flag to skip automatic installation
- Improves error messages with clear installation instructions

Closes #1517

* fix(install): handle PEP 668 externally-managed environments

- Try installing pipx from system packages first
- Fall back to pip with --break-system-packages flag for PEP 668
- Add pipx package names for each distro
- Improve pipx ensurepath handling

* fix(install): improve pipx installation error handling

- Properly suppress stderr when trying system package installation
- Better conditional logic for pip installation with --break-system-packages
- Add success logging for each installation method

* fix(install): suppress pip error output by checking success

- Use grep to check for 'Successfully installed' instead of exit codes
- This suppresses PEP 668 error output when trying pip methods
- Provides helpful error message suggesting manual apt install

* fix(install): handle distros without VERSION_ID in os-release

- Arch Linux and some other rolling distros don't have VERSION_ID
- Use parameter expansion to set empty default

* fix(install): add support for archarm distribution

- OrbStack Arch Linux uses 'archarm' as distribution ID
- Add it to the Arch Linux case pattern

* chore: add build/ and dist/ to .gitignore

- Ignore Python build artifacts
- Ignore distribution packages

* fix: add 'boilerplates' prefix to command suggestions in help text

- Update help messages to show 'boilerplates repo update' instead of 'repo update'
- Makes it clearer that commands should be run with the boilerplates CLI prefix
- Addresses user feedback from issue #1517

* feat(gitlab): integrate improvements from template/1372 with schema 1.2

- Add .env.j2 for environment variables (root password)
- Add container_hostname, root_email, root_password variables
- Add initial root user configuration to gitlab.rb
- Add default_theme, default_color_mode, disable_usage_data settings
- Improve template description and next_steps documentation
- Add env_file and swarm configs/secrets support
- Update to use schema 1.2 volume section (volume_mode instead of swarm_volume_*)
- Fix registry port from 5678 to standard 5000
- Add swarm placement constraints support
- Update ports section to include ports_https

* archetype validation testing

* schema1.2-traefik_domain

* feat(traefik): add multiple DNS challenge providers

- Add support for Porkbun, GoDaddy, DigitalOcean, Route53 (AWS), Azure, GCP, and Namecheap
- Add provider-specific credential variables with conditional visibility
- Support both standard and Docker Swarm modes for all providers
- Update environment variable handling for each provider

Closes #1478

* fix repo and changelog

* feature(docs): add GitHub Action to auto-generate wiki variable documentation

- Created .github/scripts/generate_wiki_docs.py script

- Generates markdown documentation for all module variables

- Uses latest schema version for each module

- Created workflow to auto-update wiki on schema changes

- Workflow triggers on changes to module specs and script

- Runs on release/v0.1.0 branch (will switch to main later)

Relates to #1316

* docs(wiki): add prominent Contributing section with CONTRIBUTING.md link

- Added Contributing section in Developer Documentation area

- Links directly to CONTRIBUTING.md in repository

- Highlights key points: CLI requires Discord, templates welcome PRs

Relates to #1316

* Documentation

* template updates

* template updates

* fix(compose): add Loki batching configuration to Alloy template

- Add batch_wait (5s) and batch_size (1MB) to reduce request volume
- Add max_backoff (5m) and min_backoff (500ms) for retry reliability
- Prevents ingestion rate limit errors with multiple Alloy instances
- Reduces HTTP overhead and improves compression efficiency

Fixes #1556

* updates

* update schema

* update

* fix ruff

* update wiki

* fix wiki

* fix wiki

* refactor(workflows): rename and extend wiki sync workflow

- Rename docs-update-wiki-variables.yaml to docs-update-wiki.yaml
- Add syncing of static wiki pages from .wiki/ directory
- Add .wiki/** to workflow triggers
- Improve commit message and workflow description

* fix(workflows): improve wiki branch detection for new wikis

- Add fallback to current branch if symbolic ref doesn't exist
- Prevents 'invalid refspec' error on newly created wikis

* fix(workflows): simplify wiki workflow by assuming master branch

- Remove complex branch detection that was causing empty variable issues
- Hardcode master branch (GitHub wikis default)
- Remove unnecessary initialization check (wiki must exist for checkout to succeed)
- Simplify commit message

* update email settings

* fix wiki

* big template updates 1

* big template updates 1

* template refactoring 2

* working on templates 2

* working on templates 3

* refactoring updates

* release-test-1

* release-test-2

* fix(templates): resolve yamllint errors - add missing newlines and remove duplicate key

* fix(templates): resolve yamllint line-length warnings in descriptions
Christian Lempa 2 месяцев назад
Родитель
Сommit
f856aef030
100 измененных файлов с 10573 добавлено и 6579 удалено
  1. 252 0
      .github/scripts/generate_wiki_docs.py
  2. 36 0
      .github/workflows/codequality-ruff.yaml
  3. 11 8
      .github/workflows/docs-update-wiki.yaml
  4. 4 2
      .gitignore
  5. 409 148
      AGENTS.md
  6. 42 0
      CHANGELOG.md
  7. 270 0
      CONTRIBUTING.md
  8. 0 3
      archetypes/__init__.py
  9. 0 453
      archetypes/__main__.py
  10. 0 20
      archetypes/compose/configs-v1.j2
  11. 0 134
      archetypes/compose/extension.yaml
  12. 0 50
      archetypes/compose/networks-v1.j2
  13. 0 20
      archetypes/compose/secrets-v1.j2
  14. 0 20
      archetypes/compose/service-configs-v1.j2
  15. 0 38
      archetypes/compose/service-deploy-v1.j2
  16. 0 25
      archetypes/compose/service-environment-v1.j2
  17. 0 33
      archetypes/compose/service-labels-v1.j2
  18. 0 30
      archetypes/compose/service-networks-v1.j2
  19. 0 26
      archetypes/compose/service-ports-v1.j2
  20. 0 20
      archetypes/compose/service-secrets-v1.j2
  21. 0 23
      archetypes/compose/service-v1.j2
  22. 0 26
      archetypes/compose/service-volumes-v1.j2
  23. 0 40
      archetypes/compose/volumes-v1.j2
  24. 1 1
      cli/__init__.py
  25. 118 82
      cli/__main__.py
  26. 0 953
      cli/core/config.py
  27. 9 0
      cli/core/config/__init__.py
  28. 575 0
      cli/core/config/config_manager.py
  29. 0 975
      cli/core/display.py
  30. 180 0
      cli/core/display/__init__.py
  31. 308 0
      cli/core/display/display_base.py
  32. 193 0
      cli/core/display/display_icons.py
  33. 66 0
      cli/core/display/display_settings.py
  34. 303 0
      cli/core/display/display_status.py
  35. 274 0
      cli/core/display/display_table.py
  36. 158 0
      cli/core/display/display_template.py
  37. 242 0
      cli/core/display/display_variable.py
  38. 46 27
      cli/core/exceptions.py
  39. 11 0
      cli/core/input/__init__.py
  40. 228 0
      cli/core/input/input_manager.py
  41. 37 0
      cli/core/input/input_settings.py
  42. 243 0
      cli/core/input/prompt_manager.py
  43. 70 117
      cli/core/library.py
  44. 0 1298
      cli/core/module.py
  45. 9 0
      cli/core/module/__init__.py
  46. 692 0
      cli/core/module/base_commands.py
  47. 345 0
      cli/core/module/base_module.py
  48. 141 0
      cli/core/module/config_commands.py
  49. 236 0
      cli/core/module/helpers.py
  50. 104 116
      cli/core/prompt.py
  51. 7 13
      cli/core/registry.py
  52. 249 281
      cli/core/repo.py
  53. 17 0
      cli/core/schema/__init__.py
  54. 15 0
      cli/core/schema/ansible/v1.0.json
  55. 229 0
      cli/core/schema/compose/v1.0.json
  56. 312 0
      cli/core/schema/compose/v1.1.json
  57. 528 0
      cli/core/schema/compose/v1.2.json
  58. 202 0
      cli/core/schema/helm/v1.0.json
  59. 247 0
      cli/core/schema/kubernetes/v1.0.json
  60. 220 0
      cli/core/schema/loader.py
  61. 14 0
      cli/core/schema/packer/v1.0.json
  62. 87 0
      cli/core/schema/terraform/v1.0.json
  63. 20 0
      cli/core/template/__init__.py
  64. 376 414
      cli/core/template/template.py
  65. 69 102
      cli/core/template/variable.py
  66. 444 300
      cli/core/template/variable_collection.py
  67. 58 69
      cli/core/template/variable_section.py
  68. 31 33
      cli/core/validators.py
  69. 3 7
      cli/core/version.py
  70. 1 0
      cli/modules/__init__.py
  71. 88 0
      cli/modules/ansible/__init__.py
  72. 132 12
      cli/modules/compose/__init__.py
  73. 0 278
      cli/modules/compose/spec_v1_0.py
  74. 0 341
      cli/modules/compose/spec_v1_1.py
  75. 252 0
      cli/modules/compose/validate.py
  76. 87 0
      cli/modules/helm/__init__.py
  77. 88 0
      cli/modules/kubernetes/__init__.py
  78. 87 0
      cli/modules/packer/__init__.py
  79. 88 0
      cli/modules/terraform/__init__.py
  80. 16 0
      library/ansible/checkmk-install-agent/playbook.yaml.j2
  81. 59 0
      library/ansible/checkmk-install-agent/template.yaml
  82. 16 0
      library/ansible/checkmk-manage-host/playbook.yaml.j2
  83. 65 0
      library/ansible/checkmk-manage-host/template.yaml
  84. 61 0
      library/ansible/docker-certs-enable/playbook.yaml.j2
  85. 35 0
      library/ansible/docker-certs-enable/template.yaml
  86. 167 0
      library/ansible/docker-certs/playbook.yaml.j2
  87. 43 0
      library/ansible/docker-certs/template.yaml
  88. 44 0
      library/ansible/docker-install-ubuntu/playbook.yaml.j2
  89. 27 0
      library/ansible/docker-install-ubuntu/template.yaml
  90. 24 0
      library/ansible/docker-prune/playbook.yaml.j2
  91. 27 0
      library/ansible/docker-prune/template.yaml
  92. 28 0
      library/ansible/ubuntu-add-sshkey/playbook.yaml.j2
  93. 27 0
      library/ansible/ubuntu-add-sshkey/template.yaml
  94. 24 0
      library/ansible/ubuntu-apt-update/playbook.yaml.j2
  95. 29 0
      library/ansible/ubuntu-apt-update/template.yaml
  96. 28 0
      library/ansible/ubuntu-vm-core/playbook.yaml.j2
  97. 27 0
      library/ansible/ubuntu-vm-core/template.yaml
  98. 151 0
      library/compose/adguardhome/compose.yaml.j2
  99. 82 0
      library/compose/adguardhome/template.yaml
  100. 29 41
      library/compose/alloy/compose.yaml.j2

+ 252 - 0
.github/scripts/generate_wiki_docs.py

@@ -0,0 +1,252 @@
+#!/usr/bin/env python3
+"""Generate GitHub Wiki documentation for module variables.
+
+This script auto-generates variable documentation in GitHub Wiki markdown format
+for all registered modules, using the latest schema version for each.
+"""
+
+import sys
+from pathlib import Path
+
+# Add project root to path (script is in .github/scripts, so go up twice)
+project_root = Path(__file__).parent.parent.parent
+sys.path.insert(0, str(project_root))
+
+# ruff: noqa: E402
+# Import all modules to register them
+import cli.modules.ansible
+import cli.modules.compose
+import cli.modules.helm
+import cli.modules.kubernetes
+import cli.modules.packer
+import cli.modules.terraform  # noqa: F401
+from cli.core.registry import registry  # Module import after path manipulation
+
+
+def format_value(value):
+    """Format value for markdown display."""
+    if value is None or value == "":
+        return "_none_"
+    if isinstance(value, bool):
+        return "✓" if value else "✗"
+    if isinstance(value, list):
+        return ", ".join(f"`{v}`" for v in value)
+    return f"`{value}`"
+
+
+def generate_module_docs(module_name: str, output_dir: Path):  # noqa: PLR0912, PLR0915
+    """Generate wiki documentation for a single module."""
+    # Get module class from registry
+    module_classes = dict(registry.iter_module_classes())
+
+    if module_name not in module_classes:
+        sys.stderr.write(f"Warning: Module '{module_name}' not found, skipping\n")
+        return False
+
+    module_cls = module_classes[module_name]
+    schema_version = module_cls.schema_version
+
+    # Get the spec for the latest schema version
+    if hasattr(module_cls, "schemas") and schema_version in module_cls.schemas:
+        spec = module_cls.schemas[schema_version]
+    elif hasattr(module_cls, "spec"):
+        spec = module_cls.spec
+    else:
+        sys.stderr.write(f"Warning: No spec found for module '{module_name}', skipping\n")
+        return False
+
+    # Generate markdown content
+    lines = []
+
+    # Header
+    lines.append(f"# {module_name.title()} Variables")
+    lines.append("")
+    lines.append(f"**Module:** `{module_name}`  ")
+    lines.append(f"**Schema Version:** `{schema_version}`  ")
+    lines.append(f"**Description:** {module_cls.description}")
+    lines.append("")
+    lines.append("---")
+    lines.append("")
+    lines.append(
+        "This page documents all available variables for the "
+        + f"{module_name} module. Variables are organized into sections "
+        + "that can be enabled/disabled based on your configuration needs."
+    )
+    lines.append("")
+
+    # Table of contents
+    lines.append("## Table of Contents")
+    lines.append("")
+    for section_key, section_data in spec.items():
+        section_title = section_data.get("title", section_key)
+        anchor = section_title.lower().replace(" ", "-").replace("/", "")
+        lines.append(f"- [{section_title}](#{anchor})")
+    lines.append("")
+    lines.append("---")
+    lines.append("")
+
+    # Process each section
+    for section_key, section_data in spec.items():
+        section_title = section_data.get("title", section_key)
+        section_desc = section_data.get("description", "")
+        section_toggle = section_data.get("toggle", "")
+        section_needs = section_data.get("needs", "")
+        section_required = section_data.get("required", False)
+        section_vars = section_data.get("vars", {})
+
+        # Section header
+        lines.append(f"## {section_title}")
+        lines.append("")
+
+        # Section metadata
+        metadata = []
+        if section_required:
+            metadata.append("**Required:** Yes")
+        if section_toggle:
+            metadata.append(f"**Toggle Variable:** `{section_toggle}`")
+        if section_needs:
+            if isinstance(section_needs, list):
+                needs_str = ", ".join(f"`{n}`" for n in section_needs)
+            else:
+                needs_str = f"`{section_needs}`"
+            metadata.append(f"**Depends On:** {needs_str}")
+
+        if metadata:
+            lines.append("  \n".join(metadata))
+            lines.append("")
+
+        if section_desc:
+            lines.append(section_desc)
+            lines.append("")
+
+        # Skip sections with no variables
+        if not section_vars:
+            lines.append("_No variables defined in this section._")
+            lines.append("")
+            continue
+
+        # Variables table
+        lines.append("| Variable | Type | Default | Description |")
+        lines.append("|----------|------|---------|-------------|")
+
+        for var_name, var_data in section_vars.items():
+            var_type = var_data.get("type", "str")
+            var_default = format_value(var_data.get("default"))
+            var_description = var_data.get("description", "").replace("\n", " ")
+
+            # Add extra metadata to description
+            extra_parts = []
+            if var_data.get("sensitive"):
+                extra_parts.append("**Sensitive**")
+            if var_data.get("autogenerated"):
+                extra_parts.append("**Auto-generated**")
+            if "options" in var_data:
+                opts = ", ".join(f"`{o}`" for o in var_data["options"])
+                extra_parts.append(f"**Options:** {opts}")
+            if "needs" in var_data:
+                extra_parts.append(f"**Needs:** `{var_data['needs']}`")
+            if "extra" in var_data:
+                extra_parts.append(var_data["extra"])
+
+            if extra_parts:
+                var_description += "<br>" + " • ".join(extra_parts)
+
+            lines.append(f"| `{var_name}` | `{var_type}` | {var_default} | {var_description} |")
+
+        lines.append("")
+        lines.append("---")
+        lines.append("")
+
+    # Footer
+    lines.append("## Notes")
+    lines.append("")
+    lines.append("- **Required sections** must be configured")
+    lines.append("- **Toggle variables** enable/disable entire sections")
+    lines.append("- **Dependencies** (`needs`) control when sections/variables are available")
+    lines.append("- **Sensitive variables** are masked during prompts")
+    lines.append("- **Auto-generated variables** are populated automatically if not provided")
+    lines.append("")
+    lines.append("---")
+    lines.append("")
+    lines.append(f"_Last updated: Schema version {schema_version}_")
+
+    # Write to file
+    output_file = output_dir / f"Variables-{module_name.title()}.md"
+    output_file.write_text("\n".join(lines))
+
+    sys.stdout.write(f"Generated: {output_file.name}\n")
+    return True
+
+
+def generate_variables_index(modules: list[str], output_dir: Path):
+    """Generate index page for all variable documentation."""
+    lines = []
+
+    lines.append("# Variables Documentation")
+    lines.append("")
+    lines.append("This section contains auto-generated documentation for all " + "available variables in each module.")
+    lines.append("")
+    lines.append("## Available Modules")
+    lines.append("")
+
+    for module_name in sorted(modules):
+        lines.append(f"- [{module_name.title()}](Variables-{module_name.title()})")
+
+    lines.append("")
+    lines.append("---")
+    lines.append("")
+    lines.append("Each module page includes:")
+    lines.append("")
+    lines.append("- Schema version information")
+    lines.append("- Complete list of sections and variables")
+    lines.append("- Variable types, defaults, and descriptions")
+    lines.append("- Section dependencies and toggle configurations")
+    lines.append("")
+    lines.append("---")
+    lines.append("")
+    lines.append("_This documentation is auto-generated from module schemas._")
+
+    output_file = output_dir / "Variables.md"
+    output_file.write_text("\n".join(lines))
+
+    sys.stdout.write(f"Generated: {output_file.name}\n")
+
+
+# Minimum required arguments
+MIN_ARGS = 2
+
+
+def main():
+    """Main entry point."""
+    if len(sys.argv) < MIN_ARGS:
+        sys.stderr.write("Usage: python3 scripts/generate_wiki_docs.py <output_directory>\n")
+        sys.exit(1)
+
+    output_dir = Path(sys.argv[1])
+    output_dir.mkdir(parents=True, exist_ok=True)
+
+    sys.stdout.write(f"Generating wiki documentation in: {output_dir}\n")
+    sys.stdout.write("\n")
+
+    # Get all registered modules
+    module_classes = dict(registry.iter_module_classes())
+    successful_modules = []
+
+    for module_name in sorted(module_classes.keys()):
+        if generate_module_docs(module_name, output_dir):
+            successful_modules.append(module_name)
+
+    sys.stdout.write("\n")
+
+    # Generate index page
+    if successful_modules:
+        generate_variables_index(successful_modules, output_dir)
+        sys.stdout.write("\n")
+        sys.stdout.write(f"✓ Successfully generated documentation for {len(successful_modules)} module(s)\n")
+    else:
+        sys.stderr.write("Error: No documentation generated\n")
+        sys.exit(1)
+
+
+if __name__ == "__main__":
+    main()

+ 36 - 0
.github/workflows/codequality-ruff.yaml

@@ -0,0 +1,36 @@
+---
+name: Code Quality - Ruff
+
+'on':
+  pull_request:
+    branches:
+      - main
+  push:
+    branches:
+      - main
+      - 'release/**'
+
+permissions:
+  contents: read
+
+jobs:
+  ruff:
+    name: Python Linting and Formatting
+    runs-on: ubuntu-latest
+    steps:
+      - name: Checkout
+        uses: actions/checkout@v5
+
+      - name: Set up Python
+        uses: actions/setup-python@v5
+        with:
+          python-version: '3.9'
+
+      - name: Install Ruff
+        run: pip install ruff
+
+      - name: Run Ruff Linting
+        run: ruff check .
+
+      - name: Run Ruff Formatting Check
+        run: ruff format --check .

+ 11 - 8
.github/workflows/docs-update-wiki.yaml

@@ -6,11 +6,10 @@ name: Docs - Update Wiki
     branches:
       - main
     paths:
-      - 'cli/modules/*/spec_*.py'
-      - 'cli/modules/*/__init__.py'
-      - '.wiki/**'
-      - '.github/scripts/generate_wiki_docs.py'
-      - '.github/workflows/docs-update-wiki.yaml'
+      - 'cli/core/schema/**/*.json'  # JSON schema files
+      - '.wiki/**'  # Static wiki pages
+      - '.github/scripts/generate_wiki_docs.py'  # Wiki generation script
+      - '.github/workflows/docs-update-wiki.yaml'  # This workflow
   workflow_dispatch:  # Allow manual trigger
 
 permissions:
@@ -40,7 +39,7 @@ jobs:
       - name: Install dependencies
         run: |
           python -m pip install --upgrade pip
-          pip install typer rich pyyaml jinja2
+          pip install -e .
 
       - name: Generate variable documentation
         run: |
@@ -82,8 +81,12 @@ jobs:
         run: |
           git config user.name "github-actions[bot]"
           git config user.email "github-actions[bot]@users.noreply.github.com"
-          git commit -m "Auto-update variable documentation from schema changes"
-          git push
+          git commit -m "Auto-update wiki pages"
+
+          # Pull with rebase to handle any remote changes, then push
+          # GitHub wikis use master as default branch
+          git pull --rebase origin master
+          git push origin master
 
       - name: Summary
         run: |

+ 4 - 2
.gitignore

@@ -6,6 +6,7 @@
 **/secret.*
 **/.env
 **/.envrc
+**/.direnv
 
 # Ignore Ansible
 **/.ansible
@@ -17,9 +18,12 @@
 **/*.pyd
 **/.venv
 **/venv/
+**/.ruff_cache/
 
 # Packaging
 *.egg-info/
+build/
+dist/
 
 # Installation tracking
 .installed-version
@@ -27,5 +31,3 @@
 # Test outputs
 tests/
 config.yaml
-
-*~

+ 409 - 148
AGENTS.md

@@ -17,11 +17,23 @@ python3 -m cli
 python3 -m cli --log-level DEBUG compose list
 ```
 
+### Production-Ready Testing
+
+For detailed information about testing boilerplates in a production-like environment before release, see **WARP-LOCAL.md** (local file, not in git). This document covers:
+- Test server infrastructure and Docker contexts
+- Step-by-step testing procedures for Docker Compose, Swarm, and Kubernetes
+- Comprehensive testing checklists
+- Production release criteria
+
 ### Linting and Formatting
 
 Should **always** happen before pushing anything to the repository.
 
-- Use `yamllint` for YAML files and `ruff` for Python code.
+- Use `yamllint` for YAML files
+- Use `ruff` for Python code:
+  - `ruff check --fix .` - Check and auto-fix linting errors (including unused imports)
+  - `ruff format .` - Format code according to style guidelines
+  - Both commands must be run before committing
 
 ### Project Management and Git
 
@@ -38,7 +50,8 @@ The project is stored in a public GitHub Repository, use issues, and branches fo
 
 - `cli/` - Python CLI application source code
   - `cli/core/` - Core Components of the CLI application
-  - `cli/modules/` - Modules implementing variable specs and technology-specific functions
+  - `cli/core/schema/` - JSON schema definitions for all modules
+  - `cli/modules/` - Modules implementing technology-specific functions
   - `cli/__main__.py` - CLI entry point, auto-discovers modules and registers commands
 - `library/` - Template collections organized by module
   - `library/ansible/` - Ansible playbooks and configurations
@@ -58,15 +71,27 @@ The project is stored in a public GitHub Repository, use issues, and branches fo
   - **Key Attributes**: `_sections` (dict of VariableSection objects), `_variable_map` (flat lookup dict)
   - **Key Methods**: `get_satisfied_values()` (returns enabled variables), `apply_defaults()`, `sort_sections()`
 - `cli/core/config.py` - Configuration management (loading, saving, validation)
-- `cli/core/display.py` - Centralized CLI output rendering (**Always use this to display output - never print directly**)
+- `cli/core/display/` - Centralized CLI output rendering package (**Always use DisplayManager - never print directly**)
+  - `__init__.py` - Package exports (DisplayManager, DisplaySettings, IconManager)
+  - `display_manager.py` - Main DisplayManager facade
+  - `display_settings.py` - DisplaySettings configuration class
+  - `icon_manager.py` - IconManager for Nerd Font icons
+  - `variable_display.py` - VariableDisplayManager for variable rendering
+  - `template_display.py` - TemplateDisplayManager for template display
+  - `status_display.py` - StatusDisplayManager for status messages
+  - `table_display.py` - TableDisplayManager for table rendering
 - `cli/core/exceptions.py` - Custom exceptions for error handling (**Always use this for raising errors**)
 - `cli/core/library.py` - LibraryManager for template discovery from git-synced libraries and static file paths
 - `cli/core/module.py` - Abstract base class for modules (defines standard commands)
 - `cli/core/prompt.py` - Interactive CLI prompts using rich library
 - `cli/core/registry.py` - Central registry for module classes (auto-discovers modules)
 - `cli/core/repo.py` - Repository management for syncing git-based template libraries
+- `cli/core/schema/` - Schema management package (**JSON-based schema system**)
+  - `loader.py` - SchemaLoader class for loading and validating JSON schemas
+  - `<module>/` - Module-specific schema directories (e.g., `compose/`, `terraform/`)
+  - `<module>/v*.json` - Version-specific JSON schema files (e.g., `v1.0.json`, `v1.2.json`)
 - `cli/core/section.py` - VariableSection class (stores section metadata and variables)
-  - **Key Attributes**: `key`, `title`, `toggle`, `required`, `needs`, `variables` (dict of Variable objects)
+  - **Key Attributes**: `key`, `title`, `toggle`, `needs`, `variables` (dict of Variable objects)
 - `cli/core/template.py` - Template Class for parsing, managing and rendering templates
 - `cli/core/variable.py` - Variable class (stores variable metadata and values)
   - **Key Attributes**: `name`, `type`, `value` (stores default or current value), `description`, `sensitive`, `needs`
@@ -88,52 +113,117 @@ Modules can be either single files or packages:
 - Call `registry.register(YourModule)` at module bottom
 - Auto-discovered and registered at CLI startup
 
-**Module Spec:**
-Module-wide variable specification defining defaults for all templates of that kind.
+**Module Discovery and Registration:**
 
-**Important**: The `spec` variable is an **OrderedDict** (or regular dict), NOT a VariableCollection object. It's converted to VariableCollection when needed.
+The system automatically discovers and registers modules at startup:
 
-Example:
-```python
-from collections import OrderedDict
-
-# Spec is a dict/OrderedDict, not a VariableCollection
-spec = OrderedDict({
-  "general": {
-    "title": "General",
-    "vars": {
-      "common_var": {
+1. **Discovery**: CLI `__main__.py` imports all Python files in `cli/modules/` directory
+2. **Registration**: Each module file calls `registry.register(ModuleClass)` at module level
+3. **Storage**: Registry stores module classes in a central dictionary by module name
+4. **Command Generation**: CLI framework auto-generates subcommands for each registered module
+5. **Instantiation**: Modules are instantiated on-demand when commands are invoked
+
+**Benefits:**
+- No manual registration needed - just add a file to `cli/modules/`
+- Modules are self-contained - can be added/removed without modifying core code
+- Type-safe - registry validates module interfaces at registration time
+
+**Module Schema System:**
+
+**JSON Schema Architecture** (Refactored from Python specs):
+
+All module schemas are now defined as **JSON files** in `cli/core/schema/<module>/v*.json`. This provides:
+- **Version control**: Easy schema comparison and diffs in git
+- **Language-agnostic**: Schemas can be consumed by tools outside Python
+- **Validation**: Built-in JSON schema validation
+- **Documentation**: Self-documenting schema structure
+
+**Schema File Location:**
+```
+cli/core/schema/
+  compose/
+    v1.0.json
+    v1.1.json
+    v1.2.json
+  terraform/
+    v1.0.json
+  ansible/
+    v1.0.json
+  ...other modules...
+```
+
+**JSON Schema Structure:**
+
+Schemas are arrays of section objects, where each section contains:
+
+```json
+[
+  {
+    "key": "section_key",
+    "title": "Section Title",
+    "description": "Optional section description",
+    "toggle": "optional_toggle_variable_name",
+    "needs": "optional_dependency",
+    "required": true,
+    "vars": [
+      {
+        "name": "variable_name",
         "type": "str",
-        "default": "value",
-        "description": "A common variable"
+        "description": "Variable description",
+        "default": "default_value",
+        "required": true,
+        "sensitive": false,
+        "autogenerated": false,
+        "options": ["option1", "option2"],
+        "needs": "other_var=value",
+        "extra": "Additional help text"
       }
-    }
-  },
-  "networking": {
-    "title": "Network",
-    "toggle": "net_enabled",
-    "vars": {...}
+    ]
   }
-})
-
-# To use the spec, convert it to VariableCollection:
-from cli.core.collection import VariableCollection
-variable_collection = VariableCollection(spec)
+]
 ```
 
-**Multi-Schema Modules:**
-For modules supporting multiple schema versions, use package structure:
-```
-cli/modules/compose/
-  __init__.py          # Module class, loads appropriate spec
-  spec_v1_0.py         # Schema 1.0 specification
-  spec_v1_1.py         # Schema 1.1 specification
+**Schema Loading in Modules:**
+
+Modules load JSON schemas on-demand using the SchemaLoader:
+
+```python
+from cli.core.schema import load_schema, has_schema, list_versions
+
+class MyModule(Module):
+    name = "mymodule"
+    schema_version = "1.2"  # Latest version supported
+    
+    def get_spec(self, template_schema: str) -> OrderedDict:
+        """Load JSON schema and convert to dict format."""
+        json_spec = load_schema(self.name, template_schema)
+        # Convert JSON array to OrderedDict format
+        return self._convert_json_to_dict(json_spec)
 ```
 
+**Schema Design Principles:**
+- **Backward compatibility**: Newer module versions can load older template schemas
+- **Auto-created toggle variables**: Sections with `toggle` automatically create boolean variables
+- **Conditional visibility**: Variables use `needs` constraints to show/hide based on other variable values
+- **Mode-based organization**: Related settings grouped by operational mode (e.g., network_mode, volume_mode)
+- **Incremental evolution**: New schemas add features without breaking existing templates
+
+**Working with Schemas:**
+- **View available versions**: Check `cli/core/schema/<module>/` directory or use `list_versions(module)`
+- **Add new schema version**: Create new JSON file following naming convention (e.g., `v1.3.json`)
+- **Update module**: Increment `schema_version` in module class when adding new schema
+- **Validate schemas**: SchemaLoader automatically validates JSON structure on load
+
+**Migration from Python Specs:**
+
+Older Python-based `spec_v*.py` files have been migrated to JSON. The module `__init__.py` now:
+1. Loads JSON schemas using SchemaLoader
+2. Converts JSON array format to OrderedDict for backward compatibility
+3. Provides lazy loading via `_SchemaDict` class
+
 **Existing Modules:**
-- `cli/modules/compose/` - Docker Compose package with schema 1.0 and 1.1 support
-  - `spec_v1_0.py` - Basic compose spec
-  - `spec_v1_1.py` - Extended with network_mode, swarm support
+- `cli/modules/compose/` - Docker Compose (JSON schemas: v1.0, v1.1, v1.2)
+- Other modules (ansible, terraform, kubernetes, helm, packer) - Work in Progress
 
 **(Work in Progress):** terraform, docker, ansible, kubernetes, packer modules
 
@@ -179,52 +269,135 @@ libraries:
 
 ### DisplayManager and IconManager
 
-External code should NEVER directly call `IconManager` or `console.print`, instead always use `DisplayManager` methods.
+**CRITICAL RULE - NEVER violate this:**
+- NEVER use `console.print()` outside of display manager classes (`cli/core/display/` directory)
+- NEVER import `Console` from `rich.console` except in display manager classes or `cli/__main__.py`
+- ALWAYS use `module_instance.display.display_*()` or `display.display_*()` methods for ALL output
+- Display managers (`cli/core/display/*.py`) are the ONLY exception - they implement console output
+
+**Rationale:**
+- `DisplayManager` provides a **centralized interface** for ALL CLI output rendering
+- Direct console usage bypasses formatting standards, icon management, and output consistency
+- `IconManager` provides **Nerd Font icons** internally for DisplayManager - never use emojis or direct icons
 
-- `DisplayManager` provides a **centralized interface** for ALL CLI output rendering (Use `display_***` methods from `DisplayManager` for ALL output)
-- `IconManager` provides **Nerd Font icons** internally for DisplayManager, don't use Emojis or direct console access
+**DisplayManager Architecture** (Refactored for Single Responsibility Principle):
+
+`DisplayManager` acts as a facade that delegates to specialized manager classes:
+
+1. **VariableDisplayManager** - Handles all variable-related rendering
+   - `render_variable_value()` - Variable value formatting with context awareness
+   - `render_section()` - Section header display
+   - `render_variables_table()` - Complete variables table with dependencies
+
+2. **TemplateDisplayManager** - Handles all template-related rendering
+   - `render_template()` - Main template display coordinator
+   - `render_template_header()` - Template metadata display
+   - `render_file_tree()` - Template file structure visualization
+   - `render_file_generation_confirmation()` - Files preview before generation
+
+3. **StatusDisplayManager** - Handles status messages and error display
+   - `display_message()` - Core message formatting with level-based routing
+   - `display_error()`, `display_warning()`, `display_success()`, `display_info()` - Convenience methods
+   - `display_template_render_error()` - Detailed render error display
+   - `display_warning_with_confirmation()` - Interactive warning prompts
+
+4. **TableDisplayManager** - Handles table rendering
+   - `render_templates_table()` - Templates list with library indicators
+   - `render_status_table()` - Status tables with success/error indicators
+   - `render_config_tree()` - Configuration tree visualization
+
+**Usage Pattern:**
+```python
+# External code uses DisplayManager methods (backward compatible)
+display = DisplayManager()
+display.display_template(template, template_id)
+
+# Internally, DisplayManager delegates to specialized managers
+# display.templates.render_template(template, template_id)
+```
+
+**Design Principles:**
+- External code calls `DisplayManager` methods only
+- `DisplayManager` delegates to specialized managers internally
+- Each specialized manager has a single, focused responsibility
+- Backward compatibility maintained through delegation methods
+- All managers can access parent DisplayManager via `self.parent`
 
 ## Templates
 
 Templates are directory-based. Each template is a directory containing all the necessary files and subdirectories for the boilerplate.
 
+### Template Rendering Flow
+
+**How templates are loaded and rendered:**
+
+1. **Discovery**: LibraryManager finds template directories containing `template.yaml`/`template.yml`
+2. **Parsing**: Template class loads and parses the template metadata and spec
+3. **Schema Resolution**: Module's `get_spec()` loads appropriate spec based on template's `schema` field
+4. **Variable Inheritance**: Template inherits ALL variables from module schema
+5. **Variable Merging**: Template spec overrides are merged with module spec (precedence: module < template < user config < CLI)
+6. **Collection Building**: VariableCollection is constructed with merged variables and sections
+7. **Dependency Resolution**: Sections are topologically sorted based on `needs` constraints
+8. **Variable Resolution**: Variables with `needs` constraints are evaluated for visibility
+9. **Jinja2 Rendering**: Template files (`.j2`) are rendered with final variable values
+10. **Sanitization**: Rendered output is cleaned (whitespace, blank lines, trailing newline)
+11. **Validation**: Optional semantic validation (YAML structure, Docker Compose schema, etc.)
+
+**Key Architecture Points:**
+- Templates don't "call" module specs - they declare a schema version and inherit from it
+- Variable visibility is dynamic based on `needs` constraints (evaluated at prompt/render time)
+- Jinja2 templates support `{% include %}` and `{% import %}` for composition
+
+### Template Structure
+
 Requires `template.yaml` or `template.yml` with metadata and variables:
 
 ```yaml
 ---
 kind: compose
-schema: "1.0"  # Optional: Defaults to 1.0 if not specified
+schema: "X.Y"  # Optional: Defaults to "1.0" if not specified (e.g., "1.0", "1.2")
 metadata:
-  name: My Nginx Template
-  description: >
-    A template for a simple Nginx service.
-
-
-    Project: https://...
-
-    Source: https://
-
-    Documentation: https://
-  version: 0.1.0
-  author: Christian Lempa
-  date: '2024-10-01'
+  name: My Service Template
+  description: A template for a service.
+  version: 1.0.0
+  author: Your Name
+  date: '2024-01-01'
 spec:
   general:
     vars:
-      nginx_version:
-        type: string
-        description: The Nginx version to use.
-        default: latest
+      service_name:
+        type: str
+        description: Service name
 ```
 
+### Template Metadata Versioning
+
+**Template Version Field:**
+The `metadata.version` field in `template.yaml` should reflect the version of the underlying application or resource:
+- **Compose templates**: Match the Docker image version (e.g., `nginx:1.25.3` → `version: 1.25.3`)
+- **Terraform templates**: Match the provider version (e.g., AWS provider 5.23.0 → `version: 5.23.0`)
+- **Other templates**: Match the primary application/tool version being deployed
+- Use `latest` or increment template-specific version (e.g., `0.1.0`, `0.2.0`) only when no specific application version applies
+
+**Rationale:** This helps users identify which version of the application/provider the template is designed for and ensures template versions track upstream changes.
+
+**Application Version Variables:**
+- **IMPORTANT**: Application/image versions should be **hardcoded** in template files (e.g., `image: nginx:1.25.3`)
+- Do NOT create template variables for application versions (e.g., no `nginx_version` variable)
+- Users should update the template file directly when they need a different version
+- This prevents version mismatches and ensures templates are tested with specific, known versions
+- Exception: Only create version variables if there's a strong technical reason (e.g., multi-component version pinning)
+
 ### Template Schema Versioning
 
+**Version Format:** Schemas use 2-level versioning in `MAJOR.MINOR` format (e.g., "1.0", "1.2", "2.0").
+
 Templates and modules use schema versioning to ensure compatibility. Each module defines a supported schema version, and templates declare which schema version they use.
 
 ```yaml
 ---
 kind: compose
-schema: "1.0"  # Defaults to 1.0 if not specified
+schema: "X.Y"  # Optional: Defaults to "1.0" if not specified (e.g., "1.0", "1.2")
 metadata:
   name: My Template
   version: 1.0.0
@@ -234,7 +407,7 @@ spec:
 ```
 
 **How It Works:**
-- **Module Schema Version**: Each module defines `schema_version` (e.g., "1.1")
+- **Module Schema Version**: Each module defines `schema_version` (e.g., "1.0", "1.2", "2.0")
 - **Module Spec Loading**: Modules load appropriate spec based on template's schema version
 - **Template Schema Version**: Each template declares `schema` at the top level (defaults to "1.0")
 - **Compatibility Check**: Template schema ≤ Module schema → Compatible
@@ -242,39 +415,39 @@ spec:
 
 **Behavior:**
 - Templates without `schema` field default to "1.0" (backward compatible)
-- Old templates (schema 1.0) work with newer modules (schema 1.1)
-- New templates (schema 1.2) fail on older modules (schema 1.1) with clear error
-- Version comparison uses 2-level versioning (major.minor format)
+- Older templates work with newer module versions (backward compatibility)
+- Templates with newer schema versions fail on older modules with `IncompatibleSchemaVersionError`
+- Version comparison uses MAJOR.MINOR format (e.g., "1.0" < "1.2" < "2.0")
 
 **When to Use:**
 - Increment module schema version when adding new features (new variable types, sections, etc.)
-- Set template schema when using features from a specific schema
-- Example: Template using new variable type added in schema 1.1 should set `schema: "1.1"`
+- Set template schema when using features from a specific schema version
+- Templates using features from newer schemas must declare the appropriate schema version
 
 **Single-File Module Example:**
 ```python
 class SimpleModule(Module):
   name = "simple"
   description = "Simple module"
-  schema_version = "1.0"
+  schema_version = "X.Y"  # e.g., "1.0", "1.2"
   spec = VariableCollection.from_dict({...})  # Single spec
 ```
 
 **Multi-Schema Module Example:**
 ```python
-# cli/modules/compose/__init__.py
-class ComposeModule(Module):
-  name = "compose"
-  description = "Manage Docker Compose configurations"
-  schema_version = "1.1"  # Highest schema version supported
+# cli/modules/modulename/__init__.py
+class ExampleModule(Module):
+  name = "modulename"
+  description = "Module description"
+  schema_version = "X.Y"  # Highest schema version supported (e.g., "1.2", "2.0")
   
   def get_spec(self, template_schema: str) -> VariableCollection:
     """Load spec based on template schema version."""
-    if template_schema == "1.0":
-      from .spec_v1_0 import get_spec
-    elif template_schema == "1.1":
-      from .spec_v1_1 import get_spec
-    return get_spec()
+    # Dynamically load the appropriate spec version
+    # template_schema will be like "1.0", "1.2", etc.
+    version_file = f"spec_v{template_schema.replace('.', '_')}"
+    spec_module = importlib.import_module(f".{version_file}", package=__package__)
+    return spec_module.get_spec()
 ```
 
 **Version Management:**
@@ -287,6 +460,7 @@ class ComposeModule(Module):
 - **Jinja2 Templates (`.j2`)**: Rendered by Jinja2, `.j2` extension removed in output. Support `{% include %}` and `{% import %}`.
 - **Static Files**: Non-`.j2` files copied as-is.
 - **Sanitization**: Auto-sanitized (single blank lines, no leading blanks, trimmed whitespace, single trailing newline).
+- **Shortcodes**: Template descriptions support emoji-style shortcodes (e.g., `:warning:`, `:info:`, `:docker:`) which are automatically replaced with Nerd Font icons during display. Add new shortcodes to `IconManager.SHORTCODES` dict.
 
 ### Docker Compose Best Practices
 
@@ -295,18 +469,46 @@ class ComposeModule(Module):
 When using Traefik with Docker Compose, the `traefik.docker.network` label is **CRITICAL** for stacks with multiple networks. When containers are connected to multiple networks, Traefik must know which network to use for routing.
 
 **Implementation:**
-- ALL templates using Traefik MUST follow the patterns in `archetypes/compose/traefik-v1.j2` (standard mode) and `archetypes/compose/swarm-v1.j2` (swarm mode)
-- These archetypes are the authoritative reference for correct Traefik label configuration
+- Review `archetypes/compose/` directory for reference implementations of Traefik integration patterns
 - The `traefik.docker.network={{ traefik_network }}` label must be present in both standard `labels:` and `deploy.labels:` sections
+- Standard mode and Swarm mode require different label configurations - check archetypes for examples
 
 ### Variables
 
-**Precedence** (lowest to highest):
+**How Templates Inherit Variables:**
+
+Templates automatically inherit ALL variables from the module schema version they declare. The template's `schema: "X.Y"` field determines which module spec is loaded, and all variables from that schema are available.
+
+**When to Define Template Variables:**
+
+You only need to define variables in your template's `spec` section when:
+1. **Overriding defaults**: Change default values for module variables (e.g., hardcode `service_name` for your specific app)
+2. **Adding custom variables**: Define template-specific variables not present in the module schema
+3. **Upgrading to newer schema**: To use new features, update `schema: "X.Y"` to a higher version - no template spec changes needed
+
+**Variable Precedence** (lowest to highest):
 1. Module `spec` (defaults for all templates of that kind)
 2. Template `spec` (overrides module defaults)
 3. User `config.yaml` (overrides template and module defaults)
 4. CLI `--var` (highest priority)
 
+**Template Variable Override Rules:**
+- **Override module defaults**: Only specify properties that differ from module spec (e.g., change `default` value)
+- **Create new variables**: Define template-specific variables not in module spec
+- **Minimize duplication**: Do NOT re-specify `type`, `description`, or other properties if they remain unchanged from module spec
+
+**Example:**
+```yaml
+# Template declares schema: "1.2" → inherits ALL variables from compose schema 1.2
+# Template spec ONLY needs to override specific defaults:
+spec:
+  general:
+    vars:
+      service_name:
+        default: whoami  # Only override the default, type already defined in module
+      # All other schema 1.2 variables (network_mode, volume_mode, etc.) are automatically available
+```
+
 **Variable Types:**
 - `str` (default), `int`, `float`, `bool`
 - `email` - Email validation with regex
@@ -324,9 +526,32 @@ When using Traefik with Docker Compose, the `traefik.docker.network` label is **
 - `options` - List of valid values (for enum type)
 
 **Section Features:**
-- **Required Sections**: Mark with `required: true` (general is implicit). Users must provide all values.
 - **Toggle Settings**: Conditional sections via `toggle: "bool_var_name"`. If false, section is skipped.
-- **Dependencies**: Use `needs: "section_name"` or `needs: ["sec1", "sec2"]`. Dependent sections only shown when dependencies are enabled. Auto-validated (detects circular/missing/self dependencies). Topologically sorted.
+  - **IMPORTANT**: When a section has `toggle: "var_name"`, that boolean variable is AUTO-CREATED by the system
+  - Toggle variable behavior may vary by schema version - check current schema documentation
+  - Example: `ports` section with `toggle: "ports_enabled"` automatically provides `ports_enabled` boolean
+- **Dependencies**: Use `needs: "section_name"` or `needs: ["sec1", "sec2"]`. Dependent sections only shown when dependencies are enabled.
+
+**Dependency Resolution Architecture:**
+
+Sections and variables support `needs` constraints to control visibility based on other variables.
+
+**Section-Level Dependencies:**
+- Format: `needs: "section_name"` or `needs: ["sec1", "sec2"]`
+- Section only appears when all required sections are enabled (their toggle variables are true)
+- Automatically validated: detects circular, missing, and self-dependencies
+- Topologically sorted: ensures dependencies are prompted/processed before dependents
+
+**Variable-Level Dependencies:**
+- Format: `needs: "var_name=value"` or `needs: "var1=val1;var2=val2"` (semicolon-separated)
+- Variable only visible when constraint is satisfied (e.g., `needs: "network_mode=bridge"`)
+- Supports multiple values: `needs: "network_mode=bridge,macvlan"` (comma = OR)
+- Evaluated dynamically at prompt and render time
+
+**Validation:**
+- Circular dependencies: Raises error if A needs B and B needs A
+- Missing dependencies: Raises error if referencing non-existent sections/variables
+- Self-dependencies: Raises error if section depends on itself
 
 **Example Section with Dependencies:**
 
@@ -389,96 +614,132 @@ To skip the prompt use the `--no-interactive` flag, which will use defaults or e
 - `list` - List all templates
 - `search <query>` - Search templates by ID
 - `show <id>` - Show template details
-- `generate <id> [directory]` - Generate from template (supports `--dry-run`, `--var`, `--no-interactive`)
-- `validate [id]` - Validate templates (Jinja2 + semantic)
+- `generate <id> -o <directory>` - Generate from template (supports `--dry-run`, `--var`, `--no-interactive`)
+- `validate [template_id]` - Validate template(s) (Jinja2 + semantic). Omit template_id to validate all templates
 - `defaults` - Manage config defaults (`get`, `set`, `rm`, `clear`, `list`)
 
 **Core Commands:**
 - `repo sync` - Sync git-based libraries
 - `repo list` - List configured libraries
 
-## Archetypes Testing Tool
+## Archetypes
 
-The `archetypes` package provides a testing tool for developing and testing individual template snippets (Jinja2 files) without needing a full template directory structure.
+The `archetypes` package provides reusable, standardized template building blocks for creating boilerplates. Archetypes are modular Jinja2 snippets that represent specific configuration sections.
 
 ### Purpose
 
-Archetypes are template "snippets" or "parts" that can be tested in isolation. This is useful for:
-- Developing specific sections of templates (e.g., network configurations, volume mounts)
-- Testing Jinja2 logic with different variable combinations
-- Validating template rendering before integrating into full templates
+1. **Template Development**: Provide standardized, tested building blocks for creating new templates
+2. **Testing & Validation**: Enable testing of specific configuration sections in isolation with different variable combinations
 
 ### Usage
 
 ```bash
-# Run the archetypes tool
-python3 -m archetypes
-
-# List all archetypes for a module
+# List available archetypes for a module
 python3 -m archetypes compose list
 
-# Show details of an archetype (displays variables and content)
-python3 -m archetypes compose show network-v1
+# Preview an archetype component
+python3 -m archetypes compose generate <archetype-name>
+
+# Test with variable overrides
+python3 -m archetypes compose generate <archetype-name> \
+  --var traefik_enabled=true \
+  --var swarm_enabled=true
+
+# Validate templates against archetypes
+python3 -m archetypes compose validate            # All templates
+python3 -m archetypes compose validate <template> # Single template
+```
+
+### Archetype Validation
+
+The `validate` command compares templates against archetypes to measure coverage and identify which archetype patterns are being used.
 
-# Preview generated output (always in preview mode - never writes files)
-python3 -m archetypes compose generate network-v1
+**What it does:**
+- Compares each template file against all available archetypes using **structural pattern matching**
+- Abstracts away specific values to focus on:
+  - **Jinja2 control flow**: `{% if %}`, `{% elif %}`, `{% else %}`, `{% for %}` structures
+  - **YAML structure**: Key names, indentation, and nesting patterns
+  - **Variable usage patterns**: Presence of `{{ }}` placeholders (not specific names)
+  - **Wildcard placeholders**: `__ANY__`, `__ANYSTR__`, `__ANYINT__`, `__ANYBOOL__`
+  - **Repeat markers**: `{# @repeat-start #}` / `{# @repeat-end #}`
+  - **Optional markers**: `{# @optional-start #}` / `{# @optional-end #}`
+- This allows detection of archetypes even when specific values differ (e.g., `grafana_data` vs `alloy_data`)
+- Calculates **containment ratio**: what percentage of each archetype structure is found within the template
+- Reports usage status: **exact** (≥95%), **high** (≥70%), **partial** (≥30%), or **none** (<30%)
+- Provides coverage metrics: (exact + high matches) / total archetypes
 
-# Preview with variable overrides
-python3 -m archetypes compose generate network-v1 \
-  --var network_mode=macvlan \
-  --var network_macvlan_ipv4_address=192.168.1.100
+### Advanced Pattern Matching in Archetypes
 
-# Preview with reference directory (for context only - no files written)
-python3 -m archetypes compose generate network-v1 /tmp/output --var network_mode=host
+Archetypes support special annotations for flexible pattern matching:
+
+**Wildcard Placeholders** (match any value):
+- `__ANY__` - Matches anything
+- `__ANYSTR__` - Matches any string
+- `__ANYINT__` - Matches any integer
+- `__ANYBOOL__` - Matches any boolean
+
+**Repeat Markers** (pattern can appear 1+ times):
+```yaml
+{# @repeat-start #}
+  pattern
+{# @repeat-end #}
 ```
 
-### Structure
+**Optional Markers** (section may or may not exist):
+```yaml
+{# @optional-start #}
+  pattern
+{# @optional-end #}
+```
 
+**Example:**
+```yaml
+volumes:
+  {# @repeat-start #}
+  __ANY__:
+    driver: local
+  {# @repeat-end #}
 ```
-archetypes/
-  __init__.py           # Package initialization
-  __main__.py           # CLI tool (auto-discovers modules)
-  compose/              # Module-specific archetypes
-    network-v1.j2       # Archetype snippet (just a .j2 file)
-    volumes-v1.j2       # Another archetype
-  terraform/            # Another module's archetypes
-    vpc.j2
+Matches any number of volumes with `driver: local`
+
+**Usage:**
+```bash
+# Validate all templates in library - shows summary table
+python3 -m archetypes compose validate
+
+# Validate specific template - shows detailed archetype breakdown
+python3 -m archetypes compose validate whoami
+
+# Validate templates in custom location
+python3 -m archetypes compose validate --library /path/to/templates
 ```
 
-### Key Features
+**Output:**
+- **Summary mode** (all templates): Table showing exact/high/partial/none counts and coverage % per template
+- **Detail mode** (single template): Table showing each archetype's status, similarity %, and matching file
 
-- **Auto-discovers modules**: Scans `archetypes/` for subdirectories (module names)
-- **Reuses CLI components**: Imports actual CLI classes (Template, VariableCollection, DisplayManager) for identical behavior
-- **Loads module specs**: Pulls variable specifications from `cli/modules/<module>/spec_v*.py` for defaults
-- **Full variable context**: Provides ALL variables with defaults (not just satisfied ones) for complete rendering
-- **Three commands**: `list`, `show`, `generate`
-- **Testing only**: The `generate` command NEVER writes files - it always shows preview output only
+**Use cases:**
+- **Quality assurance**: Ensure templates follow established patterns
+- **Refactoring**: Identify templates that could benefit from archetype alignment
+- **Documentation**: Track which archetypes are most/least used across templates
 
-### Implementation Details
+### Template Development Workflow
+
+1. **Discover**: Use `list` command to see available archetype components for your module
+2. **Review**: Preview archetypes to understand implementation patterns
+3. **Copy**: Copy relevant archetype components to your template directory
+4. **Customize**: Modify as needed (hardcode image, add custom labels, etc.)
+5. **Validate**: Use `compose validate` to check Jinja2 syntax and semantic correctness
+
+### Architecture
+
+**Key Concepts:**
+- Each module can have its own `archetypes/<module>/` directory with reusable components
+- `archetypes.yaml` configures schema version and variable overrides for testing
+- Components are modular Jinja2 files that can be tested in isolation or composition
+- **Testing only**: The `generate` command NEVER writes files - always shows preview output
 
 **How it works:**
-1. Module discovery: Finds subdirectories in `archetypes/` (e.g., `compose`)
-2. For each module, creates a Typer sub-app with list/show/generate commands
-3. Archetype files are simple `.j2` files (no `template.yaml` needed)
-4. Variable defaults come from module spec: `cli/modules/<module>/spec_v*.py`
-5. Rendering uses Jinja2 with full variable context from spec
-
-**ArchetypeTemplate class:**
-- Simplified template wrapper for single .j2 files
-- Loads module spec and converts to VariableCollection
-- Extracts ALL variables (not just satisfied) from spec sections
-- Merges user overrides (`--var`) on top of spec defaults
-- Renders using Jinja2 FileSystemLoader
-
-**Variable defaults source:**
-```python
-# Defaults come from module spec files
-from cli.modules.compose import spec  # OrderedDict with variable definitions
-vc = VariableCollection(spec)         # Convert to VariableCollection
-
-# Extract all variables with their default values
-for section_name, section in vc._sections.items():
-    for var_name, var in section.variables.items():
-        if var.value is not None:  # var.value stores the default
-            render_context[var_name] = var.value
-```
+- Loads module spec based on schema version from `archetypes.yaml`
+- Merges variable sources: module spec → archetypes.yaml → CLI --var
+- Renders using Jinja2 with support for `{% include %}` directives

+ 42 - 0
CHANGELOG.md

@@ -7,6 +7,48 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
 
 ## [Unreleased]
 
+### Added
+- Variable file support with `--var-file` flag (#1331) - Load variables from YAML file for non-interactive deployments
+- Variable override support for `show` command with `--var` and `--var-file` flags (#1421) - Preview variable overrides before generating
+- Terraform template support (#1422) - Manage Terraform configurations with schema 1.0
+- Kubernetes template support (#1423) - Manage Kubernetes configurations with schema 1.0
+- Helm template support (#1424) - Manage Helm charts with schema 1.0
+- Ansible template support (#1426) - Manage Ansible playbooks with schema 1.0
+- Packer template support (#1427) - Manage Packer templates with schema 1.0
+- Alphabetically sorted commands in help output with grouped panels for better organization
+- Separate help panels for "Template Commands" and "Configuration Commands"
+- Compose Schema 1.2: Port variables (http, https, ssh, dns, dhcp, smtp) - Templates only prompt for ports they use
+- Compose Schema 1.2: Dedicated `volume` section for storage configuration (replaces swarm_volume_* variables)
+- Compose Schema 1.2: `resources` section for CPU and memory limits
+- Compose Schema 1.2: `traefik_domain` variable for base domain configuration (#1362) - Set once, use across all services
+- Compose Schema 1.2: `database_host` now requires `database_external=true`
+- Compose Schema 1.2: `email_encryption` replaces `email_tls` and `email_ssl` with options: none, ssl, tls
+- Markdown formatting support for template descriptions and next steps (#1471)
+- Output directory flag `--output`/`-o` for `generate` command (#1534) - Replaces positional directory argument
+- Variable property `autogenerated_length` to specify custom length for auto-generated values (default: 32 characters)
+- Nerd Font icon support with shortcode replacement in template descriptions - Rich visual feedback using standardized icon system
+
+### Changed
+- Schema is now managed in JSON for better standardization and clarity (#1555)
+- Compose Schema 1.2: Removed `traefik_entrypoint` and `traefik_tls_entrypoint` variables
+- Removed Jinja2 `| default()` filter extraction and merging (#1410) - All defaults must now be defined in template/module specs
+- Refactored code quality (#1364) for all core modules from single files to package structure with specific submodules
+- Improved debug logging to capture module discovery and registration during initialization
+- Enhanced debug logging for better troubleshooting
+- Simplified dry-run output to show only essential information (files, sizes, status)
+- Traefik template now uses module spec variable `authentik_traefik_middleware` instead of template-specific `traefik_authentik_middleware_name`
+- `validate` command now accepts template ID as positional argument (e.g., `compose validate netbox`) - Consistent with archetypes command pattern
+- Sections can't be required anymore, only variables can be required - Simplifies logic and improves usability
+- Variables are now optional by default - only explicitly marked `required: true` variables are required, display shows `(*)` indicator instead of `(required)`
+
+### Deprecated
+- Positional directory argument for `generate` command (#1534) - Use `--output`/`-o` flag instead (will be removed in v0.2.0)
+
+### Fixed
+- CLI --var flag now properly converts boolean and numeric strings to appropriate Python types (#1522)
+- Empty template files are no longer created during generation (#1518)
+- Enhanced user confirmation flow for template generation (#1428)
+
 ## [0.0.7] - 2025-10-28
 
 ### Added

+ 270 - 0
CONTRIBUTING.md

@@ -0,0 +1,270 @@
+# Contributing to Boilerplates
+
+Thank you for your interest in contributing to the Boilerplates project! This document provides guidelines and instructions for contributing.
+
+## Table of Contents
+
+- [Code of Conduct](#code-of-conduct)
+- [How to Contribute](#how-to-contribute)
+- [CLI Development](#cli-development)
+- [Template Contributions](#template-contributions)
+- [Development Setup](#development-setup)
+- [Code Standards](#code-standards)
+- [Testing Guidelines](#testing-guidelines)
+- [Pull Request Process](#pull-request-process)
+
+## Code of Conduct
+
+Be respectful and constructive in all interactions. We're here to build great tools together.
+
+## How to Contribute
+
+### CLI Development
+
+**IMPORTANT:** Any changes to the CLI application (`cli/` directory) require coordination.
+
+**Before making CLI changes:**
+1. Join the [Discord server](https://christianlempa.de/discord)
+2. Reach out to discuss your proposed changes
+3. Wait for approval before opening a PR
+
+**Rationale:** The CLI architecture is complex and tightly integrated. Coordinating changes ensures consistency and prevents conflicts.
+
+### Template Contributions
+
+Template contributions are welcome and encouraged! You can:
+- Add new templates to `library/`
+- Improve existing templates
+- Fix bugs in templates
+- Update template documentation
+
+**Process:**
+1. Read the [Developer Documentation](../../wiki/Developers) in the Wiki
+2. Create a new branch: `feature/###-template-name` or `problem/###-fix-description`
+3. Add or modify templates following the structure in `library/`
+4. Test your template thoroughly
+5. Open a pull request
+
+**No prior approval needed** for template contributions, but feel free to open an issue first to discuss larger changes.
+
+## Development Setup
+
+### Prerequisites
+
+- Python 3.10 or higher
+- Git
+- pipx (recommended) or pip
+
+### Installation
+
+1. Clone the repository:
+```bash
+git clone https://github.com/ChristianLempa/boilerplates.git
+cd boilerplates
+```
+
+2. Create a virtual environment:
+```bash
+python3 -m venv venv
+source venv/bin/activate  # On Windows: venv\Scripts\activate
+```
+
+3. Install dependencies:
+```bash
+pip install -e .
+```
+
+4. Run the CLI in development mode:
+```bash
+python3 -m cli --help
+```
+
+### Development Commands
+
+```bash
+# Run CLI with debug logging
+python3 -m cli --log-level DEBUG compose list
+
+# Test template generation
+python3 -m cli compose generate template-name --dry-run
+
+# Validate templates
+python3 -m cli compose validate
+```
+
+## Code Standards
+
+### Python Style Guide
+
+- Follow PEP 8 conventions
+- Use **2-space indentation** (project standard)
+- Maximum line length: 100 characters
+- Use type hints where appropriate
+
+### Naming Conventions
+
+- **Files:** lowercase with underscores (`variable_display.py`)
+- **Classes:** PascalCase (`VariableCollection`, `DisplayManager`)
+- **Functions/Methods:** snake_case (`render_template`, `get_spec`)
+- **Constants:** UPPER_SNAKE_CASE (`DEFAULT_TIMEOUT`, `MAX_RETRIES`)
+- **Private methods:** prefix with underscore (`_parse_section`)
+
+### Comment Anchors
+
+Use standardized comment anchors for important notes:
+
+```python
+# TODO: Implement feature X
+# FIXME: Bug in validation logic
+# NOTE: This is a workaround for issue #123
+# LINK: https://docs.python.org/3/library/typing.html
+```
+
+### DisplayManager Usage
+
+**CRITICAL RULE:**
+- NEVER use `console.print()` outside of display manager classes
+- NEVER import `Console` from `rich.console` except in display manager classes
+- ALWAYS use `display.display_*()` methods for ALL output
+
+```python
+# GOOD
+display = DisplayManager()
+display.display_success("Template generated successfully")
+
+# BAD
+from rich.console import Console
+console = Console()
+console.print("Template generated")  # Don't do this!
+```
+
+### Docstrings
+
+Use docstrings for all public classes and methods:
+
+```python
+def render_template(self, template: Template, template_id: str) -> None:
+  """Render a complete template display.
+  
+  Args:
+    template: The Template object to render
+    template_id: The template identifier
+  """
+  pass
+```
+
+## Testing Guidelines
+
+### Linting and Formatting
+
+**REQUIRED before committing:**
+
+```bash
+# YAML files
+yamllint library/
+
+# Python code - check and auto-fix
+ruff check --fix .
+
+# Python code - format
+ruff format .
+```
+
+### Validation Commands
+
+```bash
+# Validate all templates
+python3 -m cli compose validate
+
+# Validate specific template
+python3 -m cli compose validate template-name
+
+# Validate with semantic checks
+python3 -m cli compose validate --semantic
+```
+
+### Manual Testing
+
+Before submitting a PR, test your changes:
+
+```bash
+# Test template generation
+python3 -m cli compose generate your-template --dry-run
+
+# Test interactive mode
+python3 -m cli compose generate your-template
+
+# Test non-interactive mode
+python3 -m cli compose generate your-template output-dir \
+  --var service_name=test \
+  --no-interactive
+```
+
+## Pull Request Process
+
+### Branch Naming
+
+- **Features:** `feature/###-description` (e.g., `feature/1234-add-nginx-template`)
+- **Bug fixes:** `problem/###-description` (e.g., `problem/1235-fix-validation`)
+
+### Commit Messages
+
+Follow the format: `type(scope): subject`
+
+**Types:**
+- `feat`: New feature
+- `fix`: Bug fix
+- `docs`: Documentation changes
+- `refactor`: Code refactoring
+- `test`: Adding tests
+- `chore`: Maintenance tasks
+
+**Examples:**
+```
+feat(compose): add nginx template
+fix(display): correct variable rendering for enum types
+docs(wiki): update installation instructions
+refactor(template): simplify Jinja2 rendering logic
+```
+
+### PR Checklist
+
+Before submitting a pull request:
+
+- [ ] Code follows style guidelines (run `ruff check` and `ruff format`)
+- [ ] YAML files pass `yamllint`
+- [ ] All templates validate successfully
+- [ ] Changes are tested manually
+- [ ] Commit messages follow conventions
+- [ ] PR description explains the changes
+- [ ] Related issues are referenced (e.g., "Closes #1234")
+
+### PR Review
+
+- PRs require approval before merging
+- Address review comments promptly
+- Keep PRs focused and reasonably sized
+- Squash commits if requested
+
+## Issue Labels
+
+When creating issues, use appropriate labels:
+
+- `feature` - New feature requests
+- `problem` - Bug reports
+- `discussion` - General discussions
+- `question` - Questions about usage
+- `documentation` - Documentation improvements
+
+## Getting Help
+
+- Check the [Wiki](../../wiki) for documentation
+- Join [Discord](https://christianlempa.de/discord) for discussions
+- Open an issue for bugs or feature requests
+- Watch [YouTube tutorials](https://www.youtube.com/@christianlempa)
+
+## License
+
+By contributing, you agree that your contributions will be licensed under the same license as the project.
+
+Thank you for contributing to Boilerplates!

+ 0 - 3
archetypes/__init__.py

@@ -1,3 +0,0 @@
-"""Archetypes testing package for template snippet development."""
-
-__version__ = "0.1.0"

+ 0 - 453
archetypes/__main__.py

@@ -1,453 +0,0 @@
-#!/usr/bin/env python3
-"""
-Archetypes testing tool - for developing and testing template snippets.
-Usage: python3 -m archetypes <module> <command>
-"""
-
-from __future__ import annotations
-
-import logging
-import sys
-from pathlib import Path
-from typing import Optional, Dict, Any, List
-
-from typer import Typer, Argument, Option
-from rich.console import Console
-from rich.table import Table
-from rich.panel import Panel
-
-# Import CLI components
-from cli.core.collection import VariableCollection
-from cli.core.display import DisplayManager
-from cli.core.exceptions import (
-    TemplateRenderError,
-)
-
-app = Typer(
-    help="Test and develop template snippets (archetypes) without full template structure.",
-    add_completion=True,
-    rich_markup_mode="rich",
-)
-console = Console()
-display = DisplayManager()
-
-# Base directory for archetypes
-ARCHETYPES_DIR = Path(__file__).parent
-
-
-def setup_logging(log_level: str = "WARNING") -> None:
-    """Configure logging for debugging."""
-    numeric_level = getattr(logging, log_level.upper(), None)
-    if not isinstance(numeric_level, int):
-        raise ValueError(f"Invalid log level: {log_level}")
-
-    logging.basicConfig(
-        level=numeric_level,
-        format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
-        datefmt="%Y-%m-%d %H:%M:%S",
-    )
-
-
-class ArchetypeTemplate:
-    """Simplified template for testing individual .j2 files."""
-
-    def __init__(self, file_path: Path, module_name: str):
-        self.file_path = file_path
-        self.module_name = module_name
-        self.id = file_path.stem  # Filename without extension
-        self.template_dir = file_path.parent
-
-        # Create a minimal template.yaml in memory
-        self.metadata = type(
-            "obj",
-            (object,),
-            {
-                "name": f"Archetype: {self.id}",
-                "description": f"Testing archetype from {file_path.name}",
-                "version": "0.1.0",
-                "author": "Testing",
-                "library": "archetype",
-                "tags": ["archetype", "test"],
-            },
-        )()
-
-        # Parse spec from module if available
-        self.variables = self._load_module_spec()
-
-    def _load_module_spec(self) -> Optional[VariableCollection]:
-        """Load variable spec from the module and merge with extension.yaml if present."""
-        try:
-            # Import the module to get its spec
-            if self.module_name == "compose":
-                from cli.modules.compose import spec
-                from collections import OrderedDict
-                import yaml
-
-                # Convert spec to dict if needed
-                if isinstance(spec, (dict, OrderedDict)):
-                    spec_dict = OrderedDict(spec)
-                elif isinstance(spec, VariableCollection):
-                    # Extract dict from existing VariableCollection (shouldn't happen)
-                    spec_dict = OrderedDict()
-                else:
-                    logging.warning(
-                        f"Spec for {self.module_name} has unexpected type: {type(spec)}"
-                    )
-                    return None
-
-                # Check for extension.yaml in the archetype directory
-                extension_file = self.template_dir / "extension.yaml"
-                if extension_file.exists():
-                    try:
-                        with open(extension_file, "r") as f:
-                            extension_vars = yaml.safe_load(f)
-
-                        if extension_vars:
-                            # Apply extension defaults to existing variables in their sections
-                            # Extension vars that don't exist will be added to a "testing" section
-                            applied_count = 0
-                            new_vars = {}
-
-                            for var_name, var_spec in extension_vars.items():
-                                found = False
-                                # Search for the variable in existing sections
-                                for section_name, section_data in spec_dict.items():
-                                    if (
-                                        "vars" in section_data
-                                        and var_name in section_data["vars"]
-                                    ):
-                                        # Update the default value for existing variable
-                                        if "default" in var_spec:
-                                            section_data["vars"][var_name][
-                                                "default"
-                                            ] = var_spec["default"]
-                                            applied_count += 1
-                                            found = True
-                                            break
-
-                                # If variable doesn't exist in spec, add it to testing section
-                                if not found:
-                                    new_vars[var_name] = var_spec
-
-                            # Add new test-only variables to testing section
-                            if new_vars:
-                                if "testing" not in spec_dict:
-                                    spec_dict["testing"] = {
-                                        "title": "Testing Variables",
-                                        "description": "Additional variables for archetype testing",
-                                        "vars": {},
-                                    }
-                                spec_dict["testing"]["vars"].update(new_vars)
-
-                            logging.debug(
-                                f"Applied {applied_count} extension defaults, added {len(new_vars)} new test variables from {extension_file}"
-                            )
-                    except Exception as e:
-                        logging.warning(f"Failed to load extension.yaml: {e}")
-
-                return VariableCollection(spec_dict)
-        except Exception as e:
-            logging.warning(f"Could not load spec for module {self.module_name}: {e}")
-            return None
-
-    def render(self, variables: Optional[Dict[str, Any]] = None) -> Dict[str, str]:
-        """Render the single .j2 file using CLI's Template class."""
-        # Create a minimal template directory structure in memory
-        # by using the Template class's rendering capabilities
-        from jinja2 import Environment, FileSystemLoader, StrictUndefined
-
-        # Set up Jinja2 environment with the archetype directory
-        env = Environment(
-            loader=FileSystemLoader(str(self.template_dir)),
-            undefined=StrictUndefined,
-            trim_blocks=True,
-            lstrip_blocks=True,
-            keep_trailing_newline=True,
-        )
-
-        # Get variable values
-        if variables is None:
-            variables = {}
-
-        # Get default values from spec if available
-        if self.variables:
-            # Get ALL variable values, not just satisfied ones
-            # This is needed for archetype testing where we want full template context
-            # Include None values so templates can properly handle optional variables
-            spec_values = {}
-            for section_name, section in self.variables._sections.items():
-                for var_name, var in section.variables.items():
-                    # Include ALL variables, even if value is None
-                    # This allows Jinja2 templates to handle optional variables properly
-                    spec_values[var_name] = var.value
-            # Merge: CLI variables override spec defaults
-            final_values = {**spec_values, **variables}
-        else:
-            final_values = variables
-
-        try:
-            # Load and render the template
-            template = env.get_template(self.file_path.name)
-            rendered_content = template.render(**final_values)
-
-            # Remove .j2 extension for output filename
-            output_filename = self.file_path.name.replace(".j2", "")
-
-            return {output_filename: rendered_content}
-        except Exception as e:
-            raise TemplateRenderError(f"Failed to render {self.file_path.name}: {e}")
-
-
-def find_archetypes(module_name: str) -> List[Path]:
-    """Find all .j2 files in the module's archetype directory."""
-    module_dir = ARCHETYPES_DIR / module_name
-
-    if not module_dir.exists():
-        console.print(f"[red]Module directory not found: {module_dir}[/red]")
-        return []
-
-    # Find all .j2 files
-    j2_files = list(module_dir.glob("*.j2"))
-    return sorted(j2_files)
-
-
-def create_module_commands(module_name: str) -> Typer:
-    """Create a Typer app with commands for a specific module."""
-    module_app = Typer(help=f"Manage {module_name} archetypes")
-
-    @module_app.command()
-    def list() -> None:
-        """List all archetype files for this module."""
-        archetypes = find_archetypes(module_name)
-
-        if not archetypes:
-            display.display_warning(
-                f"No archetypes found for module '{module_name}'",
-                context=f"directory: {ARCHETYPES_DIR / module_name}",
-            )
-            return
-
-        # Create table
-        table = Table(
-            title=f"Archetypes for '{module_name}'",
-            show_header=True,
-            header_style="bold cyan",
-        )
-        table.add_column("ID", style="cyan")
-        table.add_column("Filename", style="white")
-        table.add_column("Size", style="dim")
-
-        for archetype_path in archetypes:
-            file_size = archetype_path.stat().st_size
-            if file_size < 1024:
-                size_str = f"{file_size}B"
-            else:
-                size_str = f"{file_size / 1024:.1f}KB"
-
-            table.add_row(
-                archetype_path.stem,
-                archetype_path.name,
-                size_str,
-            )
-
-        console.print(table)
-        console.print(f"\n[dim]Found {len(archetypes)} archetype(s)[/dim]")
-
-    @module_app.command()
-    def show(
-        id: str = Argument(..., help="Archetype ID (filename without .j2)"),
-    ) -> None:
-        """Show details of an archetype file."""
-        archetypes = find_archetypes(module_name)
-
-        # Find the archetype
-        archetype_path = None
-        for path in archetypes:
-            if path.stem == id:
-                archetype_path = path
-                break
-
-        if not archetype_path:
-            display.display_error(
-                f"Archetype '{id}' not found", context=f"module '{module_name}'"
-            )
-            return
-
-        # Load archetype
-        archetype = ArchetypeTemplate(archetype_path, module_name)
-
-        # Display details
-        console.print()
-        console.print(
-            Panel(
-                f"[bold]{archetype.metadata.name}[/bold]\n"
-                f"{archetype.metadata.description}\n\n"
-                f"[dim]Module:[/dim] {module_name}\n"
-                f"[dim]File:[/dim] {archetype_path.name}\n"
-                f"[dim]Path:[/dim] {archetype_path}",
-                title="Archetype Details",
-                border_style="cyan",
-            )
-        )
-
-        # Show variables if spec is loaded
-        if archetype.variables:
-            console.print("\n[bold]Available Variables:[/bold]")
-
-            # Access the private _sections attribute
-            for section_name, section in archetype.variables._sections.items():
-                if section.variables:
-                    console.print(
-                        f"\n[cyan]{section.title or section_name.capitalize()}:[/cyan]"
-                    )
-                    for var_name, var in section.variables.items():
-                        default = (
-                            var.value if var.value is not None else "[dim]none[/dim]"
-                        )
-                        console.print(f"  {var_name}: {default}")
-        else:
-            console.print("\n[yellow]No variable spec loaded for this module[/yellow]")
-
-        # Show file content
-        console.print("\n[bold]Template Content:[/bold]")
-        console.print("─" * 80)
-        with open(archetype_path, "r") as f:
-            console.print(f.read())
-        console.print()
-
-    @module_app.command()
-    def generate(
-        id: str = Argument(..., help="Archetype ID (filename without .j2)"),
-        directory: Optional[str] = Argument(
-            None, help="Output directory (for reference only - no files are written)"
-        ),
-        var: Optional[List[str]] = Option(
-            None,
-            "--var",
-            "-v",
-            help="Variable override (KEY=VALUE format)",
-        ),
-    ) -> None:
-        """Generate output from an archetype file (always in preview mode)."""
-        # Archetypes ALWAYS run in dry-run mode with content display
-        # This is a testing tool - it never writes actual files
-
-        archetypes = find_archetypes(module_name)
-
-        # Find the archetype
-        archetype_path = None
-        for path in archetypes:
-            if path.stem == id:
-                archetype_path = path
-                break
-
-        if not archetype_path:
-            display.display_error(
-                f"Archetype '{id}' not found", context=f"module '{module_name}'"
-            )
-            return
-
-        # Load archetype
-        archetype = ArchetypeTemplate(archetype_path, module_name)
-
-        # Parse variable overrides
-        variables = {}
-        if var:
-            for var_option in var:
-                if "=" in var_option:
-                    key, value = var_option.split("=", 1)
-                    variables[key] = value
-                else:
-                    console.print(
-                        f"[yellow]Warning: Invalid --var format '{var_option}' (use KEY=VALUE)[/yellow]"
-                    )
-
-        # Render the archetype
-        try:
-            rendered_files = archetype.render(variables)
-        except Exception as e:
-            display.display_error(
-                f"Failed to render archetype: {e}", context=f"archetype '{id}'"
-            )
-            return
-
-        # Determine output directory (for display purposes only)
-        if directory:
-            output_dir = Path(directory)
-        else:
-            output_dir = Path.cwd()
-
-        # Always show preview (archetypes never write files)
-        console.print()
-        console.print("[bold cyan]Archetype Preview (Testing Mode)[/bold cyan]")
-        console.print(
-            "[dim]This tool never writes files - it's for testing template snippets only[/dim]"
-        )
-        console.print()
-        console.print(f"[dim]Reference directory:[/dim] {output_dir}")
-        console.print(f"[dim]Files to preview:[/dim] {len(rendered_files)}")
-        console.print()
-
-        for filename, content in rendered_files.items():
-            full_path = output_dir / filename
-            status = "Would overwrite" if full_path.exists() else "Would create"
-            size = len(content.encode("utf-8"))
-            console.print(f"  [{status}] {filename} ({size} bytes)")
-
-        console.print()
-        console.print("[bold]Rendered Content:[/bold]")
-        console.print("─" * 80)
-        for filename, content in rendered_files.items():
-            console.print(content)
-
-        console.print()
-        display.display_success("Preview complete - no files were written")
-
-    return module_app
-
-
-def init_app() -> None:
-    """Initialize the application by discovering modules and registering commands."""
-    # Find all module directories in archetypes/
-    if ARCHETYPES_DIR.exists():
-        for module_dir in ARCHETYPES_DIR.iterdir():
-            if module_dir.is_dir() and not module_dir.name.startswith(("_", ".")):
-                module_name = module_dir.name
-                # Register module commands
-                module_app = create_module_commands(module_name)
-                app.add_typer(module_app, name=module_name)
-
-
-@app.callback(invoke_without_command=True)
-def main(
-    log_level: Optional[str] = Option(
-        None,
-        "--log-level",
-        help="Set logging level (DEBUG, INFO, WARNING, ERROR)",
-    ),
-) -> None:
-    """Archetypes testing tool for template snippet development."""
-    if log_level:
-        setup_logging(log_level)
-    else:
-        logging.disable(logging.CRITICAL)
-
-    import click
-
-    ctx = click.get_current_context()
-
-    if ctx.invoked_subcommand is None:
-        console.print(ctx.get_help())
-        sys.exit(0)
-
-
-if __name__ == "__main__":
-    try:
-        init_app()
-        app()
-    except KeyboardInterrupt:
-        console.print("\n[yellow]Operation cancelled[/yellow]")
-        sys.exit(130)
-    except Exception as e:
-        console.print(f"[bold red]Error:[/bold red] {e}")
-        sys.exit(1)

+ 0 - 20
archetypes/compose/configs-v1.j2

@@ -1,20 +0,0 @@
-{#
-  Archetype: toplevel-configs-v1
-  
-  Description:
-    Swarm configs definition from file source.
-  
-  Approach:
-    - Only applies to swarm mode
-    - Reads config from file at deploy time
-    - Configs are immutable once created
-  
-  Usage:
-    Use with service-configs-v1 for configuration file management.
-    Create configuration file before deploying stack.
-#}
-{% if swarm_enabled %}
-configs:
-  {{ config_name }}:
-    file: ./config/app.yaml
-{% endif %}

+ 0 - 134
archetypes/compose/extension.yaml

@@ -1,134 +0,0 @@
----
-# Extension variables for archetype testing
-# These variables provide defaults for variables that have no default in the module spec
-# or add custom variables specifically needed for archetype testing
-
-# Variables from spec that need defaults for testing
-service_name:
-  type: str
-  description: Service name for testing
-  default: testapp
-
-container_name:
-  type: str
-  description: Container name for testing
-  default: testapp-container
-
-container_hostname:
-  type: str
-  description: Container hostname for testing
-  default: testapp-host
-
-traefik_host:
-  type: hostname
-  description: Traefik host for testing
-  default: app.example.com
-
-database_port:
-  type: int
-  description: Database port for testing
-  default: 5432
-
-database_name:
-  type: str
-  description: Database name for testing
-  default: testdb
-
-database_user:
-  type: str
-  description: Database user for testing
-  default: dbuser
-
-database_password:
-  type: str
-  description: Database password for testing
-  default: secretpassword123
-  sensitive: true
-
-email_host:
-  type: str
-  description: Email server host for testing
-  default: smtp.example.com
-
-email_username:
-  type: str
-  description: Email username for testing
-  default: noreply@example.com
-
-email_password:
-  type: str
-  description: Email password for testing
-  default: emailpass123
-  sensitive: true
-
-email_from:
-  type: str
-  description: Email from address for testing
-  default: noreply@example.com
-
-authentik_url:
-  type: url
-  description: Authentik URL for testing
-  default: https://auth.example.com
-
-authentik_slug:
-  type: str
-  description: Authentik application slug for testing
-  default: testapp
-
-authentik_client_id:
-  type: str
-  description: Authentik client ID for testing
-  default: client_id_12345
-
-authentik_client_secret:
-  type: str
-  description: Authentik client secret for testing
-  default: client_secret_abcdef
-  sensitive: true
-
-# Custom variables specific to archetype testing (not in module spec)
-network_enabled:
-  type: bool
-  description: Enable network configuration for testing
-  default: true
-
-volume_external:
-  type: bool
-  description: Use external volume for testing
-  default: false
-
-ports_http:
-  type: int
-  description: HTTP port for testing
-  default: 8080
-
-secret_name:
-  type: str
-  description: Secret name for testing
-  default: app_secret
-
-config_name:
-  type: str
-  description: Config name for testing
-  default: app_config
-
-service_image:
-  type: str
-  description: Service image for testing
-  default: nginx:alpine
-
-service_port:
-  type: int
-  description: Service port for testing
-  default: 8080
-
-volume_name:
-  type: str
-  description: Volume name for testing
-  default: app_data
-
-traefik_middleware:
-  type: str
-  description: Traefik middleware for testing
-  default: auth@file

+ 0 - 50
archetypes/compose/networks-v1.j2

@@ -1,50 +0,0 @@
-{#
-  Archetype: networks-v1
-  
-  Description:
-    Consolidated top-level networks section supporting multiple modes:
-    - Bridge: Simple bridge network for standalone deployments
-    - External: Reference pre-existing networks
-    - Macvlan: L2 network access with static IP assignment
-    - Swarm: Overlay networks for multi-node swarm clusters
-  
-  Approach:
-    - Conditionally creates network based on network_mode or network_enabled
-    - Supports external networks (network_external flag)
-    - Macvlan includes IPAM configuration
-    - Swarm mode uses overlay driver with attachable option
-    - Always includes Traefik network as external when enabled
-  
-  Usage:
-    Use as the single networks archetype for all deployment types.
-    Adapts based on network_mode, swarm_enabled, and network_external variables.
-#}
-{% if network_enabled or traefik_enabled %}
-networks:
-  {% if network_enabled %}
-  {{ network_name }}:
-    {% if network_external %}
-    external: true
-    {% else %}
-    {% if network_mode == 'macvlan' %}
-    driver: macvlan
-    driver_opts:
-      parent: {{ network_macvlan_parent_interface }}
-    ipam:
-      config:
-        - subnet: {{ network_macvlan_subnet }}
-          gateway: {{ network_macvlan_gateway }}
-    name: {{ network_name }}
-    {% elif swarm_enabled %}
-    driver: overlay
-    attachable: true
-    {% else %}
-    driver: bridge
-    {% endif %}
-    {% endif %}
-  {% endif %}
-  {% if traefik_enabled %}
-  {{ traefik_network }}:
-    external: true
-  {% endif %}
-{% endif %}

+ 0 - 20
archetypes/compose/secrets-v1.j2

@@ -1,20 +0,0 @@
-{#
-  Archetype: toplevel-secrets-v1
-  
-  Description:
-    Swarm secrets definition from file source.
-  
-  Approach:
-    - Only applies to swarm mode
-    - Reads secret from file at deploy time
-    - Secrets are encrypted in swarm
-  
-  Usage:
-    Use with service-secrets-v1 for secure credential management.
-    Create .env.secret file containing the secret value.
-#}
-{% if swarm_enabled %}
-secrets:
-  {{ secret_name }}:
-    file: ./.env.secret
-{% endif %}

+ 0 - 20
archetypes/compose/service-configs-v1.j2

@@ -1,20 +0,0 @@
-{#
-  Archetype: service-configs-v1
-  
-  Description:
-    Swarm configs reference for configuration files.
-  
-  Approach:
-    - Only applies to swarm mode
-    - References configs defined in top-level configs section
-    - Configs mounted at specified target path
-  
-  Usage:
-    Use for application configuration files in swarm.
-    Requires corresponding toplevel-configs-v1 archetype.
-#}
-    {% if swarm_enabled %}
-    configs:
-      - source: {{ config_name }}
-        target: /etc/app/config.yaml
-    {% endif %}

+ 0 - 38
archetypes/compose/service-deploy-v1.j2

@@ -1,38 +0,0 @@
-{#
-  Archetype: service-deploy-traefik-v1
-  
-  Description:
-    Swarm deployment with Traefik labels in deploy section.
-  
-  Approach:
-    - Labels must be in deploy section for swarm mode
-    - Includes full HTTP + HTTPS Traefik configuration
-    - Critical: traefik.docker.network label for multi-network containers
-  
-  Usage:
-    Use for swarm services exposed through Traefik.
-    Combines with service-labels-traefik-https-v1 for standalone mode.
-#}
-    {% if swarm_enabled and traefik_enabled %}
-    deploy:
-      mode: {{ swarm_placement_mode }}
-      {% if swarm_placement_mode == 'replicated' %}
-      replicas: {{ swarm_replicas }}
-      {% endif %}
-      restart_policy:
-        condition: on-failure
-      labels:
-        - traefik.enable=true
-        - traefik.docker.network={{ traefik_network }}
-        - traefik.http.services.{{ service_name }}-web.loadBalancer.server.port={{ service_port }}
-        - traefik.http.routers.{{ service_name }}-http.service={{ service_name }}-web
-        - traefik.http.routers.{{ service_name }}-http.rule=Host(`{{ traefik_host }}`)
-        - traefik.http.routers.{{ service_name }}-http.entrypoints={{ traefik_entrypoint }}
-        {% if traefik_tls_enabled %}
-        - traefik.http.routers.{{ service_name }}-https.service={{ service_name }}-web
-        - traefik.http.routers.{{ service_name }}-https.rule=Host(`{{ traefik_host }}`)
-        - traefik.http.routers.{{ service_name }}-https.entrypoints={{ traefik_tls_entrypoint }}
-        - traefik.http.routers.{{ service_name }}-https.tls=true
-        - traefik.http.routers.{{ service_name }}-https.tls.certresolver={{ traefik_tls_certresolver }}
-        {% endif %}
-    {% endif %}

+ 0 - 25
archetypes/compose/service-environment-v1.j2

@@ -1,25 +0,0 @@
-{#
-  Archetype: service-environment-v1
-  
-  Description:
-    Environment variables for common container configurations.
-  
-  Approach:
-    - Sets standard environment variables (timezone, UID/GID)
-    - Demonstrates secret handling: file-based for swarm, env var for standalone
-    - Uses user_uid/user_gid from module spec general section
-  
-  Usage:
-    Use for services that need timezone and user/group configuration.
-    Adapt the secret handling pattern for your specific secret variables.
-    Replace SECRET example with actual secret variable names as needed.
-#}
-    environment:
-      - TZ={{ container_timezone }}
-      - UID={{ user_uid }}
-      - GID={{ user_gid }}
-      {% if swarm_enabled %}
-      - SECRET=/run/secrets/{{ secret_name }}
-      {% else %}
-      - SECRET=${SECRET}
-      {% endif %}

+ 0 - 33
archetypes/compose/service-labels-v1.j2

@@ -1,33 +0,0 @@
-{#
-  Archetype: service-labels-traefik-middleware-v1
-  
-  Description:
-    Traefik labels with middleware support for authentication, headers, etc.
-  
-  Approach:
-    - Extends HTTPS configuration with middleware assignment
-    - Middlewares applied to both HTTP and HTTPS routers
-    - Supports chaining multiple middlewares (comma-separated)
-  
-  Usage:
-    Use when you need authentication, rate limiting, headers, or other
-    Traefik middleware features. Define middlewares in Traefik config or labels.
-#}
-    {% if traefik_enabled %}
-    labels:
-      - traefik.enable=true
-      - traefik.docker.network={{ traefik_network }}
-      - traefik.http.services.{{ service_name }}-web.loadBalancer.server.port={{ service_port }}
-      - traefik.http.routers.{{ service_name }}-http.service={{ service_name }}-web
-      - traefik.http.routers.{{ service_name }}-http.rule=Host(`{{ traefik_host }}`)
-      - traefik.http.routers.{{ service_name }}-http.entrypoints={{ traefik_entrypoint }}
-      - traefik.http.routers.{{ service_name }}-http.middlewares={{ traefik_middleware }}
-      {% if traefik_tls_enabled %}
-      - traefik.http.routers.{{ service_name }}-https.service={{ service_name }}-web
-      - traefik.http.routers.{{ service_name }}-https.rule=Host(`{{ traefik_host }}`)
-      - traefik.http.routers.{{ service_name }}-https.entrypoints={{ traefik_tls_entrypoint }}
-      - traefik.http.routers.{{ service_name }}-https.tls=true
-      - traefik.http.routers.{{ service_name }}-https.tls.certresolver={{ traefik_tls_certresolver }}
-      - traefik.http.routers.{{ service_name }}-https.middlewares={{ traefik_middleware }}
-      {% endif %}
-    {% endif %}

+ 0 - 30
archetypes/compose/service-networks-v1.j2

@@ -1,30 +0,0 @@
-{#
-  Archetype: service-networks-macvlan-v1
-  
-  Description:
-    Network configuration supporting host, bridge, and macvlan modes.
-  
-  Approach:
-    - Host mode: Uses network_mode: host (no networks section)
-    - Macvlan mode: Assigns static IP address
-    - Bridge mode: Simple network attachment
-    - Always includes Traefik network if enabled
-  
-  Usage:
-    Use for services that need specific network modes (e.g., Pi-hole with macvlan).
-    Requires network_mode variable ('host', 'bridge', or 'macvlan').
-#}
-    {% if network_mode == 'host' %}
-    network_mode: host
-    {% else %}
-    networks:
-      {% if traefik_enabled %}
-      {{ traefik_network }}:
-      {% endif %}
-      {% if network_mode == 'macvlan' %}
-      {{ network_name }}:
-        ipv4_address: {{ network_macvlan_ipv4_address }}
-      {% elif network_mode == 'bridge' %}
-      {{ network_name }}:
-      {% endif %}
-    {% endif %}

+ 0 - 26
archetypes/compose/service-ports-v1.j2

@@ -1,26 +0,0 @@
-{#
-  Archetype: service-ports-conditional-v1
-  
-  Description:
-    Port mappings that are only exposed when Traefik is disabled.
-  
-  Approach:
-    - Swarm mode: Uses long syntax with mode:host for proper host binding
-    - Standalone mode: Uses short syntax for simplicity
-    - Conditionally skipped if Traefik handles routing
-  
-  Usage:
-    Use for HTTP/HTTPS services that can be proxied through Traefik.
-    Ports are only exposed directly when traefik_enabled=false.
-#}
-    {% if not traefik_enabled %}
-    ports:
-      {% if swarm_enabled %}
-      - target: {{ service_port }}
-        published: {{ ports_http }}
-        protocol: tcp
-        mode: host
-      {% else %}
-      - "{{ ports_http }}:{{ service_port }}"
-      {% endif %}
-    {% endif %}

+ 0 - 20
archetypes/compose/service-secrets-v1.j2

@@ -1,20 +0,0 @@
-{#
-  Archetype: service-secrets-v1
-  
-  Description:
-    Swarm secrets reference for sensitive data.
-  
-  Approach:
-    - Only applies to swarm mode
-    - References secrets defined in top-level secrets section
-    - Secrets mounted at /run/secrets/<secret_name>
-  
-  Usage:
-    Use for passwords, API keys, certificates in swarm.
-    Requires corresponding secrets-v1 (top-level) archetype.
-    Must be used within a service definition.
-#}
-{% if swarm_enabled %}
-    secrets:
-      - {{ secret_name }}
-{% endif %}

+ 0 - 23
archetypes/compose/service-v1.j2

@@ -1,23 +0,0 @@
-{#
-  Archetype: service-basic-v1
-  
-  Description:
-    Basic service definition with image, container name (non-swarm), and hostname.
-    This is the foundation for any Docker Compose service.
-  
-  Approach:
-    - Defines the service name and image
-    - Conditionally adds container_name only for non-swarm deployments
-    - Sets hostname for service identification
-  
-  Usage:
-    Use this as the starting point for any service definition.
-#}
-services:
-  {{ service_name }}:
-    image: {{ service_image }}
-    {% if not swarm_enabled %}
-    restart: {{ restart_policy }}
-    container_name: {{ container_name }}
-    {% endif %}
-    hostname: {{ container_hostname }}

+ 0 - 26
archetypes/compose/service-volumes-v1.j2

@@ -1,26 +0,0 @@
-{#
-  Archetype: service-volumes-v1
-  
-  Description:
-    Service volume mounts supporting standalone and swarm modes.
-  
-  Approach:
-    - Standalone mode: Uses named volumes
-    - Swarm mount mode: Uses bind mounts from swarm_volume_mount_path
-    - Swarm local/nfs mode: Uses named volumes
-  
-  Usage:
-    Use for services that need persistent storage.
-    Follows the pattern from pihole template.
-    Uses volume_name variable for named volumes.
-#}
-    volumes:
-      {% if not swarm_enabled %}
-      - {{ volume_name }}:/data
-      {% else %}
-      {% if swarm_volume_mode == 'mount' %}
-      - {{ swarm_volume_mount_path }}/data:/data:rw
-      {% elif swarm_volume_mode in ['local', 'nfs'] %}
-      - {{ volume_name }}:/data
-      {% endif %}
-      {% endif %}

+ 0 - 40
archetypes/compose/volumes-v1.j2

@@ -1,40 +0,0 @@
-{#
-  Archetype: volumes-v1
-  
-  Description:
-    Consolidated top-level volumes section supporting multiple modes:
-    - Simple: Basic local volumes for standalone deployments
-    - External: Reference pre-existing volumes
-    - NFS: Network filesystem for shared storage in swarm
-    - Swarm: Flexible mode supporting mount/local/NFS strategies
-  
-  Approach:
-    - External volumes: No definition needed (external: true not used at top-level)
-    - Standalone mode: Always uses local volumes
-    - Swarm mode with mount: No volume definition (uses bind mounts)
-    - Swarm mode with local: Simple local volumes
-    - Swarm mode with NFS: Network filesystem with driver options
-  
-  Usage:
-    Use as the single volumes archetype for all deployment types.
-    Adapts based on volume_external, swarm_enabled, and swarm_volume_mode variables.
-#}
-{% if not volume_external %}
-{% if swarm_enabled %}
-{% if swarm_volume_mode in ['local', 'nfs'] %}
-volumes:
-  {{ volume_name }}:
-    {% if swarm_volume_mode == 'nfs' %}
-    driver: local
-    driver_opts:
-      type: nfs
-      o: addr={{ swarm_volume_nfs_server }},{{ swarm_volume_nfs_options }}
-      device: ":{{ swarm_volume_nfs_path }}"
-    {% endif %}
-{% endif %}
-{% else %}
-volumes:
-  {{ volume_name }}:
-    driver: local
-{% endif %}
-{% endif %}

+ 1 - 1
cli/__init__.py

@@ -2,6 +2,6 @@
 Boilerplates CLI - A sophisticated command-line tool for managing infrastructure boilerplates.
 """
 
-__version__ = "0.0.7"
+__version__ = "0.1.0"
 __author__ = "Christian Lempa"
 __description__ = "CLI tool for managing infrastructure boilerplates"

+ 118 - 82
cli/__main__.py

@@ -11,21 +11,41 @@ import logging
 import pkgutil
 import sys
 from pathlib import Path
-from typing import Optional
-from typer import Typer, Option
+
+import click
 from rich.console import Console
+from typer import Option, Typer
+from typer.core import TyperGroup
+
 import cli.modules
-from cli.core.registry import registry
-from cli.core import repo
 from cli import __version__
-# Using standard Python exceptions instead of custom ones
+from cli.core import repo
+from cli.core.display import DisplayManager
+from cli.core.registry import registry
+
+
+class OrderedGroup(TyperGroup):
+    """Typer Group that lists commands in alphabetical order."""
+
+    def list_commands(self, ctx: click.Context) -> list[str]:
+        return sorted(super().list_commands(ctx))
+
 
 app = Typer(
-    help="CLI tool for managing infrastructure boilerplates.\n\n[dim]Easily generate, customize, and deploy templates for Docker Compose, Terraform, Kubernetes, and more.\n\n [white]Made with 💜 by [bold]Christian Lempa[/bold]",
+    help=(
+        "CLI tool for managing infrastructure boilerplates.\n\n"
+        "[dim]Easily generate, customize, and deploy templates for Docker Compose, "
+        "Terraform, Kubernetes, and more.\n\n "
+        "[white]Made with 💜 by [bold]Christian Lempa[/bold]"
+    ),
     add_completion=True,
     rich_markup_mode="rich",
+    pretty_exceptions_enable=False,
+    no_args_is_help=True,
+    cls=OrderedGroup,
 )
 console = Console()
+display = DisplayManager()
 
 
 def setup_logging(log_level: str = "WARNING") -> None:
@@ -40,9 +60,7 @@ def setup_logging(log_level: str = "WARNING") -> None:
     """
     numeric_level = getattr(logging, log_level.upper(), None)
     if not isinstance(numeric_level, int):
-        raise ValueError(
-            f"Invalid log level '{log_level}'. Valid levels: DEBUG, INFO, WARNING, ERROR, CRITICAL"
-        )
+        raise ValueError(f"Invalid log level '{log_level}'. Valid levels: DEBUG, INFO, WARNING, ERROR, CRITICAL")
 
     try:
         logging.basicConfig(
@@ -54,27 +72,24 @@ def setup_logging(log_level: str = "WARNING") -> None:
         logger = logging.getLogger(__name__)
         logger.setLevel(numeric_level)
     except Exception as e:
-        raise RuntimeError(f"Failed to configure logging: {e}")
+        raise RuntimeError(f"Failed to configure logging: {e}") from e
 
 
 @app.callback(invoke_without_command=True)
 def main(
-    version: Optional[bool] = Option(
+    _version: bool | None = Option(
         None,
         "--version",
         "-v",
         help="Show the application version and exit.",
         is_flag=True,
-        callback=lambda v: console.print(f"boilerplates version {__version__}")
-        or sys.exit(0)
-        if v
-        else None,
+        callback=lambda v: console.print(f"boilerplates version {__version__}") or sys.exit(0) if v else None,
         is_eager=True,
     ),
-    log_level: Optional[str] = Option(
+    log_level: str | None = Option(
         None,
         "--log-level",
-        help="Set the logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL). If omitted, logging is disabled.",
+        help=("Set the logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL). If omitted, logging is disabled."),
     ),
 ) -> None:
     """CLI tool for managing infrastructure boilerplates."""
@@ -88,8 +103,6 @@ def main(
         logging.disable(logging.CRITICAL)
 
     # Get context without type annotation (compatible with all Typer versions)
-    import click
-
     ctx = click.get_current_context()
 
     # Store log level in context for potential use by other commands
@@ -102,6 +115,67 @@ def main(
         sys.exit(0)
 
 
+def _import_modules(modules_path: Path, logger: logging.Logger) -> list[str]:
+    """Import all modules and return list of failures."""
+    failed_imports = []
+    for _finder, name, ispkg in pkgutil.iter_modules([str(modules_path)]):
+        if not name.startswith("_") and name != "base":
+            try:
+                logger.debug(f"Importing module: {name} ({'package' if ispkg else 'file'})")
+                importlib.import_module(f"cli.modules.{name}")
+            except ImportError as e:
+                error_info = f"Import failed for '{name}': {e!s}"
+                failed_imports.append(error_info)
+                logger.warning(error_info)
+            except Exception as e:
+                error_info = f"Unexpected error importing '{name}': {e!s}"
+                failed_imports.append(error_info)
+                logger.error(error_info)
+    return failed_imports
+
+
+def _register_repo_command(logger: logging.Logger) -> list[str]:
+    """Register repo command and return list of failures."""
+    failed = []
+    try:
+        logger.debug("Registering repo command")
+        repo.register_cli(app)
+    except Exception as e:
+        error_info = f"Repo command registration failed: {e!s}"
+        failed.append(error_info)
+        logger.warning(error_info)
+    return failed
+
+
+def _register_module_classes(logger: logging.Logger) -> tuple[list, list[str]]:
+    """Register template-based modules and return (module_classes, failures)."""
+    failed_registrations = []
+    module_classes = list(registry.iter_module_classes())
+    logger.debug(f"Registering {len(module_classes)} template-based modules")
+
+    for _name, module_cls in module_classes:
+        try:
+            logger.debug(f"Registering module class: {module_cls.__name__}")
+            module_cls.register_cli(app)
+        except Exception as e:
+            error_info = f"Registration failed for '{module_cls.__name__}': {e!s}"
+            failed_registrations.append(error_info)
+            logger.warning(error_info)
+            display.warning(error_info)
+
+    return module_classes, failed_registrations
+
+
+def _build_error_details(failed_imports: list[str], failed_registrations: list[str]) -> str:
+    """Build detailed error message from failures."""
+    error_details = []
+    if failed_imports:
+        error_details.extend(["Import failures:"] + [f"  - {err}" for err in failed_imports])
+    if failed_registrations:
+        error_details.extend(["Registration failures:"] + [f"  - {err}" for err in failed_registrations])
+    return "\n".join(error_details) if error_details else ""
+
+
 def init_app() -> None:
     """Initialize the application by discovering and registering modules.
 
@@ -117,102 +191,64 @@ def init_app() -> None:
         # Auto-discover and import all modules
         modules_path = Path(cli.modules.__file__).parent
         logger.debug(f"Discovering modules in {modules_path}")
-
-        for finder, name, ispkg in pkgutil.iter_modules([str(modules_path)]):
-            # Import both module files and packages (for multi-schema modules)
-            if not name.startswith("_") and name != "base":
-                try:
-                    logger.debug(
-                        f"Importing module: {name} ({'package' if ispkg else 'file'})"
-                    )
-                    importlib.import_module(f"cli.modules.{name}")
-                except ImportError as e:
-                    error_info = f"Import failed for '{name}': {str(e)}"
-                    failed_imports.append(error_info)
-                    logger.warning(error_info)
-                except Exception as e:
-                    error_info = f"Unexpected error importing '{name}': {str(e)}"
-                    failed_imports.append(error_info)
-                    logger.error(error_info)
+        failed_imports = _import_modules(modules_path, logger)
 
         # Register core repo command
-        try:
-            logger.debug("Registering repo command")
-            repo.register_cli(app)
-        except Exception as e:
-            error_info = f"Repo command registration failed: {str(e)}"
-            failed_registrations.append(error_info)
-            logger.warning(error_info)
-
-        # Register template-based modules with app
-        module_classes = list(registry.iter_module_classes())
-        logger.debug(f"Registering {len(module_classes)} template-based modules")
+        repo_failures = _register_repo_command(logger)
 
-        for name, module_cls in module_classes:
-            try:
-                logger.debug(f"Registering module class: {module_cls.__name__}")
-                module_cls.register_cli(app)
-            except Exception as e:
-                error_info = (
-                    f"Registration failed for '{module_cls.__name__}': {str(e)}"
-                )
-                failed_registrations.append(error_info)
-                # Log warning but don't raise exception for individual module failures
-                logger.warning(error_info)
-                console.print(f"[yellow]Warning:[/yellow] {error_info}")
+        # Register template-based modules
+        module_classes, failed_registrations = _register_module_classes(logger)
+        failed_registrations.extend(repo_failures)
 
-        # If we have no modules registered at all, that's a critical error
+        # Validate we have modules
         if not module_classes and not failed_imports:
             raise RuntimeError("No modules found to register")
 
         # Log summary
         successful_modules = len(module_classes) - len(failed_registrations)
-        logger.info(
-            f"Application initialized: {successful_modules} modules registered successfully"
-        )
-
+        logger.info(f"Application initialized: {successful_modules} modules registered successfully")
         if failed_imports:
             logger.info(f"Module import failures: {len(failed_imports)}")
         if failed_registrations:
             logger.info(f"Module registration failures: {len(failed_registrations)}")
 
     except Exception as e:
-        error_details = []
-        if failed_imports:
-            error_details.extend(
-                ["Import failures:"] + [f"  - {err}" for err in failed_imports]
-            )
-        if failed_registrations:
-            error_details.extend(
-                ["Registration failures:"]
-                + [f"  - {err}" for err in failed_registrations]
-            )
-
-        details = "\n".join(error_details) if error_details else str(e)
-        raise RuntimeError(f"Application initialization failed: {details}")
+        details = _build_error_details(failed_imports, failed_registrations) or str(e)
+        raise RuntimeError(f"Application initialization failed: {details}") from e
 
 
 def run() -> None:
     """Run the CLI application."""
+    # Configure logging early if --log-level is provided
+    if "--log-level" in sys.argv:
+        try:
+            log_level_index = sys.argv.index("--log-level") + 1
+            if log_level_index < len(sys.argv):
+                log_level = sys.argv[log_level_index]
+                logging.disable(logging.NOTSET)
+                setup_logging(log_level)
+        except (ValueError, IndexError):
+            pass  # Let Typer handle argument parsing errors
+
     try:
         init_app()
         app()
     except (ValueError, RuntimeError) as e:
         # Handle configuration and initialization errors cleanly
-        console.print(f"[bold red]Error:[/bold red] {e}")
+        display.error(str(e))
         sys.exit(1)
     except ImportError as e:
         # Handle module import errors with detailed info
-        console.print(f"[bold red]Module Import Error:[/bold red] {e}")
+        display.error(f"Module Import Error: {e}")
         sys.exit(1)
     except KeyboardInterrupt:
         # Handle Ctrl+C gracefully
-        console.print("\n[yellow]Operation cancelled by user[/yellow]")
+        display.warning("Operation cancelled by user")
         sys.exit(130)
     except Exception as e:
         # Handle unexpected errors - show simplified message
-        console.print(f"[bold red]Unexpected error:[/bold red] {e}")
-        console.print("[dim]Use --log-level DEBUG for more details[/dim]")
+        display.error(str(e))
+        display.info("Use --log-level DEBUG for more details")
         sys.exit(1)
 
 

+ 0 - 953
cli/core/config.py

@@ -1,953 +0,0 @@
-from __future__ import annotations
-
-import logging
-import re
-import shutil
-import tempfile
-from pathlib import Path
-from typing import Any, Dict, Optional, Union
-
-import yaml
-from rich.console import Console
-
-from .exceptions import ConfigError, ConfigValidationError, YAMLParseError
-
-logger = logging.getLogger(__name__)
-console = Console()
-
-# Valid Python identifier pattern for variable names
-VALID_IDENTIFIER_PATTERN = re.compile(r"^[a-zA-Z_][a-zA-Z0-9_]*$")
-
-# Valid path pattern - prevents path traversal attempts
-VALID_PATH_PATTERN = re.compile(r'^[^\x00-\x1f<>:"|?*]+$')
-
-# Maximum allowed string lengths to prevent DOS attacks
-MAX_STRING_LENGTH = 1000
-MAX_PATH_LENGTH = 4096
-MAX_LIST_LENGTH = 100
-
-
-class ConfigManager:
-    """Manages configuration for the CLI application."""
-
-    def __init__(self, config_path: Optional[Union[str, Path]] = None) -> None:
-        """Initialize the configuration manager.
-
-        Args:
-            config_path: Path to the configuration file. If None, auto-detects:
-                        1. Checks for ./config.yaml (local project config)
-                        2. Falls back to ~/.config/boilerplates/config.yaml (global config)
-        """
-        if config_path is None:
-            # Check for local config.yaml in current directory first
-            local_config = Path.cwd() / "config.yaml"
-            if local_config.exists() and local_config.is_file():
-                self.config_path = local_config
-                self.is_local = True
-                logger.debug(f"Using local config: {local_config}")
-            else:
-                # Fall back to global config
-                config_dir = Path.home() / ".config" / "boilerplates"
-                config_dir.mkdir(parents=True, exist_ok=True)
-                self.config_path = config_dir / "config.yaml"
-                self.is_local = False
-        else:
-            self.config_path = Path(config_path)
-            self.is_local = False
-
-        # Create default config if it doesn't exist (only for global config)
-        if not self.config_path.exists():
-            if not self.is_local:
-                self._create_default_config()
-            else:
-                raise ConfigError(f"Local config file not found: {self.config_path}")
-        else:
-            # Migrate existing config if needed
-            self._migrate_config_if_needed()
-
-    def _create_default_config(self) -> None:
-        """Create a default configuration file."""
-        default_config = {
-            "defaults": {},
-            "preferences": {"editor": "vim", "output_dir": None, "library_paths": []},
-            "libraries": [
-                {
-                    "name": "default",
-                    "type": "git",
-                    "url": "https://github.com/christianlempa/boilerplates.git",
-                    "branch": "main",
-                    "directory": "library",
-                    "enabled": True,
-                }
-            ],
-        }
-        self._write_config(default_config)
-        logger.info(f"Created default configuration at {self.config_path}")
-
-    def _migrate_config_if_needed(self) -> None:
-        """Migrate existing config to add missing sections and library types."""
-        try:
-            config = self._read_config()
-            needs_migration = False
-
-            # Add libraries section if missing
-            if "libraries" not in config:
-                logger.info("Migrating config: adding libraries section")
-                config["libraries"] = [
-                    {
-                        "name": "default",
-                        "type": "git",
-                        "url": "https://github.com/christianlempa/boilerplates.git",
-                        "branch": "refactor/boilerplates-v2",
-                        "directory": "library",
-                        "enabled": True,
-                    }
-                ]
-                needs_migration = True
-            else:
-                # Migrate existing libraries to add 'type' field if missing
-                # For backward compatibility, assume all old libraries without 'type' are git libraries
-                libraries = config.get("libraries", [])
-                for library in libraries:
-                    if "type" not in library:
-                        logger.info(
-                            f"Migrating library '{library.get('name', 'unknown')}': adding type: git"
-                        )
-                        library["type"] = "git"
-                        needs_migration = True
-
-            # Write back if migration was needed
-            if needs_migration:
-                self._write_config(config)
-                logger.info("Config migration completed successfully")
-        except Exception as e:
-            logger.warning(f"Config migration failed: {e}")
-
-    @staticmethod
-    def _validate_string_length(
-        value: str, field_name: str, max_length: int = MAX_STRING_LENGTH
-    ) -> None:
-        """Validate string length to prevent DOS attacks.
-
-        Args:
-            value: String value to validate
-            field_name: Name of the field for error messages
-            max_length: Maximum allowed length
-
-        Raises:
-            ConfigValidationError: If string exceeds maximum length
-        """
-        if len(value) > max_length:
-            raise ConfigValidationError(
-                f"{field_name} exceeds maximum length of {max_length} characters "
-                f"(got {len(value)} characters)"
-            )
-
-    @staticmethod
-    def _validate_path_string(path: str, field_name: str) -> None:
-        """Validate path string for security concerns.
-
-        Args:
-            path: Path string to validate
-            field_name: Name of the field for error messages
-
-        Raises:
-            ConfigValidationError: If path contains invalid characters or patterns
-        """
-        # Check length
-        if len(path) > MAX_PATH_LENGTH:
-            raise ConfigValidationError(
-                f"{field_name} exceeds maximum path length of {MAX_PATH_LENGTH} characters"
-            )
-
-        # Check for null bytes and control characters
-        if "\x00" in path or any(ord(c) < 32 for c in path if c not in "\t\n\r"):
-            raise ConfigValidationError(
-                f"{field_name} contains invalid control characters"
-            )
-
-        # Check for path traversal attempts
-        if ".." in path.split("/"):
-            logger.warning(
-                f"Path '{path}' contains '..' - potential path traversal attempt"
-            )
-
-    @staticmethod
-    def _validate_list_length(
-        lst: list, field_name: str, max_length: int = MAX_LIST_LENGTH
-    ) -> None:
-        """Validate list length to prevent DOS attacks.
-
-        Args:
-            lst: List to validate
-            field_name: Name of the field for error messages
-            max_length: Maximum allowed length
-
-        Raises:
-            ConfigValidationError: If list exceeds maximum length
-        """
-        if len(lst) > max_length:
-            raise ConfigValidationError(
-                f"{field_name} exceeds maximum length of {max_length} items (got {len(lst)} items)"
-            )
-
-    def _read_config(self) -> Dict[str, Any]:
-        """Read configuration from file.
-
-        Returns:
-            Dictionary containing the configuration.
-
-        Raises:
-            YAMLParseError: If YAML parsing fails.
-            ConfigValidationError: If configuration structure is invalid.
-            ConfigError: If reading fails for other reasons.
-        """
-        try:
-            with open(self.config_path, "r") as f:
-                config = yaml.safe_load(f) or {}
-
-            # Validate config structure
-            self._validate_config_structure(config)
-
-            return config
-        except yaml.YAMLError as e:
-            logger.error(f"Failed to parse YAML configuration: {e}")
-            raise YAMLParseError(str(self.config_path), e)
-        except ConfigValidationError:
-            # Re-raise validation errors as-is
-            raise
-        except (IOError, OSError) as e:
-            logger.error(f"Failed to read configuration file: {e}")
-            raise ConfigError(
-                f"Failed to read configuration file '{self.config_path}': {e}"
-            )
-
-    def _write_config(self, config: Dict[str, Any]) -> None:
-        """Write configuration to file atomically using temp file + rename pattern.
-
-        This prevents config file corruption if write operation fails partway through.
-
-        Args:
-            config: Dictionary containing the configuration to write.
-
-        Raises:
-            ConfigValidationError: If configuration structure is invalid.
-            ConfigError: If writing fails for any reason.
-        """
-        tmp_path = None
-        try:
-            # Validate config structure before writing
-            self._validate_config_structure(config)
-
-            # Ensure parent directory exists
-            self.config_path.parent.mkdir(parents=True, exist_ok=True)
-
-            # Write to temporary file in same directory for atomic rename
-            with tempfile.NamedTemporaryFile(
-                mode="w",
-                delete=False,
-                dir=self.config_path.parent,
-                prefix=".config_",
-                suffix=".tmp",
-            ) as tmp_file:
-                yaml.dump(config, tmp_file, default_flow_style=False)
-                tmp_path = tmp_file.name
-
-            # Atomic rename (overwrites existing file on POSIX systems)
-            shutil.move(tmp_path, self.config_path)
-            logger.debug(f"Configuration written atomically to {self.config_path}")
-
-        except ConfigValidationError:
-            # Re-raise validation errors as-is
-            if tmp_path:
-                Path(tmp_path).unlink(missing_ok=True)
-            raise
-        except (IOError, OSError, yaml.YAMLError) as e:
-            # Clean up temp file if it exists
-            if tmp_path:
-                try:
-                    Path(tmp_path).unlink(missing_ok=True)
-                except (IOError, OSError):
-                    logger.warning(f"Failed to clean up temporary file: {tmp_path}")
-            logger.error(f"Failed to write configuration file: {e}")
-            raise ConfigError(
-                f"Failed to write configuration to '{self.config_path}': {e}"
-            )
-
-    def _validate_config_structure(self, config: Dict[str, Any]) -> None:
-        """Validate the configuration structure with comprehensive checks.
-
-        Args:
-            config: Configuration dictionary to validate.
-
-        Raises:
-            ConfigValidationError: If configuration structure is invalid.
-        """
-        if not isinstance(config, dict):
-            raise ConfigValidationError("Configuration must be a dictionary")
-
-        # Check top-level structure
-        if "defaults" in config and not isinstance(config["defaults"], dict):
-            raise ConfigValidationError("'defaults' must be a dictionary")
-
-        if "preferences" in config and not isinstance(config["preferences"], dict):
-            raise ConfigValidationError("'preferences' must be a dictionary")
-
-        # Validate defaults structure
-        if "defaults" in config:
-            for module_name, module_defaults in config["defaults"].items():
-                if not isinstance(module_name, str):
-                    raise ConfigValidationError(
-                        f"Module name must be a string, got {type(module_name).__name__}"
-                    )
-
-                # Validate module name length
-                self._validate_string_length(module_name, "Module name", max_length=100)
-
-                if not isinstance(module_defaults, dict):
-                    raise ConfigValidationError(
-                        f"Defaults for module '{module_name}' must be a dictionary"
-                    )
-
-                # Validate number of defaults per module
-                self._validate_list_length(
-                    list(module_defaults.keys()), f"Defaults for module '{module_name}'"
-                )
-
-                # Validate variable names are valid Python identifiers
-                for var_name, var_value in module_defaults.items():
-                    if not isinstance(var_name, str):
-                        raise ConfigValidationError(
-                            f"Variable name must be a string, got {type(var_name).__name__}"
-                        )
-
-                    # Validate variable name length
-                    self._validate_string_length(
-                        var_name, "Variable name", max_length=100
-                    )
-
-                    if not VALID_IDENTIFIER_PATTERN.match(var_name):
-                        raise ConfigValidationError(
-                            f"Invalid variable name '{var_name}' in module '{module_name}'. "
-                            f"Variable names must be valid Python identifiers (letters, numbers, underscores, "
-                            f"cannot start with a number)"
-                        )
-
-                    # Validate variable value types and lengths
-                    if isinstance(var_value, str):
-                        self._validate_string_length(
-                            var_value, f"Value for '{module_name}.{var_name}'"
-                        )
-                    elif isinstance(var_value, list):
-                        self._validate_list_length(
-                            var_value, f"Value for '{module_name}.{var_name}'"
-                        )
-                    elif var_value is not None and not isinstance(
-                        var_value, (bool, int, float)
-                    ):
-                        raise ConfigValidationError(
-                            f"Invalid value type for '{module_name}.{var_name}': "
-                            f"must be string, number, boolean, list, or null (got {type(var_value).__name__})"
-                        )
-
-        # Validate preferences structure and types
-        if "preferences" in config:
-            preferences = config["preferences"]
-
-            # Validate known preference types
-            if "editor" in preferences:
-                if not isinstance(preferences["editor"], str):
-                    raise ConfigValidationError("Preference 'editor' must be a string")
-                self._validate_string_length(
-                    preferences["editor"], "Preference 'editor'", max_length=100
-                )
-
-            if "output_dir" in preferences:
-                output_dir = preferences["output_dir"]
-                if output_dir is not None:
-                    if not isinstance(output_dir, str):
-                        raise ConfigValidationError(
-                            "Preference 'output_dir' must be a string or null"
-                        )
-                    self._validate_path_string(output_dir, "Preference 'output_dir'")
-
-            if "library_paths" in preferences:
-                if not isinstance(preferences["library_paths"], list):
-                    raise ConfigValidationError(
-                        "Preference 'library_paths' must be a list"
-                    )
-
-                self._validate_list_length(
-                    preferences["library_paths"], "Preference 'library_paths'"
-                )
-
-                for i, path in enumerate(preferences["library_paths"]):
-                    if not isinstance(path, str):
-                        raise ConfigValidationError(
-                            f"Library path must be a string, got {type(path).__name__}"
-                        )
-                    self._validate_path_string(path, f"Library path at index {i}")
-
-        # Validate libraries structure
-        if "libraries" in config:
-            libraries = config["libraries"]
-
-            if not isinstance(libraries, list):
-                raise ConfigValidationError("'libraries' must be a list")
-
-            self._validate_list_length(libraries, "Libraries list")
-
-            for i, library in enumerate(libraries):
-                if not isinstance(library, dict):
-                    raise ConfigValidationError(
-                        f"Library at index {i} must be a dictionary"
-                    )
-
-                # Validate name field (required for all library types)
-                if "name" not in library:
-                    raise ConfigValidationError(
-                        f"Library at index {i} missing required field 'name'"
-                    )
-                if not isinstance(library["name"], str):
-                    raise ConfigValidationError(
-                        f"Library 'name' at index {i} must be a string"
-                    )
-                self._validate_string_length(
-                    library["name"], f"Library 'name' at index {i}", max_length=500
-                )
-
-                # Validate type field (default to "git" for backward compatibility)
-                lib_type = library.get("type", "git")
-                if lib_type not in ("git", "static"):
-                    raise ConfigValidationError(
-                        f"Library type at index {i} must be 'git' or 'static', got '{lib_type}'"
-                    )
-
-                # Type-specific validation
-                if lib_type == "git":
-                    # Git libraries require: url, directory
-                    required_fields = ["url", "directory"]
-                    for field in required_fields:
-                        if field not in library:
-                            raise ConfigValidationError(
-                                f"Git library at index {i} missing required field '{field}'"
-                            )
-
-                        if not isinstance(library[field], str):
-                            raise ConfigValidationError(
-                                f"Library '{field}' at index {i} must be a string"
-                            )
-
-                        self._validate_string_length(
-                            library[field],
-                            f"Library '{field}' at index {i}",
-                            max_length=500,
-                        )
-
-                    # Validate optional branch field
-                    if "branch" in library:
-                        if not isinstance(library["branch"], str):
-                            raise ConfigValidationError(
-                                f"Library 'branch' at index {i} must be a string"
-                            )
-                        self._validate_string_length(
-                            library["branch"],
-                            f"Library 'branch' at index {i}",
-                            max_length=200,
-                        )
-
-                elif lib_type == "static":
-                    # Static libraries require: path
-                    if "path" not in library:
-                        raise ConfigValidationError(
-                            f"Static library at index {i} missing required field 'path'"
-                        )
-
-                    if not isinstance(library["path"], str):
-                        raise ConfigValidationError(
-                            f"Library 'path' at index {i} must be a string"
-                        )
-
-                    self._validate_path_string(
-                        library["path"], f"Library 'path' at index {i}"
-                    )
-
-                # Validate optional enabled field (applies to all types)
-                if "enabled" in library and not isinstance(library["enabled"], bool):
-                    raise ConfigValidationError(
-                        f"Library 'enabled' at index {i} must be a boolean"
-                    )
-
-    def get_config_path(self) -> Path:
-        """Get the path to the configuration file being used.
-
-        Returns:
-            Path to the configuration file (global or local).
-        """
-        return self.config_path
-
-    def is_using_local_config(self) -> bool:
-        """Check if a local configuration file is being used.
-
-        Returns:
-            True if using local config, False if using global config.
-        """
-        return self.is_local
-
-    def get_defaults(self, module_name: str) -> Dict[str, Any]:
-        """Get default variable values for a module.
-
-        Returns defaults in a flat format:
-        {
-            "var_name": "value",
-            "var2_name": "value2"
-        }
-
-        Args:
-            module_name: Name of the module
-
-        Returns:
-            Dictionary of default values (flat key-value pairs)
-        """
-        config = self._read_config()
-        defaults = config.get("defaults", {})
-        return defaults.get(module_name, {})
-
-    def set_defaults(self, module_name: str, defaults: Dict[str, Any]) -> None:
-        """Set default variable values for a module with comprehensive validation.
-
-        Args:
-            module_name: Name of the module
-            defaults: Dictionary of defaults (flat key-value pairs):
-                      {"var_name": "value", "var2_name": "value2"}
-
-        Raises:
-            ConfigValidationError: If module name or variable names are invalid.
-        """
-        # Validate module name
-        if not isinstance(module_name, str) or not module_name:
-            raise ConfigValidationError("Module name must be a non-empty string")
-
-        self._validate_string_length(module_name, "Module name", max_length=100)
-
-        # Validate defaults dictionary
-        if not isinstance(defaults, dict):
-            raise ConfigValidationError("Defaults must be a dictionary")
-
-        # Validate number of defaults
-        self._validate_list_length(list(defaults.keys()), "Defaults dictionary")
-
-        # Validate variable names and values
-        for var_name, var_value in defaults.items():
-            if not isinstance(var_name, str):
-                raise ConfigValidationError(
-                    f"Variable name must be a string, got {type(var_name).__name__}"
-                )
-
-            self._validate_string_length(var_name, "Variable name", max_length=100)
-
-            if not VALID_IDENTIFIER_PATTERN.match(var_name):
-                raise ConfigValidationError(
-                    f"Invalid variable name '{var_name}'. Variable names must be valid Python identifiers "
-                    f"(letters, numbers, underscores, cannot start with a number)"
-                )
-
-            # Validate value types and lengths
-            if isinstance(var_value, str):
-                self._validate_string_length(var_value, f"Value for '{var_name}'")
-            elif isinstance(var_value, list):
-                self._validate_list_length(var_value, f"Value for '{var_name}'")
-            elif var_value is not None and not isinstance(
-                var_value, (bool, int, float)
-            ):
-                raise ConfigValidationError(
-                    f"Invalid value type for '{var_name}': "
-                    f"must be string, number, boolean, list, or null (got {type(var_value).__name__})"
-                )
-
-        config = self._read_config()
-
-        if "defaults" not in config:
-            config["defaults"] = {}
-
-        config["defaults"][module_name] = defaults
-        self._write_config(config)
-        logger.info(f"Updated defaults for module '{module_name}'")
-
-    def set_default_value(self, module_name: str, var_name: str, value: Any) -> None:
-        """Set a single default variable value with comprehensive validation.
-
-        Args:
-            module_name: Name of the module
-            var_name: Name of the variable
-            value: Default value to set
-
-        Raises:
-            ConfigValidationError: If module name or variable name is invalid.
-        """
-        # Validate inputs
-        if not isinstance(module_name, str) or not module_name:
-            raise ConfigValidationError("Module name must be a non-empty string")
-
-        self._validate_string_length(module_name, "Module name", max_length=100)
-
-        if not isinstance(var_name, str):
-            raise ConfigValidationError(
-                f"Variable name must be a string, got {type(var_name).__name__}"
-            )
-
-        self._validate_string_length(var_name, "Variable name", max_length=100)
-
-        if not VALID_IDENTIFIER_PATTERN.match(var_name):
-            raise ConfigValidationError(
-                f"Invalid variable name '{var_name}'. Variable names must be valid Python identifiers "
-                f"(letters, numbers, underscores, cannot start with a number)"
-            )
-
-        # Validate value type and length
-        if isinstance(value, str):
-            self._validate_string_length(value, f"Value for '{var_name}'")
-        elif isinstance(value, list):
-            self._validate_list_length(value, f"Value for '{var_name}'")
-        elif value is not None and not isinstance(value, (bool, int, float)):
-            raise ConfigValidationError(
-                f"Invalid value type for '{var_name}': "
-                f"must be string, number, boolean, list, or null (got {type(value).__name__})"
-            )
-
-        defaults = self.get_defaults(module_name)
-        defaults[var_name] = value
-        self.set_defaults(module_name, defaults)
-        logger.info(f"Set default for '{module_name}.{var_name}' = '{value}'")
-
-    def get_default_value(self, module_name: str, var_name: str) -> Optional[Any]:
-        """Get a single default variable value.
-
-        Args:
-            module_name: Name of the module
-            var_name: Name of the variable
-
-        Returns:
-            Default value or None if not set
-        """
-        defaults = self.get_defaults(module_name)
-        return defaults.get(var_name)
-
-    def clear_defaults(self, module_name: str) -> None:
-        """Clear all defaults for a module.
-
-        Args:
-            module_name: Name of the module
-        """
-        config = self._read_config()
-
-        if "defaults" in config and module_name in config["defaults"]:
-            del config["defaults"][module_name]
-            self._write_config(config)
-            logger.info(f"Cleared defaults for module '{module_name}'")
-
-    def get_preference(self, key: str) -> Optional[Any]:
-        """Get a user preference value.
-
-        Args:
-            key: Preference key (e.g., 'editor', 'output_dir', 'library_paths')
-
-        Returns:
-            Preference value or None if not set
-        """
-        config = self._read_config()
-        preferences = config.get("preferences", {})
-        return preferences.get(key)
-
-    def set_preference(self, key: str, value: Any) -> None:
-        """Set a user preference value with comprehensive validation.
-
-        Args:
-            key: Preference key
-            value: Preference value
-
-        Raises:
-            ConfigValidationError: If key or value is invalid for known preference types.
-        """
-        # Validate key
-        if not isinstance(key, str) or not key:
-            raise ConfigValidationError("Preference key must be a non-empty string")
-
-        self._validate_string_length(key, "Preference key", max_length=100)
-
-        # Validate known preference types
-        if key == "editor":
-            if not isinstance(value, str):
-                raise ConfigValidationError("Preference 'editor' must be a string")
-            self._validate_string_length(value, "Preference 'editor'", max_length=100)
-
-        elif key == "output_dir":
-            if value is not None:
-                if not isinstance(value, str):
-                    raise ConfigValidationError(
-                        "Preference 'output_dir' must be a string or null"
-                    )
-                self._validate_path_string(value, "Preference 'output_dir'")
-
-        elif key == "library_paths":
-            if not isinstance(value, list):
-                raise ConfigValidationError("Preference 'library_paths' must be a list")
-
-            self._validate_list_length(value, "Preference 'library_paths'")
-
-            for i, path in enumerate(value):
-                if not isinstance(path, str):
-                    raise ConfigValidationError(
-                        f"Library path must be a string, got {type(path).__name__}"
-                    )
-                self._validate_path_string(path, f"Library path at index {i}")
-
-        # For unknown preference keys, apply basic validation
-        else:
-            if isinstance(value, str):
-                self._validate_string_length(value, f"Preference '{key}'")
-            elif isinstance(value, list):
-                self._validate_list_length(value, f"Preference '{key}'")
-
-        config = self._read_config()
-
-        if "preferences" not in config:
-            config["preferences"] = {}
-
-        config["preferences"][key] = value
-        self._write_config(config)
-        logger.info(f"Set preference '{key}' = '{value}'")
-
-    def get_all_preferences(self) -> Dict[str, Any]:
-        """Get all user preferences.
-
-        Returns:
-            Dictionary of all preferences
-        """
-        config = self._read_config()
-        return config.get("preferences", {})
-
-    def get_libraries(self) -> list[Dict[str, Any]]:
-        """Get all configured libraries.
-
-        Returns:
-            List of library configurations
-        """
-        config = self._read_config()
-        return config.get("libraries", [])
-
-    def get_library_by_name(self, name: str) -> Optional[Dict[str, Any]]:
-        """Get a specific library by name.
-
-        Args:
-            name: Name of the library
-
-        Returns:
-            Library configuration dictionary or None if not found
-        """
-        libraries = self.get_libraries()
-        for library in libraries:
-            if library.get("name") == name:
-                return library
-        return None
-
-    def add_library(
-        self,
-        name: str,
-        library_type: str = "git",
-        url: Optional[str] = None,
-        directory: Optional[str] = None,
-        branch: str = "main",
-        path: Optional[str] = None,
-        enabled: bool = True,
-    ) -> None:
-        """Add a new library to the configuration.
-
-        Args:
-            name: Unique name for the library
-            library_type: Type of library ("git" or "static")
-            url: Git repository URL (required for git type)
-            directory: Directory within repo (required for git type)
-            branch: Git branch (for git type)
-            path: Local path to templates (required for static type)
-            enabled: Whether the library is enabled
-
-        Raises:
-            ConfigValidationError: If library with the same name already exists or validation fails
-        """
-        # Validate name
-        if not isinstance(name, str) or not name:
-            raise ConfigValidationError("Library name must be a non-empty string")
-
-        self._validate_string_length(name, "Library name", max_length=100)
-
-        # Validate type
-        if library_type not in ("git", "static"):
-            raise ConfigValidationError(
-                f"Library type must be 'git' or 'static', got '{library_type}'"
-            )
-
-        # Check if library already exists
-        if self.get_library_by_name(name):
-            raise ConfigValidationError(f"Library '{name}' already exists")
-
-        # Type-specific validation and config building
-        if library_type == "git":
-            if not url:
-                raise ConfigValidationError("Git libraries require 'url' parameter")
-            if not directory:
-                raise ConfigValidationError(
-                    "Git libraries require 'directory' parameter"
-                )
-
-            # Validate git-specific fields
-            if not isinstance(url, str) or not url:
-                raise ConfigValidationError("Library URL must be a non-empty string")
-            self._validate_string_length(url, "Library URL", max_length=500)
-
-            if not isinstance(directory, str) or not directory:
-                raise ConfigValidationError(
-                    "Library directory must be a non-empty string"
-                )
-            self._validate_string_length(directory, "Library directory", max_length=200)
-
-            if not isinstance(branch, str) or not branch:
-                raise ConfigValidationError("Library branch must be a non-empty string")
-            self._validate_string_length(branch, "Library branch", max_length=200)
-
-            library_config = {
-                "name": name,
-                "type": "git",
-                "url": url,
-                "branch": branch,
-                "directory": directory,
-                "enabled": enabled,
-            }
-
-        else:  # static
-            if not path:
-                raise ConfigValidationError("Static libraries require 'path' parameter")
-
-            # Validate static-specific fields
-            if not isinstance(path, str) or not path:
-                raise ConfigValidationError("Library path must be a non-empty string")
-            self._validate_path_string(path, "Library path")
-
-            # For backward compatibility with older CLI versions,
-            # add dummy values for git-specific fields
-            library_config = {
-                "name": name,
-                "type": "static",
-                "url": "",  # Empty string for backward compatibility
-                "branch": "main",  # Default value for backward compatibility
-                "directory": ".",  # Default value for backward compatibility
-                "path": path,
-                "enabled": enabled,
-            }
-
-        config = self._read_config()
-
-        if "libraries" not in config:
-            config["libraries"] = []
-
-        config["libraries"].append(library_config)
-
-        self._write_config(config)
-        logger.info(f"Added {library_type} library '{name}'")
-
-    def remove_library(self, name: str) -> None:
-        """Remove a library from the configuration.
-
-        Args:
-            name: Name of the library to remove
-
-        Raises:
-            ConfigError: If library is not found
-        """
-        config = self._read_config()
-        libraries = config.get("libraries", [])
-
-        # Find and remove the library
-        new_libraries = [lib for lib in libraries if lib.get("name") != name]
-
-        if len(new_libraries) == len(libraries):
-            raise ConfigError(f"Library '{name}' not found")
-
-        config["libraries"] = new_libraries
-        self._write_config(config)
-        logger.info(f"Removed library '{name}'")
-
-    def update_library(self, name: str, **kwargs: Any) -> None:
-        """Update a library's configuration.
-
-        Args:
-            name: Name of the library to update
-            **kwargs: Fields to update (url, branch, directory, enabled)
-
-        Raises:
-            ConfigError: If library is not found
-            ConfigValidationError: If validation fails
-        """
-        config = self._read_config()
-        libraries = config.get("libraries", [])
-
-        # Find the library
-        library_found = False
-        for library in libraries:
-            if library.get("name") == name:
-                library_found = True
-
-                # Update allowed fields
-                if "url" in kwargs:
-                    url = kwargs["url"]
-                    if not isinstance(url, str) or not url:
-                        raise ConfigValidationError(
-                            "Library URL must be a non-empty string"
-                        )
-                    self._validate_string_length(url, "Library URL", max_length=500)
-                    library["url"] = url
-
-                if "branch" in kwargs:
-                    branch = kwargs["branch"]
-                    if not isinstance(branch, str) or not branch:
-                        raise ConfigValidationError(
-                            "Library branch must be a non-empty string"
-                        )
-                    self._validate_string_length(
-                        branch, "Library branch", max_length=200
-                    )
-                    library["branch"] = branch
-
-                if "directory" in kwargs:
-                    directory = kwargs["directory"]
-                    if not isinstance(directory, str) or not directory:
-                        raise ConfigValidationError(
-                            "Library directory must be a non-empty string"
-                        )
-                    self._validate_string_length(
-                        directory, "Library directory", max_length=200
-                    )
-                    library["directory"] = directory
-
-                if "enabled" in kwargs:
-                    enabled = kwargs["enabled"]
-                    if not isinstance(enabled, bool):
-                        raise ConfigValidationError("Library enabled must be a boolean")
-                    library["enabled"] = enabled
-
-                break
-
-        if not library_found:
-            raise ConfigError(f"Library '{name}' not found")
-
-        config["libraries"] = libraries
-        self._write_config(config)
-        logger.info(f"Updated library '{name}'")
-
-    def get_libraries_path(self) -> Path:
-        """Get the path to the libraries directory.
-
-        Returns:
-            Path to the libraries directory (same directory as config file)
-        """
-        return self.config_path.parent / "libraries"

+ 9 - 0
cli/core/config/__init__.py

@@ -0,0 +1,9 @@
+"""Config package for configuration management.
+
+This package provides the ConfigManager class for managing application configuration,
+including defaults, preferences, and library configurations.
+"""
+
+from .config_manager import ConfigManager, LibraryConfig
+
+__all__ = ["ConfigManager", "LibraryConfig"]

+ 575 - 0
cli/core/config/config_manager.py

@@ -0,0 +1,575 @@
+from __future__ import annotations
+
+import logging
+import shutil
+import tempfile
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Any
+
+import yaml
+
+from ..exceptions import ConfigError, ConfigValidationError, YAMLParseError
+
+logger = logging.getLogger(__name__)
+
+
+@dataclass
+class LibraryConfig:
+    """Configuration for a template library."""
+
+    name: str
+    library_type: str = "git"
+    url: str | None = None
+    directory: str | None = None
+    branch: str = "main"
+    path: str | None = None
+    enabled: bool = True
+
+
+class ConfigManager:
+    """Manages configuration for the CLI application."""
+
+    def __init__(self, config_path: str | Path | None = None) -> None:
+        """Initialize the configuration manager.
+
+        Args:
+            config_path: Path to the configuration file. If None, auto-detects:
+                        1. Checks for ./config.yaml (local project config)
+                        2. Falls back to ~/.config/boilerplates/config.yaml (global config)
+        """
+        if config_path is None:
+            # Check for local config.yaml in current directory first
+            local_config = Path.cwd() / "config.yaml"
+            if local_config.exists() and local_config.is_file():
+                self.config_path = local_config
+                self.is_local = True
+                logger.debug(f"Using local config: {local_config}")
+            else:
+                # Fall back to global config
+                config_dir = Path.home() / ".config" / "boilerplates"
+                config_dir.mkdir(parents=True, exist_ok=True)
+                self.config_path = config_dir / "config.yaml"
+                self.is_local = False
+        else:
+            self.config_path = Path(config_path)
+            self.is_local = False
+
+        # Create default config if it doesn't exist (only for global config)
+        if not self.config_path.exists():
+            if not self.is_local:
+                self._create_default_config()
+            else:
+                raise ConfigError(f"Local config file not found: {self.config_path}")
+        else:
+            # Migrate existing config if needed
+            self._migrate_config_if_needed()
+
+    def _create_default_config(self) -> None:
+        """Create a default configuration file."""
+        default_config = {
+            "defaults": {},
+            "preferences": {"editor": "vim", "output_dir": None, "library_paths": []},
+            "libraries": [
+                {
+                    "name": "default",
+                    "type": "git",
+                    "url": "https://github.com/christianlempa/boilerplates.git",
+                    "branch": "main",
+                    "directory": "library",
+                    "enabled": True,
+                }
+            ],
+        }
+        self._write_config(default_config)
+        logger.info(f"Created default configuration at {self.config_path}")
+
+    def _migrate_config_if_needed(self) -> None:
+        """Migrate existing config to add missing sections and library types."""
+        try:
+            config = self._read_config()
+            needs_migration = False
+
+            # Add libraries section if missing
+            if "libraries" not in config:
+                logger.info("Migrating config: adding libraries section")
+                config["libraries"] = [
+                    {
+                        "name": "default",
+                        "type": "git",
+                        "url": "https://github.com/christianlempa/boilerplates.git",
+                        "branch": "refactor/boilerplates-v2",
+                        "directory": "library",
+                        "enabled": True,
+                    }
+                ]
+                needs_migration = True
+            else:
+                # Migrate existing libraries to add 'type' field if missing
+                # For backward compatibility, assume all old libraries without
+                # 'type' are git libraries
+                libraries = config.get("libraries", [])
+                for library in libraries:
+                    if "type" not in library:
+                        lib_name = library.get("name", "unknown")
+                        logger.info(f"Migrating library '{lib_name}': adding type: git")
+                        library["type"] = "git"
+                        needs_migration = True
+
+            # Write back if migration was needed
+            if needs_migration:
+                self._write_config(config)
+                logger.info("Config migration completed successfully")
+        except Exception as e:
+            logger.warning(f"Config migration failed: {e}")
+
+    def _read_config(self) -> dict[str, Any]:
+        """Read configuration from file.
+
+        Returns:
+            Dictionary containing the configuration.
+
+        Raises:
+            YAMLParseError: If YAML parsing fails.
+            ConfigValidationError: If configuration structure is invalid.
+            ConfigError: If reading fails for other reasons.
+        """
+        try:
+            with self.config_path.open() as f:
+                config = yaml.safe_load(f) or {}
+
+            # Validate config structure
+            self._validate_config_structure(config)
+
+            return config
+        except yaml.YAMLError as e:
+            logger.error(f"Failed to parse YAML configuration: {e}")
+            raise YAMLParseError(str(self.config_path), e) from e
+        except ConfigValidationError:
+            # Re-raise validation errors as-is
+            raise
+        except OSError as e:
+            logger.error(f"Failed to read configuration file: {e}")
+            raise ConfigError(f"Failed to read configuration file '{self.config_path}': {e}") from e
+
+    def _write_config(self, config: dict[str, Any]) -> None:
+        """Write configuration to file atomically using temp file + rename pattern.
+
+        This prevents config file corruption if write operation fails partway through.
+
+        Args:
+            config: Dictionary containing the configuration to write.
+
+        Raises:
+            ConfigValidationError: If configuration structure is invalid.
+            ConfigError: If writing fails for any reason.
+        """
+        tmp_path = None
+        try:
+            # Validate config structure before writing
+            self._validate_config_structure(config)
+
+            # Ensure parent directory exists
+            self.config_path.parent.mkdir(parents=True, exist_ok=True)
+
+            # Write to temporary file in same directory for atomic rename
+            with tempfile.NamedTemporaryFile(
+                mode="w",
+                delete=False,
+                dir=self.config_path.parent,
+                prefix=".config_",
+                suffix=".tmp",
+            ) as tmp_file:
+                yaml.dump(config, tmp_file, default_flow_style=False)
+                tmp_path = tmp_file.name
+
+            # Atomic rename (overwrites existing file on POSIX systems)
+            shutil.move(tmp_path, self.config_path)
+            logger.debug(f"Configuration written atomically to {self.config_path}")
+
+        except ConfigValidationError:
+            # Re-raise validation errors as-is
+            if tmp_path:
+                Path(tmp_path).unlink(missing_ok=True)
+            raise
+        except (OSError, yaml.YAMLError) as e:
+            # Clean up temp file if it exists
+            if tmp_path:
+                try:
+                    Path(tmp_path).unlink(missing_ok=True)
+                except OSError:
+                    logger.warning(f"Failed to clean up temporary file: {tmp_path}")
+            logger.error(f"Failed to write configuration file: {e}")
+            raise ConfigError(f"Failed to write configuration to '{self.config_path}': {e}") from e
+
+    def _validate_config_structure(self, config: dict[str, Any]) -> None:
+        """Validate the configuration structure - basic type checking.
+
+        Args:
+            config: Configuration dictionary to validate.
+
+        Raises:
+            ConfigValidationError: If configuration structure is invalid.
+        """
+        if not isinstance(config, dict):
+            raise ConfigValidationError("Configuration must be a dictionary")
+
+        # Validate top-level types
+        self._validate_top_level_types(config)
+
+        # Validate defaults structure
+        self._validate_defaults_types(config)
+
+        # Validate libraries structure
+        self._validate_libraries_fields(config)
+
+    def _validate_top_level_types(self, config: dict[str, Any]) -> None:
+        """Validate top-level config section types."""
+        if "defaults" in config and not isinstance(config["defaults"], dict):
+            raise ConfigValidationError("'defaults' must be a dictionary")
+
+        if "preferences" in config and not isinstance(config["preferences"], dict):
+            raise ConfigValidationError("'preferences' must be a dictionary")
+
+        if "libraries" in config and not isinstance(config["libraries"], list):
+            raise ConfigValidationError("'libraries' must be a list")
+
+    def _validate_defaults_types(self, config: dict[str, Any]) -> None:
+        """Validate defaults section has correct types."""
+        if "defaults" not in config:
+            return
+
+        for module_name, module_defaults in config["defaults"].items():
+            if not isinstance(module_defaults, dict):
+                raise ConfigValidationError(f"Defaults for module '{module_name}' must be a dictionary")
+
+    def _validate_libraries_fields(self, config: dict[str, Any]) -> None:
+        """Validate libraries have required fields."""
+        if "libraries" not in config:
+            return
+
+        for i, library in enumerate(config["libraries"]):
+            if not isinstance(library, dict):
+                raise ConfigValidationError(f"Library at index {i} must be a dictionary")
+
+            if "name" not in library:
+                raise ConfigValidationError(f"Library at index {i} missing required field 'name'")
+
+            lib_type = library.get("type", "git")
+            if lib_type == "git" and ("url" not in library or "directory" not in library):
+                raise ConfigValidationError(
+                    f"Git library at index {i} missing required fields 'url' and/or 'directory'"
+                )
+            if lib_type == "static" and "path" not in library:
+                raise ConfigValidationError(f"Static library at index {i} missing required field 'path'")
+
+    def get_config_path(self) -> Path:
+        """Get the path to the configuration file being used.
+
+        Returns:
+            Path to the configuration file (global or local).
+        """
+        return self.config_path
+
+    def is_using_local_config(self) -> bool:
+        """Check if a local configuration file is being used.
+
+        Returns:
+            True if using local config, False if using global config.
+        """
+        return self.is_local
+
+    def get_defaults(self, module_name: str) -> dict[str, Any]:
+        """Get default variable values for a module.
+
+        Returns defaults in a flat format:
+        {
+            "var_name": "value",
+            "var2_name": "value2"
+        }
+
+        Args:
+            module_name: Name of the module
+
+        Returns:
+            Dictionary of default values (flat key-value pairs)
+        """
+        config = self._read_config()
+        defaults = config.get("defaults", {})
+        return defaults.get(module_name, {})
+
+    def set_defaults(self, module_name: str, defaults: dict[str, Any]) -> None:
+        """Set default variable values for a module with comprehensive validation.
+
+        Args:
+            module_name: Name of the module
+            defaults: Dictionary of defaults (flat key-value pairs):
+                      {"var_name": "value", "var2_name": "value2"}
+
+        Raises:
+            ConfigValidationError: If module name or variable names are invalid.
+        """
+        # Basic validation
+        if not isinstance(module_name, str) or not module_name:
+            raise ConfigValidationError("Module name must be a non-empty string")
+
+        if not isinstance(defaults, dict):
+            raise ConfigValidationError("Defaults must be a dictionary")
+
+        config = self._read_config()
+
+        if "defaults" not in config:
+            config["defaults"] = {}
+
+        config["defaults"][module_name] = defaults
+        self._write_config(config)
+        logger.info(f"Updated defaults for module '{module_name}'")
+
+    def set_default_value(self, module_name: str, var_name: str, value: Any) -> None:
+        """Set a single default variable value with comprehensive validation.
+
+        Args:
+            module_name: Name of the module
+            var_name: Name of the variable
+            value: Default value to set
+
+        Raises:
+            ConfigValidationError: If module name or variable name is invalid.
+        """
+        # Basic validation
+        if not isinstance(module_name, str) or not module_name:
+            raise ConfigValidationError("Module name must be a non-empty string")
+
+        if not isinstance(var_name, str) or not var_name:
+            raise ConfigValidationError("Variable name must be a non-empty string")
+
+        defaults = self.get_defaults(module_name)
+        defaults[var_name] = value
+        self.set_defaults(module_name, defaults)
+        logger.info(f"Set default for '{module_name}.{var_name}' = '{value}'")
+
+    def get_default_value(self, module_name: str, var_name: str) -> Any | None:
+        """Get a single default variable value.
+
+        Args:
+            module_name: Name of the module
+            var_name: Name of the variable
+
+        Returns:
+            Default value or None if not set
+        """
+        defaults = self.get_defaults(module_name)
+        return defaults.get(var_name)
+
+    def clear_defaults(self, module_name: str) -> None:
+        """Clear all defaults for a module.
+
+        Args:
+            module_name: Name of the module
+        """
+        config = self._read_config()
+
+        if "defaults" in config and module_name in config["defaults"]:
+            del config["defaults"][module_name]
+            self._write_config(config)
+            logger.info(f"Cleared defaults for module '{module_name}'")
+
+    def get_preference(self, key: str) -> Any | None:
+        """Get a user preference value.
+
+        Args:
+            key: Preference key (e.g., 'editor', 'output_dir', 'library_paths')
+
+        Returns:
+            Preference value or None if not set
+        """
+        config = self._read_config()
+        preferences = config.get("preferences", {})
+        return preferences.get(key)
+
+    def set_preference(self, key: str, value: Any) -> None:
+        """Set a user preference value with comprehensive validation.
+
+        Args:
+            key: Preference key
+            value: Preference value
+
+        Raises:
+            ConfigValidationError: If key or value is invalid for known preference types.
+        """
+        # Basic validation
+        if not isinstance(key, str) or not key:
+            raise ConfigValidationError("Preference key must be a non-empty string")
+
+        config = self._read_config()
+
+        if "preferences" not in config:
+            config["preferences"] = {}
+
+        config["preferences"][key] = value
+        self._write_config(config)
+        logger.info(f"Set preference '{key}' = '{value}'")
+
+    def get_all_preferences(self) -> dict[str, Any]:
+        """Get all user preferences.
+
+        Returns:
+            Dictionary of all preferences
+        """
+        config = self._read_config()
+        return config.get("preferences", {})
+
+    def get_libraries(self) -> list[dict[str, Any]]:
+        """Get all configured libraries.
+
+        Returns:
+            List of library configurations
+        """
+        config = self._read_config()
+        return config.get("libraries", [])
+
+    def get_library_by_name(self, name: str) -> dict[str, Any] | None:
+        """Get a specific library by name.
+
+        Args:
+            name: Name of the library
+
+        Returns:
+            Library configuration dictionary or None if not found
+        """
+        libraries = self.get_libraries()
+        for library in libraries:
+            if library.get("name") == name:
+                return library
+        return None
+
+    def add_library(self, lib_config: LibraryConfig) -> None:
+        """Add a new library to the configuration.
+
+        Args:
+            lib_config: Library configuration
+
+        Raises:
+            ConfigValidationError: If library with the same name already exists or validation fails
+        """
+        # Basic validation
+        if not isinstance(lib_config.name, str) or not lib_config.name:
+            raise ConfigValidationError("Library name must be a non-empty string")
+
+        if lib_config.library_type not in ("git", "static"):
+            raise ConfigValidationError(f"Library type must be 'git' or 'static', got '{lib_config.library_type}'")
+
+        if self.get_library_by_name(lib_config.name):
+            raise ConfigValidationError(f"Library '{lib_config.name}' already exists")
+
+        # Type-specific validation
+        if lib_config.library_type == "git":
+            if not lib_config.url or not lib_config.directory:
+                raise ConfigValidationError("Git libraries require 'url' and 'directory' parameters")
+
+            library_dict = {
+                "name": lib_config.name,
+                "type": "git",
+                "url": lib_config.url,
+                "branch": lib_config.branch,
+                "directory": lib_config.directory,
+                "enabled": lib_config.enabled,
+            }
+
+        else:  # static
+            if not lib_config.path:
+                raise ConfigValidationError("Static libraries require 'path' parameter")
+
+            # For backward compatibility with older CLI versions,
+            # add dummy values for git-specific fields
+            library_dict = {
+                "name": lib_config.name,
+                "type": "static",
+                "url": "",  # Empty string for backward compatibility
+                "branch": "main",  # Default value for backward compatibility
+                "directory": ".",  # Default value for backward compatibility
+                "path": lib_config.path,
+                "enabled": lib_config.enabled,
+            }
+
+        config = self._read_config()
+
+        if "libraries" not in config:
+            config["libraries"] = []
+
+        config["libraries"].append(library_dict)
+
+        self._write_config(config)
+        logger.info(f"Added {lib_config.library_type} library '{lib_config.name}'")
+
+    def remove_library(self, name: str) -> None:
+        """Remove a library from the configuration.
+
+        Args:
+            name: Name of the library to remove
+
+        Raises:
+            ConfigError: If library is not found
+        """
+        config = self._read_config()
+        libraries = config.get("libraries", [])
+
+        # Find and remove the library
+        new_libraries = [lib for lib in libraries if lib.get("name") != name]
+
+        if len(new_libraries) == len(libraries):
+            raise ConfigError(f"Library '{name}' not found")
+
+        config["libraries"] = new_libraries
+        self._write_config(config)
+        logger.info(f"Removed library '{name}'")
+
+    def update_library(self, name: str, **kwargs: Any) -> None:
+        """Update a library's configuration.
+
+        Args:
+            name: Name of the library to update
+            **kwargs: Fields to update (url, branch, directory, enabled)
+
+        Raises:
+            ConfigError: If library is not found
+            ConfigValidationError: If validation fails
+        """
+        config = self._read_config()
+        libraries = config.get("libraries", [])
+
+        # Find the library
+        library_found = False
+        for library in libraries:
+            if library.get("name") == name:
+                library_found = True
+
+                # Update allowed fields
+                if "url" in kwargs:
+                    library["url"] = kwargs["url"]
+
+                if "branch" in kwargs:
+                    library["branch"] = kwargs["branch"]
+
+                if "directory" in kwargs:
+                    library["directory"] = kwargs["directory"]
+
+                if "enabled" in kwargs:
+                    library["enabled"] = kwargs["enabled"]
+
+                break
+
+        if not library_found:
+            raise ConfigError(f"Library '{name}' not found")
+
+        config["libraries"] = libraries
+        self._write_config(config)
+        logger.info(f"Updated library '{name}'")
+
+    def get_libraries_path(self) -> Path:
+        """Get the path to the libraries directory.
+
+        Returns:
+            Path to the libraries directory (same directory as config file)
+        """
+        return self.config_path.parent / "libraries"

+ 0 - 975
cli/core/display.py

@@ -1,975 +0,0 @@
-from __future__ import annotations
-
-import logging
-from pathlib import Path
-from typing import TYPE_CHECKING
-
-from rich.console import Console
-from rich.table import Table
-from rich.tree import Tree
-
-if TYPE_CHECKING:
-    from .exceptions import TemplateRenderError
-    from .template import Template
-
-logger = logging.getLogger(__name__)
-console = Console()
-console_err = Console(stderr=True)
-
-
-class IconManager:
-    """Centralized icon management system for consistent CLI display.
-
-    This class provides standardized icons for file types, status indicators,
-    and UI elements. Icons use Nerd Font glyphs for consistent display.
-
-    Categories:
-        - File types: .yaml, .j2, .json, .md, etc.
-        - Status: success, warning, error, info, skipped
-        - UI elements: folders, config, locks, etc.
-    """
-
-    # File Type Icons
-    FILE_FOLDER = "\uf07b"  #
-    FILE_DEFAULT = "\uf15b"  #
-    FILE_YAML = "\uf15c"  #
-    FILE_JSON = "\ue60b"  #
-    FILE_MARKDOWN = "\uf48a"  #
-    FILE_JINJA2 = "\ue235"  #
-    FILE_DOCKER = "\uf308"  #
-    FILE_COMPOSE = "\uf308"  #
-    FILE_SHELL = "\uf489"  #
-    FILE_PYTHON = "\ue73c"  #
-    FILE_TEXT = "\uf15c"  #
-
-    # Status Indicators
-    STATUS_SUCCESS = "\uf00c"  #  (check)
-    STATUS_ERROR = "\uf00d"  #  (times/x)
-    STATUS_WARNING = "\uf071"  #  (exclamation-triangle)
-    STATUS_INFO = "\uf05a"  #  (info-circle)
-    STATUS_SKIPPED = "\uf05e"  #  (ban/circle-slash)
-
-    # UI Elements
-    UI_CONFIG = "\ue5fc"  #
-    UI_LOCK = "\uf084"  #
-    UI_SETTINGS = "\uf013"  #
-    UI_ARROW_RIGHT = "\uf061"  #  (arrow-right)
-    UI_BULLET = "\uf111"  #  (circle)
-    UI_LIBRARY_GIT = "\uf418"  #  (git icon)
-    UI_LIBRARY_STATIC = "\uf07c"  #  (folder icon)
-
-    @classmethod
-    def get_file_icon(cls, file_path: str | Path) -> str:
-        """Get the appropriate icon for a file based on its extension or name.
-
-        Args:
-            file_path: Path to the file (can be string or Path object)
-
-        Returns:
-            Unicode icon character for the file type
-
-        Examples:
-            >>> IconManager.get_file_icon("config.yaml")
-            '\uf15c'
-            >>> IconManager.get_file_icon("template.j2")
-            '\ue235'
-        """
-        if isinstance(file_path, str):
-            file_path = Path(file_path)
-
-        file_name = file_path.name.lower()
-        suffix = file_path.suffix.lower()
-
-        # Check for Docker Compose files
-        compose_names = {
-            "docker-compose.yml",
-            "docker-compose.yaml",
-            "compose.yml",
-            "compose.yaml",
-        }
-        if file_name in compose_names or file_name.startswith("docker-compose"):
-            return cls.FILE_DOCKER
-
-        # Check by extension
-        extension_map = {
-            ".yaml": cls.FILE_YAML,
-            ".yml": cls.FILE_YAML,
-            ".json": cls.FILE_JSON,
-            ".md": cls.FILE_MARKDOWN,
-            ".j2": cls.FILE_JINJA2,
-            ".sh": cls.FILE_SHELL,
-            ".py": cls.FILE_PYTHON,
-            ".txt": cls.FILE_TEXT,
-        }
-
-        return extension_map.get(suffix, cls.FILE_DEFAULT)
-
-    @classmethod
-    def get_status_icon(cls, status: str) -> str:
-        """Get the appropriate icon for a status indicator.
-
-        Args:
-            status: Status type (success, error, warning, info, skipped)
-
-        Returns:
-            Unicode icon character for the status
-
-        Examples:
-            >>> IconManager.get_status_icon("success")
-            '✓'
-            >>> IconManager.get_status_icon("warning")
-            '⚠'
-        """
-        status_map = {
-            "success": cls.STATUS_SUCCESS,
-            "error": cls.STATUS_ERROR,
-            "warning": cls.STATUS_WARNING,
-            "info": cls.STATUS_INFO,
-            "skipped": cls.STATUS_SKIPPED,
-        }
-        return status_map.get(status.lower(), cls.STATUS_INFO)
-
-    @classmethod
-    def folder(cls) -> str:
-        """Get the folder icon."""
-        return cls.FILE_FOLDER
-
-    @classmethod
-    def config(cls) -> str:
-        """Get the config icon."""
-        return cls.UI_CONFIG
-
-    @classmethod
-    def lock(cls) -> str:
-        """Get the lock icon (for sensitive variables)."""
-        return cls.UI_LOCK
-
-    @classmethod
-    def arrow_right(cls) -> str:
-        """Get the right arrow icon (for showing transitions/changes)."""
-        return cls.UI_ARROW_RIGHT
-
-
-class DisplayManager:
-    """Handles all rich rendering for the CLI.
-
-    This class is responsible for ALL display output in the CLI, including:
-    - Status messages (success, error, warning, info)
-    - Tables (templates, summaries, results)
-    - Trees (file structures, configurations)
-    - Confirmation dialogs and prompts
-    - Headers and sections
-
-    Design Principles:
-    - All display logic should go through DisplayManager methods
-    - IconManager is ONLY used internally by DisplayManager
-    - External code should never directly call IconManager or console.print
-    - Consistent formatting across all display types
-    """
-
-    def __init__(self, quiet: bool = False):
-        """Initialize DisplayManager.
-
-        Args:
-            quiet: If True, suppress all non-error output
-        """
-        self.quiet = quiet
-
-    def display_templates_table(
-        self, templates: list, module_name: str, title: str
-    ) -> None:
-        """Display a table of templates with library type indicators.
-
-        Args:
-            templates: List of Template objects
-            module_name: Name of the module
-            title: Title for the table
-        """
-        if not templates:
-            logger.info(f"No templates found for module '{module_name}'")
-            return
-
-        logger.info(f"Listing {len(templates)} templates for module '{module_name}'")
-        table = Table(title=title)
-        table.add_column("ID", style="bold", no_wrap=True)
-        table.add_column("Name")
-        table.add_column("Tags")
-        table.add_column("Version", no_wrap=True)
-        table.add_column("Schema", no_wrap=True)
-        table.add_column("Library", no_wrap=True)
-
-        for template in templates:
-            name = template.metadata.name or "Unnamed Template"
-            tags_list = template.metadata.tags or []
-            tags = ", ".join(tags_list) if tags_list else "-"
-            version = (
-                str(template.metadata.version) if template.metadata.version else ""
-            )
-            schema = (
-                template.schema_version
-                if hasattr(template, "schema_version")
-                else "1.0"
-            )
-
-            # Show library with type indicator and color
-            library_name = template.metadata.library or ""
-            library_type = template.metadata.library_type or "git"
-
-            if library_type == "static":
-                # Static libraries: yellow/amber color with folder icon
-                library_display = (
-                    f"[yellow]{IconManager.UI_LIBRARY_STATIC} {library_name}[/yellow]"
-                )
-            else:
-                # Git libraries: blue color with git icon
-                library_display = (
-                    f"[blue]{IconManager.UI_LIBRARY_GIT} {library_name}[/blue]"
-                )
-
-            # Display qualified ID if present (e.g., "alloy.default")
-            display_id = template.id
-
-            table.add_row(display_id, name, tags, version, schema, library_display)
-
-        console.print(table)
-
-    def display_template_details(self, template: Template, template_id: str) -> None:
-        """Display template information panel and variables table.
-
-        Args:
-            template: Template instance to display
-            template_id: ID of the template
-        """
-        self._display_template_header(template, template_id)
-        self._display_file_tree(template)
-        self._display_variables_table(template)
-
-    def display_section_header(self, title: str, description: str | None) -> None:
-        """Display a section header."""
-        if description:
-            console.print(
-                f"\n[bold cyan]{title}[/bold cyan] [dim]- {description}[/dim]"
-            )
-        else:
-            console.print(f"\n[bold cyan]{title}[/bold cyan]")
-        console.print("─" * 40, style="dim")
-
-    def display_validation_error(self, message: str) -> None:
-        """Display a validation error message."""
-        self.display_message("error", message)
-
-    def display_message(
-        self, level: str, message: str, context: str | None = None
-    ) -> None:
-        """Display a message with consistent formatting.
-
-        Args:
-            level: Message level (error, warning, success, info)
-            message: The message to display
-            context: Optional context information
-        """
-        # Errors and warnings always go to stderr, even in quiet mode
-        # Success and info respect quiet mode and go to stdout
-        if level in ("error", "warning"):
-            output_console = console_err
-            should_print = True
-        else:
-            output_console = console
-            should_print = not self.quiet
-
-        if not should_print:
-            return
-
-        icon = IconManager.get_status_icon(level)
-        colors = {
-            "error": "red",
-            "warning": "yellow",
-            "success": "green",
-            "info": "blue",
-        }
-        color = colors.get(level, "white")
-
-        # Format message based on context
-        if context:
-            text = (
-                f"{level.capitalize()} in {context}: {message}"
-                if level == "error" or level == "warning"
-                else f"{context}: {message}"
-            )
-        else:
-            text = (
-                f"{level.capitalize()}: {message}"
-                if level == "error" or level == "warning"
-                else message
-            )
-
-        output_console.print(f"[{color}]{icon} {text}[/{color}]")
-
-        # Log appropriately
-        log_message = f"{context}: {message}" if context else message
-        log_methods = {
-            "error": logger.error,
-            "warning": logger.warning,
-            "success": logger.info,
-            "info": logger.info,
-        }
-        log_methods.get(level, logger.info)(log_message)
-
-    def display_error(self, message: str, context: str | None = None) -> None:
-        """Display an error message."""
-        self.display_message("error", message, context)
-
-    def display_warning(self, message: str, context: str | None = None) -> None:
-        """Display a warning message."""
-        self.display_message("warning", message, context)
-
-    def display_success(self, message: str, context: str | None = None) -> None:
-        """Display a success message."""
-        self.display_message("success", message, context)
-
-    def display_info(self, message: str, context: str | None = None) -> None:
-        """Display an informational message."""
-        self.display_message("info", message, context)
-
-    def display_version_incompatibility(
-        self, template_id: str, required_version: str, current_version: str
-    ) -> None:
-        """Display a version incompatibility error with upgrade instructions.
-
-        Args:
-            template_id: ID of the incompatible template
-            required_version: Minimum CLI version required by template
-            current_version: Current CLI version
-        """
-        console_err.print()
-        console_err.print(
-            f"[bold red]{IconManager.STATUS_ERROR} Version Incompatibility[/bold red]"
-        )
-        console_err.print()
-        console_err.print(
-            f"Template '[cyan]{template_id}[/cyan]' requires CLI version [green]{required_version}[/green] or higher."
-        )
-        console_err.print(f"Current CLI version: [yellow]{current_version}[/yellow]")
-        console_err.print()
-        console_err.print("[bold]Upgrade Instructions:[/bold]")
-        console_err.print(
-            f"  {IconManager.UI_ARROW_RIGHT} Run: [cyan]pip install --upgrade boilerplates[/cyan]"
-        )
-        console_err.print(
-            f"  {IconManager.UI_ARROW_RIGHT} Or install specific version: [cyan]pip install boilerplates=={required_version}[/cyan]"
-        )
-        console_err.print()
-
-        logger.error(
-            f"Template '{template_id}' requires CLI version {required_version}, "
-            f"current version is {current_version}"
-        )
-
-    def _display_template_header(self, template: Template, template_id: str) -> None:
-        """Display the header for a template with library information."""
-        template_name = template.metadata.name or "Unnamed Template"
-        version = (
-            str(template.metadata.version)
-            if template.metadata.version
-            else "Not specified"
-        )
-        schema = (
-            template.schema_version if hasattr(template, "schema_version") else "1.0"
-        )
-        description = template.metadata.description or "No description available"
-
-        # Get library information
-        library_name = template.metadata.library or ""
-        library_type = template.metadata.library_type or "git"
-
-        # Format library display with icon and color
-        if library_type == "static":
-            library_display = (
-                f"[yellow]{IconManager.UI_LIBRARY_STATIC} {library_name}[/yellow]"
-            )
-        else:
-            library_display = (
-                f"[blue]{IconManager.UI_LIBRARY_GIT} {library_name}[/blue]"
-            )
-
-        console.print(
-            f"[bold blue]{template_name} ({template_id} - [cyan]{version}[/cyan] - [magenta]schema {schema}[/magenta]) {library_display}[/bold blue]"
-        )
-        console.print(description)
-
-    def _build_file_tree(
-        self, root_label: str, files: list, get_file_info: callable
-    ) -> Tree:
-        """Build a file tree structure.
-
-        Args:
-            root_label: Label for root node
-            files: List of files to display
-            get_file_info: Function that takes a file and returns (path, display_name, color, extra_text)
-
-        Returns:
-            Tree object ready for display
-        """
-        file_tree = Tree(root_label)
-        tree_nodes = {Path("."): file_tree}
-
-        for file_item in sorted(files, key=lambda f: get_file_info(f)[0]):
-            path, display_name, color, extra_text = get_file_info(file_item)
-            parts = path.parts
-            current_path = Path(".")
-            current_node = file_tree
-
-            # Build directory structure
-            for part in parts[:-1]:
-                current_path = current_path / part
-                if current_path not in tree_nodes:
-                    new_node = current_node.add(
-                        f"{IconManager.folder()} [white]{part}[/white]"
-                    )
-                    tree_nodes[current_path] = new_node
-                current_node = tree_nodes[current_path]
-
-            # Add file
-            icon = IconManager.get_file_icon(display_name)
-            file_label = f"{icon} [{color}]{display_name}[/{color}]"
-            if extra_text:
-                file_label += f" {extra_text}"
-            current_node.add(file_label)
-
-        return file_tree
-
-    def _display_file_tree(self, template: Template) -> None:
-        """Display the file structure of a template."""
-        console.print()
-        console.print("[bold blue]Template File Structure:[/bold blue]")
-
-        def get_template_file_info(template_file):
-            display_name = (
-                template_file.output_path.name
-                if hasattr(template_file, "output_path")
-                else template_file.relative_path.name
-            )
-            return (template_file.relative_path, display_name, "white", None)
-
-        file_tree = self._build_file_tree(
-            f"{IconManager.folder()} [white]{template.id}[/white]",
-            template.template_files,
-            get_template_file_info,
-        )
-
-        if file_tree.children:
-            console.print(file_tree)
-
-    def _display_variables_table(self, template: Template) -> None:
-        """Display a table of variables for a template.
-
-        All variables and sections are always shown. Disabled sections/variables
-        are displayed with dimmed styling.
-
-        Args:
-            template: Template instance
-        """
-        if not (template.variables and template.variables.has_sections()):
-            return
-
-        console.print()
-        console.print("[bold blue]Template Variables:[/bold blue]")
-
-        variables_table = Table(show_header=True, header_style="bold blue")
-        variables_table.add_column("Variable", style="white", no_wrap=True)
-        variables_table.add_column("Type", style="magenta")
-        variables_table.add_column("Default", style="green")
-        variables_table.add_column("Description", style="white")
-
-        first_section = True
-        for section in template.variables.get_sections().values():
-            if not section.variables:
-                continue
-
-            if not first_section:
-                variables_table.add_row("", "", "", "", style="bright_black")
-            first_section = False
-
-            # Check if section is enabled AND dependencies are satisfied
-            is_enabled = section.is_enabled()
-            dependencies_satisfied = template.variables.is_section_satisfied(
-                section.key
-            )
-            is_dimmed = not (is_enabled and dependencies_satisfied)
-
-            # Only show (disabled) if section has no dependencies (dependencies make it obvious)
-            # Empty list means no dependencies (same as None)
-            has_dependencies = section.needs and len(section.needs) > 0
-            disabled_text = (
-                " (disabled)" if (is_dimmed and not has_dependencies) else ""
-            )
-
-            # For disabled sections, make entire heading bold and dim (don't include colored markup inside)
-            if is_dimmed:
-                # Build text without internal markup, then wrap entire thing in bold bright_black (dimmed appearance)
-                required_part = " (required)" if section.required else ""
-                header_text = f"[bold bright_black]{section.title}{required_part}{disabled_text}[/bold bright_black]"
-            else:
-                # For enabled sections, include the colored markup
-                required_text = (
-                    " [yellow](required)[/yellow]" if section.required else ""
-                )
-                header_text = (
-                    f"[bold]{section.title}{required_text}{disabled_text}[/bold]"
-                )
-            variables_table.add_row(header_text, "", "", "")
-            for var_name, variable in section.variables.items():
-                # Skip toggle variable in required sections (always enabled, no need to show)
-                if section.required and section.toggle and var_name == section.toggle:
-                    continue
-
-                # Check if variable's needs are satisfied
-                var_satisfied = template.variables.is_variable_satisfied(var_name)
-
-                # Dim the variable if section is dimmed OR variable needs are not satisfied
-                row_style = "bright_black" if (is_dimmed or not var_satisfied) else None
-
-                # Build default value display
-                # Special case: disabled bool variables show as "original → False"
-                if (is_dimmed or not var_satisfied) and variable.type == "bool":
-                    # Show that disabled bool variables are forced to False
-                    if (
-                        hasattr(variable, "_original_disabled")
-                        and variable._original_disabled is not False
-                    ):
-                        orig_val = str(variable._original_disabled)
-                        default_val = f"{orig_val} {IconManager.arrow_right()} False"
-                    else:
-                        default_val = "False"
-                # If origin is 'config' and original value differs from current, show: original → config_value
-                # BUT only for enabled variables (don't show arrow for disabled ones)
-                elif (
-                    not (is_dimmed or not var_satisfied)
-                    and variable.origin == "config"
-                    and hasattr(variable, "_original_stored")
-                    and variable.original_value != variable.value
-                ):
-                    # Format original value (use same display logic, but shorter)
-                    if variable.sensitive:
-                        orig_display = "********"
-                    elif (
-                        variable.original_value is None or variable.original_value == ""
-                    ):
-                        orig_display = "[dim](none)[/dim]"
-                    else:
-                        orig_val_str = str(variable.original_value)
-                        orig_display = (
-                            orig_val_str[:15] + "..."
-                            if len(orig_val_str) > 15
-                            else orig_val_str
-                        )
-
-                    # Get current (config) value display (without showing "(none)" since we have the arrow)
-                    config_display = variable.get_display_value(
-                        mask_sensitive=True, max_length=15, show_none=False
-                    )
-                    if (
-                        not config_display
-                    ):  # If still empty after show_none=False, show actual value
-                        config_display = (
-                            str(variable.value) if variable.value else "(empty)"
-                        )
-
-                    # Highlight the arrow and config value in bold yellow to show it's a custom override
-                    default_val = f"{orig_display} [bold yellow]{IconManager.arrow_right()} {config_display}[/bold yellow]"
-                else:
-                    # Use variable's native get_display_value() method (shows "(none)" for empty)
-                    default_val = variable.get_display_value(
-                        mask_sensitive=True, max_length=30, show_none=True
-                    )
-
-                # Add lock icon for sensitive variables
-                sensitive_icon = f" {IconManager.lock()}" if variable.sensitive else ""
-                # Add required indicator for required variables
-                required_indicator = (
-                    " [yellow](required)[/yellow]" if variable.required else ""
-                )
-                var_display = f"  {var_name}{sensitive_icon}{required_indicator}"
-
-                variables_table.add_row(
-                    var_display,
-                    variable.type or "str",
-                    default_val,
-                    variable.description or "",
-                    style=row_style,
-                )
-
-        console.print(variables_table)
-
-    def display_file_generation_confirmation(
-        self,
-        output_dir: Path,
-        files: dict[str, str],
-        existing_files: list[Path] | None = None,
-    ) -> None:
-        """Display files to be generated with confirmation prompt."""
-        console.print()
-        console.print("[bold]Files to be generated:[/bold]")
-
-        def get_file_generation_info(file_path_str):
-            file_path = Path(file_path_str)
-            file_name = file_path.parts[-1] if file_path.parts else file_path.name
-            full_path = output_dir / file_path
-
-            if existing_files and full_path in existing_files:
-                return (file_path, file_name, "yellow", "[red](will overwrite)[/red]")
-            else:
-                return (file_path, file_name, "green", None)
-
-        file_tree = self._build_file_tree(
-            f"{IconManager.folder()} [cyan]{output_dir.resolve()}[/cyan]",
-            files.keys(),
-            get_file_generation_info,
-        )
-
-        console.print(file_tree)
-        console.print()
-
-    def display_config_tree(
-        self, spec: dict, module_name: str, show_all: bool = False
-    ) -> None:
-        """Display configuration spec as a tree view.
-
-        Args:
-            spec: The configuration spec dictionary
-            module_name: Name of the module
-            show_all: If True, show all details including descriptions
-        """
-        if not spec:
-            console.print(
-                f"[yellow]No configuration found for module '{module_name}'[/yellow]"
-            )
-            return
-
-        # Create root tree node
-        tree = Tree(
-            f"[bold blue]{IconManager.config()} {str.capitalize(module_name)} Configuration[/bold blue]"
-        )
-
-        for section_name, section_data in spec.items():
-            if not isinstance(section_data, dict):
-                continue
-
-            # Determine if this is a section with variables
-            # Guard against None from empty YAML sections
-            section_vars = section_data.get("vars") or {}
-            section_desc = section_data.get("description", "")
-            section_required = section_data.get("required", False)
-            section_toggle = section_data.get("toggle", None)
-            section_needs = section_data.get("needs", None)
-
-            # Build section label
-            section_label = f"[cyan]{section_name}[/cyan]"
-            if section_required:
-                section_label += " [yellow](required)[/yellow]"
-            if section_toggle:
-                section_label += f" [dim](toggle: {section_toggle})[/dim]"
-            if section_needs:
-                needs_str = (
-                    ", ".join(section_needs)
-                    if isinstance(section_needs, list)
-                    else section_needs
-                )
-                section_label += f" [dim](needs: {needs_str})[/dim]"
-
-            if show_all and section_desc:
-                section_label += f"\n  [dim]{section_desc}[/dim]"
-
-            section_node = tree.add(section_label)
-
-            # Add variables
-            if section_vars:
-                for var_name, var_data in section_vars.items():
-                    if isinstance(var_data, dict):
-                        var_type = var_data.get("type", "string")
-                        var_default = var_data.get("default", "")
-                        var_desc = var_data.get("description", "")
-                        var_sensitive = var_data.get("sensitive", False)
-
-                        # Build variable label
-                        var_label = f"[green]{var_name}[/green] [dim]({var_type})[/dim]"
-
-                        if var_default is not None and var_default != "":
-                            display_val = (
-                                "********" if var_sensitive else str(var_default)
-                            )
-                            if not var_sensitive and len(display_val) > 30:
-                                display_val = display_val[:27] + "..."
-                            var_label += f" = [yellow]{display_val}[/yellow]"
-
-                        if show_all and var_desc:
-                            var_label += f"\n    [dim]{var_desc}[/dim]"
-
-                        section_node.add(var_label)
-                    else:
-                        # Simple key-value pair
-                        section_node.add(
-                            f"[green]{var_name}[/green] = [yellow]{var_data}[/yellow]"
-                        )
-
-        console.print(tree)
-
-    def display_next_steps(self, next_steps: str, variable_values: dict) -> None:
-        """Display next steps after template generation, rendering them as a Jinja2 template.
-
-        Args:
-            next_steps: The next_steps string from template metadata (may contain Jinja2 syntax)
-            variable_values: Dictionary of variable values to use for rendering
-        """
-        if not next_steps:
-            return
-
-        console.print("\n[bold cyan]Next Steps:[/bold cyan]")
-
-        try:
-            from jinja2 import Template as Jinja2Template
-
-            next_steps_template = Jinja2Template(next_steps)
-            rendered_next_steps = next_steps_template.render(variable_values)
-            console.print(rendered_next_steps)
-        except Exception as e:
-            logger.warning(f"Failed to render next_steps as template: {e}")
-            # Fallback to plain text if rendering fails
-            console.print(next_steps)
-
-    def display_status_table(
-        self,
-        title: str,
-        rows: list[tuple[str, str, bool]],
-        columns: tuple[str, str] = ("Item", "Status"),
-    ) -> None:
-        """Display a status table with success/error indicators.
-
-        Args:
-            title: Table title
-            rows: List of tuples (name, message, success_bool)
-            columns: Column headers (name_header, status_header)
-        """
-        table = Table(title=title, show_header=True)
-        table.add_column(columns[0], style="cyan", no_wrap=True)
-        table.add_column(columns[1])
-
-        for name, message, success in rows:
-            status_style = "green" if success else "red"
-            status_icon = IconManager.get_status_icon("success" if success else "error")
-            table.add_row(
-                name, f"[{status_style}]{status_icon} {message}[/{status_style}]"
-            )
-
-        console.print(table)
-
-    def display_summary_table(self, title: str, items: dict[str, str]) -> None:
-        """Display a simple two-column summary table.
-
-        Args:
-            title: Table title
-            items: Dictionary of key-value pairs to display
-        """
-        table = Table(title=title, show_header=False, box=None, padding=(0, 2))
-        table.add_column(style="bold")
-        table.add_column()
-
-        for key, value in items.items():
-            table.add_row(key, value)
-
-        console.print(table)
-
-    def display_file_operation_table(self, files: list[tuple[str, int, str]]) -> None:
-        """Display a table of file operations with sizes and statuses.
-
-        Args:
-            files: List of tuples (file_path, size_bytes, status)
-        """
-        table = Table(
-            show_header=True, header_style="bold cyan", box=None, padding=(0, 1)
-        )
-        table.add_column("File", style="white", no_wrap=False)
-        table.add_column("Size", justify="right", style="dim")
-        table.add_column("Status", style="yellow")
-
-        for file_path, size_bytes, status in files:
-            # Format size
-            if size_bytes < 1024:
-                size_str = f"{size_bytes}B"
-            elif size_bytes < 1024 * 1024:
-                size_str = f"{size_bytes / 1024:.1f}KB"
-            else:
-                size_str = f"{size_bytes / (1024 * 1024):.1f}MB"
-
-            table.add_row(str(file_path), size_str, status)
-
-        console.print(table)
-
-    def display_heading(
-        self, text: str, icon_type: str | None = None, style: str = "bold"
-    ) -> None:
-        """Display a heading with optional icon.
-
-        Args:
-            text: Heading text
-            icon_type: Type of icon to display (e.g., 'folder', 'file', 'config')
-            style: Rich style to apply
-        """
-        if icon_type:
-            icon = self._get_icon_by_type(icon_type)
-            console.print(f"[{style}]{icon} {text}[/{style}]")
-        else:
-            console.print(f"[{style}]{text}[/{style}]")
-
-    def display_warning_with_confirmation(
-        self, message: str, details: list[str] | None = None, default: bool = False
-    ) -> bool:
-        """Display a warning message with optional details and get confirmation.
-
-        Args:
-            message: Warning message to display
-            details: Optional list of detail lines to show
-            default: Default value for confirmation
-
-        Returns:
-            True if user confirms, False otherwise
-        """
-        icon = IconManager.get_status_icon("warning")
-        console.print(f"\n[yellow]{icon} {message}[/yellow]")
-
-        if details:
-            for detail in details:
-                console.print(f"[yellow]  {detail}[/yellow]")
-
-        from rich.prompt import Confirm
-
-        return Confirm.ask("Continue?", default=default)
-
-    def display_skipped(self, message: str, reason: str | None = None) -> None:
-        """Display a skipped/disabled message.
-
-        Args:
-            message: The main message to display
-            reason: Optional reason why it was skipped
-        """
-        icon = IconManager.get_status_icon("skipped")
-        if reason:
-            console.print(f"\n[dim]{icon} {message} (skipped - {reason})[/dim]")
-        else:
-            console.print(f"\n[dim]{icon} {message} (skipped)[/dim]")
-
-    def get_lock_icon(self) -> str:
-        """Get the lock icon for sensitive variables.
-
-        Returns:
-            Lock icon unicode character
-        """
-        return IconManager.lock()
-
-    def _get_icon_by_type(self, icon_type: str) -> str:
-        """Get icon by semantic type name.
-
-        Args:
-            icon_type: Type of icon (e.g., 'folder', 'file', 'config', 'lock')
-
-        Returns:
-            Icon unicode character
-        """
-        icon_map = {
-            "folder": IconManager.folder(),
-            "file": IconManager.FILE_DEFAULT,
-            "config": IconManager.config(),
-            "lock": IconManager.lock(),
-            "arrow": IconManager.arrow_right(),
-        }
-        return icon_map.get(icon_type, "")
-
-    def display_template_render_error(
-        self, error: "TemplateRenderError", context: str | None = None
-    ) -> None:
-        """Display a detailed template rendering error with context and suggestions.
-
-        Args:
-            error: TemplateRenderError exception with detailed error information
-            context: Optional context information (e.g., template ID)
-        """
-        from rich.panel import Panel
-        from rich.syntax import Syntax
-
-        # Always display errors to stderr
-        # Display main error header
-        icon = IconManager.get_status_icon("error")
-        if context:
-            console_err.print(
-                f"\n[red bold]{icon} Template Rendering Error[/red bold] [dim]({context})[/dim]"
-            )
-        else:
-            console_err.print(f"\n[red bold]{icon} Template Rendering Error[/red bold]")
-
-        console_err.print()
-
-        # Display error message
-        if error.file_path:
-            console_err.print(
-                f"[red]Error in file:[/red] [cyan]{error.file_path}[/cyan]"
-            )
-            if error.line_number:
-                location = f"Line {error.line_number}"
-                if error.column:
-                    location += f", Column {error.column}"
-                console_err.print(f"[red]Location:[/red] {location}")
-
-        console_err.print(
-            f"[red]Message:[/red] {str(error.original_error) if error.original_error else str(error)}"
-        )
-        console_err.print()
-
-        # Display code context if available
-        if error.context_lines:
-            console_err.print("[bold cyan]Code Context:[/bold cyan]")
-
-            # Build the context text
-            context_text = "\n".join(error.context_lines)
-
-            # Display in a panel with syntax highlighting if possible
-            file_ext = Path(error.file_path).suffix if error.file_path else ""
-            if file_ext == ".j2":
-                # Remove .j2 to get base extension for syntax highlighting
-                base_name = Path(error.file_path).stem
-                base_ext = Path(base_name).suffix
-                lexer = "jinja2" if not base_ext else None
-            else:
-                lexer = None
-
-            try:
-                if lexer:
-                    syntax = Syntax(
-                        context_text, lexer, line_numbers=False, theme="monokai"
-                    )
-                    console_err.print(Panel(syntax, border_style="red", padding=(1, 2)))
-                else:
-                    console_err.print(
-                        Panel(context_text, border_style="red", padding=(1, 2))
-                    )
-            except Exception:
-                # Fallback to plain panel if syntax highlighting fails
-                console_err.print(
-                    Panel(context_text, border_style="red", padding=(1, 2))
-                )
-
-            console_err.print()
-
-        # Display suggestions if available
-        if error.suggestions:
-            console_err.print("[bold yellow]Suggestions:[/bold yellow]")
-            for i, suggestion in enumerate(error.suggestions, 1):
-                bullet = IconManager.UI_BULLET
-                console_err.print(f"  [yellow]{bullet}[/yellow] {suggestion}")
-            console_err.print()
-
-        # Display variable context in debug mode
-        if error.variable_context:
-            console_err.print("[bold blue]Available Variables (Debug):[/bold blue]")
-            var_list = ", ".join(sorted(error.variable_context.keys()))
-            console_err.print(f"[dim]{var_list}[/dim]")
-            console_err.print()

+ 180 - 0
cli/core/display/__init__.py

@@ -0,0 +1,180 @@
+"""Display module for CLI output rendering.
+
+This package provides centralized display management with mixin-based architecture.
+DisplayManager inherits from multiple mixins to provide a flat, cohesive API.
+"""
+
+from __future__ import annotations
+
+from rich.console import Console
+
+from .display_base import BaseDisplay
+from .display_icons import IconManager
+from .display_settings import DisplaySettings
+from .display_status import StatusDisplay
+from .display_table import TableDisplay
+from .display_template import TemplateDisplay
+from .display_variable import VariableDisplay
+
+# Console instances for stdout and stderr
+console = Console()
+console_err = Console(stderr=True)
+
+
+class DisplayManager:
+    """Main display coordinator using composition.
+
+    This class composes specialized display components to provide a unified API.
+    Each component handles a specific concern (status, tables, templates, variables).
+
+    Design Principles:
+    - Composition over inheritance
+    - Explicit dependencies
+    - Clear separation of concerns
+    - Easy to test and extend
+    """
+
+    def __init__(self, quiet: bool = False, settings: DisplaySettings | None = None):
+        """Initialize DisplayManager with composed display components.
+
+        Args:
+            quiet: If True, suppress all non-error output
+            settings: Optional DisplaySettings instance for customization
+        """
+        self.quiet = quiet
+        self.settings = settings or DisplaySettings()
+
+        # Create base display component (includes utilities)
+        self.base = BaseDisplay(self.settings, quiet)
+
+        # Create specialized display components
+        self.status = StatusDisplay(self.settings, quiet, self.base)
+        self.variables = VariableDisplay(self.settings, self.base)
+        self.templates = TemplateDisplay(self.settings, self.base, self.variables, self.status)
+        self.tables = TableDisplay(self.settings, self.base)
+
+    # ===== Delegate to base display =====
+    def text(self, text: str, style: str | None = None) -> None:
+        """Display plain text."""
+        return self.base.text(text, style)
+
+    def heading(self, text: str, style: str | None = None) -> None:
+        """Display a heading."""
+        return self.base.heading(text, style)
+
+    def section(self, title: str, description: str | None = None) -> None:
+        """Display a section header with optional description.
+
+        Args:
+            title: Section title
+            description: Optional section description
+        """
+        self.base.text("")
+        self.base.text(f"[bold cyan]{title}[/bold cyan]")
+        if description:
+            self.base.text(f"[dim]{description}[/dim]")
+
+    def table(
+        self,
+        headers: list[str] | None = None,
+        rows: list[tuple] | None = None,
+        title: str | None = None,
+        show_header: bool = True,
+        borderless: bool = False,
+    ) -> None:
+        """Display a table."""
+        return self.base.table(headers, rows, title, show_header, borderless)
+
+    def tree(self, root_label: str, nodes: dict | list) -> None:
+        """Display a tree."""
+        return self.base.tree(root_label, nodes)
+
+    def code(self, code_text: str, language: str | None = None) -> None:
+        """Display code."""
+        return self.base.code(code_text, language)
+
+    def progress(self, *columns):
+        """Create a progress bar."""
+        return self.base.progress(*columns)
+
+    def file_tree(
+        self,
+        root_label: str,
+        files: list,
+        file_info_fn: callable,
+        title: str | None = None,
+    ) -> None:
+        """Display a file tree structure."""
+        return self.base.file_tree(root_label, files, file_info_fn, title)
+
+    # ===== Formatting utilities =====
+    def truncate(self, value: str, max_length: int | None = None) -> str:
+        """Truncate string value."""
+        return self.base.truncate(value, max_length)
+
+    def format_file_size(self, size_bytes: int) -> str:
+        """Format file size in human-readable format."""
+        return self.base.format_file_size(size_bytes)
+
+    def data_table(
+        self,
+        columns: list[dict],
+        rows: list,
+        title: str | None = None,
+        row_formatter: callable | None = None,
+    ) -> None:
+        """Display a data table with configurable columns."""
+        return self.tables.data_table(columns, rows, title, row_formatter)
+
+    def display_status_table(
+        self,
+        title: str,
+        rows: list[tuple[str, str, bool]],
+        columns: tuple[str, str] = ("Item", "Status"),
+    ) -> None:
+        """Display a status table with success/error indicators."""
+        return self.tables.render_status_table(title, rows, columns)
+
+    # ===== Delegate to status display =====
+    def error(self, message: str, context: str | None = None, details: str | None = None) -> None:
+        """Display an error message."""
+        return self.status.error(message, context, details)
+
+    def warning(self, message: str, context: str | None = None, details: str | None = None) -> None:
+        """Display a warning message."""
+        return self.status.warning(message, context, details)
+
+    def success(self, message: str, context: str | None = None) -> None:
+        """Display a success message."""
+        return self.status.success(message, context)
+
+    def info(self, message: str, context: str | None = None) -> None:
+        """Display an info message."""
+        return self.status.info(message, context)
+
+    def skipped(self, message: str, reason: str | None = None) -> None:
+        """Display skipped message."""
+        return self.status.skipped(message, reason)
+
+    # ===== Helper methods =====
+    def get_lock_icon(self) -> str:
+        """Get lock icon."""
+        return self.base.get_lock_icon()
+
+    def print_table(self, table) -> None:
+        """Print a pre-built Rich Table object.
+
+        Args:
+            table: Rich Table object to print
+        """
+        return self.base._print_table(table)
+
+
+# Export public API
+__all__ = [
+    "DisplayManager",
+    "DisplaySettings",
+    "IconManager",
+    "console",
+    "console_err",
+]

+ 308 - 0
cli/core/display/display_base.py

@@ -0,0 +1,308 @@
+"""Base display methods for DisplayManager."""
+
+from __future__ import annotations
+
+import logging
+from pathlib import Path
+
+from rich.console import Console
+from rich.progress import Progress
+from rich.syntax import Syntax
+from rich.table import Table
+from rich.tree import Tree
+
+from .display_icons import IconManager
+from .display_settings import DisplaySettings
+
+logger = logging.getLogger(__name__)
+console = Console()
+
+
+class BaseDisplay:
+    """Base display methods and utilities.
+
+    Provides fundamental display methods (text, heading, table, tree, code, progress)
+    and utility/helper methods for formatting.
+    """
+
+    def __init__(self, settings: DisplaySettings, quiet: bool = False):
+        """Initialize BaseDisplay.
+
+        Args:
+            settings: Display settings for formatting
+            quiet: If True, suppress non-error output
+        """
+        self.settings = settings
+        self.quiet = quiet
+
+    def heading(self, text: str, style: str | None = None) -> None:
+        """Display a standardized heading.
+
+        Args:
+            text: Heading text
+            style: Optional style override (defaults to STYLE_HEADER from settings)
+        """
+        if style is None:
+            style = self.settings.STYLE_HEADER
+        console.print(f"[{style}]{text}[/{style}]")
+        console.print("")  # Add newline after heading
+
+    def text(self, text: str, style: str | None = None) -> None:
+        """Display plain text with optional styling.
+
+        Args:
+            text: Text to display
+            style: Optional Rich style markup
+        """
+        if style:
+            console.print(f"[{style}]{text}[/{style}]")
+        else:
+            console.print(text)
+
+    def table(
+        self,
+        headers: list[str] | None = None,
+        rows: list[tuple] | None = None,
+        title: str | None = None,
+        show_header: bool = True,
+        borderless: bool = False,
+    ) -> None:
+        """Display a standardized table.
+
+        Args:
+            headers: Column headers (if None, no headers)
+            rows: List of tuples, one per row
+            title: Optional table title
+            show_header: Whether to show header row
+            borderless: If True, use borderless style (box=None)
+        """
+        table = Table(
+            title=title,
+            show_header=show_header and headers is not None,
+            header_style=self.settings.STYLE_TABLE_HEADER,
+            box=None,
+            padding=self.settings.PADDING_TABLE_NORMAL if borderless else (0, 1),
+        )
+
+        # Add columns
+        if headers:
+            for header in headers:
+                table.add_column(header)
+        elif rows and len(rows) > 0:
+            # No headers, but need columns for data
+            for _ in range(len(rows[0])):
+                table.add_column()
+
+        # Add rows
+        if rows:
+            for row in rows:
+                table.add_row(*[str(cell) for cell in row])
+
+        console.print(table)
+
+    def tree(self, root_label: str, nodes: dict | list | Tree) -> None:
+        """Display a tree structure.
+
+        Args:
+            root_label: Label for the root node
+            nodes: Hierarchical structure (dict, list, or pre-built Tree)
+        """
+        if isinstance(nodes, Tree):
+            console.print(nodes)
+        else:
+            tree = Tree(root_label)
+            self._build_tree_nodes(tree, nodes)
+            console.print(tree)
+
+    def _build_tree_nodes(self, parent, nodes):
+        """Recursively build tree nodes.
+
+        Args:
+            parent: Parent tree node
+            nodes: Dict or list of child nodes
+        """
+        if isinstance(nodes, dict):
+            for key, value in nodes.items():
+                if isinstance(value, (dict, list)):
+                    branch = parent.add(str(key))
+                    self._build_tree_nodes(branch, value)
+                else:
+                    parent.add(f"{key}: {value}")
+        elif isinstance(nodes, list):
+            for item in nodes:
+                if isinstance(item, (dict, list)):
+                    self._build_tree_nodes(parent, item)
+                else:
+                    parent.add(str(item))
+
+    def _print_tree(self, tree) -> None:
+        """Print a pre-built Rich Tree object.
+
+        Args:
+            tree: Rich Tree object to print
+        """
+        console.print(tree)
+
+    def _print_table(self, table) -> None:
+        """Print a pre-built Rich Table object.
+
+        Enforces consistent header styling for all tables.
+
+        Args:
+            table: Rich Table object to print
+        """
+        # Enforce consistent header style for all tables
+        table.header_style = self.settings.STYLE_TABLE_HEADER
+        console.print(table)
+
+    def _print_markdown(self, markdown) -> None:
+        """Print a pre-built Rich Markdown object.
+
+        Args:
+            markdown: Rich Markdown object to print
+        """
+        console.print(markdown)
+
+    def code(self, code_text: str, language: str | None = None) -> None:
+        """Display code with optional syntax highlighting.
+
+        Args:
+            code_text: Code to display
+            language: Programming language for syntax highlighting
+        """
+        if language:
+            syntax = Syntax(code_text, language, theme="monokai", line_numbers=False)
+            console.print(syntax)
+        else:
+            # Plain code block without highlighting
+            console.print(f"[dim]{code_text}[/dim]")
+
+    def progress(self, *columns):
+        """Create a Rich Progress context manager with standardized console.
+
+        Args:
+            *columns: Progress columns (e.g., SpinnerColumn(), TextColumn())
+
+        Returns:
+            Progress context manager
+
+        Example:
+            with display.progress(
+                SpinnerColumn(), TextColumn("[progress.description]{task.description}")
+            ) as progress:
+                task = progress.add_task("Processing...", total=None)
+                # do work
+                progress.remove_task(task)
+        """
+        return Progress(*columns, console=console)
+
+    # ===== Formatting Utilities =====
+
+    def truncate(self, value: str, max_length: int | None = None) -> str:
+        """Truncate a string value if it exceeds maximum length.
+
+        Args:
+            value: String value to truncate
+            max_length: Maximum length (uses default if None)
+
+        Returns:
+            Truncated string with suffix if needed
+        """
+        if max_length is None:
+            max_length = self.settings.VALUE_MAX_LENGTH_DEFAULT
+
+        if max_length > 0 and len(value) > max_length:
+            return value[: max_length - len(self.settings.TRUNCATION_SUFFIX)] + self.settings.TRUNCATION_SUFFIX
+        return value
+
+    def format_file_size(self, size_bytes: int) -> str:
+        """Format file size in human-readable format (B, KB, MB).
+
+        Args:
+            size_bytes: Size in bytes
+
+        Returns:
+            Formatted size string (e.g., "1.5KB", "2.3MB")
+        """
+        if size_bytes < self.settings.SIZE_KB_THRESHOLD:
+            return f"{size_bytes}B"
+        if size_bytes < self.settings.SIZE_MB_THRESHOLD:
+            kb = size_bytes / self.settings.SIZE_KB_THRESHOLD
+            return f"{kb:.{self.settings.SIZE_DECIMAL_PLACES}f}KB"
+        mb = size_bytes / self.settings.SIZE_MB_THRESHOLD
+        return f"{mb:.{self.settings.SIZE_DECIMAL_PLACES}f}MB"
+
+    def file_tree(
+        self,
+        root_label: str,
+        files: list,
+        file_info_fn: callable,
+        title: str | None = None,
+    ) -> None:
+        """Display a file tree structure.
+
+        Args:
+            root_label: Label for root node (e.g., "📁 my-project")
+            files: List of file items to display
+            file_info_fn: Function that takes a file and returns
+                         (path, display_name, color, extra_text) where:
+                         - path: Path object for directory structure
+                         - display_name: Name to show for the file
+                         - color: Rich color for the filename
+                         - extra_text: Optional additional text
+            title: Optional heading to display before tree
+        """
+        if title:
+            self.heading(title)
+
+        tree = Tree(root_label)
+        tree_nodes = {Path(): tree}
+
+        for file_item in sorted(files, key=lambda f: file_info_fn(f)[0]):
+            path, display_name, color, extra_text = file_info_fn(file_item)
+            parts = path.parts
+            current_path = Path()
+            current_node = tree
+
+            # Build directory structure
+            for part in parts[:-1]:
+                current_path = current_path / part
+                if current_path not in tree_nodes:
+                    new_node = current_node.add(f"{IconManager.folder()} [white]{part}[/white]")
+                    tree_nodes[current_path] = new_node
+                current_node = tree_nodes[current_path]
+
+            # Add file
+            icon = IconManager.get_file_icon(display_name)
+            file_label = f"{icon} [{color}]{display_name}[/{color}]"
+            if extra_text:
+                file_label += f" {extra_text}"
+            current_node.add(file_label)
+
+        console.print(tree)
+
+    def _get_icon_by_type(self, icon_type: str) -> str:
+        """Get icon by semantic type name.
+
+        Args:
+            icon_type: Type of icon (e.g., 'folder', 'file', 'config', 'lock')
+
+        Returns:
+            Icon unicode character
+        """
+        icon_map = {
+            "folder": IconManager.folder(),
+            "file": IconManager.FILE_DEFAULT,
+            "config": IconManager.config(),
+            "lock": IconManager.lock(),
+            "arrow": IconManager.arrow_right(),
+        }
+        return icon_map.get(icon_type, "")
+
+    def get_lock_icon(self) -> str:
+        """Get the lock icon for sensitive variables.
+
+        Returns:
+            Lock icon unicode character
+        """
+        return IconManager.lock()

+ 193 - 0
cli/core/display/display_icons.py

@@ -0,0 +1,193 @@
+"""Icon management for consistent CLI display."""
+
+from __future__ import annotations
+
+from pathlib import Path
+from typing import ClassVar
+
+
+class IconManager:
+    """Centralized icon management system for consistent CLI display.
+
+    This class provides standardized icons for file types, status indicators,
+    and UI elements. Icons use Nerd Font glyphs for consistent display.
+
+    Categories:
+        - File types: .yaml, .j2, .json, .md, etc.
+        - Status: success, warning, error, info, skipped
+        - UI elements: folders, config, locks, etc.
+    """
+
+    # File Type Icons
+    FILE_FOLDER = "\uf07b"
+    FILE_DEFAULT = "\uf15b"
+    FILE_YAML = "\uf15c"
+    FILE_JSON = "\ue60b"
+    FILE_MARKDOWN = "\uf48a"
+    FILE_JINJA2 = "\ue235"
+    FILE_DOCKER = "\uf308"
+    FILE_COMPOSE = "\uf308"
+    FILE_SHELL = "\uf489"
+    FILE_PYTHON = "\ue73c"
+    FILE_TEXT = "\uf15c"
+
+    # Status Indicators
+    STATUS_SUCCESS = "\uf00c"  #  (check)
+    STATUS_ERROR = "\uf00d"  #  (times/x)
+    STATUS_WARNING = "\uf071"  #  (exclamation-triangle)
+    STATUS_INFO = "\uf05a"  #  (info-circle)
+    STATUS_SKIPPED = "\uf05e"  #  (ban/circle-slash)
+
+    # UI Elements
+    UI_CONFIG = "\ue5fc"
+    UI_LOCK = "\uf084"
+    UI_SETTINGS = "\uf013"
+    UI_ARROW_RIGHT = "\uf061"  #  (arrow-right)
+    UI_BULLET = "\uf111"  #  (circle)
+    UI_LIBRARY_GIT = "\uf418"  #  (git icon)
+    UI_LIBRARY_STATIC = "\uf07c"  #  (folder icon)
+
+    # Shortcode Mappings (emoji-style codes to Nerd Font icons)
+    # Format: ":code:" -> "\uf000"
+    #
+    # Usage:
+    # 1. In regular text: ":mycode: Some text" - icon replaces shortcode inline
+    # 2. In markdown lists: "- :mycode: List item" - icon replaces bullet with color
+    #
+    # To add new shortcodes:
+    # 1. Add entry to this dict: ":mycode:": "\uf000"
+    # 2. Use in template descriptions or markdown content
+    # 3. Shortcodes are automatically replaced when markdown is rendered
+    # 4. List items starting with shortcodes get colored icons instead of bullets
+    #
+    # Find Nerd Font codes at: https://www.nerdfonts.com/cheat-sheet
+    SHORTCODES: ClassVar[dict[str, str]] = {
+        ":warning:": "\uf071",  #  (exclamation-triangle)
+        ":info:": "\uf05a",  #  (info-circle)
+        ":check:": "\uf00c",  #  (check)
+        ":error:": "\uf00d",  #  (times/x)
+        ":lock:": "\uf084",  #  (lock)
+        ":folder:": "\uf07b",  #  (folder)
+        ":file:": "\uf15b",  #  (file)
+        ":gear:": "\uf013",  #  (settings/gear)
+        ":rocket:": "\uf135",  #  (rocket)
+        ":star:": "\uf005",  #  (star)
+        ":lightning:": "\uf0e7",  #  (bolt/lightning)
+        ":cloud:": "\uf0c2",  #  (cloud)
+        ":database:": "\uf1c0",  #  (database)
+        ":network:": "\uf6ff",  #  (network)
+        ":docker:": "\uf308",  #  (docker)
+        ":kubernetes:": "\ue287",  #  (kubernetes/helm)
+    }
+
+    @classmethod
+    def get_file_icon(cls, file_path: str | Path) -> str:
+        """Get the appropriate icon for a file based on its extension or name.
+
+        Args:
+            file_path: Path to the file (can be string or Path object)
+
+        Returns:
+            Unicode icon character for the file type
+
+        Examples:
+            >>> IconManager.get_file_icon("config.yaml")
+            '\uf15c'
+            >>> IconManager.get_file_icon("template.j2")
+            '\ue235'
+        """
+        if isinstance(file_path, str):
+            file_path = Path(file_path)
+
+        file_name = file_path.name.lower()
+        suffix = file_path.suffix.lower()
+
+        # Check for Docker Compose files
+        compose_names = {
+            "docker-compose.yml",
+            "docker-compose.yaml",
+            "compose.yml",
+            "compose.yaml",
+        }
+        if file_name in compose_names or file_name.startswith("docker-compose"):
+            return cls.FILE_DOCKER
+
+        # Check by extension
+        extension_map = {
+            ".yaml": cls.FILE_YAML,
+            ".yml": cls.FILE_YAML,
+            ".json": cls.FILE_JSON,
+            ".md": cls.FILE_MARKDOWN,
+            ".j2": cls.FILE_JINJA2,
+            ".sh": cls.FILE_SHELL,
+            ".py": cls.FILE_PYTHON,
+            ".txt": cls.FILE_TEXT,
+        }
+
+        return extension_map.get(suffix, cls.FILE_DEFAULT)
+
+    @classmethod
+    def get_status_icon(cls, status: str) -> str:
+        """Get the appropriate icon for a status indicator.
+
+        Args:
+            status: Status type (success, error, warning, info, skipped)
+
+        Returns:
+            Unicode icon character for the status
+
+        Examples:
+            >>> IconManager.get_status_icon("success")
+            '✓'
+            >>> IconManager.get_status_icon("warning")
+            '⚠'
+        """
+        status_map = {
+            "success": cls.STATUS_SUCCESS,
+            "error": cls.STATUS_ERROR,
+            "warning": cls.STATUS_WARNING,
+            "info": cls.STATUS_INFO,
+            "skipped": cls.STATUS_SKIPPED,
+        }
+        return status_map.get(status.lower(), cls.STATUS_INFO)
+
+    @classmethod
+    def folder(cls) -> str:
+        """Get the folder icon."""
+        return cls.FILE_FOLDER
+
+    @classmethod
+    def config(cls) -> str:
+        """Get the config icon."""
+        return cls.UI_CONFIG
+
+    @classmethod
+    def lock(cls) -> str:
+        """Get the lock icon (for sensitive variables)."""
+        return cls.UI_LOCK
+
+    @classmethod
+    def arrow_right(cls) -> str:
+        """Get the right arrow icon (for showing transitions/changes)."""
+        return cls.UI_ARROW_RIGHT
+
+    @classmethod
+    def replace_shortcodes(cls, text: str) -> str:
+        """Replace emoji-style shortcodes with Nerd Font icons.
+
+        Args:
+            text: Text containing shortcodes like :warning:, :info:, etc.
+
+        Returns:
+            Text with shortcodes replaced by Nerd Font icons
+
+        Examples:
+            >>> IconManager.replace_shortcodes(":warning: This is a warning")
+            ' This is a warning'
+            >>> IconManager.replace_shortcodes(":docker: :kubernetes: Stack")
+            '  Stack'
+        """
+        result = text
+        for shortcode, icon in cls.SHORTCODES.items():
+            result = result.replace(shortcode, icon)
+        return result

+ 66 - 0
cli/core/display/display_settings.py

@@ -0,0 +1,66 @@
+"""Display configuration settings for the CLI."""
+
+
+class DisplaySettings:
+    """Centralized display configuration settings.
+
+    This class holds all configurable display parameters including colors,
+    styles, layouts, and formatting options. Modify these values to customize
+    the CLI appearance.
+    """
+
+    # === Color Scheme ===
+    COLOR_ERROR = "red"
+    COLOR_WARNING = "yellow"
+    COLOR_SUCCESS = "green"
+    COLOR_INFO = "blue"
+    COLOR_MUTED = "dim"
+
+    # Library type colors
+    COLOR_LIBRARY_GIT = "blue"
+    COLOR_LIBRARY_STATIC = "yellow"
+
+    # === Style Constants ===
+    STYLE_HEADER = "bold white underline"
+    STYLE_HEADER_ALT = "bold cyan"
+    STYLE_DISABLED = "bright_black"
+    STYLE_SECTION_TITLE = "bold cyan"
+    STYLE_SECTION_DESC = "dim"
+    STYLE_TEMPLATE_NAME = "bold white"
+
+    # Table styles
+    STYLE_TABLE_HEADER = "bold blue"
+    STYLE_VAR_COL_NAME = "white"
+    STYLE_VAR_COL_TYPE = "magenta"
+    STYLE_VAR_COL_DEFAULT = "green"
+    STYLE_VAR_COL_DESC = "white"
+
+    # === Text Labels ===
+    LABEL_REQUIRED = " [yellow](*)[/yellow]"
+    LABEL_DISABLED = " (disabled)"
+    TEXT_EMPTY_VALUE = "(none)"
+    TEXT_EMPTY_OVERRIDE = "(empty)"
+    TEXT_UNNAMED_TEMPLATE = "Unnamed Template"
+    TEXT_NO_DESCRIPTION = "No description available"
+    TEXT_VERSION_NOT_SPECIFIED = "Not specified"
+
+    # === Value Formatting ===
+    SENSITIVE_MASK = "********"
+    TRUNCATION_SUFFIX = "..."
+    VALUE_MAX_LENGTH_SHORT = 15
+    VALUE_MAX_LENGTH_DEFAULT = 30
+
+    # === Layout Constants ===
+    SECTION_SEPARATOR_CHAR = "─"
+    SECTION_SEPARATOR_LENGTH = 40
+    VAR_NAME_INDENT = "  "  # 2 spaces
+
+    # === Size Formatting ===
+    SIZE_KB_THRESHOLD = 1024
+    SIZE_MB_THRESHOLD = 1024 * 1024
+    SIZE_DECIMAL_PLACES = 1
+
+    # === Table Padding ===
+    PADDING_PANEL = (1, 2)
+    PADDING_TABLE_COMPACT = (0, 1)
+    PADDING_TABLE_NORMAL = (0, 2)

+ 303 - 0
cli/core/display/display_status.py

@@ -0,0 +1,303 @@
+from __future__ import annotations
+
+import logging
+import re
+from typing import TYPE_CHECKING
+
+from rich import box
+from rich._loop import loop_first
+from rich.console import Console, ConsoleOptions, RenderResult
+from rich.markdown import Heading, ListItem, Markdown
+from rich.panel import Panel
+from rich.segment import Segment
+from rich.text import Text
+
+from .display_icons import IconManager
+from .display_settings import DisplaySettings
+
+if TYPE_CHECKING:
+    from .display_base import BaseDisplay
+
+logger = logging.getLogger(__name__)
+console_err = Console(stderr=True)  # Keep for error output
+
+
+class LeftAlignedHeading(Heading):
+    """Custom Heading element with left alignment and no extra spacing."""
+
+    def __rich_console__(self, console: Console, options: ConsoleOptions) -> RenderResult:
+        text = self.text
+        text.justify = "left"  # Override center justification
+        if self.tag == "h1":
+            # Draw a border around h1s (left-aligned)
+            yield Panel(
+                text,
+                box=box.HEAVY,
+                style="markdown.h1.border",
+            )
+        else:
+            # Styled text for h2 and beyond (no blank line before h2)
+            yield text
+
+
+class IconListItem(ListItem):
+    """Custom list item that replaces bullets with colored icons from shortcodes."""
+
+    def render_bullet(self, console: Console, options: ConsoleOptions) -> RenderResult:
+        """Render list item with icon replacement if text starts with :shortcode:."""
+        # Get the text content from elements
+        text_content = ""
+        for element in self.elements:
+            if hasattr(element, "text"):
+                text_content = element.text.plain
+                break
+
+        icon_used = None
+        icon_color = "cyan"  # Default color for icons
+        shortcode_found = None
+
+        # Scan for shortcode at the beginning
+        for shortcode, icon in IconManager.SHORTCODES.items():
+            if text_content.strip().startswith(shortcode):
+                icon_used = icon
+                shortcode_found = shortcode
+
+                # Map shortcodes to colors
+                shortcode_colors = {
+                    ":warning:": "yellow",
+                    ":error:": "red",
+                    ":check:": "green",
+                    ":success:": "green",
+                    ":info:": "blue",
+                    ":docker:": "blue",
+                    ":kubernetes:": "blue",
+                    ":rocket:": "magenta",
+                    ":star:": "yellow",
+                    ":lightning:": "yellow",
+                }
+                icon_color = shortcode_colors.get(shortcode, "cyan")
+                break
+
+        if icon_used and shortcode_found:
+            # Remove the shortcode from the text in all elements
+            for element in self.elements:
+                if hasattr(element, "text"):
+                    # Replace the shortcode in the Text object
+                    plain_text = element.text.plain
+                    new_text = plain_text.replace(shortcode_found, "", 1).lstrip()
+                    # Reconstruct the Text object with the same style
+                    element.text = Text(new_text, style=element.text.style)
+
+            # Render with custom colored icon instead of bullet
+            render_options = options.update(width=options.max_width - 3)
+            lines = console.render_lines(self.elements, render_options, style=self.style)
+            bullet_style = console.get_style(icon_color, default="none")
+
+            bullet = Segment(f" {icon_used} ", bullet_style)
+            padding = Segment(" " * 3)
+            new_line = Segment("\n")
+
+            for first, line in loop_first(lines):
+                yield bullet if first else padding
+                yield from line
+                yield new_line
+        else:
+            # No icon found, use default list item rendering
+            yield from super().render_bullet(console, options)
+
+
+class LeftAlignedMarkdown(Markdown):
+    """Custom Markdown renderer with left-aligned headings and icon list items."""
+
+    def __init__(self, markup: str, **kwargs):
+        """Initialize with custom heading and list item elements."""
+        super().__init__(markup, **kwargs)
+
+        # Replace heading element to use left alignment
+        self.elements["heading_open"] = LeftAlignedHeading
+
+        # Replace list item element to use icon replacement
+        self.elements["list_item_open"] = IconListItem
+
+
+class StatusDisplay:
+    """Status messages and error display.
+
+    Provides methods for displaying success, error, warning,
+    and informational messages with consistent formatting.
+    """
+
+    def __init__(self, settings: DisplaySettings, quiet: bool, base: BaseDisplay):
+        """Initialize StatusDisplay.
+
+        Args:
+            settings: Display settings for formatting
+            quiet: If True, suppress non-error output
+            base: BaseDisplay instance
+        """
+        self.settings = settings
+        self.quiet = quiet
+        self.base = base
+
+    def _display_message(self, level: str, message: str, context: str | None = None) -> None:
+        """Display a message with consistent formatting.
+
+        Args:
+            level: Message level (error, warning, success, info)
+            message: The message to display
+            context: Optional context information
+        """
+        # Errors and warnings always go to stderr, even in quiet mode
+        # Success and info respect quiet mode and go to stdout
+        use_stderr = level in ("error", "warning")
+        should_print = use_stderr or not self.quiet
+
+        if not should_print:
+            return
+
+        settings = self.settings
+        colors = {
+            "error": settings.COLOR_ERROR,
+            "warning": settings.COLOR_WARNING,
+            "success": settings.COLOR_SUCCESS,
+        }
+        color = colors.get(level)
+
+        # Format message based on context
+        if context:
+            text = (
+                f"{level.capitalize()} in {context}: {message}"
+                if level in {"error", "warning"}
+                else f"{context}: {message}"
+            )
+        else:
+            text = f"{level.capitalize()}: {message}" if level in {"error", "warning"} else message
+
+        # Only use icons and colors for actual status indicators (error, warning, success)
+        # Plain info messages use default terminal color (no markup)
+        if level in {"error", "warning", "success"}:
+            icon = IconManager.get_status_icon(level)
+            formatted_text = f"[{color}]{icon} {text}[/{color}]"
+        else:
+            formatted_text = text
+
+        if use_stderr:
+            console_err.print(formatted_text)
+        else:
+            self.base.text(formatted_text)
+
+        # Log appropriately
+        log_message = f"{context}: {message}" if context else message
+        log_methods = {
+            "error": logger.error,
+            "warning": logger.warning,
+            "success": logger.info,
+            "info": logger.info,
+        }
+        log_methods.get(level, logger.info)(log_message)
+
+    def error(self, message: str, context: str | None = None, details: str | None = None) -> None:
+        """Display an error message.
+
+        Args:
+            message: Error message
+            context: Optional context
+            details: Optional additional details (shown in dim style on same line)
+        """
+        if details:
+            # Combine message and details on same line with different formatting
+            settings = self.settings
+            color = settings.COLOR_ERROR
+            icon = IconManager.get_status_icon("error")
+
+            # Format: Icon Error: Message (details in dim)
+            formatted = f"[{color}]{icon} Error: {message}[/{color}] [dim]({details})[/dim]"
+            console_err.print(formatted)
+
+            # Log at debug level to avoid duplicate console output (already printed to stderr)
+            logger.debug(f"Error displayed: {message} ({details})")
+        else:
+            # No details, use standard display
+            self._display_message("error", message, context)
+
+    def warning(self, message: str, context: str | None = None, details: str | None = None) -> None:
+        """Display a warning message.
+
+        Args:
+            message: Warning message
+            context: Optional context
+            details: Optional additional details (shown in dim style on same line)
+        """
+        if details:
+            # Combine message and details on same line with different formatting
+            settings = self.settings
+            color = settings.COLOR_WARNING
+            icon = IconManager.get_status_icon("warning")
+
+            # Format: Icon Warning: Message (details in dim)
+            formatted = f"[{color}]{icon} Warning: {message}[/{color}] [dim]({details})[/dim]"
+            console_err.print(formatted)
+
+            # Log at debug level to avoid duplicate console output (already printed to stderr)
+            logger.debug(f"Warning displayed: {message} ({details})")
+        else:
+            # No details, use standard display
+            self._display_message("warning", message, context)
+
+    def success(self, message: str, context: str | None = None) -> None:
+        """Display a success message.
+
+        Args:
+            message: Success message
+            context: Optional context
+        """
+        self._display_message("success", message, context)
+
+    def info(self, message: str, context: str | None = None) -> None:
+        """Display an informational message.
+
+        Args:
+            message: Info message
+            context: Optional context
+        """
+        self._display_message("info", message, context)
+
+    def skipped(self, message: str, reason: str | None = None) -> None:
+        """Display a skipped/disabled message.
+
+        Args:
+            message: The main message to display
+            reason: Optional reason why it was skipped
+        """
+        if reason:
+            self.base.text(f"\n{message} (skipped - {reason})", style="dim")
+        else:
+            self.base.text(f"\n{message} (skipped)", style="dim")
+
+    def markdown(self, content: str) -> None:
+        """Render markdown content with left-aligned headings.
+
+        Replaces emoji-style shortcodes (e.g., :warning:, :info:) with Nerd Font icons
+        before rendering, EXCEPT for shortcodes at the start of list items which are
+        handled by IconListItem to replace the bullet.
+
+        Args:
+            content: Markdown-formatted text to render (may contain shortcodes)
+        """
+        if not self.quiet:
+            # Replace shortcodes with Nerd Font icons, but preserve list item shortcodes
+            # Pattern: "- :shortcode:" at start of line should NOT be replaced
+            lines = content.split("\n")
+            processed_lines = []
+
+            for line in lines:
+                # Check if line is a list item starting with a shortcode
+                if re.match(r"^\s*-\s+:[a-z]+:", line):
+                    # Keep the line as-is, IconListItem will handle it
+                    processed_lines.append(line)
+                else:
+                    # Replace shortcodes normally
+                    processed_lines.append(IconManager.replace_shortcodes(line))
+
+            processed_content = "\n".join(processed_lines)
+            self.base._print_markdown(LeftAlignedMarkdown(processed_content))

+ 274 - 0
cli/core/display/display_table.py

@@ -0,0 +1,274 @@
+from __future__ import annotations
+
+import logging
+from typing import TYPE_CHECKING
+
+from rich.table import Table
+from rich.tree import Tree
+
+from .display_icons import IconManager
+from .display_settings import DisplaySettings
+
+if TYPE_CHECKING:
+    from .display_base import BaseDisplay
+
+logger = logging.getLogger(__name__)
+
+
+class TableDisplay:
+    """Table rendering.
+
+    Provides methods for displaying data tables with flexible formatting.
+    """
+
+    def __init__(self, settings: DisplaySettings, base: BaseDisplay):
+        """Initialize TableDisplay.
+
+        Args:
+            settings: Display settings for formatting
+            base: BaseDisplay instance for utility methods
+        """
+        self.settings = settings
+        self.base = base
+
+    def data_table(
+        self,
+        columns: list[dict],
+        rows: list,
+        title: str | None = None,
+        row_formatter: callable | None = None,
+    ) -> None:
+        """Display a data table with configurable columns and formatting.
+
+        Args:
+            columns: List of column definitions, each dict with:
+                     - name: Column header text
+                     - style: Optional Rich style (e.g., "bold", "cyan")
+                     - no_wrap: Optional bool to prevent text wrapping
+                     - justify: Optional justify ("left", "right", "center")
+            rows: List of data rows (dicts, tuples, or objects)
+            title: Optional table title
+            row_formatter: Optional function(row) -> tuple to transform row data
+        """
+        table = Table(title=title, show_header=True)
+
+        # Add columns
+        for col in columns:
+            table.add_column(
+                col["name"],
+                style=col.get("style"),
+                no_wrap=col.get("no_wrap", False),
+                justify=col.get("justify", "left"),
+            )
+
+        # Add rows
+        for row in rows:
+            if row_formatter:
+                formatted_row = row_formatter(row)
+            elif isinstance(row, dict):
+                formatted_row = tuple(str(row.get(col["name"], "")) for col in columns)
+            else:
+                formatted_row = tuple(str(cell) for cell in row)
+
+            table.add_row(*formatted_row)
+
+        self.base._print_table(table)
+
+    def render_templates_table(self, templates: list, module_name: str, _title: str) -> None:
+        """Display a table of templates with library type indicators.
+
+        Args:
+            templates: List of Template objects
+            module_name: Name of the module
+            _title: Title for the table (unused, kept for API compatibility)
+        """
+        if not templates:
+            logger.info(f"No templates found for module '{module_name}'")
+            return
+
+        logger.info(f"Listing {len(templates)} templates for module '{module_name}'")
+        table = Table()
+        table.add_column("ID", style="bold", no_wrap=True)
+        table.add_column("Name")
+        table.add_column("Tags")
+        table.add_column("Version", no_wrap=True)
+        table.add_column("Schema", no_wrap=True)
+        table.add_column("Library", no_wrap=True)
+
+        settings = self.settings
+
+        for template in templates:
+            name = template.metadata.name or settings.TEXT_UNNAMED_TEMPLATE
+            tags_list = template.metadata.tags or []
+            tags = ", ".join(tags_list) if tags_list else "-"
+            version = str(template.metadata.version) if template.metadata.version else ""
+            schema = template.schema_version if hasattr(template, "schema_version") else "1.0"
+
+            # Format library with icon and color
+            library_name = template.metadata.library or ""
+            library_type = template.metadata.library_type or "git"
+            icon = IconManager.UI_LIBRARY_STATIC if library_type == "static" else IconManager.UI_LIBRARY_GIT
+            color = "yellow" if library_type == "static" else "blue"
+            library_display = f"[{color}]{icon} {library_name}[/{color}]"
+
+            table.add_row(template.id, name, tags, version, schema, library_display)
+
+        self.base._print_table(table)
+
+    def render_status_table(
+        self,
+        _title: str,
+        rows: list[tuple[str, str, bool]],
+        columns: tuple[str, str] = ("Item", "Status"),
+    ) -> None:
+        """Display a status table with success/error indicators.
+
+        Args:
+            _title: Table title (unused, kept for API compatibility)
+            rows: List of tuples (name, message, success_bool)
+            columns: Column headers (name_header, status_header)
+        """
+        table = Table(show_header=True)
+        table.add_column(columns[0], style="cyan", no_wrap=True)
+        table.add_column(columns[1])
+
+        for name, message, success in rows:
+            status_style = "green" if success else "red"
+            status_icon = IconManager.get_status_icon("success" if success else "error")
+            table.add_row(name, f"[{status_style}]{status_icon} {message}[/{status_style}]")
+
+        self.base._print_table(table)
+
+    def render_summary_table(self, title: str, items: dict[str, str]) -> None:
+        """Display a simple two-column summary table.
+
+        Args:
+            title: Table title
+            items: Dictionary of key-value pairs to display
+        """
+        settings = self.settings
+        table = Table(
+            title=title,
+            show_header=False,
+            box=None,
+            padding=settings.PADDING_TABLE_NORMAL,
+        )
+        table.add_column(style="bold")
+        table.add_column()
+
+        for key, value in items.items():
+            table.add_row(key, value)
+
+        self.base._print_table(table)
+
+    def render_file_operation_table(self, files: list[tuple[str, int, str]]) -> None:
+        """Display a table of file operations with sizes and statuses.
+
+        Args:
+            files: List of tuples (file_path, size_bytes, status)
+        """
+        settings = self.settings
+        table = Table(
+            show_header=True,
+            header_style=settings.STYLE_TABLE_HEADER,
+            box=None,
+            padding=settings.PADDING_TABLE_COMPACT,
+        )
+        table.add_column("File", style="white", no_wrap=False)
+        table.add_column("Size", justify="right", style=settings.COLOR_MUTED)
+        table.add_column("Status", style=settings.COLOR_WARNING)
+
+        for file_path, size_bytes, status in files:
+            size_str = self.base.format_file_size(size_bytes)
+            table.add_row(str(file_path), size_str, status)
+
+        self.base._print_table(table)
+
+    def _build_section_label(
+        self,
+        section_name: str,
+        section_data: dict,
+        show_all: bool,
+    ) -> str:
+        """Build section label with metadata."""
+        section_desc = section_data.get("description", "")
+        section_toggle = section_data.get("toggle")
+        section_needs = section_data.get("needs")
+
+        label = f"[cyan]{section_name}[/cyan]"
+        if section_toggle:
+            label += f" [dim](toggle: {section_toggle})[/dim]"
+        if section_needs:
+            needs_str = ", ".join(section_needs) if isinstance(section_needs, list) else section_needs
+            label += f" [dim](needs: {needs_str})[/dim]"
+        if show_all and section_desc:
+            label += f"\n  [dim]{section_desc}[/dim]"
+
+        return label
+
+    def _build_variable_label(
+        self,
+        var_name: str,
+        var_data: dict,
+        show_all: bool,
+    ) -> str:
+        """Build variable label with type and default value."""
+        var_type = var_data.get("type", "string")
+        var_default = var_data.get("default", "")
+        var_desc = var_data.get("description", "")
+        var_sensitive = var_data.get("sensitive", False)
+
+        label = f"[green]{var_name}[/green] [dim]({var_type})[/dim]"
+
+        if var_default is not None and var_default != "":
+            settings = self.settings
+            display_val = settings.SENSITIVE_MASK if var_sensitive else str(var_default)
+            if not var_sensitive:
+                display_val = self.base.truncate(display_val, settings.VALUE_MAX_LENGTH_DEFAULT)
+            label += f" = [{settings.COLOR_WARNING}]{display_val}[/{settings.COLOR_WARNING}]"
+
+        if show_all and var_desc:
+            label += f"\n    [dim]{var_desc}[/dim]"
+
+        return label
+
+    def _add_section_variables(self, section_node, section_vars: dict, show_all: bool) -> None:
+        """Add variables to a section node."""
+        for var_name, var_data in section_vars.items():
+            if isinstance(var_data, dict):
+                var_label = self._build_variable_label(var_name, var_data, show_all)
+                section_node.add(var_label)
+            else:
+                # Simple key-value pair
+                section_node.add(f"[green]{var_name}[/green] = [yellow]{var_data}[/yellow]")
+
+    def render_config_tree(self, spec: dict, module_name: str, show_all: bool = False) -> None:
+        """Display configuration spec as a tree view.
+
+        Args:
+            spec: The configuration spec dictionary
+            module_name: Name of the module
+            show_all: If True, show all details including descriptions
+        """
+        if not spec:
+            self.base.text(f"No configuration found for module '{module_name}'", style="yellow")
+            return
+
+        # Create root tree node
+        icon = IconManager.config()
+        tree = Tree(f"[bold blue]{icon} {str.capitalize(module_name)} Configuration[/bold blue]")
+
+        for section_name, section_data in spec.items():
+            if not isinstance(section_data, dict):
+                continue
+
+            # Build and add section
+            section_label = self._build_section_label(section_name, section_data, show_all)
+            section_node = tree.add(section_label)
+
+            # Add variables to section
+            section_vars = section_data.get("vars") or {}
+            if section_vars:
+                self._add_section_variables(section_node, section_vars, show_all)
+
+        self.base._print_tree(tree)

+ 158 - 0
cli/core/display/display_template.py

@@ -0,0 +1,158 @@
+from __future__ import annotations
+
+from pathlib import Path
+from typing import TYPE_CHECKING
+
+from rich import box
+from rich.console import Console
+from rich.panel import Panel
+from rich.text import Text
+
+from .display_icons import IconManager
+from .display_settings import DisplaySettings
+
+if TYPE_CHECKING:
+    from ..template import Template
+    from .display_base import BaseDisplay
+    from .display_status import StatusDisplay
+    from .display_variable import VariableDisplay
+
+
+class TemplateDisplay:
+    """Template-related rendering.
+
+    Provides methods for displaying template information,
+    file trees, and metadata.
+    """
+
+    def __init__(
+        self,
+        settings: DisplaySettings,
+        base: BaseDisplay,
+        variables: VariableDisplay,
+        status: StatusDisplay,
+    ):
+        """Initialize TemplateDisplay.
+
+        Args:
+            settings: Display settings for formatting
+            base: BaseDisplay instance
+            variables: VariableDisplay instance for rendering variables
+            status: StatusDisplay instance for markdown rendering
+        """
+        self.settings = settings
+        self.base = base
+        self.variables = variables
+        self.status = status
+
+    def render_template(self, template: Template, template_id: str) -> None:
+        """Display template information panel and variables table.
+
+        Args:
+            template: Template instance to display
+            template_id: ID of the template
+        """
+        self.render_template_header(template, template_id)
+        self.render_file_tree(template)
+        self.variables.render_variables_table(template)
+
+    def render_template_header(self, template: Template, template_id: str) -> None:
+        """Display the header for a template with library information.
+
+        Args:
+            template: Template instance
+            template_id: ID of the template
+        """
+        settings = self.settings
+
+        template_name = template.metadata.name or settings.TEXT_UNNAMED_TEMPLATE
+        version = str(template.metadata.version) if template.metadata.version else settings.TEXT_VERSION_NOT_SPECIFIED
+        schema = template.schema_version if hasattr(template, "schema_version") else "1.0"
+        description = template.metadata.description or settings.TEXT_NO_DESCRIPTION
+
+        # Get library information and format with icon/color
+        library_name = template.metadata.library or ""
+        library_type = template.metadata.library_type or "git"
+        icon = IconManager.UI_LIBRARY_STATIC if library_type == "static" else IconManager.UI_LIBRARY_GIT
+        color = "yellow" if library_type == "static" else "blue"
+
+        # Create custom H1-style header with Rich markup support
+
+        # Build header content with Rich formatting
+        header_content = Text()
+        header_content.append(template_name, style="bold white")
+        header_content.append(" (", style="white")
+        header_content.append("id:", style="white")
+        header_content.append(template_id, style="dim")
+        header_content.append(" │ ", style="dim")
+        header_content.append("version:", style="white")
+        header_content.append(version, style="cyan")
+        header_content.append(" │ ", style="dim")
+        header_content.append("schema:", style="white")
+        header_content.append(schema, style="magenta")
+        header_content.append(" │ ", style="dim")
+        header_content.append("library:", style="white")
+        header_content.append(icon + " ", style=color)
+        header_content.append(library_name, style=color)
+        header_content.append(")", style="white")
+
+        panel = Panel(header_content, box=box.HEAVY, style="markdown.h1.border")
+        Console().print(panel)
+        self.base.text("")
+        self.status.markdown(description)
+
+    def render_file_tree(self, template: Template) -> None:
+        """Display the file structure of a template.
+
+        Args:
+            template: Template instance
+        """
+        self.base.text("")
+        self.base.heading("Template File Structure")
+
+        def get_template_file_info(template_file):
+            display_name = (
+                template_file.output_path.name
+                if hasattr(template_file, "output_path")
+                else template_file.relative_path.name
+            )
+            return (template_file.relative_path, display_name, "white", None)
+
+        if template.template_files:
+            self.base.file_tree(
+                f"{IconManager.folder()} [white]{template.id}[/white]",
+                template.template_files,
+                get_template_file_info,
+            )
+
+    def render_file_generation_confirmation(
+        self,
+        output_dir: Path,
+        files: dict[str, str],
+        existing_files: list[Path] | None = None,
+    ) -> None:
+        """Display files to be generated with confirmation prompt.
+
+        Args:
+            output_dir: Output directory path
+            files: Dictionary of file paths to content
+            existing_files: List of existing files that will be overwritten
+        """
+        self.base.text("")
+        self.base.heading("Files to be Generated")
+
+        def get_file_generation_info(file_path_str):
+            file_path = Path(file_path_str)
+            file_name = file_path.parts[-1] if file_path.parts else file_path.name
+            full_path = output_dir / file_path
+
+            if existing_files and full_path in existing_files:
+                return (file_path, file_name, "yellow", "[red](will overwrite)[/red]")
+            return (file_path, file_name, "green", None)
+
+        self.base.file_tree(
+            f"{IconManager.folder()} [cyan]{output_dir.resolve()}[/cyan]",
+            files.keys(),
+            get_file_generation_info,
+        )
+        self.base.text("")

+ 242 - 0
cli/core/display/display_variable.py

@@ -0,0 +1,242 @@
+from __future__ import annotations
+
+import logging
+from typing import TYPE_CHECKING
+
+from rich.table import Table
+
+from .display_icons import IconManager
+from .display_settings import DisplaySettings
+
+logger = logging.getLogger(__name__)
+
+if TYPE_CHECKING:
+    from ..template import Template
+    from .display_base import BaseDisplay
+
+
+class VariableDisplay:
+    """Variable-related rendering.
+
+    Provides methods for displaying variables, sections,
+    and their values with appropriate formatting based on context.
+    """
+
+    def __init__(self, settings: DisplaySettings, base: BaseDisplay):
+        """Initialize VariableDisplay.
+
+        Args:
+            settings: Display settings for formatting
+            base: BaseDisplay instance
+        """
+        self.settings = settings
+        self.base = base
+
+    def render_variable_value(
+        self,
+        variable,
+        _context: str = "default",
+        is_dimmed: bool = False,
+        var_satisfied: bool = True,
+    ) -> str:
+        """Render variable value with appropriate formatting based on context.
+
+        Args:
+            variable: Variable instance to render
+            _context: Display context (unused, kept for API compatibility)
+            is_dimmed: Whether the variable should be dimmed
+            var_satisfied: Whether the variable's dependencies are satisfied
+
+        Returns:
+            Formatted string representation of the variable value
+        """
+        # Handle disabled bool variables
+        if (is_dimmed or not var_satisfied) and variable.type == "bool":
+            if hasattr(variable, "_original_disabled") and variable._original_disabled is not False:
+                return f"{variable._original_disabled} {IconManager.arrow_right()} False"
+            return "False"
+
+        # Handle config overrides with arrow
+        if (
+            variable.origin == "config"
+            and hasattr(variable, "_original_stored")
+            and variable.original_value != variable.value
+        ):
+            settings = self.settings
+            orig = self._format_value(
+                variable,
+                variable.original_value,
+                max_length=settings.VALUE_MAX_LENGTH_SHORT,
+            )
+            curr = variable.get_display_value(
+                mask_sensitive=True,
+                max_length=settings.VALUE_MAX_LENGTH_SHORT,
+                show_none=False,
+            )
+            if not curr:
+                curr = str(variable.value) if variable.value else settings.TEXT_EMPTY_OVERRIDE
+            arrow = IconManager.arrow_right()
+            color = settings.COLOR_WARNING
+            return f"[dim]{orig}[/dim] [bold {color}]{arrow} {curr}[/bold {color}]"
+
+        # Default formatting
+        settings = self.settings
+        value = variable.get_display_value(
+            mask_sensitive=True,
+            max_length=settings.VALUE_MAX_LENGTH_DEFAULT,
+            show_none=True,
+        )
+        if not variable.value:
+            return f"[{settings.COLOR_MUTED}]{value}[/{settings.COLOR_MUTED}]"
+        return value
+
+    def _format_value(self, variable, value, max_length: int | None = None) -> str:
+        """Helper to format a specific value.
+
+        Args:
+            variable: Variable instance
+            value: Value to format
+            max_length: Maximum length before truncation
+
+        Returns:
+            Formatted value string
+        """
+        settings = self.settings
+
+        if variable.sensitive:
+            return settings.SENSITIVE_MASK
+        if value is None or value == "":
+            return f"[{settings.COLOR_MUTED}]({settings.TEXT_EMPTY_VALUE})[/{settings.COLOR_MUTED}]"
+
+        val_str = str(value)
+        return self.base.truncate(val_str, max_length)
+
+    def render_section(self, title: str, description: str | None) -> None:
+        """Display a section header.
+
+        Args:
+            title: Section title
+            description: Optional section description
+        """
+        settings = self.settings
+        if description:
+            self.base.text(
+                f"\n{title} - {description}",
+                style=f"{settings.STYLE_SECTION_TITLE} {settings.STYLE_SECTION_DESC}",
+            )
+        else:
+            self.base.text(f"\n{title}", style=settings.STYLE_SECTION_TITLE)
+        self.base.text(
+            settings.SECTION_SEPARATOR_CHAR * settings.SECTION_SEPARATOR_LENGTH,
+            style=settings.COLOR_MUTED,
+        )
+
+    def _render_section_header(self, section, is_dimmed: bool) -> str:
+        """Build section header text with appropriate styling.
+
+        Args:
+            section: VariableSection instance
+            is_dimmed: Whether section is dimmed (disabled)
+
+        Returns:
+            Formatted header text with Rich markup
+        """
+        settings = self.settings
+        # Show (disabled) label if section has a toggle and is not enabled
+        disabled_text = settings.LABEL_DISABLED if (section.toggle and not section.is_enabled()) else ""
+
+        if is_dimmed:
+            style = settings.STYLE_DISABLED
+            return f"[bold {style}]{section.title}{disabled_text}[/bold {style}]"
+        return f"[bold]{section.title}{disabled_text}[/bold]"
+
+    def _render_variable_row(self, var_name: str, variable, is_dimmed: bool, var_satisfied: bool) -> tuple:
+        """Build variable row data for table display.
+
+        Args:
+            var_name: Variable name
+            variable: Variable instance
+            is_dimmed: Whether containing section is dimmed
+            var_satisfied: Whether variable dependencies are satisfied
+
+        Returns:
+            Tuple of (var_display, type, default_val, description, row_style)
+        """
+        settings = self.settings
+
+        # Build row style
+        row_style = settings.STYLE_DISABLED if (is_dimmed or not var_satisfied) else None
+
+        # Build default value
+        default_val = self.render_variable_value(variable, is_dimmed=is_dimmed, var_satisfied=var_satisfied)
+
+        # Build variable display name
+        sensitive_icon = f" {IconManager.lock()}" if variable.sensitive else ""
+        # Only show required indicator if variable is enabled (not dimmed and dependencies satisfied)
+        required_indicator = settings.LABEL_REQUIRED if variable.required and not is_dimmed and var_satisfied else ""
+        var_display = f"{settings.VAR_NAME_INDENT}{var_name}{sensitive_icon}{required_indicator}"
+
+        return (
+            var_display,
+            variable.type or "str",
+            default_val,
+            variable.description or "",
+            row_style,
+        )
+
+    def render_variables_table(self, template: Template) -> None:
+        """Display a table of variables for a template.
+
+        All variables and sections are always shown. Disabled sections/variables
+        are displayed with dimmed styling.
+
+        Args:
+            template: Template instance
+        """
+        if not (template.variables and template.variables.has_sections()):
+            return
+
+        settings = self.settings
+        self.base.text("")
+        self.base.heading("Template Variables")
+
+        variables_table = Table(show_header=True, header_style=settings.STYLE_TABLE_HEADER)
+        variables_table.add_column("Variable", style=settings.STYLE_VAR_COL_NAME, no_wrap=True)
+        variables_table.add_column("Type", style=settings.STYLE_VAR_COL_TYPE)
+        variables_table.add_column("Default", style=settings.STYLE_VAR_COL_DEFAULT)
+        variables_table.add_column("Description", style=settings.STYLE_VAR_COL_DESC)
+
+        first_section = True
+        for section in template.variables.get_sections().values():
+            if not section.variables:
+                continue
+
+            if not first_section:
+                variables_table.add_row("", "", "", "", style=settings.STYLE_DISABLED)
+            first_section = False
+
+            # Check if section is enabled AND dependencies are satisfied
+            is_enabled = section.is_enabled()
+            dependencies_satisfied = template.variables.is_section_satisfied(section.key)
+            is_dimmed = not (is_enabled and dependencies_satisfied)
+
+            # Render section header
+            header_text = self._render_section_header(section, is_dimmed)
+            variables_table.add_row(header_text, "", "", "")
+
+            # Render variables
+            for var_name, variable in section.variables.items():
+                # Check if variable's needs are satisfied
+                var_satisfied = template.variables.is_variable_satisfied(var_name)
+
+                # Build and add row
+                (
+                    var_display,
+                    var_type,
+                    default_val,
+                    description,
+                    row_style,
+                ) = self._render_variable_row(var_name, variable, is_dimmed, var_satisfied)
+                variables_table.add_row(var_display, var_type, default_val, description, style=row_style)
+
+        self.base._print_table(variables_table)

+ 46 - 27
cli/core/exceptions.py

@@ -4,7 +4,9 @@ This module defines specific exception types for better error handling
 and diagnostics throughout the application.
 """
 
-from typing import Optional, List, Dict
+from __future__ import annotations
+
+from dataclasses import dataclass, field
 
 
 class BoilerplatesError(Exception):
@@ -34,7 +36,7 @@ class TemplateError(BoilerplatesError):
 class TemplateNotFoundError(TemplateError):
     """Raised when a template cannot be found."""
 
-    def __init__(self, template_id: str, module_name: Optional[str] = None):
+    def __init__(self, template_id: str, module_name: str | None = None):
         self.template_id = template_id
         self.module_name = module_name
         msg = f"Template '{template_id}' not found"
@@ -64,7 +66,7 @@ class TemplateLoadError(TemplateError):
 class TemplateSyntaxError(TemplateError):
     """Raised when a Jinja2 template has syntax errors."""
 
-    def __init__(self, template_id: str, errors: List[str]):
+    def __init__(self, template_id: str, errors: list[str]):
         self.template_id = template_id
         self.errors = errors
         msg = f"Jinja2 syntax errors in template '{template_id}':\n" + "\n".join(errors)
@@ -101,37 +103,43 @@ class IncompatibleSchemaVersionError(TemplateError):
         super().__init__(msg)
 
 
+@dataclass
+class RenderErrorContext:
+    """Context information for template rendering errors."""
+
+    file_path: str | None = None
+    line_number: int | None = None
+    column: int | None = None
+    context_lines: list[str] = field(default_factory=list)
+    variable_context: dict[str, str] = field(default_factory=dict)
+    suggestions: list[str] = field(default_factory=list)
+    original_error: Exception | None = None
+
+
 class TemplateRenderError(TemplateError):
     """Raised when template rendering fails."""
 
-    def __init__(
-        self,
-        message: str,
-        file_path: Optional[str] = None,
-        line_number: Optional[int] = None,
-        column: Optional[int] = None,
-        context_lines: Optional[List[str]] = None,
-        variable_context: Optional[Dict[str, str]] = None,
-        suggestions: Optional[List[str]] = None,
-        original_error: Optional[Exception] = None,
-    ):
-        self.file_path = file_path
-        self.line_number = line_number
-        self.column = column
-        self.context_lines = context_lines or []
-        self.variable_context = variable_context or {}
-        self.suggestions = suggestions or []
-        self.original_error = original_error
+    def __init__(self, message: str, context: RenderErrorContext | None = None):
+        self.context = context or RenderErrorContext()
+
+        # Expose context fields as instance attributes for backward compatibility
+        self.file_path = self.context.file_path
+        self.line_number = self.context.line_number
+        self.column = self.context.column
+        self.context_lines = self.context.context_lines
+        self.variable_context = self.context.variable_context
+        self.suggestions = self.context.suggestions
+        self.original_error = self.context.original_error
 
         # Build enhanced error message
         parts = [message]
 
-        if file_path:
-            location = f"File: {file_path}"
-            if line_number:
-                location += f", Line: {line_number}"
-                if column:
-                    location += f", Column: {column}"
+        if self.context.file_path:
+            location = f"File: {self.context.file_path}"
+            if self.context.line_number:
+                location += f", Line: {self.context.line_number}"
+                if self.context.column:
+                    location += f", Column: {self.context.column}"
             parts.append(location)
 
         super().__init__("\n".join(parts))
@@ -190,6 +198,17 @@ class ModuleLoadError(ModuleError):
     pass
 
 
+class SchemaError(BoilerplatesError):
+    """Raised when schema operations fail."""
+
+    def __init__(self, message: str, details: str | None = None):
+        self.details = details
+        msg = message
+        if details:
+            msg += f" ({details})"
+        super().__init__(msg)
+
+
 class FileOperationError(BoilerplatesError):
     """Raised when file operations fail."""
 

+ 11 - 0
cli/core/input/__init__.py

@@ -0,0 +1,11 @@
+"""Input management package for CLI user input operations.
+
+This package provides centralized input handling with standardized styling
+and validation across the entire CLI application.
+"""
+
+from .input_manager import InputManager
+from .input_settings import InputSettings
+from .prompt_manager import PromptHandler
+
+__all__ = ["InputManager", "InputSettings", "PromptHandler"]

+ 228 - 0
cli/core/input/input_manager.py

@@ -0,0 +1,228 @@
+"""Input Manager for standardized user input handling.
+
+This module provides a centralized interface for all user input operations,
+ensuring consistent styling and validation across the CLI.
+"""
+
+from __future__ import annotations
+
+import logging
+import re
+from typing import Callable
+
+from rich.console import Console
+from rich.prompt import Confirm, IntPrompt, Prompt
+
+from .input_settings import InputSettings
+
+logger = logging.getLogger(__name__)
+console = Console()
+
+
+class InputManager:
+    """Manages all user input operations with standardized styling.
+
+    This class provides primitives for various types of user input including
+    text, passwords, confirmations, choices, and validated inputs.
+    """
+
+    def __init__(self, settings: InputSettings | None = None):
+        """Initialize InputManager.
+
+        Args:
+            settings: Input configuration settings (uses default if None)
+        """
+        self.settings = settings or InputSettings()
+
+    def text(
+        self,
+        prompt: str,
+        default: str | None = None,
+        password: bool = False,
+        validator: Callable[[str], bool] | None = None,
+        error_message: str | None = None,
+    ) -> str:
+        """Prompt for text input.
+
+        Args:
+            prompt: Prompt message to display
+            default: Default value if user presses Enter
+            password: If True, mask the input
+            validator: Optional validation function
+            error_message: Custom error message for validation failure
+
+        Returns:
+            User input string
+        """
+        if password:
+            return self.password(prompt, default)
+
+        while True:
+            result = Prompt.ask(
+                f"[{self.settings.PROMPT_STYLE}]{prompt}[/{self.settings.PROMPT_STYLE}]",
+                default=default or "",
+                console=console,
+            )
+
+            if validator and not validator(result):
+                msg = error_message or "Invalid input"
+                console.print(f"[{self.settings.PROMPT_ERROR_STYLE}]{msg}[/{self.settings.PROMPT_ERROR_STYLE}]")
+                continue
+
+            return result
+
+    def password(self, prompt: str, default: str | None = None) -> str:
+        """Prompt for password input (masked).
+
+        Args:
+            prompt: Prompt message to display
+            default: Default value if user presses Enter
+
+        Returns:
+            User input string (masked during entry)
+        """
+        return Prompt.ask(
+            f"[{self.settings.PROMPT_STYLE}]{prompt}[/{self.settings.PROMPT_STYLE}]",
+            default=default or "",
+            password=True,
+            console=console,
+        )
+
+    def confirm(self, prompt: str, default: bool | None = None) -> bool:
+        """Prompt for yes/no confirmation.
+
+        Args:
+            prompt: Prompt message to display
+            default: Default value if user presses Enter
+
+        Returns:
+            True for yes, False for no
+        """
+        if default is None:
+            default = self.settings.DEFAULT_CONFIRM_YES
+
+        return Confirm.ask(
+            f"[{self.settings.PROMPT_STYLE}]{prompt}[/{self.settings.PROMPT_STYLE}]",
+            default=default,
+            console=console,
+        )
+
+    def integer(
+        self,
+        prompt: str,
+        default: int | None = None,
+        min_value: int | None = None,
+        max_value: int | None = None,
+    ) -> int:
+        """Prompt for integer input with optional range validation.
+
+        Args:
+            prompt: Prompt message to display
+            default: Default value if user presses Enter
+            min_value: Minimum allowed value
+            max_value: Maximum allowed value
+
+        Returns:
+            Integer value
+        """
+        while True:
+            if default is not None:
+                result = IntPrompt.ask(
+                    f"[{self.settings.PROMPT_STYLE}]{prompt}[/{self.settings.PROMPT_STYLE}]",
+                    default=default,
+                    console=console,
+                )
+            else:
+                try:
+                    result = IntPrompt.ask(
+                        f"[{self.settings.PROMPT_STYLE}]{prompt}[/{self.settings.PROMPT_STYLE}]",
+                        console=console,
+                    )
+                except ValueError:
+                    console.print(
+                        f"[{self.settings.PROMPT_ERROR_STYLE}]{self.settings.MSG_INVALID_INTEGER}[/{self.settings.PROMPT_ERROR_STYLE}]"
+                    )
+                    continue
+
+            # Validate range
+            if min_value is not None and result < min_value:
+                error_style = self.settings.PROMPT_ERROR_STYLE
+                console.print(f"[{error_style}]Value must be at least {min_value}[/{error_style}]")
+                continue
+
+            if max_value is not None and result > max_value:
+                error_style = self.settings.PROMPT_ERROR_STYLE
+                console.print(f"[{error_style}]Value must be at most {max_value}[/{error_style}]")
+                continue
+
+            return result
+
+    def choice(self, prompt: str, choices: list[str], default: str | None = None) -> str:
+        """Prompt user to select one option from a list.
+
+        Args:
+            prompt: Prompt message to display
+            choices: List of valid options
+            default: Default choice if user presses Enter
+
+        Returns:
+            Selected choice
+        """
+        if not choices:
+            raise ValueError("Choices list cannot be empty")
+
+        choices_display = f"[{', '.join(choices)}]"
+        full_prompt = f"{prompt} {choices_display}"
+
+        while True:
+            result = Prompt.ask(
+                f"[{self.settings.PROMPT_STYLE}]{full_prompt}[/{self.settings.PROMPT_STYLE}]",
+                default=default or "",
+                console=console,
+            )
+
+            if result in choices:
+                return result
+
+            console.print(
+                f"[{self.settings.PROMPT_ERROR_STYLE}]{self.settings.MSG_INVALID_CHOICE}[/{self.settings.PROMPT_ERROR_STYLE}]"
+            )
+
+    def validate_email(self, email: str) -> bool:
+        """Validate email address format.
+
+        Args:
+            email: Email address to validate
+
+        Returns:
+            True if valid, False otherwise
+        """
+        pattern = r"^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$"
+        return bool(re.match(pattern, email))
+
+    def validate_url(self, url: str) -> bool:
+        """Validate URL format.
+
+        Args:
+            url: URL to validate
+
+        Returns:
+            True if valid, False otherwise
+        """
+        pattern = r"^https?://[^\s/$.?#].[^\s]*$"
+        return bool(re.match(pattern, url, re.IGNORECASE))
+
+    def validate_hostname(self, hostname: str) -> bool:
+        """Validate hostname/domain format.
+
+        Args:
+            hostname: Hostname to validate
+
+        Returns:
+            True if valid, False otherwise
+        """
+        pattern = (
+            r"^(?:[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?\.)*"
+            r"[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?$"
+        )
+        return bool(re.match(pattern, hostname))

+ 37 - 0
cli/core/input/input_settings.py

@@ -0,0 +1,37 @@
+"""Input configuration settings for the CLI.
+
+This module defines all configurable input parameters including prompt styles,
+colors, and default behaviors.
+"""
+
+
+class InputSettings:
+    """Centralized input configuration settings.
+
+    This class holds all configurable input parameters including prompt styles,
+    colors, validation messages, and default behaviors.
+    """
+
+    # === Prompt Styles ===
+    PROMPT_STYLE = "white"
+    PROMPT_DEFAULT_STYLE = "dim"
+    PROMPT_ERROR_STYLE = "red"
+    PROMPT_SUCCESS_STYLE = "green"
+
+    # === Validation Messages ===
+    MSG_INVALID_INTEGER = "Please enter a valid integer"
+    MSG_INVALID_FLOAT = "Please enter a valid number"
+    MSG_INVALID_EMAIL = "Please enter a valid email address"
+    MSG_INVALID_URL = "Please enter a valid URL"
+    MSG_INVALID_HOSTNAME = "Please enter a valid hostname"
+    MSG_REQUIRED = "This field is required"
+    MSG_INVALID_CHOICE = "Please select a valid option"
+
+    # === Default Values ===
+    DEFAULT_CONFIRM_YES = True
+    DEFAULT_PASSWORD_MASK = "•"
+
+    # === Prompt Labels ===
+    LABEL_DEFAULT = "default"
+    LABEL_AUTO = "*auto"
+    LABEL_OPTIONAL = "optional"

+ 243 - 0
cli/core/input/prompt_manager.py

@@ -0,0 +1,243 @@
+from __future__ import annotations
+
+import logging
+from typing import Any, Callable
+
+from rich.console import Console
+from rich.prompt import IntPrompt, Prompt
+
+from ..display import DisplayManager
+from ..input import InputManager
+from ..template import Variable, VariableCollection
+
+logger = logging.getLogger(__name__)
+
+
+class PromptHandler:
+    """Simple interactive prompt handler for collecting template variables."""
+
+    def __init__(self) -> None:
+        self.console = Console()
+        self.display = DisplayManager()
+
+    def _handle_section_toggle(self, section, collected: dict[str, Any]) -> bool:
+        """Handle section toggle variable and return whether section should be enabled."""
+        if section.required:
+            logger.debug(f"Processing required section '{section.key}' without toggle prompt")
+            return True
+
+        if not section.toggle:
+            return True
+
+        toggle_var = section.variables.get(section.toggle)
+        if not toggle_var:
+            return True
+
+        current_value = toggle_var.convert(toggle_var.value)
+        new_value = self._prompt_variable(toggle_var, _required=section.required)
+
+        if new_value != current_value:
+            collected[toggle_var.name] = new_value
+            toggle_var.value = new_value
+
+        return section.is_enabled()
+
+    def _should_skip_variable(
+        self,
+        var_name: str,
+        section,
+        variables: VariableCollection,
+        section_enabled: bool,
+    ) -> bool:
+        """Determine if a variable should be skipped during collection."""
+        if section.toggle and var_name == section.toggle:
+            return True
+
+        if not variables.is_variable_satisfied(var_name):
+            logger.debug(f"Skipping variable '{var_name}' - needs not satisfied")
+            return True
+
+        if not section_enabled:
+            logger.debug(f"Skipping variable '{var_name}' from disabled section '{section.key}'")
+            return True
+
+        return False
+
+    def _collect_variable_value(self, variable: Variable, section, collected: dict[str, Any]) -> None:
+        """Collect a single variable value and update if changed."""
+        current_value = variable.convert(variable.value)
+        new_value = self._prompt_variable(variable, _required=section.required)
+
+        if variable.autogenerated and new_value is None:
+            collected[variable.name] = None
+            variable.value = None
+        elif new_value != current_value:
+            collected[variable.name] = new_value
+            variable.value = new_value
+
+    def collect_variables(self, variables: VariableCollection) -> dict[str, Any]:
+        """Collect values for variables by iterating through sections.
+
+        Args:
+            variables: VariableCollection with organized sections and variables
+
+        Returns:
+            Dict of variable names to collected values
+        """
+        input_mgr = InputManager()
+        if not input_mgr.confirm("Customize any settings?", default=False):
+            logger.info("User opted to keep all default values")
+            return {}
+
+        collected: dict[str, Any] = {}
+
+        for _section_key, section in variables.get_sections().items():
+            if not section.variables:
+                continue
+
+            self.display.section(section.title, section.description)
+            section_enabled = self._handle_section_toggle(section, collected)
+
+            for var_name, variable in section.variables.items():
+                if self._should_skip_variable(var_name, section, variables, section_enabled):
+                    continue
+
+                self._collect_variable_value(variable, section, collected)
+
+        logger.info(f"Variable collection completed. Collected {len(collected)} values")
+        return collected
+
+    def _prompt_variable(self, variable: Variable, _required: bool = False) -> Any:
+        """Prompt for a single variable value based on its type.
+
+        Args:
+            variable: The variable to prompt for
+            _required: Whether the containing section is required
+                      (unused, kept for API compatibility)
+
+        Returns:
+            The validated value entered by the user
+        """
+        logger.debug(f"Prompting for variable '{variable.name}' (type: {variable.type})")
+
+        # Use variable's native methods for prompt text and default value
+        prompt_text = variable.get_prompt_text()
+        default_value = variable.get_normalized_default()
+
+        # Add lock icon before default value for sensitive or autogenerated variables
+        if variable.sensitive or variable.autogenerated:
+            # Format: "Prompt text 🔒 (default)"
+            # The lock icon goes between the text and the default value in parentheses
+            prompt_text = f"{prompt_text} {self.display.get_lock_icon()}"
+
+        # Check if this specific variable is required (has no default and not autogenerated)
+        var_is_required = variable.is_required()
+
+        # If variable is required, mark it in the prompt
+        if var_is_required:
+            prompt_text = f"{prompt_text} [bold red]*required[/bold red]"
+
+        handler = self._get_prompt_handler(variable)
+
+        # Add validation hint (includes both extra text and enum options)
+        hint = variable.get_validation_hint()
+        if hint:
+            # Show options/extra inline inside parentheses, before the default
+            prompt_text = f"{prompt_text} [dim]({hint})[/dim]"
+
+        while True:
+            try:
+                raw = handler(prompt_text, default_value)
+                # Use Variable's centralized validation method that handles:
+                # - Type conversion
+                # - Autogenerated variable detection
+                # - Required field validation
+                return variable.validate_and_convert(raw, check_required=True)
+
+                # Return the converted value (caller will update variable.value)
+            except ValueError as exc:
+                # Conversion/validation failed — show a consistent error message and retry
+                self._show_validation_error(str(exc))
+            except Exception as e:
+                # Unexpected error — log and retry using the stored (unconverted) value
+                logger.error(f"Error prompting for variable '{variable.name}': {e!s}")
+                default_value = variable.value
+                handler = self._get_prompt_handler(variable)
+
+    def _get_prompt_handler(self, variable: Variable) -> Callable:
+        """Return the prompt function for a variable type."""
+        handlers = {
+            "bool": self._prompt_bool,
+            "int": self._prompt_int,
+            # For enum prompts we pass the variable.extra through so options and extra
+            # can be combined into a single inline hint.
+            "enum": lambda text, default: self._prompt_enum(
+                text,
+                variable.options or [],
+                default,
+                _extra=getattr(variable, "extra", None),
+            ),
+        }
+        return handlers.get(
+            variable.type,
+            lambda text, default: self._prompt_string(text, default, is_sensitive=variable.sensitive),
+        )
+
+    def _show_validation_error(self, message: str) -> None:
+        """Display validation feedback consistently."""
+        self.display.error(message)
+
+    def _prompt_string(self, prompt_text: str, default: Any = None, is_sensitive: bool = False) -> str | None:
+        value = Prompt.ask(
+            prompt_text,
+            default=str(default) if default is not None else "",
+            show_default=True,
+            password=is_sensitive,
+        )
+        stripped = value.strip() if value else None
+        return stripped if stripped else None
+
+    def _prompt_bool(self, prompt_text: str, default: Any = None) -> bool | None:
+        input_mgr = InputManager()
+        if default is None:
+            return input_mgr.confirm(prompt_text, default=None)
+        converted = default if isinstance(default, bool) else str(default).lower() in ("true", "1", "yes", "on")
+        return input_mgr.confirm(prompt_text, default=converted)
+
+    def _prompt_int(self, prompt_text: str, default: Any = None) -> int | None:
+        converted = None
+        if default is not None:
+            try:
+                converted = int(default)
+            except (ValueError, TypeError):
+                logger.warning(f"Invalid default integer value: {default}")
+        return IntPrompt.ask(prompt_text, default=converted)
+
+    def _prompt_enum(
+        self,
+        prompt_text: str,
+        options: list[str],
+        default: Any = None,
+        _extra: str | None = None,
+    ) -> str:
+        """Prompt for enum selection with validation.
+
+        Note: prompt_text should already include hint from variable.get_validation_hint()
+        but we keep this for backward compatibility and fallback.
+        """
+        if not options:
+            return self._prompt_string(prompt_text, default)
+
+        # Validate default is in options
+        if default and str(default) not in options:
+            default = options[0]
+
+        while True:
+            value = Prompt.ask(
+                prompt_text,
+                default=str(default) if default else options[0],
+                show_default=True,
+            )
+            if value in options:
+                return value
+            self.console.print(f"[red]Invalid choice. Select from: {', '.join(options)}[/red]")

+ 70 - 117
cli/core/library.py

@@ -1,21 +1,23 @@
 from __future__ import annotations
 
-from pathlib import Path
 import logging
-from typing import Optional
+from pathlib import Path
+
 import yaml
 
-from .exceptions import LibraryError, TemplateNotFoundError, DuplicateTemplateError
+from .config import ConfigManager
+from .exceptions import DuplicateTemplateError, LibraryError, TemplateNotFoundError
 
 logger = logging.getLogger(__name__)
 
+# Qualified ID format: "template_id.library_name"
+QUALIFIED_ID_PARTS = 2
+
 
 class Library:
     """Represents a single library with a specific path."""
 
-    def __init__(
-        self, name: str, path: Path, priority: int = 0, library_type: str = "git"
-    ) -> None:
+    def __init__(self, name: str, path: Path, priority: int = 0, library_type: str = "git") -> None:
         """Initialize a library instance.
 
         Args:
@@ -25,9 +27,7 @@ class Library:
           library_type: Type of library ("git" or "static")
         """
         if library_type not in ("git", "static"):
-            raise ValueError(
-                f"Invalid library type: {library_type}. Must be 'git' or 'static'."
-            )
+            raise ValueError(f"Invalid library type: {library_type}. Must be 'git' or 'static'.")
 
         self.name = name
         self.path = path
@@ -45,12 +45,10 @@ class Library:
             return False
 
         try:
-            with open(template_file, "r", encoding="utf-8") as f:
+            with template_file.open(encoding="utf-8") as f:
                 docs = [doc for doc in yaml.safe_load_all(f) if doc]
-                return (
-                    docs[0].get("metadata", {}).get("draft", False) if docs else False
-                )
-        except (yaml.YAMLError, IOError, OSError) as e:
+                return docs[0].get("metadata", {}).get("draft", False) if docs else False
+        except (yaml.YAMLError, OSError) as e:
             logger.warning(f"Error checking draft status for {template_path}: {e}")
             return False
 
@@ -67,9 +65,7 @@ class Library:
         Raises:
             FileNotFoundError: If the template ID is not found in this library or is marked as draft
         """
-        logger.debug(
-            f"Looking for template '{template_id}' in module '{module_name}' in library '{self.name}'"
-        )
+        logger.debug(f"Looking for template '{template_id}' in module '{module_name}' in library '{self.name}'")
 
         # Build the path to the specific template directory
         template_path = self.path / module_name / template_id
@@ -85,9 +81,7 @@ class Library:
         logger.debug(f"Found template '{template_id}' at: {template_path}")
         return template_path, self.name
 
-    def find(
-        self, module_name: str, sort_results: bool = False
-    ) -> list[tuple[Path, str]]:
+    def find(self, module_name: str, sort_results: bool = False) -> list[tuple[Path, str]]:
         """Find templates in this library for a specific module.
 
         Excludes templates marked as draft.
@@ -102,27 +96,21 @@ class Library:
         Raises:
             FileNotFoundError: If the module directory is not found in this library
         """
-        logger.debug(
-            f"Looking for templates in module '{module_name}' in library '{self.name}'"
-        )
+        logger.debug(f"Looking for templates in module '{module_name}' in library '{self.name}'")
 
         # Build the path to the module directory
         module_path = self.path / module_name
 
         # Check if the module directory exists
         if not module_path.is_dir():
-            raise LibraryError(
-                f"Module '{module_name}' not found in library '{self.name}'"
-            )
+            raise LibraryError(f"Module '{module_name}' not found in library '{self.name}'")
 
         # Track seen IDs to detect duplicates within this library
         seen_ids = {}
         template_dirs = []
         try:
             for item in module_path.iterdir():
-                has_template = item.is_dir() and any(
-                    (item / f).exists() for f in ("template.yaml", "template.yml")
-                )
+                has_template = item.is_dir() and any((item / f).exists() for f in ("template.yaml", "template.yml"))
                 if has_template and not self._is_template_draft(item):
                     template_id = item.name
 
@@ -137,7 +125,7 @@ class Library:
         except PermissionError as e:
             raise LibraryError(
                 f"Permission denied accessing module '{module_name}' in library '{self.name}': {e}"
-            )
+            ) from e
 
         # Sort if requested
         if sort_results:
@@ -152,11 +140,38 @@ class LibraryManager:
 
     def __init__(self) -> None:
         """Initialize LibraryManager with git-based libraries from config."""
-        from .config import ConfigManager
-
         self.config = ConfigManager()
         self.libraries = self._load_libraries_from_config()
 
+    def _resolve_git_library_path(self, name: str, lib_config: dict, libraries_path: Path) -> Path:
+        """Resolve path for a git-based library."""
+        directory = lib_config.get("directory", ".")
+        library_base = libraries_path / name
+        if directory and directory != ".":
+            return library_base / directory
+        return library_base
+
+    def _resolve_static_library_path(self, name: str, lib_config: dict) -> Path | None:
+        """Resolve path for a static library."""
+        path_str = lib_config.get("path")
+        if not path_str:
+            logger.warning(f"Static library '{name}' has no path configured")
+            return None
+
+        library_path = Path(path_str).expanduser()
+        if not library_path.is_absolute():
+            library_path = (self.config.config_path.parent / library_path).resolve()
+        return library_path
+
+    def _warn_missing_library(self, name: str, library_path: Path, lib_type: str) -> None:
+        """Log warning about missing library."""
+        if lib_type == "git":
+            logger.warning(
+                f"Library '{name}' not found at {library_path}. Run 'boilerplates repo update' to sync libraries."
+            )
+        else:
+            logger.warning(f"Static library '{name}' not found at {library_path}")
+
     def _load_libraries_from_config(self) -> list[Library]:
         """Load libraries from configuration.
 
@@ -165,8 +180,6 @@ class LibraryManager:
         """
         libraries = []
         libraries_path = self.config.get_libraries_path()
-
-        # Get library configurations from config
         library_configs = self.config.get_libraries()
 
         for i, lib_config in enumerate(library_configs):
@@ -176,58 +189,25 @@ class LibraryManager:
                 continue
 
             name = lib_config.get("name")
-            lib_type = lib_config.get(
-                "type", "git"
-            )  # Default to "git" for backward compat
+            lib_type = lib_config.get("type", "git")
 
-            # Handle library type-specific path resolution
+            # Resolve library path based on type
             if lib_type == "git":
-                # Existing git logic
-                directory = lib_config.get("directory", ".")
-
-                # Build path to library: ~/.config/boilerplates/libraries/{name}/{directory}/
-                # For sparse-checkout, files remain in the specified directory
-                library_base = libraries_path / name
-                if directory and directory != ".":
-                    library_path = library_base / directory
-                else:
-                    library_path = library_base
-
+                library_path = self._resolve_git_library_path(name, lib_config, libraries_path)
             elif lib_type == "static":
-                # New static logic - use path directly
-                path_str = lib_config.get("path")
-                if not path_str:
-                    logger.warning(f"Static library '{name}' has no path configured")
+                library_path = self._resolve_static_library_path(name, lib_config)
+                if not library_path:
                     continue
-
-                # Expand ~ and resolve relative paths
-                library_path = Path(path_str).expanduser()
-                if not library_path.is_absolute():
-                    # Resolve relative to config directory
-                    library_path = (
-                        self.config.config_path.parent / library_path
-                    ).resolve()
-
             else:
-                logger.warning(
-                    f"Unknown library type '{lib_type}' for library '{name}'"
-                )
+                logger.warning(f"Unknown library type '{lib_type}' for library '{name}'")
                 continue
 
             # Check if library path exists
             if not library_path.exists():
-                if lib_type == "git":
-                    logger.warning(
-                        f"Library '{name}' not found at {library_path}. "
-                        f"Run 'repo update' to sync libraries."
-                    )
-                else:
-                    logger.warning(
-                        f"Static library '{name}' not found at {library_path}"
-                    )
+                self._warn_missing_library(name, library_path, lib_type)
                 continue
 
-            # Create Library instance with type and priority based on order (first = highest priority)
+            # Create Library instance with priority based on order
             priority = len(library_configs) - i
             libraries.append(
                 Library(
@@ -237,18 +217,14 @@ class LibraryManager:
                     library_type=lib_type,
                 )
             )
-            logger.debug(
-                f"Loaded {lib_type} library '{name}' from {library_path} with priority {priority}"
-            )
+            logger.debug(f"Loaded {lib_type} library '{name}' from {library_path} with priority {priority}")
 
         if not libraries:
-            logger.warning("No libraries loaded. Run 'repo update' to sync libraries.")
+            logger.warning("No libraries loaded. Run 'boilerplates repo update' to sync libraries.")
 
         return libraries
 
-    def find_by_id(
-        self, module_name: str, template_id: str
-    ) -> Optional[tuple[Path, str]]:
+    def find_by_id(self, module_name: str, template_id: str) -> tuple[Path, str] | None:
         """Find a template by its ID across all libraries.
 
         Supports both simple IDs and qualified IDs (template.library format).
@@ -260,34 +236,24 @@ class LibraryManager:
         Returns:
             Tuple of (template_path, library_name) if found, None otherwise
         """
-        logger.debug(
-            f"Searching for template '{template_id}' in module '{module_name}' across all libraries"
-        )
+        logger.debug(f"Searching for template '{template_id}' in module '{module_name}' across all libraries")
 
         # Check if this is a qualified ID (contains '.')
         if "." in template_id:
             parts = template_id.rsplit(".", 1)
-            if len(parts) == 2:
+            if len(parts) == QUALIFIED_ID_PARTS:
                 base_id, requested_lib = parts
-                logger.debug(
-                    f"Parsing qualified ID: base='{base_id}', library='{requested_lib}'"
-                )
+                logger.debug(f"Parsing qualified ID: base='{base_id}', library='{requested_lib}'")
 
                 # Try to find in the specific library
                 for library in self.libraries:
                     if library.name == requested_lib:
                         try:
-                            template_path, lib_name = library.find_by_id(
-                                module_name, base_id
-                            )
-                            logger.debug(
-                                f"Found template '{base_id}' in library '{requested_lib}'"
-                            )
+                            template_path, lib_name = library.find_by_id(module_name, base_id)
+                            logger.debug(f"Found template '{base_id}' in library '{requested_lib}'")
                             return template_path, lib_name
                         except TemplateNotFoundError:
-                            logger.debug(
-                                f"Template '{base_id}' not found in library '{requested_lib}'"
-                            )
+                            logger.debug(f"Template '{base_id}' not found in library '{requested_lib}'")
                             return None
 
                 logger.debug(f"Library '{requested_lib}' not found")
@@ -297,9 +263,7 @@ class LibraryManager:
         for library in sorted(self.libraries, key=lambda x: x.priority, reverse=True):
             try:
                 template_path, lib_name = library.find_by_id(module_name, template_id)
-                logger.debug(
-                    f"Found template '{template_id}' in library '{library.name}'"
-                )
+                logger.debug(f"Found template '{template_id}' in library '{library.name}'")
                 return template_path, lib_name
             except TemplateNotFoundError:
                 # Continue searching in next library
@@ -308,9 +272,7 @@ class LibraryManager:
         logger.debug(f"Template '{template_id}' not found in any library")
         return None
 
-    def find(
-        self, module_name: str, sort_results: bool = False
-    ) -> list[tuple[Path, str, bool]]:
+    def find(self, module_name: str, sort_results: bool = False) -> list[tuple[Path, str, bool]]:
         """Find templates across all libraries for a specific module.
 
         Handles duplicates by qualifying IDs with library names when needed.
@@ -323,9 +285,7 @@ class LibraryManager:
             List of tuples (template_path, library_name, needs_qualification)
             where needs_qualification is True if the template ID appears in multiple libraries
         """
-        logger.debug(
-            f"Searching for templates in module '{module_name}' across all libraries"
-        )
+        logger.debug(f"Searching for templates in module '{module_name}' across all libraries")
 
         all_templates = []
 
@@ -334,16 +294,12 @@ class LibraryManager:
             try:
                 templates = library.find(module_name, sort_results=False)
                 all_templates.extend(templates)
-                logger.debug(
-                    f"Found {len(templates)} templates in library '{library.name}'"
-                )
+                logger.debug(f"Found {len(templates)} templates in library '{library.name}'")
             except (LibraryError, DuplicateTemplateError) as e:
                 # DuplicateTemplateError from library.find() should propagate up
                 if isinstance(e, DuplicateTemplateError):
                     raise
-                logger.debug(
-                    f"Module '{module_name}' not found in library '{library.name}'"
-                )
+                logger.debug(f"Module '{module_name}' not found in library '{library.name}'")
                 continue
 
         # Track template IDs and their libraries to detect cross-library duplicates
@@ -360,10 +316,7 @@ class LibraryManager:
             if len(occurrences) > 1:
                 # Duplicate across libraries - mark for qualified IDs
                 lib_names = ", ".join(lib for _, lib in occurrences)
-                logger.info(
-                    f"Template '{template_id}' found in multiple libraries: {lib_names}. "
-                    f"Using qualified IDs."
-                )
+                logger.info(f"Template '{template_id}' found in multiple libraries: {lib_names}. Using qualified IDs.")
                 for template_path, library_name in occurrences:
                     # Mark that this ID needs qualification
                     result.append((template_path, library_name, True))

+ 0 - 1298
cli/core/module.py

@@ -1,1298 +0,0 @@
-from __future__ import annotations
-
-import logging
-from abc import ABC
-from pathlib import Path
-from typing import Any, Optional, List, Dict
-
-from rich.console import Console
-from rich.panel import Panel
-from rich.prompt import Confirm
-from typer import Argument, Option, Typer, Exit
-
-from .display import DisplayManager
-from .exceptions import (
-    TemplateRenderError,
-    TemplateSyntaxError,
-    TemplateValidationError,
-)
-from .library import LibraryManager
-from .prompt import PromptHandler
-from .template import Template
-
-logger = logging.getLogger(__name__)
-console = Console()
-console_err = Console(stderr=True)
-
-
-def parse_var_inputs(var_options: List[str], extra_args: List[str]) -> Dict[str, Any]:
-    """Parse variable inputs from --var options and extra args.
-
-    Supports formats:
-      --var KEY=VALUE
-      --var KEY VALUE
-
-    Args:
-      var_options: List of variable options from CLI
-      extra_args: Additional arguments that may contain values
-
-    Returns:
-      Dictionary of parsed variables
-    """
-    variables = {}
-
-    # Parse --var KEY=VALUE format
-    for var_option in var_options:
-        if "=" in var_option:
-            key, value = var_option.split("=", 1)
-            variables[key] = value
-        else:
-            # --var KEY VALUE format - value should be in extra_args
-            if extra_args:
-                variables[var_option] = extra_args.pop(0)
-            else:
-                logger.warning(f"No value provided for variable '{var_option}'")
-
-    return variables
-
-
-class Module(ABC):
-    """Streamlined base module that auto-detects variables from templates."""
-
-    # Schema version supported by this module (override in subclasses)
-    schema_version: str = "1.0"
-
-    def __init__(self) -> None:
-        if not all([self.name, self.description]):
-            raise ValueError(
-                f"Module {self.__class__.__name__} must define name and description"
-            )
-
-        logger.info(f"Initializing module '{self.name}'")
-        logger.debug(
-            f"Module '{self.name}' configuration: description='{self.description}'"
-        )
-        self.libraries = LibraryManager()
-        self.display = DisplayManager()
-
-    def list(
-        self,
-        raw: bool = Option(
-            False, "--raw", help="Output raw list format instead of rich table"
-        ),
-    ) -> list[Template]:
-        """List all templates."""
-        logger.debug(f"Listing templates for module '{self.name}'")
-        templates = []
-
-        entries = self.libraries.find(self.name, sort_results=True)
-        for entry in entries:
-            # Unpack entry - now returns (path, library_name, needs_qualification)
-            template_dir = entry[0]
-            library_name = entry[1]
-            needs_qualification = entry[2] if len(entry) > 2 else False
-
-            try:
-                # Get library object to determine type
-                library = next(
-                    (
-                        lib
-                        for lib in self.libraries.libraries
-                        if lib.name == library_name
-                    ),
-                    None,
-                )
-                library_type = library.library_type if library else "git"
-
-                template = Template(
-                    template_dir, library_name=library_name, library_type=library_type
-                )
-
-                # Validate schema version compatibility
-                template._validate_schema_version(self.schema_version, self.name)
-
-                # If template ID needs qualification, set qualified ID
-                if needs_qualification:
-                    template.set_qualified_id()
-
-                templates.append(template)
-            except Exception as exc:
-                logger.error(f"Failed to load template from {template_dir}: {exc}")
-                continue
-
-        filtered_templates = templates
-
-        if filtered_templates:
-            if raw:
-                # Output raw format (tab-separated values for easy filtering with awk/sed/cut)
-                # Format: ID\tNAME\tTAGS\tVERSION\tLIBRARY
-                for template in filtered_templates:
-                    name = template.metadata.name or "Unnamed Template"
-                    tags_list = template.metadata.tags or []
-                    tags = ",".join(tags_list) if tags_list else "-"
-                    version = (
-                        str(template.metadata.version)
-                        if template.metadata.version
-                        else "-"
-                    )
-                    library = template.metadata.library or "-"
-                    print(f"{template.id}\t{name}\t{tags}\t{version}\t{library}")
-            else:
-                # Output rich table format
-                self.display.display_templates_table(
-                    filtered_templates, self.name, f"{self.name.capitalize()} templates"
-                )
-        else:
-            logger.info(f"No templates found for module '{self.name}'")
-
-        return filtered_templates
-
-    def search(
-        self, query: str = Argument(..., help="Search string to filter templates by ID")
-    ) -> list[Template]:
-        """Search for templates by ID containing the search string."""
-        logger.debug(
-            f"Searching templates for module '{self.name}' with query='{query}'"
-        )
-        templates = []
-
-        entries = self.libraries.find(self.name, sort_results=True)
-        for entry in entries:
-            # Unpack entry - now returns (path, library_name, needs_qualification)
-            template_dir = entry[0]
-            library_name = entry[1]
-            needs_qualification = entry[2] if len(entry) > 2 else False
-
-            try:
-                # Get library object to determine type
-                library = next(
-                    (
-                        lib
-                        for lib in self.libraries.libraries
-                        if lib.name == library_name
-                    ),
-                    None,
-                )
-                library_type = library.library_type if library else "git"
-
-                template = Template(
-                    template_dir, library_name=library_name, library_type=library_type
-                )
-
-                # Validate schema version compatibility
-                template._validate_schema_version(self.schema_version, self.name)
-
-                # If template ID needs qualification, set qualified ID
-                if needs_qualification:
-                    template.set_qualified_id()
-
-                templates.append(template)
-            except Exception as exc:
-                logger.error(f"Failed to load template from {template_dir}: {exc}")
-                continue
-
-        # Apply search filtering
-        filtered_templates = [t for t in templates if query.lower() in t.id.lower()]
-
-        if filtered_templates:
-            logger.info(
-                f"Found {len(filtered_templates)} templates matching '{query}' for module '{self.name}'"
-            )
-            self.display.display_templates_table(
-                filtered_templates,
-                self.name,
-                f"{self.name.capitalize()} templates matching '{query}'",
-            )
-        else:
-            logger.info(
-                f"No templates found matching '{query}' for module '{self.name}'"
-            )
-            self.display.display_warning(
-                f"No templates found matching '{query}'",
-                context=f"module '{self.name}'",
-            )
-
-        return filtered_templates
-
-    def show(
-        self,
-        id: str,
-    ) -> None:
-        """Show template details."""
-        logger.debug(f"Showing template '{id}' from module '{self.name}'")
-        template = self._load_template_by_id(id)
-
-        if not template:
-            self.display.display_error(
-                f"Template '{id}' not found", context=f"module '{self.name}'"
-            )
-            return
-
-        # Apply config defaults (same as in generate)
-        # This ensures the display shows the actual defaults that will be used
-        if template.variables:
-            from .config import ConfigManager
-
-            config = ConfigManager()
-            config_defaults = config.get_defaults(self.name)
-
-            if config_defaults:
-                logger.debug(f"Loading config defaults for module '{self.name}'")
-                # Apply config defaults (this respects the variable types and validation)
-                successful = template.variables.apply_defaults(
-                    config_defaults, "config"
-                )
-                if successful:
-                    logger.debug(
-                        f"Applied config defaults for: {', '.join(successful)}"
-                    )
-
-            # Re-sort sections after applying config (toggle values may have changed)
-            template.variables.sort_sections()
-
-            # Reset disabled bool variables to False to prevent confusion
-            reset_vars = template.variables.reset_disabled_bool_variables()
-            if reset_vars:
-                logger.debug(
-                    f"Reset {len(reset_vars)} disabled bool variables to False"
-                )
-
-        self._display_template_details(template, id)
-
-    def _apply_variable_defaults(self, template: Template) -> None:
-        """Apply config defaults and CLI overrides to template variables.
-
-        Args:
-            template: Template instance with variables to configure
-        """
-        if not template.variables:
-            return
-
-        from .config import ConfigManager
-
-        config = ConfigManager()
-        config_defaults = config.get_defaults(self.name)
-
-        if config_defaults:
-            logger.info(f"Loading config defaults for module '{self.name}'")
-            successful = template.variables.apply_defaults(config_defaults, "config")
-            if successful:
-                logger.debug(f"Applied config defaults for: {', '.join(successful)}")
-
-    def _apply_cli_overrides(
-        self, template: Template, var: Optional[List[str]], ctx=None
-    ) -> None:
-        """Apply CLI variable overrides to template.
-
-        Args:
-            template: Template instance to apply overrides to
-            var: List of variable override strings from --var flags
-            ctx: Context object containing extra args (optional, will get current context if None)
-        """
-        if not template.variables:
-            return
-
-        # Get context if not provided (compatible with all Typer versions)
-        if ctx is None:
-            import click
-
-            try:
-                ctx = click.get_current_context()
-            except RuntimeError:
-                ctx = None
-
-        extra_args = list(ctx.args) if ctx and hasattr(ctx, "args") else []
-        cli_overrides = parse_var_inputs(var or [], extra_args)
-
-        if cli_overrides:
-            logger.info(f"Received {len(cli_overrides)} variable overrides from CLI")
-            successful_overrides = template.variables.apply_defaults(
-                cli_overrides, "cli"
-            )
-            if successful_overrides:
-                logger.debug(
-                    f"Applied CLI overrides for: {', '.join(successful_overrides)}"
-                )
-
-    def _collect_variable_values(
-        self, template: Template, interactive: bool
-    ) -> Dict[str, Any]:
-        """Collect variable values from user prompts and template defaults.
-
-        Args:
-            template: Template instance with variables
-            interactive: Whether to prompt user for values interactively
-
-        Returns:
-            Dictionary of variable names to values
-        """
-        variable_values = {}
-
-        # Collect values interactively if enabled
-        if interactive and template.variables:
-            prompt_handler = PromptHandler()
-            collected_values = prompt_handler.collect_variables(template.variables)
-            if collected_values:
-                variable_values.update(collected_values)
-                logger.info(
-                    f"Collected {len(collected_values)} variable values from user input"
-                )
-
-        # Add satisfied variable values (respects dependencies and toggles)
-        if template.variables:
-            variable_values.update(template.variables.get_satisfied_values())
-
-        return variable_values
-
-    def _check_output_directory(
-        self, output_dir: Path, rendered_files: Dict[str, str], interactive: bool
-    ) -> Optional[List[Path]]:
-        """Check output directory for conflicts and get user confirmation if needed.
-
-        Args:
-            output_dir: Directory where files will be written
-            rendered_files: Dictionary of file paths to rendered content
-            interactive: Whether to prompt user for confirmation
-
-        Returns:
-            List of existing files that will be overwritten, or None to cancel
-        """
-        dir_exists = output_dir.exists()
-        dir_not_empty = dir_exists and any(output_dir.iterdir())
-
-        # Check which files already exist
-        existing_files = []
-        if dir_exists:
-            for file_path in rendered_files.keys():
-                full_path = output_dir / file_path
-                if full_path.exists():
-                    existing_files.append(full_path)
-
-        # Warn if directory is not empty
-        if dir_not_empty:
-            if interactive:
-                details = []
-                if existing_files:
-                    details.append(
-                        f"{len(existing_files)} file(s) will be overwritten."
-                    )
-
-                if not self.display.display_warning_with_confirmation(
-                    f"Directory '{output_dir}' is not empty.",
-                    details if details else None,
-                    default=False,
-                ):
-                    self.display.display_info("Generation cancelled")
-                    return None
-            else:
-                # Non-interactive mode: show warning but continue
-                logger.warning(f"Directory '{output_dir}' is not empty")
-                if existing_files:
-                    logger.warning(f"{len(existing_files)} file(s) will be overwritten")
-
-        return existing_files
-
-    def _get_generation_confirmation(
-        self,
-        output_dir: Path,
-        rendered_files: Dict[str, str],
-        existing_files: Optional[List[Path]],
-        dir_not_empty: bool,
-        dry_run: bool,
-        interactive: bool,
-    ) -> bool:
-        """Display file generation confirmation and get user approval.
-
-        Args:
-            output_dir: Output directory path
-            rendered_files: Dictionary of file paths to content
-            existing_files: List of existing files that will be overwritten
-            dir_not_empty: Whether output directory already contains files
-            dry_run: Whether this is a dry run
-            interactive: Whether to prompt for confirmation
-
-        Returns:
-            True if user confirms generation, False to cancel
-        """
-        if not interactive:
-            return True
-
-        self.display.display_file_generation_confirmation(
-            output_dir, rendered_files, existing_files if existing_files else None
-        )
-
-        # Final confirmation (only if we didn't already ask about overwriting)
-        if not dir_not_empty and not dry_run:
-            if not Confirm.ask("Generate these files?", default=True):
-                self.display.display_info("Generation cancelled")
-                return False
-
-        return True
-
-    def _execute_dry_run(
-        self,
-        id: str,
-        output_dir: Path,
-        rendered_files: Dict[str, str],
-        show_files: bool,
-    ) -> None:
-        """Execute dry run mode with comprehensive simulation.
-
-        Simulates all filesystem operations that would occur during actual generation,
-        including directory creation, file writing, and permission checks.
-
-        Args:
-            id: Template ID
-            output_dir: Directory where files would be written
-            rendered_files: Dictionary of file paths to rendered content
-            show_files: Whether to display file contents
-        """
-        import os
-
-        console.print()
-        console.print(
-            "[bold cyan]Dry Run Mode - Simulating File Generation[/bold cyan]"
-        )
-        console.print()
-
-        # Simulate directory creation
-        self.display.display_heading("Directory Operations", icon_type="folder")
-
-        # Check if output directory exists
-        if output_dir.exists():
-            self.display.display_success(
-                f"Output directory exists: [cyan]{output_dir}[/cyan]"
-            )
-            # Check if we have write permissions
-            if os.access(output_dir, os.W_OK):
-                self.display.display_success("Write permission verified")
-            else:
-                self.display.display_warning("Write permission may be denied")
-        else:
-            console.print(
-                f"  [dim]→[/dim] Would create output directory: [cyan]{output_dir}[/cyan]"
-            )
-            # Check if parent directory exists and is writable
-            parent = output_dir.parent
-            if parent.exists() and os.access(parent, os.W_OK):
-                self.display.display_success("Parent directory writable")
-            else:
-                self.display.display_warning("Parent directory may not be writable")
-
-        # Collect unique subdirectories that would be created
-        subdirs = set()
-        for file_path in rendered_files.keys():
-            parts = Path(file_path).parts
-            for i in range(1, len(parts)):
-                subdirs.add(Path(*parts[:i]))
-
-        if subdirs:
-            console.print(
-                f"  [dim]→[/dim] Would create {len(subdirs)} subdirectory(ies)"
-            )
-            for subdir in sorted(subdirs):
-                console.print(f"    [dim]📁[/dim] {subdir}/")
-
-        console.print()
-
-        # Display file operations in a table
-        self.display.display_heading("File Operations", icon_type="file")
-
-        total_size = 0
-        new_files = 0
-        overwrite_files = 0
-        file_operations = []
-
-        for file_path, content in sorted(rendered_files.items()):
-            full_path = output_dir / file_path
-            file_size = len(content.encode("utf-8"))
-            total_size += file_size
-
-            # Determine status
-            if full_path.exists():
-                status = "Overwrite"
-                overwrite_files += 1
-            else:
-                status = "Create"
-                new_files += 1
-
-            file_operations.append((file_path, file_size, status))
-
-        self.display.display_file_operation_table(file_operations)
-        console.print()
-
-        # Summary statistics
-        if total_size < 1024:
-            size_str = f"{total_size}B"
-        elif total_size < 1024 * 1024:
-            size_str = f"{total_size / 1024:.1f}KB"
-        else:
-            size_str = f"{total_size / (1024 * 1024):.1f}MB"
-
-        summary_items = {
-            "Total files:": str(len(rendered_files)),
-            "New files:": str(new_files),
-            "Files to overwrite:": str(overwrite_files),
-            "Total size:": size_str,
-        }
-        self.display.display_summary_table("Summary", summary_items)
-        console.print()
-
-        # Show file contents if requested
-        if show_files:
-            console.print("[bold cyan]Generated File Contents:[/bold cyan]")
-            console.print()
-            for file_path, content in sorted(rendered_files.items()):
-                console.print(f"[cyan]File:[/cyan] {file_path}")
-                print(f"{'─' * 80}")
-                print(content)
-                print()  # Add blank line after content
-            console.print()
-
-        self.display.display_success("Dry run complete - no files were written")
-        console.print(f"[dim]Files would have been generated in '{output_dir}'[/dim]")
-        logger.info(
-            f"Dry run completed for template '{id}' - {len(rendered_files)} files, {total_size} bytes"
-        )
-
-    def _write_generated_files(
-        self, output_dir: Path, rendered_files: Dict[str, str], quiet: bool = False
-    ) -> None:
-        """Write rendered files to the output directory.
-
-        Args:
-            output_dir: Directory to write files to
-            rendered_files: Dictionary of file paths to rendered content
-            quiet: Suppress output messages
-        """
-        output_dir.mkdir(parents=True, exist_ok=True)
-
-        for file_path, content in rendered_files.items():
-            full_path = output_dir / file_path
-            full_path.parent.mkdir(parents=True, exist_ok=True)
-            with open(full_path, "w", encoding="utf-8") as f:
-                f.write(content)
-            if not quiet:
-                console.print(
-                    f"[green]Generated file: {file_path}[/green]"
-                )  # Keep simple per-file output
-
-        if not quiet:
-            self.display.display_success(
-                f"Template generated successfully in '{output_dir}'"
-            )
-        logger.info(f"Template written to directory: {output_dir}")
-
-    def generate(
-        self,
-        id: str = Argument(..., help="Template ID"),
-        directory: Optional[str] = Argument(
-            None, help="Output directory (defaults to template ID)"
-        ),
-        interactive: bool = Option(
-            True,
-            "--interactive/--no-interactive",
-            "-i/-n",
-            help="Enable interactive prompting for variables",
-        ),
-        var: Optional[list[str]] = Option(
-            None,
-            "--var",
-            "-v",
-            help="Variable override (repeatable). Supports: KEY=VALUE or KEY VALUE",
-        ),
-        dry_run: bool = Option(
-            False, "--dry-run", help="Preview template generation without writing files"
-        ),
-        show_files: bool = Option(
-            False,
-            "--show-files",
-            help="Display generated file contents in plain text (use with --dry-run)",
-        ),
-        quiet: bool = Option(
-            False, "--quiet", "-q", help="Suppress all non-error output"
-        ),
-    ) -> None:
-        """Generate from template.
-
-        Variable precedence chain (lowest to highest):
-        1. Module spec (defined in cli/modules/*.py)
-        2. Template spec (from template.yaml)
-        3. Config defaults (from ~/.config/boilerplates/config.yaml)
-        4. CLI overrides (--var flags)
-
-        Examples:
-            # Generate to directory named after template
-            cli compose generate traefik
-
-            # Generate to custom directory
-            cli compose generate traefik my-proxy
-
-            # Generate with variables
-            cli compose generate traefik --var traefik_enabled=false
-
-            # Preview without writing files (dry run)
-            cli compose generate traefik --dry-run
-
-            # Preview and show generated file contents
-            cli compose generate traefik --dry-run --show-files
-        """
-        logger.info(
-            f"Starting generation for template '{id}' from module '{self.name}'"
-        )
-
-        # Create a display manager with quiet mode if needed
-        display = DisplayManager(quiet=quiet) if quiet else self.display
-
-        template = self._load_template_by_id(id)
-
-        # Apply defaults and overrides
-        self._apply_variable_defaults(template)
-        self._apply_cli_overrides(template, var)
-
-        # Re-sort sections after all overrides (toggle values may have changed)
-        if template.variables:
-            template.variables.sort_sections()
-
-            # Reset disabled bool variables to False to prevent confusion
-            reset_vars = template.variables.reset_disabled_bool_variables()
-            if reset_vars:
-                logger.debug(
-                    f"Reset {len(reset_vars)} disabled bool variables to False"
-                )
-
-        if not quiet:
-            self._display_template_details(template, id)
-            console.print()
-
-        # Collect variable values
-        variable_values = self._collect_variable_values(template, interactive)
-
-        try:
-            # Validate and render template
-            if template.variables:
-                template.variables.validate_all()
-
-            # Check if we're in debug mode (logger level is DEBUG)
-            debug_mode = logger.isEnabledFor(logging.DEBUG)
-
-            rendered_files, variable_values = template.render(
-                template.variables, debug=debug_mode
-            )
-
-            if not rendered_files:
-                display.display_error(
-                    "Template rendering returned no files",
-                    context="template generation",
-                )
-                raise Exit(code=1)
-
-            logger.info(f"Successfully rendered template '{id}'")
-
-            # Determine output directory
-            if directory:
-                output_dir = Path(directory)
-                # Check if path looks like an absolute path but is missing the leading slash
-                # This handles cases like "Users/username/path" which should be "/Users/username/path"
-                if not output_dir.is_absolute() and str(output_dir).startswith(
-                    ("Users/", "home/", "usr/", "opt/", "var/", "tmp/")
-                ):
-                    output_dir = Path("/") / output_dir
-                    logger.debug(
-                        f"Normalized relative-looking absolute path to: {output_dir}"
-                    )
-            else:
-                output_dir = Path(id)
-
-            # Check for conflicts and get confirmation (skip in quiet mode)
-            if not quiet:
-                existing_files = self._check_output_directory(
-                    output_dir, rendered_files, interactive
-                )
-                if existing_files is None:
-                    return  # User cancelled
-
-                # Get final confirmation for generation
-                dir_not_empty = output_dir.exists() and any(output_dir.iterdir())
-                if not self._get_generation_confirmation(
-                    output_dir,
-                    rendered_files,
-                    existing_files,
-                    dir_not_empty,
-                    dry_run,
-                    interactive,
-                ):
-                    return  # User cancelled
-            else:
-                # In quiet mode, just check for existing files without prompts
-                existing_files = []
-
-            # Execute generation (dry run or actual)
-            if dry_run:
-                if not quiet:
-                    self._execute_dry_run(id, output_dir, rendered_files, show_files)
-            else:
-                self._write_generated_files(output_dir, rendered_files, quiet=quiet)
-
-            # Display next steps (not in quiet mode)
-            if template.metadata.next_steps and not quiet:
-                display.display_next_steps(
-                    template.metadata.next_steps, variable_values
-                )
-
-        except TemplateRenderError as e:
-            # Display enhanced error information for template rendering errors (always show errors)
-            display.display_template_render_error(e, context=f"template '{id}'")
-            raise Exit(code=1)
-        except Exception as e:
-            display.display_error(str(e), context=f"generating template '{id}'")
-            raise Exit(code=1)
-
-    def config_get(
-        self,
-        var_name: Optional[str] = Argument(
-            None, help="Variable name to get (omit to show all defaults)"
-        ),
-    ) -> None:
-        """Get default value(s) for this module.
-
-        Examples:
-            # Get all defaults for module
-            cli compose defaults get
-
-            # Get specific variable default
-            cli compose defaults get service_name
-        """
-        from .config import ConfigManager
-
-        config = ConfigManager()
-
-        if var_name:
-            # Get specific variable default
-            value = config.get_default_value(self.name, var_name)
-            if value is not None:
-                console.print(f"[green]{var_name}[/green] = [yellow]{value}[/yellow]")
-            else:
-                self.display.display_warning(
-                    f"No default set for variable '{var_name}'",
-                    context=f"module '{self.name}'",
-                )
-        else:
-            # Show all defaults (flat list)
-            defaults = config.get_defaults(self.name)
-            if defaults:
-                console.print(
-                    f"[bold]Config defaults for module '{self.name}':[/bold]\n"
-                )
-                for var_name, var_value in defaults.items():
-                    console.print(
-                        f"  [green]{var_name}[/green] = [yellow]{var_value}[/yellow]"
-                    )
-            else:
-                console.print(
-                    f"[yellow]No defaults configured for module '{self.name}'[/yellow]"
-                )
-
-    def config_set(
-        self,
-        var_name: str = Argument(..., help="Variable name or var=value format"),
-        value: Optional[str] = Argument(
-            None, help="Default value (not needed if using var=value format)"
-        ),
-    ) -> None:
-        """Set a default value for a variable.
-
-        This only sets the DEFAULT VALUE, not the variable spec.
-        The variable must be defined in the module or template spec.
-
-        Supports both formats:
-          - var_name value
-          - var_name=value
-
-        Examples:
-            # Set default value (format 1)
-            cli compose defaults set service_name my-awesome-app
-
-            # Set default value (format 2)
-            cli compose defaults set service_name=my-awesome-app
-
-            # Set author for all compose templates
-            cli compose defaults set author "Christian Lempa"
-        """
-        from .config import ConfigManager
-
-        config = ConfigManager()
-
-        # Parse var_name and value - support both "var value" and "var=value" formats
-        if "=" in var_name and value is None:
-            # Format: var_name=value
-            parts = var_name.split("=", 1)
-            actual_var_name = parts[0]
-            actual_value = parts[1]
-        elif value is not None:
-            # Format: var_name value
-            actual_var_name = var_name
-            actual_value = value
-        else:
-            self.display.display_error(
-                f"Missing value for variable '{var_name}'", context="config set"
-            )
-            console.print(
-                "[dim]Usage: defaults set VAR_NAME VALUE or defaults set VAR_NAME=VALUE[/dim]"
-            )
-            raise Exit(code=1)
-
-        # Set the default value
-        config.set_default_value(self.name, actual_var_name, actual_value)
-        self.display.display_success(
-            f"Set default: [cyan]{actual_var_name}[/cyan] = [yellow]{actual_value}[/yellow]"
-        )
-        console.print(
-            "\n[dim]This will be used as the default value when generating templates with this module.[/dim]"
-        )
-
-    def config_remove(
-        self,
-        var_name: str = Argument(..., help="Variable name to remove"),
-    ) -> None:
-        """Remove a specific default variable value.
-
-        Examples:
-            # Remove a default value
-            cli compose defaults rm service_name
-        """
-        from .config import ConfigManager
-
-        config = ConfigManager()
-        defaults = config.get_defaults(self.name)
-
-        if not defaults:
-            console.print(
-                f"[yellow]No defaults configured for module '{self.name}'[/yellow]"
-            )
-            return
-
-        if var_name in defaults:
-            del defaults[var_name]
-            config.set_defaults(self.name, defaults)
-            self.display.display_success(f"Removed default for '{var_name}'")
-        else:
-            self.display.display_error(f"No default found for variable '{var_name}'")
-
-    def config_clear(
-        self,
-        var_name: Optional[str] = Argument(
-            None, help="Variable name to clear (omit to clear all defaults)"
-        ),
-        force: bool = Option(False, "--force", "-f", help="Skip confirmation prompt"),
-    ) -> None:
-        """Clear default value(s) for this module.
-
-        Examples:
-            # Clear specific variable default
-            cli compose defaults clear service_name
-
-            # Clear all defaults for module
-            cli compose defaults clear --force
-        """
-        from .config import ConfigManager
-
-        config = ConfigManager()
-        defaults = config.get_defaults(self.name)
-
-        if not defaults:
-            console.print(
-                f"[yellow]No defaults configured for module '{self.name}'[/yellow]"
-            )
-            return
-
-        if var_name:
-            # Clear specific variable
-            if var_name in defaults:
-                del defaults[var_name]
-                config.set_defaults(self.name, defaults)
-                self.display.display_success(f"Cleared default for '{var_name}'")
-            else:
-                self.display.display_error(
-                    f"No default found for variable '{var_name}'"
-                )
-        else:
-            # Clear all defaults
-            if not force:
-                detail_lines = [
-                    f"This will clear ALL defaults for module '{self.name}':",
-                    "",
-                ]
-                for var_name, var_value in defaults.items():
-                    detail_lines.append(
-                        f"  [green]{var_name}[/green] = [yellow]{var_value}[/yellow]"
-                    )
-
-                self.display.display_warning("Warning: This will clear ALL defaults")
-                console.print()
-                for line in detail_lines:
-                    console.print(line)
-                console.print()
-                if not Confirm.ask("[bold red]Are you sure?[/bold red]", default=False):
-                    console.print("[green]Operation cancelled.[/green]")
-                    return
-
-            config.clear_defaults(self.name)
-            self.display.display_success(
-                f"Cleared all defaults for module '{self.name}'"
-            )
-
-    def config_list(self) -> None:
-        """Display the defaults for this specific module in YAML format.
-
-        Examples:
-            # Show the defaults for the current module
-            cli compose defaults list
-        """
-        from .config import ConfigManager
-        import yaml
-
-        config = ConfigManager()
-
-        # Get only the defaults for this module
-        defaults = config.get_defaults(self.name)
-
-        if not defaults:
-            console.print(
-                f"[yellow]No configuration found for module '{self.name}'[/yellow]"
-            )
-            console.print(
-                f"\n[dim]Config file location: {config.get_config_path()}[/dim]"
-            )
-            return
-
-        # Create a minimal config structure with only this module's defaults
-        module_config = {"defaults": {self.name: defaults}}
-
-        # Convert config to YAML string
-        yaml_output = yaml.dump(
-            module_config, default_flow_style=False, sort_keys=False
-        )
-
-        console.print(
-            f"[bold]Configuration for module:[/bold] [cyan]{self.name}[/cyan]"
-        )
-        console.print(f"[dim]Config file: {config.get_config_path()}[/dim]\n")
-        console.print(
-            Panel(
-                yaml_output,
-                title=f"{self.name.capitalize()} Config",
-                border_style="blue",
-            )
-        )
-
-    def validate(
-        self,
-        template_id: str = Argument(
-            None, help="Template ID to validate (if omitted, validates all templates)"
-        ),
-        path: Optional[str] = Option(
-            None,
-            "--path",
-            "-p",
-            help="Validate a template from a specific directory path",
-        ),
-        verbose: bool = Option(
-            False, "--verbose", "-v", help="Show detailed validation information"
-        ),
-        semantic: bool = Option(
-            True,
-            "--semantic/--no-semantic",
-            help="Enable semantic validation (Docker Compose schema, etc.)",
-        ),
-    ) -> None:
-        """Validate templates for Jinja2 syntax, undefined variables, and semantic correctness.
-
-        Validation includes:
-        - Jinja2 syntax checking
-        - Variable definition checking
-        - Semantic validation (when --semantic is enabled):
-          - Docker Compose file structure
-          - YAML syntax
-          - Configuration best practices
-
-        Examples:
-            # Validate all templates in this module
-            cli compose validate
-
-            # Validate a specific template
-            cli compose validate gitlab
-
-            # Validate a template from a specific path
-            cli compose validate --path /path/to/template
-
-            # Validate with verbose output
-            cli compose validate --verbose
-
-            # Skip semantic validation (only Jinja2)
-            cli compose validate --no-semantic
-        """
-        from .validators import get_validator_registry
-
-        # Validate from path takes precedence
-        if path:
-            try:
-                template_path = Path(path).resolve()
-                if not template_path.exists():
-                    self.display.display_error(f"Path does not exist: {path}")
-                    raise Exit(code=1)
-                if not template_path.is_dir():
-                    self.display.display_error(f"Path is not a directory: {path}")
-                    raise Exit(code=1)
-
-                console.print(
-                    f"[bold]Validating template from path:[/bold] [cyan]{template_path}[/cyan]\n"
-                )
-                template = Template(template_path, library_name="local")
-                template_id = template.id
-            except Exception as e:
-                self.display.display_error(
-                    f"Failed to load template from path '{path}': {e}"
-                )
-                raise Exit(code=1)
-        elif template_id:
-            # Validate a specific template by ID
-            try:
-                template = self._load_template_by_id(template_id)
-                console.print(
-                    f"[bold]Validating template:[/bold] [cyan]{template_id}[/cyan]\n"
-                )
-            except Exception as e:
-                self.display.display_error(
-                    f"Failed to load template '{template_id}': {e}"
-                )
-                raise Exit(code=1)
-        else:
-            # Validate all templates - handled separately below
-            template = None
-
-        # Single template validation
-        if template:
-            try:
-                # Trigger validation by accessing used_variables
-                _ = template.used_variables
-                # Trigger variable definition validation by accessing variables
-                _ = template.variables
-                self.display.display_success("Jinja2 validation passed")
-
-                # Semantic validation
-                if semantic:
-                    console.print(
-                        "\n[bold cyan]Running semantic validation...[/bold cyan]"
-                    )
-                    registry = get_validator_registry()
-                    has_semantic_errors = False
-
-                    # Render template with default values for validation
-                    debug_mode = logger.isEnabledFor(logging.DEBUG)
-                    rendered_files, _ = template.render(
-                        template.variables, debug=debug_mode
-                    )
-
-                    for file_path, content in rendered_files.items():
-                        result = registry.validate_file(content, file_path)
-
-                        if (
-                            result.errors
-                            or result.warnings
-                            or (verbose and result.info)
-                        ):
-                            console.print(f"\n[cyan]File:[/cyan] {file_path}")
-                            result.display(f"{file_path}")
-
-                            if result.errors:
-                                has_semantic_errors = True
-
-                    if not has_semantic_errors:
-                        self.display.display_success("Semantic validation passed")
-                    else:
-                        self.display.display_error("Semantic validation found errors")
-                        raise Exit(code=1)
-
-                if verbose:
-                    console.print(
-                        f"\n[dim]Template path: {template.template_dir}[/dim]"
-                    )
-                    console.print(
-                        f"[dim]Found {len(template.used_variables)} variables[/dim]"
-                    )
-                    if semantic:
-                        console.print(
-                            f"[dim]Generated {len(rendered_files)} files[/dim]"
-                        )
-
-            except TemplateRenderError as e:
-                # Display enhanced error information for template rendering errors
-                self.display.display_template_render_error(
-                    e, context=f"template '{template_id}'"
-                )
-                raise Exit(code=1)
-            except (TemplateSyntaxError, TemplateValidationError, ValueError) as e:
-                self.display.display_error(f"Validation failed for '{template_id}':")
-                console.print(f"\n{e}")
-                raise Exit(code=1)
-            except Exception as e:
-                self.display.display_error(
-                    f"Unexpected error validating '{template_id}': {e}"
-                )
-                raise Exit(code=1)
-
-            return
-        else:
-            # Validate all templates
-            console.print(f"[bold]Validating all {self.name} templates...[/bold]\n")
-
-            entries = self.libraries.find(self.name, sort_results=True)
-            total = len(entries)
-            valid_count = 0
-            invalid_count = 0
-            errors = []
-
-            for template_dir, library_name in entries:
-                template_id = template_dir.name
-                try:
-                    template = Template(template_dir, library_name=library_name)
-                    # Trigger validation
-                    _ = template.used_variables
-                    _ = template.variables
-                    valid_count += 1
-                    if verbose:
-                        self.display.display_success(template_id)
-                except ValueError as e:
-                    invalid_count += 1
-                    errors.append((template_id, str(e)))
-                    if verbose:
-                        self.display.display_error(template_id)
-                except Exception as e:
-                    invalid_count += 1
-                    errors.append((template_id, f"Load error: {e}"))
-                    if verbose:
-                        self.display.display_warning(template_id)
-
-            # Summary
-            summary_items = {
-                "Total templates:": str(total),
-                "[green]Valid:[/green]": str(valid_count),
-                "[red]Invalid:[/red]": str(invalid_count),
-            }
-            self.display.display_summary_table("Validation Summary", summary_items)
-
-            # Show errors if any
-            if errors:
-                console.print("\n[bold red]Validation Errors:[/bold red]")
-                for template_id, error_msg in errors:
-                    console.print(
-                        f"\n[yellow]Template:[/yellow] [cyan]{template_id}[/cyan]"
-                    )
-                    console.print(f"[dim]{error_msg}[/dim]")
-                raise Exit(code=1)
-            else:
-                self.display.display_success("All templates are valid!")
-
-    @classmethod
-    def register_cli(cls, app: Typer) -> None:
-        """Register module commands with the main app."""
-        logger.debug(f"Registering CLI commands for module '{cls.name}'")
-
-        module_instance = cls()
-
-        module_app = Typer(help=cls.description)
-
-        module_app.command("list")(module_instance.list)
-        module_app.command("search")(module_instance.search)
-        module_app.command("show")(module_instance.show)
-        module_app.command("validate")(module_instance.validate)
-
-        module_app.command(
-            "generate",
-            context_settings={"allow_extra_args": True, "ignore_unknown_options": True},
-        )(module_instance.generate)
-
-        # Add defaults commands (simplified - only manage default values)
-        defaults_app = Typer(help="Manage default values for template variables")
-        defaults_app.command("get", help="Get default value(s)")(
-            module_instance.config_get
-        )
-        defaults_app.command("set", help="Set a default value")(
-            module_instance.config_set
-        )
-        defaults_app.command("rm", help="Remove a specific default value")(
-            module_instance.config_remove
-        )
-        defaults_app.command("clear", help="Clear default value(s)")(
-            module_instance.config_clear
-        )
-        defaults_app.command(
-            "list", help="Display the config for this module in YAML format"
-        )(module_instance.config_list)
-        module_app.add_typer(defaults_app, name="defaults")
-
-        app.add_typer(module_app, name=cls.name, help=cls.description)
-        logger.info(f"Module '{cls.name}' CLI commands registered")
-
-    def _load_template_by_id(self, id: str) -> Template:
-        """Load a template by its ID, supporting qualified IDs.
-
-        Supports both formats:
-        - Simple: "alloy" (uses priority system)
-        - Qualified: "alloy.default" (loads from specific library)
-
-        Args:
-            id: Template ID (simple or qualified)
-
-        Returns:
-            Template instance
-
-        Raises:
-            FileNotFoundError: If template is not found
-        """
-        logger.debug(f"Loading template with ID '{id}' from module '{self.name}'")
-
-        # find_by_id now handles both simple and qualified IDs
-        result = self.libraries.find_by_id(self.name, id)
-
-        if not result:
-            raise FileNotFoundError(
-                f"Template '{id}' not found in module '{self.name}'"
-            )
-
-        template_dir, library_name = result
-
-        # Get library type
-        library = next(
-            (lib for lib in self.libraries.libraries if lib.name == library_name), None
-        )
-        library_type = library.library_type if library else "git"
-
-        try:
-            template = Template(
-                template_dir, library_name=library_name, library_type=library_type
-            )
-
-            # Validate schema version compatibility
-            template._validate_schema_version(self.schema_version, self.name)
-
-            # If the original ID was qualified, preserve it
-            if "." in id:
-                template.id = id
-
-            return template
-        except Exception as exc:
-            logger.error(f"Failed to load template '{id}': {exc}")
-            raise FileNotFoundError(
-                f"Template '{id}' could not be loaded: {exc}"
-            ) from exc
-
-    def _display_template_details(self, template: Template, id: str) -> None:
-        """Display template information panel and variables table.
-
-        Args:
-            template: Template instance to display
-            id: Template ID
-        """
-        self.display.display_template_details(template, id)

+ 9 - 0
cli/core/module/__init__.py

@@ -0,0 +1,9 @@
+"""Module package for template management.
+
+This package provides the base Module class and related functionality for managing
+template modules in the boilerplates CLI.
+"""
+
+from .base_module import Module
+
+__all__ = ["Module"]

+ 692 - 0
cli/core/module/base_commands.py

@@ -0,0 +1,692 @@
+"""Base commands for module: list, search, show, validate, generate."""
+
+from __future__ import annotations
+
+import logging
+from dataclasses import dataclass
+from pathlib import Path
+
+from jinja2 import Template as Jinja2Template
+from typer import Exit
+
+from ..config import ConfigManager
+from ..display import DisplayManager, IconManager
+from ..exceptions import (
+    TemplateRenderError,
+    TemplateSyntaxError,
+    TemplateValidationError,
+)
+from ..input import InputManager
+from ..template import Template
+from ..validators import get_validator_registry
+from .helpers import (
+    apply_cli_overrides,
+    apply_var_file,
+    apply_variable_defaults,
+    collect_variable_values,
+)
+
+logger = logging.getLogger(__name__)
+
+# File size thresholds for display formatting
+BYTES_PER_KB = 1024
+BYTES_PER_MB = 1024 * 1024
+
+
+@dataclass
+class GenerationConfig:
+    """Configuration for template generation."""
+
+    id: str
+    directory: str | None = None
+    output: str | None = None
+    interactive: bool = True
+    var: list[str] | None = None
+    var_file: str | None = None
+    dry_run: bool = False
+    show_files: bool = False
+    quiet: bool = False
+
+
+@dataclass
+class ConfirmationContext:
+    """Context for file generation confirmation."""
+
+    output_dir: Path
+    rendered_files: dict[str, str]
+    existing_files: list[Path] | None
+    dir_not_empty: bool
+    dry_run: bool
+    interactive: bool
+    display: DisplayManager
+
+
+def list_templates(module_instance, raw: bool = False) -> list:
+    """List all templates."""
+    logger.debug(f"Listing templates for module '{module_instance.name}'")
+
+    # Load all templates using centralized helper
+    filtered_templates = module_instance._load_all_templates()
+
+    if filtered_templates:
+        if raw:
+            # Output raw format (tab-separated values for easy filtering with awk/sed/cut)
+            # Format: ID\tNAME\tTAGS\tVERSION\tLIBRARY
+            for template in filtered_templates:
+                tags_list = template.metadata.tags or []
+                ",".join(tags_list) if tags_list else "-"
+                (str(template.metadata.version) if template.metadata.version else "-")
+        else:
+            # Output rich table format
+            def format_template_row(template):
+                name = template.metadata.name or "Unnamed Template"
+                tags_list = template.metadata.tags or []
+                tags = ", ".join(tags_list) if tags_list else "-"
+                version = str(template.metadata.version) if template.metadata.version else ""
+                schema = template.schema_version if hasattr(template, "schema_version") else "1.0"
+                library_name = template.metadata.library or ""
+                library_type = template.metadata.library_type or "git"
+                # Format library with icon and color
+                icon = IconManager.UI_LIBRARY_STATIC if library_type == "static" else IconManager.UI_LIBRARY_GIT
+                color = "yellow" if library_type == "static" else "blue"
+                library_display = f"[{color}]{icon} {library_name}[/{color}]"
+                return (template.id, name, tags, version, schema, library_display)
+
+            module_instance.display.data_table(
+                columns=[
+                    {"name": "ID", "style": "bold", "no_wrap": True},
+                    {"name": "Name"},
+                    {"name": "Tags"},
+                    {"name": "Version", "no_wrap": True},
+                    {"name": "Schema", "no_wrap": True},
+                    {"name": "Library", "no_wrap": True},
+                ],
+                rows=filtered_templates,
+                row_formatter=format_template_row,
+            )
+    else:
+        logger.info(f"No templates found for module '{module_instance.name}'")
+        module_instance.display.info(
+            f"No templates found for module '{module_instance.name}'",
+            context="Use 'bp repo update' to update libraries or check library configuration",
+        )
+
+    return filtered_templates
+
+
+def search_templates(module_instance, query: str) -> list:
+    """Search for templates by ID containing the search string."""
+    logger.debug(f"Searching templates for module '{module_instance.name}' with query='{query}'")
+
+    # Load templates with search filter using centralized helper
+    filtered_templates = module_instance._load_all_templates(lambda t: query.lower() in t.id.lower())
+
+    if filtered_templates:
+        logger.info(f"Found {len(filtered_templates)} templates matching '{query}' for module '{module_instance.name}'")
+
+        def format_template_row(template):
+            name = template.metadata.name or "Unnamed Template"
+            tags_list = template.metadata.tags or []
+            tags = ", ".join(tags_list) if tags_list else "-"
+            version = str(template.metadata.version) if template.metadata.version else ""
+            schema = template.schema_version if hasattr(template, "schema_version") else "1.0"
+            library_name = template.metadata.library or ""
+            library_type = template.metadata.library_type or "git"
+            # Format library with icon and color
+            icon = IconManager.UI_LIBRARY_STATIC if library_type == "static" else IconManager.UI_LIBRARY_GIT
+            color = "yellow" if library_type == "static" else "blue"
+            library_display = f"[{color}]{icon} {library_name}[/{color}]"
+            return (template.id, name, tags, version, schema, library_display)
+
+        module_instance.display.data_table(
+            columns=[
+                {"name": "ID", "style": "bold", "no_wrap": True},
+                {"name": "Name"},
+                {"name": "Tags"},
+                {"name": "Version", "no_wrap": True},
+                {"name": "Schema", "no_wrap": True},
+                {"name": "Library", "no_wrap": True},
+            ],
+            rows=filtered_templates,
+            row_formatter=format_template_row,
+        )
+    else:
+        logger.info(f"No templates found matching '{query}' for module '{module_instance.name}'")
+        module_instance.display.warning(
+            f"No templates found matching '{query}'",
+            context=f"module '{module_instance.name}'",
+        )
+
+    return filtered_templates
+
+
+def show_template(module_instance, id: str, var: list[str] | None = None, var_file: str | None = None) -> None:
+    """Show template details with optional variable overrides."""
+    logger.debug(f"Showing template '{id}' from module '{module_instance.name}'")
+    template = module_instance._load_template_by_id(id)
+
+    if not template:
+        module_instance.display.error(f"Template '{id}' not found", context=f"module '{module_instance.name}'")
+        return
+
+    # Apply defaults and overrides (same precedence as generate command)
+    if template.variables:
+        config = ConfigManager()
+        apply_variable_defaults(template, config, module_instance.name)
+        apply_var_file(template, var_file, module_instance.display)
+        apply_cli_overrides(template, var)
+
+        # Re-sort sections after applying overrides (toggle values may have changed)
+        template.variables.sort_sections()
+
+        # Reset disabled bool variables to False to prevent confusion
+        reset_vars = template.variables.reset_disabled_bool_variables()
+        if reset_vars:
+            logger.debug(f"Reset {len(reset_vars)} disabled bool variables to False")
+
+    # Display template header
+    module_instance.display.templates.render_template_header(template, id)
+    # Display file tree
+    module_instance.display.templates.render_file_tree(template)
+    # Display variables table
+    module_instance.display.variables.render_variables_table(template)
+
+
+def check_output_directory(
+    output_dir: Path,
+    rendered_files: dict[str, str],
+    interactive: bool,
+    display: DisplayManager,
+) -> list[Path] | None:
+    """Check output directory for conflicts and get user confirmation if needed."""
+    dir_exists = output_dir.exists()
+    dir_not_empty = dir_exists and any(output_dir.iterdir())
+
+    # Check which files already exist
+    existing_files = []
+    if dir_exists:
+        for file_path in rendered_files:
+            full_path = output_dir / file_path
+            if full_path.exists():
+                existing_files.append(full_path)
+
+    # Warn if directory is not empty
+    if dir_not_empty:
+        if interactive:
+            display.text("")  # Add newline before warning
+            # Combine directory warning and file count on same line
+            warning_msg = f"Directory '{output_dir}' is not empty."
+            if existing_files:
+                warning_msg += f" {len(existing_files)} file(s) will be overwritten."
+            display.warning(warning_msg)
+            display.text("")  # Add newline after warning
+
+            input_mgr = InputManager()
+            if not input_mgr.confirm("Continue?", default=False):
+                display.info("Generation cancelled")
+                return None
+        else:
+            # Non-interactive mode: show warning but continue
+            logger.warning(f"Directory '{output_dir}' is not empty")
+            if existing_files:
+                logger.warning(f"{len(existing_files)} file(s) will be overwritten")
+
+    return existing_files
+
+
+def get_generation_confirmation(_ctx: ConfirmationContext) -> bool:
+    """Display file generation confirmation and get user approval."""
+    # No confirmation needed - either non-interactive, dry-run, or already confirmed during directory check
+    return True
+
+
+def _collect_subdirectories(rendered_files: dict[str, str]) -> set[Path]:
+    """Collect unique subdirectories from file paths."""
+    subdirs = set()
+    for file_path in rendered_files:
+        parts = Path(file_path).parts
+        for i in range(1, len(parts)):
+            subdirs.add(Path(*parts[:i]))
+    return subdirs
+
+
+def _analyze_file_operations(
+    output_dir: Path, rendered_files: dict[str, str]
+) -> tuple[list[tuple[str, int, str]], int, int, int]:
+    """Analyze file operations and return statistics."""
+    total_size = 0
+    new_files = 0
+    overwrite_files = 0
+    file_operations = []
+
+    for file_path, content in sorted(rendered_files.items()):
+        full_path = output_dir / file_path
+        file_size = len(content.encode("utf-8"))
+        total_size += file_size
+
+        if full_path.exists():
+            status = "Overwrite"
+            overwrite_files += 1
+        else:
+            status = "Create"
+            new_files += 1
+
+        file_operations.append((file_path, file_size, status))
+
+    return file_operations, total_size, new_files, overwrite_files
+
+
+def _format_size(total_size: int) -> str:
+    """Format byte size into human-readable string."""
+    if total_size < BYTES_PER_KB:
+        return f"{total_size}B"
+    if total_size < BYTES_PER_MB:
+        return f"{total_size / BYTES_PER_KB:.1f}KB"
+    return f"{total_size / BYTES_PER_MB:.1f}MB"
+
+
+def execute_dry_run(
+    id: str,
+    output_dir: Path,
+    rendered_files: dict[str, str],
+    show_files: bool,
+    display: DisplayManager,
+) -> tuple[int, int, str]:
+    """Execute dry run mode - preview files without writing.
+
+    Returns:
+        Tuple of (total_files, overwrite_files, size_str) for summary display
+    """
+    _file_operations, total_size, _new_files, overwrite_files = _analyze_file_operations(output_dir, rendered_files)
+    size_str = _format_size(total_size)
+
+    # Show file contents if requested
+    if show_files:
+        display.text("")
+        display.heading("File Contents")
+        for file_path, content in sorted(rendered_files.items()):
+            display.text(f"\n[cyan]{file_path}[/cyan]")
+            display.text(f"{'─' * 80}")
+            display.text(content)
+        display.text("")
+
+    logger.info(f"Dry run completed for template '{id}' - {len(rendered_files)} files, {total_size} bytes")
+    return len(rendered_files), overwrite_files, size_str
+
+
+def write_rendered_files(
+    output_dir: Path,
+    rendered_files: dict[str, str],
+    _quiet: bool,
+    _display: DisplayManager,
+) -> None:
+    """Write rendered files to the output directory."""
+    output_dir.mkdir(parents=True, exist_ok=True)
+
+    for file_path, content in rendered_files.items():
+        full_path = output_dir / file_path
+        full_path.parent.mkdir(parents=True, exist_ok=True)
+        with full_path.open("w", encoding="utf-8") as f:
+            f.write(content)
+
+    logger.info(f"Template written to directory: {output_dir}")
+
+
+def _prepare_template(
+    module_instance,
+    id: str,
+    var_file: str | None,
+    var: list[str] | None,
+    display: DisplayManager,
+):
+    """Load template and apply all defaults/overrides."""
+    template = module_instance._load_template_by_id(id)
+    config = ConfigManager()
+    apply_variable_defaults(template, config, module_instance.name)
+    apply_var_file(template, var_file, display)
+    apply_cli_overrides(template, var)
+
+    if template.variables:
+        template.variables.sort_sections()
+        reset_vars = template.variables.reset_disabled_bool_variables()
+        if reset_vars:
+            logger.debug(f"Reset {len(reset_vars)} disabled bool variables to False")
+
+    return template
+
+
+def _render_template(template, id: str, display: DisplayManager, interactive: bool):
+    """Validate, render template and collect variable values."""
+    variable_values = collect_variable_values(template, interactive)
+
+    if template.variables:
+        template.variables.validate_all()
+
+    debug_mode = logger.isEnabledFor(logging.DEBUG)
+    rendered_files, variable_values = template.render(template.variables, debug=debug_mode)
+
+    if not rendered_files:
+        display.error(
+            "Template rendering returned no files",
+            context="template generation",
+        )
+        raise Exit(code=1)
+
+    logger.info(f"Successfully rendered template '{id}'")
+    return rendered_files, variable_values
+
+
+def _determine_output_dir(directory: str | None, output: str | None, id: str) -> tuple[Path, bool]:
+    """Determine and normalize output directory path.
+
+    Returns:
+        Tuple of (output_dir, used_deprecated_arg) where used_deprecated_arg indicates
+        if the deprecated positional directory argument was used.
+    """
+    used_deprecated_arg = False
+
+    # Priority: --output flag > positional directory argument > template ID
+    if output:
+        output_dir = Path(output)
+    elif directory:
+        output_dir = Path(directory)
+        used_deprecated_arg = True
+        logger.debug(f"Using deprecated positional directory argument: {directory}")
+    else:
+        output_dir = Path(id)
+
+    # Normalize paths that look like absolute paths but are relative
+    if not output_dir.is_absolute() and str(output_dir).startswith(("Users/", "home/", "usr/", "opt/", "var/", "tmp/")):
+        output_dir = Path("/") / output_dir
+        logger.debug(f"Normalized relative-looking absolute path to: {output_dir}")
+
+    return output_dir, used_deprecated_arg
+
+
+def _display_template_error(display: DisplayManager, template_id: str, error: TemplateRenderError) -> None:
+    """Display template rendering error with clean formatting."""
+    display.text("")
+    display.text("─" * 80, style="dim")
+    display.text("")
+
+    # Build details if available
+    details = None
+    if error.file_path:
+        details = error.file_path
+        if error.line_number:
+            details += f":line {error.line_number}"
+
+    # Display error with details
+    display.error(f"Failed to generate boilerplate from template '{template_id}'", details=details)
+
+
+def _display_generic_error(display: DisplayManager, template_id: str, error: Exception) -> None:
+    """Display generic error with clean formatting."""
+    display.text("")
+    display.text("─" * 80, style="dim")
+    display.text("")
+
+    # Truncate long error messages
+    max_error_length = 100
+    error_msg = str(error)
+    if len(error_msg) > max_error_length:
+        error_msg = f"{error_msg[:max_error_length]}..."
+
+    # Display error with details
+    display.error(f"Failed to generate boilerplate from template '{template_id}'", details=error_msg)
+
+
+def generate_template(module_instance, config: GenerationConfig) -> None:  # noqa: PLR0912, PLR0915
+    """Generate from template."""
+    logger.info(f"Starting generation for template '{config.id}' from module '{module_instance.name}'")
+
+    display = DisplayManager(quiet=config.quiet) if config.quiet else module_instance.display
+    template = _prepare_template(module_instance, config.id, config.var_file, config.var, display)
+
+    # Determine output directory early to check for deprecated argument usage
+    output_dir, used_deprecated_arg = _determine_output_dir(config.directory, config.output, config.id)
+
+    if not config.quiet:
+        # Display template header
+        module_instance.display.templates.render_template_header(template, config.id)
+        # Display file tree
+        module_instance.display.templates.render_file_tree(template)
+        # Display variables table
+        module_instance.display.variables.render_variables_table(template)
+        module_instance.display.text("")
+
+        # Show deprecation warning BEFORE any user interaction
+        if used_deprecated_arg:
+            module_instance.display.warning(
+                "Using positional argument for output directory is deprecated and will be removed in v0.2.0",
+                details="Use --output/-o flag instead",
+            )
+            module_instance.display.text("")
+
+    try:
+        rendered_files, variable_values = _render_template(template, config.id, display, config.interactive)
+
+        # Check for conflicts and get confirmation (skip in quiet mode)
+        if not config.quiet:
+            existing_files = check_output_directory(output_dir, rendered_files, config.interactive, display)
+            if existing_files is None:
+                return  # User cancelled
+
+            dir_not_empty = output_dir.exists() and any(output_dir.iterdir())
+            ctx = ConfirmationContext(
+                output_dir=output_dir,
+                rendered_files=rendered_files,
+                existing_files=existing_files,
+                dir_not_empty=dir_not_empty,
+                dry_run=config.dry_run,
+                interactive=config.interactive,
+                display=display,
+            )
+            if not get_generation_confirmation(ctx):
+                return  # User cancelled
+
+        # Execute generation (dry run or actual)
+        dry_run_stats = None
+        if config.dry_run:
+            if not config.quiet:
+                dry_run_stats = execute_dry_run(config.id, output_dir, rendered_files, config.show_files, display)
+        else:
+            write_rendered_files(output_dir, rendered_files, config.quiet, display)
+
+        # Display next steps (not in quiet mode)
+        if template.metadata.next_steps and not config.quiet:
+            display.text("")
+            display.heading("Next Steps")
+            try:
+                next_steps_template = Jinja2Template(template.metadata.next_steps)
+                rendered_next_steps = next_steps_template.render(variable_values)
+                display.status.markdown(rendered_next_steps)
+            except Exception as e:
+                logger.warning(f"Failed to render next_steps as template: {e}")
+                # Fallback to plain text if rendering fails
+                display.status.markdown(template.metadata.next_steps)
+
+        # Display final status message at the end
+        if not config.quiet:
+            display.text("")
+            display.text("─" * 80, style="dim")
+
+            if config.dry_run and dry_run_stats:
+                total_files, overwrite_files, size_str = dry_run_stats
+                if overwrite_files > 0:
+                    display.warning(
+                        f"Dry run complete: {total_files} files ({size_str}) would be written to '{output_dir}' "
+                        f"({overwrite_files} would be overwritten)"
+                    )
+                else:
+                    display.success(
+                        f"Dry run complete: {total_files} files ({size_str}) would be written to '{output_dir}'"
+                    )
+            else:
+                # Actual generation completed
+                display.success(f"Boilerplate generated successfully in '{output_dir}'")
+
+    except TemplateRenderError as e:
+        _display_template_error(display, config.id, e)
+        raise Exit(code=1) from None
+    except Exception as e:
+        _display_generic_error(display, config.id, e)
+        raise Exit(code=1) from None
+
+
+def validate_templates(
+    module_instance,
+    template_id: str,
+    path: str | None,
+    verbose: bool,
+    semantic: bool,
+) -> None:
+    """Validate templates for Jinja2 syntax, undefined variables, and semantic correctness."""
+    # Load template based on input
+    template = _load_template_for_validation(module_instance, template_id, path)
+
+    if template:
+        _validate_single_template(module_instance, template, template_id, verbose, semantic)
+    else:
+        _validate_all_templates(module_instance, verbose)
+
+
+def _load_template_for_validation(module_instance, template_id: str, path: str | None):
+    """Load a template from path or ID for validation."""
+    if path:
+        template_path = Path(path).resolve()
+        if not template_path.exists():
+            module_instance.display.error(f"Path does not exist: {path}")
+            raise Exit(code=1) from None
+        if not template_path.is_dir():
+            module_instance.display.error(f"Path is not a directory: {path}")
+            raise Exit(code=1) from None
+
+        module_instance.display.info(f"[bold]Validating template from path:[/bold] [cyan]{template_path}[/cyan]")
+        try:
+            return Template(template_path, library_name="local")
+        except Exception as e:
+            module_instance.display.error(f"Failed to load template from path '{path}': {e}")
+            raise Exit(code=1) from None
+
+    if template_id:
+        try:
+            template = module_instance._load_template_by_id(template_id)
+            module_instance.display.info(f"Validating template: {template_id}")
+            return template
+        except Exception as e:
+            module_instance.display.error(f"Failed to load template '{template_id}': {e}")
+            raise Exit(code=1) from None
+
+    return None
+
+
+def _validate_single_template(module_instance, template, template_id: str, verbose: bool, semantic: bool) -> None:
+    """Validate a single template."""
+    try:
+        # Jinja2 validation
+        _ = template.used_variables
+        _ = template.variables
+        module_instance.display.success("Jinja2 validation passed")
+
+        # Semantic validation
+        if semantic:
+            _run_semantic_validation(module_instance, template, verbose)
+
+        # Verbose output
+        if verbose:
+            _display_validation_details(module_instance, template, semantic)
+
+    except TemplateRenderError as e:
+        module_instance.display.error(str(e), context=f"template '{template_id}'")
+        raise Exit(code=1) from None
+    except (TemplateSyntaxError, TemplateValidationError, ValueError) as e:
+        module_instance.display.error(f"Validation failed for '{template_id}':")
+        module_instance.display.info(f"\n{e}")
+        raise Exit(code=1) from None
+    except Exception as e:
+        module_instance.display.error(f"Unexpected error validating '{template_id}': {e}")
+        raise Exit(code=1) from None
+
+
+def _run_semantic_validation(module_instance, template, verbose: bool) -> None:
+    """Run semantic validation on rendered template files."""
+    module_instance.display.info("")
+    module_instance.display.info("Running semantic validation...")
+
+    registry = get_validator_registry()
+    debug_mode = logger.isEnabledFor(logging.DEBUG)
+    rendered_files, _ = template.render(template.variables, debug=debug_mode)
+
+    has_semantic_errors = False
+    for file_path, content in rendered_files.items():
+        result = registry.validate_file(content, file_path)
+
+        if result.errors or result.warnings or (verbose and result.info):
+            module_instance.display.info(f"\nFile: {file_path}")
+            result.display(f"{file_path}")
+
+            if result.errors:
+                has_semantic_errors = True
+
+    if has_semantic_errors:
+        module_instance.display.error("Semantic validation found errors")
+        raise Exit(code=1) from None
+
+    module_instance.display.success("Semantic validation passed")
+
+
+def _display_validation_details(module_instance, template, semantic: bool) -> None:
+    """Display verbose validation details."""
+    module_instance.display.info(f"\nTemplate path: {template.template_dir}")
+    module_instance.display.info(f"Found {len(template.used_variables)} variables")
+    if semantic:
+        debug_mode = logger.isEnabledFor(logging.DEBUG)
+        rendered_files, _ = template.render(template.variables, debug=debug_mode)
+        module_instance.display.info(f"Generated {len(rendered_files)} files")
+
+
+def _validate_all_templates(module_instance, verbose: bool) -> None:
+    """Validate all templates in the module."""
+    module_instance.display.info(f"Validating all {module_instance.name} templates...")
+
+    valid_count = 0
+    invalid_count = 0
+    errors = []
+
+    all_templates = module_instance._load_all_templates()
+    total = len(all_templates)
+
+    for template in all_templates:
+        try:
+            _ = template.used_variables
+            _ = template.variables
+            valid_count += 1
+            if verbose:
+                module_instance.display.success(template.id)
+        except ValueError as e:
+            invalid_count += 1
+            errors.append((template.id, str(e)))
+            if verbose:
+                module_instance.display.error(template.id)
+        except Exception as e:
+            invalid_count += 1
+            errors.append((template.id, f"Load error: {e}"))
+            if verbose:
+                module_instance.display.warning(template.id)
+
+    # Display summary
+    module_instance.display.info("")
+    module_instance.display.info(f"Total templates: {total}")
+    module_instance.display.info(f"Valid: {valid_count}")
+    module_instance.display.info(f"Invalid: {invalid_count}")
+
+    if errors:
+        module_instance.display.info("")
+        for template_id, error_msg in errors:
+            module_instance.display.error(f"{template_id}: {error_msg}")
+        raise Exit(code=1)
+
+    if total > 0:
+        module_instance.display.info("")
+        module_instance.display.success("All templates are valid")

+ 345 - 0
cli/core/module/base_module.py

@@ -0,0 +1,345 @@
+"""Base module class for template management."""
+
+from __future__ import annotations
+
+import logging
+from abc import ABC
+from typing import Annotated
+
+from typer import Argument, Option, Typer
+
+from ..display import DisplayManager
+from ..library import LibraryManager
+from ..template import Template
+from .base_commands import (
+    GenerationConfig,
+    generate_template,
+    list_templates,
+    search_templates,
+    show_template,
+    validate_templates,
+)
+from .config_commands import (
+    config_clear,
+    config_get,
+    config_list,
+    config_remove,
+    config_set,
+)
+
+logger = logging.getLogger(__name__)
+
+# Expected length of library entry tuple: (path, library_name, needs_qualification)
+LIBRARY_ENTRY_MIN_LENGTH = 2
+
+
+class Module(ABC):
+    """Streamlined base module that auto-detects variables from templates.
+
+    Subclasses must define:
+    - name: str (class attribute)
+    - description: str (class attribute)
+    """
+
+    # Class attributes that must be defined by subclasses
+    name: str
+    description: str
+
+    # Schema version supported by this module (override in subclasses)
+    schema_version: str = "1.0"
+
+    def __init__(self) -> None:
+        # Validate required class attributes
+        if not hasattr(self.__class__, "name") or not hasattr(self.__class__, "description"):
+            raise TypeError(f"Module {self.__class__.__name__} must define 'name' and 'description' class attributes")
+
+        logger.info(f"Initializing module '{self.name}'")
+        logger.debug(f"Module '{self.name}' configuration: description='{self.description}'")
+        self.libraries = LibraryManager()
+        self.display = DisplayManager()
+
+    def _load_all_templates(self, filter_fn=None) -> list:
+        """Load all templates for this module with optional filtering."""
+        templates = []
+        entries = self.libraries.find(self.name, sort_results=True)
+
+        for entry in entries:
+            # Unpack entry - returns (path, library_name, needs_qualification)
+            template_dir = entry[0]
+            library_name = entry[1]
+            needs_qualification = entry[2] if len(entry) > LIBRARY_ENTRY_MIN_LENGTH else False
+
+            try:
+                # Get library object to determine type
+                library = next(
+                    (lib for lib in self.libraries.libraries if lib.name == library_name),
+                    None,
+                )
+                library_type = library.library_type if library else "git"
+
+                template = Template(template_dir, library_name=library_name, library_type=library_type)
+
+                # Validate schema version compatibility
+                template._validate_schema_version(self.schema_version, self.name)
+
+                # If template ID needs qualification, set qualified ID
+                if needs_qualification:
+                    template.set_qualified_id()
+
+                # Apply filter if provided
+                if filter_fn is None or filter_fn(template):
+                    templates.append(template)
+
+            except Exception as exc:
+                logger.error(f"Failed to load template from {template_dir}: {exc}")
+                continue
+
+        return templates
+
+    def _load_template_by_id(self, id: str):
+        """Load a template by its ID, supporting qualified IDs."""
+        logger.debug(f"Loading template with ID '{id}' from module '{self.name}'")
+
+        # find_by_id now handles both simple and qualified IDs
+        result = self.libraries.find_by_id(self.name, id)
+
+        if not result:
+            raise FileNotFoundError(f"Template '{id}' not found in module '{self.name}'")
+
+        template_dir, library_name = result
+
+        # Get library type
+        library = next((lib for lib in self.libraries.libraries if lib.name == library_name), None)
+        library_type = library.library_type if library else "git"
+
+        try:
+            template = Template(template_dir, library_name=library_name, library_type=library_type)
+
+            # Validate schema version compatibility
+            template._validate_schema_version(self.schema_version, self.name)
+
+            # If the original ID was qualified, preserve it
+            if "." in id:
+                template.id = id
+
+            return template
+        except Exception as exc:
+            logger.error(f"Failed to load template '{id}': {exc}")
+            raise FileNotFoundError(f"Template '{id}' could not be loaded: {exc}") from exc
+
+    def list(
+        self,
+        raw: Annotated[bool, Option("--raw", help="Output raw list format instead of rich table")] = False,
+    ) -> list:
+        """List all templates."""
+        return list_templates(self, raw)
+
+    def search(
+        self,
+        query: Annotated[str, Argument(help="Search string to filter templates by ID")],
+    ) -> list:
+        """Search for templates by ID containing the search string."""
+        return search_templates(self, query)
+
+    def show(
+        self,
+        id: str,
+        var: Annotated[
+            list[str] | None,
+            Option(
+                "--var",
+                "-v",
+                help="Variable override (repeatable). Supports: KEY=VALUE or KEY VALUE",
+            ),
+        ] = None,
+        var_file: Annotated[
+            str | None,
+            Option(
+                "--var-file",
+                "-f",
+                help="Load variables from YAML file (overrides config defaults)",
+            ),
+        ] = None,
+    ) -> None:
+        """Show template details with optional variable overrides."""
+        return show_template(self, id, var, var_file)
+
+    def generate(
+        self,
+        id: Annotated[str, Argument(help="Template ID")],
+        directory: Annotated[
+            str | None, Argument(help="[DEPRECATED: use --output] Output directory (defaults to template ID)")
+        ] = None,
+        *,
+        output: Annotated[
+            str | None,
+            Option(
+                "--output",
+                "-o",
+                help="Output directory (defaults to template ID)",
+            ),
+        ] = None,
+        interactive: Annotated[
+            bool,
+            Option(
+                "--interactive/--no-interactive",
+                "-i/-n",
+                help="Enable interactive prompting for variables",
+            ),
+        ] = True,
+        var: Annotated[
+            list[str] | None,
+            Option(
+                "--var",
+                "-v",
+                help="Variable override (repeatable). Supports: KEY=VALUE or KEY VALUE",
+            ),
+        ] = None,
+        var_file: Annotated[
+            str | None,
+            Option(
+                "--var-file",
+                "-f",
+                help="Load variables from YAML file (overrides config defaults, overridden by --var)",
+            ),
+        ] = None,
+        dry_run: Annotated[
+            bool,
+            Option("--dry-run", help="Preview template generation without writing files"),
+        ] = False,
+        show_files: Annotated[
+            bool,
+            Option(
+                "--show-files",
+                help="Display generated file contents in plain text (use with --dry-run)",
+            ),
+        ] = False,
+        quiet: Annotated[bool, Option("--quiet", "-q", help="Suppress all non-error output")] = False,
+    ) -> None:
+        """Generate from template.
+
+        Variable precedence chain (lowest to highest):
+        1. Module spec (defined in cli/modules/*.py)
+        2. Template spec (from template.yaml)
+        3. Config defaults (from ~/.config/boilerplates/config.yaml)
+        4. Variable file (from --var-file)
+        5. CLI overrides (--var flags)
+        """
+        config = GenerationConfig(
+            id=id,
+            directory=directory,
+            output=output,
+            interactive=interactive,
+            var=var,
+            var_file=var_file,
+            dry_run=dry_run,
+            show_files=show_files,
+            quiet=quiet,
+        )
+        return generate_template(self, config)
+
+    def validate(
+        self,
+        template_id: Annotated[
+            str | None,
+            Argument(help="Template ID to validate (omit to validate all templates)"),
+        ] = None,
+        *,
+        path: Annotated[
+            str | None,
+            Option("--path", help="Path to template directory for validation"),
+        ] = None,
+        verbose: Annotated[bool, Option("--verbose", "-v", help="Show detailed validation information")] = False,
+        semantic: Annotated[
+            bool,
+            Option(
+                "--semantic/--no-semantic",
+                help="Enable semantic validation (Docker Compose schema, etc.)",
+            ),
+        ] = True,
+    ) -> None:
+        """Validate templates for Jinja2 syntax, undefined variables, and semantic correctness.
+
+        Examples:
+            # Validate specific template
+            cli compose validate netbox
+
+            # Validate all templates
+            cli compose validate
+
+            # Validate with verbose output
+            cli compose validate netbox --verbose
+        """
+        return validate_templates(self, template_id, path, verbose, semantic)
+
+    def config_get(
+        self,
+        var_name: str | None = None,
+    ) -> None:
+        """Get default value(s) for this module."""
+        return config_get(self, var_name)
+
+    def config_set(
+        self,
+        var_name: str,
+        value: str | None = None,
+    ) -> None:
+        """Set a default value for a variable."""
+        return config_set(self, var_name, value)
+
+    def config_remove(
+        self,
+        var_name: Annotated[str, Argument(help="Variable name to remove")],
+    ) -> None:
+        """Remove a specific default variable value."""
+        return config_remove(self, var_name)
+
+    def config_clear(
+        self,
+        var_name: str | None = None,
+        force: bool = False,
+    ) -> None:
+        """Clear default value(s) for this module."""
+        return config_clear(self, var_name, force)
+
+    def config_list(self) -> None:
+        """Display the defaults for this specific module in YAML format."""
+        return config_list(self)
+
+    @classmethod
+    def register_cli(cls, app: Typer) -> None:
+        """Register module commands with the main app."""
+        logger.debug(f"Registering CLI commands for module '{cls.name}'")
+
+        module_instance = cls()
+
+        module_app = Typer(help=cls.description)
+
+        module_app.command("list")(module_instance.list)
+        module_app.command("search")(module_instance.search)
+        module_app.command("show")(module_instance.show)
+        module_app.command("validate")(module_instance.validate)
+
+        module_app.command(
+            "generate",
+            context_settings={"allow_extra_args": True, "ignore_unknown_options": True},
+        )(module_instance.generate)
+
+        # Add defaults commands (simplified - only manage default values)
+        defaults_app = Typer(help="Manage default values for template variables")
+        defaults_app.command("get", help="Get default value(s)")(module_instance.config_get)
+        defaults_app.command("set", help="Set a default value")(module_instance.config_set)
+        defaults_app.command("rm", help="Remove a specific default value")(module_instance.config_remove)
+        defaults_app.command("clear", help="Clear default value(s)")(module_instance.config_clear)
+        defaults_app.command("list", help="Display the config for this module in YAML format")(
+            module_instance.config_list
+        )
+        module_app.add_typer(defaults_app, name="defaults")
+
+        app.add_typer(
+            module_app,
+            name=cls.name,
+            help=cls.description,
+            rich_help_panel="Template Commands",
+        )
+        logger.info(f"Module '{cls.name}' CLI commands registered")

+ 141 - 0
cli/core/module/config_commands.py

@@ -0,0 +1,141 @@
+"""Config/defaults management commands for module."""
+
+from __future__ import annotations
+
+import logging
+
+from typer import Exit
+
+from cli.core.config import ConfigManager
+from cli.core.input import InputManager
+
+logger = logging.getLogger(__name__)
+
+
+def config_get(module_instance, var_name: str | None = None) -> None:
+    """Get default value(s) for this module."""
+    config = ConfigManager()
+
+    if var_name:
+        # Get specific variable default
+        value = config.get_default_value(module_instance.name, var_name)
+        if value is not None:
+            module_instance.display.info(f"[green]{var_name}[/green] = [yellow]{value}[/yellow]")
+        else:
+            module_instance.display.warning(
+                f"No default set for variable '{var_name}'",
+                context=f"module '{module_instance.name}'",
+            )
+    else:
+        # Show all defaults (flat list)
+        defaults = config.get_defaults(module_instance.name)
+        if defaults:
+            module_instance.display.info(f"[bold]Config defaults for module '{module_instance.name}':[/bold]")
+            for config_var_name, var_value in defaults.items():
+                module_instance.display.info(f"  [green]{config_var_name}[/green] = [yellow]{var_value}[/yellow]")
+        else:
+            module_instance.display.warning(f"No defaults configured for module '{module_instance.name}'")
+
+
+def config_set(module_instance, var_name: str, value: str | None = None) -> None:
+    """Set a default value for a variable."""
+    config = ConfigManager()
+
+    # Parse var_name and value - support both "var value" and "var=value" formats
+    if "=" in var_name and value is None:
+        # Format: var_name=value
+        parts = var_name.split("=", 1)
+        actual_var_name = parts[0]
+        actual_value = parts[1]
+    elif value is not None:
+        # Format: var_name value
+        actual_var_name = var_name
+        actual_value = value
+    else:
+        module_instance.display.error(f"Missing value for variable '{var_name}'", context="config set")
+        module_instance.display.info("[dim]Usage: defaults set VAR_NAME VALUE or defaults set VAR_NAME=VALUE[/dim]")
+        raise Exit(code=1)
+
+    # Set the default value
+    config.set_default_value(module_instance.name, actual_var_name, actual_value)
+    module_instance.display.success(f"Set default: [cyan]{actual_var_name}[/cyan] = [yellow]{actual_value}[/yellow]")
+    module_instance.display.info(
+        "[dim]This will be used as the default value when generating templates with this module.[/dim]"
+    )
+
+
+def config_remove(module_instance, var_name: str) -> None:
+    """Remove a specific default variable value."""
+    config = ConfigManager()
+    defaults = config.get_defaults(module_instance.name)
+
+    if not defaults:
+        module_instance.display.warning(f"No defaults configured for module '{module_instance.name}'")
+        return
+
+    if var_name in defaults:
+        del defaults[var_name]
+        config.set_defaults(module_instance.name, defaults)
+        module_instance.display.success(f"Removed default for '{var_name}'")
+    else:
+        module_instance.display.error(f"No default found for variable '{var_name}'")
+
+
+def config_clear(module_instance, var_name: str | None = None, force: bool = False) -> None:
+    """Clear default value(s) for this module."""
+    config = ConfigManager()
+    defaults = config.get_defaults(module_instance.name)
+
+    if not defaults:
+        module_instance.display.warning(f"No defaults configured for module '{module_instance.name}'")
+        return
+
+    if var_name:
+        # Clear specific variable
+        if var_name in defaults:
+            del defaults[var_name]
+            config.set_defaults(module_instance.name, defaults)
+            module_instance.display.success(f"Cleared default for '{var_name}'")
+        else:
+            module_instance.display.error(f"No default found for variable '{var_name}'")
+    else:
+        # Clear all defaults
+        if not force:
+            detail_lines = [
+                f"This will clear ALL defaults for module '{module_instance.name}':",
+                "",
+            ]
+            for clear_var_name, var_value in defaults.items():
+                detail_lines.append(f"  [green]{clear_var_name}[/green] = [yellow]{var_value}[/yellow]")
+
+            module_instance.display.warning("Warning: This will clear ALL defaults")
+            module_instance.display.info("")
+            for line in detail_lines:
+                module_instance.display.info(line)
+            module_instance.display.info("")
+            input_mgr = InputManager()
+            if not input_mgr.confirm("Are you sure?", default=False):
+                module_instance.display.info("[green]Operation cancelled.[/green]")
+                return
+
+        config.clear_defaults(module_instance.name)
+        module_instance.display.success(f"Cleared all defaults for module '{module_instance.name}'")
+
+
+def config_list(module_instance) -> None:
+    """Display the defaults for this specific module as a table."""
+    config = ConfigManager()
+
+    # Get only the defaults for this module
+    defaults = config.get_defaults(module_instance.name)
+
+    if not defaults:
+        module_instance.display.warning(f"No defaults configured for module '{module_instance.name}'")
+        return
+
+    # Display defaults using DisplayManager
+    module_instance.display.heading(f"Defaults for module '{module_instance.name}':")
+
+    # Convert defaults to display format (rows for table)
+    rows = [(f"{var_name}:", str(var_value)) for var_name, var_value in defaults.items()]
+    module_instance.display.table(headers=None, rows=rows, title="", show_header=False, borderless=True)

+ 236 - 0
cli/core/module/helpers.py

@@ -0,0 +1,236 @@
+"""Helper methods for module variable application and template generation."""
+
+from __future__ import annotations
+
+import logging
+from pathlib import Path
+from typing import Any
+
+import click
+import yaml
+from typer import Exit
+
+from ..display import DisplayManager
+from ..input import PromptHandler
+
+logger = logging.getLogger(__name__)
+
+
+def parse_var_inputs(var_options: list[str], extra_args: list[str]) -> dict[str, Any]:
+    """Parse variable inputs from --var options and extra args with type conversion.
+
+    Supports formats:
+      --var KEY=VALUE
+      --var KEY VALUE
+
+    Values are automatically converted to appropriate types:
+      - 'true', 'yes', '1' → True
+      - 'false', 'no', '0' → False
+      - Numeric strings → int or float
+      - Everything else → string
+
+    Args:
+      var_options: List of variable options from CLI
+      extra_args: Additional arguments that may contain values
+
+    Returns:
+      Dictionary of parsed variables with converted types
+    """
+    variables = {}
+
+    # Parse --var KEY=VALUE format
+    for var_option in var_options:
+        if "=" in var_option:
+            key, value = var_option.split("=", 1)
+            variables[key] = _convert_string_to_type(value)
+        # --var KEY VALUE format - value should be in extra_args
+        elif extra_args:
+            value = extra_args.pop(0)
+            variables[var_option] = _convert_string_to_type(value)
+        else:
+            logger.warning(f"No value provided for variable '{var_option}'")
+
+    return variables
+
+
+def _convert_string_to_type(value: str) -> Any:
+    """Convert string value to appropriate Python type.
+
+    Args:
+        value: String value to convert
+
+    Returns:
+        Converted value (bool, int, float, or str)
+    """
+    # Boolean conversion
+    if value.lower() in ("true", "yes", "1"):
+        return True
+    if value.lower() in ("false", "no", "0"):
+        return False
+
+    # Integer conversion
+    try:
+        return int(value)
+    except ValueError:
+        pass
+
+    # Float conversion
+    try:
+        return float(value)
+    except ValueError:
+        pass
+
+    # Return as string
+    return value
+
+
+def load_var_file(var_file_path: str) -> dict:
+    """Load variables from a YAML file.
+
+    Args:
+        var_file_path: Path to the YAML file containing variables
+
+    Returns:
+        Dictionary of variable names to values (flat structure)
+
+    Raises:
+        FileNotFoundError: If the var file doesn't exist
+        ValueError: If the file is not valid YAML or has invalid structure
+    """
+    var_path = Path(var_file_path).expanduser().resolve()
+
+    if not var_path.exists():
+        raise FileNotFoundError(f"Variable file not found: {var_file_path}")
+
+    if not var_path.is_file():
+        raise ValueError(f"Variable file path is not a file: {var_file_path}")
+
+    try:
+        with var_path.open(encoding="utf-8") as f:
+            content = yaml.safe_load(f)
+    except yaml.YAMLError as e:
+        raise ValueError(f"Invalid YAML in variable file: {e}") from e
+    except OSError as e:
+        raise ValueError(f"Error reading variable file: {e}") from e
+
+    if not isinstance(content, dict):
+        raise ValueError(f"Variable file must contain a YAML dictionary, got {type(content).__name__}")
+
+    logger.info(f"Loaded {len(content)} variables from file: {var_path.name}")
+    logger.debug(f"Variables from file: {', '.join(content.keys())}")
+
+    return content
+
+
+def apply_variable_defaults(template, config_manager, module_name: str) -> None:
+    """Apply config defaults to template variables.
+
+    Args:
+        template: Template instance with variables to configure
+        config_manager: ConfigManager instance
+        module_name: Name of the module
+    """
+    if not template.variables:
+        return
+
+    config_defaults = config_manager.get_defaults(module_name)
+
+    if config_defaults:
+        logger.info(f"Loading config defaults for module '{module_name}'")
+        successful = template.variables.apply_defaults(config_defaults, "config")
+        if successful:
+            logger.debug(f"Applied config defaults for: {', '.join(successful)}")
+
+
+def apply_var_file(template, var_file_path: str | None, display: DisplayManager) -> None:
+    """Apply variables from a YAML file to template.
+
+    Args:
+        template: Template instance to apply variables to
+        var_file_path: Path to the YAML file containing variables
+        display: DisplayManager for error messages
+
+    Raises:
+        Exit: If the file cannot be loaded or contains invalid data
+    """
+    if not var_file_path or not template.variables:
+        return
+
+    try:
+        var_file_vars = load_var_file(var_file_path)
+        if var_file_vars:
+            # Get list of valid variable names from template
+            valid_vars = set()
+            for section in template.variables.get_sections().values():
+                valid_vars.update(section.variables.keys())
+
+            # Warn about unknown variables
+            unknown_vars = set(var_file_vars.keys()) - valid_vars
+            if unknown_vars:
+                for var_name in sorted(unknown_vars):
+                    logger.warning(f"Variable '{var_name}' from var-file does not exist in template '{template.id}'")
+
+            successful = template.variables.apply_defaults(var_file_vars, "var-file")
+            if successful:
+                logger.debug(f"Applied var-file overrides for: {', '.join(successful)}")
+    except (FileNotFoundError, ValueError) as e:
+        display.error(
+            f"Failed to load variable file: {e}",
+            context="variable file loading",
+        )
+        raise Exit(code=1) from e
+
+
+def apply_cli_overrides(template, var: list[str] | None, ctx=None) -> None:
+    """Apply CLI variable overrides to template.
+
+    Args:
+        template: Template instance to apply overrides to
+        var: List of variable override strings from --var flags
+        ctx: Context object containing extra args (optional, will get current context if None)
+    """
+    if not template.variables:
+        return
+
+    # Get context if not provided (compatible with all Typer versions)
+    if ctx is None:
+        try:
+            ctx = click.get_current_context()
+        except RuntimeError:
+            ctx = None
+
+    extra_args = list(ctx.args) if ctx and hasattr(ctx, "args") else []
+    cli_overrides = parse_var_inputs(var or [], extra_args)
+
+    if cli_overrides:
+        logger.info(f"Received {len(cli_overrides)} variable overrides from CLI")
+        successful_overrides = template.variables.apply_defaults(cli_overrides, "cli")
+        if successful_overrides:
+            logger.debug(f"Applied CLI overrides for: {', '.join(successful_overrides)}")
+
+
+def collect_variable_values(template, interactive: bool) -> dict[str, Any]:
+    """Collect variable values from user prompts and template defaults.
+
+    Args:
+        template: Template instance with variables
+        interactive: Whether to prompt user for values interactively
+
+    Returns:
+        Dictionary of variable names to values
+    """
+    variable_values = {}
+
+    # Collect values interactively if enabled
+    if interactive and template.variables:
+        prompt_handler = PromptHandler()
+        collected_values = prompt_handler.collect_variables(template.variables)
+        if collected_values:
+            variable_values.update(collected_values)
+            logger.info(f"Collected {len(collected_values)} variable values from user input")
+
+    # Add satisfied variable values (respects dependencies and toggles)
+    if template.variables:
+        variable_values.update(template.variables.get_satisfied_values())
+
+    return variable_values

+ 104 - 116
cli/core/prompt.py

@@ -1,13 +1,14 @@
 from __future__ import annotations
 
-from typing import Dict, Any, Callable
 import logging
+from typing import Any, Callable
+
 from rich.console import Console
-from rich.prompt import Prompt, Confirm, IntPrompt
+from rich.prompt import Confirm, IntPrompt, Prompt
 
+from .collection import VariableCollection
 from .display import DisplayManager
 from .variable import Variable
-from .collection import VariableCollection
 
 logger = logging.getLogger(__name__)
 
@@ -29,13 +30,12 @@ class PromptHandler:
             Dict of variable names to collected values
         """
         if not Confirm.ask("Customize any settings?", default=False):
+            self.console.print("")  # Add blank line after prompt
             logger.info("User opted to keep all default values")
             return {}
+        self.console.print("")  # Add blank line after prompt
 
-        collected: Dict[str, Any] = {}
-        prompted_variables: set[str] = (
-            set()
-        )  # Track which variables we've already prompted for
+        collected: dict[str, Any] = {}
 
         # Process each section
         for section_key, section in variables.get_sections().items():
@@ -43,114 +43,115 @@ class PromptHandler:
                 continue
 
             # Check if dependencies are satisfied
-            if not variables.is_section_satisfied(section_key):
-                # Get list of unsatisfied dependencies for better user feedback
-                unsatisfied_keys = [
-                    dep
-                    for dep in section.needs
-                    if not variables.is_section_satisfied(dep)
-                ]
-                # Convert section keys to titles for user-friendly display
-                unsatisfied_titles = []
-                for dep_key in unsatisfied_keys:
-                    dep_section = variables.get_section(dep_key)
-                    if dep_section:
-                        unsatisfied_titles.append(dep_section.title)
-                    else:
-                        unsatisfied_titles.append(dep_key)
-                dep_names = (
-                    ", ".join(unsatisfied_titles) if unsatisfied_titles else "unknown"
-                )
-                self.display.display_skipped(
-                    section.title, f"requires {dep_names} to be enabled"
-                )
-                logger.debug(
-                    f"Skipping section '{section_key}' - dependencies not satisfied: {dep_names}"
-                )
+            if not self._check_section_dependencies(variables, section_key, section):
                 continue
 
             # Always show section header first
             self.display.display_section_header(section.title, section.description)
 
-            # Track whether this section will be enabled
-            section_will_be_enabled = True
-
-            # Handle section toggle - skip for required sections
-            if section.required:
-                # Required sections are always processed, no toggle prompt needed
-                logger.debug(
-                    f"Processing required section '{section.key}' without toggle prompt"
-                )
-            elif section.toggle:
-                toggle_var = section.variables.get(section.toggle)
-                if toggle_var:
-                    # Prompt for toggle variable using standard variable prompting logic
-                    # This ensures consistent handling of description, extra text, validation hints, etc.
-                    current_value = toggle_var.convert(toggle_var.value)
-                    new_value = self._prompt_variable(
-                        toggle_var, required=section.required
-                    )
-
-                    if new_value != current_value:
-                        collected[toggle_var.name] = new_value
-                        toggle_var.value = new_value
-
-                    # Use section's native is_enabled() method
-                    if not section.is_enabled():
-                        section_will_be_enabled = False
+            # Handle section toggle and determine if enabled
+            section_will_be_enabled = self._handle_section_toggle(section, collected)
 
             # Collect variables in this section
-            for var_name, variable in section.variables.items():
-                # Skip toggle variable (already handled)
-                if section.toggle and var_name == section.toggle:
-                    continue
-
-                # Skip variables with unsatisfied needs (similar to display logic)
-                if not variables.is_variable_satisfied(var_name):
-                    logger.debug(
-                        f"Skipping variable '{var_name}' - needs not satisfied"
-                    )
-                    continue
-
-                # Skip all variables if section is disabled
-                if not section_will_be_enabled:
-                    logger.debug(
-                        f"Skipping variable '{var_name}' from disabled section '{section_key}'"
-                    )
-                    continue
-
-                # Prompt for the variable
-                current_value = variable.convert(variable.value)
-                # Pass section.required so _prompt_variable can enforce required inputs
-                new_value = self._prompt_variable(variable, required=section.required)
-
-                # Track that we've prompted for this variable
-                prompted_variables.add(var_name)
-
-                # For autogenerated variables, always update even if None (signals autogeneration)
-                if variable.autogenerated and new_value is None:
-                    collected[var_name] = None
-                    variable.value = None
-                elif new_value != current_value:
-                    collected[var_name] = new_value
-                    variable.value = new_value
+            self._collect_section_variables(section, section_key, section_will_be_enabled, variables, collected)
 
         logger.info(f"Variable collection completed. Collected {len(collected)} values")
         return collected
 
-    def _prompt_variable(self, variable: Variable, required: bool = False) -> Any:
+    def _check_section_dependencies(self, variables: VariableCollection, section_key: str, section) -> bool:
+        """Check if section dependencies are satisfied and display skip message if not."""
+        if not variables.is_section_satisfied(section_key):
+            # Get list of unsatisfied dependencies for better user feedback
+            unsatisfied_keys = [dep for dep in section.needs if not variables.is_section_satisfied(dep)]
+            # Convert section keys to titles for user-friendly display
+            unsatisfied_titles = []
+            for dep_key in unsatisfied_keys:
+                dep_section = variables.get_section(dep_key)
+                unsatisfied_titles.append(dep_section.title if dep_section else dep_key)
+
+            dep_names = ", ".join(unsatisfied_titles) if unsatisfied_titles else "unknown"
+            self.display.display_skipped(section.title, f"requires {dep_names} to be enabled")
+            logger.debug(f"Skipping section '{section_key}' - dependencies not satisfied: {dep_names}")
+            return False
+        return True
+
+    def _handle_section_toggle(self, section, collected: dict[str, Any]) -> bool:
+        """Handle section toggle prompt and return whether section will be enabled."""
+        # Required sections are always enabled
+        if section.required:
+            logger.debug(f"Processing required section '{section.key}' without toggle prompt")
+            return True
+
+        # Handle optional sections with toggle
+        if not section.toggle:
+            return True
+
+        toggle_var = section.variables.get(section.toggle)
+        if not toggle_var:
+            return True
+
+        # Prompt for toggle variable
+        current_value = toggle_var.convert(toggle_var.value)
+        new_value = self._prompt_variable(toggle_var, required=section.required)
+
+        if new_value != current_value:
+            collected[toggle_var.name] = new_value
+            toggle_var.value = new_value
+
+        # Return whether section is enabled
+        return section.is_enabled()
+
+    def _collect_section_variables(
+        self,
+        section,
+        section_key: str,
+        section_enabled: bool,
+        variables: VariableCollection,
+        collected: dict[str, Any],
+    ) -> None:
+        """Collect values for all variables in a section."""
+        for var_name, variable in section.variables.items():
+            # Skip toggle variable (already handled)
+            if section.toggle and var_name == section.toggle:
+                continue
+
+            # Skip variables with unsatisfied needs
+            if not variables.is_variable_satisfied(var_name):
+                logger.debug(f"Skipping variable '{var_name}' - needs not satisfied")
+                continue
+
+            # Skip all variables if section is disabled
+            if not section_enabled:
+                logger.debug(f"Skipping variable '{var_name}' from disabled section '{section_key}'")
+                continue
+
+            # Prompt for the variable and update if changed
+            self._prompt_and_update_variable(variable, collected)
+
+    def _prompt_and_update_variable(self, variable: Variable, collected: dict[str, Any]) -> None:
+        """Prompt for a variable and update collected values if changed."""
+        current_value = variable.convert(variable.value)
+        new_value = self._prompt_variable(variable, required=False)
+
+        # For autogenerated variables, always update even if None (signals autogeneration)
+        if variable.autogenerated and new_value is None:
+            collected[variable.name] = None
+            variable.value = None
+        elif new_value != current_value:
+            collected[variable.name] = new_value
+            variable.value = new_value
+
+    def _prompt_variable(self, variable: Variable, _required: bool = False) -> Any:
         """Prompt for a single variable value based on its type.
 
         Args:
             variable: The variable to prompt for
-            required: Whether the containing section is required (for context/display)
+            _required: Whether the containing section is required (unused, kept for API compatibility)
 
         Returns:
             The validated value entered by the user
         """
-        logger.debug(
-            f"Prompting for variable '{variable.name}' (type: {variable.type})"
-        )
+        logger.debug(f"Prompting for variable '{variable.name}' (type: {variable.type})")
 
         # Use variable's native methods for prompt text and default value
         prompt_text = variable.get_prompt_text()
@@ -184,18 +185,15 @@ class PromptHandler:
                 # - Type conversion
                 # - Autogenerated variable detection
                 # - Required field validation
-                converted = variable.validate_and_convert(raw, check_required=True)
+                return variable.validate_and_convert(raw, check_required=True)
 
                 # Return the converted value (caller will update variable.value)
-                return converted
             except ValueError as exc:
                 # Conversion/validation failed — show a consistent error message and retry
                 self._show_validation_error(str(exc))
             except Exception as e:
                 # Unexpected error — log and retry using the stored (unconverted) value
-                logger.error(
-                    f"Error prompting for variable '{variable.name}': {str(e)}"
-                )
+                logger.error(f"Error prompting for variable '{variable.name}': {e!s}")
                 default_value = variable.value
                 handler = self._get_prompt_handler(variable)
 
@@ -215,18 +213,14 @@ class PromptHandler:
         }
         return handlers.get(
             variable.type,
-            lambda text, default: self._prompt_string(
-                text, default, is_sensitive=variable.sensitive
-            ),
+            lambda text, default: self._prompt_string(text, default, is_sensitive=variable.sensitive),
         )
 
     def _show_validation_error(self, message: str) -> None:
         """Display validation feedback consistently."""
         self.display.display_validation_error(message)
 
-    def _prompt_string(
-        self, prompt_text: str, default: Any = None, is_sensitive: bool = False
-    ) -> str | None:
+    def _prompt_string(self, prompt_text: str, default: Any = None, is_sensitive: bool = False) -> str | None:
         value = Prompt.ask(
             prompt_text,
             default=str(default) if default is not None else "",
@@ -239,11 +233,7 @@ class PromptHandler:
     def _prompt_bool(self, prompt_text: str, default: Any = None) -> bool | None:
         if default is None:
             return Confirm.ask(prompt_text, default=None)
-        converted = (
-            default
-            if isinstance(default, bool)
-            else str(default).lower() in ("true", "1", "yes", "on")
-        )
+        converted = default if isinstance(default, bool) else str(default).lower() in ("true", "1", "yes", "on")
         return Confirm.ask(prompt_text, default=converted)
 
     def _prompt_int(self, prompt_text: str, default: Any = None) -> int | None:
@@ -260,7 +250,7 @@ class PromptHandler:
         prompt_text: str,
         options: list[str],
         default: Any = None,
-        extra: str | None = None,
+        _extra: str | None = None,
     ) -> str:
         """Prompt for enum selection with validation.
 
@@ -282,6 +272,4 @@ class PromptHandler:
             )
             if value in options:
                 return value
-            self.console.print(
-                f"[red]Invalid choice. Select from: {', '.join(options)}[/red]"
-            )
+            self.console.print(f"[red]Invalid choice. Select from: {', '.join(options)}[/red]")

+ 7 - 13
cli/core/registry.py

@@ -3,7 +3,7 @@
 from __future__ import annotations
 
 import logging
-from typing import Iterator, Type
+from collections.abc import Iterator
 
 logger = logging.getLogger(__name__)
 
@@ -15,25 +15,19 @@ class ModuleRegistry:
         self._modules = {}
         logger.debug("Initializing module registry")
 
-    def register(self, module_class: Type) -> None:
+    def register(self, module_class: type) -> None:
         """Register a module class."""
         # Module class defines its own name attribute
         logger.debug(f"Attempting to register module class '{module_class.name}'")
 
         if module_class.name in self._modules:
-            logger.warning(
-                f"Module '{module_class.name}' already registered, replacing with new implementation"
-            )
+            logger.warning(f"Module '{module_class.name}' already registered, replacing with new implementation")
 
         self._modules[module_class.name] = module_class
-        logger.info(
-            f"Registered module '{module_class.name}' (total modules: {len(self._modules)})"
-        )
-        logger.debug(
-            f"Module '{module_class.name}' details: description='{module_class.description}'"
-        )
-
-    def iter_module_classes(self) -> Iterator[tuple[str, Type]]:
+        logger.info(f"Registered module '{module_class.name}' (total modules: {len(self._modules)})")
+        logger.debug(f"Module '{module_class.name}' details: description='{module_class.description}'")
+
+    def iter_module_classes(self) -> Iterator[tuple[str, type]]:
         """Yield registered module classes without instantiating them."""
         logger.debug(f"Iterating over {len(self._modules)} registered module classes")
         for name in sorted(self._modules.keys()):

+ 249 - 281
cli/core/repo.py

@@ -3,30 +3,25 @@
 from __future__ import annotations
 
 import logging
+import shutil
 import subprocess
 from pathlib import Path
-from typing import Optional
 
-from rich.console import Console
-from rich.progress import Progress, SpinnerColumn, TextColumn
+from rich.progress import SpinnerColumn, TextColumn
 from rich.table import Table
 from typer import Argument, Option, Typer
 
-from ..core.config import ConfigManager
-from ..core.display import DisplayManager
+from ..core.config import ConfigManager, LibraryConfig
+from ..core.display import DisplayManager, IconManager
 from ..core.exceptions import ConfigError
 
 logger = logging.getLogger(__name__)
-console = Console()
-console_err = Console(stderr=True)
 display = DisplayManager()
 
 app = Typer(help="Manage library repositories")
 
 
-def _run_git_command(
-    args: list[str], cwd: Optional[Path] = None
-) -> tuple[bool, str, str]:
+def _run_git_command(args: list[str], cwd: Path | None = None) -> tuple[bool, str, str]:
     """Run a git command and return the result.
 
     Args:
@@ -38,7 +33,8 @@ def _run_git_command(
     """
     try:
         result = subprocess.run(
-            ["git"] + args,
+            ["git", *args],
+            check=False,
             cwd=cwd,
             capture_output=True,
             text=True,
@@ -57,8 +53,8 @@ def _clone_or_pull_repo(
     name: str,
     url: str,
     target_path: Path,
-    branch: Optional[str] = None,
-    sparse_dir: Optional[str] = None,
+    branch: str | None = None,
+    sparse_dir: str | None = None,
 ) -> tuple[bool, str]:
     """Clone or pull a git repository with optional sparse-checkout.
 
@@ -73,124 +69,147 @@ def _clone_or_pull_repo(
         Tuple of (success, message)
     """
     if target_path.exists() and (target_path / ".git").exists():
-        # Repository exists, pull updates
-        logger.debug(f"Pulling updates for library '{name}' at {target_path}")
+        return _pull_repo_updates(name, target_path, branch)
+    return _clone_new_repo(name, url, target_path, branch, sparse_dir)
 
-        # Determine which branch to pull
-        pull_branch = branch if branch else "main"
 
-        # Pull updates from specific branch
-        success, stdout, stderr = _run_git_command(
-            ["pull", "--ff-only", "origin", pull_branch], cwd=target_path
-        )
+def _pull_repo_updates(name: str, target_path: Path, branch: str | None) -> tuple[bool, str]:
+    """Pull updates for an existing repository."""
+    logger.debug(f"Pulling updates for library '{name}' at {target_path}")
+
+    pull_branch = branch if branch else "main"
+    success, stdout, stderr = _run_git_command(["pull", "--ff-only", "origin", pull_branch], cwd=target_path)
+
+    if not success:
+        error_msg = stderr or stdout
+        logger.error(f"Failed to pull library '{name}': {error_msg}")
+        return False, f"Pull failed: {error_msg}"
+
+    if "Already up to date" in stdout or "Already up-to-date" in stdout:
+        return True, "Already up to date"
+    return True, "Updated successfully"
 
+
+def _clone_new_repo(
+    name: str, url: str, target_path: Path, branch: str | None, sparse_dir: str | None
+) -> tuple[bool, str]:
+    """Clone a new repository, optionally with sparse-checkout."""
+    logger.debug(f"Cloning library '{name}' from {url} to {target_path}")
+    target_path.parent.mkdir(parents=True, exist_ok=True)
+
+    use_sparse = sparse_dir and sparse_dir != "."
+
+    if use_sparse:
+        return _clone_sparse_repo(url, target_path, branch, sparse_dir)
+    return _clone_full_repo(name, url, target_path, branch)
+
+
+def _clone_sparse_repo(url: str, target_path: Path, branch: str | None, sparse_dir: str) -> tuple[bool, str]:
+    """Clone repository with sparse-checkout."""
+    logger.debug(f"Using sparse-checkout for directory: {sparse_dir}")
+    target_path.mkdir(parents=True, exist_ok=True)
+
+    # Define git operations to perform sequentially
+    operations = [
+        (["init"], "Failed to initialize repo"),
+        (["remote", "add", "origin", url], "Failed to add remote"),
+        (["sparse-checkout", "init", "--no-cone"], "Failed to enable sparse-checkout"),
+        (
+            ["sparse-checkout", "set", f"{sparse_dir}/*"],
+            "Failed to set sparse-checkout directory",
+        ),
+    ]
+
+    # Execute initial operations
+    for cmd, error_msg in operations:
+        success, stdout, stderr = _run_git_command(cmd, cwd=target_path)
+        if not success:
+            return False, f"{error_msg}: {stderr or stdout}"
+
+    # Fetch and checkout
+    fetch_branch = branch if branch else "main"
+    success, stdout, stderr = _run_git_command(["fetch", "--depth", "1", "origin", fetch_branch], cwd=target_path)
+    if not success:
+        return False, f"Fetch failed: {stderr or stdout}"
+
+    success, stdout, stderr = _run_git_command(["checkout", fetch_branch], cwd=target_path)
+    result_success = success
+    result_msg = "Cloned successfully (sparse)" if success else f"Checkout failed: {stderr or stdout}"
+
+    return result_success, result_msg
+
+
+def _clone_full_repo(name: str, url: str, target_path: Path, branch: str | None) -> tuple[bool, str]:
+    """Clone full repository."""
+    clone_args = ["clone", "--depth", "1"]
+    if branch:
+        clone_args.extend(["--branch", branch])
+    clone_args.extend([url, str(target_path)])
+
+    success, stdout, stderr = _run_git_command(clone_args)
+
+    if success:
+        return True, "Cloned successfully"
+    error_msg = stderr or stdout
+    logger.error(f"Failed to clone library '{name}': {error_msg}")
+    return False, f"Clone failed: {error_msg}"
+
+
+def _process_library_update(lib: dict, libraries_path: Path, progress, verbose: bool) -> tuple[str, str, bool]:
+    """Process a single library update and return result."""
+    name = lib.get("name")
+    lib_type = lib.get("type", "git")
+    enabled = lib.get("enabled", True)
+
+    if not enabled:
+        if verbose:
+            display.text(f"Skipping disabled library: {name}", style="dim")
+        return (name, "Skipped (disabled)", False)
+
+    if lib_type == "static":
+        if verbose:
+            display.text(f"Skipping static library: {name} (no sync needed)", style="dim")
+        return (name, "N/A (static)", True)
+
+    # Handle git libraries
+    url = lib.get("url")
+    branch = lib.get("branch")
+    directory = lib.get("directory", "library")
+
+    task = progress.add_task(f"Updating {name}...", total=None)
+    target_path = libraries_path / name
+    success, message = _clone_or_pull_repo(name, url, target_path, branch, directory)
+    progress.remove_task(task)
+
+    if verbose:
         if success:
-            # Check if anything was updated
-            if "Already up to date" in stdout or "Already up-to-date" in stdout:
-                return True, "Already up to date"
-            else:
-                return True, "Updated successfully"
+            display.success(f"{name}: {message}")
         else:
-            error_msg = stderr or stdout
-            logger.error(f"Failed to pull library '{name}': {error_msg}")
-            return False, f"Pull failed: {error_msg}"
-    else:
-        # Repository doesn't exist, clone it
-        logger.debug(f"Cloning library '{name}' from {url} to {target_path}")
-
-        # Ensure parent directory exists
-        target_path.parent.mkdir(parents=True, exist_ok=True)
-
-        # Determine if we should use sparse-checkout
-        use_sparse = sparse_dir and sparse_dir != "."
-
-        if use_sparse:
-            # Use sparse-checkout to clone only specific directory
-            logger.debug(f"Using sparse-checkout for directory: {sparse_dir}")
-
-            # Initialize empty repo
-            success, stdout, stderr = _run_git_command(["init"], cwd=None)
-            if success:
-                # Create target directory
-                target_path.mkdir(parents=True, exist_ok=True)
-
-                # Initialize git repo
-                success, stdout, stderr = _run_git_command(["init"], cwd=target_path)
-                if not success:
-                    return False, f"Failed to initialize repo: {stderr or stdout}"
-
-                # Add remote
-                success, stdout, stderr = _run_git_command(
-                    ["remote", "add", "origin", url], cwd=target_path
-                )
-                if not success:
-                    return False, f"Failed to add remote: {stderr or stdout}"
-
-                # Enable sparse-checkout (non-cone mode to exclude root files)
-                success, stdout, stderr = _run_git_command(
-                    ["sparse-checkout", "init", "--no-cone"], cwd=target_path
-                )
-                if not success:
-                    return (
-                        False,
-                        f"Failed to enable sparse-checkout: {stderr or stdout}",
-                    )
-
-                # Set sparse-checkout to specific directory (non-cone uses patterns)
-                success, stdout, stderr = _run_git_command(
-                    ["sparse-checkout", "set", f"{sparse_dir}/*"], cwd=target_path
-                )
-                if not success:
-                    return (
-                        False,
-                        f"Failed to set sparse-checkout directory: {stderr or stdout}",
-                    )
-
-                # Fetch specific branch (without attempting to update local ref)
-                fetch_args = ["fetch", "--depth", "1", "origin"]
-                if branch:
-                    fetch_args.append(branch)
-                else:
-                    fetch_args.append("main")
-
-                success, stdout, stderr = _run_git_command(fetch_args, cwd=target_path)
-                if not success:
-                    return False, f"Fetch failed: {stderr or stdout}"
-
-                # Checkout the branch
-                checkout_branch = branch if branch else "main"
-                success, stdout, stderr = _run_git_command(
-                    ["checkout", checkout_branch], cwd=target_path
-                )
-                if not success:
-                    return False, f"Checkout failed: {stderr or stdout}"
-
-                # Done! Files are in target_path/sparse_dir/
-                return True, "Cloned successfully (sparse)"
-            else:
-                return False, f"Failed to initialize: {stderr or stdout}"
-        else:
-            # Regular full clone
-            clone_args = ["clone", "--depth", "1"]
-            if branch:
-                clone_args.extend(["--branch", branch])
-            clone_args.extend([url, str(target_path)])
+            display.error(f"{name}: {message}")
 
-            success, stdout, stderr = _run_git_command(clone_args)
+    return (name, message, success)
 
-            if success:
-                return True, "Cloned successfully"
-            else:
-                error_msg = stderr or stdout
-                logger.error(f"Failed to clone library '{name}': {error_msg}")
-                return False, f"Clone failed: {error_msg}"
+
+def _display_update_summary(results: list[tuple[str, str, bool]]) -> None:
+    """Display update summary."""
+    total = len(results)
+    successful = sum(1 for _, _, success in results if success)
+
+    display.text("")
+    if successful == total:
+        display.text(f"All libraries updated successfully ({successful}/{total})", style="green")
+    elif successful > 0:
+        display.text(
+            f"Partially successful: {successful}/{total} libraries updated",
+            style="yellow",
+        )
+    else:
+        display.text("Failed to update libraries", style="red")
 
 
 @app.command()
 def update(
-    library_name: Optional[str] = Argument(
-        None, help="Name of specific library to update (updates all if not specified)"
-    ),
+    library_name: str | None = Argument(None, help="Name of specific library to update (updates all if not specified)"),
     verbose: bool = Option(False, "--verbose", "-v", help="Show detailed output"),
 ) -> None:
     """Update library repositories by cloning or pulling from git.
@@ -202,95 +221,93 @@ def update(
     libraries = config.get_libraries()
 
     if not libraries:
-        display.display_warning("No libraries configured")
-        console.print(
-            "Libraries are auto-configured on first run with a default library."
-        )
+        display.warning("No libraries configured")
+        display.text("Libraries are auto-configured on first run with a default library.")
         return
 
     # Filter to specific library if requested
     if library_name:
         libraries = [lib for lib in libraries if lib.get("name") == library_name]
         if not libraries:
-            console_err.print(
-                f"[red]Error:[/red] Library '{library_name}' not found in configuration"
-            )
+            display.error(f"Library '{library_name}' not found in configuration")
             return
 
     libraries_path = config.get_libraries_path()
-
-    # Create results table
     results = []
 
-    with Progress(
-        SpinnerColumn(),
-        TextColumn("[progress.description]{task.description}"),
-        console=console,
-    ) as progress:
+    with display.progress(SpinnerColumn(), TextColumn("[progress.description]{task.description}")) as progress:
         for lib in libraries:
-            name = lib.get("name")
-            lib_type = lib.get("type", "git")
-            enabled = lib.get("enabled", True)
-
-            if not enabled:
-                if verbose:
-                    console.print(f"[dim]Skipping disabled library: {name}[/dim]")
-                results.append((name, "Skipped (disabled)", False))
-                continue
-
-            # Skip static libraries (no sync needed)
-            if lib_type == "static":
-                if verbose:
-                    console.print(
-                        f"[dim]Skipping static library: {name} (no sync needed)[/dim]"
-                    )
-                results.append((name, "N/A (static)", True))
-                continue
-
-            # Handle git libraries
-            url = lib.get("url")
-            branch = lib.get("branch")
-            directory = lib.get("directory", "library")
-
-            task = progress.add_task(f"Updating {name}...", total=None)
-
-            # Target path: ~/.config/boilerplates/libraries/{name}/
-            target_path = libraries_path / name
-
-            # Clone or pull the repository with sparse-checkout if directory is specified
-            success, message = _clone_or_pull_repo(
-                name, url, target_path, branch, directory
-            )
-
-            results.append((name, message, success))
-            progress.remove_task(task)
-
-            if verbose:
-                if success:
-                    display.display_success(f"{name}: {message}")
-                else:
-                    display.display_error(f"{name}: {message}")
+            result = _process_library_update(lib, libraries_path, progress, verbose)
+            results.append(result)
 
     # Display summary table
     if not verbose:
-        display.display_status_table(
-            "Library Update Summary", results, columns=("Library", "Status")
-        )
-
-    # Summary
-    total = len(results)
-    successful = sum(1 for _, _, success in results if success)
+        display.display_status_table("Library Update Summary", results, columns=("Library", "Status"))
+
+    _display_update_summary(results)
+
+
+def _get_library_path_for_git(lib: dict, libraries_path: Path, name: str) -> Path:
+    """Get library path for git library type."""
+    directory = lib.get("directory", "library")
+    library_base = libraries_path / name
+    if directory and directory != ".":
+        return library_base / directory
+    return library_base
+
+
+def _get_library_path_for_static(lib: dict, config: ConfigManager) -> Path:
+    """Get library path for static library type."""
+    url_or_path = lib.get("path", "")
+    library_path = Path(url_or_path).expanduser()
+    if not library_path.is_absolute():
+        library_path = (config.config_path.parent / library_path).resolve()
+    return library_path
+
+
+def _get_library_info(lib: dict, config: ConfigManager, libraries_path: Path) -> tuple[str, str, str, str, str, str]:
+    """Extract library information based on type."""
+    name = lib.get("name", "")
+    lib_type = lib.get("type", "git")
+    enabled = lib.get("enabled", True)
+
+    if lib_type == "git":
+        url_or_path = lib.get("url", "")
+        branch = lib.get("branch", "main")
+        directory = lib.get("directory", "library")
+        library_path = _get_library_path_for_git(lib, libraries_path, name)
+        exists = library_path.exists()
+        type_icon = IconManager.UI_LIBRARY_GIT
+
+    elif lib_type == "static":
+        url_or_path = lib.get("path", "")
+        branch = "-"
+        directory = "-"
+        library_path = _get_library_path_for_static(lib, config)
+        exists = library_path.exists()
+        type_icon = IconManager.UI_LIBRARY_STATIC
 
-    if successful == total:
-        console.print(
-            f"\n[green]All libraries updated successfully ({successful}/{total})[/green]"
-        )
-    elif successful > 0:
-        console.print(
-            f"\n[yellow]Partially successful: {successful}/{total} libraries updated[/yellow]"
-        )
     else:
-        console.print("\n[red]Failed to update libraries[/red]")
+        # Unknown type
+        url_or_path = "<unknown type>"
+        branch = "-"
+        directory = "-"
+        exists = False
+        type_icon = "?"
+
+    # Build status string
+    status_parts = []
+    if not enabled:
+        status_parts.append("[dim]disabled[/dim]")
+    elif exists:
+        status_parts.append("[green]available[/green]")
+    else:
+        status_parts.append("[yellow]not found[/yellow]")
+
+    status = " ".join(status_parts)
+    type_display = f"{type_icon} {lib_type}"
+
+    return url_or_path, branch, directory, type_display, type_icon, status
 
 
 @app.command()
@@ -300,10 +317,15 @@ def list() -> None:
     libraries = config.get_libraries()
 
     if not libraries:
-        console.print("[yellow]No libraries configured.[/yellow]")
+        display.text("No libraries configured.", style="yellow")
         return
 
-    table = Table(title="Configured Libraries", show_header=True)
+    settings = display.settings
+    table = Table(
+        title="Configured Libraries",
+        show_header=True,
+        header_style=settings.STYLE_TABLE_HEADER,
+    )
     table.add_column("Name", style="cyan", no_wrap=True)
     table.add_column("URL/Path", style="blue")
     table.add_column("Branch", style="yellow")
@@ -315,78 +337,24 @@ def list() -> None:
 
     for lib in libraries:
         name = lib.get("name", "")
-        lib_type = lib.get("type", "git")
-        enabled = lib.get("enabled", True)
-
-        if lib_type == "git":
-            url_or_path = lib.get("url", "")
-            branch = lib.get("branch", "main")
-            directory = lib.get("directory", "library")
-
-            # Check if library exists locally
-            library_base = libraries_path / name
-            if directory and directory != ".":
-                library_path = library_base / directory
-            else:
-                library_path = library_base
-            exists = library_path.exists()
-
-        elif lib_type == "static":
-            url_or_path = lib.get("path", "")
-            branch = "-"
-            directory = "-"
-
-            # Check if static path exists
-            from pathlib import Path
-
-            library_path = Path(url_or_path).expanduser()
-            if not library_path.is_absolute():
-                library_path = (config.config_path.parent / library_path).resolve()
-            exists = library_path.exists()
-
-        else:
-            # Unknown type
-            url_or_path = "<unknown type>"
-            branch = "-"
-            directory = "-"
-            exists = False
-
-        type_display = lib_type
-
-        status_parts = []
-        if not enabled:
-            status_parts.append("[dim]disabled[/dim]")
-        elif exists:
-            status_parts.append("[green]available[/green]")
-        else:
-            status_parts.append("[yellow]not found[/yellow]")
-
-        status = " ".join(status_parts)
-
+        url_or_path, branch, directory, type_display, _type_icon, status = _get_library_info(
+            lib, config, libraries_path
+        )
         table.add_row(name, url_or_path, branch, directory, type_display, status)
 
-    console.print(table)
+    display.print_table(table)
 
 
 @app.command()
 def add(
     name: str = Argument(..., help="Unique name for the library"),
-    library_type: str = Option(
-        "git", "--type", "-t", help="Library type (git or static)"
-    ),
-    url: Optional[str] = Option(
-        None, "--url", "-u", help="Git repository URL (for git type)"
-    ),
-    branch: str = Option("main", "--branch", "-b", help="Git branch (for git type)"),
-    directory: str = Option(
-        "library", "--directory", "-d", help="Directory in repo (for git type)"
-    ),
-    path: Optional[str] = Option(
-        None, "--path", "-p", help="Local path (for static type)"
-    ),
-    enabled: bool = Option(
-        True, "--enabled/--disabled", help="Enable or disable the library"
-    ),
+    *,
+    library_type: str | None = None,
+    url: str | None = None,
+    branch: str = "main",
+    directory: str = "library",
+    path: str | None = None,
+    enabled: bool = Option(True, "--enabled/--disabled", help="Enable or disable the library"),
     sync: bool = Option(True, "--sync/--no-sync", help="Sync after adding (git only)"),
 ) -> None:
     """Add a new library to the configuration.
@@ -403,10 +371,10 @@ def add(
     try:
         if library_type == "git":
             if not url:
-                display.display_error("--url is required for git libraries")
+                display.error("--url is required for git libraries")
                 return
-            config.add_library(
-                name,
+            lib_config = LibraryConfig(
+                name=name,
                 library_type="git",
                 url=url,
                 branch=branch,
@@ -415,32 +383,34 @@ def add(
             )
         elif library_type == "static":
             if not path:
-                display.display_error("--path is required for static libraries")
+                display.error("--path is required for static libraries")
                 return
-            config.add_library(name, library_type="static", path=path, enabled=enabled)
-        else:
-            display.display_error(
-                f"Invalid library type: {library_type}. Must be 'git' or 'static'."
+            lib_config = LibraryConfig(
+                name=name,
+                library_type="static",
+                path=path,
+                enabled=enabled,
             )
+        else:
+            display.error(f"Invalid library type: {library_type}. Must be 'git' or 'static'.")
             return
 
-        display.display_success(f"Added {library_type} library '{name}'")
+        config.add_library(lib_config)
+        display.success(f"Added {library_type} library '{name}'")
 
         if library_type == "git" and sync and enabled:
-            console.print(f"\nSyncing library '{name}'...")
+            display.text(f"\nSyncing library '{name}'...")
             update(library_name=name, verbose=True)
         elif library_type == "static":
-            display.display_info(f"Static library points to: {path}")
+            display.info(f"Static library points to: {path}")
     except ConfigError as e:
-        display.display_error(str(e))
+        display.error(str(e))
 
 
 @app.command()
 def remove(
     name: str = Argument(..., help="Name of the library to remove"),
-    keep_files: bool = Option(
-        False, "--keep-files", help="Keep the local library files (don't delete)"
-    ),
+    keep_files: bool = Option(False, "--keep-files", help="Keep the local library files (don't delete)"),
 ) -> None:
     """Remove a library from the configuration and delete its local files."""
     config = ConfigManager()
@@ -448,7 +418,7 @@ def remove(
     try:
         # Remove from config
         config.remove_library(name)
-        display.display_success(f"Removed library '{name}' from configuration")
+        display.success(f"Removed library '{name}' from configuration")
 
         # Delete local files unless --keep-files is specified
         if not keep_files:
@@ -456,17 +426,15 @@ def remove(
             library_path = libraries_path / name
 
             if library_path.exists():
-                import shutil
-
                 shutil.rmtree(library_path)
-                display.display_success(f"Deleted local files at {library_path}")
+                display.success(f"Deleted local files at {library_path}")
             else:
-                display.display_info(f"No local files found at {library_path}")
+                display.info(f"No local files found at {library_path}")
     except ConfigError as e:
-        display.display_error(str(e))
+        display.error(str(e))
 
 
 # Register the repo command with the CLI
 def register_cli(parent_app: Typer) -> None:
     """Register the repo command with the parent Typer app."""
-    parent_app.add_typer(app, name="repo")
+    parent_app.add_typer(app, name="repo", rich_help_panel="Configuration Commands")

+ 17 - 0
cli/core/schema/__init__.py

@@ -0,0 +1,17 @@
+"""Schema loading and management for boilerplate modules."""
+
+from .loader import (
+    SchemaLoader,
+    get_loader,
+    has_schema,
+    list_versions,
+    load_schema,
+)
+
+__all__ = [
+    "SchemaLoader",
+    "get_loader",
+    "has_schema",
+    "list_versions",
+    "load_schema",
+]

+ 15 - 0
cli/core/schema/ansible/v1.0.json

@@ -0,0 +1,15 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "target_hosts",
+        "description": "Target hosts",
+        "type": "str",
+        "required": true
+      }
+    ]
+  }
+]

+ 229 - 0
cli/core/schema/compose/v1.0.json

@@ -0,0 +1,229 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "service_name",
+        "description": "Service name",
+        "type": "str"
+      },
+      {
+        "name": "container_name",
+        "description": "Container name",
+        "type": "str"
+      },
+      {
+        "name": "container_timezone",
+        "description": "Container timezone (e.g., Europe/Berlin)",
+        "type": "str",
+        "default": "UTC"
+      },
+      {
+        "name": "user_uid",
+        "description": "User UID for container process",
+        "type": "int",
+        "default": 1000
+      },
+      {
+        "name": "user_gid",
+        "description": "User GID for container process",
+        "type": "int",
+        "default": 1000
+      },
+      {
+        "name": "restart_policy",
+        "description": "Container restart policy",
+        "type": "enum",
+        "options": ["unless-stopped", "always", "on-failure", "no"],
+        "default": "unless-stopped"
+      }
+    ]
+  },
+  {
+    "key": "network",
+    "title": "Network",
+    "toggle": "network_enabled",
+    "vars": [
+      {
+        "name": "network_enabled",
+        "description": "Enable custom network block",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "network_name",
+        "description": "Docker network name",
+        "type": "str",
+        "default": "bridge"
+      },
+      {
+        "name": "network_external",
+        "description": "Use existing Docker network",
+        "type": "bool",
+        "default": true
+      }
+    ]
+  },
+  {
+    "key": "ports",
+    "title": "Ports",
+    "toggle": "ports_enabled",
+    "vars": [
+      {
+        "name": "ports_enabled",
+        "description": "Expose ports via 'ports' mapping",
+        "type": "bool",
+        "default": true
+      }
+    ]
+  },
+  {
+    "key": "traefik",
+    "title": "Traefik",
+    "toggle": "traefik_enabled",
+    "description": "Traefik routes external traffic to your service.",
+    "vars": [
+      {
+        "name": "traefik_enabled",
+        "description": "Enable Traefik reverse proxy integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "traefik_network",
+        "description": "Traefik network name",
+        "type": "str",
+        "default": "traefik"
+      },
+      {
+        "name": "traefik_host",
+        "description": "Domain name for your service (e.g., app.example.com)",
+        "type": "str"
+      },
+      {
+        "name": "traefik_entrypoint",
+        "description": "HTTP entrypoint (non-TLS)",
+        "type": "str",
+        "default": "web"
+      }
+    ]
+  },
+  {
+    "key": "traefik_tls",
+    "title": "Traefik TLS/SSL",
+    "toggle": "traefik_tls_enabled",
+    "needs": "traefik",
+    "description": "Enable HTTPS/TLS for Traefik with certificate management.",
+    "vars": [
+      {
+        "name": "traefik_tls_enabled",
+        "description": "Enable HTTPS/TLS",
+        "type": "bool",
+        "default": true
+      },
+      {
+        "name": "traefik_tls_entrypoint",
+        "description": "TLS entrypoint",
+        "type": "str",
+        "default": "websecure"
+      },
+      {
+        "name": "traefik_tls_certresolver",
+        "description": "Traefik certificate resolver name",
+        "type": "str",
+        "default": "cloudflare"
+      }
+    ]
+  },
+  {
+    "key": "swarm",
+    "title": "Docker Swarm",
+    "toggle": "swarm_enabled",
+    "description": "Deploy service in Docker Swarm mode with replicas.",
+    "vars": [
+      {
+        "name": "swarm_enabled",
+        "description": "Enable Docker Swarm mode",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "swarm_replicas",
+        "description": "Number of replicas in Swarm",
+        "type": "int",
+        "default": 1
+      },
+      {
+        "name": "swarm_placement_mode",
+        "description": "Swarm placement mode",
+        "type": "enum",
+        "options": ["global", "replicated"],
+        "default": "replicated"
+      },
+      {
+        "name": "swarm_placement_host",
+        "description": "Limit placement to specific node",
+        "type": "str"
+      }
+    ]
+  },
+  {
+    "key": "database",
+    "title": "Database",
+    "toggle": "database_enabled",
+    "description": "Connect to external database (PostgreSQL or MySQL)",
+    "vars": [
+      {
+        "name": "database_enabled",
+        "description": "Enable external database integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_type",
+        "description": "Database type",
+        "type": "enum",
+        "options": ["postgres", "mysql"],
+        "default": "postgres"
+      },
+      {
+        "name": "database_external",
+        "description": "Use an external database server?",
+        "extra": "skips creation of internal database container",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_host",
+        "description": "Database host",
+        "type": "str",
+        "default": "database"
+      },
+      {
+        "name": "database_port",
+        "description": "Database port",
+        "type": "int"
+      },
+      {
+        "name": "database_name",
+        "description": "Database name",
+        "type": "str"
+      },
+      {
+        "name": "database_user",
+        "description": "Database user",
+        "type": "str"
+      },
+      {
+        "name": "database_password",
+        "description": "Database password",
+        "type": "str",
+        "default": "",
+        "sensitive": true,
+        "autogenerated": true
+      }
+    ]
+  }
+]

+ 312 - 0
cli/core/schema/compose/v1.1.json

@@ -0,0 +1,312 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "service_name",
+        "description": "Service name",
+        "type": "str"
+      },
+      {
+        "name": "container_name",
+        "description": "Container name",
+        "type": "str"
+      },
+      {
+        "name": "container_hostname",
+        "description": "Container internal hostname",
+        "type": "str"
+      },
+      {
+        "name": "container_timezone",
+        "description": "Container timezone (e.g., Europe/Berlin)",
+        "type": "str",
+        "default": "UTC"
+      },
+      {
+        "name": "user_uid",
+        "description": "User UID for container process",
+        "type": "int",
+        "default": 1000
+      },
+      {
+        "name": "user_gid",
+        "description": "User GID for container process",
+        "type": "int",
+        "default": 1000
+      },
+      {
+        "name": "container_loglevel",
+        "description": "Container log level",
+        "type": "enum",
+        "options": ["debug", "info", "warn", "error"],
+        "default": "info"
+      },
+      {
+        "name": "restart_policy",
+        "description": "Container restart policy",
+        "type": "enum",
+        "options": ["unless-stopped", "always", "on-failure", "no"],
+        "default": "unless-stopped"
+      }
+    ]
+  },
+  {
+    "key": "network",
+    "title": "Network",
+    "vars": [
+      {
+        "name": "network_mode",
+        "description": "Docker network mode",
+        "type": "enum",
+        "options": ["bridge", "host", "macvlan"],
+        "default": "bridge",
+        "extra": "bridge=default Docker networking, host=use host network stack, macvlan=dedicated MAC address on physical network"
+      },
+      {
+        "name": "network_name",
+        "description": "Docker network name",
+        "type": "str",
+        "default": "bridge",
+        "needs": "network_mode=bridge,macvlan"
+      },
+      {
+        "name": "network_external",
+        "description": "Use existing Docker network (external)",
+        "type": "bool",
+        "default": false,
+        "needs": "network_mode=bridge,macvlan"
+      },
+      {
+        "name": "network_macvlan_ipv4_address",
+        "description": "Static IP address for container",
+        "type": "str",
+        "default": "192.168.1.253",
+        "needs": "network_mode=macvlan"
+      },
+      {
+        "name": "network_macvlan_parent_interface",
+        "description": "Host network interface name",
+        "type": "str",
+        "default": "eth0",
+        "needs": "network_mode=macvlan"
+      },
+      {
+        "name": "network_macvlan_subnet",
+        "description": "Network subnet in CIDR notation",
+        "type": "str",
+        "default": "192.168.1.0/24",
+        "needs": "network_mode=macvlan"
+      },
+      {
+        "name": "network_macvlan_gateway",
+        "description": "Network gateway IP address",
+        "type": "str",
+        "default": "192.168.1.1",
+        "needs": "network_mode=macvlan"
+      }
+    ]
+  },
+  {
+    "key": "ports",
+    "title": "Ports",
+    "needs": "network_mode=bridge",
+    "vars": []
+  },
+  {
+    "key": "traefik",
+    "title": "Traefik",
+    "toggle": "traefik_enabled",
+    "needs": "network_mode=bridge",
+    "description": "Traefik routes external traffic to your service.",
+    "vars": [
+      {
+        "name": "traefik_enabled",
+        "description": "Enable Traefik reverse proxy integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "traefik_network",
+        "description": "Traefik network name",
+        "type": "str",
+        "default": "traefik"
+      },
+      {
+        "name": "traefik_host",
+        "description": "Domain name for your service (e.g., app.example.com)",
+        "type": "str"
+      },
+      {
+        "name": "traefik_entrypoint",
+        "description": "HTTP entrypoint (non-TLS)",
+        "type": "str",
+        "default": "web"
+      }
+    ]
+  },
+  {
+    "key": "traefik_tls",
+    "title": "Traefik TLS/SSL",
+    "toggle": "traefik_tls_enabled",
+    "needs": "traefik_enabled=true;network_mode=bridge",
+    "description": "Enable HTTPS/TLS for Traefik with certificate management.",
+    "vars": [
+      {
+        "name": "traefik_tls_enabled",
+        "description": "Enable HTTPS/TLS",
+        "type": "bool",
+        "default": true
+      },
+      {
+        "name": "traefik_tls_entrypoint",
+        "description": "TLS entrypoint",
+        "type": "str",
+        "default": "websecure"
+      },
+      {
+        "name": "traefik_tls_certresolver",
+        "description": "Traefik certificate resolver name",
+        "type": "str",
+        "default": "cloudflare"
+      }
+    ]
+  },
+  {
+    "key": "swarm",
+    "title": "Docker Swarm",
+    "toggle": "swarm_enabled",
+    "needs": "network_mode=bridge",
+    "description": "Deploy service in Docker Swarm mode.",
+    "vars": [
+      {
+        "name": "swarm_enabled",
+        "description": "Enable Docker Swarm mode",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "swarm_placement_mode",
+        "description": "Swarm placement mode",
+        "type": "enum",
+        "options": ["replicated", "global"],
+        "default": "replicated"
+      },
+      {
+        "name": "swarm_replicas",
+        "description": "Number of replicas",
+        "type": "int",
+        "default": 1,
+        "needs": "swarm_placement_mode=replicated"
+      },
+      {
+        "name": "swarm_placement_host",
+        "description": "Target hostname for placement constraint",
+        "type": "str",
+        "default": "",
+        "optional": true,
+        "needs": "swarm_placement_mode=replicated",
+        "extra": "Constrains service to run on specific node by hostname"
+      },
+      {
+        "name": "swarm_volume_mode",
+        "description": "Swarm volume storage backend",
+        "type": "enum",
+        "options": ["local", "mount", "nfs"],
+        "default": "local",
+        "extra": "WARNING: 'local' only works on single-node deployments!"
+      },
+      {
+        "name": "swarm_volume_mount_path",
+        "description": "Host path for bind mount",
+        "type": "str",
+        "default": "/mnt/storage",
+        "needs": "swarm_volume_mode=mount",
+        "extra": "Useful for shared/replicated storage"
+      },
+      {
+        "name": "swarm_volume_nfs_server",
+        "description": "NFS server address",
+        "type": "str",
+        "default": "192.168.1.1",
+        "needs": "swarm_volume_mode=nfs",
+        "extra": "IP address or hostname of NFS server"
+      },
+      {
+        "name": "swarm_volume_nfs_path",
+        "description": "NFS export path",
+        "type": "str",
+        "default": "/export",
+        "needs": "swarm_volume_mode=nfs",
+        "extra": "Path to NFS export on the server"
+      },
+      {
+        "name": "swarm_volume_nfs_options",
+        "description": "NFS mount options",
+        "type": "str",
+        "default": "rw,nolock,soft",
+        "needs": "swarm_volume_mode=nfs",
+        "extra": "Comma-separated NFS mount options"
+      }
+    ]
+  },
+  {
+    "key": "database",
+    "title": "Database",
+    "toggle": "database_enabled",
+    "description": "Connect to external database (PostgreSQL or MySQL)",
+    "vars": [
+      {
+        "name": "database_enabled",
+        "description": "Enable external database integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_type",
+        "description": "Database type",
+        "type": "enum",
+        "options": ["default", "sqlite", "postgres", "mysql"],
+        "default": "default"
+      },
+      {
+        "name": "database_external",
+        "description": "Use an external database server?",
+        "extra": "skips creation of internal database container",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_host",
+        "description": "Database host",
+        "type": "str",
+        "default": "database"
+      },
+      {
+        "name": "database_port",
+        "description": "Database port",
+        "type": "int"
+      },
+      {
+        "name": "database_name",
+        "description": "Database name",
+        "type": "str"
+      },
+      {
+        "name": "database_user",
+        "description": "Database user",
+        "type": "str"
+      },
+      {
+        "name": "database_password",
+        "description": "Database password",
+        "type": "str",
+        "default": "",
+        "sensitive": true,
+        "autogenerated": true
+      }
+    ]
+  }
+]

+ 528 - 0
cli/core/schema/compose/v1.2.json

@@ -0,0 +1,528 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "vars": [
+      {
+        "name": "service_name",
+        "description": "Service name",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "container_name",
+        "description": "Container name",
+        "type": "str"
+      },
+      {
+        "name": "container_hostname",
+        "description": "Container internal hostname",
+        "type": "str"
+      },
+      {
+        "name": "container_timezone",
+        "description": "Container timezone (e.g., Europe/Berlin)",
+        "type": "str"
+      },
+      {
+        "name": "user_uid",
+        "description": "User UID for container process",
+        "type": "int",
+        "default": 1000
+      },
+      {
+        "name": "user_gid",
+        "description": "User GID for container process",
+        "type": "int",
+        "default": 1000
+      },
+      {
+        "name": "container_loglevel",
+        "description": "Container log level",
+        "type": "enum",
+        "options": ["debug", "info", "warn", "error"]
+      },
+      {
+        "name": "restart_policy",
+        "description": "Container restart policy",
+        "type": "enum",
+        "options": ["unless-stopped", "always", "on-failure", "no"],
+        "default": "unless-stopped",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "network",
+    "title": "Network",
+    "vars": [
+      {
+        "name": "network_mode",
+        "description": "Docker network mode",
+        "type": "enum",
+        "options": ["bridge", "host", "macvlan"],
+        "extra": "bridge=default Docker networking, host=use host network stack, macvlan=dedicated MAC address on physical network"
+      },
+      {
+        "name": "network_name",
+        "description": "Docker network name",
+        "type": "str",
+        "default": "bridge",
+        "needs": ["network_mode=bridge,macvlan"],
+        "required": true
+      },
+      {
+        "name": "network_external",
+        "description": "Use existing Docker network (external)",
+        "type": "bool",
+        "default": false,
+        "needs": ["network_mode=bridge,macvlan"]
+      },
+      {
+        "name": "network_macvlan_ipv4_address",
+        "description": "Static IP address for container",
+        "type": "str",
+        "default": "192.168.1.253",
+        "needs": ["network_mode=macvlan"],
+        "required": true
+      },
+      {
+        "name": "network_macvlan_parent_interface",
+        "description": "Host network interface name",
+        "type": "str",
+        "default": "eth0",
+        "needs": ["network_mode=macvlan"],
+        "required": true
+      },
+      {
+        "name": "network_macvlan_subnet",
+        "description": "Network subnet in CIDR notation",
+        "type": "str",
+        "default": "192.168.1.0/24",
+        "needs": ["network_mode=macvlan"],
+        "required": true
+      },
+      {
+        "name": "network_macvlan_gateway",
+        "description": "Network gateway IP address",
+        "type": "str",
+        "default": "192.168.1.1",
+        "needs": ["network_mode=macvlan"],
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "ports",
+    "title": "Ports",
+    "needs": ["network_mode!=host,macvlan"],
+    "description": "Expose service ports to the host.",
+    "vars": [
+      {
+        "name": "ports_http",
+        "description": "HTTP port on host",
+        "type": "int",
+        "needs": ["traefik_enabled=false"],
+        "default": 8080,
+        "required": true
+      },
+      {
+        "name": "ports_https",
+        "description": "HTTPS port on host",
+        "type": "int",
+        "needs": ["traefik_enabled=false"],
+        "default": 8443,
+        "required": true
+      },
+      {
+        "name": "ports_ssh",
+        "description": "SSH port on host",
+        "type": "int",
+        "default": 22,
+        "required": true
+      },
+      {
+        "name": "ports_dns",
+        "description": "DNS port on host",
+        "type": "int",
+        "default": 53,
+        "required": true
+      },
+      {
+        "name": "ports_dhcp",
+        "description": "DHCP port on host",
+        "type": "int",
+        "default": 67,
+        "required": true
+      },
+      {
+        "name": "ports_smtp",
+        "description": "SMTP port on host",
+        "type": "int",
+        "default": 25,
+        "required": true
+      },
+      {
+        "name": "ports_snmp",
+        "description": "SNMP trap port",
+        "type": "int",
+        "default": 162,
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "traefik",
+    "title": "Traefik",
+    "toggle": "traefik_enabled",
+    "needs": ["network_mode!=host,macvlan"],
+    "description": "Traefik routes external traffic to your service.",
+    "vars": [
+      {
+        "name": "traefik_enabled",
+        "description": "Enable Traefik reverse proxy integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "traefik_network",
+        "description": "Traefik network name",
+        "type": "str",
+        "default": "traefik",
+        "required": true
+      },
+      {
+        "name": "traefik_host",
+        "description": "Service subdomain or full hostname (e.g., 'app' or 'app.example.com')",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "traefik_domain",
+        "description": "Base domain (e.g., example.com)",
+        "type": "str",
+        "default": "home.arpa",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "traefik_tls",
+    "title": "Traefik TLS/SSL",
+    "toggle": "traefik_tls_enabled",
+    "needs": ["traefik_enabled=true", "network_mode!=host,macvlan"],
+    "description": "Enable HTTPS/TLS for Traefik with certificate management.",
+    "vars": [
+      {
+        "name": "traefik_tls_enabled",
+        "description": "Enable HTTPS/TLS",
+        "type": "bool",
+        "default": true
+      },
+      {
+        "name": "traefik_tls_certresolver",
+        "description": "Traefik certificate resolver name",
+        "type": "str",
+        "default": "cloudflare",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "volume",
+    "title": "Volume Storage",
+    "description": "Configure persistent storage for your service.",
+    "vars": [
+      {
+        "name": "volume_mode",
+        "description": "Volume storage backend",
+        "type": "enum",
+        "options": ["local", "mount", "nfs"],
+        "default": "local",
+        "required": true
+      },
+      {
+        "name": "volume_mount_path",
+        "description": "Host path for bind mounts",
+        "type": "str",
+        "default": "/mnt/storage",
+        "needs": ["volume_mode=mount"],
+        "required": true
+      },
+      {
+        "name": "volume_nfs_server",
+        "description": "NFS server address",
+        "type": "str",
+        "default": "192.168.1.1",
+        "needs": ["volume_mode=nfs"],
+        "required": true
+      },
+      {
+        "name": "volume_nfs_path",
+        "description": "NFS export path",
+        "type": "str",
+        "default": "/export",
+        "needs": ["volume_mode=nfs"],
+        "required": true
+      },
+      {
+        "name": "volume_nfs_options",
+        "description": "NFS mount options (comma-separated)",
+        "type": "str",
+        "default": "rw,nolock,soft",
+        "needs": ["volume_mode=nfs"],
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "resources",
+    "title": "Resource Limits",
+    "toggle": "resources_enabled",
+    "description": "Set CPU and memory limits for the service.",
+    "vars": [
+      {
+        "name": "resources_enabled",
+        "description": "Enable resource limits",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "resources_cpu_limit",
+        "description": "Maximum CPU cores (e.g., 0.5, 1.0, 2.0)",
+        "type": "str",
+        "default": "1.0",
+        "required": true
+      },
+      {
+        "name": "resources_cpu_reservation",
+        "description": "Reserved CPU cores",
+        "type": "str",
+        "default": "0.25",
+        "needs": ["swarm_enabled=true"],
+        "required": true
+      },
+      {
+        "name": "resources_memory_limit",
+        "description": "Maximum memory (e.g., 512M, 1G, 2G)",
+        "type": "str",
+        "default": "1G",
+        "required": true
+      },
+      {
+        "name": "resources_memory_reservation",
+        "description": "Reserved memory",
+        "type": "str",
+        "default": "512M",
+        "needs": ["swarm_enabled=true"],
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "swarm",
+    "title": "Docker Swarm",
+    "toggle": "swarm_enabled",
+    "needs": ["network_mode!=host,macvlan"],
+    "description": "Deploy service in Docker Swarm mode.",
+    "vars": [
+      {
+        "name": "swarm_enabled",
+        "description": "Enable Docker Swarm mode",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "swarm_placement_mode",
+        "description": "Swarm placement mode",
+        "type": "enum",
+        "options": ["replicated", "global"],
+        "default": "replicated",
+        "required": true
+      },
+      {
+        "name": "swarm_replicas",
+        "description": "Number of replicas",
+        "type": "int",
+        "default": 1,
+        "needs": ["swarm_placement_mode=replicated"],
+        "required": true
+      },
+      {
+        "name": "swarm_placement_host",
+        "description": "Target hostname for placement constraint",
+        "type": "str",
+        "default": "",
+        "needs": ["swarm_placement_mode=replicated"],
+        "extra": "Constrains service to run on specific node by hostname"
+      }
+    ]
+  },
+  {
+    "key": "database",
+    "title": "Database",
+    "toggle": "database_enabled",
+    "description": "Connect to external database (PostgreSQL or MySQL)",
+    "vars": [
+      {
+        "name": "database_enabled",
+        "description": "Enable external database integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_type",
+        "description": "Database type",
+        "type": "enum",
+        "options": ["sqlite", "postgres", "mysql"],
+        "default": "sqlite",
+        "required": true
+      },
+      {
+        "name": "database_external",
+        "description": "Use an external database server?",
+        "extra": "skips creation of internal database container",
+        "type": "bool",
+        "needs": ["database_type=postgres,mysql"],
+        "default": false
+      },
+      {
+        "name": "database_host",
+        "description": "Database host",
+        "type": "str",
+        "needs": ["database_external=true;database_type=postgres,mysql"],
+        "required": true
+      },
+      {
+        "name": "database_port",
+        "description": "Database port",
+        "type": "int",
+        "needs": ["database_external=true;database_type=postgres,mysql"],
+        "required": true
+      },
+      {
+        "name": "database_name",
+        "description": "Database name",
+        "type": "str",
+        "needs": ["database_type=postgres,mysql"],
+        "required": true
+      },
+      {
+        "name": "database_user",
+        "description": "Database user",
+        "type": "str",
+        "needs": ["database_type=postgres,mysql"],
+        "required": true
+      },
+      {
+        "name": "database_password",
+        "description": "Database password",
+        "type": "str",
+        "needs": ["database_type=postgres,mysql"],
+        "sensitive": true,
+        "autogenerated": true,
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "email",
+    "title": "Email Server",
+    "toggle": "email_enabled",
+    "description": "Configure email server for notifications and user management.",
+    "vars": [
+      {
+        "name": "email_enabled",
+        "description": "Enable email server configuration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "email_host",
+        "description": "SMTP server hostname",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "email_port",
+        "description": "SMTP server port",
+        "type": "int",
+        "default": 25,
+        "required": true
+      },
+      {
+        "name": "email_username",
+        "description": "SMTP username",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "email_password",
+        "description": "SMTP password",
+        "type": "str",
+        "sensitive": true,
+        "required": true
+      },
+      {
+        "name": "email_from",
+        "description": "From email address",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "email_encryption",
+        "description": "Email encryption method to use",
+        "type": "enum",
+        "options": ["none", "starttls", "ssl"]
+      }
+    ]
+  },
+  {
+    "key": "authentik",
+    "title": "Authentik SSO",
+    "toggle": "authentik_enabled",
+    "description": "Integrate with Authentik for Single Sign-On authentication.",
+    "vars": [
+      {
+        "name": "authentik_enabled",
+        "description": "Enable Authentik SSO integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "authentik_url",
+        "description": "Authentik base URL (e.g., https://auth.example.com)",
+        "type": "url",
+        "required": true
+      },
+      {
+        "name": "authentik_slug",
+        "description": "Authentik application slug",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "authentik_traefik_middleware",
+        "description": "Traefik middleware name for Authentik authentication",
+        "type": "str",
+        "default": "authentik-middleware@file",
+        "needs": ["traefik_enabled=true"],
+        "required": true
+      },
+      {
+        "name": "authentik_client_id",
+        "description": "Authentik OAuth2 client ID",
+        "type": "str",
+        "sensitive": true,
+        "required": true
+      },
+      {
+        "name": "authentik_client_secret",
+        "description": "Authentik OAuth2 client secret",
+        "type": "str",
+        "sensitive": true,
+        "required": true
+      }
+    ]
+  }
+]

+ 202 - 0
cli/core/schema/helm/v1.0.json

@@ -0,0 +1,202 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "release_name",
+        "description": "Helm release name",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "namespace",
+        "description": "Kubernetes namespace",
+        "type": "str"
+      }
+    ]
+  },
+  {
+    "key": "networking",
+    "title": "Networking",
+    "vars": [
+      {
+        "name": "network_mode",
+        "description": "Kubernetes service type",
+        "type": "enum",
+        "options": ["ClusterIP", "NodePort", "LoadBalancer"],
+        "default": "ClusterIP"
+      }
+    ]
+  },
+  {
+    "key": "database",
+    "title": "Database Configuration",
+    "toggle": "database_enabled",
+    "vars": [
+      {
+        "name": "database_enabled",
+        "description": "Enable external database configuration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_type",
+        "description": "Database type",
+        "type": "enum",
+        "options": ["postgres", "mysql", "mariadb"],
+        "default": "postgres"
+      },
+      {
+        "name": "database_host",
+        "description": "Database hostname",
+        "type": "hostname"
+      },
+      {
+        "name": "database_port",
+        "description": "Database port",
+        "type": "int",
+        "default": 5432
+      },
+      {
+        "name": "database_name",
+        "description": "Database name",
+        "type": "str"
+      },
+      {
+        "name": "database_user",
+        "description": "Database username",
+        "type": "str"
+      },
+      {
+        "name": "database_password",
+        "description": "Database password",
+        "type": "str",
+        "sensitive": true
+      }
+    ]
+  },
+  {
+    "key": "email",
+    "title": "Email Configuration",
+    "toggle": "email_enabled",
+    "vars": [
+      {
+        "name": "email_enabled",
+        "description": "Enable email configuration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "email_host",
+        "description": "SMTP server hostname",
+        "type": "hostname"
+      },
+      {
+        "name": "email_port",
+        "description": "SMTP server port",
+        "type": "int",
+        "default": 587
+      },
+      {
+        "name": "email_username",
+        "description": "SMTP username",
+        "type": "str"
+      },
+      {
+        "name": "email_password",
+        "description": "SMTP password",
+        "type": "str",
+        "sensitive": true
+      },
+      {
+        "name": "email_from",
+        "description": "From email address",
+        "type": "email"
+      },
+      {
+        "name": "email_use_tls",
+        "description": "Use TLS encryption",
+        "type": "bool",
+        "default": true
+      },
+      {
+        "name": "email_use_ssl",
+        "description": "Use SSL encryption",
+        "type": "bool",
+        "default": false
+      }
+    ]
+  },
+  {
+    "key": "traefik",
+    "title": "Traefik Ingress",
+    "toggle": "traefik_enabled",
+    "vars": [
+      {
+        "name": "traefik_enabled",
+        "description": "Enable Traefik ingress",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "traefik_host",
+        "description": "Ingress hostname (FQDN)",
+        "type": "hostname"
+      }
+    ]
+  },
+  {
+    "key": "traefik_tls",
+    "title": "Traefik TLS/SSL",
+    "needs": "traefik",
+    "toggle": "traefik_tls_enabled",
+    "vars": [
+      {
+        "name": "traefik_tls_enabled",
+        "description": "Enable TLS for ingress",
+        "type": "bool",
+        "default": true
+      },
+      {
+        "name": "traefik_tls_certmanager",
+        "description": "Use cert-manager for TLS certificates",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "certmanager_issuer",
+        "description": "Cert-manager cluster issuer name",
+        "type": "str",
+        "needs": "traefik_tls_certmanager=true",
+        "default": "letsencrypt-prod"
+      },
+      {
+        "name": "traefik_tls_secret",
+        "description": "TLS secret name",
+        "type": "str"
+      }
+    ]
+  },
+  {
+    "key": "volumes",
+    "title": "Persistent Volumes",
+    "vars": [
+      {
+        "name": "volumes_mode",
+        "description": "Volume configuration mode",
+        "type": "enum",
+        "options": ["dynamic-pvc", "existing-pvc"],
+        "default": "dynamic-pvc",
+        "extra": "dynamic-pvc=auto-provision storage, existing-pvc=use existing PVC"
+      },
+      {
+        "name": "volumes_pvc_name",
+        "description": "Existing PVC name",
+        "type": "str",
+        "needs": "volumes_mode=existing-pvc"
+      }
+    ]
+  }
+]

+ 247 - 0
cli/core/schema/kubernetes/v1.0.json

@@ -0,0 +1,247 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "resource_name",
+        "description": "Kubernetes resource name",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "namespace",
+        "description": "Kubernetes namespace",
+        "type": "str",
+        "default": "default",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "resources",
+    "title": "Resource Limits",
+    "toggle": "resources_enabled",
+    "description": "Set CPU and memory limits for the resource.",
+    "vars": [
+      {
+        "name": "resources_enabled",
+        "description": "Enable resource limits",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "resources_cpu_limit",
+        "description": "Maximum CPU cores (e.g., 100m, 500m, 1, 2)",
+        "type": "str",
+        "default": "1",
+        "required": true
+      },
+      {
+        "name": "resources_cpu_request",
+        "description": "Requested CPU cores",
+        "type": "str",
+        "default": "250m",
+        "required": true
+      },
+      {
+        "name": "resources_memory_limit",
+        "description": "Maximum memory (e.g., 512Mi, 1Gi, 2Gi)",
+        "type": "str",
+        "default": "1Gi",
+        "required": true
+      },
+      {
+        "name": "resources_memory_request",
+        "description": "Requested memory",
+        "type": "str",
+        "default": "512Mi",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "traefik",
+    "title": "Traefik",
+    "toggle": "traefik_enabled",
+    "description": "Traefik routes external traffic to your service.",
+    "vars": [
+      {
+        "name": "traefik_enabled",
+        "description": "Enable Traefik ingress configuration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "traefik_host",
+        "description": "Service subdomain or full hostname (e.g., 'app' or 'app.example.com')",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "traefik_domain",
+        "description": "Base domain (e.g., example.com)",
+        "type": "str",
+        "default": "home.arpa",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "traefik_tls",
+    "title": "Traefik TLS/SSL",
+    "toggle": "traefik_tls_enabled",
+    "needs": ["traefik"],
+    "description": "Enable HTTPS/TLS for Traefik with certificate management.",
+    "vars": [
+      {
+        "name": "traefik_tls_enabled",
+        "description": "Enable HTTPS/TLS",
+        "type": "bool",
+        "default": true
+      },
+      {
+        "name": "traefik_tls_certresolver",
+        "description": "Traefik certificate resolver name",
+        "type": "str",
+        "default": "cloudflare",
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "database",
+    "title": "Database",
+    "toggle": "database_enabled",
+    "description": "Connect to external database (PostgreSQL or MySQL)",
+    "vars": [
+      {
+        "name": "database_enabled",
+        "description": "Enable external database integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "database_type",
+        "description": "Database type",
+        "type": "enum",
+        "options": ["sqlite", "postgres", "mysql", "mariadb"],
+        "default": "postgres",
+        "required": true
+      },
+      {
+        "name": "database_host",
+        "description": "Database host",
+        "type": "str",
+        "default": "database",
+        "required": true
+      },
+      {
+        "name": "database_port",
+        "description": "Database port",
+        "type": "int",
+        "required": true
+      },
+      {
+        "name": "database_name",
+        "description": "Database name",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "database_user",
+        "description": "Database user",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "database_password",
+        "description": "Database password",
+        "type": "str",
+        "default": "",
+        "sensitive": true,
+        "autogenerated": true,
+        "required": true
+      }
+    ]
+  },
+  {
+    "key": "email",
+    "title": "Email Server",
+    "toggle": "email_enabled",
+    "description": "Configure email server for notifications and user management.",
+    "vars": [
+      {
+        "name": "email_enabled",
+        "description": "Enable email server configuration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "email_host",
+        "description": "SMTP server hostname",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "email_port",
+        "description": "SMTP server port",
+        "type": "int",
+        "default": 25,
+        "required": true
+      },
+      {
+        "name": "email_username",
+        "description": "SMTP username",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "email_password",
+        "description": "SMTP password",
+        "type": "str",
+        "sensitive": true,
+        "required": true
+      },
+      {
+        "name": "email_from",
+        "description": "From email address",
+        "type": "str",
+        "required": true
+      },
+      {
+        "name": "email_encryption",
+        "description": "Email encryption method to use",
+        "type": "enum",
+        "options": ["none", "starttls", "ssl"]
+      }
+    ]
+  },
+  {
+    "key": "authentik",
+    "title": "Authentik SSO",
+    "toggle": "authentik_enabled",
+    "description": "Integrate with Authentik for Single Sign-On authentication.",
+    "vars": [
+      {
+        "name": "authentik_enabled",
+        "description": "Enable Authentik SSO integration",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "authentik_url",
+        "description": "Authentik base URL (e.g., https://auth.example.com)",
+        "type": "url",
+        "required": true
+      },
+      {
+        "name": "authentik_slug",
+        "description": "Authentik application slug",
+        "type": "str",
+        "required": true
+      }
+    ]
+  }
+]

+ 220 - 0
cli/core/schema/loader.py

@@ -0,0 +1,220 @@
+"""JSON Schema Loading and Validation.
+
+This module provides functionality to load, cache, and validate JSON schemas
+for boilerplate modules. Schemas are stored in cli/core/schema/<module>/v*.json files.
+"""
+
+import json
+from pathlib import Path
+from typing import Any
+
+from cli.core.exceptions import SchemaError
+
+
+class SchemaLoader:
+    """Loads and validates JSON schemas for modules."""
+
+    def __init__(self, schema_dir: Path | None = None):
+        """Initialize schema loader.
+
+        Args:
+            schema_dir: Directory containing schema files. If None, uses cli/core/schema/
+        """
+        if schema_dir is None:
+            # Use path relative to this file (in cli/core/schema/)
+            # __file__ is cli/core/schema/loader.py, parent is cli/core/schema/
+            self.schema_dir = Path(__file__).parent
+        else:
+            self.schema_dir = schema_dir
+
+    def load_schema(self, module: str, version: str) -> list[dict[str, Any]]:
+        """Load a JSON schema from file.
+
+        Args:
+            module: Module name (e.g., 'compose', 'ansible')
+            version: Schema version (e.g., '1.0', '1.2')
+
+        Returns:
+            Schema as list of section specifications
+
+        Raises:
+            SchemaError: If schema file not found or invalid JSON
+        """
+        schema_file = self.schema_dir / module / f"v{version}.json"
+
+        if not schema_file.exists():
+            raise SchemaError(
+                f"Schema file not found: {schema_file}",
+                details=f"Module: {module}, Version: {version}",
+            )
+
+        try:
+            with schema_file.open(encoding="utf-8") as f:
+                schema = json.load(f)
+        except json.JSONDecodeError as e:
+            raise SchemaError(
+                f"Invalid JSON in schema file: {schema_file}",
+                details=f"Error: {e}",
+            ) from e
+        except Exception as e:
+            raise SchemaError(
+                f"Failed to read schema file: {schema_file}",
+                details=f"Error: {e}",
+            ) from e
+
+        # Validate schema structure
+        self._validate_schema_structure(schema, module, version)
+
+        return schema
+
+    def _validate_schema_structure(self, schema: Any, module: str, version: str) -> None:
+        """Validate that schema has correct structure.
+
+        Args:
+            schema: Schema to validate
+            module: Module name for error messages
+            version: Version for error messages
+
+        Raises:
+            SchemaError: If schema structure is invalid
+        """
+        if not isinstance(schema, list):
+            raise SchemaError(
+                f"Schema must be a list, got {type(schema).__name__}",
+                details=f"Module: {module}, Version: {version}",
+            )
+
+        for idx, section in enumerate(schema):
+            if not isinstance(section, dict):
+                raise SchemaError(
+                    f"Section {idx} must be a dict, got {type(section).__name__}",
+                    details=f"Module: {module}, Version: {version}",
+                )
+
+            # Check required fields
+            if "key" not in section:
+                raise SchemaError(
+                    f"Section {idx} missing required field 'key'",
+                    details=f"Module: {module}, Version: {version}",
+                )
+
+            if "vars" not in section:
+                raise SchemaError(
+                    f"Section '{section.get('key')}' missing required field 'vars'",
+                    details=f"Module: {module}, Version: {version}",
+                )
+
+            if not isinstance(section["vars"], list):
+                raise SchemaError(
+                    f"Section '{section['key']}' vars must be a list",
+                    details=f"Module: {module}, Version: {version}",
+                )
+
+            # Validate variables
+            for var_idx, var in enumerate(section["vars"]):
+                if not isinstance(var, dict):
+                    raise SchemaError(
+                        f"Variable {var_idx} in section '{section['key']}' must be a dict",
+                        details=f"Module: {module}, Version: {version}",
+                    )
+
+                if "name" not in var:
+                    raise SchemaError(
+                        f"Variable {var_idx} in section '{section['key']}' missing 'name'",
+                        details=f"Module: {module}, Version: {version}",
+                    )
+
+                if "type" not in var:
+                    raise SchemaError(
+                        f"Variable '{var.get('name')}' in section '{section['key']}' missing 'type'",
+                        details=f"Module: {module}, Version: {version}",
+                    )
+
+    def list_versions(self, module: str) -> list[str]:
+        """List available schema versions for a module.
+
+        Args:
+            module: Module name
+
+        Returns:
+            List of version strings (e.g., ['1.0', '1.1', '1.2'])
+        """
+        module_dir = self.schema_dir / module
+
+        if not module_dir.exists():
+            return []
+
+        versions = []
+        for file in module_dir.glob("v*.json"):
+            # Extract version from filename (v1.0.json -> 1.0)
+            version = file.stem[1:]  # Remove 'v' prefix
+            versions.append(version)
+
+        return sorted(versions)
+
+    def has_schema(self, module: str, version: str) -> bool:
+        """Check if a schema exists.
+
+        Args:
+            module: Module name
+            version: Schema version
+
+        Returns:
+            True if schema exists
+        """
+        schema_file = self.schema_dir / module / f"v{version}.json"
+        return schema_file.exists()
+
+
+# Global schema loader instance
+_loader: SchemaLoader | None = None
+
+
+def get_loader() -> SchemaLoader:
+    """Get global schema loader instance.
+
+    Returns:
+        SchemaLoader instance
+    """
+    global _loader  # noqa: PLW0603
+    if _loader is None:
+        _loader = SchemaLoader()
+    return _loader
+
+
+def load_schema(module: str, version: str) -> list[dict[str, Any]]:
+    """Load a schema using the global loader.
+
+    Args:
+        module: Module name
+        version: Schema version
+
+    Returns:
+        Schema as list of section specifications
+    """
+    return get_loader().load_schema(module, version)
+
+
+def list_versions(module: str) -> list[str]:
+    """List available versions for a module.
+
+    Args:
+        module: Module name
+
+    Returns:
+        List of version strings
+    """
+    return get_loader().list_versions(module)
+
+
+def has_schema(module: str, version: str) -> bool:
+    """Check if a schema exists.
+
+    Args:
+        module: Module name
+        version: Schema version
+
+    Returns:
+        True if schema exists
+    """
+    return get_loader().has_schema(module, version)

+ 14 - 0
cli/core/schema/packer/v1.0.json

@@ -0,0 +1,14 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "playbook_name",
+        "description": "Ansible playbook name",
+        "type": "str"
+      }
+    ]
+  }
+]

+ 87 - 0
cli/core/schema/terraform/v1.0.json

@@ -0,0 +1,87 @@
+[
+  {
+    "key": "general",
+    "title": "General",
+    "required": true,
+    "vars": [
+      {
+        "name": "resource_name",
+        "description": "Terraform resource name (alphanumeric and underscores only)",
+        "type": "str",
+        "default": "resource"
+      }
+    ]
+  },
+  {
+    "key": "depends_on",
+    "title": "Dependencies",
+    "toggle": "depends_on_enabled",
+    "required": false,
+    "vars": [
+      {
+        "name": "depends_on_enabled",
+        "description": "Enable resource dependencies",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "dependencies",
+        "description": "Comma-separated list of resource dependencies",
+        "type": "str",
+        "default": ""
+      }
+    ]
+  },
+  {
+    "key": "lifecycle",
+    "title": "Lifecycle",
+    "toggle": "lifecycle_enabled",
+    "required": false,
+    "vars": [
+      {
+        "name": "lifecycle_enabled",
+        "description": "Enable lifecycle rules",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "prevent_destroy",
+        "description": "Prevent resource destruction",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "create_before_destroy",
+        "description": "Create replacement before destroying",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "ignore_changes",
+        "description": "Comma-separated list of attributes to ignore changes for",
+        "type": "str",
+        "default": ""
+      }
+    ]
+  },
+  {
+    "key": "tags",
+    "title": "Tags",
+    "toggle": "tags_enabled",
+    "required": false,
+    "vars": [
+      {
+        "name": "tags_enabled",
+        "description": "Enable resource tags",
+        "type": "bool",
+        "default": false
+      },
+      {
+        "name": "tags_json",
+        "description": "Resource tags in JSON format (e.g., {\"Environment\": \"Production\"})",
+        "type": "str",
+        "default": "{}"
+      }
+    ]
+  }
+]

+ 20 - 0
cli/core/template/__init__.py

@@ -0,0 +1,20 @@
+"""Template package for template and variable management.
+
+This package provides Template, VariableCollection, VariableSection, and Variable
+classes for managing templates and their variables.
+"""
+
+from .template import Template, TemplateErrorHandler, TemplateFile, TemplateMetadata
+from .variable import Variable
+from .variable_collection import VariableCollection
+from .variable_section import VariableSection
+
+__all__ = [
+    "Template",
+    "TemplateErrorHandler",
+    "TemplateFile",
+    "TemplateMetadata",
+    "Variable",
+    "VariableCollection",
+    "VariableSection",
+]

Разница между файлами не показана из-за своего большого размера
+ 376 - 414
cli/core/template/template.py


+ 69 - 102
cli/core/variable.py → cli/core/template/variable.py

@@ -1,18 +1,20 @@
 from __future__ import annotations
 
-from typing import TYPE_CHECKING, Any, Dict, List, Optional, Set
-from urllib.parse import urlparse
 import logging
-import re
+from typing import TYPE_CHECKING, Any
+from urllib.parse import urlparse
+
+from email_validator import EmailNotValidError, validate_email
 
 if TYPE_CHECKING:
-    from cli.core.section import VariableSection
+    from cli.core.template.variable_section import VariableSection
 
 logger = logging.getLogger(__name__)
 
+# Constants
+DEFAULT_AUTOGENERATED_LENGTH = 32
 TRUE_VALUES = {"true", "1", "yes", "on"}
 FALSE_VALUES = {"false", "0", "no", "off"}
-EMAIL_REGEX = re.compile(r"^[^@\\s]+@[^@\\s]+\\.[^@\\s]+$")
 
 
 class Variable:
@@ -22,8 +24,9 @@ class Variable:
         """Initialize Variable from a dictionary containing variable specification.
 
         Args:
-            data: Dictionary containing variable specification with required 'name' key
-                  and optional keys: description, type, options, prompt, value, default, section, origin
+            data: Dictionary containing variable specification with required
+                  'name' key and optional keys: description, type, options,
+                  prompt, value, default, section, origin
 
         Raises:
             ValueError: If data is not a dict, missing 'name' key, or has invalid default value
@@ -36,36 +39,38 @@ class Variable:
             raise ValueError("Variable data must contain 'name' key")
 
         # Track which fields were explicitly provided in source data
-        self._explicit_fields: Set[str] = set(data.keys())
+        self._explicit_fields: set[str] = set(data.keys())
 
         # Initialize fields
         self.name: str = data["name"]
         # Reference to parent section (set by VariableCollection)
-        self.parent_section: Optional["VariableSection"] = data.get("parent_section")
-        self.description: Optional[str] = data.get("description") or data.get(
-            "display", ""
-        )
+        self.parent_section: VariableSection | None = data.get("parent_section")
+        self.description: str | None = data.get("description") or data.get("display", "")
         self.type: str = data.get("type", "str")
-        self.options: Optional[List[Any]] = data.get("options", [])
-        self.prompt: Optional[str] = data.get("prompt")
+        self.options: list[Any] | None = data.get("options", [])
+        self.prompt: str | None = data.get("prompt")
         if "value" in data:
             self.value: Any = data.get("value")
         elif "default" in data:
             self.value: Any = data.get("default")
         else:
-            self.value: Any = None
-        self.origin: Optional[str] = data.get("origin")
+            # RULE: If bool variables don't have any default value or value at all,
+            # automatically set them to false
+            self.value: Any = False if self.type == "bool" else None
+        self.origin: str | None = data.get("origin")
         self.sensitive: bool = data.get("sensitive", False)
         # Optional extra explanation used by interactive prompts
-        self.extra: Optional[str] = data.get("extra")
+        self.extra: str | None = data.get("extra")
         # Flag indicating this variable should be auto-generated when empty
         self.autogenerated: bool = data.get("autogenerated", False)
-        # Flag indicating this variable is required even when section is disabled
+        # Length of auto-generated value
+        self.autogenerated_length: int = data.get("autogenerated_length", DEFAULT_AUTOGENERATED_LENGTH)
+        # Flag indicating if autogenerated value should be base64 encoded
+        self.autogenerated_base64: bool = data.get("autogenerated_base64", False)
+        # Flag indicating this variable is required (must have a value)
         self.required: bool = data.get("required", False)
-        # Flag indicating this variable can be empty/optional
-        self.optional: bool = data.get("optional", False)
         # Original value before config override (used for display)
-        self.original_value: Optional[Any] = data.get("original_value")
+        self.original_value: Any | None = data.get("original_value")
         # Variable dependencies - can be string or list of strings in format "var_name=value"
         # Supports semicolon-separated multiple conditions: "var1=value1;var2=value2,value3"
         needs_value = data.get("needs")
@@ -73,24 +78,20 @@ class Variable:
             if isinstance(needs_value, str):
                 # Split by semicolon to support multiple AND conditions in a single string
                 # Example: "traefik_enabled=true;network_mode=bridge,macvlan"
-                self.needs: List[str] = [
-                    need.strip() for need in needs_value.split(";") if need.strip()
-                ]
+                self.needs: list[str] = [need.strip() for need in needs_value.split(";") if need.strip()]
             elif isinstance(needs_value, list):
-                self.needs: List[str] = needs_value
+                self.needs: list[str] = needs_value
             else:
-                raise ValueError(
-                    f"Variable '{self.name}' has invalid 'needs' value: must be string or list"
-                )
+                raise ValueError(f"Variable '{self.name}' has invalid 'needs' value: must be string or list")
         else:
-            self.needs: List[str] = []
+            self.needs: list[str] = []
 
         # Validate and convert the default/initial value if present
         if self.value is not None:
             try:
                 self.value = self.convert(self.value)
             except ValueError as exc:
-                raise ValueError(f"Invalid default for variable '{self.name}': {exc}")
+                raise ValueError(f"Invalid default for variable '{self.name}': {exc}") from exc
 
     def convert(self, value: Any) -> Any:
         """Validate and convert a raw value based on the variable type.
@@ -154,24 +155,16 @@ class Variable:
 
         # Special handling for autogenerated variables
         # Allow empty values as they will be auto-generated later
-        if self.autogenerated and (
-            converted is None
-            or (
-                isinstance(converted, str) and (converted == "" or converted == "*auto")
-            )
-        ):
+        if self.autogenerated and (converted is None or (isinstance(converted, str) and (converted in {"", "*auto"}))):
             return None  # Signal that auto-generation should happen
 
-        # Allow empty values for optional variables
-        if self.optional and (
-            converted is None or (isinstance(converted, str) and converted == "")
-        ):
-            return None
-
         # Check if this is a required field and the value is empty
-        if check_required and self.is_required():
-            if converted is None or (isinstance(converted, str) and converted == ""):
-                raise ValueError("This field is required and cannot be empty")
+        if (
+            check_required
+            and self.is_required()
+            and (converted is None or (isinstance(converted, str) and converted == ""))
+        ):
+            raise ValueError("This field is required and cannot be empty")
 
         return converted
 
@@ -187,7 +180,7 @@ class Variable:
                 return False
         raise ValueError("value must be a boolean (true/false)")
 
-    def _convert_int(self, value: Any) -> Optional[int]:
+    def _convert_int(self, value: Any) -> int | None:
         """Convert value to integer."""
         if isinstance(value, int):
             return value
@@ -198,7 +191,7 @@ class Variable:
         except (TypeError, ValueError) as exc:
             raise ValueError("value must be an integer") from exc
 
-    def _convert_float(self, value: Any) -> Optional[float]:
+    def _convert_float(self, value: Any) -> float | None:
         """Convert value to float."""
         if isinstance(value, float):
             return value
@@ -209,7 +202,7 @@ class Variable:
         except (TypeError, ValueError) as exc:
             raise ValueError("value must be a float") from exc
 
-    def _convert_enum(self, value: Any) -> Optional[str]:
+    def _convert_enum(self, value: Any) -> str | None:
         if value == "":
             return None
         val = str(value)
@@ -230,11 +223,14 @@ class Variable:
         val = str(value).strip()
         if not val:
             return None
-        if not EMAIL_REGEX.fullmatch(val):
-            raise ValueError("value must be a valid email address")
-        return val
+        try:
+            # Validate email using RFC 5321/5322 compliant parser
+            validated = validate_email(val, check_deliverability=False)
+            return validated.normalized
+        except EmailNotValidError as exc:
+            raise ValueError(f"value must be a valid email address: {exc}") from exc
 
-    def to_dict(self) -> Dict[str, Any]:
+    def to_dict(self) -> dict[str, Any]:
         """Serialize Variable to a dictionary for storage."""
         result = {}
 
@@ -256,10 +252,14 @@ class Variable:
             result["sensitive"] = True
         if self.autogenerated:
             result["autogenerated"] = True
+            # Only include length if not default
+            if self.autogenerated_length != DEFAULT_AUTOGENERATED_LENGTH:
+                result["autogenerated_length"] = self.autogenerated_length
+            # Include base64 flag if enabled
+            if self.autogenerated_base64:
+                result["autogenerated_base64"] = True
         if self.required:
             result["required"] = True
-        if self.optional:
-            result["optional"] = True
         if self.options is not None:  # Allow empty list
             result["options"] = self.options
 
@@ -269,9 +269,7 @@ class Variable:
 
         return result
 
-    def get_display_value(
-        self, mask_sensitive: bool = True, max_length: int = 30, show_none: bool = True
-    ) -> str:
+    def get_display_value(self, mask_sensitive: bool = True, max_length: int = 30, show_none: bool = True) -> str:
         """Get formatted display value with optional masking and truncation.
 
         Args:
@@ -314,20 +312,14 @@ class Variable:
 
         # Type-specific handlers
         if self.type == "enum":
-            if not self.options:
-                return typed
             return (
-                self.options[0]
-                if typed is None or str(typed) not in self.options
-                else str(typed)
+                typed
+                if not self.options
+                else (self.options[0] if typed is None or str(typed) not in self.options else str(typed))
             )
 
         if self.type == "bool":
-            return (
-                typed
-                if isinstance(typed, bool)
-                else (None if typed is None else bool(typed))
-            )
+            return typed if isinstance(typed, bool) else (None if typed is None else bool(typed))
 
         if self.type == "int":
             try:
@@ -352,7 +344,7 @@ class Variable:
 
         return prompt_text
 
-    def get_validation_hint(self) -> Optional[str]:
+    def get_validation_hint(self) -> str | None:
         """Get validation hint for prompts (e.g., enum options).
 
         Returns:
@@ -373,43 +365,17 @@ class Variable:
     def is_required(self) -> bool:
         """Check if this variable requires a value (cannot be empty/None).
 
-        A variable is considered required if:
-        - It has an explicit 'required: true' flag (highest precedence)
-        - OR it doesn't have a default value (value is None)
-          AND it's not marked as autogenerated (which can be empty and generated later)
-          AND it's not marked as optional (which can be empty)
-          AND it's not a boolean type (booleans default to False if not set)
+        A variable is considered required ONLY if it has an explicit 'required: true' flag.
+        All other variables are optional by default.
 
         Returns:
             True if the variable must have a non-empty value, False otherwise
         """
-        # Optional variables can always be empty
-        if self.optional:
-            return False
-
-        # Explicit required flag takes highest precedence
-        if self.required:
-            # But autogenerated variables can still be empty (will be generated later)
-            if self.autogenerated:
-                return False
-            return True
-
-        # Autogenerated variables can be empty (will be generated later)
-        if self.autogenerated:
-            return False
-
-        # Boolean variables always have a value (True or False)
-        if self.type == "bool":
-            return False
-
-        # Variables with a default value are not required
-        if self.value is not None:
-            return False
-
-        # No default value and not autogenerated = required
-        return True
+        # Only explicitly marked required variables are required
+        # Autogenerated variables can still be empty (will be generated later)
+        return self.required and not self.autogenerated
 
-    def get_parent(self) -> Optional["VariableSection"]:
+    def get_parent(self) -> VariableSection | None:
         """Get the parent VariableSection that contains this variable.
 
         Returns:
@@ -417,7 +383,7 @@ class Variable:
         """
         return self.parent_section
 
-    def clone(self, update: Optional[Dict[str, Any]] = None) -> "Variable":
+    def clone(self, update: dict[str, Any] | None = None) -> Variable:
         """Create a deep copy of the variable with optional field updates.
 
         This is more efficient than converting to dict and back when copying variables.
@@ -442,8 +408,9 @@ class Variable:
             "sensitive": self.sensitive,
             "extra": self.extra,
             "autogenerated": self.autogenerated,
+            "autogenerated_length": self.autogenerated_length,
+            "autogenerated_base64": self.autogenerated_base64,
             "required": self.required,
-            "optional": self.optional,
             "original_value": self.original_value,
             "needs": self.needs.copy() if self.needs else None,
             "parent_section": self.parent_section,

Разница между файлами не показана из-за своего большого размера
+ 444 - 300
cli/core/template/variable_collection.py


+ 58 - 69
cli/core/section.py → cli/core/template/variable_section.py

@@ -1,8 +1,9 @@
 from __future__ import annotations
 
 from collections import OrderedDict
-from typing import Any, Dict, List, Optional
+from typing import Any
 
+from ..exceptions import VariableError
 from .variable import Variable
 
 
@@ -16,23 +17,21 @@ class VariableSection:
             data: Dictionary containing section specification with required 'key' and 'title' keys
         """
         if not isinstance(data, dict):
-            raise ValueError("VariableSection data must be a dictionary")
+            raise VariableError("VariableSection data must be a dictionary")
 
         if "key" not in data:
-            raise ValueError("VariableSection data must contain 'key'")
+            raise VariableError("VariableSection data must contain 'key'")
 
         if "title" not in data:
-            raise ValueError("VariableSection data must contain 'title'")
+            raise VariableError("VariableSection data must contain 'title'")
 
         self.key: str = data["key"]
         self.title: str = data["title"]
         self.variables: OrderedDict[str, Variable] = OrderedDict()
-        self.description: Optional[str] = data.get("description")
-        self.toggle: Optional[str] = data.get("toggle")
+        self.description: str | None = data.get("description")
+        self.toggle: str | None = data.get("toggle")
         # Track which fields were explicitly provided (to support explicit clears)
         self._explicit_fields: set[str] = set(data.keys())
-        # Default "general" section to required=True, all others to required=False
-        self.required: bool = data.get("required", data["key"] == "general")
         # Section dependencies - can be string or list of strings
         # Supports semicolon-separated multiple conditions: "var1=value1;var2=value2,value3"
         needs_value = data.get("needs")
@@ -40,22 +39,17 @@ class VariableSection:
             if isinstance(needs_value, str):
                 # Split by semicolon to support multiple AND conditions in a single string
                 # Example: "traefik_enabled=true;network_mode=bridge,macvlan"
-                self.needs: List[str] = [
-                    need.strip() for need in needs_value.split(";") if need.strip()
-                ]
+                self.needs: list[str] = [need.strip() for need in needs_value.split(";") if need.strip()]
             elif isinstance(needs_value, list):
-                self.needs: List[str] = needs_value
+                self.needs: list[str] = needs_value
             else:
-                raise ValueError(
-                    f"Section '{self.key}' has invalid 'needs' value: must be string or list"
-                )
+                raise VariableError(f"Section '{self.key}' has invalid 'needs' value: must be string or list")
         else:
-            self.needs: List[str] = []
+            self.needs: list[str] = []
 
-    def to_dict(self) -> Dict[str, Any]:
+    def to_dict(self) -> dict[str, Any]:
         """Serialize VariableSection to a dictionary for storage."""
         section_dict = {
-            "required": self.required,
             "vars": {name: var.to_dict() for name, var in self.variables.items()},
         }
 
@@ -66,9 +60,7 @@ class VariableSection:
 
         # Store dependencies (single value if only one, list otherwise)
         if self.needs:
-            section_dict["needs"] = (
-                self.needs[0] if len(self.needs) == 1 else self.needs
-            )
+            section_dict["needs"] = self.needs[0] if len(self.needs) == 1 else self.needs
 
         return section_dict
 
@@ -76,12 +68,8 @@ class VariableSection:
         """Check if section is currently enabled based on toggle variable.
 
         Returns:
-            True if section is enabled (required, no toggle, or toggle is True), False otherwise
+            True if section is enabled (no toggle, or toggle is True), False otherwise
         """
-        # Required sections are always enabled, regardless of toggle
-        if self.required:
-            return True
-
         if not self.toggle:
             return True
 
@@ -94,7 +82,7 @@ class VariableSection:
         except Exception:
             return False
 
-    def clone(self, origin_update: Optional[str] = None) -> "VariableSection":
+    def clone(self, origin_update: str | None = None) -> VariableSection:
         """Create a deep copy of the section with all variables.
 
         This is more efficient than converting to dict and back when copying sections.
@@ -115,7 +103,6 @@ class VariableSection:
                 "title": self.title,
                 "description": self.description,
                 "toggle": self.toggle,
-                "required": self.required,
                 "needs": self.needs.copy() if self.needs else None,
             }
         )
@@ -123,51 +110,35 @@ class VariableSection:
         # Deep copy all variables
         for var_name, variable in self.variables.items():
             if origin_update:
-                cloned.variables[var_name] = variable.clone(
-                    update={"origin": origin_update}
-                )
+                cloned.variables[var_name] = variable.clone(update={"origin": origin_update})
             else:
                 cloned.variables[var_name] = variable.clone()
 
         return cloned
 
-    def sort_variables(self, is_need_satisfied_func=None) -> None:
-        """Sort variables within section for optimal display and user interaction.
-
-        Current sorting strategy:
-        - Variables with no dependencies come first
-        - Variables that depend on others come after their dependencies (topological sort)
-        - Original order is preserved for variables at the same dependency level
-
-        Future sorting strategies can be added here (e.g., by type, required first, etc.)
-
-        Args:
-            is_need_satisfied_func: Optional function to check if a variable need is satisfied
-                                   (reserved for future use in conditional sorting)
-        """
-        if not self.variables:
-            return
-
-        # Build dependency graph
-        var_list = list(self.variables.keys())
+    def _build_dependency_graph(self, var_list: list[str]) -> dict[str, list[str]]:
+        """Build dependency graph for variables in this section."""
         var_set = set(var_list)
-
-        # For each variable, find which OTHER variables in THIS section it depends on
         dependencies = {var_name: [] for var_name in var_list}
+
         for var_name in var_list:
             variable = self.variables[var_name]
-            if variable.needs:
-                for need in variable.needs:
-                    # Parse need format: "variable_name=value"
-                    dep_var = need.split("=")[0] if "=" in need else need
-                    # Only track dependencies within THIS section
-                    if dep_var in var_set and dep_var != var_name:
-                        dependencies[var_name].append(dep_var)
-
-        # Topological sort using Kahn's algorithm
+            if not variable.needs:
+                continue
+
+            for need in variable.needs:
+                # Parse need format: "variable_name=value"
+                dep_var = need.split("=")[0] if "=" in need else need
+                # Only track dependencies within THIS section
+                if dep_var in var_set and dep_var != var_name:
+                    dependencies[var_name].append(dep_var)
+
+        return dependencies
+
+    def _topological_sort(self, var_list: list[str], dependencies: dict[str, list[str]]) -> list[str]:
+        """Perform topological sort using Kahn's algorithm."""
         in_degree = {var_name: len(deps) for var_name, deps in dependencies.items()}
         queue = [var for var, degree in in_degree.items() if degree == 0]
-        # Preserve original order for variables with same dependency level
         queue.sort(key=lambda v: var_list.index(v))
         result = []
 
@@ -185,12 +156,30 @@ class VariableSection:
 
         # If not all variables were sorted (cycle), append remaining in original order
         if len(result) != len(var_list):
-            for var_name in var_list:
-                if var_name not in result:
-                    result.append(var_name)
+            result.extend(var_name for var_name in var_list if var_name not in result)
+
+        return result
+
+    def sort_variables(self, _is_need_satisfied_func=None) -> None:
+        """Sort variables within section for optimal display and user interaction.
+
+        Current sorting strategy:
+        - Variables with no dependencies come first
+        - Variables that depend on others come after their dependencies (topological sort)
+        - Original order is preserved for variables at the same dependency level
+
+        Future sorting strategies can be added here (e.g., by type, required first, etc.)
+
+        Args:
+            _is_need_satisfied_func: Optional function to check if a variable need is satisfied
+                                    (unused, reserved for future use in conditional sorting)
+        """
+        if not self.variables:
+            return
+
+        var_list = list(self.variables.keys())
+        dependencies = self._build_dependency_graph(var_list)
+        result = self._topological_sort(var_list, dependencies)
 
         # Rebuild variables OrderedDict in new order
-        sorted_vars = OrderedDict()
-        for var_name in result:
-            sorted_vars[var_name] = self.variables[var_name]
-        self.variables = sorted_vars
+        self.variables = OrderedDict((var_name, self.variables[var_name]) for var_name in result)

+ 31 - 33
cli/core/validators.py

@@ -9,22 +9,25 @@ from __future__ import annotations
 import logging
 from abc import ABC, abstractmethod
 from pathlib import Path
-from typing import Any, List, Optional
+from typing import TYPE_CHECKING, Any, ClassVar
+
+if TYPE_CHECKING:
+    pass
 
 import yaml
-from rich.console import Console
+
+from .display import DisplayManager
 
 logger = logging.getLogger(__name__)
-console = Console()
 
 
 class ValidationResult:
     """Represents the result of a validation operation."""
 
     def __init__(self):
-        self.errors: List[str] = []
-        self.warnings: List[str] = []
-        self.info: List[str] = []
+        self.errors: list[str] = []
+        self.warnings: list[str] = []
+        self.info: list[str] = []
 
     def add_error(self, message: str) -> None:
         """Add an error message."""
@@ -52,36 +55,38 @@ class ValidationResult:
         return len(self.warnings) > 0
 
     def display(self, context: str = "Validation") -> None:
-        """Display validation results to console."""
+        """Display validation results using DisplayManager."""
+        display = DisplayManager()
+
         if self.errors:
-            console.print(f"\n[red]✗ {context} Failed:[/red]")
+            display.error(f"\n✗ {context} Failed:")
             for error in self.errors:
-                console.print(f"  [red]• {error}[/red]")
+                display.error(f"  • {error}")
 
         if self.warnings:
-            console.print(f"\n[yellow]⚠ {context} Warnings:[/yellow]")
+            display.warning(f"\n⚠ {context} Warnings:")
             for warning in self.warnings:
-                console.print(f"  [yellow]• {warning}[/yellow]")
+                display.warning(f"  • {warning}")
 
         if self.info:
-            console.print(f"\n[blue]ℹ {context} Info:[/blue]")
+            display.text(f"\n[blue]i {context} Info:[/blue]")
             for info_msg in self.info:
-                console.print(f"  [blue]• {info_msg}[/blue]")
+                display.text(f"  [blue]• {info_msg}[/blue]")
 
         if self.is_valid and not self.has_warnings:
-            console.print(f"\n[green]✓ {context} Passed[/green]")
+            display.text(f"\n[green]✓ {context} Passed[/green]")
 
 
 class ContentValidator(ABC):
     """Abstract base class for content validators."""
 
     @abstractmethod
-    def validate(self, content: str, file_path: str) -> ValidationResult:
+    def validate(self, content: str, _file_path: str) -> ValidationResult:
         """Validate content and return results.
 
         Args:
             content: The file content to validate
-            file_path: Path to the file (for error messages)
+            _file_path: Path to the file (unused in base class, kept for API compatibility)
 
         Returns:
             ValidationResult with errors, warnings, and info
@@ -104,7 +109,7 @@ class ContentValidator(ABC):
 class DockerComposeValidator(ContentValidator):
     """Validator for Docker Compose files."""
 
-    COMPOSE_FILENAMES = {
+    COMPOSE_FILENAMES: ClassVar[set[str]] = {
         "docker-compose.yml",
         "docker-compose.yaml",
         "compose.yml",
@@ -116,7 +121,7 @@ class DockerComposeValidator(ContentValidator):
         filename = Path(file_path).name.lower()
         return filename in self.COMPOSE_FILENAMES
 
-    def validate(self, content: str, file_path: str) -> ValidationResult:
+    def validate(self, content: str, _file_path: str) -> ValidationResult:
         """Validate Docker Compose file structure."""
         result = ValidationResult()
 
@@ -130,9 +135,7 @@ class DockerComposeValidator(ContentValidator):
 
             # Check for version (optional in Compose v2, but good practice)
             if "version" not in data:
-                result.add_info(
-                    "No 'version' field specified (using Compose v2 format)"
-                )
+                result.add_info("No 'version' field specified (using Compose v2 format)")
 
             # Check for services (required)
             if "services" not in data:
@@ -170,9 +173,7 @@ class DockerComposeValidator(ContentValidator):
 
         return result
 
-    def _validate_service(
-        self, name: str, config: Any, result: ValidationResult
-    ) -> None:
+    def _validate_service(self, name: str, config: Any, result: ValidationResult) -> None:
         """Validate a single service configuration."""
         if not isinstance(config, dict):
             result.add_error(f"Service '{name}': configuration must be a dictionary")
@@ -203,9 +204,8 @@ class DockerComposeValidator(ContentValidator):
                 keys = [e.split("=")[0] for e in env if isinstance(e, str) and "=" in e]
                 duplicates = {k for k in keys if keys.count(k) > 1}
                 if duplicates:
-                    result.add_warning(
-                        f"Service '{name}': duplicate environment variables: {', '.join(duplicates)}"
-                    )
+                    dups = ", ".join(duplicates)
+                    result.add_warning(f"Service '{name}': duplicate environment variables: {dups}")
 
         # Check for ports
         if "ports" in config:
@@ -221,7 +221,7 @@ class YAMLValidator(ContentValidator):
         """Check if file is a YAML file."""
         return Path(file_path).suffix.lower() in [".yml", ".yaml"]
 
-    def validate(self, content: str, file_path: str) -> ValidationResult:
+    def validate(self, content: str, _file_path: str) -> ValidationResult:
         """Validate YAML syntax."""
         result = ValidationResult()
 
@@ -238,7 +238,7 @@ class ValidatorRegistry:
     """Registry for content validators."""
 
     def __init__(self):
-        self.validators: List[ContentValidator] = []
+        self.validators: list[ContentValidator] = []
         self._register_default_validators()
 
     def _register_default_validators(self) -> None:
@@ -255,7 +255,7 @@ class ValidatorRegistry:
         self.validators.append(validator)
         logger.debug(f"Registered validator: {validator.__class__.__name__}")
 
-    def get_validator(self, file_path: str) -> Optional[ContentValidator]:
+    def get_validator(self, file_path: str) -> ContentValidator | None:
         """Get the most appropriate validator for a file.
 
         Args:
@@ -288,9 +288,7 @@ class ValidatorRegistry:
 
         # No validator found - return empty result
         result = ValidationResult()
-        result.add_info(
-            f"No semantic validator available for {Path(file_path).suffix} files"
-        )
+        result.add_info(f"No semantic validator available for {Path(file_path).suffix} files")
         return result
 
 

+ 3 - 7
cli/core/version.py

@@ -6,14 +6,13 @@ Supports version strings in the format: major.minor (e.g., "1.0", "1.2")
 
 from __future__ import annotations
 
-import re
-from typing import Tuple
 import logging
+import re
 
 logger = logging.getLogger(__name__)
 
 
-def parse_version(version_str: str) -> Tuple[int, int]:
+def parse_version(version_str: str) -> tuple[int, int]:
     """Parse a semantic version string into a tuple of integers.
 
     Args:
@@ -42,10 +41,7 @@ def parse_version(version_str: str) -> Tuple[int, int]:
     match = re.match(pattern, version_str)
 
     if not match:
-        raise ValueError(
-            f"Invalid version format '{version_str}'. "
-            "Expected format: major.minor (e.g., '1.0', '1.2')"
-        )
+        raise ValueError(f"Invalid version format '{version_str}'. Expected format: major.minor (e.g., '1.0', '1.2')")
 
     major, minor = match.groups()
     return (int(major), int(minor))

+ 1 - 0
cli/modules/__init__.py

@@ -0,0 +1 @@
+"""Modules package."""

+ 88 - 0
cli/modules/ansible/__init__.py

@@ -0,0 +1,88 @@
+"""Ansible module with multi-schema support."""
+
+import logging
+from collections import OrderedDict
+
+from ...core.module import Module
+from ...core.registry import registry
+from ...core.schema import has_schema, list_versions, load_schema
+
+logger = logging.getLogger(__name__)
+
+
+def _load_json_spec_as_dict(version: str) -> OrderedDict:
+    """Load JSON schema and convert to dict format for backward compatibility.
+
+    Args:
+        version: Schema version
+
+    Returns:
+        OrderedDict in the same format as Python specs
+    """
+    logger.debug(f"Loading ansible schema {version} from JSON")
+    json_spec = load_schema("ansible", version)
+
+    # Convert JSON array format to OrderedDict format
+    spec_dict = OrderedDict()
+    for section_data in json_spec:
+        section_key = section_data["key"]
+
+        # Build section dict
+        section_dict = {}
+        if "title" in section_data:
+            section_dict["title"] = section_data["title"]
+        if "description" in section_data:
+            section_dict["description"] = section_data["description"]
+        if "toggle" in section_data:
+            section_dict["toggle"] = section_data["toggle"]
+        if "required" in section_data:
+            section_dict["required"] = section_data["required"]
+        if "needs" in section_data:
+            section_dict["needs"] = section_data["needs"]
+
+        # Convert vars array to dict
+        vars_dict = OrderedDict()
+        for var_data in section_data["vars"]:
+            var_name = var_data["name"]
+            var_dict = {k: v for k, v in var_data.items() if k != "name"}
+            vars_dict[var_name] = var_dict
+
+        section_dict["vars"] = vars_dict
+        spec_dict[section_key] = section_dict
+
+    return spec_dict
+
+
+# Schema version mapping - loads JSON schemas on-demand
+class _SchemaDict(dict):
+    """Dict subclass that loads JSON schemas on-demand."""
+
+    def __getitem__(self, version):
+        if not has_schema("ansible", version):
+            raise KeyError(
+                f"Schema version {version} not found for ansible module. "
+                f"Available: {', '.join(list_versions('ansible'))}"
+            )
+        return _load_json_spec_as_dict(version)
+
+    def __contains__(self, version):
+        return has_schema("ansible", version)
+
+
+# Initialize schema dict
+SCHEMAS = _SchemaDict()
+
+# Default spec - load latest version
+spec = _load_json_spec_as_dict("1.0")
+
+
+class AnsibleModule(Module):
+    """Ansible module."""
+
+    name = "ansible"
+    description = "Manage Ansible configurations"
+    schema_version = "1.0"  # Current schema version supported by this module
+    schemas = SCHEMAS  # Available schema versions
+
+
+registry.register(AnsibleModule)

+ 132 - 12
cli/modules/compose/__init__.py

@@ -1,29 +1,149 @@
 """Docker Compose module with multi-schema support."""
 
+import logging
+from collections import OrderedDict
+from typing import Annotated
+
+from typer import Argument, Option
+
 from ...core.module import Module
+from ...core.module.base_commands import validate_templates
 from ...core.registry import registry
+from ...core.schema import has_schema, list_versions, load_schema
+from .validate import run_docker_validation
+
+logger = logging.getLogger(__name__)
+
+
+def _load_json_spec_as_dict(version: str) -> OrderedDict:
+    """Load JSON schema and convert to dict format for backward compatibility.
+
+    Args:
+        version: Schema version
+
+    Returns:
+        OrderedDict in the same format as Python specs
+    """
+    logger.debug(f"Loading compose schema {version} from JSON")
+    json_spec = load_schema("compose", version)
+
+    # Convert JSON array format to OrderedDict format
+    spec_dict = OrderedDict()
+    for section_data in json_spec:
+        section_key = section_data["key"]
+
+        # Build section dict
+        section_dict = {}
+        if "title" in section_data:
+            section_dict["title"] = section_data["title"]
+        if "description" in section_data:
+            section_dict["description"] = section_data["description"]
+        if "toggle" in section_data:
+            section_dict["toggle"] = section_data["toggle"]
+        if "required" in section_data:
+            section_dict["required"] = section_data["required"]
+        if "needs" in section_data:
+            section_dict["needs"] = section_data["needs"]
+
+        # Convert vars array to dict
+        vars_dict = OrderedDict()
+        for var_data in section_data["vars"]:
+            var_name = var_data["name"]
+            var_dict = {k: v for k, v in var_data.items() if k != "name"}
+            vars_dict[var_name] = var_dict
+
+        section_dict["vars"] = vars_dict
+        spec_dict[section_key] = section_dict
 
-# Import schema specifications
-from .spec_v1_0 import spec as spec_1_0
-from .spec_v1_1 import spec as spec_1_1
+    return spec_dict
 
-# Schema version mapping
-SCHEMAS = {
-    "1.0": spec_1_0,
-    "1.1": spec_1_1,
-}
 
-# Default spec points to latest version
-spec = spec_1_1
+# Schema version mapping - loads JSON schemas on-demand
+class _SchemaDict(dict):
+    """Dict subclass that loads JSON schemas on-demand."""
+
+    def __getitem__(self, version):
+        if not has_schema("compose", version):
+            raise KeyError(
+                f"Schema version {version} not found for compose module. "
+                f"Available: {', '.join(list_versions('compose'))}"
+            )
+        return _load_json_spec_as_dict(version)
+
+    def __contains__(self, version):
+        return has_schema("compose", version)
+
+
+# Initialize schema dict
+SCHEMAS = _SchemaDict()
+
+# Default spec - load latest version
+spec = _load_json_spec_as_dict("1.2")
 
 
 class ComposeModule(Module):
-    """Docker Compose module."""
+    """Docker Compose module with extended validation."""
 
     name = "compose"
     description = "Manage Docker Compose configurations"
-    schema_version = "1.1"  # Current schema version supported by this module
+    schema_version = "1.2"  # Current schema version supported by this module
     schemas = SCHEMAS  # Available schema versions
 
+    def validate(  # noqa: PLR0913
+        self,
+        template_id: Annotated[
+            str | None,
+            Argument(help="Template ID to validate (omit to validate all templates)"),
+        ] = None,
+        *,
+        path: Annotated[
+            str | None,
+            Option("--path", help="Path to template directory for validation"),
+        ] = None,
+        verbose: Annotated[bool, Option("--verbose", "-v", help="Show detailed validation information")] = False,
+        semantic: Annotated[
+            bool,
+            Option(
+                "--semantic/--no-semantic",
+                help="Enable semantic validation (Docker Compose schema, etc.)",
+            ),
+        ] = True,
+        docker: Annotated[
+            bool,
+            Option(
+                "--docker/--no-docker",
+                help="Enable Docker Compose validation using 'docker compose config'",
+            ),
+        ] = False,
+        docker_test_all: Annotated[
+            bool,
+            Option(
+                "--docker-test-all",
+                help="Test all variable combinations (minimal, maximal, each toggle). Requires --docker",
+            ),
+        ] = False,
+    ) -> None:
+        """Validate templates for Jinja2 syntax, undefined variables, and semantic correctness.
+
+        Extended for Docker Compose with optional docker compose config validation.
+        Use --docker for single config test, --docker-test-all for comprehensive testing.
+
+        Examples:
+            # Validate specific template
+            compose validate netbox
+
+            # Validate all templates
+            compose validate
+
+            # Validate with Docker Compose config check
+            compose validate netbox --docker
+        """
+        # Run standard validation first
+        validate_templates(self, template_id, path, verbose, semantic)
+
+        # If docker validation is enabled and we have a specific template
+        if docker and (template_id or path):
+            run_docker_validation(self, template_id, path, docker_test_all, verbose)
+
 
 registry.register(ComposeModule)

+ 0 - 278
cli/modules/compose/spec_v1_0.py

@@ -1,278 +0,0 @@
-"""Compose module schema version 1.0 - Original specification."""
-
-from collections import OrderedDict
-
-spec = OrderedDict(
-    {
-        "general": {
-            "title": "General",
-            "vars": {
-                "service_name": {
-                    "description": "Service name",
-                    "type": "str",
-                },
-                "container_name": {
-                    "description": "Container name",
-                    "type": "str",
-                },
-                "container_hostname": {
-                    "description": "Container internal hostname",
-                    "type": "str",
-                },
-                "container_timezone": {
-                    "description": "Container timezone (e.g., Europe/Berlin)",
-                    "type": "str",
-                    "default": "UTC",
-                },
-                "user_uid": {
-                    "description": "User UID for container process",
-                    "type": "int",
-                    "default": 1000,
-                },
-                "user_gid": {
-                    "description": "User GID for container process",
-                    "type": "int",
-                    "default": 1000,
-                },
-                "container_loglevel": {
-                    "description": "Container log level",
-                    "type": "enum",
-                    "options": ["debug", "info", "warn", "error"],
-                    "default": "info",
-                },
-                "restart_policy": {
-                    "description": "Container restart policy",
-                    "type": "enum",
-                    "options": ["unless-stopped", "always", "on-failure", "no"],
-                    "default": "unless-stopped",
-                },
-            },
-        },
-        "network": {
-            "title": "Network",
-            "toggle": "network_enabled",
-            "vars": {
-                "network_enabled": {
-                    "description": "Enable custom network block",
-                    "type": "bool",
-                    "default": False,
-                },
-                "network_name": {
-                    "description": "Docker network name",
-                    "type": "str",
-                    "default": "bridge",
-                },
-                "network_external": {
-                    "description": "Use existing Docker network",
-                    "type": "bool",
-                    "default": True,
-                },
-            },
-        },
-        "ports": {
-            "title": "Ports",
-            "toggle": "ports_enabled",
-            "vars": {
-                "ports_enabled": {
-                    "description": "Expose ports via 'ports' mapping",
-                    "type": "bool",
-                    "default": True,
-                }
-            },
-        },
-        "traefik": {
-            "title": "Traefik",
-            "toggle": "traefik_enabled",
-            "description": "Traefik routes external traffic to your service.",
-            "vars": {
-                "traefik_enabled": {
-                    "description": "Enable Traefik reverse proxy integration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "traefik_network": {
-                    "description": "Traefik network name",
-                    "type": "str",
-                    "default": "traefik",
-                },
-                "traefik_host": {
-                    "description": "Domain name for your service (e.g., app.example.com)",
-                    "type": "str",
-                },
-                "traefik_entrypoint": {
-                    "description": "HTTP entrypoint (non-TLS)",
-                    "type": "str",
-                    "default": "web",
-                },
-            },
-        },
-        "traefik_tls": {
-            "title": "Traefik TLS/SSL",
-            "toggle": "traefik_tls_enabled",
-            "needs": "traefik",
-            "description": "Enable HTTPS/TLS for Traefik with certificate management.",
-            "vars": {
-                "traefik_tls_enabled": {
-                    "description": "Enable HTTPS/TLS",
-                    "type": "bool",
-                    "default": True,
-                },
-                "traefik_tls_entrypoint": {
-                    "description": "TLS entrypoint",
-                    "type": "str",
-                    "default": "websecure",
-                },
-                "traefik_tls_certresolver": {
-                    "description": "Traefik certificate resolver name",
-                    "type": "str",
-                    "default": "cloudflare",
-                },
-            },
-        },
-        "swarm": {
-            "title": "Docker Swarm",
-            "toggle": "swarm_enabled",
-            "description": "Deploy service in Docker Swarm mode with replicas.",
-            "vars": {
-                "swarm_enabled": {
-                    "description": "Enable Docker Swarm mode",
-                    "type": "bool",
-                    "default": False,
-                },
-                "swarm_replicas": {
-                    "description": "Number of replicas in Swarm",
-                    "type": "int",
-                    "default": 1,
-                },
-                "swarm_placement_mode": {
-                    "description": "Swarm placement mode",
-                    "type": "enum",
-                    "options": ["global", "replicated"],
-                    "default": "replicated",
-                },
-                "swarm_placement_host": {
-                    "description": "Limit placement to specific node",
-                    "type": "str",
-                },
-            },
-        },
-        "database": {
-            "title": "Database",
-            "toggle": "database_enabled",
-            "description": "Connect to external database (PostgreSQL or MySQL)",
-            "vars": {
-                "database_enabled": {
-                    "description": "Enable external database integration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "database_type": {
-                    "description": "Database type",
-                    "type": "enum",
-                    "options": ["postgres", "mysql"],
-                    "default": "postgres",
-                },
-                "database_external": {
-                    "description": "Use an external database server?",
-                    "extra": "skips creation of internal database container",
-                    "type": "bool",
-                    "default": False,
-                },
-                "database_host": {
-                    "description": "Database host",
-                    "type": "str",
-                    "default": "database",
-                },
-                "database_port": {"description": "Database port", "type": "int"},
-                "database_name": {
-                    "description": "Database name",
-                    "type": "str",
-                },
-                "database_user": {
-                    "description": "Database user",
-                    "type": "str",
-                },
-                "database_password": {
-                    "description": "Database password",
-                    "type": "str",
-                    "default": "",
-                    "sensitive": True,
-                    "autogenerated": True,
-                },
-            },
-        },
-        "email": {
-            "title": "Email Server",
-            "toggle": "email_enabled",
-            "description": "Configure email server for notifications and user management.",
-            "vars": {
-                "email_enabled": {
-                    "description": "Enable email server configuration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "email_host": {
-                    "description": "SMTP server hostname",
-                    "type": "str",
-                },
-                "email_port": {
-                    "description": "SMTP server port",
-                    "type": "int",
-                    "default": 587,
-                },
-                "email_username": {
-                    "description": "SMTP username",
-                    "type": "str",
-                },
-                "email_password": {
-                    "description": "SMTP password",
-                    "type": "str",
-                    "sensitive": True,
-                },
-                "email_from": {
-                    "description": "From email address",
-                    "type": "str",
-                },
-                "email_use_tls": {
-                    "description": "Use TLS encryption",
-                    "type": "bool",
-                    "default": True,
-                },
-                "email_use_ssl": {
-                    "description": "Use SSL encryption",
-                    "type": "bool",
-                    "default": False,
-                },
-            },
-        },
-        "authentik": {
-            "title": "Authentik SSO",
-            "toggle": "authentik_enabled",
-            "description": "Integrate with Authentik for Single Sign-On authentication.",
-            "vars": {
-                "authentik_enabled": {
-                    "description": "Enable Authentik SSO integration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "authentik_url": {
-                    "description": "Authentik base URL (e.g., https://auth.example.com)",
-                    "type": "str",
-                },
-                "authentik_slug": {
-                    "description": "Authentik application slug",
-                    "type": "str",
-                },
-                "authentik_client_id": {
-                    "description": "OAuth client ID from Authentik provider",
-                    "type": "str",
-                },
-                "authentik_client_secret": {
-                    "description": "OAuth client secret from Authentik provider",
-                    "type": "str",
-                    "sensitive": True,
-                },
-            },
-        },
-    }
-)

+ 0 - 341
cli/modules/compose/spec_v1_1.py

@@ -1,341 +0,0 @@
-"""Compose module schema version 1.1 - Enhanced with network_mode and improved swarm.
-
-Changes from 1.0:
-- network: Added network_mode (bridge/host/macvlan) with conditional macvlan fields
-- swarm: Added volume modes (local/mount/nfs) and conditional placement constraints
-- traefik_tls: Updated needs format from 'traefik' to 'traefik_enabled=true'
-"""
-
-from collections import OrderedDict
-
-spec = OrderedDict(
-    {
-        "general": {
-            "title": "General",
-            "vars": {
-                "service_name": {
-                    "description": "Service name",
-                    "type": "str",
-                },
-                "container_name": {
-                    "description": "Container name",
-                    "type": "str",
-                },
-                "container_hostname": {
-                    "description": "Container internal hostname",
-                    "type": "str",
-                },
-                "container_timezone": {
-                    "description": "Container timezone (e.g., Europe/Berlin)",
-                    "type": "str",
-                    "default": "UTC",
-                },
-                "user_uid": {
-                    "description": "User UID for container process",
-                    "type": "int",
-                    "default": 1000,
-                },
-                "user_gid": {
-                    "description": "User GID for container process",
-                    "type": "int",
-                    "default": 1000,
-                },
-                "container_loglevel": {
-                    "description": "Container log level",
-                    "type": "enum",
-                    "options": ["debug", "info", "warn", "error"],
-                    "default": "info",
-                },
-                "restart_policy": {
-                    "description": "Container restart policy",
-                    "type": "enum",
-                    "options": ["unless-stopped", "always", "on-failure", "no"],
-                    "default": "unless-stopped",
-                },
-            },
-        },
-        "network": {
-            "title": "Network",
-            "vars": {
-                "network_mode": {
-                    "description": "Docker network mode",
-                    "type": "enum",
-                    "options": ["bridge", "host", "macvlan"],
-                    "default": "bridge",
-                },
-                "network_name": {
-                    "description": "Docker network name",
-                    "type": "str",
-                    "default": "bridge",
-                    "needs": "network_mode=bridge,macvlan",
-                },
-                "network_external": {
-                    "description": "Use existing Docker network (external)",
-                    "type": "bool",
-                    "default": False,
-                    "needs": "network_mode=bridge,macvlan",
-                },
-                "network_macvlan_ipv4_address": {
-                    "description": "Static IP address for container",
-                    "type": "str",
-                    "default": "192.168.1.253",
-                    "needs": "network_mode=macvlan",
-                },
-                "network_macvlan_parent_interface": {
-                    "description": "Host network interface name",
-                    "type": "str",
-                    "default": "eth0",
-                    "needs": "network_mode=macvlan",
-                },
-                "network_macvlan_subnet": {
-                    "description": "Network subnet in CIDR notation",
-                    "type": "str",
-                    "default": "192.168.1.0/24",
-                    "needs": "network_mode=macvlan",
-                },
-                "network_macvlan_gateway": {
-                    "description": "Network gateway IP address",
-                    "type": "str",
-                    "default": "192.168.1.1",
-                    "needs": "network_mode=macvlan",
-                },
-            },
-        },
-        "ports": {
-            "title": "Ports",
-            "toggle": "ports_enabled",
-            "needs": "network_mode=bridge",
-            "vars": {},
-        },
-        "traefik": {
-            "title": "Traefik",
-            "toggle": "traefik_enabled",
-            "needs": "network_mode=bridge",
-            "description": "Traefik routes external traffic to your service.",
-            "vars": {
-                "traefik_enabled": {
-                    "description": "Enable Traefik reverse proxy integration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "traefik_network": {
-                    "description": "Traefik network name",
-                    "type": "str",
-                    "default": "traefik",
-                },
-                "traefik_host": {
-                    "description": "Domain name for your service (e.g., app.example.com)",
-                    "type": "str",
-                },
-                "traefik_entrypoint": {
-                    "description": "HTTP entrypoint (non-TLS)",
-                    "type": "str",
-                    "default": "web",
-                },
-            },
-        },
-        "traefik_tls": {
-            "title": "Traefik TLS/SSL",
-            "toggle": "traefik_tls_enabled",
-            "needs": "traefik_enabled=true;network_mode=bridge",
-            "description": "Enable HTTPS/TLS for Traefik with certificate management.",
-            "vars": {
-                "traefik_tls_enabled": {
-                    "description": "Enable HTTPS/TLS",
-                    "type": "bool",
-                    "default": True,
-                },
-                "traefik_tls_entrypoint": {
-                    "description": "TLS entrypoint",
-                    "type": "str",
-                    "default": "websecure",
-                },
-                "traefik_tls_certresolver": {
-                    "description": "Traefik certificate resolver name",
-                    "type": "str",
-                    "default": "cloudflare",
-                },
-            },
-        },
-        "swarm": {
-            "title": "Docker Swarm",
-            "needs": "network_mode=bridge",
-            "toggle": "swarm_enabled",
-            "description": "Deploy service in Docker Swarm mode.",
-            "vars": {
-                "swarm_enabled": {
-                    "description": "Enable Docker Swarm mode",
-                    "type": "bool",
-                    "default": False,
-                },
-                "swarm_placement_mode": {
-                    "description": "Swarm placement mode",
-                    "type": "enum",
-                    "options": ["replicated", "global"],
-                    "default": "replicated",
-                },
-                "swarm_replicas": {
-                    "description": "Number of replicas",
-                    "type": "int",
-                    "default": 1,
-                    "needs": "swarm_placement_mode=replicated",
-                },
-                "swarm_placement_host": {
-                    "description": "Target hostname for placement constraint",
-                    "type": "str",
-                    "default": "",
-                    "optional": True,
-                    "needs": "swarm_placement_mode=replicated",
-                    "extra": "Constrains service to run on specific node by hostname",
-                },
-                "swarm_volume_mode": {
-                    "description": "Swarm volume storage backend",
-                    "type": "enum",
-                    "options": ["local", "mount", "nfs"],
-                    "default": "local",
-                    "extra": "WARNING: 'local' only works on single-node deployments!",
-                },
-                "swarm_volume_mount_path": {
-                    "description": "Host path for bind mount",
-                    "type": "str",
-                    "default": "/mnt/storage",
-                    "needs": "swarm_volume_mode=mount",
-                    "extra": "Useful for shared/replicated storage",
-                },
-                "swarm_volume_nfs_server": {
-                    "description": "NFS server address",
-                    "type": "str",
-                    "default": "192.168.1.1",
-                    "needs": "swarm_volume_mode=nfs",
-                    "extra": "IP address or hostname of NFS server",
-                },
-                "swarm_volume_nfs_path": {
-                    "description": "NFS export path",
-                    "type": "str",
-                    "default": "/export",
-                    "needs": "swarm_volume_mode=nfs",
-                    "extra": "Path to NFS export on the server",
-                },
-                "swarm_volume_nfs_options": {
-                    "description": "NFS mount options",
-                    "type": "str",
-                    "default": "rw,nolock,soft",
-                    "needs": "swarm_volume_mode=nfs",
-                    "extra": "Comma-separated NFS mount options",
-                },
-            },
-        },
-        "database": {
-            "title": "Database",
-            "toggle": "database_enabled",
-            "vars": {
-                "database_type": {
-                    "description": "Database type",
-                    "type": "enum",
-                    "options": ["default", "sqlite", "postgres", "mysql"],
-                    "default": "default",
-                },
-                "database_external": {
-                    "description": "Use an external database server?",
-                    "extra": "skips creation of internal database container",
-                    "type": "bool",
-                    "default": False,
-                },
-                "database_host": {
-                    "description": "Database host",
-                    "type": "str",
-                    "default": "database",
-                },
-                "database_port": {"description": "Database port", "type": "int"},
-                "database_name": {
-                    "description": "Database name",
-                    "type": "str",
-                },
-                "database_user": {
-                    "description": "Database user",
-                    "type": "str",
-                },
-                "database_password": {
-                    "description": "Database password",
-                    "type": "str",
-                    "default": "",
-                    "sensitive": True,
-                    "autogenerated": True,
-                },
-            },
-        },
-        "email": {
-            "title": "Email Server",
-            "toggle": "email_enabled",
-            "description": "Configure email server for notifications and user management.",
-            "vars": {
-                "email_enabled": {
-                    "description": "Enable email server configuration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "email_host": {
-                    "description": "SMTP server hostname",
-                    "type": "str",
-                },
-                "email_port": {
-                    "description": "SMTP server port",
-                    "type": "int",
-                    "default": 587,
-                },
-                "email_username": {
-                    "description": "SMTP username",
-                    "type": "str",
-                },
-                "email_password": {
-                    "description": "SMTP password",
-                    "type": "str",
-                    "sensitive": True,
-                },
-                "email_from": {
-                    "description": "From email address",
-                    "type": "str",
-                },
-                "email_use_tls": {
-                    "description": "Use TLS encryption",
-                    "type": "bool",
-                    "default": True,
-                },
-                "email_use_ssl": {
-                    "description": "Use SSL encryption",
-                    "type": "bool",
-                    "default": False,
-                },
-            },
-        },
-        "authentik": {
-            "title": "Authentik SSO",
-            "toggle": "authentik_enabled",
-            "description": "Integrate with Authentik for Single Sign-On authentication.",
-            "vars": {
-                "authentik_enabled": {
-                    "description": "Enable Authentik SSO integration",
-                    "type": "bool",
-                    "default": False,
-                },
-                "authentik_url": {
-                    "description": "Authentik base URL (e.g., https://auth.example.com)",
-                    "type": "str",
-                },
-                "authentik_slug": {
-                    "description": "Authentik application slug",
-                    "type": "str",
-                },
-                "authentik_client_id": {
-                    "description": "OAuth client ID from Authentik provider",
-                    "type": "str",
-                },
-                "authentik_client_secret": {
-                    "description": "OAuth client secret from Authentik provider",
-                    "type": "str",
-                    "sensitive": True,
-                },
-            },
-        },
-    }
-)

+ 252 - 0
cli/modules/compose/validate.py

@@ -0,0 +1,252 @@
+"""Docker Compose validation functionality."""
+
+import logging
+import subprocess
+import tempfile
+from pathlib import Path
+
+from typer import Exit
+
+from ...core.template import Template
+
+logger = logging.getLogger(__name__)
+
+
+def run_docker_validation(
+    module_instance,
+    template_id: str | None,
+    path: str | None,
+    test_all: bool,
+    verbose: bool,
+) -> None:
+    """Run Docker Compose validation using docker compose config.
+
+    Args:
+        module_instance: The module instance (for display and template loading)
+        template_id: Template ID to validate
+        path: Path to template directory
+        test_all: Test all variable combinations
+        verbose: Show detailed output
+
+    Raises:
+        Exit: If validation fails or docker is not available
+    """
+    try:
+        # Load the template
+        if path:
+            template_path = Path(path).resolve()
+            template = Template(template_path, library_name="local")
+        else:
+            template = module_instance._load_template_by_id(template_id)
+
+        module_instance.display.info("")
+        module_instance.display.info("Running Docker Compose validation...")
+
+        # Test multiple combinations or single configuration
+        if test_all:
+            _test_variable_combinations(module_instance, template, verbose)
+        else:
+            # Single configuration with template defaults
+            success = _validate_compose_files(
+                module_instance, template, template.variables, verbose, "Template defaults"
+            )
+            if success:
+                module_instance.display.success("Docker Compose validation passed")
+            else:
+                module_instance.display.error("Docker Compose validation failed")
+                raise Exit(code=1) from None
+
+    except FileNotFoundError as e:
+        module_instance.display.error(
+            "Docker Compose CLI not found",
+            context="Install Docker Desktop or Docker Engine with Compose plugin",
+        )
+        raise Exit(code=1) from e
+    except Exception as e:
+        module_instance.display.error(f"Docker validation failed: {e}")
+        raise Exit(code=1) from e
+
+
+def _validate_compose_files(module_instance, template, variables, verbose: bool, config_name: str) -> bool:
+    """Validate rendered compose files using docker compose config.
+
+    Args:
+        module_instance: The module instance
+        template: The template object
+        variables: VariableCollection with configured values
+        verbose: Show detailed output
+        config_name: Name of this configuration (for display)
+
+    Returns:
+        True if validation passed, False otherwise
+    """
+    try:
+        # Render the template
+        debug_mode = logger.isEnabledFor(logging.DEBUG)
+        rendered_files, _ = template.render(variables, debug=debug_mode)
+
+        # Find compose files
+        compose_files = [
+            (filename, content)
+            for filename, content in rendered_files.items()
+            if filename.endswith(("compose.yaml", "compose.yml", "docker-compose.yaml", "docker-compose.yml"))
+        ]
+
+        if not compose_files:
+            module_instance.display.warning(f"[{config_name}] No Docker Compose files found")
+            return True
+
+        # Validate each compose file
+        has_errors = False
+        for filename, content in compose_files:
+            if verbose:
+                module_instance.display.info(f"[{config_name}] Validating: {filename}")
+
+            # Write to temporary file
+            with tempfile.NamedTemporaryFile(mode="w", suffix=".yaml", delete=False) as tmp_file:
+                tmp_file.write(content)
+                tmp_path = tmp_file.name
+
+            try:
+                # Run docker compose config
+                result = subprocess.run(
+                    ["docker", "compose", "-f", tmp_path, "config", "--quiet"],
+                    capture_output=True,
+                    text=True,
+                    check=False,
+                )
+
+                if result.returncode != 0:
+                    has_errors = True
+                    module_instance.display.error(f"[{config_name}] Docker validation failed for {filename}")
+                    if result.stderr:
+                        module_instance.display.info(f"\n{result.stderr}")
+                elif verbose:
+                    module_instance.display.success(f"[{config_name}] Docker validation passed: {filename}")
+
+            finally:
+                # Clean up temporary file
+                Path(tmp_path).unlink(missing_ok=True)
+
+        return not has_errors
+
+    except Exception as e:
+        module_instance.display.error(f"[{config_name}] Validation failed: {e}")
+        return False
+
+
+def _test_variable_combinations(module_instance, template, verbose: bool) -> None:
+    """Test multiple variable combinations intelligently.
+
+    Tests:
+    1. Minimal config (all toggles OFF)
+    2. Maximal config (all toggles ON)
+    3. Each toggle individually ON (to isolate toggle-specific issues)
+
+    Args:
+        module_instance: The module instance
+        template: The template object
+        verbose: Show detailed output
+
+    Raises:
+        Exit: If any validation fails
+    """
+    module_instance.display.info("Testing multiple variable combinations...")
+    module_instance.display.info("")
+
+    # Find all boolean toggle variables
+    toggle_vars = _find_toggle_variables(template)
+
+    if not toggle_vars:
+        module_instance.display.warning("No toggle variables found - testing default configuration only")
+        success = _validate_compose_files(module_instance, template, template.variables, verbose, "Default")
+        if not success:
+            raise Exit(code=1) from None
+        module_instance.display.success("Docker Compose validation passed")
+        return
+
+    module_instance.display.info(f"Found {len(toggle_vars)} toggle variable(s): {', '.join(toggle_vars)}")
+    module_instance.display.info("")
+
+    all_passed = True
+    test_count = 0
+
+    # Test 1: Minimal (all OFF)
+    module_instance.display.info("[1/3] Testing minimal configuration (all toggles OFF)...")
+    toggle_config = dict.fromkeys(toggle_vars, False)
+    variables = _get_variables_with_toggles(module_instance, template, toggle_config)
+    if not _validate_compose_files(module_instance, template, variables, verbose, "Minimal"):
+        all_passed = False
+    test_count += 1
+    module_instance.display.info("")
+
+    # Test 2: Maximal (all ON)
+    module_instance.display.info("[2/3] Testing maximal configuration (all toggles ON)...")
+    toggle_config = dict.fromkeys(toggle_vars, True)
+    variables = _get_variables_with_toggles(module_instance, template, toggle_config)
+    if not _validate_compose_files(module_instance, template, variables, verbose, "Maximal"):
+        all_passed = False
+    test_count += 1
+    module_instance.display.info("")
+
+    # Test 3: Each toggle individually
+    module_instance.display.info(f"[3/3] Testing each toggle individually ({len(toggle_vars)} tests)...")
+    for i, toggle in enumerate(toggle_vars, 1):
+        # Set all OFF except the current one
+        toggle_config = {t: t == toggle for t in toggle_vars}
+        variables = _get_variables_with_toggles(module_instance, template, toggle_config)
+        config_name = f"{toggle}=true"
+        if not _validate_compose_files(module_instance, template, variables, verbose, config_name):
+            all_passed = False
+        test_count += 1
+        if verbose and i < len(toggle_vars):
+            module_instance.display.info("")
+
+    # Summary
+    module_instance.display.info("")
+    module_instance.display.info("─" * 80)
+    if all_passed:
+        module_instance.display.success(f"All {test_count} configuration(s) passed Docker Compose validation")
+    else:
+        module_instance.display.error("Some configurations failed Docker Compose validation")
+        raise Exit(code=1) from None
+
+
+def _find_toggle_variables(template) -> list[str]:
+    """Find all boolean toggle variables in a template.
+
+    Args:
+        template: The template object
+
+    Returns:
+        List of toggle variable names
+    """
+    toggle_vars = []
+    for var_name, var in template.variables._variable_map.items():
+        if var.type == "bool" and var_name.endswith("_enabled"):
+            toggle_vars.append(var_name)
+    return sorted(toggle_vars)
+
+
+def _get_variables_with_toggles(module_instance, template, toggle_config: dict[str, bool]):  # noqa: ARG001
+    """Get VariableCollection with specific toggle settings.
+
+    Args:
+        module_instance: The module instance (unused, for signature consistency)
+        template: The template object
+        toggle_config: Dict mapping toggle names to boolean values
+
+    Returns:
+        VariableCollection with configured toggle values
+    """
+    # Reload template to get fresh VariableCollection
+    # (template.variables is mutated by previous calls)
+    fresh_template = Template(template.template_dir, library_name=template.metadata.library)
+    variables = fresh_template.variables
+
+    # Apply toggle configuration
+    for toggle_name, toggle_value in toggle_config.items():
+        if toggle_name in variables._variable_map:
+            variables._variable_map[toggle_name].value = toggle_value
+
+    return variables

+ 87 - 0
cli/modules/helm/__init__.py

@@ -0,0 +1,87 @@
+"""Helm module with multi-schema support."""
+
+import logging
+from collections import OrderedDict
+
+from ...core.module import Module
+from ...core.registry import registry
+from ...core.schema import has_schema, list_versions, load_schema
+
+logger = logging.getLogger(__name__)
+
+
+def _load_json_spec_as_dict(version: str) -> OrderedDict:
+    """Load JSON schema and convert to dict format for backward compatibility.
+
+    Args:
+        version: Schema version
+
+    Returns:
+        OrderedDict in the same format as Python specs
+    """
+    logger.debug(f"Loading helm schema {version} from JSON")
+    json_spec = load_schema("helm", version)
+
+    # Convert JSON array format to OrderedDict format
+    spec_dict = OrderedDict()
+    for section_data in json_spec:
+        section_key = section_data["key"]
+
+        # Build section dict
+        section_dict = {}
+        if "title" in section_data:
+            section_dict["title"] = section_data["title"]
+        if "description" in section_data:
+            section_dict["description"] = section_data["description"]
+        if "toggle" in section_data:
+            section_dict["toggle"] = section_data["toggle"]
+        if "required" in section_data:
+            section_dict["required"] = section_data["required"]
+        if "needs" in section_data:
+            section_dict["needs"] = section_data["needs"]
+
+        # Convert vars array to dict
+        vars_dict = OrderedDict()
+        for var_data in section_data["vars"]:
+            var_name = var_data["name"]
+            var_dict = {k: v for k, v in var_data.items() if k != "name"}
+            vars_dict[var_name] = var_dict
+
+        section_dict["vars"] = vars_dict
+        spec_dict[section_key] = section_dict
+
+    return spec_dict
+
+
+# Schema version mapping - loads JSON schemas on-demand
+class _SchemaDict(dict):
+    """Dict subclass that loads JSON schemas on-demand."""
+
+    def __getitem__(self, version):
+        if not has_schema("helm", version):
+            raise KeyError(
+                f"Schema version {version} not found for helm module. Available: {', '.join(list_versions('helm'))}"
+            )
+        return _load_json_spec_as_dict(version)
+
+    def __contains__(self, version):
+        return has_schema("helm", version)
+
+
+# Initialize schema dict
+SCHEMAS = _SchemaDict()
+
+# Default spec - load latest version
+spec = _load_json_spec_as_dict("1.0")
+
+
+class HelmModule(Module):
+    """Helm module."""
+
+    name = "helm"
+    description = "Manage Helm configurations"
+    schema_version = "1.0"  # Current schema version supported by this module
+    schemas = SCHEMAS  # Available schema versions
+
+
+registry.register(HelmModule)

+ 88 - 0
cli/modules/kubernetes/__init__.py

@@ -0,0 +1,88 @@
+"""Kubernetes module with multi-schema support."""
+
+import logging
+from collections import OrderedDict
+
+from ...core.module import Module
+from ...core.registry import registry
+from ...core.schema import has_schema, list_versions, load_schema
+
+logger = logging.getLogger(__name__)
+
+
+def _load_json_spec_as_dict(version: str) -> OrderedDict:
+    """Load JSON schema and convert to dict format for backward compatibility.
+
+    Args:
+        version: Schema version
+
+    Returns:
+        OrderedDict in the same format as Python specs
+    """
+    logger.debug(f"Loading kubernetes schema {version} from JSON")
+    json_spec = load_schema("kubernetes", version)
+
+    # Convert JSON array format to OrderedDict format
+    spec_dict = OrderedDict()
+    for section_data in json_spec:
+        section_key = section_data["key"]
+
+        # Build section dict
+        section_dict = {}
+        if "title" in section_data:
+            section_dict["title"] = section_data["title"]
+        if "description" in section_data:
+            section_dict["description"] = section_data["description"]
+        if "toggle" in section_data:
+            section_dict["toggle"] = section_data["toggle"]
+        if "required" in section_data:
+            section_dict["required"] = section_data["required"]
+        if "needs" in section_data:
+            section_dict["needs"] = section_data["needs"]
+
+        # Convert vars array to dict
+        vars_dict = OrderedDict()
+        for var_data in section_data["vars"]:
+            var_name = var_data["name"]
+            var_dict = {k: v for k, v in var_data.items() if k != "name"}
+            vars_dict[var_name] = var_dict
+
+        section_dict["vars"] = vars_dict
+        spec_dict[section_key] = section_dict
+
+    return spec_dict
+
+
+# Schema version mapping - loads JSON schemas on-demand
+class _SchemaDict(dict):
+    """Dict subclass that loads JSON schemas on-demand."""
+
+    def __getitem__(self, version):
+        if not has_schema("kubernetes", version):
+            raise KeyError(
+                f"Schema version {version} not found for kubernetes module. "
+                f"Available: {', '.join(list_versions('kubernetes'))}"
+            )
+        return _load_json_spec_as_dict(version)
+
+    def __contains__(self, version):
+        return has_schema("kubernetes", version)
+
+
+# Initialize schema dict
+SCHEMAS = _SchemaDict()
+
+# Default spec - load latest version
+spec = _load_json_spec_as_dict("1.0")
+
+
+class KubernetesModule(Module):
+    """Kubernetes module."""
+
+    name = "kubernetes"
+    description = "Manage Kubernetes configurations"
+    schema_version = "1.0"  # Current schema version supported by this module
+    schemas = SCHEMAS  # Available schema versions
+
+
+registry.register(KubernetesModule)

+ 87 - 0
cli/modules/packer/__init__.py

@@ -0,0 +1,87 @@
+"""Packer module with multi-schema support."""
+
+import logging
+from collections import OrderedDict
+
+from ...core.module import Module
+from ...core.registry import registry
+from ...core.schema import has_schema, list_versions, load_schema
+
+logger = logging.getLogger(__name__)
+
+
+def _load_json_spec_as_dict(version: str) -> OrderedDict:
+    """Load JSON schema and convert to dict format for backward compatibility.
+
+    Args:
+        version: Schema version
+
+    Returns:
+        OrderedDict in the same format as Python specs
+    """
+    logger.debug(f"Loading packer schema {version} from JSON")
+    json_spec = load_schema("packer", version)
+
+    # Convert JSON array format to OrderedDict format
+    spec_dict = OrderedDict()
+    for section_data in json_spec:
+        section_key = section_data["key"]
+
+        # Build section dict
+        section_dict = {}
+        if "title" in section_data:
+            section_dict["title"] = section_data["title"]
+        if "description" in section_data:
+            section_dict["description"] = section_data["description"]
+        if "toggle" in section_data:
+            section_dict["toggle"] = section_data["toggle"]
+        if "required" in section_data:
+            section_dict["required"] = section_data["required"]
+        if "needs" in section_data:
+            section_dict["needs"] = section_data["needs"]
+
+        # Convert vars array to dict
+        vars_dict = OrderedDict()
+        for var_data in section_data["vars"]:
+            var_name = var_data["name"]
+            var_dict = {k: v for k, v in var_data.items() if k != "name"}
+            vars_dict[var_name] = var_dict
+
+        section_dict["vars"] = vars_dict
+        spec_dict[section_key] = section_dict
+
+    return spec_dict
+
+
+# Schema version mapping - loads JSON schemas on-demand
+class _SchemaDict(dict):
+    """Dict subclass that loads JSON schemas on-demand."""
+
+    def __getitem__(self, version):
+        if not has_schema("packer", version):
+            raise KeyError(
+                f"Schema version {version} not found for packer module. Available: {', '.join(list_versions('packer'))}"
+            )
+        return _load_json_spec_as_dict(version)
+
+    def __contains__(self, version):
+        return has_schema("packer", version)
+
+
+# Initialize schema dict
+SCHEMAS = _SchemaDict()
+
+# Default spec - load latest version
+spec = _load_json_spec_as_dict("1.0")
+
+
+class PackerModule(Module):
+    """Packer module."""
+
+    name = "packer"
+    description = "Manage Packer configurations"
+    schema_version = "1.0"  # Current schema version supported by this module
+    schemas = SCHEMAS  # Available schema versions
+
+
+registry.register(PackerModule)

+ 88 - 0
cli/modules/terraform/__init__.py

@@ -0,0 +1,88 @@
+"""Terraform module with multi-schema support."""
+
+import logging
+from collections import OrderedDict
+
+from ...core.module import Module
+from ...core.registry import registry
+from ...core.schema import has_schema, list_versions, load_schema
+
+logger = logging.getLogger(__name__)
+
+
+def _load_json_spec_as_dict(version: str) -> OrderedDict:
+    """Load JSON schema and convert to dict format for backward compatibility.
+
+    Args:
+        version: Schema version
+
+    Returns:
+        OrderedDict in the same format as Python specs
+    """
+    logger.debug(f"Loading terraform schema {version} from JSON")
+    json_spec = load_schema("terraform", version)
+
+    # Convert JSON array format to OrderedDict format
+    spec_dict = OrderedDict()
+    for section_data in json_spec:
+        section_key = section_data["key"]
+
+        # Build section dict
+        section_dict = {}
+        if "title" in section_data:
+            section_dict["title"] = section_data["title"]
+        if "description" in section_data:
+            section_dict["description"] = section_data["description"]
+        if "toggle" in section_data:
+            section_dict["toggle"] = section_data["toggle"]
+        if "required" in section_data:
+            section_dict["required"] = section_data["required"]
+        if "needs" in section_data:
+            section_dict["needs"] = section_data["needs"]
+
+        # Convert vars array to dict
+        vars_dict = OrderedDict()
+        for var_data in section_data["vars"]:
+            var_name = var_data["name"]
+            var_dict = {k: v for k, v in var_data.items() if k != "name"}
+            vars_dict[var_name] = var_dict
+
+        section_dict["vars"] = vars_dict
+        spec_dict[section_key] = section_dict
+
+    return spec_dict
+
+
+# Schema version mapping - loads JSON schemas on-demand
+class _SchemaDict(dict):
+    """Dict subclass that loads JSON schemas on-demand."""
+
+    def __getitem__(self, version):
+        if not has_schema("terraform", version):
+            raise KeyError(
+                f"Schema version {version} not found for terraform module. "
+                f"Available: {', '.join(list_versions('terraform'))}"
+            )
+        return _load_json_spec_as_dict(version)
+
+    def __contains__(self, version):
+        return has_schema("terraform", version)
+
+
+# Initialize schema dict
+SCHEMAS = _SchemaDict()
+
+# Default spec - load latest version
+spec = _load_json_spec_as_dict("1.0")
+
+
+class TerraformModule(Module):
+    """Terraform module."""
+
+    name = "terraform"
+    description = "Manage Terraform configurations"
+    schema_version = "1.0"  # Current schema version supported by this module
+    schemas = SCHEMAS  # Available schema versions
+
+
+registry.register(TerraformModule)

+ 16 - 0
library/ansible/checkmk-install-agent/playbook.yaml.j2

@@ -0,0 +1,16 @@
+---
+- name: Install Checkmk agent on all hosts
+  hosts: {{ target_hosts }}
+  become: true
+  roles:
+    - checkmk.general.agent
+  vars:
+    checkmk_agent_version: "2.4.0p15"
+    checkmk_agent_server: {{ checkmk_server }}
+    checkmk_agent_server_protocol: {{ checkmk_protocol }}
+    checkmk_agent_site: {{ checkmk_site }}
+    checkmk_agent_auto_activate: {{ checkmk_auto_activate }}
+    checkmk_agent_tls: {{ checkmk_tls }}
+    checkmk_agent_user: {{ checkmk_user }}
+    checkmk_agent_pass: {{ checkmk_pass }}
+    checkmk_agent_host_name: {{ checkmk_host }}

+ 59 - 0
library/ansible/checkmk-install-agent/template.yaml

@@ -0,0 +1,59 @@
+---
+kind: ansible
+metadata:
+  name: Install Checkmk Agent
+  description: |-
+    Ansible playbook to install Checkmk monitoring agent on hosts. Uses the checkmk.general.agent role with automatic registration.
+    ## References
+    - **Project**: https://github.com/Checkmk/ansible-collection-checkmk.general
+    - **Documentation**: https://docs.checkmk.com/
+  version: 2.4.0
+  author: Christian Lempa
+  date: "2025-11-11"
+  tags: []
+  icon:
+    provider: selfh
+    id: checkmk
+  draft: false
+  next_steps: ""
+schema: "1.0"
+spec:
+  checkmk:
+    title: Checkmk Configuration
+    vars:
+      checkmk_server:
+        type: str
+        description: Checkmk Server
+        required: true
+      checkmk_protocol:
+        type: str
+        description: Checkmk Server Protocol
+        enum:
+          - http
+          - https
+        default: https
+        required: true
+      checkmk_site:
+        type: str
+        description: Checkmk Site
+        default: cmk
+        required: true
+      checkmk_auto_activate:
+        type: bool
+        description: Auto Activate Agent
+      checkmk_tls:
+        type: bool
+        description: Use TLS for Agent Communication
+      checkmk_user:
+        type: str
+        description: Checkmk Automation User
+        required: true
+      checkmk_pass:
+        type: str
+        description: Checkmk Automation User Password
+        required: true
+        sensitive: true
+      checkmk_host:
+        type: str
+        description: Checkmk Host Name
+        required: true

+ 16 - 0
library/ansible/checkmk-manage-host/playbook.yaml.j2

@@ -0,0 +1,16 @@
+---
+- name: Manage Checkmk host
+  hosts: localhost
+  gather_facts: false
+  tasks:
+    - name: "Create or update host in Checkmk"
+      checkmk.general.host:
+        server_url: "{{ checkmk_protocol }}://{{ checkmk_server }}"
+        site: {{ checkmk_site }}
+        automation_user: {{ checkmk_user }}
+        automation_secret: {{ checkmk_pass }}
+        name: {{ host_name }}
+        attributes:
+          ipaddress: {{ host_ip }}
+        folder: {{ host_folder }}
+        state: "present"

+ 65 - 0
library/ansible/checkmk-manage-host/template.yaml

@@ -0,0 +1,65 @@
+---
+kind: ansible
+metadata:
+  name: Manage Checkmk Host
+  description: |-
+    Ansible playbook to manage hosts in Checkmk monitoring. Uses the checkmk.general.host module to create or update host configuration.
+    ## References
+    - **Project**: https://github.com/Checkmk/ansible-collection-checkmk.general
+    - **Documentation**: https://docs.checkmk.com/
+  version: 2.4.0
+  author: Christian Lempa
+  date: "2025-11-11"
+  tags: []
+  icon:
+    provider: selfh
+    id: checkmk
+  draft: false
+  next_steps: ""
+schema: "1.0"
+spec:
+  checkmk:
+    title: Checkmk Configuration
+    vars:
+      checkmk_server:
+        type: str
+        description: Checkmk Server
+        required: true
+      checkmk_protocol:
+        type: str
+        description: Checkmk Server Protocol
+        enum:
+          - http
+          - https
+        default: https
+        required: true
+      checkmk_site:
+        type: str
+        description: Checkmk Site
+        default: cmk
+        required: true
+      checkmk_user:
+        type: str
+        description: Checkmk Automation User
+        required: true
+      checkmk_pass:
+        type: str
+        description: Checkmk Automation User Password
+        required: true
+        sensitive: true
+  host:
+    title: Host Configuration
+    vars:
+      host_name:
+        type: str
+        description: Hostname to add to Checkmk
+        required: true
+      host_ip:
+        type: str
+        description: IP address of the host
+        required: true
+      host_folder:
+        type: str
+        description: Folder path in Checkmk
+        default: /
+        required: true

+ 61 - 0
library/ansible/docker-certs-enable/playbook.yaml.j2

@@ -0,0 +1,61 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+  vars:
+    certs_path: {{ certs_path }}
+
+  tasks:
+    - name: Check if docker certs are existing
+      ansible.builtin.stat:
+        path: {{ '{{' }} certs_path {{ '}}' }}
+      register: certs_dir
+
+    - name: Fail if docker certs are not existing
+      ansible.builtin.fail:
+        msg: "Docker certificates are not existing in /root/docker-certs."
+      when: not certs_dir.stat.exists
+
+    - name: Get machine's primary internal ip address from eth0 interface
+      ansible.builtin.setup:
+      register: ip_address
+
+    - name: Set machine's primary internal ip address
+      ansible.builtin.set_fact:
+        ip_address: {{ '{{' }} ip_address.ansible_facts.ansible_default_ipv4.address {{ '}}' }}
+
+    - name: Check if ip_address is a valid ip address
+      ansible.builtin.assert:
+        that:
+          - ip_address is match("^(?:[0-9]{1,3}\\.){3}[0-9]{1,3}$")
+        fail_msg: "ip_address is not a valid ip address."
+        success_msg: "ip_address is a valid ip address."
+
+    - name: Change docker daemon to use certs
+      ansible.builtin.lineinfile:
+        path: /lib/systemd/system/docker.service
+        line: >
+          ExecStart=/usr/bin/dockerd -H fd:// --containerd=/run/containerd/containerd.sock
+          -H tcp://{{ '{{' }} ip_address {{ '}}' }}:2376 --tlsverify --tlscacert={{ '{{' }} certs_path {{ '}}' }}/ca.pem
+          --tlscert={{ '{{' }} certs_path {{ '}}' }}/server-cert.pem --tlskey={{ '{{' }} certs_path {{ '}}' }}/server-key.pem
+        regexp: '^ExecStart='
+        state: present
+
+    - name: Reload systemd daemon
+      ansible.builtin.systemd:
+        daemon_reload: true
+
+    - name: Restart docker daemon
+      ansible.builtin.systemd:
+        name: docker
+        state: restarted
+        enabled: true

+ 35 - 0
library/ansible/docker-certs-enable/template.yaml

@@ -0,0 +1,35 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: docker
+  name: Enable Docker TLS
+  description: >
+    Ansible playbook to enable TLS on Docker daemon using existing certificates.
+    Configures Docker to use TLS for secure remote access.
+
+
+    Project: https://www.docker.com
+
+    Documentation: https://docs.docker.com/engine/security/protect-access/
+  version: 1.0.0
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Docker Certs enable
+      become:
+        default: true
+  certificates:
+    title: Certificate Configuration
+    required: true
+    vars:
+      certs_path:
+        type: str
+        description: Path where certificates are stored
+        default: /root/docker-certs

+ 167 - 0
library/ansible/docker-certs/playbook.yaml.j2

@@ -0,0 +1,167 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+  vars:
+    certs_path: {{ certs_path }}
+    cert_validity_days: {{ cert_validity_days }}
+    cn_domain: {{ cn_domain }}
+
+  tasks:
+    - name: Check if docker certs are existing
+      ansible.builtin.stat:
+        path: {{ '{{' }} certs_path {{ '}}' }}
+      register: certs_dir
+
+    - name: Create docker certs directory (if needed)
+      ansible.builtin.file:
+        path: {{ '{{' }} certs_path {{ '}}' }}
+        state: directory
+        mode: '0700'
+      when: not certs_dir.stat.exists
+
+    - name: Check if docker certs directory is empty
+      ansible.builtin.command: ls -A {{ '{{' }} certs_path {{ '}}' }}
+      register: certs_list
+      when: certs_dir.stat.exists
+      changed_when: false
+      ignore_errors: true
+
+    - name: Fail if docker certs already exist
+      ansible.builtin.fail:
+        msg: "Docker certificates already exist in /root/docker-certs."
+      when: certs_list.stdout | default('') != ''
+
+    - name: Get machine's primary internal ip address from eth0 interface
+      ansible.builtin.setup:
+      register: ip_address
+
+    - name: Set machine's primary internal ip address
+      ansible.builtin.set_fact:
+        ip_address: {{ '{{' }} ip_address.ansible_facts.ansible_default_ipv4.address {{ '}}' }}
+
+    - name: Check if ip_address is a valid ip address
+      ansible.builtin.assert:
+        that:
+          - ip_address is match("^(?:[0-9]{1,3}\\.){3}[0-9]{1,3}$")
+        fail_msg: "ip_address is not a valid ip address."
+        success_msg: "ip_address is a valid ip address."
+
+    - name: Generate CA private key
+      ansible.builtin.command:
+        cmd: >
+          openssl genrsa -out {{ '{{' }} certs_path {{ '}}' }}/ca-key.pem 4096
+      args:
+        creates: {{ '{{' }} certs_path {{ '}}' }}/ca-key.pem
+
+    - name: Generate CA certificate
+      ansible.builtin.command:
+        cmd: >
+          openssl req -sha256 -new -x509
+            -subj "/CN={{ '{{' }} cn_domain {{ '}}' }}"
+            -days {{ '{{' }} cert_validity_days {{ '}}' }}
+            -key {{ '{{' }} certs_path {{ '}}' }}/ca-key.pem
+            -out {{ '{{' }} certs_path {{ '}}' }}/ca.pem
+      args:
+        creates: {{ '{{' }} certs_path {{ '}}' }}/ca.pem
+
+    - name: Generate server private key
+      ansible.builtin.command:
+        cmd: >
+          openssl genrsa -out {{ '{{' }} certs_path {{ '}}' }}/server-key.pem 4096
+        creates: {{ '{{' }} certs_path {{ '}}' }}/server-key.pem
+
+    - name: Generate server certificate signing request
+      ansible.builtin.command:
+        cmd: >
+          openssl req -sha256 -new
+            -subj "/CN={{ '{{' }} inventory_hostname {{ '}}' }}"
+            -key {{ '{{' }} certs_path {{ '}}' }}/server-key.pem
+            -out {{ '{{' }} certs_path {{ '}}' }}/server.csr
+        creates: {{ '{{' }} certs_path {{ '}}' }}/server.csr
+
+    - name: Generate server certificate extension file
+      ansible.builtin.shell: |
+        echo "subjectAltName = DNS:{{ '{{' }} inventory_hostname {{ '}}' }},IP:{{ '{{' }} ip_address {{ '}}' }},IP:127.0.0.1" >> {{ '{{' }} certs_path {{ '}}' }}/extfile.cnf
+        echo "extendedKeyUsage = serverAuth" >> {{ '{{' }} certs_path {{ '}}' }}/extfile.cnf
+      args:
+        creates: {{ '{{' }} certs_path {{ '}}' }}/extfile.cnf
+
+    - name: Generate server certificate
+      ansible.builtin.command:
+        cmd: >
+          openssl x509 -req -days {{ '{{' }} cert_validity_days {{ '}}' }} -sha256
+            -in {{ '{{' }} certs_path {{ '}}' }}/server.csr
+            -CA {{ '{{' }} certs_path {{ '}}' }}/ca.pem
+            -CAkey {{ '{{' }} certs_path {{ '}}' }}/ca-key.pem
+            -CAcreateserial -out {{ '{{' }} certs_path {{ '}}' }}/server-cert.pem
+            -extfile {{ '{{' }} certs_path {{ '}}' }}/extfile.cnf
+        creates: {{ '{{' }} certs_path {{ '}}' }}/server-cert.pem
+
+    - name: Generate client private key
+      ansible.builtin.command:
+        cmd: >
+          openssl genrsa -out {{ '{{' }} certs_path {{ '}}' }}/key.pem 4096
+        creates: {{ '{{' }} certs_path {{ '}}' }}/key.pem
+
+    - name: Generate client certificate signing request
+      ansible.builtin.command:
+        cmd: >
+          openssl req -sha256 -new
+            -subj "/CN=client"
+            -key {{ '{{' }} certs_path {{ '}}' }}/key.pem
+            -out {{ '{{' }} certs_path {{ '}}' }}/client.csr
+        creates: {{ '{{' }} certs_path {{ '}}' }}/client.csr
+
+    - name: Generate client certificate extension file
+      ansible.builtin.shell: |
+        echo "extendedKeyUsage = clientAuth" >> {{ '{{' }} certs_path {{ '}}' }}/client-extfile.cnf
+      args:
+        creates: {{ '{{' }} certs_path {{ '}}' }}/client-extfile.cnf
+
+    - name: Generate client certificate
+      ansible.builtin.command:
+        cmd: >
+          openssl x509 -req -days {{ '{{' }} cert_validity_days {{ '}}' }}
+            -sha256 -in {{ '{{' }} certs_path {{ '}}' }}/client.csr
+            -CA {{ '{{' }} certs_path {{ '}}' }}/ca.pem
+            -CAkey {{ '{{' }} certs_path {{ '}}' }}/ca-key.pem
+            -CAcreateserial -out {{ '{{' }} certs_path {{ '}}' }}/cert.pem
+            -extfile {{ '{{' }} certs_path {{ '}}' }}/client-extfile.cnf
+        creates: {{ '{{' }} certs_path {{ '}}' }}/cert.pem
+
+    - name: Remove client certificate signing request
+      ansible.builtin.file:
+        path: {{ '{{' }} certs_path {{ '}}' }}/server.csr
+        state: absent
+
+    - name: Remove client certificate signing request
+      ansible.builtin.file:
+        path: {{ '{{' }} certs_path {{ '}}' }}/client.csr
+        state: absent
+
+    - name: Remove server certificate extension file
+      ansible.builtin.file:
+        path: {{ '{{' }} certs_path {{ '}}' }}/extfile.cnf
+        state: absent
+
+    - name: Remove client certificate extension file
+      ansible.builtin.file:
+        path: {{ '{{' }} certs_path {{ '}}' }}/client-extfile.cnf
+        state: absent
+
+    - name: Set permissions for docker certs
+      ansible.builtin.file:
+        path: {{ '{{' }} certs_path {{ '}}' }}
+        mode: '0700'
+        recurse: true
+        follow: true

+ 43 - 0
library/ansible/docker-certs/template.yaml

@@ -0,0 +1,43 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: docker
+  name: Generate Docker TLS Certificates
+  description: >
+    Ansible playbook to generate TLS certificates for Docker daemon.
+    Creates CA, server, and client certificates for secure Docker remote access.
+
+
+    Project: https://www.docker.com
+
+    Documentation: https://docs.docker.com/engine/security/protect-access/
+  version: 1.0.0
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Docker Certs
+      become:
+        default: true
+  certificates:
+    title: Certificate Configuration
+    required: true
+    vars:
+      certs_path:
+        type: str
+        description: Path where certificates will be stored
+        default: /root/docker-certs
+      cert_validity_days:
+        type: int
+        description: Certificate validity period in days
+        default: 3650
+      cn_domain:
+        type: hostname
+        description: Common Name (CN) for the CA certificate
+        default: your-domain.tld

+ 44 - 0
library/ansible/docker-install-ubuntu/playbook.yaml.j2

@@ -0,0 +1,44 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+
+  tasks:
+    - name: Install docker dependencies
+      ansible.builtin.apt:
+        name:
+          - apt-transport-https
+          - ca-certificates
+          - curl
+          - gnupg-agent
+          - software-properties-common
+        update_cache: true
+
+    - name: Add docker gpg key
+      ansible.builtin.apt_key:
+        url: https://download.docker.com/linux/ubuntu/gpg
+        state: present
+        keyring: /etc/apt/keyrings/docker.gpg
+
+    - name: Add docker repository
+      ansible.builtin.apt_repository:
+        filename: docker
+        repo: deb [arch=amd64 signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu {{ '{{' }} ansible_lsb.codename | lower {{ '}}' }} stable
+        state: present
+
+    - name: Install docker engine
+      ansible.builtin.apt:
+        name:
+          - docker-ce
+          - docker-buildx-plugin
+          - docker-compose-plugin
+        update_cache: true

+ 27 - 0
library/ansible/docker-install-ubuntu/template.yaml

@@ -0,0 +1,27 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: docker
+  name: Install Docker on Ubuntu
+  description: >
+    Ansible playbook to install Docker Engine on Ubuntu systems.
+    Includes Docker CE, Buildx plugin, and Compose plugin.
+
+
+    Project: https://www.docker.com
+
+    Documentation: https://docs.docker.com/engine/install/ubuntu/
+  version: 27.5.1
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Install docker
+      become:
+        default: true

+ 24 - 0
library/ansible/docker-prune/playbook.yaml.j2

@@ -0,0 +1,24 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+
+  tasks:
+    - name: Prune non-dangling images
+      community.docker.docker_prune:
+        containers: false
+        images: true
+        images_filters:
+          dangling: false
+        networks: false
+        volumes: false
+        builder_cache: false

+ 27 - 0
library/ansible/docker-prune/template.yaml

@@ -0,0 +1,27 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: docker
+  name: Docker Prune
+  description: >
+    Ansible playbook to clean up Docker resources.
+    Prunes non-dangling images to free up disk space.
+
+
+    Project: https://www.docker.com
+
+    Documentation: https://docs.docker.com/engine/reference/commandline/system_prune/
+  version: 1.0.0
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Clean docker
+      become:
+        default: false

+ 28 - 0
library/ansible/ubuntu-add-sshkey/playbook.yaml.j2

@@ -0,0 +1,28 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+
+  tasks:
+    - name: Install public keys
+      ansible.posix.authorized_key:
+        user: {{ '{{' }} lookup('env', 'USER') {{ '}}' }}
+        state: present
+        key: {{ '{{' }} lookup('file', '~/.ssh/id_rsa.pub') {{ '}}' }}
+
+    - name: Change sudoers file
+      ansible.builtin.lineinfile:
+        path: /etc/sudoers
+        state: present
+        regexp: '^%sudo'
+        line: '%sudo ALL=(ALL) NOPASSWD: ALL'
+        validate: /usr/sbin/visudo -cf %s

+ 27 - 0
library/ansible/ubuntu-add-sshkey/template.yaml

@@ -0,0 +1,27 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: ansible
+  name: Add SSH Key and Configure Sudoers
+  description: >
+    Ansible playbook to add SSH public key to authorized_keys.
+    Also configures passwordless sudo for sudo group.
+
+
+    Project: https://www.openssh.com
+
+    Documentation: https://www.openssh.com/manual.html
+  version: 1.0.0
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Add ssh key
+      become:
+        default: true

+ 24 - 0
library/ansible/ubuntu-apt-update/playbook.yaml.j2

@@ -0,0 +1,24 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+
+  tasks:
+    - name: Update packages with apt
+      when: ansible_pkg_mgr == 'apt'
+      ansible.builtin.apt:
+        update_cache: true
+
+    - name: Upgrade packages with apt
+      when: ansible_pkg_mgr == 'apt'
+      ansible.builtin.apt:
+        upgrade: dist

+ 29 - 0
library/ansible/ubuntu-apt-update/template.yaml

@@ -0,0 +1,29 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: ansible
+  name: Update and Upgrade Ubuntu Packages
+  description: >
+    Ansible playbook to update and upgrade APT packages on Ubuntu systems.
+    Performs apt update and dist-upgrade.
+
+
+    Project: https://ubuntu.com
+
+    Documentation: https://ubuntu.com/server/docs
+  version: 1.0.0
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Update and upgrade apt packages
+      target_hosts:
+        default: all
+      become:
+        default: false

+ 28 - 0
library/ansible/ubuntu-vm-core/playbook.yaml.j2

@@ -0,0 +1,28 @@
+---
+- name: {{ playbook_name }}
+  hosts: {{ target_hosts }}
+{% if become %}
+  become: true
+{% endif %}
+{% if options_enabled and not gather_facts %}
+  gather_facts: false
+{% endif %}
+{% if secrets_enabled %}
+  vars_files:
+    - {{ secrets_file }}
+{% endif %}
+
+  tasks:
+    - name: Install packages
+      ansible.builtin.apt:
+        name:
+          - prometheus-node-exporter
+          - nfs-common
+          - qemu-guest-agent
+        update_cache: true
+
+    - name: Start guest qemu-guest-agent
+      ansible.builtin.service:
+        name: qemu-guest-agent
+        state: started
+        enabled: true

+ 27 - 0
library/ansible/ubuntu-vm-core/template.yaml

@@ -0,0 +1,27 @@
+---
+kind: ansible
+schema: "1.0"
+metadata:
+  icon:
+    provider: selfh
+    id: ansible
+  name: Install Ubuntu VM Core Packages
+  description: >
+    Ansible playbook to install essential packages for Ubuntu virtual machines.
+    Includes Prometheus node exporter, NFS client, and QEMU guest agent.
+
+
+    Project: https://ubuntu.com
+
+    Documentation: https://ubuntu.com/server/docs
+  version: 1.0.0
+  author: Christian Lempa
+  date: '2025-11-11'
+  draft: true
+spec:
+  general:
+    vars:
+      playbook_name:
+        default: Install core packages for virtual machines
+      become:
+        default: true

+ 151 - 0
library/compose/adguardhome/compose.yaml.j2

@@ -0,0 +1,151 @@
+---
+services:
+  {{ service_name }}:
+    image: docker.io/adguard/adguardhome:v0.107.69
+    restart: {{ restart_policy }}
+    {% if network_mode == 'host' %}
+    network_mode: host
+    {% elif network_mode == 'bridge' or network_mode == 'macvlan' or traefik_enabled %}
+    networks:
+      {% if traefik_enabled %}
+      {{ traefik_network }}:
+      {% endif %}
+      {% if network_mode == 'macvlan' %}
+      {{ network_name }}:
+        ipv4_address: {{ network_macvlan_ipv4_address }}
+      {% elif network_mode == 'bridge' %}
+      {{ network_name }}:
+      {% endif %}
+    {% endif %}
+    {#
+      Port mappings (only in bridge mode or default network):
+      - HTTP/HTTPS (80/443) ports are only exposed when Traefik is disabled
+      - Initial setup port 3000 is exposed during first-time setup
+      - DNS and related ports (53, 853, 5443) are always exposed
+      - In host or macvlan mode, ports are bound directly to host network
+    #}
+    {% if network_mode == '' or network_mode == 'bridge' or traefik_enabled %}
+    ports:
+      {% if not traefik_enabled %}
+      - "{{ ports_http }}:80/tcp"
+      - "{{ ports_https }}:443/tcp"
+      {% if initial_setup %}
+      - "{{ ports_initial }}:3000/tcp"
+      {% endif %}
+      {% endif %}
+      - "{{ ports_https }}:443/udp"
+      - "{{ ports_dns }}:53/tcp"
+      - "{{ ports_dns }}:53/udp"
+      - "{{ ports_tls }}:853/tcp"
+      - "{{ ports_dnscrypt }}:5443/tcp"
+      - "{{ ports_dnscrypt }}:5443/udp"
+    {% endif %}
+    volumes:
+      {% if volume_mode == 'mount' %}
+      - {{ volume_mount_path }}/work:/opt/adguardhome/work:rw
+      - {{ volume_mount_path }}/conf:/opt/adguardhome/conf:rw
+      {% else  %}
+      - {{ service_name }}_work:/opt/adguardhome/work
+      - {{ service_name }}_conf:/opt/adguardhome/conf
+      {% endif %}
+    cap_add:
+      - NET_ADMIN
+      - NET_BIND_SERVICE
+      - NET_RAW
+    {% if traefik_enabled %}
+    labels:
+      - traefik.enable=true
+      - traefik.docker.network={{ traefik_network }}
+      - traefik.http.services.{{ service_name }}_web.loadBalancer.server.port=80
+      - traefik.http.routers.{{ service_name }}_http.service={{ service_name }}_web
+      - traefik.http.routers.{{ service_name }}_http.rule=Host(`{{ traefik_host }}.{{ traefik_domain }}`)
+      - traefik.http.routers.{{ service_name }}_http.entrypoints=web
+      {% if traefik_tls_enabled %}
+      - traefik.http.routers.{{ service_name }}_https.service={{ service_name }}_web
+      - traefik.http.routers.{{ service_name }}_https.rule=Host(`{{ traefik_host }}.{{ traefik_domain }}`)
+      - traefik.http.routers.{{ service_name }}_https.entrypoints=websecure
+      - traefik.http.routers.{{ service_name }}_https.tls=true
+      - traefik.http.routers.{{ service_name }}_https.tls.certresolver={{ traefik_tls_certresolver }}
+      {% endif %}
+      {#
+        Initial setup routing (port 3000):
+        Routes setup wizard through separate Traefik service.
+        Note: Setup wizard is available at http://<host>.<domain>/setup during initial configuration.
+      #}
+      {% if initial_setup %}
+      - traefik.http.services.{{ service_name }}_setup.loadBalancer.server.port=3000
+      - traefik.http.routers.{{ service_name }}_setup.service={{ service_name }}_setup
+      - traefik.http.routers.{{ service_name }}_setup.rule=Host(`{{ traefik_host }}.{{ traefik_domain }}`) && PathPrefix(`/setup`)
+      - traefik.http.routers.{{ service_name }}_setup.entrypoints=web
+      - traefik.http.middlewares.{{ service_name }}_setup-strip.stripprefix.prefixes=/setup
+      - traefik.http.routers.{{ service_name }}_setup.middlewares={{ service_name }}_setup-strip
+      {% endif %}
+    {% endif %}
+
+{% if network_mode == 'bridge' or network_mode == 'macvlan' or traefik_enabled %}
+{#
+  Network definitions:
+  - 'bridge' mode: creates custom bridge network
+  - 'macvlan' mode: creates macvlan network with static IP assignment
+    (requires manual network creation in Swarm mode)
+  - Swarm overlay: used when swarm_enabled=true with bridge mode
+  - Traefik network: always external (managed separately by Traefik stack)
+  - Default mode (network_mode=''): uses Docker's default bridge (no definition needed)
+  - Host mode: no network definition (container uses host network stack directly)
+#}
+networks:
+  {% if network_mode == 'bridge' or network_mode == 'macvlan'%}
+  {{ network_name }}:
+    {% if network_external %}
+    external: true
+    {% else %}
+    {% if network_mode == 'macvlan' %}
+    driver: macvlan
+    driver_opts:
+      parent: {{ network_macvlan_parent_interface }}
+    ipam:
+      config:
+        - subnet: {{ network_macvlan_subnet }}
+          gateway: {{ network_macvlan_gateway }}
+    name: {{ network_name }}
+    {% elif swarm_enabled %}
+    driver: overlay
+    attachable: true
+    {% else %}
+    driver: bridge
+    {% endif %}
+    {% endif %}
+  {% endif %}
+  {% if traefik_enabled %}
+  {{ traefik_network }}:
+    external: true
+  {% endif %}
+{% endif %}
+
+{% if volume_mode == 'local' %}
+{#
+  Volume definitions:
+  - 'local' mode: Docker-managed local volumes
+  - 'nfs' mode: NFS-backed volumes for shared storage
+  - 'mount' mode: bind mounts (no volume definition needed)
+#}
+volumes:
+  {{ service_name }}_work:
+    driver: local
+  {{ service_name }}_conf:
+    driver: local
+{% elif volume_mode == 'nfs' %}
+volumes:
+  {{ service_name }}_work:
+    driver: local
+    driver_opts:
+      type: nfs
+      o: addr={{ volume_nfs_server }},nfsvers=4,{{ volume_nfs_options }}
+      device: ":{{ volume_nfs_path }}/work"
+  {{ service_name }}_conf:
+    driver: local
+    driver_opts:
+      type: nfs
+      o: addr={{ volume_nfs_server }},nfsvers=4,{{ volume_nfs_options }}
+      device: ":{{ volume_nfs_path }}/conf"
+{% endif %}

+ 82 - 0
library/compose/adguardhome/template.yaml

@@ -0,0 +1,82 @@
+---
+kind: compose
+metadata:
+  name: AdGuard Home
+  description: |-
+    Network-wide software for blocking ads and tracking. AdGuard Home operates as a DNS server that
+    re-routes tracking domains to a "black hole", thus preventing your devices from connecting to those servers.
+    It features advanced DNS filtering, parental controls, safe browsing, and HTTPS/DNS-over-TLS/DNS-over-QUIC support.
+    ## Prerequisites
+    - :info: During the initial setup, AdGuard Home runs an HTTP server on port 3000 to guide you through configuration.
+    After completing the setup, AdGuard Home switches to the configured HTTP port, and port, consider re-deploying the
+    service with `initial_setup=false`.
+    - :warning: If you require DHCP functionality or want AdGuard Home to bind directly to port 53,
+    you must set `network_mode` to `host` or `macvlan`. Note this exposes all container ports directly on the host.
+    You can't use `traefik_enabled` in this case!
+    ## References
+    - **Project:** https://adguard.com/adguard-home/overview.html
+    - **Documentation:** https://github.com/AdguardTeam/AdGuardHome/wiki
+    - **GitHub:** https://github.com/AdguardTeam/AdGuardHome
+  icon:
+    provider: selfh
+    id: adguard-home
+  version: v0.107.69
+  author: Christian Lempa
+  date: '2025-11-13'
+  tags:
+    - traefik
+    - network
+    - volume
+  next_steps:
+  draft: true
+schema: 1.2
+spec:
+  general:
+    vars:
+      service_name:
+        default: "adguardhome"
+      initial_setup:
+        description: "Enable initial setup wizard on port 3000"
+        type: bool
+        default: true
+        extra: >
+          Port 3000 is only used during the initial setup wizard.
+          After completing setup, AdGuard Home switches to the configured HTTP port and port 3000 becomes inactive.
+          Set to false if you've already completed the initial setup.
+  traefik:
+    vars:
+      traefik_host:
+        default: "adguardhome"
+  network:
+    vars:
+      network_mode:
+        extra: >
+          Use 'host' mode if you need DHCP functionality or want AdGuard Home to bind directly to port 53.
+          NOTE: Swarm only supports 'bridge' mode!
+      network_name:
+        default: "adguardhome_network"
+  ports:
+    vars:
+      ports_http:
+        default: 80
+      ports_https:
+        default: 443
+      ports_initial:
+        description: "Initial setup wizard port"
+        type: int
+        default: 3000
+        needs: ["traefik_enabled=false", "initial_setup=true"]
+        extra: >
+          Only used during first-time setup. After configuration, port becomes inactive.
+      ports_dns:
+        description: "DNS port"
+        type: int
+        default: 53
+      ports_tls:
+        description: "DNS over TLS Port"
+        type: int
+        default: 853
+      ports_dnscrypt:
+        description: "DNSCrypt Port"
+        type: int
+        default: 5443

+ 29 - 41
library/compose/alloy/compose.yaml.j2

@@ -1,20 +1,21 @@
 services:
-  {{ service_name | default("alloy") }}:
-    image: grafana/alloy:v1.11.3
-    container_name: {{ container_name | default("alloy") }}
+  {{ service_name }}:
+    image: docker.io/grafana/alloy:v1.12.0
+    restart: {{ restart_policy }}
+    {% if container_hostname %}
     hostname: {{ container_hostname }}
-    command:
-      - run
-      - --server.http.listen-addr=0.0.0.0:{{ ports_main | default("12345") }}
-      - --storage.path=/var/lib/alloy/data
-      - /etc/alloy/config.alloy
-    {% if ports_enabled %}
+    {% endif %}
+    {% if traefik_enabled %}
+    networks:
+      {{ traefik_network }}:
+    {% endif %}
+    {% if not traefik_enabled %}
     ports:
-      - "{{ ports_main | default("12345") }}:12345"
+      - "{{ ports_webui }}:12345"
     {% endif %}
     volumes:
-      - ./config/config.alloy:/etc/alloy/config.alloy
-      - alloy_data:/var/lib/alloy/data
+      - {{ service_name }}_data:/alloy/data
+      - ./config.alloy:/etc/alloy/config.alloy:ro
       {% if logs_enabled or metrics_enabled %}
       - /:/rootfs:ro
       - /sys:/sys:ro
@@ -29,42 +30,29 @@ services:
       {% if metrics_enabled and metrics_system %}
       - /run/udev/data:/run/udev/data:ro
       {% endif %}
-    {% if network_enabled %}
-    networks:
-      - {{ network_name | default("bridge") }}
-    {% endif %}
     {% if traefik_enabled %}
     labels:
       - traefik.enable=true
-      - traefik.docker.network={{ traefik_network | default("traefik") }}
-      - traefik.http.services.{{ service_name | default("alloy") }}.loadbalancer.server.port=12345
-      - traefik.http.services.{{ service_name | default("alloy") }}.loadbalancer.server.scheme=http
-      - traefik.http.routers.{{ service_name | default("alloy") }}-http.service={{ service_name | default("alloy") }}
-      - traefik.http.routers.{{ service_name | default("alloy") }}-http.rule=Host(`{{ traefik_host }}`)
-      - traefik.http.routers.{{ service_name | default("alloy") }}-http.entrypoints={{ traefik_entrypoint | default("web") }}
+      - traefik.docker.network={{ traefik_network }}
+      - traefik.http.services.{{ service_name }}-web.loadBalancer.server.port=12345
+      - traefik.http.routers.{{ service_name }}-http.service={{ service_name }}-web
+      - traefik.http.routers.{{ service_name }}-http.rule=Host(`{{ traefik_host }}.{{ traefik_domain }}`)
+      - traefik.http.routers.{{ service_name }}-http.entrypoints=web
       {% if traefik_tls_enabled %}
-      - traefik.http.routers.{{ service_name | default("alloy") }}-https.service={{ service_name | default("alloy") }}
-      - traefik.http.routers.{{ service_name | default("alloy") }}-https.rule=Host(`{{ traefik_host }}`)
-      - traefik.http.routers.{{ service_name | default("alloy") }}-https.entrypoints={{ traefik_tls_entrypoint | default("websecure") }}
-      - traefik.http.routers.{{ service_name | default("alloy") }}-https.tls=true
-      - traefik.http.routers.{{ service_name | default("alloy") }}-https.tls.certresolver={{ traefik_tls_certresolver }}
+      - traefik.http.routers.{{ service_name }}-https.service={{ service_name }}-web
+      - traefik.http.routers.{{ service_name }}-https.rule=Host(`{{ traefik_host }}.{{ traefik_domain }}`)
+      - traefik.http.routers.{{ service_name }}-https.entrypoints=websecure
+      - traefik.http.routers.{{ service_name }}-https.tls=true
+      - traefik.http.routers.{{ service_name }}-https.tls.certresolver={{ traefik_tls_certresolver }}
       {% endif %}
     {% endif %}
-    restart: {{ restart_policy | default("unless-stopped") }}
-
-volumes:
-  alloy_data:
-    driver: local
 
-{% if network_enabled or traefik_enabled %}
+{% if traefik_enabled %}
 networks:
-  {% if network_enabled %}
-  {{ network_name | default("bridge") }}:
-    {% if network_external %}
-    external: true
-    {% endif %}
-  {% elif traefik_enabled %}
-  {{ traefik_network | default("traefik") }}:
+  {{ traefik_network }}:
     external: true
-  {% endif %}
 {% endif %}
+
+volumes:
+  {{ service_name }}_data:
+    driver: local

Некоторые файлы не были показаны из-за большого количества измененных файлов