//nbkelley /homelab

PowerShell Local PC Scripting Patterns

PowerShell Local PC Scripting Patterns#

What Was Established#

A collection of PowerShell scripting patterns for local PC automation tasks, including folder renaming from CSV data, and basic system administration.

Key Decisions#

  • CSV-driven automation: Use Import-Csv with Rename-Item for bulk folder renaming driven by spreadsheet data.
  • Path validation: Always Test-Path before operating to avoid errors on missing sources or existing destinations.
  • Error handling: Wrap rename operations in conditional checks; log warnings rather than failing silently.

Folder Renaming from CSV#

CSV format (folder_renaming.csv):

PowerShell Local PC Scripting Patterns

What Was Established#

  • Standardized PowerShell commands for local file operations and MSI package management.
  • A streamlined, linear pattern for automating software installation, execution, and uninstallation on local Windows hosts without over-engineering modular functions.

Key Decisions#

  • Prefer Get-ChildItem with wildcards (C:\Users\*\Downloads\*.msi) for rapid file discovery over manual profile loops.
  • Use Start-Process with msiexec.exe for silent MSI installs (/qn) to keep scripts concise for linear workflows.
  • Simplified error handling and removed verbose logging for basic administrative tasks, reserving complex wrappers for larger automation frameworks.

Current Configuration / Patterns#

File Listing & Discovery#

# List files recursively, sorted newest first
Get-ChildItem -Path "C:\Path" -File -Recurse | Sort-Object LastWriteTime -Descending

# Find first matching file across user profiles
Get-ChildItem "C:\Users\*\Downloads\CleanUpTool.msi" -ErrorAction SilentlyContinue | Select-Object -First 1

File Copying#

# Copy single file, force overwrite
Copy-Item -Path "Source\File.ext" -Destination "Target\" -Force

# Copy entire directory tree
Copy-Item -Path "C:\Source\*" -Destination "D:\Destination\" -Recurse

MSI Install / Run / Uninstall Pattern#

#Requires -RunAsAdministrator

# 1. Find and Install
$cleanupPath = Get-ChildItem "C:\Users\*\Downloads\CleanUpTool.msi" -ErrorAction SilentlyContinue | Select-Object -First 1
if ($cleanupPath) {
    Start-Process "msiexec.exe" -ArgumentList "/i `"$($cleanupPath.FullName)`" /qn" -Wait

    # 2. Run (locate executable in Program Files)
    $exePath = Get-ChildItem "C:\Program Files\", "C:\Program Files (x86)\" -Recurse -Filter "*CleanUpTool*.exe" -ErrorAction SilentlyContinue | Select-Object -First 1
    if ($exePath) { Start-Process $exePath.FullName -Wait }

    # 3. Uninstall
    $product = Get-WmiObject -Class Win32_Product | Where-Object { $_.Name -like "*CleanUpTool*" }
    if ($product) { $product.Uninstall() }
}

# 4. Install Next Package
$foxitPath = Join-Path $PSScriptRoot "Foxit.msi"
if (Test-Path $foxitPath) {
    Start-Process "msiexec.exe" -ArgumentList "/i `"$foxitPath`" /qn" -Wait
}

Historical Notes#

  • Conversation dated 2025-06-11. The simplified linear script approach was explicitly chosen over a modular function-based approach to reduce overhead for routine local PC tasks.
  • Get-WmiObject -Class Win32_Product is a legacy enumeration method. While functional for basic scripts, it is known to trigger package repairs and is slow on large systems. Consider migrating to Get-CimInstance -Class Win32_Product or querying HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall for production use.

Open Questions#

  • Does the local PC environment still rely on Get-WmiObject for software inventory, or has it migrated to Get-CimInstance / registry queries?
  • Are there standardized locations for MSI installers beyond user Downloads and script root for future automation?
  • wiki/scripts/powershell-cleanup-patterns.md
  • wiki/scripts/bulk-folder-rename-script.md
  • wiki/administration/powershell-administration-commands.md

Sources#

  • ingested/chats/056-List Files in Folder Using PowerShell.md
  • DeepSeek conversation: “List Files in Folder Using PowerShell” (2025-06-11)

n8n

n8n#

What Was Established#

n8n is the automation and orchestration hub for the homelab. It runs as an LXC on Proxmox, connected to PostgreSQL for persistent workflow state and execution history. Community edition is sufficient for current use.

Deployment#

Detail Value
LXC host n8n
IP 192.168.1.169
Port 5678
URL http://192.168.1.169:5678
Version 2.15.1 (Self Hosted)
Installed via tteck Proxmox helper scripts
OS Debian 13 (unprivileged LXC)
Config file /opt/n8n.env

Configuration (/opt/n8n.env)#

N8N_SECURE_COOKIE=false
N8N_PORT=5678
N8N_PROTOCOL=http
N8N_HOST=192.168.1.169
DB_TYPE=postgresdb
DB_POSTGRESDB_HOST=192.168.1.57
DB_POSTGRESDB_PORT=5432
DB_POSTGRESDB_DATABASE=homelab
DB_POSTGRESDB_USER=homelab
DB_POSTGRESDB_PASSWORD=<password>

After editing: systemctl restart n8n

Book Discovery Pipeline

Book Discovery Pipeline#

What Was Established#

A multi-agent, multi-node pipeline designed to identify high-prestige, upcoming literary works by analyzing critical reviews before they hit the mainstream.

Key Decisions#

  • Architecture: Two-tier agent system. Lightweight models (E4B) on the Orchestrator (T480) handle routing/filtering; heavier models (26B) on the Inference node (Pavilion) handle deep analysis.
  • Tech Stack: n8n (Orchestration), PostgreSQL (Data Storage), Hugo (Static Site Generation), Python/JS (Custom Logic).
  • Data Sources: RSS feeds (Literary Hub, etc.), Web Scraping (for indie blogs), and Goodreads API/Scraping for popularity comparison.

Current Configuration#

The Pipeline Chain#

  1. Ingestion: n8n fetches RSS feeds and scrapes blogs.
  2. Filtering (E4B): Classifies content (Review vs. News). Discards non-reviews.
  3. Extraction (26B): Extracts title, author, publisher, and critical language.
  4. Scoring (26B): Analyates “prestige signals” (e.g., phrases like “formally ambitious”) and compares against Goodreads popularity.
  5. Aggregation: Aggregates data into PostgreSQL.
  6. Publication: n8n generates a Markdown file and commits it to a Hugo repository.

Data Schema (PostgreSQL)#

  • sources: RSS metadata.
  • articles: Raw ingested content.
  • books: Normalized book records.
  • reviews: Links articles to books + extracted critical language.
  • prestige_scores: Historical scoring for trend tracking.

Open Questions#

  • How to effectively scrape Substack/paywalled content without high costs.
  • Determining the optimal frequency for the pipeline run (Weekly vs. Bi-weekly).

Open WebUI Deployment, Ollama Configuration, AI-Driven Monitoring Pipeline