My Q2 2025 Homelab: AI-Enhanced Self-Hosting with a Massive DIY NAS
Written by Jeff on April 13, 2025My Q2 2025 Homelab: AI-Enhanced Self-Hosting with a Massive DIY NAS
Introduction
Welcome to my homelab! It's a constantly evolving environment where I experiment with new technologies, self-host essential services, and manage a growing collection of data. My homelab is structured into three core sections: an AI experimentation platform, a robust self-hosting infrastructure, and a massive DIY Network Attached Storage (NAS) solution. Each component is designed to work in synergy, unlocking new possibilities for personal projects and learning.
AI Section Update
My AI section leverages a powerful gaming computer repurposed as a local AI server. The core components include:
- ๐ป Processor: Intelยฎ Coreโข i9-13900KS
- ๐ฅ๏ธ Mainboard: ASUS ROG Strix Z790-E Gaming WiFi II LGA 1700
- ๐ง RAM: 128GB DDR5 - This has remained the same.
- ๐ฎ GPU: NVIDIA RTX 4090
- ๐พ SSD: 1TB (Samsung 980 Pro)
- ๐ Network: 10Gbit
The key update, as mentioned, isn't the hardware itself, but a more mature understanding and optimized utilization of the 128GB of RAM I have at my disposal. This enables me to comfortably work with larger models and datasets. I've also been focusing on optimizing the software stack to improve inference speeds.
I primarily use the following AI models, deployed via Ollama, for various tasks within my homelab:
- ๐ Llama 3.2 Vision (90B Parameters): Used for resource-intensive tasks and advanced experimentation. The 128GB RAM is critical for running this model locally.
- ๐๏ธ Llama 3.3 (70B Parameters): Ideal for advanced natural language processing tasks, providing robust and reliable performance for daily use.
- ๐ผ๏ธ Llama 3.2 Vision (11B Parameters): Used for day-to-day multimodal processing, striking a balance between performance and efficiency.
- ๐ InternVL2 (26B Parameters): With advanced vision-language capabilities, this model excels at complex multimodal tasks while maintaining efficiency for regular use.
My focus has shifted from simply running the models to optimizing their performance for specific homelab applications.
Self-Hosted Applications (Proxmox Cluster)
My self-hosting infrastructure primarily relies on a two-node Proxmox cluster built on Dell PowerEdge R720 servers. Proxmox provides excellent virtualization capabilities and high availability. This cluster hosts all my self-hosted applications, including:
- MailCow: My personal mail server, giving me complete control over my email.
- Nextcloud: My online office suite and personal cloud storage solution, replacing Google Drive.
- Home Assistant: For smart home automation, connecting and controlling all my smart devices.
- Emby: My media center, streaming movies and TV shows from my NAS.
- Unifi Network Controller: To manage my Ubiquiti network devices.
- Nginx Proxy Manager: For managing reverse proxies and SSL certificates.
- Forgejo: A self-hosted software forge, providing Git repository management.
- Paperless-ngx: To archive and organize my documents, going paperless.
- Immich: A Google Photos alternative for storing and organizing my photos and videos.
- pfSense: Acting as my firewall, IDS, and IPS, protecting my homelab from threats.
- Vaultwarden: A Bitwarden-compatible password storage solution.
- Prometheus/Grafana: For monitoring and graphing the performance of my homelab.
- UptimeKuma: Alarms if my services goes down.
- Dashy: A personal dashboard to centralize access to all my self-hosted services.
- Flatnotes: A database-less note-taking web app, perfect for quick notes and documentation.
- Portainer: For managing my Docker containers across the homelab.
- Syncthing: To sync data from my travel setup, ensuring I have access to important files wherever I am.
- Plausible: Analytics for my projects and applications.
Proxmox's built-in features for backup and replication are essential for ensuring data protection. I carefully allocate resources to each VM to ensure optimal performance and stability.
Kubernetes Cluster (Raspberry Pi 5s)
As mentioned, this cluster of ten Raspberry Pi 5s is purely for research purposes, testing, and development. I'm fully aware that building a Kubernetes cluster with Raspberry Pi 5s comes with challenges. Resource constraints, managing SD card failures, and dealing with ARM architecture compatibility issues are all realities, but I accept these limitations because this cluster is a sandbox environment for experimentation. The hands-on experience I gain with Kubernetes concepts is invaluable. The cluster is primarily used for testing work-related items that cannot be shared publicly.
NAS Section (DIY with 320TB)
Data is at the heart of my homelab, and my DIY NAS server provides the foundation for storage. The NAS boasts a total capacity of 320TB, built using 16 x 20TB Seagate Exos hard drives configured in a RAID 6 array. It's powered by an AMD Ryzen 5 5600G CPU with 32GB of DDR4 ECC RAM running TrueNAS Scale with the ZFS file system.
Building my own NAS allowed me to customize the hardware and software to meet my specific needs. While it required more initial effort, I've gained a deeper understanding of storage technologies and the ability to scale the system as needed. I've implemented a comprehensive backup strategy that includes regular snapshots stored locally, offsite backups to Backblaze B2, and a 1:1 backup to my DIY NAS with 500TB in my house in Jakarta. The NAS is connected to the network via a dual 10Gbe connection using link aggregation (LACP).
The NAS integrates with the AI and self-hosted application sections. Specifically, it stores all the media files for Emby and all user data and files for Nextcloud. This provides ample storage and ensures these applications function seamlessly.
Integration and Synergy
Currently, the synergy between the sections is primarily through data storage. The NAS acts as a central repository for data used by the applications hosted on Proxmox. However, the data for the AI models is not yet stored on the NAS, as the AI machine has sufficient local storage and I want to ensure optimal performance.
Challenges and Future Directions
Managing such a diverse homelab environment presents several challenges. It's hard managing homelab at this scale and constantly ensuring all the pieces play nicely together.
A key area I want to explore is integrating the AI section more directly with my self-hosted applications running on Proxmox. In the future, I plan to research and develop a "Travel Lab" or "Travel Server" concept -- a homelab setup that can be easily transported for on-the-go development and experimentation.
Conclusion
My Q2 2025 homelab is a constantly evolving project, fueled by a passion for experimentation and self-hosting. By combining AI, a robust application infrastructure, and a massive NAS, I'm able to unlock new possibilities for learning, automation, and personal projects. I encourage others to share their own homelab setups and ideas, as we can all learn from each other's experiences.