Using WireGuard to Access Self Hosted Local AI Anywhere, Securely
Additionally Reflections on Local AI Over Hosted AI
Today we are going to continue our series on Local AI and make it available outside of our home network. This gives us our personal AI models that are privately hosted on our own hardware the ability to work with us at the coffee shop, at the airport, and anywhere else you need to travel outside of your home network. We can continue to focus on our privacy oriented setups that protect us from data mining and intrusion into our personal lives from the larger hosted AI companies while ensuring we have a secure connection to our home network that other people can’t access.
This allows us to continue to use AI productively without feeling the need to use other less privacy focused services while we are on the go. Most commonly this happens when using AI on our phones, but this will also work for any devices you have as long as you can use WireGuard. As a bonus this technique can be used on any other self hosted software also to reach it on the go, but for this article we are going to focus on locally hosted AI.
Prerequisites
This article builds on a few others that need to be done before jumping in here.
First you need to get Ollama up on your server you have chosen, see Running AI Locally and Privately for Free. Ensure that after you have Ollama setup with the 0.0.0.0 IP address so that it serves on the network and not just on the loopback interface. You can test this by trying to access the Ollama API on a different device with the server’s IP address and ensure you have a successful connection.