App Simplifies Using Ollama Local AI on MacOS Devices
May 19, 2025
PaulMartínez
0
If you're looking to keep your data private and avoid feeding into third-party profiles or training sets, using a locally installed AI for your research is a smart move. I've been using the open-source Ollama on my Linux system, enhanced with a handy browser extension for a smoother experience. But when I switch to my MacOS, I opt for a straightforward, free app called Msty.
Also: How to turn Ollama from a terminal tool into a browser-based AI with this free extension
Msty is versatile, allowing you to tap into both locally installed and online AI models. Personally, I stick with the local option for maximum privacy. What sets Msty apart from other Ollama tools is its simplicity—no need for containers, terminals, or additional browser tabs.
The app is packed with features that make it a breeze to use. You can run multiple queries simultaneously with split chats, regenerate model responses, clone chats, and even add multiple models. There's real-time data summoning (though it's model-specific), and you can create Knowledge Stacks to train your local model with files, folders, Obsidian vaults, notes, and more. Plus, there's a prompt library to help you get the most out of your queries.
Msty is hands down one of the best ways to interact with Ollama. Here's how you can get started:
Installing Msty
What you'll need: Just a MacOS device and Ollama installed and running. If you haven't set up Ollama yet, follow the steps here. Don't forget to download a local model as well.
Download the Installer
Head over to the Msty website, click on the Download Msty dropdown, choose Mac, and then pick either Apple Silicon or Intel based on your device.
Install Msty
Once the download finishes, double-click the file and drag the Msty icon to the Applications folder when prompted.
Using Msty
Open Msty
Launch Msty from Launchpad on your MacOS.
Connect Your Local Ollama Model
When you first open Msty, click on Setup Local AI. It will download the necessary components and configure everything for you, including downloading a local model other than Ollama.
To link Msty with Ollama, navigate to Local AI Models in the sidebar and click the download button next to Llama 3.2. After it's downloaded, select it from the models dropdown. For other models, you'll need an API key from your account for that specific model. Now Msty should be connected to your local Ollama LLM.

I prefer sticking with the Ollama local model for my queries.
Model Instructions
One of my favorite features in Msty is the ability to customize model instructions. Whether you need the AI to act as a doctor, a writer, an accountant, an alien anthropologist, or an artistic advisor, Msty has you covered.
To tweak the model instructions, click on Edit Model Instructions in the center of the app, then hit the tiny chat button to the left of the broom icon. From the popup menu, pick the instructions you want and click "Apply to this chat" before you run your first query.

There are plenty of model instructions to choose from, helping you refine your queries for specific needs.
This guide should help you get up and running with Msty quickly. Start with the basics and, as you become more comfortable with the app, explore its more advanced features. It's a powerful tool that can really enhance your local AI experience on MacOS.
Related article
Top 5 Trusted Tools for Private and Anonymous Online Conversations
Privacy has become more than just a trendy term; it's deeply ingrained in the fabric of our society. But the big question remains: Are consumers ready to ditch their current tools
60% of AI Agents in IT Departments: Daily Tasks Revealed
The Rise of AI Agents in EnterprisesAI agents are becoming the next big thing in the world of technology, and it seems like everyone wants a piece of the action. But what exactly a
AI's Impact on Jobs and Global Economy: Navigating the Business Outlook
In today's fast-paced world, the global business landscape is being reshaped by a variety of economic and societal trends. From the subtle effects of artificial intelligence on job
Comments (0)
0/200






If you're looking to keep your data private and avoid feeding into third-party profiles or training sets, using a locally installed AI for your research is a smart move. I've been using the open-source Ollama on my Linux system, enhanced with a handy browser extension for a smoother experience. But when I switch to my MacOS, I opt for a straightforward, free app called Msty.
Also: How to turn Ollama from a terminal tool into a browser-based AI with this free extension
Msty is versatile, allowing you to tap into both locally installed and online AI models. Personally, I stick with the local option for maximum privacy. What sets Msty apart from other Ollama tools is its simplicity—no need for containers, terminals, or additional browser tabs.
The app is packed with features that make it a breeze to use. You can run multiple queries simultaneously with split chats, regenerate model responses, clone chats, and even add multiple models. There's real-time data summoning (though it's model-specific), and you can create Knowledge Stacks to train your local model with files, folders, Obsidian vaults, notes, and more. Plus, there's a prompt library to help you get the most out of your queries.
Msty is hands down one of the best ways to interact with Ollama. Here's how you can get started:
Installing Msty
What you'll need: Just a MacOS device and Ollama installed and running. If you haven't set up Ollama yet, follow the steps here. Don't forget to download a local model as well.
Download the Installer
Head over to the Msty website, click on the Download Msty dropdown, choose Mac, and then pick either Apple Silicon or Intel based on your device.
Install Msty
Once the download finishes, double-click the file and drag the Msty icon to the Applications folder when prompted.
Using Msty
Open Msty
Launch Msty from Launchpad on your MacOS.
Connect Your Local Ollama Model
When you first open Msty, click on Setup Local AI. It will download the necessary components and configure everything for you, including downloading a local model other than Ollama.
To link Msty with Ollama, navigate to Local AI Models in the sidebar and click the download button next to Llama 3.2. After it's downloaded, select it from the models dropdown. For other models, you'll need an API key from your account for that specific model. Now Msty should be connected to your local Ollama LLM.
I prefer sticking with the Ollama local model for my queries.
Model Instructions
One of my favorite features in Msty is the ability to customize model instructions. Whether you need the AI to act as a doctor, a writer, an accountant, an alien anthropologist, or an artistic advisor, Msty has you covered.
To tweak the model instructions, click on Edit Model Instructions in the center of the app, then hit the tiny chat button to the left of the broom icon. From the popup menu, pick the instructions you want and click "Apply to this chat" before you run your first query.
There are plenty of model instructions to choose from, helping you refine your queries for specific needs.
This guide should help you get up and running with Msty quickly. Start with the basics and, as you become more comfortable with the app, explore its more advanced features. It's a powerful tool that can really enhance your local AI experience on MacOS.












