MLX on Apple Silicon as local AI in comparison with Ollama & Co.

Local AI on the Mac with MLX

At a time when centralized AI services such as ChatGPT, Claude or Gemini are dominating the headlines, many professional users are increasingly looking for an alternative - a local, self-controllable AI infrastructure. Especially for creative processes, sensitive data or recurring workflows, a local solution is often the more sustainable and secure option.

Anyone working with a Mac - especially with Apple Silicon (M1, M2, M3 or M4) - can now find amazingly powerful tools to run their own language models directly on the device. At the center of this is a new, largely unknown component: MLX, a machine learning framework developed by Apple that is likely to play an increasingly central role in the company's AI ecosystem in the coming years.

Read more

RAG with Ollama and Qdrant as a universal search engine for own data

Extend local AI with databases using RAG, Ollama and Qdrant

In an increasingly confusing world of information, it is becoming more and more important to make your own databases searchable in a targeted manner - not via classic full-text searches, but through semantically relevant answers. This is exactly where the principle of the RAG database comes into play - an AI-supported search solution consisting of two central components:

Read more

Ollama meets Qdrant: A local memory for your AI on the Mac

Memory for local AI with Ollama and Qdrant

Local AI with memory - without cloud, without subscription, without detour

In a previous articles I explained how to configure Ollama on the Mac install. If you have already completed this step, you now have a powerful local language model - such as Mistral, LLaMA3 or another compatible model that can be addressed via REST API.

However, the model only "knows" what is in the current prompt on its own. It does not remember previous conversations. What is missing is a memory.

Read more

Local AI on the Mac: How to install a language model with Ollama

Local AI on the Mac has long been practical - especially on Apple-Silicon computers (M series). With Ollama you get a lean runtime environment for many open source language models (e.g. Llama 3.1/3.2, Mistral, Gemma, Qwen). The current Ollama version now also comes with a user-friendly app that allows you to set up a local language model on your Mac at the click of a mouse. In this article you will find a pragmatic guide from installation to the first prompt - with practical tips on where things traditionally go wrong.

Read more

Through the crisis with clarity: how AI opens up new perspectives

Book 'Crises as turning points - learn, grow, shape'

Crises are part of life. But while many experience them as a painful interruption or even as a personal setback, Markus Schall shows a different perspective in this extraordinary book: Understood correctly, crises can become profound turning points. Opportunities that enable growth. And moments in which our thoughts, actions and feelings can be reorganized.

With a keen sense of human experience, combined with practical life wisdom and reflective use of AI, the author takes the reader on a journey through inner landscapes - without pointing a finger, but with sincere encouragement. The book is not intended as an academic work, but as a true-to-life companion for anyone who wants to realign themselves - be it after separation, illness, financial difficulties or social upheaval.

Read more