When you hear the word lithium, many people first think of modern rechargeable batteries, battery technology or - with a skeptical eye - psychotropic drugs. But lithium is actually much more than that: it is a naturally occurring trace element that has been present in our environment since time immemorial - in rocks, in water and also in small quantities in plant-based foods.
Tips & instructions
In this category you will find Practical tips, step-by-step instructions and practical guides on various topics from technology, business and everyday life. The articles show in an understandable and comprehensible way how to solve typical problems, use software efficiently or try out new ways of working. Whether it's about Apple devices, FileMaker, ERP software, Health topics or other practical issues - the focus here is on concrete benefits. The instructions are structured in such a way that they can be implemented directly and help quickly in everyday life or in the company.
Integration of MLX in FileMaker 2025: Local AI as the new standard
While MLX originally started as an experimental framework from Apple Research, a quiet but significant development has taken place in recent months: With the release of FileMaker 2025, Claris has firmly integrated MLX into the server as a native AI infrastructure for Apple Silicon. This means that anyone working with a Mac and relying on Apple Silicon can not only run MLX models locally, but also use them directly in FileMaker - with native functions, without any intermediate layers.
Why ERP software alone is not enough - and how to really understand processes
In many companies, it always follows the same pattern: at some point, management realizes that "something is no longer running smoothly". Perhaps processes have become too slow, errors are accumulating, or the company is increasingly losing track of figures, customers or internal processes. The call for a new software solution becomes loud - preferably a modern, powerful ERP software that "can do everything". But this is often where a fatal fallacy begins.
How I wrote five books in two languages in four months
...and why this is not a miracle, but the result of a clear strategy
For a long time, writing books was seen as something tedious - a lonely project that drags on for months or even years. But what if you let go of this image? What if you rethink writing - with a clear focus, well thought-out processes and targeted use of AI?
In my new book "Writing books 2.0 - a practical guide for authors in the age of AI" I describe exactly this path. A path that has enabled me to write five books in just four months, publish them in two languages - and not accept any loss of quality compared to traditional publishing.
Writing books 2.0 - practical guide for authors in the age of AI
This book does not provide speculative visions of the future, but a well-founded, practical answer from the everyday life of an author who writes with AI himself - and deliberately focuses not on automation, but on autonomy. "Writing books 2.0" is aimed at anyone who is thinking about writing their own book - or is already in the process of doing so - and is wondering how modern tools such as ChatGPT, AI image generators or translation aids can be put to good use. The focus is not on "finished AI products", but on a holistic approach: AI as a tool, humans as authors.
Ollama meets Qdrant: A local memory for your AI on the Mac
Local AI with memory - without cloud, without subscription, without detour
In a previous articles I explained how to configure Ollama on the Mac install. If you have already completed this step, you now have a powerful local language model - such as Mistral, LLaMA3 or another compatible model that can be addressed via REST API.
However, the model only "knows" what is in the current prompt on its own. It does not remember previous conversations. What is missing is a memory.
Local AI on the Mac: How to install a language model with Ollama
Local AI on the Mac has long been practical - especially on Apple-Silicon computers (M series). With Ollama you get a lean runtime environment for many open source language models (e.g. Llama 3.1/3.2, Mistral, Gemma, Qwen). The current Ollama version now also comes with a user-friendly app that allows you to set up a local language model on your Mac at the click of a mouse. In this article you will find a pragmatic guide from installation to the first prompt - with practical tips on where things traditionally go wrong.
Health tips (not only) for FileMaker users
The development of the gFM-Business ERP software has been a little slower than usual over the last year and a half because, as the developer of the software, I had to deal with some strange symptoms of illness where it was not so easy to find the cause. Even though conventional medicine was no real help in my case, with the active support of a few alternative practitioners and a holistic dentist, I was able to find the causes and eliminate my problems. Because the results of this research are so interesting, I have decided to write a short article about it here so that you as a reader can also benefit from it. Below you will find my top favorites as pure tips, I do not earn anything from the referral and do not use any ref links.