Current contributions

Writing books 2.0 - practical guide for authors in the age of AI

Cover: Writing books 2.0 - a practical guide for authors in the age of AI

This book does not provide speculative visions of the future, but a well-founded, practical answer from the everyday life of an author who writes with AI himself - and deliberately focuses not on automation, but on autonomy. "Writing books 2.0" is aimed at anyone who is thinking about writing their own book - or is already in the process of doing so - and is wondering how modern tools such as ChatGPT, AI image generators or translation aids can be put to good use. The focus is not on "finished AI products", but on a holistic approach: AI as a tool, humans as authors.

Read more

See you at the FileMaker conference FMK 2025 in Hamburg?

FileMaker Conference FMK2025

From September 15 to 17, 2025, the German-speaking FileMaker community will meet at the JUFA Hotel Hamburg HafenCity to discuss the latest developments, trends and best practices at the FileMaker Conference (FMK 2025). For over ten years, this conference has been regarded as the most important event for developers, users and decision-makers in the Claris FileMaker sector - and I am delighted to be there in person again this year.

Read more

The Unconventional Hemorrhoid Book - open, honest and holistic

Haemorrhoids cover

Hemorrhoids are a taboo subject. Although millions of people are affected, they are rarely talked about openly. Many suffer in silence, putting off visits to the doctor and trying to get by with creams and home remedies. But it is precisely this silence that can inadvertently prolong the suffering.

With "The Unconventional Hemorrhoid Book - My way through hemorrhoid hell - and back" Markus Schall sends out a clear signal against this speechlessness - and shows new ways of better understanding, classifying and holistically alleviating complaints.

Read more

Ollama meets Qdrant: A local memory for your AI on the Mac

Memory for local AI with Ollama and Qdrant

Local AI with memory - without cloud, without subscription, without detour

In a previous articles I explained how to configure Ollama on the Mac install. If you have already completed this step, you now have a powerful local language model - such as Mistral, LLaMA3 or another compatible model that can be addressed via REST API.

However, the model only "knows" what is in the current prompt on its own. It does not remember previous conversations. What is missing is a memory.

Read more

TMD - The forgotten problem of modern medicine

Book: CMD - The forgotten problem of modern medicine

Millions of people all over the world suffer from complaints such as tinnitus, dizziness, back pain or tension - and yet cannot find a clear diagnosis. What many do not know: There may be a single, often overlooked cause behind all these symptoms - craniomandibular dysfunction, or TMD for short.

With my book "TMD - The forgotten problem of modern medicine" I would like to start right here: I explain what TMD is, how it manifests itself, why it is so difficult to recognize - and how those affected can finally achieve greater clarity and quality of life.

Read more

"The Unconventional Database Book" introduces the process way of thinking.

The Unconventional Database Book

What do cell phone contacts, to-do lists, calendars and even your own closet have in common? That's right: they can be displayed as tables - and that's no coincidence. Data has long since become a basic building block of our everyday lives. If you understand it, you understand the world a little better. This is exactly where "The Unconventional Database book" comes in.

Because anyone who can understand processes and backgrounds in everyday life is automatically able to design software processes quickly and intuitively and implement them in practice.

Read more

Local AI on the Mac: How to install a language model with Ollama

Local AI installieren with Ollama on the Mac

Local AI on the Mac has long been practical - especially on Apple-Silicon computers (M series). With Ollama you get a lean runtime environment for many open source language models (e.g. Llama 3.1/3.2, Mistral, Gemma, Qwen). The current Ollama version now also comes with a user-friendly app that allows you to set up a local language model on your Mac at the click of a mouse. In this article you will find a pragmatic guide from installation to the first prompt - with practical tips on where things traditionally go wrong.

Read more