If, like me, you have been working with layout and typesetting programs for decades, you usually notice such changes more clearly than those who have only recently entered this world. I have seen many things come and go over the years: In the early nineties, I worked on the Atari ST with Calamus SL and later, under Windows, with CorelDraw! Later came QuarkXPress, then iCalamus, Adobe InDesign - and finally, a few years ago, Affinity Publisher. Since then, the Affinity suite has accompanied me through almost all my book projects. Over the years, it has been a reliable tool, pleasantly straightforward, clearly structured and free of the ballast that many large software houses have added to themselves over the years.
Apple macOS
This category is all about the professional use of Apple macOS in the company. The specialist articles show how the Mac is a convincing workplace in everyday business life - from the Integration into company networksabout Secure data management up to Productivity apps and ERP software. Find out what advantages macOS has for Self-employed, small businesses and larger companies offers, how workflows can be optimized and which tools make the Mac a reliable companion in everyday office life. Whether in the Project management, in the Accounting or for creative work - The Mac has been a strong foundation for efficient working for many years.
Apple MLX vs. NVIDIA: How local AI inference works on the Mac
Anyone working with artificial intelligence today often first thinks of ChatGPT or similar online services. You type in a question, wait a few seconds - and receive an answer as if a very well-read, patient conversation partner were sitting at the other end of the line. But what is easily forgotten: Every input, every sentence, every word travels via the Internet to external servers. That's where the real work is done - on huge computers that you never get to see yourself.
In principle, a local language model works in exactly the same way - but without the Internet. The model is stored as a file on the user's own computer, is loaded into the working memory at startup and answers questions directly on the device. The technology behind it is the same: a neural network that understands language, generates texts and recognizes patterns. The only difference is that the entire calculation remains in-house. You could say: ChatGPT without the cloud.
Electronic invoices for SMEs: XRechnung, ZUGFeRD and ERP at a glance
Germany did not invent the e-invoice overnight - it is the result of years of standardization work (EN 16931), federal and state regulations (B2G) and now, via the Growth Opportunities Act, the gradual expansion into everyday B2B life. Since January 1, 2025, a new legal situation has applied: an "electronic invoice" is only an e-invoice if it is structured and machine-readable - pure PDF attachments by email are no longer an e-invoice according to the definition. This sounds technical, but has operational consequences from invoice receipt to accounting and archiving.
Digital dependency: how we have lost our self-determination to the cloud
I've always thought it was a mistake for people to hand over their data - be it in the cloud, via apps or with any "free" services. For me, data sovereignty has never been a buzzword, but a question of self-respect. Anyone who uses technology without considering the consequences is entering into a dependency that often only becomes noticeable years later - but then has an even deeper impact.
gFM-Business and the future of ERP: local intelligence instead of cloud dependency
For over a decade, the gFM-Business software has stood for something special in the German ERP market: it is not based on a cumbersome, difficult-to-maintain system, but on the lightweight, customizable and visually modelled FileMaker platform. This has many advantages: gFM-Business can be individually expanded, runs on Windows, macOS and iOS, and can be customized by both developers and ambitious power users.
With the advent of artificial intelligence (AI) - especially through so-called language models such as ChatGPT - new opportunities are now emerging that go far beyond traditional automation. gFM-Business is actively preparing for this future: with the aim of not only managing data, but also unlocking knowledge.
MLX on Apple Silicon as local AI in comparison with Ollama & Co.
At a time when centralized AI services such as ChatGPT, Claude or Gemini are dominating the headlines, many professional users are increasingly looking for an alternative - a local, self-controllable AI infrastructure. Especially for creative processes, sensitive data or recurring workflows, a local solution is often the more sustainable and secure option.
Anyone working with a Mac - especially with Apple Silicon (M1, M2, M3 or M4) - can now find amazingly powerful tools to run their own language models directly on the device. At the center of this is a new, largely unknown component: MLX, a machine learning framework developed by Apple that is likely to play an increasingly central role in the company's AI ecosystem in the coming years.
gofilemaker.de is closing its online store - and celebrating with a 15 % discount!
Sometimes a step back is exactly what leads forward. That's why gofilemaker.de is saying goodbye to the classic online store on September 30, 2025. Sounds crazy? But it's well thought out. And to celebrate this liberation, there will be a 15 % discount on all purchase licenses until then - for the last time. After that: individual licensing, a clear line, no more compromises.
Ollama meets Qdrant: A local memory for your AI on the Mac
Local AI with memory - without cloud, without subscription, without detour
In a previous articles I explained how to configure Ollama on the Mac install. If you have already completed this step, you now have a powerful local language model - such as Mistral, LLaMA3 or another compatible model that can be addressed via REST API.
However, the model only "knows" what is in the current prompt on its own. It does not remember previous conversations. What is missing is a memory.