LoRA training: How FileMaker 2025 simplifies the fine-tuning of large language models

LoRA Fine tuning - FileMaker 2025

The world of artificial intelligence is on the move. New models, new methods and, above all, new possibilities are emerging on an almost weekly basis - and yet one thing remains constant: not every technical innovation automatically leads to a better everyday life. Many things remain experimental, complex or simply too costly for productive use. This is particularly evident in the so-called fine-tuning of large language models - a method of specializing generative AI to its own content, terms and tonalities.

I have accompanied this process intensively over the last few months - first in the classic form, with Python, terminal, error messages and nerve-wracking setup loops. And then: with FileMaker 2025. A step that surprised me - because it wasn't loud, but clear. And because it showed that there is another way.

Read more

How AI specialists can be trained today - opportunities for companies and trainees

Train an AI specialist

Just a few years ago, artificial intelligence was a topic for research institutes and large corporations. People talked about neural networks, deep learning and speech recognition - but it hardly played a role in everyday life. Today, AI is no longer a topic for the future, but a reality: it writes texts, creates images, analyzes data and controls production processes. Whether in administration, trade or industry - it can now be found everywhere.

Read more

Digital dependency: how we have lost our self-determination to the cloud

Digital dependency with cloud systems

I've always thought it was a mistake for people to hand over their data - be it in the cloud, via apps or with any "free" services. For me, data sovereignty has never been a buzzword, but a question of self-respect. Anyone who uses technology without considering the consequences is entering into a dependency that often only becomes noticeable years later - but then has an even deeper impact.

Read more

gFM-Business and the future of ERP: local intelligence instead of cloud dependency

gFM-Business and AI + knowledge graph

For over a decade, the gFM-Business software has stood for something special in the German ERP market: it is not based on a cumbersome, difficult-to-maintain system, but on the lightweight, customizable and visually modelled FileMaker platform. This has many advantages: gFM-Business can be individually expanded, runs on Windows, macOS and iOS, and can be customized by both developers and ambitious power users.

With the advent of artificial intelligence (AI) - especially through so-called language models such as ChatGPT - new opportunities are now emerging that go far beyond traditional automation. gFM-Business is actively preparing for this future: with the aim of not only managing data, but also unlocking knowledge.

Read more

Artificial intelligence: which jobs are at risk and how we can arm ourselves now.

Which jobs will be eliminated by AI in the future

Hardly any other technological change has crept into our everyday lives as quickly as artificial intelligence. What was considered a visionary technology of the future yesterday is already a reality today - whether in texting, programming, diagnosing, translating or even creating music, art or legal briefs.

Read more

FileMaker Conference 2025: AI, community and an unexpected incident

FileMaker Conference 2025: Fire alarm with fire department

The FileMaker Conference 2025 in Hamburg is over - and it was a special milestone in many respects. Not only because this year's conference focused on many topics related to artificial intelligence, performance and modern workflows - but also because the personal exchange and the "family atmosphere" of the FileMaker community once again came into its own. For me personally, it was an intensive, inspiring and all-round enriching time - right from the very first evening.

Read more

Integration of MLX in FileMaker 2025: Local AI as the new standard

Local AI with MLX and FileMaker

While MLX originally started as an experimental framework from Apple Research, a quiet but significant development has taken place in recent months: With the release of FileMaker 2025, Claris has firmly integrated MLX into the server as a native AI infrastructure for Apple Silicon. This means that anyone working with a Mac and relying on Apple Silicon can not only run MLX models locally, but also use them directly in FileMaker - with native functions, without any intermediate layers.

Read more

MLX on Apple Silicon as local AI in comparison with Ollama & Co.

Local AI on the Mac with MLX

At a time when centralized AI services such as ChatGPT, Claude or Gemini are dominating the headlines, many professional users are increasingly looking for an alternative - a local, self-controllable AI infrastructure. Especially for creative processes, sensitive data or recurring workflows, a local solution is often the more sustainable and secure option.

Anyone working with a Mac - especially with Apple Silicon (M1, M2, M3 or M4) - can now find amazingly powerful tools to run their own language models directly on the device. At the center of this is a new, largely unknown component: MLX, a machine learning framework developed by Apple that is likely to play an increasingly central role in the company's AI ecosystem in the coming years.

Read more