MLX

Beginner’s guide to fine-tuning models using MLX on Apple Silicon.

This article is also available in Simplified Chinese. Popular Python fine-tuning packages for large language models (LLMs), such as Unsloth and Lamini, do not support GPU acceleration on Apple M-series chips. Using MLX for fine-tuning on Mac with Apple Silicon is a great alternative. MLX is a machine learning framework developed by Apple, specifically optimized for

Beginner’s guide to fine-tuning models using MLX on Apple Silicon. Read More »

Tutorial, , ,

MLX Framework FAQ Explained: Model Support, Fine-Tuning, Conversion, and MLX Community

1. What machine learning models does MLX support? The MLX framework supports a variety of popular machine learning and deep learning models, primarily including large language models (LLM) and text generation models, such as LLaMA, Mistral, Phi-2, and Qwen; image generation models like Stable Diffusion; speech recognition models such as OpenAI’s Whisper; and models for

MLX Framework FAQ Explained: Model Support, Fine-Tuning, Conversion, and MLX Community Read More »

Tutorial, ,
Scroll to Top