Swift 6's Puzzling `@isolated(any)`: What You Need to Know

2025-09-01
Swift 6's Puzzling `@isolated(any)`: What You Need to Know

Swift 6 introduces the `@isolated(any)` attribute, which describes the isolation of asynchronous functions, initially appearing confusing. It always requires an argument, but this argument cannot vary. The article explains its introduction: to solve the problem of lost isolation information during asynchronous function scheduling. `@isolated(any)` provides access to a function's isolation property, enabling more intelligent scheduling, especially when handling `Task` and `TaskGroup`, ensuring the execution order of tasks on the MainActor. While it can mostly be ignored, understanding `@isolated(any)` is crucial for writing efficient and reliable concurrent code when dealing with asynchronous function isolation and scheduling.

Read more

Saying Goodbye to Certainty: Probabilistic Programming in Swift

2025-08-29
Saying Goodbye to Certainty: Probabilistic Programming in Swift

This article introduces a novel approach to handling uncertain data in Swift: Uncertain. It encodes probability directly into the type system, elegantly addressing issues like the imprecision of GPS coordinates. Using probability distributions and Monte Carlo sampling, developers can more accurately model real-world uncertainties, building more robust and reliable applications. The article provides a Swift library based on Uncertain and includes examples demonstrating how to handle various probability distributions and perform statistical analysis.

Read more

Run LLMs Locally on Your Mac with Ollama

2025-02-16
Run LLMs Locally on Your Mac with Ollama

Apple announced Apple Intelligence at WWDC 2024, promising "AI for the rest of us," but its arrival feels distant. Meanwhile, Ollama lets you run large language models (LLMs) like llama3.2 locally on your Mac. Think of it as 'Docker for LLMs' – easy to pull, run, and manage models. Powered by llama.cpp, Ollama uses Modelfiles for configuration and the OCI standard for distribution. Running models locally offers advantages in privacy, cost, latency, and reliability. Ollama exposes an HTTP API for easy integration into apps, as demonstrated by Nominate.app, which uses it for intelligent PDF renaming. The article encourages developers to build the next generation of AI-powered apps now with Ollama, instead of waiting for Apple's promises.

Read more
Development