Use when implementing maps, annotations, search, directions, or debugging MapKit display/performance issues - SwiftUI Map, MKMapView, MKLocalSearch, clustering, Look Around
Provides MapKit implementation guidance, performance optimization, and SwiftUI vs MKMapView decision trees for iOS apps.
npx claudepluginhub charleswiltgen/axiomThis skill inherits all available tools. When active, it can use any tool Claude has access to.
MapKit patterns and anti-patterns for iOS apps. Prevents common mistakes: using MKMapView when SwiftUI Map suffices, annotations in view bodies, setRegion loops, and performance issues with large annotation counts.
axiom-mapkit-ref — Complete API referenceaxiom-mapkit-diag — Symptom-based troubleshootingaxiom-core-location — Location authorization and monitoring| Anti-Pattern | Time Cost | Fix |
|---|---|---|
| Using MKMapView when SwiftUI Map suffices | 2-4 hours UIViewRepresentable boilerplate | Use SwiftUI Map {} for standard map features (iOS 17+) |
| Creating annotations in SwiftUI view body | UI freeze with 100+ items, view recreation on every update | Move annotations to model, use @State or @Observable |
| No annotation view reuse (MKMapView) | Memory spikes, scroll lag with 500+ annotations | dequeueReusableAnnotationView(withIdentifier:for:) |
setRegion in updateUIView without guard | Infinite loop — region change triggers update, update sets region | Guard with mapView.region != region or use flag |
| Ignoring MapCameraPosition (SwiftUI) | Can't programmatically control camera, broken "center on user" | Bind position parameter to @State var cameraPosition |
| Synchronous geocoding on main thread | UI freeze for 1-3 seconds per geocode | Use CLGeocoder().geocodeAddressString with async/await |
| Not filtering annotations to visible region | Loading all 10K annotations at once | Use mapView.annotations(in:) or fetch by visible region |
Ignoring resultTypes in MKLocalSearch | Irrelevant results, slow search | Set .resultTypes = [.pointOfInterest] or .address to filter |
digraph {
"Need map in app?" [shape=diamond];
"iOS 17+ target?" [shape=diamond];
"Need custom tile overlay?" [shape=diamond];
"Need fine-grained delegate control?" [shape=diamond];
"Use SwiftUI Map" [shape=box];
"Use MKMapView\nvia UIViewRepresentable" [shape=box];
"Need map in app?" -> "iOS 17+ target?" [label="yes"];
"iOS 17+ target?" -> "Need custom tile overlay?" [label="yes"];
"iOS 17+ target?" -> "Use MKMapView\nvia UIViewRepresentable" [label="no"];
"Need custom tile overlay?" -> "Use MKMapView\nvia UIViewRepresentable" [label="yes"];
"Need custom tile overlay?" -> "Need fine-grained delegate control?" [label="no"];
"Need fine-grained delegate control?" -> "Use MKMapView\nvia UIViewRepresentable" [label="yes"];
"Need fine-grained delegate control?" -> "Use SwiftUI Map" [label="no"];
}
Annotation count?
├─ < 100 → Use Marker/Annotation directly in Map {} content builder
│ Simple, declarative, no performance concern
│
├─ 100-1000 → Enable clustering
│ Set .clusteringIdentifier on annotation views
│ SwiftUI: Marker("", coordinate:).tag(id)
│ MKMapView: view.clusteringIdentifier = "poi"
│
└─ 1000+ → Server-side clustering or visible-region filtering
Fetch only annotations within mapView.region
Or pre-cluster on server, send cluster centroids
MKMapView with view reuse is preferred for very large datasets
Only load annotations within the visible map region. Prevents loading all 10K+ annotations at once:
struct MapView: View {
@State private var cameraPosition: MapCameraPosition = .automatic
@State private var visibleAnnotations: [Location] = []
let allLocations: [Location] // Full dataset
var body: some View {
Map(position: $cameraPosition) {
ForEach(visibleAnnotations) { location in
Marker(location.name, coordinate: location.coordinate)
}
}
.onMapCameraChange(frequency: .onEnd) { context in
visibleAnnotations = allLocations.filter { location in
context.region.contains(location.coordinate)
}
}
}
}
extension MKCoordinateRegion {
func contains(_ coordinate: CLLocationCoordinate2D) -> Bool {
let latRange = (center.latitude - span.latitudeDelta / 2)...(center.latitude + span.latitudeDelta / 2)
let lngRange = (center.longitude - span.longitudeDelta / 2)...(center.longitude + span.longitudeDelta / 2)
return latRange.contains(coordinate.latitude) && lngRange.contains(coordinate.longitude)
}
}
Without clustering at 500 annotations:
With clustering:
Search implementation:
├─ User types search query
│ └─ MKLocalSearchCompleter (real-time autocomplete)
│ Configure: resultTypes, region bias
│ └─ User selects result
│ └─ MKLocalSearch (full result with MKMapItem)
│ Use completion.title for MKLocalSearch.Request
│
└─ Programmatic search (e.g., "nearest gas station")
└─ MKLocalSearch with naturalLanguageQuery
Configure: resultTypes, region, pointOfInterestFilter
Directions implementation:
├─ MKDirections.Request
│ Set source (MKMapItem.forCurrentLocation()) and destination
│ Set transportType (.automobile, .walking, .transit)
│
└─ MKDirections.calculate()
└─ MKRoute
├─ .polyline → Display as MapPolyline or MKPolylineRenderer
├─ .expectedTravelTime → Show ETA
├─ .distance → Show distance
└─ .steps → Turn-by-turn instructions
Setup: Adding a map to a SwiftUI app. Developer is familiar with MKMapView from UIKit projects.
Pressure: "I know MKMapView well. SwiftUI Map is new and might be limited."
Expected with skill: Check the decision tree. If the app needs standard markers, annotations, camera control, user location, and shape overlays — SwiftUI Map handles all of that. Use it.
Anti-pattern without skill: 200+ lines of UIViewRepresentable + Coordinator wrapping MKMapView, manually bridging state, implementing delegate methods for annotation views, fighting updateUIView infinite loops — when 20 lines of Map {} with content builder would have worked.
Time cost: 2-4 hours of unnecessary boilerplate + ongoing maintenance burden.
The test: Can you list a specific feature the app needs that SwiftUI Map cannot provide? If not, use SwiftUI Map.
Setup: App has a database of 10,000 location data points. Product manager wants users to see all locations on the map.
Pressure: "Users need to see ALL locations. Just add them all."
Expected with skill: Use clustering + visible region filtering. 10K annotations without clustering is unusable — pins overlap, scrolling lags, memory spikes. Clustering shows meaningful groups. Visible region filtering loads only what's on screen.
Anti-pattern without skill: Adding all 10,000 annotations at once. Map becomes an unreadable blob of overlapping pins. Scroll lag makes the app feel broken. Memory usage spikes 200-400MB.
Implementation path:
.clusteringIdentifier).onMapCameraChange + query)Setup: MKLocalSearch returns irrelevant or empty results. Developer considers adding Google Maps SDK.
Pressure: "MapKit search is broken. Let me add a third-party SDK."
Expected with skill: Check configuration first. MapKit search needs:
resultTypes — filter to .pointOfInterest or .address (default returns everything)region — bias results to the visible map regionAnti-pattern without skill: Adding Google Maps SDK (50+ MB binary, API key management, billing setup) when MapKit search works correctly with proper configuration.
Time cost: 4-8 hours adding third-party SDK vs 5 minutes configuring MapKit search.
MapKit and Core Location interact in ways that surprise developers.
When you set showsUserLocation = true on MKMapView or add UserAnnotation() in SwiftUI Map, MapKit implicitly requests location authorization if it hasn't been requested yet.
This means:
Request authorization explicitly BEFORE showing the map:
// 1. Request authorization with context
let session = CLServiceSession(authorization: .whenInUse)
// 2. Then show map with user location
Map {
UserAnnotation()
}
For continuous location display on a map, create a CLServiceSession:
@Observable
class MapModel {
var cameraPosition: MapCameraPosition = .automatic
private var locationSession: CLServiceSession?
func startShowingUserLocation() {
locationSession = CLServiceSession(authorization: .whenInUse)
}
func stopShowingUserLocation() {
locationSession = nil
}
}
For full authorization decision trees, monitoring patterns, and background location:
axiom-core-location — Authorization strategy, monitoring approachaxiom-core-location-diag — "Location not working" troubleshootingaxiom-energy — Location as battery subsystemThe most common pattern — a map with markers and user location:
struct ContentView: View {
@State private var cameraPosition: MapCameraPosition = .automatic
@State private var selectedItem: MKMapItem?
let locations: [Location] // Your model
var body: some View {
Map(position: $cameraPosition, selection: $selectedItem) {
UserAnnotation()
ForEach(locations) { location in
Marker(location.name, coordinate: location.coordinate)
.tint(location.category.color)
}
}
.mapStyle(.standard(elevation: .realistic))
.mapControls {
MapUserLocationButton()
MapCompass()
MapScaleView()
}
.onChange(of: selectedItem) { _, item in
if let item {
handleSelection(item)
}
}
}
}
@State var cameraPosition — bind for programmatic camera controlselection: $selectedItem — handle tap on markersMapCameraPosition.automatic — system manages initial view.mapControls {} — built-in UI for location button, compass, scaleForEach in content builder — dynamic annotations from dataComplete search with autocomplete:
@Observable
class SearchModel {
var searchText = ""
var completions: [MKLocalSearchCompletion] = []
var searchResults: [MKMapItem] = []
private let completer = MKLocalSearchCompleter()
private var completerDelegate: CompleterDelegate?
init() {
completerDelegate = CompleterDelegate { [weak self] results in
self?.completions = results
}
completer.delegate = completerDelegate
completer.resultTypes = [.pointOfInterest, .address]
}
func updateSearch(_ text: String) {
searchText = text
completer.queryFragment = text
}
func search(for completion: MKLocalSearchCompletion) async throws {
let request = MKLocalSearch.Request(completion: completion)
request.resultTypes = [.pointOfInterest, .address]
let search = MKLocalSearch(request: request)
let response = try await search.start()
searchResults = response.mapItems
}
func search(query: String, in region: MKCoordinateRegion) async throws {
let request = MKLocalSearch.Request()
request.naturalLanguageQuery = query
request.region = region
request.resultTypes = .pointOfInterest
let search = MKLocalSearch(request: request)
let response = try await search.start()
searchResults = response.mapItems
}
}
class CompleterDelegate: NSObject, MKLocalSearchCompleterDelegate {
let onUpdate: ([MKLocalSearchCompletion]) -> Void
init(onUpdate: @escaping ([MKLocalSearchCompletion]) -> Void) {
self.onUpdate = onUpdate
}
func completerDidUpdateResults(_ completer: MKLocalSearchCompleter) {
onUpdate(completer.results)
}
func completer(_ completer: MKLocalSearchCompleter, didFailWithError error: Error) {
// Handle error — network issues, rate limiting
}
}
Apple rate-limits MapKit search. For autocomplete:
MKLocalSearchCompleter handles its own throttling internallyqueryFragment on each keystroke; the completer debouncesFor MKLocalSearch:
MKLocalSearch only when the user selects a completion or submitsfunc calculateDirections(
from source: CLLocationCoordinate2D,
to destination: MKMapItem,
transportType: MKDirectionsTransportType = .automobile
) async throws -> MKRoute {
let request = MKDirections.Request()
request.source = MKMapItem(placemark: MKPlacemark(coordinate: source))
request.destination = destination
request.transportType = transportType
let directions = MKDirections(request: request)
let response = try await directions.calculate()
guard let route = response.routes.first else {
throw MapError.noRouteFound
}
return route
}
Map(position: $cameraPosition) {
if let route {
MapPolyline(route.polyline)
.stroke(.blue, lineWidth: 5)
}
Marker("Start", coordinate: startCoord)
Marker("End", coordinate: endCoord)
}
// Add overlay
mapView.addOverlay(route.polyline, level: .aboveRoads)
// Implement renderer delegate
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
if let polyline = overlay as? MKPolyline {
let renderer = MKPolylineRenderer(polyline: polyline)
renderer.strokeColor = .systemBlue
renderer.lineWidth = 5
return renderer
}
return MKOverlayRenderer(overlay: overlay)
}
let route: MKRoute = ...
let travelTime = route.expectedTravelTime // TimeInterval in seconds
let distance = route.distance // CLLocationDistance in meters
let steps = route.steps // [MKRoute.Step]
for step in steps {
print("\(step.instructions) — \(step.distance)m")
// "Turn right on Main St — 450m"
}
Map(position: $cameraPosition) {
ForEach(locations) { location in
Marker(location.name, coordinate: location.coordinate)
.tag(location.id)
}
.mapItemClusteringIdentifier("locations")
}
func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? {
if let cluster = annotation as? MKClusterAnnotation {
let view = mapView.dequeueReusableAnnotationView(
withIdentifier: "cluster",
for: annotation
) as! MKMarkerAnnotationView
view.markerTintColor = .systemBlue
view.glyphText = "\(cluster.memberAnnotations.count)"
return view
}
let view = mapView.dequeueReusableAnnotationView(
withIdentifier: "pin",
for: annotation
) as! MKMarkerAnnotationView
view.clusteringIdentifier = "locations"
view.markerTintColor = .systemRed
return view
}
clusteringIdentifiermapView.register(MKMarkerAnnotationView.self, forAnnotationViewWithReuseIdentifier: "pin")dequeueReusableAnnotationView (MKMapView)MKLookAroundSceneRequest)WWDC: 2023-10043, 2024-10094
Docs: /mapkit, /mapkit/map
Skills: mapkit-ref, mapkit-diag, core-location
Search, retrieve, and install Agent Skills from the prompts.chat registry using MCP tools. Use when the user asks to find skills, browse skill catalogs, install a skill for Claude, or extend Claude's capabilities with reusable AI agent components.
Activates when the user asks about AI prompts, needs prompt templates, wants to search for prompts, or mentions prompts.chat. Use for discovering, retrieving, and improving prompts.
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.