Accessibility Setting Nuggets from iOS 18
This post is brought to you by Emerge Tools, the best way to build on mobile.
As usual, Cupertino & Friends©️ continue to lead the way in accessibility features on mobile platforms. Though they may not be marquee features meriting keynote time, they are critical to know about. Here’s some of what I found new in iOS 18 after browsing through the documentation (in no particular order).
let concatenatedThoughts = """
Though I won't cover it here, Apple also added the ability to detect if Assistive Access is running or not in iOS 18. I cover that in my post over the matter here.
"""
Deep Linking to Accessibility Settings
How many times have you been testing some obscure (at least, to you) accessibility feature — but you needed to tweak some setting for it after it’s up and running? Aside from testing, what about as an end user — where it can be paramount to tweak these things efficiently?
Now, Apple has API to do just that:
private func openAXSettings(for feature: AccessibilitySettings.Feature) {
Task {
do {
try await AccessibilitySettings.openSettings(for: feature)
} catch {
print("Unable to open AX Settings for \(feature): \(error)")
}
}
}
Much like how we’ve had ways to direct users to specific parts of the iOS’ Settings app (think push notification permissions), now we can send people straight to accessibility sections. For now, there is only one destination supported — and that’s to allow an app to use personal voice accessibility features (for use with the AVSpeechSynthesizer
API).
Of course, we should expect more features to come in future iOS versions. There is a reason it’s built with an enum
for features and it’s not called openPersonVoiceSettings()
, or something similar.
Respecting Blinking Cursor Settings
People have the ability to toggle the blinking cursor in text fields within Settings. Now, if you’re rolling your own text view or equivalent — you can query for it:
private func checkAXCursorPreference() -> Bool {
let axSettings = AccessibilitySettings.self
let prefersNonBlink = axSettings.prefersNonBlinkingTextInsertionIndicator
return prefersNonBlink
}
As expected, there is also a corresponding accessibility notification to ping you of any changes to said preference. Gear up, it’s a mouthful:
struct AXBlinkingCursor: View {
@State private var hideCursorBlink: Bool = false
private static let AXCursorNoteName =
AccessibilitySettings.prefersNonBlinkingTextInsertionIndicatorDidChangeNotification
private static let AXCursorPublisher = NotificationCenter
.default
.publisher(for: AXBlinkingCursor.AXCursorNoteName)
var body: some View {
CustomTextView()
.onReceive(AXBlinkingCursor.AXCursorPublisher) { _ in
hideCursorBlink = prefersNonBlinkingCursor()
}
}
private func prefersNonBlinkingCursor() -> Bool {
let axSettings = AccessibilitySettings.self
let prefersNonBlink = axSettings.prefersNonBlinkingTextInsertionIndicator
return prefersNonBlink
}
}
SwiftUI Specific Additions
If you have a custom tab bar in iOS, you can now indicate that to the accessibility engine. Even better, it appears backported to iOS 17:
var body: some View {
CustomTabBar()
.accessibilityAddTraits(.isTabBar)
}
You can also interpolate localized descriptions of a Color
, too:
@State private var selectedColor: Color = .blue
var body: some View {
CustomThemePicker(baseColor: $selectedColor)
.accessibilityLabel(Text("Your custom color palette. Based on \(accessibilityName: selectedColor)"))
}
Curious how this stacks up against AXNameFromColor
or UIColor.accessibilityName
? Check out this useful gist from Bas.
Another neat one? You can now conditionally apply accessibility labels:
@State private var isDemoEnabled: Bool = false
var body: some View {
VStack {
DemoPickerView()
Button {
playSelectedDemo()
} label: {
Image(systemName: "play.fill")
}
.accessibilityLabel(Text("Play demo."), isEnabled: isDemoEnabled)
}
}
And of course, here are some honorable mentions:
- Support for haptics to play along with music tracks.
- Attach custom accessibility actions to widgets or a custom view, which fire an
Intent
. - Drag and Drop improvements with Voice Over.
- We can have more control how we structure accessibility labels.
Until next time ✌️