Visual Effect Views in SwiftUI

One of the key elements of the new UI styles introduced in iOS 8 and macOS 10.10 was the use of transparency, specifically a blurred semitransparent background through which some of the underlying content could be seen. On iOS, this is implemented using a UIVisualEffectView, while on macOS the same can be obtained using an NSVisualEffectView. The two are somewhat different in the types of inputs they take, however, and their APIs don’t quite match up with one another.

This seems like something SwiftUI could fix, though, right? Somewhat like the way List, Form, and Button in SwiftUI each describe the same behavior, but with different implementation on different operating systems. What’s more, it really sounds like it’d be an ideal modifier rather than a plain view type: you’re more likely to want to do this:

VStack {
    Text("Hello, World")
    ...
}
.visualEffect(.systemBlur)

…than this:

VisualEffect(.systemBlur) {
    VStack {
        Text("Hello, World")
        ...
    }
}

Sadly, SwiftUI doesn’t provide us with an API for this.

Happily, that gives us a chance to delve into the implementation of a nicely complex View type with multi-platform support. We can even look at the use of custom environment values and preference keys as a means of modifying a visual effect applied elsewhere in the view hierarchy.

The ideal API is the one described above, where a visual effect is applied to an existing view as a modifier. These are actually quite straightforward to add, requiring little more than an extension on View. Since visual effects are generally used as backgrounds, this will ultimately wrap the .background() modifier, supplying some private view type:

extension View {
    public func visualEffect(effect: VisualEffect) -> some View {
        background(VisualEffectView(effect: effect))
    }
}

That’s a pretty straightforward API I think, but there’s one wrinkle visible: what’s a VisualEffect? When we look at NSVisualEffectView and UIVisualEffectView, they take quite different parameters:

macOS

open class NSVisualEffectView : NSView {
    open var material: NSVisualEffectView.Material
    open var blendingMode: NSVisualEffectView.BlendingMode
    open var state: NSVisualEffectView.State
    open var isEmphasized: Bool
}

iOS

open class UIVisualEffectView: UIView, NSSecureCoding {
    @NSCopying open var effect: UIVisualEffect?
}

open class UIVisualEffect : NSObject, NSCopying, NSSecureCoding {
}

open class UIBlurEffect : UIVisualEffect {
    public init(style: UIBlurEffect.Style)
}

open class UIVibrancyEffect: UIVisualEffect {
    public init(blurEffect: UIBlurEffect)
    public init(blurEffect: UIBlurEffect, style: UIVibrancyEffect.Style)
}

Those are quite different, and we don’t particularly want to expose that at the level of the view modifier API, so we’re going to encapsulate them inside a new VisualEffect type.

A cross-platform VisualEffect

Our VisualEffect type will be an enum, and it will contain another enum defining the material types available. On iOS five types of system material will be available, while on macOS the underlying system uses semantic values. Several of these semantic values refer to items already available to SwiftUI, such as menus, popovers, tooltips, and the like. So here we’ll include only the ones that lend themselves to other uses.

public enum VisualEffect: Equatable, Hashable {
    public enum Material: Equatable, Hashable {
        // Available on all platforms.
        case `default`
        
        // Only on iOS-based platforms.
        case ultraThin
        case thin
        case thick
        case chrome
        
        // Only on macOS.
        case titlebar
        case windowBackground
        case headerView(behindWindow: Bool)
        case contentBackground(behindWindow: Bool)
        case behindPageBackground(behindWindow: Bool)
    }
    
    // Some default types.
    case system
    case systemLight
    case systemDark
    
    // Configurable types.
    case adaptive(Material)
    case light(Material)
    case dark(Material)
}

Here we have the basics down. The VisualEffect type contains three catch-all cases which cover most eventualities: system will use a basic material that will change based on the current colorScheme. systemLight and systemDark will use the same basic material but will fix either a light or dark color scheme for their own appearance.

Beyond this, there are three primary cases: adaptive, light, and dark. The three system cases essentially map to adaptive(.default), light(.default), and dark(.default). These three allow the specification of particular material types.

The materials themselves are defined in a sub-enum, Material. This contains a single type that provides a useful default on all platforms, named default, appropriately enough. The various thickness-based materials from iOS are then only available on iOS (tvOS doesn’t actually support these, but we’ll just silently use .default on that platform), while macOS’s semantic materials follow. If you look at the full implementation you’ll see a lot of @available macros marking all these platform-specific pieces; for the sake of readability I’ve left these out of the samples here.

So, now we have a way for clients of our API to specify an effect. How do we go from that to a platform-specific type? We’ll implement an extension on VisualEffect for each differing platform that vends an appropriate platform-specific value. These are all going to be internal implementation details, and will consist of exclusively internal and private methods.

iOS

We’ll sidestep the whole issue of UIVibrancyEffect for now, and leave that for a future update. Right now we’ll stick to using UIBlurEffect, and we can map the materials directly across. Since UIBlurEffect takes a single input parameter, most of the work falls to a single private property to map from our enum to the appropriate UIBlurEffect.Style.

extension VisualEffect {
    var parameters: UIVisualEffect { UIBlurEffect(style: self.blurStyle) }
    
    private var blurStyle: UIBlurEffect.Style {
        switch self {
        case .system:      return .systemMaterial
        case .systemLight: return .systemMaterialLight
        case .systemDark:  return .systemMaterialDark
        case .adaptive(let material):
            switch material {
            case .ultraThin:    return .systemUltraThinMaterial
            case .thin:         return .systemThinMaterial
            case .default:      return .systemMaterial
            case .thick:        return .systemThickMaterial
            case .chrome:       return .systemChromeMaterial
            }
        case .light(let material):
            switch material {
            case .ultraThin:    return .systemUltraThinMaterialLight
            case .thin:         return .systemThinMaterialLight
            case .default:      return .systemMaterialLight
            case .thick:        return .systemThickMaterialLight
            case .chrome:       return .systemChromeMaterialLight
            }
        case .dark(let material):
            switch material {
            case .ultraThin:    return .systemUltraThinMaterialDark
            case .thin:         return .systemThinMaterialDark
            case .default:      return .systemMaterialDark
            case .thick:        return .systemThickMaterialDark
            case .chrome:       return .systemChromeMaterialDark
            }
        }
    }
}

The three ‘default’ values are quickly handled, while the remainder have their constituent materials broken out and mapped to the appropriate UIKit values.

tvOS

It turns out that tvOS doesn’t support the various systemMaterial types available in iOS, and only has the original regular, light, and dark options available. In the interest of increased code sharing between iOS and tvOS code, we’ll allow the same options to be input on tvOS, but we’ll map them all to these three basic materials, ignoring the attached Material type when the OS doesn’t support them.

extension VisualEffect {
    var parameters: UIVisualEffect {
        switch self {
        case .adaptive, .system:   return UIBlurEffect(style: .regular)
        case .light, .systemLight: return UIBlurEffect(style: .light)
        case .dark, .systemDark:   return UIBlurEffect(style: .dark)
        }
    }
}

macOS

On macOS there’s a little more work to be done. The visual effect itself is defined not only by a material, but by a choice of what content to use in calculating the blur: should it use content from within the window, similar to the Safari titlebar, or behind the view, like the Finder sidebar? Beyond this, there are no material values related to color schemes, so specifying an explicit light or dark style requires the use of a specific NSAppearance instance. There are more options here than just ‘light’ and ‘dark,’ but until SwiftUI makes the various contrast options available for inspection somehow we’ll stick to using those two via the .aqua and .darkAqua values.

All together, that’s three separate parameters that need to be passed to the underlying NSVisualEffectView. Our extension, then, will start by defining a type for that:

extension VisualEffect {
    struct NSEffectParameters {
        var material: NSVisualEffectView.Material = .contentBackground
        var blendingMode: NSVisualEffectView.BlendingMode = .behindWindow
        var appearance: NSAppearance? = nil
    }
    
    // ...
}

By declaring a structure type with default values for all its properties, we can simplify the declaration of the system family of effects; they refer to these default values defined in NSEffectParameters.

Next we need to extract an appropriate NSVisualEffectView.Material from our VisualEffect:

private var material: NSVisualEffectView.Material {
    switch self {
    case .system, .systemLight, .systemDark:
        return .contentBackground
    case .adaptive(let material), .light(let material), .dark(let material):
        switch material {
        case .default, .contentBackground: return .contentBackground
        case .titlebar: return .titlebar
        case .headerView: return .headerView
        case .behindPageBackground: return .underPageBackground
        case .windowBackground: return .windowBackground
        }
    }
}

This makes use of Swift’s ability to extract similar types from multiple matched cases in a single expression. This lets us ignore the adaptive, light, and dark parts of the effect specification, and look only at the attached materials, and do so without duplicating any code.

Next is the blending mode. This is implicit in some places (titlebar uses within-window content like Safari, while windowBackground uses behind-window content like the Finder sidebar), but explicit in most:

private var blendingMode: NSVisualEffectView.BlendingMode {
    switch self {
    case .system, .systemLight, .systemDark:
        return .behindWindow
    case .adaptive(let material),
         .light(let material),
         .dark(let material):
        switch material {
        case .default, .windowBackground:
            return .behindWindow
        case .titlebar:
            return .withinWindow
        case .contentBackground(let b),
             .headerView(let b),
             .behindPageBackground(let b):
            return b ? .behindWindow : .withinWindow
        }
    }
}

With these pieces in place, we can define the parameters property itself:

var parameters: NSEffectParameters {
    switch self {
    case .system:
        return NSEffectParameters()
    case .systemLight:
        return NSEffectParameters(appearance: NSAppearance(named: .aqua))
    case .systemDark:
        return NSEffectParameters(appearance: NSAppearance(named: .darkAqua))
    case .adaptive:
        return NSEffectParameters(material: self.material,
                                  blendingMode: self.blendingMode)
    case .light:
        return NSEffectParameters(material: self.material,
                                  blendingMode: self.blendingMode,
                                  appearance: NSAppearance(named: .aqua))
    case .dark:
        return NSEffectParameters(material: self.material,
                                  blendingMode: self.blendingMode,
                                  appearance: NSAppearance(named: .darkAqua))
    }
}

Now we’re ready to tackle the view itself.

Defining environment values and preference keys

We’d like to see our VisualEffect settings propagate downwards using the SwiftUI environment, and at the same time we’d like descendants to request a change to any visual effect ancestors when necessary. For the former, there are two steps to perform, and one for the latter.

To see what’s required to create new environment values, take a look at the EnvironmentValues type itself:

public struct EnvironmentValues {
    public init()
    public subscript<K>(key: K.Type) -> K.Value where K : EnvironmentKey
}

The EnvironmentValues type handily provides key-value storage for us, and uses types as the keys. You define a type conforming to EnvironmentKey, and its own type serves as a key, while the value type it represents is defined as an associated type:

public protocol EnvironmentKey {
    associatedtype Value
    static var defaultValue: Self.Value { get }
}

To house a VisualEffect within the environment then, we first need to define the key type. We’ll use a Value of VisualEffect?, and a default value of nil. Our views will interpret a nil value to mean that the visual effect should be disabled.

struct VisualEffectKey: EnvironmentKey {
    typealias Value = VisualEffect?
    static var defaultValue: Value = nil
}

To actually store the value, it needs to be written into the EnvironmentValues instance via that type’s subscript modifier. This is easy to accomplish with an extension:

extension EnvironmentValues {
    public var visualEffect: VisualEffect? {
        get { self[VisualEffectKey.self] }
        set { self[VisualEffectKey.self] = newValue }
    }
}

The preference key type is similar to the environment key, with one extra difference: preference keys are designed to be reduced from a graph down to a single value. This allows a single ancestor to adopt one value even when several descendants each publish their own versions. The PreferenceKey type, then, needs to implement a reduce method. In our case, we’ll go with a simple case: a non-nil value overrides a nil value, and a non-nil value can’t be overridden.

struct VisualEffectPreferenceKey: PreferenceKey {
    typealias Value = VisualEffect?
    static var defaultValue: VisualEffect? = nil
    
    static func reduce(value: inout VisualEffect?, nextValue: () -> VisualEffect?) {
        guard value == nil else { return }
        value = nextValue()
    }
}

With these items in place, we’re ready to tackle the view itself.

Wrapping a platform VisualEffectView

Our approach is going to use a plain SwiftUI View type that itself wraps a platform specific ViewRepresentable type. The outer view will hold onto a copy of the specified VisualEffect in a @State property, and will then place that value into the environment of the platform-specific view in its body. It will also install a preference monitor using the onPreferenceChange(key:perform:) modifier to pick up any changes to the effect requested by descendants. These changes will be used to update its @State property, triggering a refresh of the view and a corresponding change to the environment.

struct VisualEffectView: View {
    @State private var effect: VisualEffect?
    private let content: _PlatformVisualEffectView
    
    var body: some View {
        content
            .environment(\.visualEffect, effect)
            .onPreferenceChange(VisualEffectPreferenceKey.self) {
                self.effect = $0
            }
    }
    
    fileprivate init(effect: VisualEffect) {
        self._effect = State(wrappedValue: effect)
        self.content = _PlatformVisualEffectView()
    }
    
    // ...
}

Note that the initializer here has to initialize the State<VisualEffect?> property named _effect; the compiler would generally synthesize this, but since we’re defining our own initializer we have to reach behind the curtain a little to get things set up properly.

The _PlatformVisualEffectView will be implemented as an inner type separately for macOS and iOS/tvOS, with each conforming to their respective platforms’ ViewRepresentable types. The implementations are selected via #if os() expressions, defining which will be compiled.

Let’s start with iOS, which is simpler.

iOS/tvOS

private struct _PlatformVisualEffectView: UIViewRepresentable {
    func makeUIView(context: Context) -> UIVisualEffectView {
        let effect = context.environment.visualEffect ?? .system
        
        let view = UIVisualEffectView(effect: effect.parameters)
        view.autoresizingMask = [.flexibleWidth, .flexibleHeight]
        return view
    }
    
    func updateUIView(_ uiView: UIVisualEffectView, context: Context) {
        guard let effect = context.environment.visualEffect else {
            // disable the effect
            uiView.isHidden = true
            return
        }
        
        uiView.isHidden = false
        uiView.effect = effect.parameters
    }
}

We’ve implemented the two required methods, makeUIView(context:) and updateUIView(_:context:). In both cases, the effect to use is read from the environment. In makeUIView a valid effect is required to initialize the underlying view, so a default is used if the environment contains nil. In updateUIView a nil value from the environment causes the effect view to be hidden. Any non-nil value is passed on to the underlying UIVisualEffectView instance.

macOS

On macOS there’s a little more work to do, but not much:

private struct _PlatformVisualEffectView: NSViewRepresentable {
    func makeNSView(context: Context) -> NSVisualEffectView {
        let view = NSVisualEffectView()
        view.autoresizingMask = [.width, .height]
        return view
    }
    
    func updateNSView(_ nsView: NSVisualEffectView, context: Context) {
        guard let params = context.environment.visualEffect?.parameters else {
            // disable the effect
            nsView.isHidden = true
            return
        }
        nsView.isHidden = false
        nsView.material = params.material
        nsView.blendingMode = params.blendingMode
        nsView.appearance = params.appearance
        
        // mark emphasized if it contains the first responder
        if let resp = nsView.window?.firstResponder as? NSView {
            nsView.isEmphasized = resp === nsView || resp.isDescendant(of: nsView)
        }
        else {
            nsView.isEmphasized = false
        }
    }
}

In makeNSView(context:) the setup is more straightforward than for iOS, as the NSVisualEffectView doesn’t require its effects to be specified at initialization. That means that the bulk of the work happens in updateNSView(_:context:). Here again the effect is read from the environment, and any nil value found causes the effect view to be hidden. If a value is present, its constituents—material, blendingMode, and optional appearance—are assigned to the underlying view.

Additionally, effect views on macOS have an additional isEmphasized property that’s designed to help indicate when a view contains the first responder. We’re hiding this complexity from our SwiftUI clients by looking at the view hierarchy to determine if the first responder is within the effect view. If so, it adopts the emphasized appearance, and turns it off if not.

At this point, your implementation is just about complete. It only remains to implement the view modifier APIs for assigning an effect and publishing an effect preference.

Wrapping it up

The actual public API you’re about to implement may seem like an anticlimax. It ultimately consists of six lines of code, two of which are the function names, and two more are closing braces. However, this speaks to the way in which the complexity of the underlying platform-specific interface has been tucked away out of sight: everything is there in a single one-line function call, even in the internal interface.

Without further ado, here’s what you need to put it all to together:

extension View {
    public func visualEffect(_ effect: VisualEffect = .system) -> some View {
        background(VisualEffectView(effect: effect))
    }
    
    public func visualEffectPreference(_ effect: VisualEffect) -> some View {
        preference(key: VisualEffectPreferenceKey.self, value: effect)
    }
}

And that’s it.

The full source, with all its documentation and copious availability attributes, is available as part of AQUI on Github. If you encounter any trouble, please file an issue there and I’ll be sure to take a look asap.


© 2009-2019. All rights reserved.

Powered by Hydejack v9.1.6