Get all available cameras in iOS using AVCaptureDevice.DiscoverySession

I’m working on an iOS project that required finding all of the users available cameras on their device. This used to be simple enough prior to iOS 10, as you could just call:

AVCaptureDevice.devices()

This would retrieve all available cameras on the users device. Sadly, thats been deprecated.

The confusing world of virtual devices

With the introduction of a wide variety of cameras in iOS devices, Apple now combines certain focal lengths into one virtual camera type. For example, if you’re looking to get all the cameras on the back of a modern iPhone [insert number here] Pro, you could just search for the device type below, using the DiscoverySession on AVCaptureDevice:

let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInTripleCamera]
                                                                mediaType: AVMediaType.video,
                                                                 position: .back)

let foundCameras = deviceDiscoverySession.devices

Of course, this only works if you know the exact device the user is on, and could prove very tedious having to check for all the known iPhone/iPad types and match them to their camera type in AVFoundation.

It also gets even more cumbersome if you want to find a specific lens within that camera system. You have the ability to use DiscoverySession with both a single cameras lens, and virtual cameras. So say you want to find the wide-angle lens on the back of an iPhone 15 Pro, both of the below have the potential to work, so which do you choose?

let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera]
                                                                      mediaType: AVMediaType.video,
                                                                      position: .back)

let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInTripleCamera]
                                                                      mediaType: AVMediaType.video,
                                                                      position: .back)

What we are able to do is to prod the discovery session to search for multiple device types, on multiple locations on the device (i.e front or back). And because the DiscoverySession will return the devices in priority order, it means we’re able to also include the single camera systems, like the one on the iPad Air for example, and the OS will determine that it should return the camera systems with multiple cameras closer to the top of the array.

Switching between lens’ on a virtual device

If you opt for returning a virtual device with multiple cameras, you have the ability to switch between those cameras seamlessly by setting the zoom factor.

Doing the below on the triple camera system on the iPhone 15 Pro, would switch you to the ultra-wide lens:

let _ = try? device.lockForConfiguration()

device.videoZoomFactor = 1.0

device.unlockForConfiguration()

The Solution

To find all available cameras on modern iOS device, we need to use the discovery session to look for all possible cameras on a modern iOS device. This ends up looking like the below:

let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInDualCamera,
                                                                            .builtInDualWideCamera,
                                                                            .builtInTripleCamera,
                                                                            .builtInWideAngleCamera,
                                                                            .builtInTelephotoCamera,
                                                                            .builtInUltraWideCamera],
                                                                mediaType: AVMediaType.video,
                                                                 position: .unspecified)
let foundCameras = deviceDiscoverySession.devices

Virtual camera systems with more than one lens are returned near the top of the array, and you’re able to switch cameras in that camera system by adjusting the zoom factor.

Stitching it all together

Using all of the above, if I wanted to get all the cameras on a device, and I wanted to be able to select the wide-angle back camera from that array of devices, the code I’d end up using might look something like this:

func getWideAngleCamera() -> AVCaptureDevice? {
    // Start a discovery session to find all available cameras on your device.
    let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInDualCamera,
                                                                                .builtInDualWideCamera,
                                                                                .builtInTripleCamera,
                                                                                .builtInWideAngleCamera,
                                                                                .builtInTelephotoCamera,
                                                                                .builtInUltraWideCamera],
                                                                    mediaType: AVMediaType.video,
                                                                    position: .unspecified)
    // Get the first camera at the back. This may either be a single camera lens, or a camera system
    // depending on the device you're using.
    if let camera = deviceDiscoverySession.devices.first(where: { $0.position == .back }) {

        // If the camera is a single lens system, then we can return the wide angle camera thats been
        // found (.builtInWideAngleCamera). If the camera is a type of `.builtInDualCamera` camera system
        // (iPhone X class devices), then we can also return the camera as by default the wide angle lens is selected.
        if camera.deviceType == .builtInDualCamera || camera.deviceType == .builtInWideAngleCamera {
            return camera
        }

        // If its none of the above devices, we can use `virtualDeviceSwitchOverVideoZoomFactors` to find
        // out what where the zoom factors are where the virtual camera system switches over lens'.

        // By default, the camera systen will start using the lowest zoom factor (0.5x, i.e the ultra-wide lens),
        // so the wide-angle lens will be the first in the `virtualDeviceSwitchOverVideoZoomFactors` array.

        if let wideAngleZoom = camera.virtualDeviceSwitchOverVideoZoomFactors.first {
            let _ = try? camera.lockForConfiguration()
            camera.videoZoomFactor = CGFloat(truncating: wideAngleZoom)
            camera.unlockForConfiguration()
        }
    }
return nil }
  • The Double Edged Sword: Apple Music and Dolby Atmos

    A month or so back I bought “Dark Side of The Moon” on Blu-Ray to finally listen to the Atmos remix and – not to mince words her – it was revelatory. Maybe the most…

  • ImageSequencer – Build a video from a collection of images in iOS/macOS/tvOS

    I’ve been working on Lapsey – an app to make beautiful timelapse’s with – and whilst I won’t be open-sourcing the entire project, I am trying to open-source various components in the app that feel…

  • Get all available cameras in iOS using AVCaptureDevice.DiscoverySession

    I’m working on an iOS project that required finding all of the users available cameras on their device. This used to be simple enough prior to iOS 10, as you could just call: This would…

  • Examples of Use Cases in Swift

    In the last post we went over what use cases are in software development and how they can be used in iOS development. We also went over their origin as a requirements gathering technique for…

  • Use Cases in iOS Development

    Use cases have historically been a somewhat confusing topic to venture into for me personally, and I’m now of the believe that is because they typically have a couple of definitions, depending on who you’re…

  • UML Diagrams with PlantUML and SwiftPlantUML

    PlantUML is an open-source tool used to produce an assortment of diagrams using text. With other diagramming tools, the paradigm is typically a GUI and some dragging and dropping of various objects to build up…

  • Camera for iOS – A Swift Package

    Currently preparing a large post going over Clean Architecture in iOS and how that relates to modularization, but whilst that is in the making, I figured I’d post about my newly released Camera framework, and…

  • Feature Modularization in iOS

    So you’ve decided a loosely coupled, highly cohesive, modular architecture with well defined boundaries, is the approach you want to take with your project. Now its time to go over how to deal with separating…

  • Module Boundaries in iOS

    We’ve talked about what modularization is, and what its advantages are when used in a decoupled, cohesive way. It feels only reasonable that we should dig into the meat of what a modular architecture could…

  • Advantages to modularization in iOS

    We’ve already talked about what modularization is, and why a team might want to architect their codebase in such a way. But what are the real life advantages to having multiple units of code to…