ImageSequencer – Build a video from a collection of images in iOS/macOS/tvOS

I’ve been working on Lapsey – an app to make beautiful timelapse’s with – and whilst I won’t be open-sourcing the entire project, I am trying to open-source various components in the app that feel like it might be worthwhile to others. One of the larger packages in the app is the “engine” that builds out a video from a sequence of images. Enter ImageSequencer.

ImageSequencer is a well-tested Swift framework for iOS, macOS, tvOS that allows you to create videos from a selection of images. It is a Swift Package and is available on GitHub here. It is memory efficient, which is helpful when stitching together a lot of 4K images, and allows for various options such framerate and bitrate adjustment. The full API can be found in the ImageSequencerController interface.

Install

Go to File > Swift Packages > Add Package Dependency and add the following URL:

https://github.com/samst-one/ImageSequencer

Usage

  1. First we need to import the ImageSequencer into our project, we do this by importing the framework
import ImageSequencer
  1. Next we need to create a ImageSequencerController object. The ImageSequencerController acts as the API for the package. We use the make function on the ImageSequencerFactory to do this. The make method has the ability to throw. We also pass in the settings for the video we want to create, so the full code for creating the ImageSequencerController looks like:
let outputUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("(UUID().uuidString).mp4")

let renderSettings = RenderSettings(bitrate: 10000000,
                                    size: CGSize(width: 1920,
                                    height: 1080),
                                    fps: 24,
                                    outputUrl: outputUrl)

let controller = try? ImageSequencerFactory.make(settings: settings)
  1. With the controller, we can now access the API. We first must start of some internal ImageSequencer processes before rendering. To do this, call:
controller.start()
  1. When you have the images you want to render to a video, we can call the render function below. A breakdown of the parameters are as follows.

    • Parameters:
      • images: The collection of images you wish to render in URL format.
      • didAddFrame: A closure that returns a double representing the percentage of rendering completed.
      • completion: A closure thats called when all the images have been rendered out. It returns an optional Error.

So the code looks a bit like this:

controller?.render(images:  images) { percent in

} completion: { error in

}
  1. Once the completion handler has been called without an error, you call the finish() method to produce the video. The video can be found at the output URL that was provided in the render settings.
controller?.finish {

}

Putting it all together

In conclusion, to render out an sequence of images, use full code is below:

let outputUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("(UUID().uuidString).mp4")

let renderSettings = RenderSettings(bitrate: 10000000,
                                    size: CGSize(width: 1920,
                                    height: 1080),
                                    fps: 24,
                                    outputUrl: outputUrl)

let controller = try? ImageSequencerFactory.make(settings: settings)
controller?.start()

controller?.render(images:  images) { percent in
    // Update the user on progress here.
} completion: { error in
    if error == nil {
        controller?.finish {
            // URL now available at output URL provided.
        }
    }
}

A sample app is included that generates some images on the fly, and hooks them into ImageSequencer to render out. Feel free to experiment with that to see if ImageSequencer is the right tool for your project. And do let me know if you make use of it at all!

  • The Double Edged Sword: Apple Music and Dolby Atmos

    A month or so back I bought “Dark Side of The Moon” on Blu-Ray to finally listen to the Atmos remix and – not to mince words her – it was revelatory. Maybe the most…

  • ImageSequencer – Build a video from a collection of images in iOS/macOS/tvOS

    I’ve been working on Lapsey – an app to make beautiful timelapse’s with – and whilst I won’t be open-sourcing the entire project, I am trying to open-source various components in the app that feel…

  • Get all available cameras in iOS using AVCaptureDevice.DiscoverySession

    I’m working on an iOS project that required finding all of the users available cameras on their device. This used to be simple enough prior to iOS 10, as you could just call: This would…

  • Examples of Use Cases in Swift

    In the last post we went over what use cases are in software development and how they can be used in iOS development. We also went over their origin as a requirements gathering technique for…

  • Use Cases in iOS Development

    Use cases have historically been a somewhat confusing topic to venture into for me personally, and I’m now of the believe that is because they typically have a couple of definitions, depending on who you’re…

  • UML Diagrams with PlantUML and SwiftPlantUML

    PlantUML is an open-source tool used to produce an assortment of diagrams using text. With other diagramming tools, the paradigm is typically a GUI and some dragging and dropping of various objects to build up…

  • Camera for iOS – A Swift Package

    Currently preparing a large post going over Clean Architecture in iOS and how that relates to modularization, but whilst that is in the making, I figured I’d post about my newly released Camera framework, and…

  • Feature Modularization in iOS

    So you’ve decided a loosely coupled, highly cohesive, modular architecture with well defined boundaries, is the approach you want to take with your project. Now its time to go over how to deal with separating…

  • Module Boundaries in iOS

    We’ve talked about what modularization is, and what its advantages are when used in a decoupled, cohesive way. It feels only reasonable that we should dig into the meat of what a modular architecture could…

  • Advantages to modularization in iOS

    We’ve already talked about what modularization is, and why a team might want to architect their codebase in such a way. But what are the real life advantages to having multiple units of code to…