I’ve been working on Lapsey – an app to make beautiful timelapse’s with – and whilst I won’t be open-sourcing the entire project, I am trying to open-source various components in the app that feel like it might be worthwhile to others. One of the larger packages in the app is the “engine” that builds out a video from a sequence of images. Enter ImageSequencer.
ImageSequencer is a well-tested Swift framework for iOS, macOS, tvOS that allows you to create videos from a selection of images. It is a Swift Package and is available on GitHub here. It is memory efficient, which is helpful when stitching together a lot of 4K images, and allows for various options such framerate and bitrate adjustment. The full API can be found in the ImageSequencerController
interface.
Install
Go to File > Swift Packages > Add Package Dependency and add the following URL:
https://github.com/samst-one/ImageSequencer
Usage
- First we need to import the
ImageSequencer
into our project, we do this by importing the framework
import ImageSequencer
- Next we need to create a
ImageSequencerController
object. TheImageSequencerController
acts as the API for the package. We use themake
function on theImageSequencerFactory
to do this. Themake
method has the ability to throw. We also pass in the settings for the video we want to create, so the full code for creating theImageSequencerController
looks like:
let outputUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("(UUID().uuidString).mp4")
let renderSettings = RenderSettings(bitrate: 10000000,
size: CGSize(width: 1920,
height: 1080),
fps: 24,
outputUrl: outputUrl)
let controller = try? ImageSequencerFactory.make(settings: settings)
- With the
controller
, we can now access the API. We first must start of some internalImageSequencer
processes before rendering. To do this, call:
controller.start()
When you have the images you want to render to a video, we can call the
render
function below. A breakdown of the parameters are as follows.- Parameters:
images
: The collection of images you wish to render in URL format.didAddFrame
: A closure that returns a double representing the percentage of rendering completed.completion
: A closure thats called when all the images have been rendered out. It returns an optionalError
.
- Parameters:
So the code looks a bit like this:
controller?.render(images: images) { percent in
} completion: { error in
}
- Once the
completion
handler has been called without an error, you call thefinish()
method to produce the video. The video can be found at the output URL that was provided in the render settings.
controller?.finish {
}
Putting it all together
In conclusion, to render out an sequence of images, use full code is below:
let outputUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("(UUID().uuidString).mp4")
let renderSettings = RenderSettings(bitrate: 10000000,
size: CGSize(width: 1920,
height: 1080),
fps: 24,
outputUrl: outputUrl)
let controller = try? ImageSequencerFactory.make(settings: settings)
controller?.start()
controller?.render(images: images) { percent in
// Update the user on progress here.
} completion: { error in
if error == nil {
controller?.finish {
// URL now available at output URL provided.
}
}
}
A sample app is included that generates some images on the fly, and hooks them into ImageSequencer
to render out. Feel free to experiment with that to see if ImageSequencer
is the right tool for your project. And do let me know if you make use of it at all!