App killed by OS for using too much memory!

Hi,

I am writing an app for which I need to get a video from Photos, work through the frames in the video, passing each one off to a function which processes the frame, and then putting the newly processed frame into an Image.

Potentially the processing function could take a while to run as it is made up of a lot of CV analysis. For this reason I am keen to only process one frame at a time to minimise the memory I need.

My code has no problem handling short videos (15 seconds) but when I pass in a 60 second video I get an error - The app “MyApp” has been killed by the operating system because it is using too much memory.

On testing, I find I can repeat runs of the 15 second video many times which suggests that the problem is not one of a permanent memory leak. I am attempting to only read out one frame at a time from the video but I wonder if something below the surface is queuing more frames up for me and, because I take a long time to process a frame, this queue is getting quite long and requiring too much memory.

I have pared the code down to a small app which exhibits the problem. Rather then using the CV processing function it is more convenient to process the function simply by converting it to greyscale and doing that multiple times to increase the processing time.

Any help on this would be gratefully received!

Simon.

I am using Xcode 15.4 and my target is an iPad Air running 17.5.1.

import SwiftUI
import AVFoundation
import PhotosUI

struct ContentView: View 
{

    @State private var myFrame: CGImage?
    @State private var pickedVideoURL: URL?
    @State private var showVideoPicker: Bool = false
    @State private var selectedItem: PhotosPickerItem?
    @State private var statusText = ""

    private let label = Text("frame")

    var body: some View 
    {
        VStack 
        {
            Button("Pick a Video")
            {
                showVideoPicker.toggle()
            }
            
            Spacer()
            
            Text("\(statusText)")

            Spacer()

            if let image = myFrame
            {
                Image(image, scale: 1.0, orientation: .up, label: label)
                    .resizable()
                    .aspectRatio(contentMode: .fit)
            }
            else
            {
                Color.blue
            }
        }
        .padding()
        .photosPicker(isPresented: $showVideoPicker, selection: $selectedItem, matching: .videos)
        .padding()
        .onChange(of: selectedItem)
        { oldValue, newValue in
            if let newValue
            {
                Task
                {
                    do
                    {
                        // Copy the video from Photos.
                        let movie = try await newValue.loadTransferable(type: Video.self)
                        pickedVideoURL = movie?.url

                        // Work through it.
                        extractFrames(from: pickedVideoURL!)
                        {
                            print("extractFrames completion code")
                        }
                    }
                    catch
                    {
                        print("Error: \(error.localizedDescription)")
                    }
                }
            }
        }
    }
    
    func extractFrames(from videoURL: URL, completion: @escaping () -> Void)
    {
            
        let asset = AVAsset(url: videoURL)
        let imageGenerator = AVAssetImageGenerator(asset: asset)
        imageGenerator.requestedTimeToleranceBefore = .zero
        imageGenerator.requestedTimeToleranceAfter = .zero
        imageGenerator.appliesPreferredTrackTransform = true

        // Calculate frame times.
        var times: [CMTime] = []
        let duration = asset.duration
        let fps = 30.0
        let totalFrames = Int(CMTimeGetSeconds(duration) * fps)
        for i in 0..<totalFrames
        {
            let time = CMTime(seconds: Double(i) / fps, preferredTimescale: 600)
            times.append(time)
        }
        
        var frameCount = 0
        for time in times
        {
            // Generate images one at a time.
            imageGenerator.generateCGImagesAsynchronously(forTimes: [NSValue(time: time)])
            { requestedTime, image, actualTime, result, error in

                frameCount += 1
                statusText = "\(frameCount)   of   \(times.count)"
                
                if let error = error
                {
                    print("Error: \(error.localizedDescription)")
                }
                else if let image = image
                {
                    // Increase repeatCount to add to the processing delay..
                    let repeatCount = 10
                    for _ in 1...repeatCount
                    {
                        myFrame = convertToGrayscale(image: image)
                    }
                }
                else
                {
                    print("No image generated at time \(requestedTime).")
                }
                
                // Check if we've processed all frames
                if frameCount >= times.count
                {
                    completion()
                }
            }
        }
    }
}


struct Video: Transferable
{
    let url: URL
    
    static var transferRepresentation: some TransferRepresentation
    {
        FileRepresentation(contentType: .movie)
        { movie in
            return .init(movie.url)
        } importing:
        { received in
            let original = received.file
            let copy = URL.documentsDirectory.appending(path: "FromPhotos.mp4")
            if FileManager.default.fileExists(atPath: copy.path())
            {
                try FileManager.default.removeItem(at: copy)
            }
            try FileManager.default.copyItem(at: original, to: copy)
            return .init(url: copy)
        }
    }
    
}

func convertToGrayscale(image: CGImage) -> CGImage?
{
    let width = image.width
    let height = image.height
    
    let colorSpace = CGColorSpaceCreateDeviceGray()
    guard let context = CGContext(data: nil, width: width, height: height, bitsPerComponent: 8, bytesPerRow: width, space: colorSpace, bitmapInfo: 0) else
    {
        return nil
    }
    
    context.draw(image, in: CGRect(x: 0, y: 0, width: width, height: height))
    
    return context.makeImage()
}

#Preview 
{
    ContentView()
}

I recommend profiling your app with Instruments to find the cause of the problem.

Choose Product > Profile in Xcode to launch Instruments. You will be asked to choose a template. Choose the Leaks template, which includes the Allocations instrument. Click the Run button to start profiling. Run your app. Click the Pause button when you are done.

Interpreting the results is the difficult part of using Instruments. The following article should help you interpret the results:

Measuring Your App’s Memory Usage with Instruments

One thing not mentioned in the article that could help you is to press Cmd-3 to show the allocations list. Sort the allocations by size to find the largest memory allocations.