I’ve received an iOS app that’s utilizing MetalKit to show uncooked video frames coming in from a community supply. I learn the pixel knowledge within the packets right into a single MTLTexture rows at a time, which is drawn into an MTKView every time a body has been utterly despatched over the community. The app works, however just for a number of seconds (a seemingly random period), earlier than the MTKView seemingly freezes (whereas packets are nonetheless being acquired).
Watching the debugger whereas my app was working revealed that the freezing of the show occurred when there was a big spike in reminiscence. Seeing the reminiscence profile in Devices revealed that the spike was associated to a fast creation of many IOSurfaces and IOAccelerators.
Being a whole beginner to iOS programming as an entire, I’m wondering if this situation comes from a misuse of the MetalKit library. Beneath is the code that I am utilizing to render the video frames themselves:
class MTKViewController: UIViewController, MTKViewDelegate {
/// Steel texture to be drawn at any time when the view controller is requested to render its view.
non-public var metalView: MTKView!
non-public var machine = MTLCreateSystemDefaultDevice()
non-public var commandQueue: MTLCommandQueue?
non-public var renderPipelineState: MTLRenderPipelineState?
non-public var texture: MTLTexture?
non-public var networkListener: NetworkListener!
non-public var textureGenerator: TextureGenerator!
override public func loadView() {
tremendous.loadView()
assert(machine != nil, "Failed making a default system Steel machine. Please, ensure that Steel is on the market in your {hardware}.")
initializeMetalView()
initializeRenderPipelineState()
networkListener = NetworkListener()
textureGenerator = TextureGenerator(width: streamWidth, top: streamHeight, bytesPerPixel: 4, rowsPerPacket: 8, machine: machine!)
networkListener.begin(port: NWEndpoint.Port(8080))
networkListener.dataRecievedCallback = { knowledge in
self.textureGenerator.course of(knowledge: knowledge)
}
textureGenerator.onTextureBuiltCallback = { texture in
self.texture = texture
self.draw(in: self.metalView)
}
commandQueue = machine?.makeCommandQueue()
}
public func mtkView(_ view: MTKView, drawableSizeWillChange measurement: CGSize) {
/// want implement?
}
public func draw(in view: MTKView) {
guard
let texture = texture,
let _ = machine
else { return }
let commandBuffer = commandQueue!.makeCommandBuffer()!
guard
let currentRenderPassDescriptor = metalView.currentRenderPassDescriptor,
let currentDrawable = metalView.currentDrawable,
let renderPipelineState = renderPipelineState
else { return }
currentRenderPassDescriptor.renderTargetWidth = streamWidth
currentRenderPassDescriptor.renderTargetHeight = streamHeight
let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: currentRenderPassDescriptor)!
encoder.pushDebugGroup("RenderFrame")
encoder.setRenderPipelineState(renderPipelineState)
encoder.setFragmentTexture(texture, index: 0)
encoder.drawPrimitives(kind: .triangleStrip, vertexStart: 0, vertexCount: 4, instanceCount: 1)
encoder.popDebugGroup()
encoder.endEncoding()
commandBuffer.current(currentDrawable)
commandBuffer.commit()
}
non-public func initializeMetalView() {
metalView = MTKView(body: CGRect(x: 0, y: 0, width: streamWidth, top: streamWidth), machine: machine)
metalView.delegate = self
metalView.framebufferOnly = true
metalView.colorPixelFormat = .bgra8Unorm
metalView.contentScaleFactor = UIScreen.predominant.scale
metalView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
view.insertSubview(metalView, at: 0)
}
/// initializes render pipeline state with a default vertex operate mapping texture to the view's body and a easy fragment operate returning texture pixel's worth.
non-public func initializeRenderPipelineState() {
guard let machine = machine, let library = machine.makeDefaultLibrary() else {
return
}
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.rasterSampleCount = 1
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineDescriptor.depthAttachmentPixelFormat = .invalid
/// Vertex operate to map the feel to the view controller's view
pipelineDescriptor.vertexFunction = library.makeFunction(title: "mapTexture")
/// Fragment operate to show texture's pixels within the space bounded by vertices of `mapTexture` shader
pipelineDescriptor.fragmentFunction = library.makeFunction(title: "displayTexture")
do {
renderPipelineState = strive machine.makeRenderPipelineState(descriptor: pipelineDescriptor)
}
catch {
assertionFailure("Failed making a render state pipeline. Cannot render the feel with out one.")
return
}
}
}
My query is just: what offers?
EDIT: Profiling CPU Utilization exhibits that CAMetalLayerPrivateNextDrawableLocked is what occurs throughout this fast creation of surfaces. What does this operate do?