I need to combine the FFTView from AudioKit into my app to indicate this cool visualiser.
The issue I’m dealing with is that the audio that I’m enjoying comes from a 2014 library (TritonSDK) which streams the audio from someplace to supply you a radio station to hearken to. So I’ve no management over the audio in any way, the whole lot is dealt with by the SDK.
The SDK nevertheless, is exposing a AudioQueueRef
property, however from what I see in Apple’s docs, there is no such thing as a solution to subscribe to audio information to it, in the event you’re not controlling the creation of the queue.
So I assumed, plan B: use AVAudioEngine to choose up the sound that’s already enjoying, however sadly I do not get any information for the visualiser (I get again 0 worth array of Floats)
Any thought what I is perhaps doing mistaken, or is that this even attainable what I am making an attempt? Thanks upfront!
My setup:
import AudioKit
import AVFAudio
closing class AudioCapture: ObservableObject {
@Printed non-public(set) var node: Node
non-public let engine = AVAudioEngine()
init() {
self.node = NodeWrapper(avAudioNode: engine.mainMixerNode)
}
func begin() {
do {
strive engine.begin()
} catch {
print("Error beginning AVAudioEngine: (error.localizedDescription)")
}
if node.avAudioNode != engine.mainMixerNode {
node = NodeWrapper(avAudioNode: engine.mainMixerNode)
}
}
func cease() {
engine.cease()
}
}
// MARK: - Utility
non-public closing class NodeWrapper: Node {
var connections: [Node] = []
let avAudioNode: AVAudioNode
init(avAudioNode: AVAudioNode) {
self.avAudioNode = avAudioNode
}
}
import SwiftUI
import AudioKitUI
struct RadioView: View {
@ObservedObject var viewModel: RadioViewModel
@Setting(.safeAreaInsets) non-public var safeAreaInsets: EdgeInsets
var physique: some View {
FFTView(viewModel.audioCapture.node)
}
}