How to continuously record Audio and play Audio from same file in Swift?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
I want to create something like a hearing aid app, where once i hit a the "startRecording" UIButton, it continuously records what I'm saying, and simultaneously plays it back to me at the same instant, in my earphones. It's basically to help people with hearing disabilities to hear the sounds from the surrounding environment, better and louder through earphones.
I am trying to implement it using the AVAudioKit, with the AudioRecorder and AudioPlayer working together with the same filepath "filename", in a while loop.
I get the error for line: audioPlayer.delegate = self
Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value.
@IBOutlet weak var startRecording: UIButton!
var recordingSession : AVAudioSession!
var audioRecorder : AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var fileNameString : String = "test.m4a"
@IBAction func buttonPressed(_ sender: Any) {
print("button pressed")
let filename = getDirectory().appendingPathComponent("(fileNameString)")
if audioRecorder == nil{ // DAF needs to be started
let settings = [AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 1,
AVSampleRateKey: 12000.0] as [String : Any]
do{
audioRecorder = try AVAudioRecorder(url: filename, settings: settings)
audioRecorder.delegate = self
//audioRecorder.record()
do{
audioPlayer = try AVAudioPlayer(contentsOf: filename, fileTypeHint: nil)
}
catch let error{
print("(error)")
}
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
while true {
audioRecorder.record()
sleep(1)
audioPlayer.play()
}
//startRecording.setTitle("Stop ", for: .normal)
} catch{
print ("failed")
}
}
else { // DAF started, needs to stop
audioRecorder.stop()
audioRecorder = nil
startRecording.setTitle("Start", for: .normal)
playRecording()
}
ios swift avaudioplayer
add a comment |
I want to create something like a hearing aid app, where once i hit a the "startRecording" UIButton, it continuously records what I'm saying, and simultaneously plays it back to me at the same instant, in my earphones. It's basically to help people with hearing disabilities to hear the sounds from the surrounding environment, better and louder through earphones.
I am trying to implement it using the AVAudioKit, with the AudioRecorder and AudioPlayer working together with the same filepath "filename", in a while loop.
I get the error for line: audioPlayer.delegate = self
Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value.
@IBOutlet weak var startRecording: UIButton!
var recordingSession : AVAudioSession!
var audioRecorder : AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var fileNameString : String = "test.m4a"
@IBAction func buttonPressed(_ sender: Any) {
print("button pressed")
let filename = getDirectory().appendingPathComponent("(fileNameString)")
if audioRecorder == nil{ // DAF needs to be started
let settings = [AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 1,
AVSampleRateKey: 12000.0] as [String : Any]
do{
audioRecorder = try AVAudioRecorder(url: filename, settings: settings)
audioRecorder.delegate = self
//audioRecorder.record()
do{
audioPlayer = try AVAudioPlayer(contentsOf: filename, fileTypeHint: nil)
}
catch let error{
print("(error)")
}
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
while true {
audioRecorder.record()
sleep(1)
audioPlayer.play()
}
//startRecording.setTitle("Stop ", for: .normal)
} catch{
print ("failed")
}
}
else { // DAF started, needs to stop
audioRecorder.stop()
audioRecorder = nil
startRecording.setTitle("Start", for: .normal)
playRecording()
}
ios swift avaudioplayer
Writing to a file seems like an unnecessary step, take a look atAVAudioEngine
– Craig Siemens
Nov 16 '18 at 17:13
AVAudioEngine seems to use buffers that are too long for real-time audio latencies.
– hotpaw2
Nov 16 '18 at 17:50
@CleverError thanks I will refer to it, please share with me a link or a pseudo code if you are already familiar with this type of application with AVAudioEngine :)
– Sharan Narasimhan
Nov 16 '18 at 17:57
@hotpaw2 a little latency is fine with me as long as it gets the job done, do you have any references for this type of app I can make use it ?
– Sharan Narasimhan
Nov 16 '18 at 17:58
add a comment |
I want to create something like a hearing aid app, where once i hit a the "startRecording" UIButton, it continuously records what I'm saying, and simultaneously plays it back to me at the same instant, in my earphones. It's basically to help people with hearing disabilities to hear the sounds from the surrounding environment, better and louder through earphones.
I am trying to implement it using the AVAudioKit, with the AudioRecorder and AudioPlayer working together with the same filepath "filename", in a while loop.
I get the error for line: audioPlayer.delegate = self
Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value.
@IBOutlet weak var startRecording: UIButton!
var recordingSession : AVAudioSession!
var audioRecorder : AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var fileNameString : String = "test.m4a"
@IBAction func buttonPressed(_ sender: Any) {
print("button pressed")
let filename = getDirectory().appendingPathComponent("(fileNameString)")
if audioRecorder == nil{ // DAF needs to be started
let settings = [AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 1,
AVSampleRateKey: 12000.0] as [String : Any]
do{
audioRecorder = try AVAudioRecorder(url: filename, settings: settings)
audioRecorder.delegate = self
//audioRecorder.record()
do{
audioPlayer = try AVAudioPlayer(contentsOf: filename, fileTypeHint: nil)
}
catch let error{
print("(error)")
}
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
while true {
audioRecorder.record()
sleep(1)
audioPlayer.play()
}
//startRecording.setTitle("Stop ", for: .normal)
} catch{
print ("failed")
}
}
else { // DAF started, needs to stop
audioRecorder.stop()
audioRecorder = nil
startRecording.setTitle("Start", for: .normal)
playRecording()
}
ios swift avaudioplayer
I want to create something like a hearing aid app, where once i hit a the "startRecording" UIButton, it continuously records what I'm saying, and simultaneously plays it back to me at the same instant, in my earphones. It's basically to help people with hearing disabilities to hear the sounds from the surrounding environment, better and louder through earphones.
I am trying to implement it using the AVAudioKit, with the AudioRecorder and AudioPlayer working together with the same filepath "filename", in a while loop.
I get the error for line: audioPlayer.delegate = self
Thread 1: Fatal error: Unexpectedly found nil while unwrapping an Optional value.
@IBOutlet weak var startRecording: UIButton!
var recordingSession : AVAudioSession!
var audioRecorder : AVAudioRecorder!
var audioPlayer : AVAudioPlayer!
var fileNameString : String = "test.m4a"
@IBAction func buttonPressed(_ sender: Any) {
print("button pressed")
let filename = getDirectory().appendingPathComponent("(fileNameString)")
if audioRecorder == nil{ // DAF needs to be started
let settings = [AVFormatIDKey: Int(kAudioFormatAppleLossless),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue,
AVEncoderBitRateKey: 320000,
AVNumberOfChannelsKey: 1,
AVSampleRateKey: 12000.0] as [String : Any]
do{
audioRecorder = try AVAudioRecorder(url: filename, settings: settings)
audioRecorder.delegate = self
//audioRecorder.record()
do{
audioPlayer = try AVAudioPlayer(contentsOf: filename, fileTypeHint: nil)
}
catch let error{
print("(error)")
}
audioPlayer.delegate = self
audioPlayer.prepareToPlay()
while true {
audioRecorder.record()
sleep(1)
audioPlayer.play()
}
//startRecording.setTitle("Stop ", for: .normal)
} catch{
print ("failed")
}
}
else { // DAF started, needs to stop
audioRecorder.stop()
audioRecorder = nil
startRecording.setTitle("Start", for: .normal)
playRecording()
}
ios swift avaudioplayer
ios swift avaudioplayer
asked Nov 16 '18 at 17:00
Sharan NarasimhanSharan Narasimhan
208
208
Writing to a file seems like an unnecessary step, take a look atAVAudioEngine
– Craig Siemens
Nov 16 '18 at 17:13
AVAudioEngine seems to use buffers that are too long for real-time audio latencies.
– hotpaw2
Nov 16 '18 at 17:50
@CleverError thanks I will refer to it, please share with me a link or a pseudo code if you are already familiar with this type of application with AVAudioEngine :)
– Sharan Narasimhan
Nov 16 '18 at 17:57
@hotpaw2 a little latency is fine with me as long as it gets the job done, do you have any references for this type of app I can make use it ?
– Sharan Narasimhan
Nov 16 '18 at 17:58
add a comment |
Writing to a file seems like an unnecessary step, take a look atAVAudioEngine
– Craig Siemens
Nov 16 '18 at 17:13
AVAudioEngine seems to use buffers that are too long for real-time audio latencies.
– hotpaw2
Nov 16 '18 at 17:50
@CleverError thanks I will refer to it, please share with me a link or a pseudo code if you are already familiar with this type of application with AVAudioEngine :)
– Sharan Narasimhan
Nov 16 '18 at 17:57
@hotpaw2 a little latency is fine with me as long as it gets the job done, do you have any references for this type of app I can make use it ?
– Sharan Narasimhan
Nov 16 '18 at 17:58
Writing to a file seems like an unnecessary step, take a look at
AVAudioEngine
– Craig Siemens
Nov 16 '18 at 17:13
Writing to a file seems like an unnecessary step, take a look at
AVAudioEngine
– Craig Siemens
Nov 16 '18 at 17:13
AVAudioEngine seems to use buffers that are too long for real-time audio latencies.
– hotpaw2
Nov 16 '18 at 17:50
AVAudioEngine seems to use buffers that are too long for real-time audio latencies.
– hotpaw2
Nov 16 '18 at 17:50
@CleverError thanks I will refer to it, please share with me a link or a pseudo code if you are already familiar with this type of application with AVAudioEngine :)
– Sharan Narasimhan
Nov 16 '18 at 17:57
@CleverError thanks I will refer to it, please share with me a link or a pseudo code if you are already familiar with this type of application with AVAudioEngine :)
– Sharan Narasimhan
Nov 16 '18 at 17:57
@hotpaw2 a little latency is fine with me as long as it gets the job done, do you have any references for this type of app I can make use it ?
– Sharan Narasimhan
Nov 16 '18 at 17:58
@hotpaw2 a little latency is fine with me as long as it gets the job done, do you have any references for this type of app I can make use it ?
– Sharan Narasimhan
Nov 16 '18 at 17:58
add a comment |
1 Answer
1
active
oldest
votes
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53342330%2fhow-to-continuously-record-audio-and-play-audio-from-same-file-in-swift%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
add a comment |
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
add a comment |
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
AVAudioRecording to a file and reading from that file to play will result in too much latency for real-time audio, due to the API writing and reading files using fairly large blocks or buffers of samples.
A better iOS API is to use for your purpose is the Audio Unit API with the RemoteIO Audio Unit. Using the RemoteIO Audio Unit can result in very low latencies from microphone to speaker (or headset). This is a C callback API however, as Apple currently does not recommend using Swift inside a real-time audio context.
answered Nov 16 '18 at 17:49
hotpaw2hotpaw2
61.9k1072132
61.9k1072132
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
add a comment |
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
Latency is okay for the scope of this application. Could you please help me rectify to error to this code? I mean without the use of an audio engine. Worse comes to worse I will take your advise and use the API
– Sharan Narasimhan
Nov 16 '18 at 18:01
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53342330%2fhow-to-continuously-record-audio-and-play-audio-from-same-file-in-swift%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Writing to a file seems like an unnecessary step, take a look at
AVAudioEngine
– Craig Siemens
Nov 16 '18 at 17:13
AVAudioEngine seems to use buffers that are too long for real-time audio latencies.
– hotpaw2
Nov 16 '18 at 17:50
@CleverError thanks I will refer to it, please share with me a link or a pseudo code if you are already familiar with this type of application with AVAudioEngine :)
– Sharan Narasimhan
Nov 16 '18 at 17:57
@hotpaw2 a little latency is fine with me as long as it gets the job done, do you have any references for this type of app I can make use it ?
– Sharan Narasimhan
Nov 16 '18 at 17:58