I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.
I am receiving a stream via a websocket event
retrieveAudioStream(){
this.socket.on('stream', (buffer) => {
console.log('buffer', buffer)
})
}
the buffer
is an arrayBuffer and I need it to be an audioBuffer
in order to be able to listen to it on my application.
How can I do this?
I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.
I am receiving a stream via a websocket event
retrieveAudioStream(){
this.socket.on('stream', (buffer) => {
console.log('buffer', buffer)
})
}
the buffer
is an arrayBuffer and I need it to be an audioBuffer
in order to be able to listen to it on my application.
How can I do this?
Share Improve this question asked May 24, 2018 at 14:55 Stretch0Stretch0 9,29315 gold badges93 silver badges159 bronze badges 4- 2 Take a look at this: developer.mozilla/en-US/docs/Web/API/BaseAudioContext/… – Get Off My Lawn Commented May 24, 2018 at 15:00
- 1 According to this SO post stackoverflow./questions/38589614/… "AudioContext.decodeAudioData just isn't designed to decode partial files". Due to my stream being arrayBuffer chunks, I am not able to decode it with this method. Any other suggestions? – Stretch0 Commented May 24, 2018 at 15:39
- 1 There is a very great example that uses MediaSource (which works the same for audio streams) and I use it for streaming audio chunkwise from a 206 response, it works very well. – Jankapunkt Commented Jun 23, 2019 at 8:37
- This might help as well. It's using decodeAudioData to decode the ArrayBuffer in an AudioBuffer and appends the chunks together. stackoverflow./questions/14143652/… – Bart Van Remortele Commented Jun 27, 2019 at 8:50
2 Answers
Reset to default 7 +25You can use BaseAudioContext.createBuffer()
method.
It is used to
create a new, empty
AudioBuffer
object, which can then be populated by data, and played via anAudioBufferSourceNode
See MDN for more info: https://developer.mozilla/en-US/docs/Web/API/BaseAudioContext/createBuffer
Since you're streaming media rather than downloading the file and then decoding the audio data, AudioContext.createMediaStreamSource()
will be much better suited for your usecase.
Read more here https://developer.mozilla/en-US/docs/Web/API/AudioContext/createMediaStreamSource
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1743623415a4480148.html
评论列表(0条)