javascript - How to convert ArrayBuffer to AudioBuffer? - Stack Overflow

I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.I am rec

I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.

I am receiving a stream via a websocket event

retrieveAudioStream(){
  this.socket.on('stream', (buffer) => {
    console.log('buffer', buffer)
  })
}

the buffer is an arrayBuffer and I need it to be an audioBuffer in order to be able to listen to it on my application.

How can I do this?

I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.

I am receiving a stream via a websocket event

retrieveAudioStream(){
  this.socket.on('stream', (buffer) => {
    console.log('buffer', buffer)
  })
}

the buffer is an arrayBuffer and I need it to be an audioBuffer in order to be able to listen to it on my application.

How can I do this?

Share Improve this question asked May 24, 2018 at 14:55 Stretch0Stretch0 9,29315 gold badges93 silver badges159 bronze badges 4
  • 2 Take a look at this: developer.mozilla/en-US/docs/Web/API/BaseAudioContext/… – Get Off My Lawn Commented May 24, 2018 at 15:00
  • 1 According to this SO post stackoverflow./questions/38589614/… "AudioContext.decodeAudioData just isn't designed to decode partial files". Due to my stream being arrayBuffer chunks, I am not able to decode it with this method. Any other suggestions? – Stretch0 Commented May 24, 2018 at 15:39
  • 1 There is a very great example that uses MediaSource (which works the same for audio streams) and I use it for streaming audio chunkwise from a 206 response, it works very well. – Jankapunkt Commented Jun 23, 2019 at 8:37
  • This might help as well. It's using decodeAudioData to decode the ArrayBuffer in an AudioBuffer and appends the chunks together. stackoverflow./questions/14143652/… – Bart Van Remortele Commented Jun 27, 2019 at 8:50
Add a ment  | 

2 Answers 2

Reset to default 7 +25

You can use BaseAudioContext.createBuffer() method. It is used to

create a new, empty AudioBuffer object, which can then be populated by data, and played via an AudioBufferSourceNode

See MDN for more info: https://developer.mozilla/en-US/docs/Web/API/BaseAudioContext/createBuffer

Since you're streaming media rather than downloading the file and then decoding the audio data, AudioContext.createMediaStreamSource() will be much better suited for your usecase.

Read more here https://developer.mozilla/en-US/docs/Web/API/AudioContext/createMediaStreamSource

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1743623415a4480148.html

相关推荐

  • javascript - How to convert ArrayBuffer to AudioBuffer? - Stack Overflow

    I am streaming an arrayBuffer to convert to an audioBuffer in order to be able to listen to it.I am rec

    20小时前
    10

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信