javascript - How does ShaderToy load sounds into a texture - Stack Overflow

I've been trying to do the same things shadertoy does for passing audio frequencywaveform into th

I've been trying to do the same things shadertoy does for passing audio frequency/waveform into the shader with three.js.

In this example it seems that IQ is putting frequency/waveform audio data into an image and then sampling that as a texture in the shader. How would I create that audio texture in Javascript?

To be clear I don't need help loading the texture uniform into the shader. I just don't know how to create the audio texture from an audio file.

var texture = new THREE.Texture();

shader.uniforms = {
     iChannel0:  { type: 't', value: texture }
};

I'm guessing I'll need to somehow put audio data into the texture I just don't know how to do that.

I've been trying to do the same things shadertoy does for passing audio frequency/waveform into the shader with three.js.

https://www.shadertoy./view/Xds3Rr

In this example it seems that IQ is putting frequency/waveform audio data into an image and then sampling that as a texture in the shader. How would I create that audio texture in Javascript?

To be clear I don't need help loading the texture uniform into the shader. I just don't know how to create the audio texture from an audio file.

var texture = new THREE.Texture();

shader.uniforms = {
     iChannel0:  { type: 't', value: texture }
};

I'm guessing I'll need to somehow put audio data into the texture I just don't know how to do that.

Share Improve this question edited Feb 25, 2016 at 17:14 Joseph asked Feb 25, 2016 at 15:41 JosephJoseph 1792 silver badges8 bronze badges
Add a ment  | 

1 Answer 1

Reset to default 8

You can get audio data from the Web Audio API be creating an analyser node

const audioContext = new window.AudioContext();
const analyser = audioContext.createAnalyser();

Then create a buffer to receive teh data

const numSamples = analyser.frequencyBinCount;
const audioData = new Uint8Array(numSamples);

Then in your render loop get the data and put it in a texture

analyser.getByteFrequencyData(audioData);
...
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE, numSamples, 1, 0,
              gl.LUMINANCE, gl.UNSIGNED_BYTE, audioData);

or in three.js use a DataTexture

That's the short version. The longer version is audio needs to be on the same domain or you'll run into CORS issues. To get data for an audio stream like an <audio> tag's you'd call

const source = audioContext.createMediaElementSource(audio);

That doesn't work in mobile Chrome nor mobile Safari at the moment and there are bugs in Safari.

Here's a working sample

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745544308a4632270.html

相关推荐

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信