-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
X-Y oscilloscope music #31
Comments
Not sure it's 100% relevant, but could be interesting: |
My two cents: I wouldn't use ipycanvas for that, for performances reason. You'd have way better performances using the Web canvas API directly in JavaScript. |
Hmm
Yes I was rather thinking of using ipycanvas to draw a shape "by hand", get the canvas image and do some computation on the Python side to convert it into "XY-oscilloscope compatible" audio signal generated with ipytone... However, there's some way to go before even trying something like that! Do you know if the Canvas API is performant enough for responsive drawing at the audio sample rate? |
The Canvas API is quite fast, as long as you don't draw tens of thousands of shapes and use the API wisely it should be fine. Of course, WebGL shaders are faster, but they are also a lot more complicated. I am not an audio expert, what is the order of magnitude of audio rate? I am not sure you want to match the audio frequency, you'll be limited by your screen refresh frequency anyway (60fps on my laptop). You can definitely make 60fps animations with a Web2D canvas. |
Sorry my comment was dumb (typical audio rate is 44.1kHz vs. 60 Hz screen rate haha).
It'll be wiser to start with the Canvas API then for a basic rendering (without trying to emulate the nice rendering of an analog oscilloscope). Thanks! |
I'd love to be able to experiment with something like this (well, perhaps less ambitious animations) in a notebook at some point.
This would require some X-Y oscilloscope widget to which we can plug in an ipytone audio node. I've found some examples either using custom shaders (https://www.shadertoy.com/view/XttSzf, https://github.com/m1el/woscope) or based on the Canvas API (https://github.com/Sean-Bradley/Oscilloscope).
I guess it shouldn't be too hard implementing a basic version in ipytone, so that we can have a very flexible emulator (code + widgets). I have zero experience in shaders / Canvas API, though.
With both ipytone and ipycanvas, I imagine real-time drawing -> generate an audio signal from the drawing -> feed the X-Y oscilloscope emulator with the audio signal, like this: https://www.youtube.com/watch?v=AGeHwNEwbZk. :-)
Cc @martinRenou (in case you have any thoughts / suggestion for the X-Y oscilloscope widget, that would be really helpful!).
The text was updated successfully, but these errors were encountered: