in accordance with sentiments put forth at the CreativePact hub this blog charts my first explorations into the murky mark-up of HTML 5 and the audio visual possibilities it may offer…
integrating the spinning shapes, the spiroid, the iterated-ripple noise with its pitch synch'ed amplitude modulation, and an array based solution to polyphony.
Parameters you can control in this version:
number of equidistant divisions per octave
number of octaves
octave offset
output gain
Parameters in use which could easily be mapped to mouse, keys, or inputs:
number of sides to the spinning shape
speed of spinning shape
size of spinning shape
rate of amplitude modulation
depth of amplitude modulation
I will continue to post developments in this blog as and when they happen, but this concludes my 30 day CreativePact project. Thank you for reading.
I was going to do a demo video, but Vimeo only permit one HD upload per week, so I will wait a few days for the final version before doing a screencast : )
might seem like a small step, but it took ages - much longer than it should have: there were two failed attempts at a more sophisticated behaviour on the interface before scrapping it all to implement something simpler...
didn't quite complete the integration of these two elements.
http://sdfphd.net/creativepact2010/sdfahDev_0002.html
hope i get to implement even half the stuff i have in mind over the next few days!
just to add that the synthesis technique I've used today is based on the iterated-ripple noise (IRN) pitch-evoking stimulus described by Deborah A. Halland Christopher J. Plack in 'Pitch Processing Sites in the Human Auditory Brain', Cerebral Cortex 19/3 (March 2009) pp. 576-585
I'm working on sdfah.js (Samuel D Freeman's Audio Hack JavaScript utility).
I'm taking my time to crawl through the primary examples of the Audio API and implement my own reusable library with them. So far so good. In the past hour I have reached a comfortable understanding of the function relationships at play within the mozToneGen Example which I hacked at for a few days at the start of this month.
If I do end up with a working example tonight then I will edit this post at tweet about it, if not then I will report again tomorrow.
so yes, it's just day 15 plugged into a different processing.js script, and yes there is zero error handling so the animation will hang in chrome as soon as it tries to play Audio()....
I set my self this challenge to: implement Gareth Loy's 'Simple Envelope Generator' with gui and commentary in 2.25hours
This here blog post is the commentary. At the time of writing these words the envelope //seems// to have been implemented. Let audio testing commence... (50min to go!)...
...it's not working!!! (30min to go)...
it works! (25min to go) but I will have to limit the behaviour to repeat the enveloped noise sample every 5sec. No gui yet, just number boxes...
I take it back, better testing shows that it is not working afterall (the envelope is being applied on top of previous envelope each time - fixing it by reseeding the noise each time env updated) (12min to go) ...
all working, happy enough with audio test. (10min to go) had a great idea for graphical representation but no time before midnight, would prob take 20min...
I did implement some user controls, but they didn't have very much affect on the sound, so I took a step back to version 'c' and now I'm going to go to sleep...
Here's an extract from code (view source on the page to see it all):
I've been at a symposium today, and not had much food due to a rough morning at home (son had really bad cramp in both legs, poor guy), so brain is tired this evening.
I did manage to write this:
which seems to not work very well on hascanvas.com page, but seems to be ok in the embedded version here and is fine in firefox 4 and in processing &rerr; code below:
I actually hacked this together last night (post midnight) which means I now have a few hours left of today with which to look over what Scott McLaughlin did for CreativePact today as he has been html5ing with my post from yesterday adding in his envelopes, which I remember him talking about not working all those months ago : )
In terms of building on day14c.html, my next move would have been to implement a crude form of polyphony by instantiating multiple Audio() objects...
as it goes, who knows what's next, who ever knows?
mouse controlling audio visual synth toy thing created with html and javascript for Firefox 4
a few more techy details:
significant edits to Lee's Processing script:
the mouse position over the canvas affects the speed of the balls ±x ±y about the centre
i.e. they should stop if you hover at the middlest pixel
added a conditional to the bounce() function using a frameCount to prevent rapid firing of the audio beep()
this is important because the current implementation of writing samples to the audio API can create a backlog of commands (i.e it will play all the samples it has been asked to play, even if you ask it to stop before you've heard them all)
a mouse click will reseed the position and speed of the balls
(but not their colour so that you can get used to which colour has which pitch)
this function also help getting out of the situation where a ball is stuck to the edge and thus beep()ing non stop - although that can sound good at times!
as promised, have been looking over a few more examples...
Here's a nice little tutorial posted on Sept 2nd:
This very short video tutorial shows you how to generate sound with JavaScript in Firefox 4... in under 2 mins! Seven lines of JavaScript code is all it takes to begin programming audio in Firefox 4.
a few minor attempts towards this today didn't get far....
stepped up to the tweet box to admit defeat having not even taken a photo to share, and out pops a haiku (well more of a senryu, but who's heard of that? ; )
had a good day of phd report writing (not had one of those for too long again). This is good, but it did mean that I didn't do very much on the #creativepact...
I was going to have a go at a rough Karplus-Strong type thing, but by the time I'd played around with different scales of the noise() function, and normalisation of valuse in an array, I became distracted by wavetables with not enough time to make it sound today... (not that anyone can hear it anyway, without the special browser version!)
I did say I'd next go back and look at more examples, but that would take up all of the day's creativePact time, so instead I'll implement a different synthesis method and maybe some interactivity on the canvas too.
(it will only work in the 'nightly build' of firefox from http://nightly.mozilla.org/ -> I'm running from firefox-4.0b5pre.en-US.mac.dmg )
However, the audio visual synchrony is . . . just not there! The sound is programmed to change when the shape changes it's number of sides, how ever the sound remains unchaged until after the shape has changed several times. By the time the audio has changed three times, the shape has been through it's entire sequence more than twice.
Tomorrow I will try again without using DSP.js to see iff the a/v synch can be improved. The next day I will look at some more examples to see how people who know what they are doing are doing things : ) in other news, I was in the studio today going through some patches from the past year and made this video with an edit of one of them:
this started out as a model of a gramophone: writing and reading audio data to/from a spiral path on a 2D surface.
here the surface has been covered with noise and the spiral paths reading is are being modified far beyond the original specifications.
The basis of this patch was put together in April 2010, subsequent patches have taken up similar ideas. The control sequences running to make this video were put in today (7 Sept 2010).
in the image: grey scale noise = the surface being read; red and green = pixels being read and converted to sound; blue = phase scope type analysis of the stereo outjavascript:void(0)put.
(as usual, sound and colour have been altered considerably by the host's compression)
Today I have been playing with Processing. I perhaps should have just gone with this as the creativepact topic of choice because it has been on my to do list for about 3 years...
I have been writing the code in Processing itself -- to alleviate some of the distractions of working via JavaScript via html...
I have not written in Java before, but have used C and C++, so the learning curve is not too bad: mostly a case of responding to the errors. I gave up on a multidimensional array after three attempts, and wrote a class instead, perhaps a more useful thing to practice.
When I was finally happy that it worked in a complete enough way to call it finished for the day, an 'Assertion failure' error started flashing up. I think it's related to this:
so I am just ignoring it for now, it's definitely not caused by my code because changing one value over and and over will sometime through the error sometimes not, for a particular value given...
Although I have been writing JavaScript for Max over the past few years, and while I was fairly happy using it with html prior to that, getting into to these examples is proving more of a challenge than I had expected. Or maybe I knew exactly how hard I would find it and that's why I have been putting it off. Either way I have made little progress today, despite have spent at least an hour (combined) looking at bits through the day. Reality has it that sitting down for a few solid hours to progress with this just not happening.
I'm determined to keep on with the CreativePact of documenting this endeavour daily, and I hope that there will be some actual creative out comes at some point.
My ideas for low level synthesis fun will go on hold and I will focus on processing.js alone for a few days.
Tried swapping the ogg soundfile I put in place yesterday for a mono one, on the off-chance that that would magically fix the problem. I didn't think it would work and it didn't, so I will return to synthesis and see where we can get to with that. Tomorrow. Just not in the mood for focussing on code text.
the picture is a photo taken on Monday with a Nikon compact thing, and colour adjustment was done in Preview...
I've put a 4sec clip on viemo: http://vimeo.com/14684892
but thought I'd also see what happens if you add a video with of unusual dim (512 by 30 pixels) here on blogger:
what happens is that the audio gets completely wrecked by their compression, no surprise I guess. Oh and the size of the embedded player was set by editing the html -- adding 41pixels for their playbar thing.
from https://wiki.mozilla.org/Audio_Data_API_JS_Library
to the github page linked there from, clicked 'download source' to get corbanbrook-dsp.js-0acac23.zip with the DSP.js file and a bunch of examples and stuff inside.
some of them are looking to play pack an audio file, so I have created an audio folder in my copy of the directory and added an ogg.
I only have one hour to play (although I might make more time later!) . there for I will only aim to get all of the examples working by changing the audio file source to a valid path . . . the pure synthesis patches are running right away . . .
. . . so far, the audio tags are loading and playing the ogg sound file, but the processing is not happening. This might take longer than I thought.
Instead I will take a look at how the different .js files are working together in some of these examples:
• dsp.js
/*
* DSP.js - a comprehensive digital signal processing library for javascript
Processing.js is an open programming language for people who want to program images, animation, and interactions for the web without using Flash or Java applets. Processing.js uses Javascript to draw shapes and manipulate images on the HTML5 Canvas element.
My excitement was much in line with the spirit in which those three were written, and enhanced by my personal circumstances; there follows a fragmented snapshot of where I'm coming from on this:
In May 2010 I was just over way through the first year of a PhD project looking at looking at sound through software for composing new music. One premise of my compositional aesthetic is that the software itself is in intrinsic part of a composition. I am a MaxMSP-aholic: this is relevant here for two reasons: first, I am looking to move away from my Max dependancy, and second, I have always known that writing HTML in the late 1990s was the gateway psudo-programming practice that lead directly to develop a serious Max habit. My current research is focused on circular forms and how they manifest everywhere in music and life, but not enough in visual representations of sound. I had previously proposed a PhD project which would have been to look at issues of portability and obsolescence in computer music.
I'm being vague and obtuse because I want to get on with it. One last thing to deal with though: the dreadful feeling of overreaching which sweeps across the waves of excitement. Perhaps it's enough to have said that.
On with the show
Here's a short (5min) video which I came to today via
http://www.reddit.com/r/programming/comments/d5vu6/html5_audio_demo_wow
The first comment on that reddit page points out that the Audio API is not actually a part of HTML5. We shall redress that point now and move towards what I made today with a few quotes from https://wiki.mozilla.org/Audio_Data_API
Abstract
The HTML5 specification introduces the <audio> and <video> media elements, and with them the opportunity to dramatically change the way we integrate media on the web. The current HTML5 media API provides ways to play and get limited information about audio and video, but gives no way to programatically access or create such media. We present a new Mozilla extension to this API, which allows web developers to read and write raw audio data.
[…]
API Tutorial
This API extends the HTMLMediaElement and HTMLAudioElement (e.g., affecting <video> and <audio>), and implements the following basic API for reading and writing raw audio data
[…]
Writing Audio
[…]
Complete Example: Creating a Web Based Tone Generator
This example creates a simple tone generator, and plays the resulting tone.
To run this example I am downloading the latest 'nightly build' of firefox from http://nightly.mozilla.org/ -- firefox-4.0b5pre.en-US.mac.dmg
And it works!
well, it works running the saved html file from the local disk: minefield seems unwilling to connect to the internet, but that is not important at this stage. I will put the examples online anyway:
This is the un-edited example: http://sdfphd.net/creativepact2010/mozToneGenExample.html
seems to be working, but it is going to get more difficult to hear objectively the changes in timbre --it's too easy to convince oneself that what you expected to happen is happening -- I want to see the sound!!
I will get round to doing this within the browser over the next few weeks, it's part of the pact, but for now I will use SoundFlower to route the audio generated by the JavaScript in the browser into MaxMSP for analysis display with the spectroscope~ object: