Simple Harmonic Motion is an ongoing research and series of projects exploring the nature of complex patterns created from the interaction of multilayered rhythms.
This version was designed for and shown at Ron Arads Curtain Call at the Roundhouse.
This ultra wide video is mapped around the 18m wide, 8m tall cylindrical display made from 5,600 silicon rods, allowing the audience to view from inside and outside. http://www.roundhouse.org.uk/?ron-arads-curtain-call
Video of the event coming soon, photos at flickr.
Visuals made with openFrameworks, which also sends midi to Ableton Live to create the sounds.
(sounds much better with headphones, seriously)
Here 180 balls are bouncing attached to (invisible) springs, each with a steady speed, but slightly different to its neighbour. Sound is triggered when they hit the floor, the pitch of the sound proportional to the frequency of the oscillation. The total loop cycle duration of the system is exactly 6 minutes.
Visuals made with Cinema4D + COFFEE (a C-like scripting language for C4D), audio with SuperCollider.
I prefer the look, sound and behaviour of the previous test (see below) though this one does have interesting patterns too.
Here 30 nodes are oscillating with fixed periods in simple harmonic motion, with each node having a slightly different frequency. The total loop cycle duration is exactly 120 seconds (60s for the audio).
I recently came across this beautiful video.
Fifteen pendulums, all with precisely calculated string lengths so they all start at the same time, slowly go out of sync, create beautiful complex rhythmic patterns, and exactly 60 seconds later come back into sync again to repeat the cycle. These techniques of creating complex patterns from the interaction of multilayered rhythms have been explored by composers such as Steve Reich, Gyorgi Ligeti, Brian Eno and many others; but the patterns in this particular video seemed so simple yet complex that I wondered what it would sound like. The exact periods of each of the pendulums are described in detail on the project page, so I was able to easily recreate the experiment quite precisely in processing, the video below.
I've also started playing with supercollider, an environment and programming language for real time audio synthesis and algorithmic composition. As an exercise I was wondering if I could re-create the demo in supercollider. The source code below seems to do the job pretty nicely. I was trying to fit the code under 140 characters so I could tweet it, so I took a few shortcuts.
The sounds in the video above are NOT from supercollider. They are triggered from the processing sketch as midi notes sent to Ableton Live. The notes in the processing sketch are selected from a pentatonic scale. I wanted the supercollider code to fit in single tweet ( less than 140 chars), so I omitted the scale and instead pick notes which are spaced at minor 3rd intervals, creating a diminished 7th arpeggio. The base note is C#. Toccata and fugue in d minor anyone?
Recently I've been playing a lot with OpenCL, the new API / framework designed to handle cross-platform parallel computing (i.e. a simple way of running code simultaneously on all cores of your CPU, GPU or other processors). Implementations have been cropping up this year in NVidia drivers or ATI drivers, but most famously it's included with Mac OSX 10.6 Snow Leopard.
To cut a long story short I've been working on a simple-to-use C++ wrapper for some of the most common functions, imaginatively called ofxOpenCL and here is a little demo of 1 million particles running at 100-200fps.
NOTE: The Vimeo compression destroys most of the particles, so I suggest downloading the quicktime directly from the vimeo page at http://www.vimeo.com/7332496
This is 1,000,000 particles being interacted on by mouse, updated on GPU (with springy behaviours ) via an OpenCL kernel, data written straight to a VBO and rendered - without ever coming back to host (i.e. main memory + cpu etc.)
Frame-rate is around 100-200fps running on a macbook pro with GF 9600GT. That's 100-200fps on a laptop! (albeit a pretty decent one), but I'm dying to try this on a GF 285 GTX - which has 7.5x the number of cores, 2.5x the fillrate and 3.5x the memory bandwidth - for only £250!!
This is a set of C++ classes for solving and displaying real-time fluid dynamics simulations based on Navier-Stokes equations and Jos Stam's paper on Real-Time Fluid Dynamics for Games. The solver class has no dependencies on openFrameworks and can be used in any C++ project. The drawer class extends ofBaseDraws and contains an ofTexture for seamless integration with openFrameworks drawing routines. Also included in the addon is a ofxMSAParticleUpdater class which allows the fluid solver to be easily plugged into ofxMSAPhysics as a force field.
ofxMSAPhysics is a C++ 3D particle/constraint based physics library for openFrameworks. It uses a very similar api to the traer.physics library for processing to make getting into it as easy as possible.
Version 2.0a is now available for testing.
Main features include
attractions (+ve or -ve)
replay saving and load from disk (temporarily disabled in current alpha release)
custom particles (extend ofxMSAParticle and add to the system)
custom constraints (extend ofxMSAConstraint and add to the system)
custom force fields (extend ofxMSAParticleUpdater and add to the system)
custom drawing (extend ofxMSAParticleDrawer and add to the system)
"Reincarnation" is an off-shoot from a visual performance for the Rambert Dance Company's "Iatrogenesis" at the Queen Elizabeth Hall, South Bank, London UK - visuals for the latter directed by my good friends at flat-e.
The project began when I started working with footage of the Rambert dancers. Inspired by the footage, I wrote custom software to track the motion and generate the visuals you see in the video below... and "Reincarnation" was born.
For the Rambert's "Iatrogenesis" piece, I created similar visuals, abstract visual layers containing subtle hints of human forms and motion, which would tie in with the movement of the dancers on stage. These were used by flat-e and composited with various other layers and projected on a see-through gauze in front of the stage.
The video (and music) you see below is not representative of the visuals (and music) of the Rambert's and flat-e's "Iatrogenesis". This is just a standalone piece born from working with the Rambert choreography.
When the clip starts, you probably won't recognize a human shape at first, but your eyes and mind will be searching, seeking mental connections between abstract shapes and recognizable patterns, like looking for shapes in clouds. You'll be questioning what you see, is that it? is it sitting? is it crouching? is it kneeling? Then all of a sudden, it'll be crystal clear. Then you'll try and keep it in focus, following it as it moves around, tracking each limb, using the motion to construct an image of the parts you can't see. It'll fade in and out of clarity. At times you'll be clinging onto just the tip of it's hand swinging round, trying to identify any other recognizable parts. You might see another arm or leg and grab onto it, fighting not to lose it. Then it'll be crystal clear again, and then all of a sudden vanish, literally in a puff of smoke, and your eyes will start searching again.
VDMX unfortunately doesn't have this feature built-in, but fortunately has beautiful integration with Quartz Composer - allowing me to build a quad warper in QC using a GLSL vertex shader, which should be super fast.
Also, around the 4:30 mark you'll see me masking the video on the box in the back. This is also using a custom Quartz Composition which allows 4 point mask creation. Usage is almost identical to the QuadWarper, but instead of warping the image it just applies a mask, or you can invert the mask and it cuts a chunk out. You could do the same by creating new layers, grouping, using as a layer mask etc. but its a a bit more hassle I think. Using the QuadMask is a lot quicker and you can put multiple QuadMasks on the same layer to draw more complex masks.
This is a demo for a traer like physics library for C++/openFrameworks.
I wrote the lib for a little project with Todd Vanderlin while in Linz, Austria at Ars Electronic 2008 (vimeo.com/1707467). I tried to keep the same API as Traer (so processing code using traer can easily be ported to oF) and it is basically a particle system with springs, attractions, gravity etc (but uses verlet integration instead of RK4).
In this demo I am interacting with the app using the keys:
A set of C++ template classes for doing various types of interpolations on data with any number of dimensions. You can feed the system an arbitrary number of data (data can be simple types like float, int or complex types like structs, classes), then resample at any resolution, or ask for the value at any percentage along the data - or just draw it - including splines in 3D.
This is useful for creating and drawing splines (in any dimensions), or creating smooth animation paths from keyframes (again, in any dimensions).
In this previous post I mention creating data-source plugins or processing existing data-sources for VDMX using Quartz Composer. Well the specific example I've given in that post isn't that useful, simply does 1-x^2.
This is a little test using GLSL in Quartz Composer 3.0, and controlling via VDMX. All happening in realtime and completely audio-reactive with no post production or timeline animations etc. The potential is humongous and very exciting!!
Soundtrack "Caliper Remote" by Autechre (from LP5 - 1998)
Who needs autechre when you have a bunch of mad girls!
P.S. I have hours of footage of this if anyone is interested :P
Another example of an AS3 3D Sprite based particle system. This one is using my 3D sprite engine but will port this to papervision asap. See it here.
The source code is by no means generic enough to be considered an engine or library, but is work in progress. I will extend the Node3D, SpringyNode3D classes to use Papervision3D and the Billboards for the rendering, and will definitely carry on working on the particle systems. I've also been converting a lot of noise functions to AS3 and will post them soon... its all getting quite exciting in the flash world! :P
This article will start with the basic definitions of bits, bytes etc, go through how data is stored in bits and bytes, and eventually discuss how the binary and hexadecimal number systems work, how to convert between the two and decimal and how that can be useful. Lays a foundation for later articles...