xDA media lab, first project: GroundSpines Graffiti

xDA is a new medialab located in Coimbra/Portugal. Here’s its first project using a customized version of (hmm 3.x) aka GroundSpines. enjoy the show and if you by any chance visit Coimbra, get in touch.. We’ll have a beer and talk crazy!

(hmm) Arcs

I was looking at a thread at processing’s forum (processing.org/discourse/yabb2/YaBB.pl?num=1252940559/9#9), on creating some kind of flowerish forms. Looking to the pictures i have presented a method of how it would be possible to create something similar, and later on, i gave it a try. So, here is an implementation of the method presented on the forum to create those kinda form/shapes.

The idea is to use particles that are shot in different directions at different speed, save that particle’s path to a buffer and then render a mesh around that path. I already had ribbon like ways to render a mesh around a path, so the rest was just easy to implement.

includes: sub-scatter lighting, planar shadows, ribbons and some nice random colors.

After rendering the video above, i kept changing the code and adding some more goodies i thought it could become interesting, thus complicating. So, i’ve implemented a tree-like strucutre, in other words, each arc can give birth to new arcs. It still needs some more work, would be nice that a child could only be born to a limited angle, and not be able to come back to its parent’s direction, etc but…this is one Hmm experiment. If it becomes a product i’ll sure think about that. Meanwhile, enjoy.



Source code available here

(hmm) 3d stereoscopic view

I have played with 3d stereoscopic in the past but never got to make someting good. This is still not the time sorry, but… i think its worth the post and the time. So what do we have here ?

The technique used is called ‘off-axis frustum’ a.k.a the “right way”, courtesy of Paul Bourke. If you want to read more about it you should pay a visit to his website.

2 images are rendered from 2 different point of views, creating two images with some little differences between them. The off-axis frustum means the point of view might not lie on the perpendicular line to the ‘view-area’.  These 2 images are then sent to a simple shader that takes the R channel from the left-eye buffer, the GB channels from the right-eye buffer and then mixes it into a single stereo image. This is the color glasses compositing method, but it sure is possible to just send both images down the two adapters of your videocard and get the same 3d feeling (with colors) using an Head-Mounted Display or a multi-projector system of some kind.

Source code available here.

You will need ‘Vitamin’ library. Just copy it to processing’s libraries folder as usual, and then run the project.

Lens flares in screen space

I’ve found a post from a guy at Processing’s forums asking about how to render lens flare in processing. Since had created that specific effect for one of my latest experiments, realtime cloud rendering, i packed it up as a single effect and made it available.

The technique is simple and is well explained at gamedev.net.


Download the project here.

As always you will need to copy Vitamin into your processing’s libraries folder.