POST IN PROGRESS Some nice experiments using Kinemes OpenGL and 3D tools… onedotzero
Great when sound activated – good for graphing!
A harmongraph generator made with quartz composer. Using transparent png as a texture, the composition modulates the x, y position using sin / cos waves while the image scales. Expensive and a little slow, but the process of watching the images being made is nice. Some of the final images produced by changing x, y position, parameters to the waves, switching presets, alpha and colors (using a MIDI controller) were usable as stills and textural elements.
Download Link: harmonogaphs_001.zip
The zip contains a version controllable by the keyboard and another with controls in the parameters. It is designed to be controlled with a midi desk. You will need to dig into the patch to explore the principles of the harmonograph generators. You may wish to add a clear patch. To see things better. I used the included QCToMovie application to render movies with alpha of the animated textures to be taken into After Effects.
Please check back – I am currently updating the site with new images, commentary, compositions and movies!
I’ve written a chapter in the following book: “Experiments in Digital Puppetry. Video Hybrids in Quartz Composer”. It involves the material on this site and elsewhere that describes real-time video processing and the use of Quartz Composer in performance. I feel quite proud to be in such interesting and diverse company. The chapters on digital puppetry are really welcome. The subject deserves a book all of it’s own!
I have a chapter on Digital Puppetry and real-time performance systems (including Quartz Composer) in the following book:
Transdisciplinary Digital Art. Sound, Vision and the New Screen
Digital Art Weeks and Interactive Futures 2006/2007, Zurich, Switzerland and Victoria, BC, Canada. Selected Papers
Editors: Randy Adams, Steve Gibson and Stefan MÃ¼ller Arisona
I’ll post more details and some excerpts asap.
If you are in a position to, please order the book for your college or local library!
A second version of my ‘tiger reflections patch’. Now with a simpler interface and Noise Industries ‘FXFactory’ ready!
The patch is simplifed and slightly more reliable than the previous version. The generated image can now rotate without producing artifacts. The gradient setup is now automated. Less control but simpler. I have done this to create a test composition that works in ‘FXFactory’ – which is very simple to use. It really unlocks another realm for Quartz Composer developers.
The patch does a simple thing of producing an ‘aquafied’ circle. I created this patch when I was exploring masks and how best to create a circle using the current Quartz Composer toolset. The patch contains some useful techniques working between pixels and Quartz Composer units. It is also an exercise in ‘pixel based positioning’. I am sure the visual effect can be improved upon, but I quite like the principle of instantly produced graphics using quartz composer. I followed several online tutorials for producing aqua styled graphics – then simplified the process. I reckon quite complex aqua graphics could be produced in this way, like widget graphical elements etc.
The accuracy of positioning of bottom gradient may need improving. Hence the ‘tweaks’
Better control over how the bottom gradient is generated. The rules seem to break down at smaller sizes. Fun though.
I’ve composed a patch that creates an iris transition. You can position it in x – y space, control the iris size and feather the edge of the mask.
Download Link: iris_quartz_composer.zip
The patch is quite simple to use. For a recent digital puppetry project, I connected the iris radius to a the output of a midi foot pedal so I could create scene transitions in real time. More of that elsewhere.
The patch demonstrates a number of useful ideas: pixel to unit conversion, creating a circle (using the hole distort patch), masks and the ‘source atop-in-out-wave-it-all-about’ patches.
Some Notes on Custom Patches, Hidden Patches and Core Image Units
My original patched originally used one of the hidden patches ‘CICheapBlur’ that you can enable by following instructions here.
For the download here, I have replaced the ‘CICheapBlur’ patch with a gaussian blur – so there shouldn’t be errors if you do not have the ‘hidden’ patches activated. This effects performance slightly. With custom core image filters appearing in the QC patches pane, I am sure it is quite possible to distribute compositions that will not run on other people’s machines without the custom CI filter. There seem to be more useful custom patches appearing on the scene with a number of excellent ones from Boinx. I hope a useful installation mechanism can be constructed to distribute the pre-requiste elements for custom patches.
I have recently completed the first prototype of a major digital puppetry project that relies heavily on Quartz Composer in an arena of live performance familiar to vj-ers, visual artists and visualists. I have a set of predefined visuals (‘scenes’) and effects and a complex mechanism that lets me composite my real-time singing mouth onto an a character that I (or someone else) can manipulate with a nintendo wii remote. Moving eyes are pre-recorded and in future versions, I indend for the eye movements to be controllable by the wii. The source code for the character control can be found elsewhere on this blog.
I do not intend to post the full patch as it is very dependent on other media, the wii controller and the Behringer midi controller, but you can view the root of the composition in the image below. I have split some of the more useful elements up and will be sharing them in other posts. I attempted (and will develop further) the idea of having a separate ‘buses’ for scenes, effects and transitions – a little like the way (I think) quartonian (and other vj-ing tools) work.
Image (1.8mb large): Screenshot of Root of Performance Composition
The project used some of the following ideas:
Some Images and Commentary
A fuller walkthrough of the final images with a commentary can be found here:
Screenshots and Scene-by-Scene Descriptions Link:http://www.daisyrust.com/quartzcomposer/moocher/
Image (above): Prototype of the Garbage Matte and Chroma-Keying Patch
Download Link: garbage_matte_bluescreen_demo_002.qtz.zip
This demo patch makes uses of Sam Kass’s excellent core image kernels available here:
Images (above): Behringer BCF2000 MIDI controller controls scene sequencing and properties of various screen objects and parameters in real time
Image: Real-time Mouth / Recorded (controllable) Eyes Composited into an Image in Real-Time
Full credit and copyright acknowledgment to the Fleischer Brothers Estate for frame grabs and stylistic inspiration.
Here’s a tool I have made that I use to create an instant sheet of thumbnails from a movie. I use it to analyze animations that I have digitised from my own video collection.
Download Link: movie_thumbnail_viewer_001.qtz.zip
It works in principle but there are a number of issues I’ll go into in a moment. It is an early version of an idea that could produce instant time-lines, retrospective after-the-fact storyboards, onion skinning and other video frame manipulations. I use it to visualize the flow and movement of time-based imagery and to produce illustrations for lectures. It is the kind of process that makes Quartz Composer a pleasure to use. To produce such a layout in Photoshop would take a good deal of preparation and layout work. I love the grid layout and the visual effects produced by this patch – as images in their own right.
You set the path to a quicktime movie.
Make some basic choices about number of frames per row, number of rows (you need to enable / disable each row as needed)
Set the time interval / frame shift to jump on each row.
Image (above): Frames demonstrating early pioneering rotoscoping from Dave Fleischer’s “Snow White” (1933) with Betty Boop and Cab Calloway (full copyright acknowledged).
Image (above): Sequences from Bill Plympton’s “Your Face” (1987) (full copyright acknowledged)
Here are some of the important issues that need to be improved:
The patch runs slowly. I am sure you could do a similar thing programmatically with QTKit that would be much faster in the generation of the rows / thumbnails. I may even have a bug / design flaw where each row gets iterated more than once.
It is always a design goal for me to have no manual tweaking necessary, for a composition to do it’s work with the minimum amount of set-up. This patch needs more work in this regard.
It seems some codecs (I forget which from my tests – I mentioned it on the qc-dev list) produce unexpected results. i.e. only two different thumbnails are generated and they then alternate across the sequence. Some sizing / aspect ratio issues of the thumbnails when the viewer is resized
I have fixed this issue on other similar patches I have made – but have yet to implement the fix here. This patch would benefit a rigorous going over. I’d love to hear from anyone who finds it useful.
Images (above): the optional ‘info’ panel and math patch. This part of the composition is interesting and the maths contained there-in could for the basis for a future improved implementation.
Image (above): The basic macro with most of the important published ports
QCWii is an application that lets a user connect a Nintendo Wii Remote to the mac and control a simple teapot.
It is a proof of concept for a digital puppet controller. The final project controls a face – where the mouth is real-time video and the eyes a pre-recorded video loops that can be controlled from the wii remotes buttons. Please see the following website for more details and the project write up.
The source code demonstrates the following:
Many thanks to Jasen Jacobsen for advice on how to make the animation smooth and to Hiroaki for the ‘Darwiin-Remote’ framework and project.
To connect to the Wii Remote: press button 1 and 2 on the wii remote so the lights flash, then click ‘Connect with wii remote’ on the preference pane:
To activate sensor tracking: click ‘Track Motion Sensors’ on the same preference pane.
That should be it. Most of the buttons on the Wii are connected to do something in the Quartz Composer composition – if only signal a connection to the patch.
It is acknowledge that both scaling and translating on the Z axis is probably not as useful as moving up and down.
To exit full screen mode – press SHIFT – F on the keyboard.
Some Additional Information if you wish to edit the Quartz Composer Patch to do something other than move a teapot around
To find the QC composition:
ctrl-click on the QC-Wii application ‘view package contents’ and dig down to ‘resources’ that is where the QC patch – wii_to_qc.qtz – lives. You can (carefully) edit that composition to do something different other than trigger the text display and move the teapot… Just don’t change the name of any of the root level published port ‘keys’:
Image: The crucial published keys that bind the patch to the wii remote via the application. Do not change the published name of these – or the application will break
If you edit that composition, save it. Then when you re-launch the application, it will use the edited composition as it’s source.
This way you don’t need to use xcode or re-build the application.
The qc patch needs to run inside an application wrapper. The application handles all the blue-tooth connection wizardry provided by the wiiremote-framework, the calibration preferences etc and toggling full screen. So you can’t simply edit the composition and preview it using Quartz Composer itself and expect the WiiRemote to connect.
Ian Grant January, 2007
ian [dop ] grant [at ] mac [dop ] com
Here we go! I did this once then failed to remember how I did it – and I’m not surprised – because the functionality to attach a speech command to an application specific key-press can only be accessed by SPEAKING the command “Define a Keyboard Command” – highlighted below in the Speech Commands window. Read more…
Here’s a screenshot of a demo patch that tiles an image (still or video) onto a user selected number of tiles. The rows/columns tiles can be fixed or animated. Three test patches are available for download.
Download link: quartz composer video matrix test files 001 – 003
Notes: These are patches in development and (except patch 001) need some tidying – the published parameters are slightly out of order on 002 and 003, but you should get the principle from patch 001.
I have been wanting a patch that does this for a while and these are really just a proof of concept and may not be the most elegant solution. There are some limitations.
The number of X tiles (columns) can be determined interactively. Unfortunately at the moment, the number of Y tiles is fixed. If you wish to add a row you must edit the contents of the “3D Transformation” patch and add new copies of the highlighted patches (indicated below), edit that names of published keys, plumb them in (following the pattern of the others – at this and the parent level) and set the Y position of the new row. Pain. But okay if you know what size of matrix you need.
For convenience, there is an image source that switches between camera input, images and a couple of movies – you will need to remove my movies and add your own.
Development – motion detection / computer vision
I have an idea (maybe wrong) that a static matrix like this could be used as the starting point for a motion detection / camera vision thing. Imagine if you processed the video (desaturate, detect outlines for example or use the “flood fill” that appeared on the quartzcomposer-dev list recently) and had something like a kernel that could sample each cell in the matrix and output a value or a boolean dependent on the white / black balance. I don’t think the core image kernel in quartz composer can output anything other than an image – but a kernel possibly can when used in core image in a cocoa application. Interesting.
Download link: quartz composer video matrix test files 001 – 003
A Quartz Composer patch that will automatically produce a cool, adjustable reflection from an image, text or video input. The style is familiar from Tiger applications, particularly “Front Row” and Keynote. It is a visual style echoed across Apple’s branding and is slightly cooler than a simple drop shadow.
Most are self explanatory but here are some notes:
Color: this controls the background, the ‘fade to color’ of the reflection gradient and the ‘fade to color’ of the block that fades the reflections
Gradient Point 1 and 2: This numbers control the falloff / distance of the reflection The numbers relate to the size of the image measured in pixels. To get recommendations on what values to use select Gradient Advice.
Currently the recommended values are calculated like so:
Gradient Point 1: image height (px) / 1.7
Gradient Point 2: image height (px) * 1.5
Gradient point 1 should be smaller then Gradient point 2. Gradient point 2 should be around the height of the image – but different effects can be achieved by varying it. Likewise, increasing Gradient point 1 from 0 increases the brightness of the first part of the reflection – creating a nice controllable reflection fall-off.
In a cocoa application the reflection gradient parameters could be more usefully mapped to a couple of sliders. If a simpler interface is needed, you could hard wire the ‘recommended’ values into the relevant inputs.
Gap: Makes a gap between the reflection and the image. This improves the illusion that the image is sitting on a surface.
- with a little effort the patch could be used in a cocoa application or, I think, converted to an image unit for use in other applications.
- thorough testing of how images with alpha channels work.
Key Quartz Composer Techniques
- mask and alpha manipulations.
- blending modes
Download Application: QCStereoscopicRecorder 0.1 (Universal Binary) OS 10.4 required
Download Source: QCStereoscopicRecorder 0.1 Source (4.0mb)
QC Stereoscopic Recorder is my first Cocoa app! It is part of my work towards an MA in Computer Arts. The app is a component of a larger project called â€œAnamorphicaâ€ in a class called â€œExperimental Digital Mediaâ€. Further documentation of this project will appear over the next few weeks. Basically, I am aiming to make a low-cost open source anaglyph recording and â€˜performanceâ€™ system.
To use it you will need two firewire cameras.
Performance is improved if each camera comes in on a separate firewire bus. To effect this, I use a Lacie firewire PCMCIA card in my G4 laptop.
I needed a simple application that would capture dual camera input, export to quicktime and make an anaglyph.
I have included â€œstereo-pairâ€ generation. Stereo-pair generation will come into itâ€™s own when the application can handle full-screen and dual monitor support. Then I may create a â€˜Wheatstoneâ€™ device (see http://www.stereoscopy.com/library/wheatstone-paper1838.html).
QCStereoscopic Recorder would not have happened if it wasnâ€™t for the wonderful â€œQuartz Composerâ€. Easy and a joy to use. As I am learning Cocoa and Objective-C, the project is indebted to sample code and open source initiatives.
early recursive anaglyph experiment made with qc stereoscopic recorder