A second version of my ‘tiger reflections patch’. Now with a simpler interface and Noise Industries ‘FXFactory’ ready!
The patch is simplifed and slightly more reliable than the previous version. The generated image can now rotate without producing artifacts. The gradient setup is now automated. Less control but simpler. I have done this to create a test composition that works in ‘FXFactory’ – which is very simple to use. It really unlocks another realm for Quartz Composer developers.
I’ve composed a patch that creates an iris transition. You can position it in x – y space, control the iris size and feather the edge of the mask.
Download Link: iris_quartz_composer.zip
The patch is quite simple to use. For a recent digital puppetry project, I connected the iris radius to a the output of a midi foot pedal so I could create scene transitions in real time. More of that elsewhere.
The patch demonstrates a number of useful ideas: pixel to unit conversion, creating a circle (using the hole distort patch), masks and the ‘source atop-in-out-wave-it-all-about’ patches.
Some Notes on Custom Patches, Hidden Patches and Core Image Units
My original patched originally used one of the hidden patches ‘CICheapBlur’ that you can enable by following instructions here.
For the download here, I have replaced the ‘CICheapBlur’ patch with a gaussian blur – so there shouldn’t be errors if you do not have the ‘hidden’ patches activated. This effects performance slightly. With custom core image filters appearing in the QC patches pane, I am sure it is quite possible to distribute compositions that will not run on other people’s machines without the custom CI filter. There seem to be more useful custom patches appearing on the scene with a number of excellent ones from Boinx. I hope a useful installation mechanism can be constructed to distribute the pre-requiste elements for custom patches.
I have recently completed the first prototype of a major digital puppetry project that relies heavily on Quartz Composer in an arena of live performance familiar to vj-ers, visual artists and visualists. I have a set of predefined visuals (‘scenes’) and effects and a complex mechanism that lets me composite my real-time singing mouth onto an a character that I (or someone else) can manipulate with a nintendo wii remote. Moving eyes are pre-recorded and in future versions, I indend for the eye movements to be controllable by the wii. The source code for the character control can be found elsewhere on this blog.
I do not intend to post the full patch as it is very dependent on other media, the wii controller and the Behringer midi controller, but you can view the root of the composition in the image below. I have split some of the more useful elements up and will be sharing them in other posts. I attempted (and will develop further) the idea of having a separate ‘buses’ for scenes, effects and transitions – a little like the way (I think) quartonian (and other vj-ing tools) work.
Image (1.8mb large): Screenshot of Root of Performance Composition
The project used some of the following ideas:
Some Images and Commentary
A fuller walkthrough of the final images with a commentary can be found here:
Screenshots and Scene-by-Scene Descriptions Link:http://www.daisyrust.com/quartzcomposer/moocher/
Image (above): Prototype of the Garbage Matte and Chroma-Keying Patch
Download Link: garbage_matte_bluescreen_demo_002.qtz.zip
This demo patch makes uses of Sam Kass’s excellent core image kernels available here:
Images (above): Behringer BCF2000 MIDI controller controls scene sequencing and properties of various screen objects and parameters in real time
Image: Real-time Mouth / Recorded (controllable) Eyes Composited into an Image in Real-Time
Full credit and copyright acknowledgment to the Fleischer Brothers Estate for frame grabs and stylistic inspiration.
Here’s a tool I have made that I use to create an instant sheet of thumbnails from a movie. I use it to analyze animations that I have digitised from my own video collection.
Download Link: movie_thumbnail_viewer_001.qtz.zip
It works in principle but there are a number of issues I’ll go into in a moment. It is an early version of an idea that could produce instant time-lines, retrospective after-the-fact storyboards, onion skinning and other video frame manipulations. I use it to visualize the flow and movement of time-based imagery and to produce illustrations for lectures. It is the kind of process that makes Quartz Composer a pleasure to use. To produce such a layout in Photoshop would take a good deal of preparation and layout work. I love the grid layout and the visual effects produced by this patch – as images in their own right.
You set the path to a quicktime movie.
Make some basic choices about number of frames per row, number of rows (you need to enable / disable each row as needed)
Set the time interval / frame shift to jump on each row.
Image (above): Frames demonstrating early pioneering rotoscoping from Dave Fleischer’s “Snow White” (1933) with Betty Boop and Cab Calloway (full copyright acknowledged).
Image (above): Sequences from Bill Plympton’s “Your Face” (1987) (full copyright acknowledged)
Here are some of the important issues that need to be improved:
The patch runs slowly. I am sure you could do a similar thing programmatically with QTKit that would be much faster in the generation of the rows / thumbnails. I may even have a bug / design flaw where each row gets iterated more than once.
It is always a design goal for me to have no manual tweaking necessary, for a composition to do it’s work with the minimum amount of set-up. This patch needs more work in this regard.
It seems some codecs (I forget which from my tests – I mentioned it on the qc-dev list) produce unexpected results. i.e. only two different thumbnails are generated and they then alternate across the sequence. Some sizing / aspect ratio issues of the thumbnails when the viewer is resized
I have fixed this issue on other similar patches I have made – but have yet to implement the fix here. This patch would benefit a rigorous going over. I’d love to hear from anyone who finds it useful.
Images (above): the optional ‘info’ panel and math patch. This part of the composition is interesting and the maths contained there-in could for the basis for a future improved implementation.
Image (above): The basic macro with most of the important published ports
Here’s a screenshot of a demo patch that tiles an image (still or video) onto a user selected number of tiles. The rows/columns tiles can be fixed or animated. Three test patches are available for download.
Download link: quartz composer video matrix test files 001 – 003
Notes: These are patches in development and (except patch 001) need some tidying – the published parameters are slightly out of order on 002 and 003, but you should get the principle from patch 001.
I have been wanting a patch that does this for a while and these are really just a proof of concept and may not be the most elegant solution. There are some limitations.
The number of X tiles (columns) can be determined interactively. Unfortunately at the moment, the number of Y tiles is fixed. If you wish to add a row you must edit the contents of the “3D Transformation” patch and add new copies of the highlighted patches (indicated below), edit that names of published keys, plumb them in (following the pattern of the others – at this and the parent level) and set the Y position of the new row. Pain. But okay if you know what size of matrix you need.
For convenience, there is an image source that switches between camera input, images and a couple of movies – you will need to remove my movies and add your own.
Development – motion detection / computer vision
I have an idea (maybe wrong) that a static matrix like this could be used as the starting point for a motion detection / camera vision thing. Imagine if you processed the video (desaturate, detect outlines for example or use the “flood fill” that appeared on the quartzcomposer-dev list recently) and had something like a kernel that could sample each cell in the matrix and output a value or a boolean dependent on the white / black balance. I don’t think the core image kernel in quartz composer can output anything other than an image – but a kernel possibly can when used in core image in a cocoa application. Interesting.
Download link: quartz composer video matrix test files 001 – 003