Software

A Simple Eye Tracker For Jitter

2011/10/20

A few years ago, I made a rough eye tracker in Jitter with cv.jit for my own use. It wasn’t the most thoroughly advanced approach to the problem but it did the job, and it’s relatively lightweight.

I moved on to other projects and I forgot about the patch almost entirely, but from time to time I have people writing to me asking question about face tracking (which is different from face detection) and eye tracking. And so, here it is, my eye tracker, for anyone to use.

Download “Eye tracker” and necessary abstractions.

As a bonus, you also get a number of useful abstractions, although they’re undocumented and will remain this way. Also, don’t ask me how they work; I don’t remember. The names are fairly descriptive, though.

You will need the standard cv.jit objects for this patch to work.

As always, hope this helps.

jit.freenect.grab Release Candidate 4

2011/10/15

A number of people have been writing me about what appeared to be a memory leak in jit.freenect.grab. Indeed, there was. It’s been found, and as far as I can tell, it’s been squashed. If any of you wish to compile the external on your own, please beware that it is the OSX version of libusb that leaks. Be sure to read the libfreenect README carefully to find out how to patch the problem.

I also get people asking me about a Windows external. Unfortunately, it’s not going to happen soon. At least unless some kind soul does the work for us, because I just don’t have enough time to dedicate to a port.

jit.freenect.grab

jit.freenect.grab Release Candidate 3

2011/01/20

Release candidate 3 is out!

jit.freenect.grab

It’s now possible to output the raw image from the infrared camera instead of the RGB image. This is the raw data the Kinect uses to estimate depth. This means that by blocking the infrared projector (on the left side of the device) it’s now possible to use the Kinect as an infrared camera. Note, however, that the infrared camera was calibrated to work with the infrared projector and you may find ambient infrared light to be either too dim or too bright to be useable.

Even with RC2, some users reported stalling issues. It appears that stalling occurs under heavy CPU load – although I haven’t verified that this is true in all cases.

jit.freenect.grab Release Candidate 2 is out (finally)

2011/01/10

Taking a short break from what has been a truly hectic and near-sleepless month, I found the time to upload a slightly improved jit.freenect.grab. Several users have reported to me that jit.freenect.grab stops working after a while. I think I’ve identified the culprit and this new version may solve the problem. I hope so.

jit.freenect.grab

Some people have also asked me about a Windows version of the external. libfreenect, on which this external is based now works on Windows and a port should be possible. Or at least, it would be if I had the time. Currently, my main priority is getting enough sleep to keep a sane mind. However, the code is on Github and if anyone wants to take the time to work on a Windows version, that would be fantastic.

Some people are also interested, of course, in NITE. Which is a skeleton tracking system developed by Primesense, the company that designed the Kinect’s depth sensor. As far as I know, NITE requires Primesense’s own drivers to work, which would require a new external altogether to work in Jitter. It would be a very useful project but unfortunately, my current engagements don’t allow me the time to work on something like this at the moment.

Here’s hoping rc2 solves everybody’s problems.

jit.freenect.grab page now live

2010/12/03

I just published a page for the jit.freenect.grab external I have been working on. It allows users of Max/MSP/Jitter to get depth and image data from their Kinect.

Head over to http://jmpelletier.com/freenect/ for more info!

Minor update to cv.jit

2010/08/16

I have just released cv.jit version 1.7.1, which fixes two issues. A bug in cv.jit.shift, which also affected cv.jit.touches, was causing memory leaks. Furthermore, some Windows users were seeing 14001 errors when trying to use externals that make use of OpenCV functions. Both problems have now been fixed.

Using cv.jit.touches: recycling index values

2010/06/29

cv.jit.touches, a new object added in the 1.7.0 release, tracks the movement of bright spots in a greyscale image and reports “touch”, “drag” and “release” information – when regions appear, move or disappear. Each region is assigned a unique ID number, so that even if they move, you can always know where an object is from frame to frame. This is unlike using cv.jit.label-based blob analysis, where you are never guaranteed that the same object in a scene will end up with the same index.

By default, cv.jit.touches outputs ever-increasing indices, although the counter is always reset to 0 when there are no active regions. For some applications, this is a reasonable approach: every touch event has its own unique ID. However, some may wish to recycle IDs. If you’re in this situation, I made an abstraction that renumbers the IDs coming out of cv.jit.touches to re-use release IDs, and keep their values as low as possible. It will always output the smallest value that isn’t currently assigned to another active region.

Download cv.jit.touches.recycle.

In order to use this file, just copy it to the “Max5/Cycling ’74/cv.jit-support/Abstractions” folder.

You can see how it works by copying the following code and selecting the “new from clipboard” option from Max’s file menu.


----------begin_max5_patcher----------
911.3oc0XssbhBDD8Y8qfhmcslaL.6a6Ovturus0VoPXhRJfwBFiIap7uuLW
zPRDbTPzTQfLyfNm9zce5FdY5D2E7mXUtNe24ONSl7xzISTSImXhY7D27nmh
yhpT2laLOOmUHbmoWSvdRnl+2qJYLmEY7EU6VqXSdZQFSn9hv2ljuQraVfY1
0Qh3UoEKuqjEKzvgR7lCl4.AgxK9pyH3bfyeMemzD0FyW7v27g61y64Ehhnb
lZoeTlFk0bkpz+oVAhmCjy95zoxSy5qsukOjVNwm1zxwcZ4fqpk+qBsOeXLb
LDYsgSCupF9OGTONBCs2vCFECOmUUEsj8ICONiEUNyYcTZgf+XTliGP9ABTG
MW.Jmt9.ATGebIpdEHrdkEQEKadd2tlkVvh4aJTaM4PTM5fTMrKYEcxEQGpE
n3ZJoEtldbtVushmWyzafq69eoAvMTv1VCjO4Ex3QIMIJKB+5fSLJs5rOIWz
QzG9LXDERGAVI9w4OjJlK3ahWwplmTFs8fDD4TIHOOUhIlpqBA75jhfmAEUi
66xiDkoOMnDkjNVuMsHgumIVWxEb4NuGYXDXwyHB3DBlPGkqH.UlU8ushqHf
VHKT64QyT4RWrviZH+bsZ1AsZ7AsZ7wiPvgZ0ktkvAmirxr8TxW2DIDJnQhD
zDbzBMAA2NIRGobnUEvPCRALCEZJfoaUD2R8qvqc4qqf9igeH.pM5OAim7ij
KJWt.ksIOZPMUDQ08nePmOnP+xkFZgm1xmVTyPN.4eybtW9+HOucGtCR9ioY
GS+e95QgsHTesyet.cgakTEdXnZC6piOgT0HRKbs2WUtdD09wfvlAuldNZQ7
mbi9rKRcvr3DGhS7pnRoBsyPK9iLuHAbm8WftsjDspWrSflvGmlBTYjAzN6B
6R2qp5dTYKe3EQp.tb92SZU7Mkw61IixgyaPOgUIRKhDo7hF2C5c2ypzjDVQ
yWJSdZxZdcBtABj.xbOYcAO4ED1e+nV5HvVrRFdrhgAJPBTw7FrpF0SrRGdr
1SDgr.Q9iJhBGdNpoGbP8m9VfU4yb4fOcv56qe72f8i5IXg1B1wyWKgDzFHA
GUHgrARnwCR.K7bAipiyp.+QEQTaBuonaNVRg6yPdfFJu36Q1OpuDH1FzNt9
zKP8RpmtADPPi9PTi5KXAVnbPgipxAEXg.KENpBrTfsYE2V0gniecHfMPBbS
UZj1mRi0Cdc5+MONptI
-----------end_max5_patcher-----------

cv.jit 1.7 is finally out!

2010/06/07

The latest version of cv.jit, 1.7.0 is finally out. I say finally because, it’s been on the brim of release for several months now, but life being what it is, I only now managed to put the finishing touches on it.

The most obvious change is that the help files have been completely re-written in Max 5 format. cv.jit 1.6 and earlier help files did not display properly in Max 5, owing to some issue with double-byte comments.

A few objects have been added also. cv.jit.opticalflow combines the functionality of cv.jit.LKflow and cv.jit.HSflow. These were two of the earliest externals I wrote, and I now somewhat regret the decision to keep them separate: they essentially do the same thing, albeit in different ways. cv.jit.opticalflow also adds support for two newer optical flow estimation algorithms: block-matching and a brand-new bleeding-edge real-time algorithm by Gunnar Farnebäck. To go along with this new external (and the two older optical flow objects), I also added a drawing utility, cv.jit.flow.draw that displays the optical flow field using hue for direction and saturation for distance.

Farnebäck optical flow, visualized with cv.jit.flow.draw

I often get questions about tracking blobs, or about dealing with the fact that cv.jit.label doesn’t always give the same label to what we would perceive as being the same object. The new object cv.jit.touches, sorts of addresses these issues. It’s a greyscale/binary region tracker. It assumes that the regions are of roughly the same size and don’t overlap. As the name implies, it was packaged with multi-touch interfaces in mind, and it outputs information such as “touch”, “drag” and “release”, but it can be used with other kinds of inputs.

cv.jit.threshold implements adaptive thresholding, in which each pixel is compared to the average brightness of its surroundings instead of a fixed value. This is especially useful when working under slightly uneven lighting situations.

Finally, cv.jit.grab is a ridiculously simple but very useful abstraction that wraps jit.qt.grab and jit.dx.grab depending on your platform. This allows you to write cross-platform patches.

Another big change, apart from the help files, is that I moved the cv.jit site to my own domain. As mentioned in every help file and abstraction, IAMAS, the great media art institution in Gifu, Japan, has provided support for my work on cv.jit – in the form of computers, software, time, unwitting testers and advice from teachers and colleagues. Alas, my contract having reached its end, I don’t work there anymore and I thought that it might be best if I gathered all my work under the same roof, here. The actual files are hosted on Sourceforge, so that those who are interested in actually doing some development can dig in to the SVN repository.

Head over to the new cv.jit page for downloads!

A Ruby script for generating Jitter attributes

2010/01/03

Writing your own Max or Jitter externals in C or C++ isn’t terribly hard, once you’ve wrapped your head around the API’s C approach to object oriented programming. However, it does involve a fair bit of boilerplate. This is especially true for adding attributes to an object – an triply so if this attribute has custom getter and setter methods.

The cv.jit collection now contains more than a few externals and I find myself spending more time trying to find ways to automate some of the repetitive tasks that are required for keeping it up to date. One of the tools I just made is a nifty Ruby script for automatically generating all the necessary attribute-related boilerplate. Simply invoke it at the command line with only a few arguments and it generates a .c file containing the necessary code. It parses the arguments “-c”, “-l”, “-f”, “-d”, “-s” and “-a” as “char”, “long”, “float32″, “float64″, “symbol and “atom” types. Numbers (if there are any) as the number of elements in a list. The arguments “-get” and “-set” specify that the attribute has a custom getter and setter, while “-clip” will add a filter to clip argument values. Any other argument is going to be parsed as the name of the attribute, unless it begins with a “-”, in which case, it’s interpreted as your external’s name (periods are automatically converted to underscores.)

For example:

ruby ./jitargs.rb -f -cv.jit.bigbrother foo

This generates a file “jitter_args.c” in the current directory that looks like this:

//setter/getter declarations

//attribute variables
float foo;

//setters/getters

//attribute registration
attr = (t_jit_object *)jit_object_new(_jit_sym_jit_attr_offset,"foo",_jit_sym_float32,
attrflags,(method)0L,(method)0L,calcoffset(t_cv_jit_bigbrother,foo));
jit_attr_addfilterset_clip(attr,0,1,TRUE,TRUE);
jit_class_addattr(_cv_jit_bigbrother_class,attr);

//attribute initialization
x->foo = 0;

If you wish to add more attributes, just run the script again with different arguments, new code will be inserted in the appropriate place. For example, by running the following:


ruby ./jitargs.rb -a -get -set 2 -cv.jit.bigbrother bar

The file above is modified to:

//setter/getter declarations
t_jit_err cv_jit_bigbrother_set_bar(t_cv_jit_bigbrother *x, void *attr, long ac, t_atom *av);
t_jit_err cv_jit_bigbrother_get_bar(t_cv_jit_bigbrother *x, void *attr, long *ac, t_atom **av);

//attribute variables
long barcount;
t_atom bar[2];
float foo;

//setters/getters
t_jit_err cv_jit_bigbrother_set_bar(t_cv_jit_bigbrother *x, void *attr, long ac, t_atom *av){
if(ac < 2){
//Not enough parameters?
return JIT_ERR_NONE;
}

return JIT_ERR_NONE;
}

t_jit_err cv_jit_bigbrother_get_bar(t_cv_jit_bigbrother *x, void *attr, long *ac, t_atom **av){

int i;
if ((*ac)&&(*av)) {
//memory passed in, use it
} else {
//otherwise allocate memory
*ac = 2;
if (!(*av = jit_getbytes(sizeof(t_atom)*(*ac)))) {
*ac = 0;
return JIT_ERR_OUT_OF_MEM;
}
}

for(i=0;i<2;i++)av[i] = x->bar[i];

return JIT_ERR_NONE;
}

//attribute registration
attr = (t_jit_object *)jit_object_new(_jit_sym_jit_attr_offset_array, "bar", _jit_sym_atom, 2,
attrflags, (method)cv_jit_bigbrother_get_bar,(method)cv_jit_bigbrother_set_bar,
calcoffset(t_cv_jit_bigbrother, barcount),calcoffset(t_cv_jit_bigbrother,bar));
jit_class_addattr(_cv_jit_bigbrother_class,attr);

attr = (t_jit_object *)jit_object_new(_jit_sym_jit_attr_offset,"foo",_jit_sym_float32,
attrflags,(method)0L,(method)0L,calcoffset(t_cv_jit_bigbrother,foo));
jit_attr_addfilterset_clip(attr,0,1,TRUE,TRUE);
jit_class_addattr(_cv_jit_bigbrother_class,attr);

//attribute initialization
jit_atom_setlong(&x->bar[0],0);
jit_atom_setlong(&x->bar[1],0);
x->foo = 0;

All you need to do now is copy and paste the code at the appropriate places. Of course, if I was really crazy, I would write a script that parses and modifies the actual external source but I’ll leave that as an exercise for the reader.

Download the script.

cv.jit – New update available

2008/07/07

A new update to cv.jit, a collection of Max/Jitter externals for computer vision is now available for download. There is only one new object, cv.jit.snake, which is an implementation of active contour algorithms.

Starting with this release, cv.jit is now open source. You can now download the source code and project files from the download pages.