A Simple Eye Tracker For Jitter

2011/10/20

A few years ago, I made a rough eye tracker in Jitter with cv.jit for my own use. It wasn’t the most thoroughly advanced approach to the problem but it did the job, and it’s relatively lightweight.

I moved on to other projects and I forgot about the patch almost entirely, but from time to time I have people writing to me asking question about face tracking (which is different from face detection) and eye tracking. And so, here it is, my eye tracker, for anyone to use.

Download “Eye tracker” and necessary abstractions.

As a bonus, you also get a number of useful abstractions, although they’re undocumented and will remain this way. Also, don’t ask me how they work; I don’t remember. The names are fairly descriptive, though.

You will need the standard cv.jit objects for this patch to work.

As always, hope this helps.

jit.freenect.grab Release Candidate 4

2011/10/15

A number of people have been writing me about what appeared to be a memory leak in jit.freenect.grab. Indeed, there was. It’s been found, and as far as I can tell, it’s been squashed. If any of you wish to compile the external on your own, please beware that it is the OSX version of libusb that leaks. Be sure to read the libfreenect README carefully to find out how to patch the problem.

I also get people asking me about a Windows external. Unfortunately, it’s not going to happen soon. At least unless some kind soul does the work for us, because I just don’t have enough time to dedicate to a port.

jit.freenect.grab

jit.freenect.grab Release Candidate 3

2011/01/20

Release candidate 3 is out!

jit.freenect.grab

It’s now possible to output the raw image from the infrared camera instead of the RGB image. This is the raw data the Kinect uses to estimate depth. This means that by blocking the infrared projector (on the left side of the device) it’s now possible to use the Kinect as an infrared camera. Note, however, that the infrared camera was calibrated to work with the infrared projector and you may find ambient infrared light to be either too dim or too bright to be useable.

Even with RC2, some users reported stalling issues. It appears that stalling occurs under heavy CPU load ? although I haven’t verified that this is true in all cases.

jit.freenect.grab Release Candidate 2 is out (finally)

2011/01/10

Taking a short break from what has been a truly hectic and near-sleepless month, I found the time to upload a slightly improved jit.freenect.grab. Several users have reported to me that jit.freenect.grab stops working after a while. I think I’ve identified the culprit and this new version may solve the problem. I hope so.

jit.freenect.grab

Some people have also asked me about a Windows version of the external. libfreenect, on which this external is based now works on Windows and a port should be possible. Or at least, it would be if I had the time. Currently, my main priority is getting enough sleep to keep a sane mind. However, the code is on Github and if anyone wants to take the time to work on a Windows version, that would be fantastic.

Some people are also interested, of course, in NITE. Which is a skeleton tracking system developed by Primesense, the company that designed the Kinect’s depth sensor. As far as I know, NITE requires Primesense’s own drivers to work, which would require a new external altogether to work in Jitter. It would be a very useful project but unfortunately, my current engagements don’t allow me the time to work on something like this at the moment.

Here’s hoping rc2 solves everybody’s problems.

jit.freenect.grab page now live

2010/12/03

I just published a page for the jit.freenect.grab external I have been working on. It allows users of Max/MSP/Jitter to get depth and image data from their Kinect.

Head over to http://jmpelletier.com/freenect/ for more info!

Minor update to cv.jit

2010/08/16

I have just released cv.jit version 1.7.1, which fixes two issues. A bug in cv.jit.shift, which also affected cv.jit.touches, was causing memory leaks. Furthermore, some Windows users were seeing 14001 errors when trying to use externals that make use of OpenCV functions. Both problems have now been fixed.

Using cv.jit.touches: recycling index values

2010/06/29

cv.jit.touches, a new object added in the 1.7.0 release, tracks the movement of bright spots in a greyscale image and reports “touch”, “drag” and “release” information ? when regions appear, move or disappear. Each region is assigned a unique ID number, so that even if they move, you can always know where an object is from frame to frame. This is unlike using cv.jit.label-based blob analysis, where you are never guaranteed that the same object in a scene will end up with the same index.

By default, cv.jit.touches outputs ever-increasing indices, although the counter is always reset to 0 when there are no active regions. For some applications, this is a reasonable approach: every touch event has its own unique ID. However, some may wish to recycle IDs. If you’re in this situation, I made an abstraction that renumbers the IDs coming out of cv.jit.touches to re-use release IDs, and keep their values as low as possible. It will always output the smallest value that isn’t currently assigned to another active region.

Download cv.jit.touches.recycle.

In order to use this file, just copy it to the “Max5/Cycling ’74/cv.jit-support/Abstractions” folder.

You can see how it works by copying the following code and selecting the “new from clipboard” option from Max’s file menu.


----------begin_max5_patcher----------
911.3oc0XssbhBDD8Y8qfhmcslaL.6a6Ovturus0VoPXhRJfwBFiIap7uuLW
zPRDbTPzTQfLyfNm9zce5FdY5D2E7mXUtNe24ONSl7xzISTSImXhY7D27nmh
yhpT2laLOOmUHbmoWSvdRnl+2qJYLmEY7EU6VqXSdZQFSn9hv2ljuQraVfY1
0Qh3UoEKuqjEKzvgR7lCl4.AgxK9pyH3bfyeMemzD0FyW7v27g61y64Ehhnb
lZoeTlFk0bkpz+oVAhmCjy95zoxSy5qsukOjVNwm1zxwcZ4fqpk+qBsOeXLb
LDYsgSCupF9OGTONBCs2vCFECOmUUEsj8ICONiEUNyYcTZgf+XTliGP9ABTG
MW.Jmt9.ATGebIpdEHrdkEQEKadd2tlkVvh4aJTaM4PTM5fTMrKYEcxEQGpE
n3ZJoEtldbtVushmWyzafq69eoAvMTv1VCjO4Ex3QIMIJKB+5fSLJs5rOIWz
QzG9LXDERGAVI9w4OjJlK3ahWwplmTFs8fDD4TIHOOUhIlpqBA75jhfmAEUi
66xiDkoOMnDkjNVuMsHgumIVWxEb4NuGYXDXwyHB3DBlPGkqH.UlU8ushqHf
VHKT64QyT4RWrviZH+bsZ1AsZ7AsZ7wiPvgZ0ktkvAmirxr8TxW2DIDJnQhD
zDbzBMAA2NIRGobnUEvPCRALCEZJfoaUD2R8qvqc4qqf9igeH.pM5OAim7ij
KJWt.ksIOZPMUDQ08nePmOnP+xkFZgm1xmVTyPN.4eybtW9+HOucGtCR9ioY
GS+e95QgsHTesyet.cgakTEdXnZC6piOgT0HRKbs2WUtdD09wfvlAuldNZQ7
mbi9rKRcvr3DGhS7pnRoBsyPK9iLuHAbm8WftsjDspWrSflvGmlBTYjAzN6B
6R2qp5dTYKe3EQp.tb92SZU7Mkw61IixgyaPOgUIRKhDo7hF2C5c2ypzjDVQ
yWJSdZxZdcBtABj.xbOYcAO4ED1e+nV5HvVrRFdrhgAJPBTw7FrpF0SrRGdr
1SDgr.Q9iJhBGdNpoGbP8m9VfU4yb4fOcv56qe72f8i5IXg1B1wyWKgDzFHA
GUHgrARnwCR.K7bAipiyp.+QEQTaBuonaNVRg6yPdfFJu36Q1OpuDH1FzNt9
zKP8RpmtADPPi9PTi5KXAVnbPgipxAEXg.KENpBrTfsYE2V0gniecHfMPBbS
UZj1mRi0Cdc5+MONptI
-----------end_max5_patcher-----------

cv.jit 1.7 is finally out!

2010/06/07

The latest version of cv.jit, 1.7.0 is finally out. I say finally because, it’s been on the brim of release for several months now, but life being what it is, I only now managed to put the finishing touches on it.

The most obvious change is that the help files have been completely re-written in Max 5 format. cv.jit 1.6 and earlier help files did not display properly in Max 5, owing to some issue with double-byte comments.

A few objects have been added also. cv.jit.opticalflow combines the functionality of cv.jit.LKflow and cv.jit.HSflow. These were two of the earliest externals I wrote, and I now somewhat regret the decision to keep them separate: they essentially do the same thing, albeit in different ways. cv.jit.opticalflow also adds support for two newer optical flow estimation algorithms: block-matching and a brand-new bleeding-edge real-time algorithm by Gunnar Farneb?ck. To go along with this new external (and the two older optical flow objects), I also added a drawing utility, cv.jit.flow.draw that displays the optical flow field using hue for direction and saturation for distance.

Farneb?ck optical flow, visualized with cv.jit.flow.draw

I often get questions about tracking blobs, or about dealing with the fact that cv.jit.label doesn’t always give the same label to what we would perceive as being the same object. The new object cv.jit.touches, sorts of addresses these issues. It’s a greyscale/binary region tracker. It assumes that the regions are of roughly the same size and don’t overlap. As the name implies, it was packaged with multi-touch interfaces in mind, and it outputs information such as “touch”, “drag” and “release”, but it can be used with other kinds of inputs.

cv.jit.threshold implements adaptive thresholding, in which each pixel is compared to the average brightness of its surroundings instead of a fixed value. This is especially useful when working under slightly uneven lighting situations.

Finally, cv.jit.grab is a ridiculously simple but very useful abstraction that wraps jit.qt.grab and jit.dx.grab depending on your platform. This allows you to write cross-platform patches.

Another big change, apart from the help files, is that I moved the cv.jit site to my own domain. As mentioned in every help file and abstraction, IAMAS, the great media art institution in Gifu, Japan, has provided support for my work on cv.jit ? in the form of computers, software, time, unwitting testers and advice from teachers and colleagues. Alas, my contract having reached its end, I don’t work there anymore and I thought that it might be best if I gathered all my work under the same roof, here. The actual files are hosted on Sourceforge, so that those who are interested in actually doing some development can dig in to the SVN repository.

Head over to the new cv.jit page for downloads!

Gragra DSP Exhibition 2010 ? Kyoto

2010/03/22

Shinobu Toma ? Ghost in the Space

Kazuomi Eshima ? Remind

Kei Shiratori ? twist suburbia

Daichi Misawa ? Skies

Leo Kikuchi ? Landscape in my Arms

Kaori Takemoto ? Hunter-gatherer Colorist

Kanna Komaki ? Utopian Babble

IAMAS 2010 Graduation Exhibition

2010/02/18

The IAMAS 2010 Graduation Exhibition opened this morning. Graduating students from the Academy and the Institute will be showing their works until Sunday February 21st.

After the traditional ribbon ceremony, a bus-full of high school students poured into Softopia’s Sophia Hall.

Mitsuru Tokisato ? What Could Have or Can Happen?

Mitsuru Tokisato ? What Could Have or Can Happen?

DSP course student Mitsuru Tokisato’s piece “What Could Have or Can Happen?” is a photographic record of his surrounding random objects with white tape.

Yuuya Ito ? Cell #00

Yuuya Ito ? Cell #00

Yuuya Ito, the other DSP student exhibiting, created and performed in a short play titled “Cell #0″. He is exhibiting the device he built for his performance: a large faucet that he plants in the sand. He is able to control video projections by turning the faucet and planting the pipe in various places.

Reinhard Gupfinger ? Singing Robot Cricket


Reinhard Gupfinger ? Singing Robot Cricket

Reinhard Gupfinger, an exchange student from the University of Art and Industrial Design Linz, in Austria built a robotic chirping cricket.