GLipKit Information

This page contains some basic information about GLipKit.

  1. What is GLipKit?
  2. GLipKit (OpenGL Image Processing Kit) is a library for image processing exploiting programmable 3D graphics hardware. Its main focus is on shape from stereo and structure from motion algorithms. It is a descendent of an experimental epipolar matching procedure running mostly on 3D graphics hardware. This library addresses mainly structure from motion to estimate image disparities (optical flow).

  3. What are the basic concepts? What is unique with this approach?
  4. The texturing capability of 3D graphics hardware always allowed image processing tasks like image warping to be done with fast hardware. Vertex programs and especially pixel programs enable accelerated image processing with commodity 3D hardware. Of course, there are limitations: the image resolution is usually bounded by 2kx2k, the precision of color channels is very limited (exception: DX9 hardware) etc. Furthermore, the image processing pipeline can be reordered for better performance. Changing OpenGL contexts between different pixel buffers is rather expensive, therefore reducing the number of context switches is one central goal in this library.

  5. Why Objective C?
  6. There are mainly two reasons for implementing GLipKit in Objective C:

    1. It must be a language derived from C to use new OpenGL extensions easily without the need to write own language bindings for those extensions.
    2. Objective C is better for rapid prototyping than C++ and I like the language. Since MingW supports Objective C, it is possible to use it under Win32. Otherwise I would use plain C.

  7. License?
  8. This code is published under the GNU General Public License (see COPYING).

  9. Where is the source code?
  10. The source is available at the The SourceForge project page

  11. Documentation?
  12. This code is not really documented at all. On the other hand, the source should be relatively easy to understand ;-)

  13. Available Algorithms
  14. Near Future Plans?
  15. There are a number of smaller items on the to-do list. A general epipolar matcher working on non-rectified images has high priority and should be rather easy to implement. With my Radeon 9000 hardware, the calculation of epipolar lines for every pixel in the key image will be done as a preprocessing step with the CPU. The epipolar constraints will be encoded in a texture, which is used during matching as a look-up table.

    The focus will be on computational stereo in the near future. Image filters will be implemented only as needed.

  16. Acknowledgements
  17. The synthetic dino and bowl (schale) datasets were generated with virtual turntable software written by Bernhard Reitinger.

  18. Further Links

© 2003 by Christopher Zach Logo