Additional information about our display technology can be found in
the following selected publications.
2006
D. Cotting and M. Gross,
Interactive Environment-Aware Display Bubbles,
Proc. of ACM UIST 2006,
ACM Press, pp. 245-254.
We present a novel display metaphor which extends traditional tabletop
projections in collaborative environments by introducing freeform,
environment-aware display representations and a matching set of
interaction schemes. For that purpose, we map personalized widgets or
ordinary computer applications that have been designed for a
conventional, rectangular layout into space-efficient bubbles whose
warping is performed with a potential-based physics approach. With a set
of interaction operators based on laser pointer tracking, these freeform
displays can be transformed and elastically deformed using focus and
context visualization techniques. We also provide operations for
intuitive instantiation of bubbles, cloning, cut & pasting, deletion and
grouping in an interactive way, and we allow for user-drawn annotations
and text entry using a projected keyboard. Additionally, an optional
environment-aware adaptivity of the displays is achieved by
imperceptible, realtime scanning of the projection geometry.
Subsequently, collision-responses of the bubbles with non-optimal
surface parts are computed in a rigid body simulation. The extraction of
the projection surface properties runs concurrently with the main
application of the system. Our approach is entirely based on
off-the-shelf, low-cost hardware including DLP-projectors and FireWire
cameras.
Paper
Presentation
Movie
- 2005
D. Cotting, R. Ziegler, M. Gross, and H. Fuchs,
Adaptive Instant Displays: Continuously Calibrated Projections Using
Per-Pixel Light Control,
Proc. of Eurographics 2005,
Eurographics Association, pp. 705-714.
We present a framework for achieving user-defined on-demand displays
in setups containing bricks of movable cameras and DLP-projectors. A
dynamic calibration procedure is introduced, which handles cameras and
projectors in a unified way and allows continuous flexible setup
changes, while seamless projection alignment and blending is performed
simultaneously. For interaction, an intuitive laser pointer based
technique is developed, which can be combined with real-time 3D
information acquired from the scene. All these tasks can be performed
concurrently with the display of a user-chosen application in a
non-disturbing way. This is achieved by using an imperceptible
structured light approach enabling pixel-based surface light control
suited for a wide range of computer graphics and vision algorithms. To
ensure scalability of light control in the same working space, multiple
projectors are multiplexed.
Paper
Presentation
Movie
D. Cotting, M. Naef, M. Gross, and H. Fuchs,
Imperceptible Patterns for Reliable Acquisition of Mixed Reality
Environments,
Proc. of Intl. Workshop on Image Analysis for Multimedia Interactive
Services 2005.
Projection-based mixed and augmented reality settings often require
concurrent optical camera acquisition. Unfortunately, the grabbed images
frequently capture the projected imagery in addition to the desired
scenery, introducing undesired interference and complicating image
analysis. To efficiently improve signal-to-noise ratio, we present a
method allowing the acquisition to take place under controlled
illumination conditions. By exploiting the micro-mirror modulation
pattern used by Digital Light Processing (DLP) projectors, a pixel-level
control of light can be achieved. Since the patterns are imperceptible
to the human eye and only slightly degrade the projected images,
structured light techniques are introduced into humaninhabited mixed and
augmented reality environments, where they often were too intrusive
previously. This extended abstract gives an overview of the proposed
embedding and illustrates feasibility and usefulness of the approach
with representative example applications.
Paper
Presentation
- 2004
-
- D. Cotting, M. Naef, M. Gross, and H. Fuchs,
Embedding Imperceptible Patterns into Projected Images for
Simultaneous Acquisition and Display,
Proc. of IEEE/ACM International Symposium on Mixed and Augmented Reality
2004,
IEEE Computer Society Press, pp. 100-109.
We introduce a method to imperceptibly embed arbitrary binary
patterns into ordinary color images displayed by unmodified
off-the-shelf Digital Light Processing (DLP) projectors. The encoded
images are visible only to cameras synchronized with the projectors and
exposed for a short interval, while the original images appear only
minimally degraded to the human eye. To achieve this goal, we analyze
and exploit the micro-mirror modulation pattern used by the projection
technology to generate intensity levels for each pixel and color
channel. Our real-time embedding process maps the user’s original color
image values to the nearest values whose camera-perceived intensities
are the ones desired by the binary image to be embedded. The color
differences caused by this mapping process are compensated by
error-diffusion dithering. The non-intrusive nature of our novel
approach allows simultaneous (immersive) display and acquisition under
controlled lighting conditions, as defined on a pixel level by the
binary patterns. We therefore introduce structured light techniques into
human-inhabited mixed and augmented reality environments, where they
previously often were too intrusive.
Paper
Presentation
Movie