Cookies   I display ads to cover the expenses. See the privacy policy for more information. You can keep or reject the ads.

Video thumbnail
We introduce a new data structure, the bilateral grid,
which enables real-time, edge-aware image processing.
2D images are lifted onto a coarse, 3D grid,
where the z-component corresponds to intensity.
As a result, points across a strong edge
are distant in the grid.
We introduce slicing, an operation that
exploits this property to extract
a discontinuous, 2D image from a smooth bilateral grid.
We demonstrate a variety of real-time, edge-aware
algorithms using the bilateral grid.
We demonstrate real-time, bilateral filtering
on a noisy, 12-megapixel input.
The user can explore the parameter's base
while getting real-time feedback on the whole image.
We zoom in to show the full resolution of the image.
Real-time feedback makes it easy for the user
to fine-tune filter parameters.
We further accelerate the bilateral filter on video
by sampling a random 10% of the input pixels.
Sub-sampling causes swimming artifacts, shown on the left.
We eliminate these artifacts by applying
a temporal exponential filter, shown on the right.
We demonstrate the real-time video abstraction technique
by [INAUDIBLE] and colleagues on HD video.
We adapt real-time video abstraction
to follow a point of focus, here controlled
by the mouse pointer.
Elements that are farther away from the point of focus
are more abstracted.
We first compute the distance to the point of focus
and cross bilateral filter it with the input image
to create an adapted importance map that respects
the edges of the image.
We use this map to composite the levels of a bilateral pyramid.
The rest of the abstraction pipeline remains unchanged.
Note that our method is efficient enough
to compute five bilateral filters per frame
and allow parameter adjustment while the video plays.
We transfer the look of a model photograph
to an input video in real time.
Our result captures the tonal balance and level
of detail of the model in the spirit of the work
by Bay and colleagues.
Our method enables on-the-fly adjustment of parameters
on live, HD video.
Compared to existing techniques that
require offline processing, our approach
makes it easy to modify the level of detail
and adjust the overall tonal balance of this shot
as it plays.
We demonstrate bilateral grid painting
by locally modifying the hue while respecting
strong intensity edges.
The initial mouse click locks the brush
to an intensity level.
The brush is aware of edges and does not
paint across intensity discontinuities.
For instance, the brush does not affect the white wall
by the door.
Our method runs in real time on the GPU.
We update the entire 2-megapixel input on every frame.
We apply the same technique to locally modify
tone-mapping parameters.
We adjust exposure parameters on this 15-megapixel HDR panorama.
We paint over the gate to correct the overexposed region.
Notice how the bars are unaffected.
Similarly, the windows are unaffected
while the overexposed dome is corrected.
Our GPU algorithm updates the entire 15-megapixel image
on every frame.
We demonstrate scribble interpolation
using the bilateral grid.
We scribble on the input image, shown on the left.
The extracted influence map is shown on the right.
We use the influence map to drive a color shift.
The white scribbles cause the cloth
to change from red to blue, while the black scribbles
protect the candies.
We zoom in to show the full image resolution.