Painting a New Reality: Wetbrush at The Tech
Imagine dipping your brush into a dollop of oil paint, lifting it to the canvas and composing layers of pigment until they match the image in your mind. Is it possible to have an immersive, tactile experience like this in the digital world?
Virtual Painting in Three Dimensions
Wetbrush recreates the textures and colors of real-life oil painting with a physics-based brush and particle simulation on a pressure-sensitive tablet. Users choose their brush size and colors, blend their paints and add complex textures in ways that feel shockingly like the real world.
“One of our goals in building Wetbrush was to make a system that takes our intuition from the natural world and maps it into a computer. You can create highly detailed ripples and ridges and bumps that can even be 3D printed to get the natural lighting effects,” explains Nathan Carr, Principal Scientist for Adobe Research.
Still in its developmental phase, Wetbrush uses complex algorithms that would overwhelm the tablets and computers most of us have at home, but it offers a glimpse into the future of immersive, mixed reality creative tools.
“We hope the computing technology will get faster so Wetbrush will be accessible to everyone,” says Nathan. “But in the meantime, we’re putting it in the museum to give people a hands-on experience with the future.”
And this is the heart of the Reboot Reality exhibit—to allow patrons to see what’s around the corner in technology and discover their own power to solve problems and create with new tools. The Wetbrush interaction is ideal for the museum setting. There’s no headset or special gear, and users can figure it out by doing it, without a tutorial or guide.
“Wetbrush is an excellent example of the creative design process that we love to get our visitors engaged in,” says Nadav Hochman, Experience Developer and Program Manager, Art & Technology at the Tech Museum. “It is extremely easy to get started by just playing around, but it quickly morphs into an intentional exercise that visitors can stay at for 10, 20, 30 minutes or longer.”
Museum Visitors See the Future, and Shape It
Putting Wetbrush at The Tech gives patrons access to cutting-edge technology, but it also gives researchers an opportunity to see their inventions in the hands of users, and to work their reactions into the development process.
For example, while watching people use Wetbrush, the engineering team made an observation that people want to collaborate on their art. And that seems like a big change.
“There’s some sort of generational shift happening,” notes Nathan. “When we interview professional artists, they want to do their own work in isolation. But when you look at the younger generation, they use tools differently. They want to collaborate.” The team is taking the observation to heart, with upcoming plans to investigate the technologies that can make collaborative painting possible.
They also plan to add to the exhibit examples of 3D-printed Wetbrush paintings. Nathan explains the idea: “You’ll be able to walk up and actually feel the paintings, making the process much more visceral. When you’re painting on a computer, you realize that you’re not just creating colors, you’re creating a volume of paint. I think that if visitors can actually feel the ridges of paint, it’ll increase their appreciation for all of the possibilities.”
Bringing to life a complex system like WetBrush required a team of dedicated research scientists – Zhili Chen, Byungmoon Kim, and Xin Sun – and close guidance from a professional technical artist, Daichi Ito. One thing that really excites the team behind Wetbrush is how broad the technology’s impact has already been. “There are times when great research happens in a lab, but it doesn’t reach the hands of users, but with Wetbrush, we’re seeing it all the way through to public, and that’s just thrilling,” says Nathan.
If you’d like to try out Wetbrush yourself, visit Reboot Reality at The Tech when it opens on Friday, May 26. At the exhibit, you’ll also be able to test drive a range of immersive digital and VR projects from Google, Oculus from Facebook and Stanford.