johnbcarpenter // heart // ibm // gesture // morph // study // info

Gesture-based prototype for a brain visualization at Oblong (2013)

OMEagination: gesture-based brain visualization at Oblong (2013)

OMEagination: Gesture based 3D visualization of brain structures and activity. Created in collaboration with University of California San Francisco and Lawrence Berkeley National Laboratory as part of the OME Precision Medicine Summit using Oblong Greenhouse SDK, FSLView, and a consumer depth sensor.

Collaborating on the project: Bill Seeley, Jesse Brown, and Andrew Trujillo from UCSF MEMORY AND AGING CENTER; Leonid Oliker (Future Technologies Group), Gunther Weber (Visualization Group and the NERSC Analytics Team), Aydin Buluc (Applied Mathematics & Scientific Computing), and Daniela Ushizima (Vis/Analytics Group) from LAWRENCE BERKELEY NATIONAL LABORATORY; Stacey Chang (Health & Wellness practice) from IDEO; Kwin Kramer, David Kung, Sarah Vieweg, John Carpenter, Corey Porter, Mattie Ruth Kramer Backman, and Michael Schuresko from OBLONG INDUSTRIES.


Seismo: gesture-based earthquake visualization at Oblong (2013)

Oblong Industries: seismo (2013). A gesture-based earthquake data visualization using g-speak, Greenhouse and a low-cost depth sensor.
Seismo: gesture-based earthquake visualization at Oblong (2013)

Seismo: gesture-based earthquake visualization at Oblong (2013)