A ‘living’ underwater world driven by activity data within the USC Cinematic Arts SCI building.
Multi-User Public Visualization
A large lobby touchscreen visualizes user and building activity as a collective dynamic world. Users have an underwater avatar based on their role (student rank, etc.) that ages based on how much time they spend in the building. For example, jellyfish grow from tiny babies to adults. As they move through the physical building, their avatar’s virtual location updates via indoor wifi localization. The building’s wireframe shows the virtual map coordinates.
Avatars generate flora as they spend time and do specific activities in particular physical locations. For example, if they study for an extended period in a certain room, a lily pad will grow in that virtual room. The building’s smart sensors feed data about temperature and C02 into the visualization. These transform the virtual lights within the scene (warm versus cool for temperature, and light versus dark for C02).
Every visualized element is dynamic and living, growing from an embryo to adult and then dissolving.
Users (or the public) can explore the visualization on the touch screens. They can zoom or rotate in to different areas and select different layers of information via a key on the screen’s side. Tapping on an object shows a call-out card with information on the user or meaning of the visualized object.
Single User Private Visualization
The individual app feeds data into the aquarium and is a window through which to visualize the virtual world through a 1st person point-of-view. This is where users can actively track what their activities are to influence the multi-user virtual world. They can also tag notes to specific areas of the building, and leave them for other users.
Project Leader | Art Direction | Concept Development: Amanda Tasse
Database Development: Charles Fox Haskins, Qiaosong Wei
Sensor Hacking: Soren Massoumi
Unity3D Development: Will Hellwarth, Isaac Steele
3D Modeling | Textures: Maureen Lu, Amanda Tasse
Android App Development: Xiao Yang, Suvil Singh
Proof-of-Concept Research Goals
Interactivity: This project grew out of a series of experiments integrating real-world data into interactive environments using the game engine, Unity 3D. In this case, the world is both generated by and responsive to data parameters. It includes unique navigation systems - simultaneously both a 1st person camera view that is mapped to real-world coordinates and a multi-user shared view. Both of these are tied to real-world information and updated in real-time, and provide a new model for systemic location-based play.
Visualization: I experimented with living expressions of data that take on a life of their own, starting from a data point and then evolving through their own life-cycle. The visualized objects build up a coordinated eco-system, which I call data landscapes or portraits. This alternative method of visualizing information uses narrative symbols, characters, and metaphors to playfully represent and reflect on activity within a specific location. An aquarium might work well for a kids aquarium or marine biology building. This approach could easily be modified to reflect the needs, interests, and style of a particular population.
Engineering: My primary collaborator, Computer Science PhD candidate, Suvil Singh, developed a novel indoor wifi triangulation method for this project and co-led the development of a server system for collecting and integrating the information into the app and game engine in real-time.