|3DS|VR| Voices of Country
3D Scanning & Unreal Engine 360 Video
View on Youtube in VR
Project developed by Dr. Rochus Urban Hinkel, in collaboration with NExT Lab, MSD, The University of Melbourne (AU); Uwe Rieger, The University of Auckland (NZ); Büro Achter April (DE). Artists: Dr. Rochus Urban Hinkel (AU); in collaboration with MSD, NExT Lab, Melissa Iraheta and Tony Yu; Lia Bach and Michael Fragstein, Büro Achter April; and Dr. Hélène Frichot.
As part of the Ars Electronica: In Kepler's Garden Melbourne - Past and Future Utopia Program:
"The projects present descriptions, stories and experiences from pre-settlement, colonial and utopian post-colonial perspectives, establishing new relationships to this rural site in country Victoria, in order to enable us to rethink the sites role, its use and its current and future meaning. The projects imagine the site’s connections with the past, its potential future, and open up a set of questions: Can we relate to and learn from the pre-colonial history of first peoples’ relation to Country? Can we critique and reflect on the European colonization of the longest continuing culture in the world? Can we imagine a post-colonial future that overcomes the neoliberal concept of land as commodity? Could a reintroduction of an indigenous biodiversity, a return to local food production, a rethinking of community and its legal frameworks, be a utopian response to the challenges we face? Will a return to a holistic understanding of Country allow us to develop a deeper understanding of it that results in a sustainable response to climate change, develops an alternative to the logic of continuous growth, and reflects on the extinction of species and the industrialization of food production? We invite our audience to emerge themselves in Virtual Reality worlds of past and future utopias."
Standard scanning path, main obstacles were occlusion hence various angles were chosen to account for that.
Aligning and registering scans in nature are usually difficult as there are very few distinguishing features. To assist with this, these markers were used to help in alignment and were removed from the point cloud afterwards.
The Z+F Scanner was specifically chosen here for its range due to the sheer scale of the field. It would have been impractical to scan at the ideal 15-20m range.
The videos were produced in Unreal Engine, using its Point Cloud plugin and 360 Panorama plugins. Each 360 image was individually rendered in a stereoscopic format and later sequenced together, totaling nearly 70,000 frames.
More detail about Unreal Engine Point Clouds can be found here:
Last modified 2yr ago