We built a few VR experiences and experiments for the Symposium, listed below, which should work on most headsets since they are web based.
Please give any feedback by clicking on the person’s name below (they are links to their twitter handles) or during the Symposium. Dialog is so important but please remember these are experiments and not polished final experiences.
The Symposium will start with a session on these VR experiences where you will be able to use one of the provided Meta Quest headsets. If you have a headset and you have done work in VR, please feel free to bring it to show your work.
‘Simple’ Mural (By Brandel) A simple and powerful introduction to VR, this shows a single Mural by Bob Horn, which you can use your hands to interact with: Pinch to ‘hold’ the mural and move it around as you see fit. If someone says VR is just the same as a big monitor, show them this!
Basic Author Map of the Future of Text (By Brandel) Open this URL in your headset and in a browser and drag in an Author document to see the Map of all of the contributors to The Future of Text book.
Simple Linnean Library (By Frode) A rough and ready room made by a novice, this is something you can also do. I used Mozilla Spoke to build an experience which can be viewed on any browser, in 2D or VR in Mozilla Hubs.
Self Editing Tool (By Fabien) In this environment you will be able to directly manipulate text and even execute the text as code by pinching these short snippets. Fabien recorded a walkthrough video:
For professional work, also check out our presenter Mez Breeze’s work: https://mezbreeze.itch.io/portraits-volume-one
This is some of the results of our work at the Future Text Lab.