Research Question Prototype
This project revisits the sketches, except the purpose is exploration of the hardware.
While sketches are explorations of ideas, this project refines that into an exploration of capabilities.
Create 1-3 sketches for the system, with a unique goal. Have a single hardware/technical question in your mind, and explore it until you are satisfied you will be able to present some insight.
Title the project with the constraint/question you are exploring, and take screenshots or video capture.
Example Questions (Lightship)
- How does Meshing Work for Physics?
- How can we use nav meshes? What are they?
- What type of content can we put in an AR space?
- What kinds of objects can we use Object Detection for? Can we detect things in motion? Images on screens? Pets?
- How closely aligned are shared AR spaces?
- What is the smallest image we can use for tracking based localization?
- How can I get two phones to acknowledge each other?
- What are the near and far limits of occlusion? How large and small of an object can I put in a scene, using occlusion effectively?
- What limits are there to the depth texture being sampled and used in for other systems, like the visual effect graph?
- How accurate is the WPS system? How low latency? How often does the WPS lat/long data update?
Example Questions (Tilt5)
- How does MixCast work? What are its limitations?
- Does Video playback work in the tilt five?
- How hard is it to track multiple wands?
- Can we use multiple wands with one headset? More than two?
- Can we align the tilt five with other tracking technology, like vive?
- Can we show unique views or secret information to different headsets?
- How small can we effectively render?
- How far away can we make the player feel?
- What sort of other real-world items could we put on the tilt five mat?
- How close can we put the players face? How can we encourage head movement?
- What sorts of environments look good here?
- Also see: perception!
Submission
Before class, upload screenshots and the title