nahbaste

means nahuel basterretche, creative technologist

Amber Dell

July 2024

Unity

Meta SDK

Inviting participants to inhabit non-human forms, this VR experience explores embodied performativity through distinct animal locomotion systems and affordances.
Why is homuncular flexibility a dangerous idea? Because the more flexible the human brain turns out to be when it comes to adapting to weirdness, the weirder a ride it will be able to keep up with as technology changes in the coming decades and centuries.
-Jaron Lanier
'Amber Dell' is a VR experience that invites participants to explore embodied performativity in virtual shared spaces. The experience challenges users to inhabit non-human forms within a shared, fantastical virtual environment. Upon entering, visitors select from an array of non-human body types, each possessing distinct locomotion systems and affordances. A surreal landscape serves as a playground for exploration and interaction, challenging participants to navigate it using unfamiliar locomotion systems.

The experience serves a dual purpose: as an artistic exploration of consciousness and embodiment, and as a tool for connecting with our own bodies. I have been interested for a long time in what has been called Homuncular Flexibility in the VR field, i.e. the idea that that users can learn to control bodies different from their own by changing the relationship between tracked and rendered motion. We are quite aware of this effect when it occurs to us in our daily lives, such as when driving a bicycle or a car. Inspired by authors such as James Bridle, Maria Puig de la Bellacasa and Jaron Lanier, It was my initial intuition that this phenomenon could be useful in exploring the more-than-human world and that this combination could produce an engaging non-linear narrative experience.

An early prototype of an avatar creation flow sketched using Bezi. This first prototype involved heavy use of 3D models to illustrate the different rigs the player could use. After selecting a rig, the player could pick one of different cosmetic options.

Initial Concept
I have worked with interactive experiences in general and the XR space in particular for quite a while now, and I’ve implemented various avatar systems. One thing that always surprised me (specially in VR) was how similar all those were. I wrote a piece some time ago when Stray came out about how we were seeing new paradigms appear in this regard.

This year, while having a conversation with a Product Designer from an XR company working in an avatar system, the topic came to mind again. After a brief discussion with her and other colleagues regarding the main aspects of an avatar creation user flow, I decided to play around with the idea of a user flow for non-human avatars. I prototyped a simple idea using Bezi, and this served as the starting point for this project.

First Unity prototype I built. I kept the use of 3D models for the avatar creation, using hand gesture recognition to swipe between them. In the video you can see an early version of the flight locomotion. I shared an evening with a group of colleagues where I gathered feedback and first impressions of the experience.

Inspired by Gorilla Tag, I built an initial prototype with a physics-based locomotion system. I showed this prototype to some people whose feedback I usually find helpful: designers, artists, developers, curators.

Here I received feedback regarding the locomotion mechanics, the aesthetic decisions and some interesting directions in which I could develop the work further:
  • I started with a low poly aesthetic, which was very well received. Users generally agreed that the focus of the experience should not be its visual style, but its mechanics. Although at this stage it was clear that the visual quality needed to be improved.
  • The first two locomotion systems I prototyped were easy to understand and maybe a higher challenge needed to come from level design and not from the locomotion system itself
  • Multiplayer could be a great driver for challenges and objectives: someone mentioned video games based in the Olympic games, which are mostly for multiplayer use and are very easy to pick up and learn
  • Another instance of sharing my development with colleagues

    Usability and self perception
    Even though this first audience was comprised of mostly young tech-savvy designers, it became quite clear that the gesture interaction was quite unintuitive. At the same time, controller tracking turned out to be much more precise than hand tracking, which resulted essential for an experience that gained so much from fast movements.

    It also became clear that users had a desire for self perception, and so I sketched a mirror interaction, where they would be able to see themselves. Since this early prototype accomodated only two avatar styles, I simply created a switching button interaction to swap them.

    It became clear to me that an avatar creation user flow would have to rely much more on 2D panel interfaces, specially if the experience was to become more complex. More on that later.
    Technical Approach
    Physics Based Locomotion
    In my opinion, developing this with a physics-based approach brings something extra to the table. There is a sense of agency that is created by this that is hard to get when taking a more simulated approach. Sadly, VR is still prone to glitching, and I find that relying on the physics engine works best with handling these cases.

    The challenge in designing this kind of locomotion system is finding uncommon yet easy to understand movements. I wanted to balance the alien nature of movements with human performability, demanding physical commitment from players while still incorporating learning curves to allow for skill mastering.
    Avatar Animation
    A big part of this experience relies on self-perception. During initial testing it became quite self apparent that one of the key challenges when onboarding users to the experience is getting them to construct an accurate mental model of their avatar and a proper mapping of how their body movements translate into avatar motions.

    So when building an avatar with multiple bones & joints but only three control points (head and controllers) I needed a way to procedurally generate motion across the rig. For this purpose I utilized Unity's Animation Rigging tool for procedural animations. This helps users create a self image of how they look to an outside viewer. In the game, this is achieved through some mirror objects placed in the scene and on the avatar creation flow.

    At the same time, this can not be the image users have when they look at their own hands. Users should be able to see their hands in this new skin but not more, or it breaks the immersion & the mental model of the self.
    Visual style & Performance Optimization
    The visual aspect of the experience was low poly from the beginning, and this decision has been tested and validated throughout the development process. The obvious reason for this is that the focus of the experience should not be on its visuals, but rather its mechanics and gameplay. Low poly of course allows for larger map sizes without the need for advanced hardware or complicated asset streaming.

    But also, low poly models are quite clear on their collider rules: they provide clear visible physical boundaries, clear planes with clear normals. So it plays well with a physics based approach. This requires crafting low poly assets with primitive colliders aligned with their model faces, since this is much more performant than mesh colliders.

    On top of this, I implemented various optimization techniques including:

  • Occlusion Culling
  • LODs (Level of Detail systems)
  • Draw Call Batching
  • Object Pooling
  • Lightmapping
  • User Research & Feedback

    I presented the same project (with slight variations) in two different headsets: a Quest 2 and a Quest Pro

    In October 2024 this project was selected to be a part of the interactive storytelling section of London Breeze Film Festival, and I though that the two days showing the project were an exceptional occasion to carry out some qualitative user research.
    I established clear objectives to validate key aspects of the experience:

    1. Locomotion System Usability
  • Evaluate the intuitiveness of different movement systems
  • Assess learning curves across different user groups
  • Observe user adaptation strategies

  • 2. Technical Performance Analysis
  • Compare user experiences across Quest 2 and Quest Pro hardware
  • Identify performance bottlenecks during extended use
  • Evaluate impact of hardware differences on user comfort
  • 3. Level Design Validation
  • Observe user navigation patterns through different difficulty zones
  • Assess user responses to increasing environmental challenges
  • Evaluate the effectiveness of spatial cues and guidance

  • 4. Game Design Pattern Recognition
  • Analyze user reactions to familiar gaming conventions
  • Identify which established patterns translated well to VR
  • Document unexpected user behaviors

  • 5. UI/UX Optimization
  • Test a simplified onboarding flow
  • Evaluate user comprehension of interface elements
  • Measure time-to-first-successful-interaction
  • A new approach to avatar creation
    This was the first instance where I designed an avatar creation user flow that relied in 2D panels. I tried to keep it as simple as possible, and shift the focus from body types and locomotions to simply animals. The driving question when creating an avatar would not be 'which kind of body do you want to have?' and 'how do you want to move in the world?' but rather the more direct 'Which animal would you like to be?'

    One of the great things about the opportunity to show this work in a place like Breeze is the variety of spectators I got to show the experience to.

    Research Methodology
    I implemented a multi-layered approach to gather user insights:

    1. Contextual Inquiry
  • Observed natural user movement and exploration patterns
  • Documented initial reactions and discovery moments
  • Recorded common points of confusion or delight
  • 2. Directed Tasks
  • Assigned specific objectives to willing participants
  • Measured completion rates and time-to-completion
  • Observed problem-solving strategies
  • 3. Post-Experience Interviews
  • Conducted structured interviews about user experience
  • Gathered feedback on specific design elements
  • Explored user expectations versus actual experience
  • 4. Hardware Comparison Testing
  • Alternated between Quest 2 and Quest Pro platforms
  • Monitored performance metrics during extended sessions
  • Documented user comfort and fatigue levels
  • After two days of user testing and interviews, I gained a key insight: inserting a user into this kind of experience is a task that has to be carried out gradually. There have to be instances where the user can create mental models of their limbs, their size in the world, their movements; and just then can they explore and conceptualize the environment they are placed into.
    Key Findings and Insights

    Core User Experience
    1. Initial Onboarding
  • Users needed visual reference guides for body movement
  • Clear need for real-time movement tutorials for first steps
  • UI failed to effectively communicate required movements
  • 2.Learning Progression
  • Quick adoption after initial success
  • There were steady learning curves observed
  • Progressive difficulty proved effective
  • Different areas of the map accommodated varying skill levels
  • 3. User Approaches when faced with a challenge
  • Mastery-focused: Practice to improve skills
  • Exploration-focused: Search for alternative routes
  • Level design successfully served both styles

  • Technical Performance
    Hardware Comparison
  • Both Quest 2 and Quest Pro ran 6-hour sessions
  • Quest 2 required a reduction in post-processing effects to reach target framerate
  • Core experience maintained across hardware
  • Stable performance during extended use



  • Environmental Design
    Successes
  • Terrain variety provided appropriate challenge levels
  • Alternative paths accommodated different skills
  • Starting area effectively facilitated basic learning
  • Visual markers successfully guided exploration
  • Areas for Improvement
  • Some features such as rivers acted as unintended traps
  • Users struggled to escape certain areas
  • Visual highlights lacked gameplay function
  • Need for better alignment of visual cues with interactions

  • User Feedback
    Immediate Requests
  • More avatar options
  • Better self-body awareness
  • Ways to visualize own body besides hands
  • Clearer movement instructions
  • Confusion Points
  • Avatar size relative to environment
  • Required body movements
  • Purpose of visual highlights
  • Navigation in challenging areas
  • Design Priorities
    1. High Priority (immediately attended)
  • Movement tutorial implementation
  • Recovery mechanics for challenging areas
  • UI improvements for avatar creation
  • Basic body-awareness features

  • 2. Medium Priority (in development)
  • Visual promise/reward alignment
  • Level design refinements
  • Enhanced visual feedback
  • Additional avatar options

  • 3. Long-term Development(currently scoping)
  • Sophisticated onboarding system
  • Advanced self-visualization tools
  • Expanded avatar ecosystem
  • Enhanced social features
  • Conclusions
    While VR is usually employed as a visual narration medium, Amber Dell focus is not what happens in the screens, but what happens in the body.

    Virtual Reality’s capacity to trick our perception presents immense opportunities for us to explore our perceptions and reevaluate our fundamental understanding of the human condition. Used with care, XR can be an amazing tool to unlock positive change in our cognition, something I believe we as a species are in dire need of.

    It is for this reason that projects like Amber Dell benefit from constant exposure: the cycle of development and presentation reinforce each other and help me get make sure development focuses in relevant features of the experience.
    nahuels logo

    ©2024 Nahuel Basterretche