6 minute read 2 Nov 2022
Automotive engineer using a vr software to work on electric motor

How will the metaverse be designed as it disrupts design?

By Rob Tannen

User Experience Psychologist, EY Design Studio

Human-centered researcher. Design essayist. Patent aficionado. FIFA gamer.

6 minute read 2 Nov 2022
Related topics Consulting Digital

As design teams create experiences for the metaverse, they will be disrupting the very skills, tools and organization of design itself.

In brief:

  • The metaverse will significantly increase the scale and complexity of experience design, changing how design organizations operate and collaborate.
  • Creating these experiences will blend new and legacy capabilities, providing growth opportunities but placing greater demands on design management.
  • Human-centered approaches, notably service design, will be essential to understand user needs in the context of a widening range of interactions.

From tangible, to digital, to virtual and beyond

Over the last couple of decades, we have seen the maturity and integration of design practices in organizations across industries. From the creation of leadership roles, such as chief design officer, to the shift toward more creative corporate cultures, design has influenced not just what is created, but how.

During this span, many organizations shifted from creating primarily physical products to digital experiences. Whether integrating digital interfaces within existing products or creating supporting apps for emerging smartphones, it was essential to expand user experience capabilities to remain relevant, let alone competitive. 

Adapting to digital design required not only hiring and training for new skills but also a reconsideration of the design organization itself. For example, traditional product development teams consisting of industrial designers and engineers use different tools, methods and workflows than interactive digital designers. 

Early on, projects were often pragmatically viewed from the lens of either physical or digital to determine how to staff and plan them. But it became clear that this was a false dichotomy — all projects needed to be experience led, driven by user needs and enabled by the respective design capabilities. As a result, design teams became organized in service of user experience, regardless of what would be delivered.

The growing interest and evolving technologies of the nascent metaverse brings a further blurring and mixing of physical and digital design and beyond. Objects and environments initially created with tangible materials can be realized virtually, while the virtual can interact with the tangible via augmented reality. Consider the following forward-looking scenarios:

  • A medical device company that primarily develops physical surgical instruments wants to provide virtual training, requiring the design of both the virtual experience and the specialized hardware to enable these interactions.
  • A retailer with traditional physical and online shopping experiences wishes to provide its customers a virtual shopping offering and an augmented reality experience for in-store patrons.
  • An education provider is striving to improve collaboration between in-person and online students and is creating a mixed AR/VR solution for both types of students to interact in real time.

Each of these examples presents a different set of design opportunities and challenges. The medical device company requires synergistic virtual and physical instruments. The retailer wants to provide comparable commerce functionality and continuity across contexts while maximizing the unique benefits of each experience’s modality. The education scenario aims to offer engaging interactions for all students without negatively impacting the respective advantages of live and remote learning formats. 

These scenarios collectively illustrate the exponential growth and complexity that the metaverse will bring to experience ecosystems, with varying combinations and interdependencies across physical, digital, virtual and mixed touch points. Consequently, organizations must rethink how their design teams operate and collaborate under these new regimes.

Design operations for the metaverse

Design operations (DesignOps) include the skills, tools, structure and governance that enable design teams to operate effectively and efficiently. DesignOps, and its adjacencies, including ResearchOps and DevOps, have matured as design organizations have scaled and become integral to business operations. But like much of business and technology, the metaverse will also disrupt design.

Skill sets for designing the metaverse

In basic terms, design skills can be framed in terms of depth and breadth. A digital designer may have deep knowledge of the design discipline, including information architecture, visual design and design prototyping. While depth is about an individual’s capabilities, breadth speaks to the ability to collaborate and communicate with adjacent disciplines across the design process. For instance, for a traditional industrial designer, breadth might extend to knowledge of mechanical engineering necessary to translate designs to production.

As design teams become more responsible for virtual and augmented reality, team skills will continue to consist of both current and new specialized capabilities. For example, some designers may focus on metaverse design (depth), while others will learn VR/AR-related skills as adjacencies to their “legacy” strengths. Additionally, new roles and expertise will come into play:

Changes to research and insight methods

Above all else, design operations should model delivering value to end users — a tenet that might get lost among the expanding variety of capabilities and systems. The metaverse has the potential to connect design teams with users in ways that will dramatically increase understanding and empathy; consider:

  • Utilizing remote, unobtrusive observation literally from a participant’s point of view for ethnographic or contextual research
  • Empowering customers with interactive visualization tools to generate and share their ideas
  • Using augmented reality or virtual environments for concept and prototype feedback and testing

The diversity of interactions enabled by the metaverse will significantly increase the scale and complexity of documenting and analyzing the user experience. For example, in the earlier retail scenario, a single user journey could include in-person touch points, with augmented reality, online and virtually — not even considering the potential for multiple or customized versions of those channels. Service design frameworks, particularly service blueprints, provide a near-term approach to manage this complexity. They can detail not just the frontstage user activities but also the backstage systems that enable those interactions, making them a powerfully helpful tool for visualizing experiences across touch points. 

Adapting or extending service blueprints might include highlighting transitions or breakpoints among physical, digital, VR and AR interactions (or combinations of those interactions). This will be critical to prioritize opportunities where potential immersive or augmented reality interactions would add significant value to users. Such opportunity touch points or “moments that matter” might be reconsidered as “moments that meta.”

Evolving tools and design technology

Despite the growing integration of physical and digital design over the years, the supporting tool sets have mainly remained separate, with product designers typically using 3D design and engineering software and user experience (UX) designers working with dedicated UX design and prototyping tools. Similarly, virtual environment design is currently done on for-purpose platforms. As tool consolidation is unlikely, compatibility and interoperability across tools will become critical to support cross-team workflows. This may influence alliances, mergers and acquisitions among design system providers. 

Most intriguing will be the potential for design tools via the metaverse itself. For example, imagine medical device designers “feeling” the weight and balance of surgical instruments as they create them or virtual environment designers building a world while immersed within. Such possibilities could disrupt the speed, creativity and collaborative nature of the design process in ways yet to be understood.

Summary

Adapting design operations to manage people, tools and methods will provide critical structure as the metaverse exponentially increases the scale and complexity of design practices. Human-centered research should remain at the core for the benefit of design teams and the users for whom they are creating a new world.

About this article

By Rob Tannen

User Experience Psychologist, EY Design Studio

Human-centered researcher. Design essayist. Patent aficionado. FIFA gamer.

Related topics Consulting Digital