Emerging Wearable Technology Signals the Next Chapter in Immersive Sport

Recent hands-on exposure to the latest Ray-Ban Meta smart glasses in the United States has reinforced a growing industry sentiment: immersive sport is entering a new phase — one defined less by large-scale virtual reality hardware and more by natural, wearable integration.

Developed by Meta in partnership with EssilorLuxottica, the Ray-Ban Meta smart glasses currently enable first-person photo and video capture, spatial audio recording, voice activation and livestreaming to select Meta-owned platforms.

Importantly, the glasses do not currently support direct streaming of third-party application content, and approval within one Meta ecosystem (such as a Meta Quest app) does not automatically extend compatibility across all Meta hardware. At the time of writing, broader international availability — including Australia — remains limited.

While the current capabilities are defined and measured, the broader implications for sport are significant.

From Broadcast to Presence

Traditional sports media has historically centred on broadcast production: multi-camera angles, commentary overlays and replay systems. Immersive sport, by contrast, shifts focus toward proximity and presence.

Wearable smart glasses demonstrate how first-person capture can reposition the viewer closer to the moment. Rather than observing an athlete’s walk through a tunnel or a pre-game interaction from a fixed lens, the perspective becomes human, eye-level and natural.

The technology does not yet replace high-end broadcast. Instead, it introduces a complementary layer: authentic, lived perspective.

The Evolution of the Immersive Ecosystem

Within this broader landscape, the IN SPORT App is already advancing immersive capabilities through structured 360° content capture. Recent recordings surrounding the Las Vegas launch have included spatially immersive podcast sessions and stadium environments designed to allow users to move through digital spaces rather than passively consume footage.

Through 360° integration, users are able to:

  • Sit inside a recorded podcast environment.

  • Stand within a stadium setting.

  • Navigate a scene interactively.

This approach represents an immediate, scalable form of immersion that does not depend on wearable hardware adoption. It establishes the foundational ecosystem required for deeper integration as device capabilities evolve globally.

A Measured Outlook on Integration

At present, there is no confirmed functionality enabling Ray-Ban Meta smart glasses to stream or display IN SPORT App content directly. However, wearable technology and immersive application development are advancing in parallel.

Industry trends indicate increasing convergence between:

  • First-person capture

  • Cloud-based distribution

  • Spatial content layering

  • Platform interoperability

As hardware ecosystems mature and developer access expands, the opportunity for deeper immersive integration across wearable devices and application platforms becomes more technically viable.

Such developments would require formal SDK compatibility, regional compliance approval and structured ecosystem alignment. No confirmed timeline has been announced for these advancements.

Why This Matters for Rugby League and Community Sport

Rugby league, in particular, has always functioned as a connective force — linking generations, communities and cultures. Immersive technology enhances that connective potential by expanding access.

For fans unable to attend international fixtures, immersive digital environments offer a sense of presence. For regional communities, 360° content provides proximity to moments that would otherwise remain distant. For ambassadors and sporting legends, immersive capture enables legacy storytelling in a lived format rather than a static interview structure.

Technology does not replace the emotion of sport.

It amplifies access to it.

The Direction of Travel

Wearable smart glasses represent one stage in a larger progression:

  1. POV capture and spatial audio

  2. Structured immersive application environments

  3. Layered digital ecosystems

  4. Cross-device immersive interoperability

The IN SPORT App’s current 360° framework positions it within that evolution, establishing immersive foundations while remaining adaptable to emerging hardware ecosystems.

As global rollout of wearable devices continues and platform integration standards mature, immersive sport is expected to move beyond passive viewing into interactive participation.

The trajectory is clear: the future of sport will not only be watched.

It will be experienced.

Disclaimer

The IN SPORT App is currently available as a Meta Quest application; however, it is not presently compatible with Ray-Ban Meta smart glasses.

While the products operate within the broader Meta ecosystem, device compatibility and cross-platform functionality are determined by separate hardware and software integration frameworks.

Any future integration between wearable smart glasses and the IN SPORT App would be subject to official platform support, technical development, regulatory approval and formal announcement. No confirmed timeline for such compatibility has been released at this time.

Previous
Previous

IN SPORT & Sports AI Experience OzFest in Las Vegas as AI Conversations Shape the Future of Sport

Next
Next

Pathway Meets Legacy: IN SPORT App Launch Connects Vegas 9s Collegiate Athletes with Rugby League Greats