Design a site like this with
Get started

ICME 2020 XR Workshop

First International Workshop on
Tools for Creating XR Media Experiences

In conjunction with IEEE ICME 2020 (


Extended Reality (XR), which includes Virtual Reality (VR), Augmented Reality (AR) and mixed reality (MR), creates entirely new ways for consumers to experience the world around them and interact with it. Within the last few years, improvements in sensor technology and processing power have led to tremendous advances in all aspects of XR hardware, and due to economies of scale of the massively growing XR market these devices are available now at a reasonable price point. On the production side, powerful low-cost systems for capturing 3D objects and volumetric video and 360° videos make it possible to create budget VR/AR productions. The same applies to the consumption side, where VR headsets like the Oculus Go or Playstation VR provide a highly immersive VR experience which is affordable for everyone.

Unfortunately, the development of tools and technologies for authoring, processing and delivering interactive XR experiences is lagging considerably behind the hardware development, which is definitely a hurdle for the cost-effective mass production of appealing XR content and scenarios. Lack of content in turn hinders broader adoption and acceptance of XR technologies by the consumer. For all these aspects, new approaches and technologies are needed in order to overcome the specific challenges of XR content creation (multimodal data, non-linear interactive storytelling, annotation and metadata models, novel compression techniques, bandwidth requirements, etc.)..

This workshop asks for original contributions on new approaches, technologies and tools for creating, processing and delivering interactive XR media (3D/CGI content/point clouds, 360° video, 3DoF+/6DoF video, volumetric video, spatial audio…).

Topics of Interest

Topics of interest include, but are not limited to:

  • Efficient XR content acquisition and representation
  • Compression and delivery to various platforms (HMD, smartphones, SmartTV / HbbTV, Web, …)
  • Subjective and objective assessment of XR scenarios (content quality, experiences…)
  • Semantic understanding of XR content (depth estimation, semantic segmentation, object recognition, pose estimation, action recognition, audio analysis, etc.)
  • Automating the XR content authoring process (e.g. providing automatic content annotation / storytelling)
  • Authoring interactions and navigation aids (e.g., elements for moving in time and space, avatars)
  • Authoring accessible XR experiences (e.g. subtitles, audio description, audio subtitling, sign language, …)

Submission and important Dates

Author information, submission instructions and the document templates are given at

The link to the electronic submission system can be found at

Papers must be no longer than 6 pages, including all text, figures, and references and must be submitted under the track for this workshop in the electronic submission system. The review process is single-blind (so authors must write their names and affiliations in the submitted paper).

Paper submission deadline (has been extended): 27 March 2020
Notification of acceptance: 15 April 2020
Camera-ready paper due: 29 April 2020
Workshop date: 6 July 2020 (first conference day), 14:00-17:20, British Summertime (GMT+1)

Workshop Format and Schedule

The workshop is organized as a half-day format, split into three sessions.
All sessions are fully virtual. See the instructions for uploading a video at

Workshop Schedule (on monday, july 6)
Keynote session
Session Chair: Pablo Cesar, Centrum Wiskunde & Informatica and Delft University of Technology
14:00-14:05 – Welcome Message from the Workshop Organizers
14:05-15:00 – Keynote by Professor Aljosa Smolic (Trinity College Dublin)
Title: “Volumetric Video Content Creation for Immersive AR/VR Experiences”
15:00-15:10 – break
Session 1
Session chair: Mario Montagud (i2CAT & University of Valencia)
15:10-15:30 – XR360: A Toolkit for Mixed 360 and 3D Productions
15:30-15:50 – An authoring model for interactive 360 videos
15:50-16:10 – Towards Neural AR: Unsupervised Object Segmentation with 3D Scanned Model Through ReLaTIVE
16:10-16:20 – break
Session 2
Session Chair: Antonis Karakottas (Centre for Research & Technology Hellas)
16:20-16:40 – Simplifying the process of creating augmented outdoor scenes
16:40-17:00 – Interactive 360° narrative for TV use
17:00-17:20 – Invited talk by Leen Segers (LucidWeb) about XR4ALL project
Title: “XR4ALL – Moving the European XR tech industry forward”

Workshop Organizers

Dimitrios Zarpalas (Centre for Research & Technology Hellas)
Pablo Cesar (Centrum Wiskunde & Informatica and Delft University of Technology)
Mario Montagud (i2CAT & University of Valencia)

Technical Programm Commitee

  • Werner Bailer, JOANNEUM RESEARCH, Austria
  • Gianluca Cernigliaro, i2CAT, Spain
  • Louay Bassbouss, Fraunhofer FOKUS, Germany
  • Christian Fuhrhop, Fraunhofer FOKUS, Germany
  • Nikolaos Zioulis, Centre for Research & Technology Hellas, Greece
  • Dorothea Tsatsou, Centre for Research & Technology Hellas, Greece
  • Antonis Karakottas, Centre for Research & Technology Hellas, Greece
  • Jie Li, Centrum Wiskunde & Informatica, Netherlands
  • Peter M. Roth, Technical University Graz, Austria
  • Rene Kaiser, Know-Center, Austria


The workshop is organized as a collaboration of the H2020 projects Hyper360 (, ImAc ( and VRTogether (


Hannes Fassold