First International Workshop on
Tools for Creating XR Media Experiences

In conjunction with IEEE ICME 2020 (http://www.2020.ieeeicme.org/)


Introduction

Extended Reality (XR), which includes Virtual Reality (VR), Augmented Reality (AR) and mixed reality (MR), creates entirely new ways for consumers to experience the world around them and interact with it. Within the last few years, improvements in sensor technology and processing power have led to tremendous advances in all aspects of XR hardware, and due to economies of scale of the massively growing XR market these devices are available now at a reasonable price point. On the production side, powerful low-cost systems for capturing 3D objects and volumetric video and 360° videos make it possible to create budget VR/AR productions. The same applies to the consumption side, where VR headsets like the Oculus Go or Playstation VR provide a highly immersive VR experience which is affordable for everyone.

Unfortunately, the development of tools and technologies for authoring, processing and delivering interactive XR experiences is lagging considerably behind the hardware development, which is definitely a hurdle for the cost-effective mass production of appealing XR content and scenarios. Lack of content in turn hinders broader adoption and acceptance of XR technologies by the consumer. For all these aspects, new approaches and technologies are needed in order to overcome the specific challenges of XR content creation (multimodal data, non-linear interactive storytelling, annotation and metadata models, novel compression techniques, bandwidth requirements, etc.)..

This workshop asks for original contributions on new approaches, technologies and tools for creating, processing and delivering interactive XR media (3D/CGI content/point clouds, 360° video, 3DoF+/6DoF video, volumetric video, spatial audio…).


Topics of Interest

Topics of interest include, but are not limited to:

  • Efficient XR content acquisition and representation
  • Compression and delivery to various platforms (HMD, smartphones, SmartTV / HbbTV, Web, …)
  • Subjective and objective assessment of XR scenarios (content quality, experiences…)
  • Semantic understanding of XR content (depth estimation, semantic segmentation, object recognition, pose estimation, action recognition, audio analysis, etc.)
  • Automating the XR content authoring process (e.g. providing automatic content annotation / storytelling)
  • Authoring interactions and navigation aids (e.g., elements for moving in time and space, avatars)
  • Authoring accessible XR experiences (e.g. subtitles, audio description, audio subtitling, sign language, …)

Submission and important Dates

Author information, submission instructions and the document templates are given at http://www.2020.ieeeicme.org/index.php/author-information-and-submission-instructions/

The link to the electronic submission system can be found at https://www.2020.ieeeicme.org/index.php/call-for-workshop-papers/

Papers must be no longer than 6 pages, including all text, figures, and references and must be submitted under the track for this workshop in the electronic submission system. The review process is single-blind (so authors must write their names and affiliations in the submitted paper).

Paper submission deadline: 13 March 2020
Notification of acceptance: 15 April 2020
Camera-ready paper due: 29 April 2020


Workshop Format and Schedule

The workshop is organized as a half-day format, split into three sessions.

The first session will the keynote. The second session is dedicated to oral presentations of the submissions accepted as an oral talk, whereas the last session will be a poster session for the submissions accepted as a poster.

_Tentative_ Workshop Schedule:
09:00-09:15 – Welcome Message from the Workshop Organizers
09:15-09:45 – Keynote by Professor Aljosa Smolic (Trinity College Dublin)
09:45-10:00 – Coffee break
10:00-10:20 – Oral Presenter 1
10:20-10:40 – Oral Presenter 2
10:40-11:00 – Oral Presenter 3
11:00-11:20 – Oral Presenter 4 (or invited talk)
11:20-11:35 – Coffee Break
11:35-13:00 – Poster Session


Workshop Organizers

Hannes Fassold (JOANNEUM RESEARCH)
Dimitrios Zarpalas (Centre for Research & Technology Hellas)
Pablo Cesar (Centrum Wiskunde & Informatica and Delft University of Technology)
Mario Montagud (i2CAT & University of Valencia)


Technical Programm Commitee

  • Werner Bailer, JOANNEUM RESEARCH, Austria
  • Gianluca Cernigliaro, i2CAT, Spain
  • Jaume Segura, i2CAT, Spain
  • Louay Bassbouss, Fraunhofer FOKUS, Germany
  • Christian Fuhrhop, Fraunhofer FOKUS, Germany
  • Nikolaos Zioulis, Centre for Research & Technology Hellas, Greece
  • Dorothea Tsatsou, Centre for Research & Technology Hellas, Greece
  • Jie Li, Centrum Wiskunde & Informatica, Netherlands
  • Peter M. Roth, Technical University Graz, Austria
  • Alexander Grabner, Technical University Graz, Austria
  • Rene Kaiser, Know-Center, Austria

Projects

The workshop is organized as a collaboration of the H2020 projects Hyper360 (http://www.hyper360.eu/), ImAc (https://www.imac-project.eu/) and VRTogether (https://vrtogether.eu/).


Contact

Hannes Fassold
hannes.fassold(at)joanneum.at

Create your website at WordPress.com
Get started