Special Sessions
QoMEX 2014 is inviting submissions to the following special sessions:
- Experiencing Multimedia Quality Fluctuations
- Media Synchronization and QoE
- Perceptual Graphics, Rendering, and High-dynamic Range Imaging
- Quantitative QoE Assessment: Augmenting Psychophysical Experiments with Psychophysiological Measurements
The purpose of these special sessions is to complement the regular program with new or emerging topics of particular interest to the community.
The submission deadlines and review process for papers in the special sessions are the same as for regular papers. To submit your contribution to a special session, follow the submission process for regular papers and select the session title as one of the paper’s topics.
Experiencing Multimedia Quality Fluctuations
Organizers
Raimund Schatz, FTW, Austria
Tobias Hossfeld, University of Würzburg, Germany
Topics of Interest
- Measurement studies and models on typical quality fluctuations (e.g. in wireless networks)
- Subjective assessment methods for session-level quality fluctuations
- Understanding of human perception of stimulus variations (in general and modality-specific)
- Measurement and modeling frameworks for quality variations
- Extension of objective metrics and integration of variability-related factors
- Inclusion of context-related (e.g. lighting, location) and content-related (e.g. scene changes, program changes) factors causing fluctuations
- Evaluation of fluctuation-aware network and application management approaches
- Databases featuring representative temporal variations of key variables
Motivation and Objectives
Research on Quality of Experience (QoE) of networked multimedia applications addresses the influence of user-, system- and context-related factors, with the majority of work focusing on static configurations of quality-related parameters and stimuli. Typical examples are studies aiming to quantify the influence of constant bandwidth bottlenecks, packet-loss and delay settings or of a specific encoding or content type on the perceived quality of video, voice, Web and other applications.
However, with the proliferation of mobile wireless use cases and quality management technologies (such as adaptive bitrate adaptation for streaming video), quality as experienced by end users is increasingly becoming subject to temporal variations and fluctuations. As regards the wireless mobile domain, aforementioned trend towards increased dynamics is driven by variations of underlying technical performance parameters like available bandwidth and packet loss as caused by changing cell load, network congestion or even outages. In addition, user mobility does not only lead to quality fluctuations related to handovers, signal strength changes, etc., but also to shifts of contextual influence factors like location, lighting and background noise. Even in static environments, aforementioned context-related factors may cause quality fluctuations. Analogously, content-related factors like scene changes or channel switching can manifest as quality variations.
Similarly, quality management mechanisms and technologies (like MPEG-DASH, HAS and Smooth Streaming in the case of Internet video) do not completely eliminate impairments they target (like rebuffering and freezing), but rather trade them off for other ones (e.g. switches in video encoding quality). The outcomes, i.e. quality fluctuations, are prone to become QoE impairments themselves because of the quality dynamics they tend to add to the experience.
All of the above developments necessitate intensified efforts towards investigating the impact of quality fluctuations on QoE. This does not only relate to improved subjective assessment methods (beyond existing continuous quality evaluation methods like SSCQE), but also to better understanding of stimulus variations on human quality perception in general. Furthermore, improved methods for adequately describing or modeling quality fluctuations (on technology as well as on perception level) and for integrating them in objective quality metrics and measurement techniques are required. Finally, QoE-based network and application management approaches are supposed to particularly benefit from these efforts, as they need to adequately address (and mitigate) temporal variations of quality-related parameters to reach their optimization goals.
Given these challenges, the main goal of this special session is to raise awareness of the QoMEX community for this topic by bundling existing efforts already related to it. In this context, outreach to related scientific communities (networking, speech and psychology) and the realization of synergies between different strands (video, image, audio, etc.) of QoE research constitute important elements in setting up this special session.
Media Synchronization and QoE
Organizers
Pablo Cesar, CWI, Netherlands
Christian Timmerer, Alpen-Adria-Universität, Austria
Hans Stokking, TNO, Netherlands
Fernando Boronat, Polytechnic University of Valencia, Spain
Topics of Interest
- QoE for audio-visual synchronization
- QoE for synchronization of multi-sensory experiences (e.g., sensory effects, mulsemedia)
- Intra-Media synchronization modelling
- QoE for inter-destination multimedia synchronization
- QoE media synchronization testbeds
- QoE media synchronization datasets
- QoE evaluation metrics for media synchronization
- QoE evaluation best practices and standardization efforts for media synchronization
Scenarios of Interest
- Hybrid broadcast/broadband scenarios
- Multi-party videoconferencing
- 3D tele-immersion
- Distributed performing arts
Motivation and Objectives
Media synchronization has been a challenge for quite some time. Over the years, many techniques to achieve intra- and inter-media audiovisual synchronization in various network conditions have been developed. On the one hand, novel media technologies such as HTTP streaming protocols, media encoders and HDTV often require new synchronization techniques. On the other hand, new patterns in media consumption often introduce specific synchronization issues. For example, internet applications evolving around broadcast TV content may need synchronization between the application and the broadband stream. Synchronization between different TV receivers may be needed in Social TV (interdestination media synchronization). In some interactive TV cases synchronization between handheld devices and the TV screen may be needed. Moreover, 3D technologies for TV broadcasting and telepresence (3D tele-immersion) may require the adoption of several of these synchronization techniques to achieve a satisfying user experience. Finally, research on multi-sensory experiences brings the promise of novel and enriched media consumption scenarios.
Fortunately for the research community, media synchronization will remain a challenge for the time to come (unless the rules of the Universe change overnight and instant delivery of packets becomes a reality). Undoubtedly, different techniques and strategies for achieving media synchronization have an impact on the Quality of Experience. This special session addresses exactly that: QoE and media synchronization in our multi-sensory, multi-device, and multi-protocol world. The purpose is to bring together experts working in the field of media synchronization, in order to discuss recent advances with a special focus on QoE. The special session is the continuation of a series of successful workshops on Media Synchronization (MediaSync) held in Berlin in 2012 and Nantes in 2013.
Perceptual Graphics, Rendering, and High-dynamic Range Imaging
Organizers
Yuming Fang, Jiangxi University of Finance and Economics, China
Weisi Lin, Nanyang Technological University, Singapore
Irene Cheng, University of Alberta, Canada
Topics of Interest
- Psychovisual modeling for graphics processing/high-dynamic range imaging
- Perceptual quality metrics and user study techniques for 3D meshes/rendered images/high-dynamic range imaging
- New psychophysical experiments for graphics/rendered images/high-dynamic range imaging
- In-depth analysis of perceptual-based graphics processing/high-dynamic range imaging systems
- Perceptually-driven photorealistic and non-photorealistic rendering
- Perceptually-driven compression and transmission for graphics and animation
- Perceptual watermarking in graphics/rendered images
- Research related to virtual, augmented, mixed realities and games
- Model validation for graphic rendering//high-dynamic range imaging
- Applications and databases related to perceptual-based graphics processing/high-dynamic range imaging
- Content- and QoE-aware delivery for graphics and animations
Motivation and Objectives
With the advancement of computing technology, a large amount of graphic and high-dynamic range imaging (HDRI) data are deployed in various applications, such as digital entertainment, Internet, scientific visualization, computer-aided design, virtual reality, telepresence, etc. These data delivery pipeline includes by rendering, compression, watermarking, transmission, and so on. Since humans are the ultimate consumers of the majority of (if not all) produced graphic and HDRI data, and as a result of evolution, the human perceptual system has developed unique physiological and psychological mechanisms in processing and understanding of visual information, it would be beneficial to incorporate visual perception-based criteria in the design and optimization of the related design and processes. Additionally, since these processes may introduce visual artifacts and degrade the delivered quality, it is important to evaluate the visual artifacts introduced into the original visual content, and to make the relevant decisions in graphic/HDRI algorithms and systems. Objective quality metrics which correlate well with the human visual perception need to be designed to enhance quantitative analysis.
This special session therefore seeks submissions that advance knowledge and represent the latest technology developments in the area of perceptual graphics, transmission and rendering, especially the Quality of Experience (QoE) for these processes, as well as innovative applications and system integration.
Quantitative QoE Assessment: Augmenting Psychophysical Experiments with Psychophysiological Measurements
Organizers
Ulrich Engelke, CSIRO, Australia
Hantao Liu, University of Hull, UK
Topics of Interest
- QoE assessment based on human psychophysiology, including eye gaze, EEG, EKG, EMG, GSR, etc
- Computational QoE models based on psychophysiological measurements
- Signal processing and machine learning techniques for psychophysiology based QoE assessment
- Experimental design for psychophysiological assessment
- Correlates of psychophysics and psychophysiology
Motivation and Objectives
Quality of Experience (QoE) assessment of multimedia applications typically utilize responses to questionnaires, either open-ended or based on psychometric scales, such as n-point Likert scales. As valuable as these studies are, they are based on conscious responses by the participants and often do not provide sufficiently deep insight into underlying perceptual and cognitive processes. In addition, psychometric scales typically assume equal perceptual distance between any two items on the scale, which is not necessarily the case. In order to gain a deeper understanding about the perceptual and cognitive processes underlying QoE, psychophysiological measurements can be performed. For instance, eye gaze tracking provides valuable information about overt attention in visual space. Electroencephalography (EEG) measurements inform about cognitive activity, such as cognitive load, situational awareness, and covert attention. Galvanic skin response (GSR) provides insight into arousal and hence emotional states. These psychophysiological responses are not intended to replace well established psychophysical assessment techniques, but to augment them and provide additional sub-conscious information. This special session aims to bring together experts in the field of psychophysiological measurements in the context of QoE assessment.