Towards a unique standard for 3D Object Compression
Marius Preda (Institut Mines Telecom)
Neil Trevett (Khronos Consortium)
Don Brutzman, Anita Havele (Web3D)
Christine Perey (ARStandards)
Summary: Currently there are several formalisms for representing a 3D graphics scene graph and object graph. There is a unique standard organisation, MPEG, dealing with compression of 3D assets (scene/objects).
AP: Establish a Liaison to Web3D and Khronos (probably also to ARStandards). Check if we already have one.
Session
Graphics Tool Library
16h to 18h
m25074
The generic parser FU concept
A mechanism based on a formalism can be used to instantiate the parser and the entropy decoder. The advantage of using such formalism is that it has more expressivity than the network configuration language that is just connecting the FUs.
AP: Set up the Core experiment: extend the BSDL to be able to able to provide the same mechanism.
m25206
Suggestions
There is no bitsize of the float (we may need a better precision).
AP: add the "double" type in the table of types. Specify the mathematical operations for double.
AP: add support for sqrt and log.
1.8.2Tuesday
Session
MPEG-V
09h to 11h
Number
Title
Source
m24258
Additional Scents for ISO/IEC 23005 - Part 6
Resolution: accepted
Markus Waltl, Benjamin Rainer, Christian Timmerer
m24259
Proposed Updates for ISO/IEC 23005-7
Resolution: accepted
Markus Waltl, Christian Timmerer
m24722
Editor's input on SoCD of 23005-1
- the V to V adaptation is done internally by a virtual world and there is no external adaptation engine between V and V
- AP: change the "Sensory Device" in Actuator in all textual descriptions. Issue a study of Part1, Part2, Part3, Part5, Part6 and a resolution asking NB to consider this change during the ballot.
Proposal to modify scent effect (Part 3) Summary: proposal of the hedonicToneEffect. Is is not needed because hedonic will be encoded in the user preferences.
Propose a qualification of the intensity levels (from "no odor" to "very strong"). Not needed because it can be handled by the authoring tool.
Jeong-Do Kim, Hyung-Gi Byun, Hae-Ryong Lee, Sanghyun Joo
m24776
Proposal to modify scent capability and preference types (Part 2) Summary:
Modification of the Scent Capability Type: (1) change ml/h in ppm, (2) add new attributes: maxIntensityType, maxHedonicIntensity, maxHedonicTone, secondOrderDelayTime.
Modifications of ScentPreferenceType
Resolution: existing ScentDevice Capability types can be used as they are defined.
AP: ScentPreferences Type: change the UnfavorableScent in HadonicScent
Jeong-Do Kim, Hyung-Gi Byun, Hae-Ryong Lee, Sanghyun Joo
m24777
A technical background of the make-up avatar usage scenario Summary:
The requirements on realistic avatar appearance were introduced in m22615.
Spectrum of skin, cosmetics and illuminants should be combined in order to produce realistic effects.
A new type: color spectral data is needed.
Skin spectrum should go in the scene definition, illuminant spectrum in the sensor and cosmetics metadata in MPEG-V part 4.
Cosmetics is a virtual object (with no geometry, only appearance properties) that connected to an object in the scene will change its appearance.
Sang-Kyun Kim, Yong Soo Joo, In-Su Jang, Jin-Seo Kim, Soon-Young Kwon
m24778
Proposition of make-up avatar schemes for ISO/IEC 23005-4 MakeupAvatarType extension of AvatarType adding SkinSpectrum and MakeupInfo.
Open Questions: Should Spectrum be part of the avatar definition (that means part of the object graph?)
Representation of the spectrum is a 3D volume corresponding to the picture of the face and it is defined per pixel. Question: should this be defined per vertex.
MakeupInfo contains color defined as a spectrum, characteristics like form, perl, … and some metadata.
Definition of the regions on the avatar where the makeup should be applied.
Sang-Kyun Kim, Yong Soo Joo, In-Su Jang, Jin-Seo Kim, Soon-Young Kwon
m24779
Proposition of spectrum camera sensor schemes for ISO/IEC 23005-5 This camera takes an image and to each pixel it associates a spectrum.
It uses DoubleMatrixType from MPEG-7 to encode.
AP: investigate the general camera sensors; spectrum camera will be part of this framework (together with camera, stereo camera, depth camera, …)
Sang-Kyun Kim, Yong Soo Joo, In-Su Jang, Jin-Seo Kim, Soon-Young Kwon
m25090
Proposal to modify Scent Command type (Part 5) Summary: propose to add the duration of the command. Duration is not needed since it can be encoded by sending the "inactivate" command at a specific time stamp (equal to the duration).
Jeong-Do Kim, Hyung-Gi Byun, Hae-Ryong Lee, Chung-Hyun Ahn
m25091
Proposal to add Intensity Category CS with Scent (Part 6) Not needed because it is an authoring issue.
Overall recommendation: prepare a contribution on the ScentSensor.
Jeong-Do Kim, Hyung-Gi Byun, Hae-Ryong Lee, Ji-Hoon Choi
Liaison
Summary: George Percivall (OGC) presented the liaison answer from OGC related to GPS data types.
GML already contains global position system data (gml:point) and includes references to the several systems used for such data. OGC also work on SWE, a framework defining several sensors
AP: check if gml:point & related fields can be directly used by MPEG-V GPS data types.
AP: study the SWE and identify compatibilities/incompatibilities with MPEG-V types for sensors.
AP: Francisco to prepare the two answer letters.
Session
Augmented Reality
10h to 13h
Number
Title
Source
m24724
Proposal of a node for Augmentation Region Summary: definition of the augmentation region: a shape in the real image. Examples in the broadcasted AR based on user preferences.
ToDo: add the augmentedScene, include the animation of the transform, show how each actor (broadcaster, ARprovider, user) should interact with the system.
Tuesday update:
- need to have a node in the scene specifying the potential service provider for that content + a default provider
Proposal of a node for Reference Signal ToDo: add ReferenceSignalUrl, remove the fields related to the analysis tool, remove the children part.
B. S. Choi, Young Ho Jeong, Jin Woo Hong
m25211
Augmented Reality Reference Model An abstract level providing independence of the algorithm, sensors, scene description, … used to coordinate the AR community, improve communication between different SDOs,
Includes user scenario, glossary of terms,
This RM is intending to receive contributions from other relevant SDOs.
Enterprise view point is presented:
- actors and their role
- AR scenarios: guide, create and play.
Information view point:
- describe the content, interaction, …
Computation view point.
Engineering Viewpoint:
Several scheme presented.
Glossary: 40 terms that may be interconnected.
Use cases: several use cases are specified.
Resolution: ARAF to give input to the RefModel. Khronos, OGC and other contributors will look for an organization to publish.
Resolution: output a document called WD of AR reference Model. Check how to organize it: public, free, possible to co-publish with other SDOs.
- ARLocator (BIFS) contains calibration information, location, 3D scene description, background information, foreground information, AR control information.
- Media
ARContent:
- Graphics, media, characteristics, widgets,
- Add a metadata node that will have as child the ARobject (Object Graph).
Resolution:
- design the metadata node (MPEG-7 SMP)
- design the calibration node
- change the connection to the camera sensor to be possible to provide not only color map but also background color map/foreground color map.
- investigate on how connecting widgets to the ARContent
Minsu Ahn, Seungju Han, Jae-Joon Han, Yong Beom Lee, James D.K. Kim, Yiling Xu, Sungryeul Rhyu, Kyungmo Stanley Park, Jaeyeon Song
m24485
Proposal for ARAF Metadata and IPMP
Metadata:
- create a list of needed attributes for characterizing ARContent (check W3C POI)
- check if MPEG-7 SMP covers these attributes.
Protection: encrypting the data in the access unit in GPAC (check if GPAC is able to encrypt BIFS access units). Check also if GPAC is able to extract any type of AU.
Streaming (with DASH): a DASH segment contains an integer number of AUs.
Management:
Using REL:
1. to check if, when we have different licences for the initial scene and the augmented object, the player knows how to handle.
To check the "grants" elements and if they should be extend.
2. Licence node/predefined proto. AP: to propose the interface based on the analysis of point 1.
3. is supported.
Protection:
The three scenarios are supported by existing tools.
Davide Bertola (CEDEO.net), Angelo Difino (CEDEO.net), Marius Preda (INT),
m24813
Augmented Reality use case for remote laboratories
Benjamin JAILLY, Marius Preda
m24814
Games in mixed and augmented reality - Use cases for ARAF
Richard Wetzel, Marius Preda, Christina Moreno, Veronica Scurtu (on behalf of TOTEM consortium)
m24812
ARQuiz: an ARAF game
Traian Lavric, Adrian Gabrielli, Ivica Arsov, Marius Preda
m24811
XML Schema for Augmented Reality Application Format
Traian Lavric, Adrian Gabrielli, Ivica Arsov, Marius Preda