Organisation internationale de normalisation



Yüklə 9,04 Mb.
səhifə277/277
tarix02.01.2022
ölçüsü9,04 Mb.
#24054
1   ...   269   270   271   272   273   274   275   276   277

Standards from 3DGC




Std

Pt

Edit.

A/E

Description

CfP

WD

CD

DIS

FDIS

Gr.

4

5

2001

A36

Pattern-based 3D mesh coding RS







13/11

14/04

15/06

3

4

16

2011

A3

Web3DG coding




14/10

15/02

15/10

16/01

3

4

16

2011

A4

Pattern-based 3D mesh coding







13/08

14/07

15/02

3

4

25

2011

A4

3rd Edition




14/10

15/02

15/10

16/01

3

4

27

2009

A6

Pattern-based 3D mesh coding C







13/11

14/04

15/06

3

A

13

2013

A1

ARAF RS & C







13/04

13/11

15/02

3

A

13

2013

E2

ARAF 2nd Edition







14/10

15/06

15/10

3

V

1

201x

E3

Architecture







13/11

14/07

15/02

3

V

2

201x

E3

Control information







13/11

14/10

15/06

3

V

3

201x

E3

Sensory information







13/11

14/10

15/06

3

V

4

201x

E3

Virtual world object characteristics







13/11

14/10

15/06

3

V

5

201x

E3

Data formats for interaction devices







13/11

14/10

15/06

3

V

6

201x

E3

Common types and tools







13/11

14/10

15/06

3

V

7

201x

E3

RS & C







14/07

15/02

15/10

3



    1. Room allocation


3DGC: Room Rohan
    1. Joint meetings


During the week, 3DG had several joint meetings with Requirements, Video, Audio and Systems.


Groups

What

Day

Time1

Time2

Where

R, 3

MIoT

Tue

09:00

10:00

3

R, 3

Multisensory AF

Tue

10:00

11:00

3

A, 3

Audio in AR

Wed

14:00

15:00

A

S, 3

MPEG-M

Thu

10:00

11:00

Rohan

JPEG, 3

ARAF & JPEG AR

Thu

11:00

12:00

Rohan

R, 3

IoT

Thu

15:30

16:00

Rohan

All

Communication

Thu

16:00

18:00

Boston



    1. Schedule at a glance





Monday




MPEG Plenary

09h00 to 13h00




Lunch

13h00 to 14h00

Session

Agenda/Status/Preparation of the week

14h40 to 15h00

Session

3DG Plenary, review of 3DG contributions (m35134, m35196, m35197, m35198, m35349)

15h00 to 18h00

Tuesday

JSession with Req.

MIoT (m34963, m34964, m35211)

09h00 to 10h00

JSession with Req.

Multi-sensorial application format (m34965, m34975)

10h00 to 11h00

Session

MPEG-V (all other MPEG-V contributions)

11h00 to 13h00




Lunch

13h00 to 14h00

J Session

ARAF

14h00 to 18h00

Wed




MPEG Plenary

09h00 to 11h00

Session

3DG compression – Web3D Coding

11h00 to 12h00




Lunch

13h00 to 14h00

JSession

Audio in AR

14h00 to 15h00

Session

ARAF (&MPEG-V sensors for ARAF)

15h00 to 17h00

Session

PC compression

17h00 to 18h00

Thu

Session

MPEG-V 3th Ed. revision

09h00 to 10h00

JSession (with Systems)

MPEG-M for MPEG-V Engine(s)

10h00 to 11h00

J Session

JPEG AR and ARAF (joint with JPEG)

11h00 to 12h00

Session

IoT Exploration editing

12h00 to 13h00




Lunch

13h00 to 14h00

Session

3DG output documents

14h00 to 15h30

JSession

IoT Exploration

15h30 to 16h00

Session

AR output documents (BoG)

16h00 to 18h00

JSession

Communication

16h00 to 18h00

Friday

Session

3DG Plenary, 3DG Output Documents Review, Liaisons, resolutions

09h00 to 13h00




MPEG Plenary

14h00 to xxh00
  1. AhG reports

    1. MPEG-V



http://wg11.sc29.org/doc_end_user/documents/110_Strasbourg/ahg_presentations/MPEG-VAhGreport.ppt-1414164659-MPEG-VAhGreport.ppt
    1. Augmented Reality


http://wg11.sc29.org/doc_end_user/documents/110_Strasbourg/ahg_presentations/ARAhGreport.ppt-1414164644-ARAhGreport.ppt
    1. 3D Graphics compression

http://wg11.sc29.org/doc_end_user/ahg_presentations.php?id_meeting=162


    1. 3D 3DG Activities are reported in the Wednesday and Friday plenary


Wednesday: http://wg11.sc29.org/doc_end_user/documents/110_Strasbourg/presentations/MPEG3DGraphicsMercredi.ppt-1413966087-MPEG3DGraphicsMercredi.ppt
Friday:

http://wg11.sc29.org/doc_end_user/documents/110_Strasbourg/presentations/MPEG3DGraphicsVendredi.ppt-1414161510-MPEG3DGraphicsVendredi.ppt
  1. Analysis of contributions




3D Graphics Compression

m35197

Mesh to FAMC converter

OpenCTM project was updated to include the FAMC encoder.

AP: commit the updated project on MPEG SVN in the Exploration directory of AFX.


Rufael Mekuria

m35198

Experimental datasets of 3D Point Clouds captured with Kinect v2

A collection of point cloud data is contributed. The data is not of a very good quality but it is a initial work.

AP: put the data set in the MPEG Content Repository.

AP: copy content from Francisco web server to the MPEG Content Repository




Zhongyi Xu, Qianni Zhang, Ebroul Izquierdo, Rufael Mekuria, Pablo Cesar

m35196

3DTI Mesh Coding Architecture and Basic implementation

The top level architecture for point-cloud and meshes is proposed.

Analysis on two axis:

- point cloud (static and dynamic): use PCL as a starting point and improve it.

- mesh with changing connectivity

AP: check if the PC&CvM C (Point Cloud and Connectivity-varying Mesh Compression) software can be provided to MPEG.

AP: Collect a set of Static or/and Time-varying Point Clouds

AP: improve the PCL based PCC




Rufael Mekuria, Pablo Cesar




Updates on Web3D Coding

AP: investigate where the glTF parser is interpreting .bin and replace this code with the .s3d decoder.

AP: Build a C encoder and decoder for TFAN by replacing the arithmetic encoding with gzip.

AP: Build a JS decoder implementation of the current TFAN by only using gzip.



Christian Tulvan, Marius Preda



ARAF










m35222

ARAF: Remote Audio Recognition

The use case described is the one of Second Screen applications where the microphone is capturing the audio and the ARAF browser sends audio PCM data to a recognition server. The server has three type of answer : recognizing the audio from a server-side database, compute an approximation of the timestamp of the current audio and optionally send links to augumentation content.


AP : add the microphone Sensor in MPEG-V specifing the capabilities and data format
AP : update the schema and the corresponding text to reflect the discussion
AP : continue the PROTO implementation

Traian Lavric, Marius Preda



















MPEG-V




m34963

Brief summary on media-centric-IoT (Internet of Things) with MPEG-V
A set of definitions is provided: devices, entities, resources, intelligence, data related to humans.

Sang-Kyun Kim, Jaejoon Han, Seungju Hun, Seoung-Jun Oh




m34964

An usage scenario for media-centric-IoT services
Focus more on media centric IoT.

Scenario 1: multiple cameras capturing the world. Several type of data is needed: description for things captured in the video or images, description for video and other sensors.


2 kind of scenarios:

1) centralized intelligence and distributed sensors/actuators

2) distributed intelligence in various devices (sensor/actuators/devices)
Light video coding located in the camera. New video coders to be easily implemented in sensors. Set of parameters to control the parameters in the sensor network.
Make a concrete use case with car traffic or natural disaster and specify what can be used from MPEG-7, MPEG-V …



Sang-Kyun Kim, Jaejoon Han, Seungju Hun, Kyoungro Yoon, Jaegon Kim, Seoung-Jun Oh




m35211

IoT basic terminologies definitions
A set of definitions from IETF are presented.


Md. Jalil Piran, Doug Young Suh




Synthesis

Low complexity codec (existing profile but to put into an informative annex)
API for initiating the set of images to be recognized (à la CDVS) and for initiating with some set of conditions coming from the MPEG-7 descriptors (video and audio)








m34965

SEM document fragmentation using Fragment Request Unit (FRU)
Report on SEM fragmentation results using Fragment Request Unit (MPEG-B Part 2).
Resolution: use MPEG-21 XML Streaming Instruction instead.
AP: initiate Multi-sensory application format.

Sang-Kyun Kim, Jaejoon Han, Seungju Han, Jungyup Oh

m34966

Report on reference SW for Makeup Skin Model in MPEG-4 Part 16
BifsEncoder was updated to include the set of nodes related to makup. Work in progress for the Bifs Decoder and the 3D player.

Jin-Seo Kim, In-Su Jang, Soon-Young Kwon, Yoon-Seok Choi, Minwoo Kim, Jungyup Oh, Yong Soo Joo, Sang-Kyun Kim

m34972

Proposal on the IndexedMaterialSet for the 3D printing
Definition based on IndexedRegionSet and adding PrintingMaterials per region.

We need a reference software implementation that is able to parse this node and print the object with multiple metarials.



Seung Wook Lee, Chang Jun Park, Jin Sung Choi,

m34974

Proposal of SEP Engine APIs
A set of APIs is proposed related to SE Manipulation, SE Retrieval, SE Adaptation, SE Actuation. For the moment the sensed information (coming from sensors) is not considered.
Thursday: the SE Engine is proposed formatted to be compliant with MXM API (Part 2 of MPEG-M)
Thursday with Systems: services and aggregation of services. Look forward for MPEG-V usage as a service in a similar manner than MPEG-M Part 4.

Jae-Kwan Yun, Yoonmee Doh, Hyun-Woo Oh, Jae-Doo Huh, Jong-Hyun Jang, Sang-Kyun Kim

m34975

Function prototype of SEM manipulation and retrieval
Details for the functions parameters are presented.

Jae-Kwan Yun, Yoonmee Doh, Hyun-Woo Oh, Jae-Doo Huh, Jong-Hyun Jang, Sang-Kyun Kim

m34976

Web interfaces for SEPengine
This is a proprietary implementation. It was presented as an example.

Jae-Kwan Yun, Yoonmee Doh, Hyun-Woo Oh, Jae-Doo Huh, Jong-Hyun Jang, Sang-Kyun Kim

m35209

Proposal of Binary Representation of 3D Printer User Preference Description
Adding the binary representation for ThreeDPrintingPreferenceType and the material type CS. Same for the binary representation of MaterialCharacteristicsType CS

Seungwook Lee, Jinsung Choi, Kyoungro Yoon, Min-Uk Kim, HyoChul Bae

m35210

Proposal of Binary Representation of 3D Printer Capability Description
Binary representation for ThreeDPrintingCapabilityType and 3DPrinterFileFormatCS and ThreeDPrinterTypeCS


Seungwook Lee, Jinsung Choi, Kyoungro Yoon, Min-Uk Kim, HyoChul Bae,

m35234

3D mesh file format used in the 3D printer industry
Identifies several 3D mesh file formats, therefore validating the file formats used in the CS

Seung Wook Lee, Chang Jun Park, Jin Sung Choi

m35237

Proposal of usage scenarios about virtual panoramic vision for MPEG-V
Use case related to automobile sensors and visualisation devices. Some additional sensors should be defined.

Saim Shin, Jong-Seol James Lee, DalWon Jang, Kyoungro Yoon

m35238

Proposal of binary representations for automobile related sensor capability descriptions
The syntax for the binary representations for automobile related sensor capabilities is proposed for Part 2.

Kyoungro Yoon, Min-Uk Kim, HyoChul Bae, Jong-Seol James Lee

m35265

Interaction-driven Sensory Effect with MPEG-V
Definition of a collection of effects that are used multiple times. When using an effect from the collection it should be possible to overwrite some attributes.

Hyunjin Yoon, Cheol-Min Kim, Jong-Hyun Jang

m35268

Proposal of device command type for 3D Printer
A basic syntax for the 3D printer device command including the choice of the material, the object to print and parameters related to the printing process (thickness, wholes, …)

Seung Wook Lee, Chang Jun Park, Jinsung Choi, Kyoungro Yoon, Min-Uk Kim, HyoChul Bae

Output document review

Microphone sensor
Use the same mechanism as CameraSensorType (Part 5)

Missing parts




















    1. Joint meeting with 3DV

No joint meeting was organized this meeting.


    1. Joint meeting with Systems on MSE AF

Multisensorial Media Application Format was presented.

Resolution: accepted.

    1. Joint meeting with Requirements on IoT

There is a similarity between the definition of IoT provided by several sources and the main aim of MPEG-V.


A use case of Remote Home Control is presented. Sensor capabilities and sensors are covered but this is not the case for Actuators. MPEG-V only covers the ones related to the sensory effects but there are some more in the example of the Remote Home Control.
Resolution 1: We need a plan to how get involvement of additional companies. We should identify some application fields that desperately need standardization.

Resolution 2: Establish a AhG in Requirements on IoT. Issues on Internet of things (draft)


The Exploration on Media-centric IoT was produced. It details the standardization needs based on the use case of large and distributed system of cameras.
    1. Joint meeting with 3DA

Traian Lavric, Telecom SudParis, gave a presentation that explained the relevant use case for discussion during this joint meeting.

The use case is for second screen applications with runtime delivery of interactive content.

• Main screen plays out audio program

• Second screen has microphone that receives main screen audio program and computes fingerprint form the audio.

• Fingerprint is sent to sever where server identifies the main screen program (via database search of stored watermarks). It also sends second screen local timestamp associated with the captured audio.

• Server sends to second screen the relevant second screen content (e.g. interactive content) along with second screen local timestamp of when it should be presented.
Juergen Herre, FhG-IIS/IAL, noted that MPEG-7 Audio Signature descriptor and Enhanced Audio Signagure have been demonstrated to implement robust audio program detection. The MPEG-4 Audio Synchronization technology has been demonstrated to achieve robust timeline detection once the program is known.

Use case requirements are:

• Identify microphone sensor attributes

• Program identification from within a large number of possible programs

• Timestamp within the identified program
Microphone data buffer parameters

• Sampling rate (enumerated list)

• Bits per word (16, 24 or 32)

• Two’s complement representation (no other choice)

• Big-endian or Little-endian
MPEG-7 tool: Enhanced Audio Signature in the Low Level Audio Descriptors for Program identification.
Matlab implementation for the Audio Signature is available.

MPEG-4 Audio synchronization: AMD 5 (fingerprints encapsulated in MPEG-4 Systems).


    1. Joint Session with JPEG on JPEGAR

A set of XML types is defined supporting communication between image capture and image processing components: openSession, query, response and closeSession.


ARAF main concepts were also introduced. ARAF also uses APIs but at a lower level (set of parameters).
The JPEGAR scenario is supported in ARAF by the ImageRecognitionServer Proto.
AP: start an experiment for showing how an JPEGAR server can be used by the ARAF browser. A trascoding step is required to format/interpret the ARAF requsts/updates into the XML form specified by JPEGAR.
    1. Session 3DG Plenary

Review of all the output documents. Details are provided in the documents listed in section 6.


  1. General issues

    1. General discussion

      1. Reference Software


It is recalled that the source code of both decoder AND encoder should be provided as part of the Reference Software for all technologies to be adopted in MPEG standards. Moreover, not providing the complete software for a published technology shall conduct to the removal of the corresponding technical specification from the standard.

Currently all the AFX tools published in the third edition are supported by both encoder and decoder implementation.


      1. Web site


The new web site is available at http://wg11.sc29.org/3dgraphics/. It is a blog-based web site and all members are allowed to post.
  1. General 3DG related activities

    1. Promotions

      1. Web Site


        Title

        Status of http://wg11.sc29.org/3dgraphics/

        Authors




        Summary

        The web site is now empty, contributions are needed.

        Resolution

        Action Point: Transfer the old web-site data to the new one.
    2. Press Release

Announcement of the CD for ARAF:


The 2nd Edition of MPEG’s Augmented Reality Application Format (ARAF) has reached CD status at the 110th MPEG meeting. The 1st Edition of ARAF was published as ISO/IEC 23000 13:2014 in May 2014, and the publication of this new 2nd Edition is foreseen at the end of 2015. ARAF enables augmentation of the real world with synthetic media objects by combining multiple, existing standards within a single specific application format addressing certain industry needs. In particular, ARAF comprises three components referred to as scene, sensor/actuator, and media. The scene component is represented using a subset of MPEG 4 Part 11 (BIFS), the sensor/actuator component is defined within MPEG V, and the media component may comprise various types of compressed (multi-)media assets using different modalities and codecs. The target applications include geo-location based services, image-based object detection and tracking, mixed and augmented reality games, and real/virtual interactive scenarios.
    1. Liaison

Answer to the liaison statement to ITU-T SG 9 related to ARAF.


  1. Resolutions from 3DG




    1. Resolutions related to MPEG-4

      1. Part 16 – Animation Framework eXtension (AFX)

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          14496-16 – Animation Framework eXtension (AFX)







          15001

          Core Experiments Description for 3DG

          N

          14/10/24

          15002

          TuC for 3DG : Indexed Printing Region Set

          N

          14/10/24
      2. Part 25 – 3D Graphics Compression Model

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          14496-25 – 3D Graphics Compression Model







          15003

          WD for 3rd Edition of 3D Graphics Compression Model (Web3D graphics coding support)

          Y

          14/11/01
    2. Resolutions related to MPEG-A

      1. Part 1 – General

        1. The Requirements and 3DG subgroups recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23000-1 – General







          15004

          WD 2.0 of Multiple Sensorial Media Application Format

          N

          14/10/24
      2. Part 13 – Augmented Reality Application Format

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23000-13 – Augmented Reality Application Format







          15017

          Text of ISO/IEC CD 23000-13 2nd Edition ARAF

          Y

          14/11/15
      3. Part 14 – Mixed and Augmented Reality Reference Model

        1. The 3DG group recommends to remove the Part 14 of ISO/IEC 23000 because the content of this part is handled now in ISO/IEC 18039.

    1. Resolutions related to MPEG-M

      1. Part 2 – MXM API

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23002-5 – MXM API







          xxx

          This resolution was handled by Systems

          N



    2. Resolutions related to MPEG-V

      1. General

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23005 – General







          15005

          Technology under consideration




          14/10/24
      2. Part 2 – Control Information

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23005-2 – Control Information







          15006

          Text of ISO/IEC DIS 23005-2 3rd Edition Control Information

          N

          14/11/15
      3. Part 3 – Sensory Information

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23005-3 – Sensory Information







          15007

          Text of ISO/IEC DIS 23005-3 3rd Edition Sensory Information

          N

          14/11/15
      4. Part 4 – Virtual World Object Characteristics

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23005-4 – Virtual World Object Characteristics







          15008

          Text of ISO/IEC DIS 23005-4 3rd Edition Virtual World Object Characteristics

          N

          14/11/15
      5. Part 5 – Data Formats for Interaction Devices

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23005-5 – Data Formats for Interaction Devices







          15009

          Text of ISO/IEC DIS 23005-5 3rd Edition Data Formats for Interaction Devices

          N

          14/11/15
      6. Part 6 – Common types and tools

        1. The 3DG subgroup recommends approval of the following documents


          No.

          Title

          TBP

          Available




          23005-6 – Common types and tools







          15010

          Text of ISO/IEC DIS 23005-6 3rd Edition Common types and tools

          N

          14/11/15
    3. Exploration

      1. Exploration on Media-centric Internet of Things (draft)

Resolution handeled by Requirements.


    1. Management

      1. Liaisons

        1. 3DG recommends approval of the following document





No.

Title

TBP

Available




Liaisons







15011

Liaison Statement to ITU-T SC 9 on AR

N

14/10/24
  1. Establishment of 3DG Ad-Hoc Groups




15012

AHG on AR

Mandates

  1. Edit the ARAF and MAR RM documents

  2. Identify additional needs for ARAF

  3. Identify the MPEG technologies that shall be part of ARAF

  4. Continue actions to implement the collaborative work-plan for MAR RM with SC24

  5. Investigate the usage of 3DV, 3DA, CDVA and UD in Augmented Reality

Chairmen

Marius Preda (Institut Mines Telecom), Traian Lavric (Institut Mines Telecom), B.S. Choi (ETRI)

Duration

Until 111th MPEG Meeting

Reflector

mpeg-3dgc AT gti. ssr. upm. es

Subscribe

To subscribe, send email to https://mx.gti.ssr.upm.es/mailman/listinfo/mpeg-3dgc

Meeting

During weekend prior to MPEG

Room size 20

Date February 15, 09h00-13h00

Before weekend prior to MPEG

Place /Date/Logistics






15013

AHG on MPEG-V

Mandates

  1. Edit the MPEG-V documents

  2. To further develop tools for MPEG-V and discuss related contributions.

  3. Collect material for reference software and conformance of MPEG-V 3rd Edition

Chairmen

Kyoungro Yoon (Konkuk University), Marius Preda (Institut Mines Telecom),

Duration

Until 111th MPEG Meeting

Reflector

metaverse@lists.uni-klu.ac.at

Subscribe

http://lists.uni-klu.ac.at/mailman/listinfo/metaverse

Meeting

During weekend prior to MPEG

Room size 20

Date February 15, 14h00-16h00

Before weekend prior to MPEG

Place /Date/Logistics







15014

AHG on Graphics compression

Mandates

  1. To refine use cases and requirements of 3D tele-immersive coding

  2. To identify compression technologies for point clouds and meshes with time varying connectivity

  3. To prepare test data for point clouds and meshes with time varying connectivity

  4. To progress the specification and continue implementation of Web3D coding

  5. Collect contributions and maintain the utility software

Chairmen

Lazar Bivolarski (LZ), Rufael Mekuria (CWI), Christian Tulvan (Institut Mines Telecom),

Duration

Until 111th MPEG Meeting

Reflector

mpeg-3dgc AT gti. ssr. upm. es

Subscribe

To subscribe, send email to https://mx.gti.ssr.upm.es/mailman/listinfo/mpeg-3dgc

Meeting

During weekend prior to MPEG

Room size 20

Date February 15, 16h00-18h00

Before weekend prior to MPEG

Place /Date/Logistics






  1. Closing of the Meeting


See you in Geneva.




Draft Systems Agenda


Yüklə 9,04 Mb.

Dostları ilə paylaş:
1   ...   269   270   271   272   273   274   275   276   277




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin