<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>CERTH &#8211; VRTogether</title>
	<atom:link href="https://vrtogether.eu/author/certh/feed/" rel="self" type="application/rss+xml" />
	<link>https://vrtogether.eu</link>
	<description>An end-to-end system for the production and delivery of photorealistic and social virtual reality experiences</description>
	<lastBuildDate>Mon, 21 Dec 2020 18:01:32 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>The final release of the Volumetric Video Capturing system of  CERTH and VRTogether is here!</title>
		<link>https://vrtogether.eu/2020/12/21/the-final-release-of-the-volumetric-video-capturing-system-of-certh-and-vrtogether-is-here/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=the-final-release-of-the-volumetric-video-capturing-system-of-certh-and-vrtogether-is-here</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Mon, 21 Dec 2020 18:01:32 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<guid isPermaLink="false">https://vrtogether.eu/?p=3237</guid>

					<description><![CDATA[<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/12/21/the-final-release-of-the-volumetric-video-capturing-system-of-certh-and-vrtogether-is-here/">The final release of the Volumetric Video Capturing system of  CERTH and VRTogether is here!</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="vc_row wpb_row vc_row-fluid attachment-full size-full wp-post-image"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>The Volumetric Capturing (VolCap) system is a toolset designed to orchestrate the capturing, streaming and recording of the data acquired from a multi-sensor infrastructure.</p>
<p>A number of processing units each manage and collect data from a single sensor using a headless application called Eye.</p>
<p>A set of sensors is orchestrated by a centralized UI application, VolCap, that is also the delivery point of the connected sensor streams.</p>
<p>VolCap allows multi-stream synchronization and data-driven and global optimized volumetric alignment (calibration), in order for the data and metadata needed for a 3D mesh reconstruction to be finally captured, encoded and streamed. It also offers an extended variety of real-time RGBD data parameterization from the user interface regarding the image resolutions, compression parameters, sensor presets and so on.</p>
<p>The technical offerings of the new version include:</p>
<ul>
<li>Efficient, scalable and low-resource multi-stream live sensor data acquisition and recording</li>
<li>Integration (and mixing) of Kinect 4 Azure and Intel RealSense 2.0 D415 devices</li>
<li>Combined hardware (device-specific) and software (IEEE 1588 PTP) multi-stream synchronization</li>
<li>Data-driven and global optimized volumetric alignment</li>
</ul>
<p>The volumetric capture software has been used in various activities such as <strong>Live tele-presence in Augmented VR </strong>or <strong>Mixed/Augmented Reality </strong>settings<strong>, Performance Capture, Free Viewpoint Video (FVV), Immersive Applications (i.e. events and/or gaming) </strong>and <strong>Motion Capture.</strong></p>
<p>In CERTH premises, we had the chance to test the final software release and simultaneously capture in high quality human user representations with 4 Microsoft Kinect4Azure and 4 Intel RealSense D415 ToF technology of the Kinect4Azure (K4A). The 4D reconstruction shows promising results that can increase sense of immersion in any application.</p>
<p>The new release comes with a new, organized documentation page available at <a href="https://vcl3d.github.io/VolumetricCapture" target="_blank" rel="noopener noreferrer">https://vcl3d.github.io/VolumetricCapture</a></p>

		</div>
	</div>
</div></div></div></div>
	<div class="wpb_video_widget wpb_content_element vc_clearfix   vc_video-aspect-ratio-169 vc_video-el-width-100 vc_video-align-left" >
		<div class="wpb_wrapper">
			
			<div class="wpb_video_wrapper"><iframe title="Testing the final release of the Volumetric Video Capturing system of  CERTH and VRTogether" width="1170" height="658" src="https://www.youtube.com/embed/zUeE_a4pSWQ?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></div>
		</div>
	</div>
</div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div></div></div></div></div><div class="vc_empty_space"   style="height: 20px"><span class="vc_empty_space_inner"></span></div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>Author: CERTH</p>
<p>Come and follow us in this VR journey with i2CAT, CWI, TNO, CERTH, Artanim, Viaccess-Orca, TheMo and Motion Spell.</p>

		</div>
	</div>
</div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/12/21/the-final-release-of-the-volumetric-video-capturing-system-of-certh-and-vrtogether-is-here/">The final release of the Volumetric Video Capturing system of  CERTH and VRTogether is here!</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Pre-Pilot Technology Test in CERTH premises</title>
		<link>https://vrtogether.eu/2020/12/11/pre-pilot-technology-test-in-certh-premises/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=pre-pilot-technology-test-in-certh-premises</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Fri, 11 Dec 2020 11:42:04 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<guid isPermaLink="false">https://vrtogether.eu/?p=3231</guid>

					<description><![CDATA[<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/12/11/pre-pilot-technology-test-in-certh-premises/">Pre-Pilot Technology Test in CERTH premises</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="vc_row wpb_row vc_row-fluid attachment-full size-full wp-post-image"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>Pilots are checkpoints to evaluate the creative and technical challenges of the project, aiming at assessing the performance of the technological developments, validating and refining the defined evaluation methodology and assessing the appropriateness of the technology and created scenarios / content to provide truly realistic and interactive social VR experiences.</p>
<p>CERTH successfully completed 2 full 2-user sessions with TVM v3 (mesh-based volumetric video) representation between 4 users in total. In Pilot 3, the experience attempts to increase the sense of immersion and togetherness. While having a maximum of five users with a live representation and a variety of different representation options (3D avatar, 2D video, single and multi camera point clouds, TVMs), users have also the ability to join a session as spectators or even with no representation at all. The Pilot3 <strong>VRTogether </strong>scenario resumes and concludes the same storyline, while taking part in a virtual apartment of great detail. It also includes three 3D avatars which interact with the users through voice instructions for specific actions using the Oculus Rift controllers and a simple form of dialog resulting in binary answer questions.</p>
<p>In CERTH premises we set up two capturing nodes with four Kinect4Azure depth sensors each. Following the Pilot protocol and all the hygiene protection rules designed by our consortium partner, CWI, we were able to conduct two 2-user sessions using the Pilot3 VR scenario of <strong>VRTogether</strong>’s platform. Both users were represented as TVMs and were able to successfully complete the scene’s interactions through the Oculus Rift’s controllers and its microphone.</p>
<p>Unfortunately we were not able to proceed with a bigger number of participants, as a result of the local COVID19 restrictions. Participants filled questionnaires before and after the sessions, as instructed by the used protocol, while all metrics integrated in<strong> VRTOGETHER</strong>’s platform were captured.</p>

		</div>
	</div>
</div></div></div></div></div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div></div></div></div></div><div class="vc_empty_space"   style="height: 20px"><span class="vc_empty_space_inner"></span></div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>Author: <a href="https://vrtogether.eu/consortium/certh/">CERTH</a></p>
<p>Come and follow us in this VR journey with <a href="https://vrtogether.eu/consortium/i2cat/">i2CAT</a>, <a href="https://vrtogether.eu/consortium/cwi/">CWI</a>, <a href="https://vrtogether.eu/consortium/tno/">TNO</a>, <a href="https://vrtogether.eu/consortium/certh/">CERTH</a>, <a href="https://vrtogether.eu/consortium/artanim/">Artanim</a>, <a href="https://vrtogether.eu/consortium/viaccess-orca/">Viaccess-Orca</a>, <a href="https://vrtogether.eu/consortium/the_mo/">TheMo</a> and <a href="https://vrtogether.eu/consortium/motion-spell/">Motion Spell</a>.</p>

		</div>
	</div>
</div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/12/11/pre-pilot-technology-test-in-certh-premises/">Pre-Pilot Technology Test in CERTH premises</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>HUMAN4D: Human-Centric Multimodal Dataset for Motions and Immersive Media created by VRTOGETHER’s consortium members CERTH, CWI, Artanim</title>
		<link>https://vrtogether.eu/2020/11/06/human4d-human-centric-multimodal-dataset-for-motions-and-immersive-media-created-by-vrtogethers-consortium-members-certh-cwi-artanim/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=human4d-human-centric-multimodal-dataset-for-motions-and-immersive-media-created-by-vrtogethers-consortium-members-certh-cwi-artanim</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Fri, 06 Nov 2020 14:19:22 +0000</pubDate>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">https://vrtogether.eu/?p=3099</guid>

					<description><![CDATA[<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/11/06/human4d-human-centric-multimodal-dataset-for-motions-and-immersive-media-created-by-vrtogethers-consortium-members-certh-cwi-artanim/">HUMAN4D: Human-Centric Multimodal Dataset for Motions and Immersive Media created by VRTOGETHER’s consortium members CERTH, CWI, Artanim</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>HUMAN4D is a new multimodal human-centric 4D dataset containing a large corpus with more than 50K samples. By capturing 2 female and 2 male professional actors performing various full-body movements and expressions, HUMAN4D provides a diverse set of motions and poses encountered as part of single- and multi-person daily, physical and social activities (jumping, dancing, etc.), along with multi-RGBD (mRGBD), volumetric and audio data.</p>
<p>Despite the existence of multi-view color datasets captured with the use of hardware (HW) synchronization, HUMAN4D is the first and only public resource that provides volumetric depth maps with high synchronization precision due to the use of intra- and inter-sensor HW-SYNC.</p>
<p>VRTogether consortium members <a href="https://vrtogether.eu/consortium/certh/"><strong>CERTH</strong></a>, <a href="https://vrtogether.eu/consortium/cwi/"><strong>CWI</strong></a> and <a href="https://vrtogether.eu/consortium/artanim/"><strong>Artanim</strong></a> made all the data (<a href="http://dx.doi.org/10.21227/xjzb-4y45">http://dx.doi.org/10.21227/xjzb-4y45</a>) and code (<a href="https://github.com/tofis/human4d_dataset">https://github.com/tofis/human4d_dataset</a>)  available online, including the respective synchronization, calibration and camera parameters, along with data loaders and other processing, visualization and evaluation tools, for academic use and further research.</p>
<p>The involved consortium members commit to continuously maintain the dataset for the community by adding new tools, baselines and captures. Despite the continuous maintenance of the dataset, benchmarking subsets will remain constant to allow the assessment and comparison between new state-of-the-art methods on the same datasets.</p>
<p>HUMAN4D and its associated tools will stimulate further research in computer vision and data driven approaches, enabling research on human pose estimation, real-time volumetric video reconstruction and compression, with the use of consumer-grade RGBD cameras sensors.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-4"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_left">
		
		<figure class="wpb_wrapper vc_figure">
			<a href="https://vrtogether.eu/wp-content/uploads/2020/11/Fig1.jpg" target="_self" class="vc_single_image-wrapper   vc_box_border_grey"><img width="550" height="443" src="https://vrtogether.eu/wp-content/uploads/2020/11/Fig1.jpg" class="vc_single_image-img attachment-full" alt="" loading="lazy" srcset="https://vrtogether.eu/wp-content/uploads/2020/11/Fig1.jpg 550w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig1-300x242.jpg 300w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig1-410x330.jpg 410w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig1-100x81.jpg 100w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig1-275x222.jpg 275w" sizes="(max-width: 550px) 100vw, 550px" /></a><figcaption class="vc_figure-caption">Using a custom photogrammetry rig with 96 cameras, photos were taken of the actor (left) and reconstructed into a 3D textured mesh using Agisoft Metashape (right)</figcaption>
		</figure>
	</div>
</div></div></div></div></div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_left">
		
		<figure class="wpb_wrapper vc_figure">
			<a href="https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-1024x460.png" target="_self" class="vc_single_image-wrapper   vc_box_border_grey"><img width="1041" height="468" src="https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2.png" class="vc_single_image-img attachment-full" alt="" loading="lazy" srcset="https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2.png 1041w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-300x135.png 300w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-1024x460.png 1024w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-768x345.png 768w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-700x315.png 700w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-410x184.png 410w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-100x45.png 100w, https://vrtogether.eu/wp-content/uploads/2020/11/Fig-2-275x124.png 275w" sizes="(max-width: 1041px) 100vw, 1041px" /></a><figcaption class="vc_figure-caption">HW-SYNCed multi-view RGBD samples (4 RGBD frames each) from ‘‘stretching_n_talking’’ (top) and ‘‘basketball_dribbling’’ (bottom) activities. The depth maps are colorized using TURBO colormap</figcaption>
		</figure>
	</div>
</div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div></div></div></div></div><div class="vc_empty_space"   style="height: 20px"><span class="vc_empty_space_inner"></span></div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>Come and follow us in this VR journey with <a href="https://vrtogether.eu/consortium/i2cat/">i2CAT</a>, <a href="https://vrtogether.eu/consortium/cwi/">CWI</a>, <a href="https://vrtogether.eu/consortium/tno/">TNO</a>, <a href="https://vrtogether.eu/consortium/certh/">CERTH</a>, <a href="https://vrtogether.eu/consortium/artanim/">Artanim</a>, <a href="https://vrtogether.eu/consortium/viaccess-orca/">Viaccess-Orca</a>, <a href="https://vrtogether.eu/consortium/the_mo/">TheMo</a> and <a href="https://vrtogether.eu/consortium/motion-spell/">Motion Spell</a>.</p>

		</div>
	</div>
</div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/11/06/human4d-human-centric-multimodal-dataset-for-motions-and-immersive-media-created-by-vrtogethers-consortium-members-certh-cwi-artanim/">HUMAN4D: Human-Centric Multimodal Dataset for Motions and Immersive Media created by VRTOGETHER’s consortium members CERTH, CWI, Artanim</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>New Kinect4Azure-based volumetric capture system for top quality user representation</title>
		<link>https://vrtogether.eu/2020/04/30/new-kinect4azure-based-volumetric-capture-system-for-top-quality-user-representation/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=new-kinect4azure-based-volumetric-capture-system-for-top-quality-user-representation</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Thu, 30 Apr 2020 14:04:23 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<guid isPermaLink="false">https://vrtogether.eu/?p=2075</guid>

					<description><![CDATA[<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/04/30/new-kinect4azure-based-volumetric-capture-system-for-top-quality-user-representation/">New Kinect4Azure-based volumetric capture system for top quality user representation</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p><span style="font-weight: 400;">Pilot 3 of VRTogether is closer and closer to its release! Volumetric Capture (publicly available on <a href="https://github.com/VCL3D" target="_blank" rel="noopener">GitHub</a></span><span style="font-weight: 400;">) has been released while more functionalities are coming soon. Higher quality of human users representations based on the ToF technology of the Kinect4Azure (K4A) is now available, leading to a top quality fully immersive experience.</span></p>
<p><a href="https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a.png"><img loading="lazy" class="wp-image-2078 aligncenter" src="https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a.png" alt="" width="860" height="564" srcset="https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a.png 1341w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-300x197.png 300w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-768x503.png 768w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-1024x671.png 1024w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-700x459.png 700w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-410x269.png 410w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-100x66.png 100w, https://vrtogether.eu/wp-content/uploads/2020/04/volcap_k4a-275x180.png 275w" sizes="(max-width: 860px) 100vw, 860px" /></a></p>
<h6 style="text-align: center;">Point Cloud capture and visualization</h6>
<p><span style="font-weight: 400;">VRTogether was excited to hear about Microsoft’s new depth sensor Kinect4Azure (K4A). Now this sensor has been successfully integrated to VRTogether by CERTH. The new volumetric capture will offer high quality representations which is critical for immersing each user of VRTogether to a realistic virtual experience. Fine details such as facial expressions, will boost the emotional engagement of each user, significantly contributing to achieve VRTogether ‘s objective which is to provide an interactive experience for multiple remote users inside the VR space.</span></p>
<p><a href="https://vrtogether.eu/wp-content/uploads/2020/04/fig1.png"><img loading="lazy" class="wp-image-2077 aligncenter" src="https://vrtogether.eu/wp-content/uploads/2020/04/fig1.png" alt="" width="260" height="413" srcset="https://vrtogether.eu/wp-content/uploads/2020/04/fig1.png 422w, https://vrtogether.eu/wp-content/uploads/2020/04/fig1-189x300.png 189w, https://vrtogether.eu/wp-content/uploads/2020/04/fig1-410x652.png 410w, https://vrtogether.eu/wp-content/uploads/2020/04/fig1-100x159.png 100w, https://vrtogether.eu/wp-content/uploads/2020/04/fig1-275x437.png 275w" sizes="(max-width: 260px) 100vw, 260px" /></a></p>
<h6 style="text-align: center;">3D Geometry of Time Varying Mesh</h6>
<p><span style="font-weight: 400;">We will now let the software speak for itself and invite everyone to download and test our new release here: </span><a href="https://github.com/VCL3D" target="_blank" rel="noopener"><span style="font-weight: 400;">https://github.com/VCL3D</span></a><span style="font-weight: 400;">  </span></p>

		</div>
	</div>
</div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div></div></div></div></div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p><i>Author: <a href="https://vrtogether.eu/consortium/certh/" target="_blank" rel="noopener">CERTH</a></i></p>
<p>Come and follow us in this VR journey with <a href="http://www.i2cat.net/en" target="_blank" rel="noopener">i2CAT</a>, <a href="https://www.cwi.nl/" target="_blank" rel="noopener">CWI</a>, <a href="https://www.tno.nl/en/" target="_blank" rel="noopener">TNO</a>, <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener">CERTH</a>, <a href="http://www.artanim.ch/" target="_blank" rel="noopener">Artanim</a>, <a href="https://www.viaccess-orca.com/" target="_blank" rel="noopener">Viaccess-Orca</a>, <a href="https://www.entropystudio.net/" target="_blank" rel="noopener">Entropy Studio</a> and <a href="https://www.gpac-licensing.com/" target="_blank" rel="noopener">Motion Spell</a>.</p>

		</div>
	</div>
</div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2020/04/30/new-kinect4azure-based-volumetric-capture-system-for-top-quality-user-representation/">New Kinect4Azure-based volumetric capture system for top quality user representation</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Kinect4Azure for VRTogether Volumetric Capture</title>
		<link>https://vrtogether.eu/2019/12/04/kinect4azure-for-vrtogether-volumetric-capture/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=kinect4azure-for-vrtogether-volumetric-capture</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Wed, 04 Dec 2019 13:10:06 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<guid isPermaLink="false">https://vrtogether.eu/?p=1876</guid>

					<description><![CDATA[<p>Since Microsoft announced their plans for the release of a new Kinect RGBD sensor, the entire technological community has been waiting patiently. It was unfortunate that the production of the previous Kinect v2 halted, as it was considered one of the best, with respect to the depth estimation quality, low-cost RGBD sensors. The continuation of [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2019/12/04/kinect4azure-for-vrtogether-volumetric-capture/">Kinect4Azure for VRTogether Volumetric Capture</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><span style="font-weight: 400;">Since Microsoft announced their plans for the release of a new Kinect RGBD sensor, the entire technological community has been waiting patiently. It was unfortunate that the production of the previous Kinect v2 halted, as it was considered one of the best, with respect to the depth estimation quality, low-cost RGBD sensors. The continuation of the MS Kinect series gave insights to the computer vision community.</span></p>
<p><img loading="lazy" class="wp-image-1877 alignright" src="https://vrtogether.eu/wp-content/uploads/2019/12/1.png" alt="" width="350" height="357" srcset="https://vrtogether.eu/wp-content/uploads/2019/12/1.png 644w, https://vrtogether.eu/wp-content/uploads/2019/12/1-294x300.png 294w, https://vrtogether.eu/wp-content/uploads/2019/12/1-410x418.png 410w, https://vrtogether.eu/wp-content/uploads/2019/12/1-100x102.png 100w, https://vrtogether.eu/wp-content/uploads/2019/12/1-275x281.png 275w" sizes="(max-width: 350px) 100vw, 350px" /></p>
<p><span style="font-weight: 400;">The new </span><b>Kinect4Azure</b> <b>(K4A)</b><span style="font-weight: 400;"> kit includes a 12 MPixel RGB camera, supplemented by 1 MPixel depth camera. Supplementary utilities are provided such as, real-time skeleton tracking, a 360-degree seven-microphone array and an inertial sensor (IMU). The aforementioned utilities, in addition to the top-notch depth quality of the new </span><b>Kinect4Azure (K4A)</b><span style="font-weight: 400;">, are expected to boost a wide spectrum of computer’s vision research fields.</span></p>
<p>The EU Project VRTogether, provides an immersive experience for multiple remote users. The objective of the project is to provide a VR experience where the users can see themselves (user representation), other users and interact with them inside the VR space. To achieve that, reconstructing the users in real-time is a prerequisite.</p>
<p>Since the announcement of the new <strong>Kinect4Azure (K4A)</strong>, VRTogether and, in particular, CERTH awaited the release of the sensor. The quality of the representations is critical for immersing each user to the virtual experience. Better quality provides a more real-like experience, as fine details of the representations add more reality features to the experience like facial expressions, which capture the emotional state of each user.</p>
<p>Pilot 3 is going to be released during the 3rd year of the project. Our expectations, comparing to the volumetric videos of users from previous pilots software, include higher quality human users representations based on the ToF technology of the <strong>Kinect4Azure (K4A)</strong>, leading to the peak of the immersive experience until the end of the project.</p>
<p>Aside from the EU VRTogether project, the exploitation of the <strong>Kinect4Azure (K4A)</strong>, like real-time human 3D reconstruction, volumetric captures and videos and other related applications is already considered a future fact.</p>
<p><img loading="lazy" class="aligncenter size-full wp-image-1880" src="https://vrtogether.eu/wp-content/uploads/2019/12/2.png" alt="" width="508" height="626" srcset="https://vrtogether.eu/wp-content/uploads/2019/12/2.png 508w, https://vrtogether.eu/wp-content/uploads/2019/12/2-243x300.png 243w, https://vrtogether.eu/wp-content/uploads/2019/12/2-410x505.png 410w, https://vrtogether.eu/wp-content/uploads/2019/12/2-100x123.png 100w, https://vrtogether.eu/wp-content/uploads/2019/12/2-275x339.png 275w" sizes="(max-width: 508px) 100vw, 508px" /><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p><i>Authors: <a href="https://vrtogether.eu/team/anargyros-chatzitofis/">Anargyros Chatzitofis</a>, <a href="https://vrtogether.eu/team/vasileios-magoulianitis/">Vasilios Magoulianitis</a></i><i> – <a href="https://vrtogether.eu/consortium/CERTH/">CERTH</a></i></p>
<p>Come and follow us in this VR journey with <a href="http://www.i2cat.net/en" target="_blank" rel="noopener">i2CAT</a>, <a href="https://www.cwi.nl/" target="_blank" rel="noopener">CWI</a>, <a href="https://www.tno.nl/en/" target="_blank" rel="noopener">TNO</a>, <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener">CERTH</a>, <a href="http://www.artanim.ch/" target="_blank" rel="noopener">Artanim</a>, <a href="https://www.viaccess-orca.com/" target="_blank" rel="noopener">Viaccess-Orca</a>, <a href="https://www.entropystudio.net/" target="_blank" rel="noopener">Entropy Studio</a> and <a href="https://www.gpac-licensing.com/" target="_blank" rel="noopener">Motion Spell</a>.</p>

		</div>
	</div>
</div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2019/12/04/kinect4azure-for-vrtogether-volumetric-capture/">Kinect4Azure for VRTogether Volumetric Capture</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>VR-Together at CERTH-ITI Open Day 2019</title>
		<link>https://vrtogether.eu/2019/05/23/vr-together-at-certh-iti-open-day-2019/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=vr-together-at-certh-iti-open-day-2019</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Thu, 23 May 2019 07:01:05 +0000</pubDate>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[News]]></category>
		<guid isPermaLink="false">https://vrtogether.eu/?p=1462</guid>

					<description><![CDATA[<p>The post <a rel="nofollow" href="https://vrtogether.eu/2019/05/23/vr-together-at-certh-iti-open-day-2019/">VR-Together at CERTH-ITI Open Day 2019</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>VR-Together was present at <strong>CERTH-ITI Open Day 2019</strong>, which was held on Friday 10/5 in Thessaloniki, Greece. The event is organized every year as part of the “EU in my region” campaign, and includes two sessions, a public and an industrial one, respectively. During the lifetime of the Open Day, several tour guides lead the guests through all presented ICT technologies, which are grouped based on their respective research field.</p>
<p><strong>VR-Together</strong> participated as “Social VR Technology” where the guests had the chance to be immersed into the Pilot 1 – Police Interrogation scenario using the <strong>Time-Varying Mesh</strong> 3D reconstruction setup. More than 200 people were informed about <strong>VR-Together</strong> <strong>technologies</strong> and objectives.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-4"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_left">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="1544" height="1180" src="https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure.jpg" class="vc_single_image-img attachment-full" alt="" loading="lazy" srcset="https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure.jpg 1544w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-300x229.jpg 300w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-768x587.jpg 768w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-1024x783.jpg 1024w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-700x535.jpg 700w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-410x313.jpg 410w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-100x76.jpg 100w, https://vrtogether.eu/wp-content/uploads/2019/05/open_day_brochure-275x210.jpg 275w" sizes="(max-width: 1544px) 100vw, 1544px" /></div>
		</figure>
	</div>
</div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p><em>Text: Spyridon Thermos, Argyris Chatzitofis – CERTH-ITI</em></p>
<p>Come and follow us in this VR journey with <a href="http://www.i2cat.net/en" target="_blank" rel="noopener">i2CAT</a>, <a href="https://www.cwi.nl/" target="_blank" rel="noopener">CWI</a>, <a href="https://www.tno.nl/en/" target="_blank" rel="noopener">TNO</a>, <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener">CERTH</a>, <a href="http://www.artanim.ch/" target="_blank" rel="noopener">Artanim</a>, <a href="https://www.viaccess-orca.com/" target="_blank" rel="noopener">Viaccess-Orca</a>, <a href="https://www.entropystudio.net/" target="_blank" rel="noopener">Entropy Studio</a> and <a href="https://www.gpac-licensing.com/" target="_blank" rel="noopener">Motion Spell</a>.</p>

		</div>
	</div>
</div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div  class="vc_tweetmeme-element"><a href="https://twitter.com/share" class="twitter-share-button" data-via="VRTogether_EU">Tweet</a><script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0],p=/^http:/.test(d.location)?'http':'https';if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src=p+'://platform.twitter.com/widgets.js';fjs.parentNode.insertBefore(js,fjs);}}(document, 'script', 'twitter-wjs');</script></div></div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2019/05/23/vr-together-at-certh-iti-open-day-2019/">VR-Together at CERTH-ITI Open Day 2019</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>VRTogether wins the Best Demo Award at MMM2019</title>
		<link>https://vrtogether.eu/2019/01/18/vrtogether-wins-the-best-demo-award-at-mmm2019/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=vrtogether-wins-the-best-demo-award-at-mmm2019</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Fri, 18 Jan 2019 08:20:55 +0000</pubDate>
				<category><![CDATA[Events]]></category>
		<category><![CDATA[News]]></category>
		<category><![CDATA[Award]]></category>
		<category><![CDATA[Conference]]></category>
		<category><![CDATA[TVM]]></category>
		<category><![CDATA[VR]]></category>
		<guid isPermaLink="false">http://vrtogether.eu/?p=788</guid>

					<description><![CDATA[<p>The post <a rel="nofollow" href="https://vrtogether.eu/2019/01/18/vrtogether-wins-the-best-demo-award-at-mmm2019/">VRTogether wins the Best Demo Award at MMM2019</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>VRTogether project partner <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener">CERTH</a> participated at the 25th <b>International MultiMedia Modeling Conference (MMM) </b><span style="font-weight: 400;">with the SpaceWars demo</span><b>. </b><span style="font-weight: 400;">MMM</span> <span style="font-weight: 400;">conference is a leading international forum for researchers and industry practitioners to share new ideas, original research results and practical development experiences from all multimedia-related areas. The MMM 2019 program is organized in several regular oral sessions, five special sessions, two poster and demos sessions, one industry session, the Video Browser Showdown session, one workshop, three invited keynote talks and two tutorials.</span></p>
<p>On the 4th day of the conference, CERTH’s (VCL) team presented a poster that depicts the <b>Time-Varying Mesh</b><span style="font-weight: 400;">-based (TVM) reconstruction pipeline as used in </span><b>VRTogether</b><span style="font-weight: 400;">, as well as a game application called “</span><b>SpaceWars</b><span style="font-weight: 400;">” that leverages the real-time reconstruction of the users.</span></p>
<p><iframe width="1170" height="658" src="https://www.youtube.com/embed/nK7pC41YjZY?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>
<p>SpaceWars is a Tele-Immersive game where multiple players are placed into the same virtual arena on top of futuristic hovercrafts, where they engage each other in a Capture-the-Flag type of game. SpaceWars utilizes the latest version of the VRTogether platform developed at CERTH (VCL) and was developed in order to stress-test the technology with respect to the real-time interactions between remote users within the challenging responsiveness setting of multiplayer game.</p>
<p>The current version of VRTogether’s TVM-based system consists only of 4 Microsoft Kinect sensors surrounding the user. The platform is portable and can be easily deployed and operated as a local capturing station as it is also low-cost in terms of equipment by using consumer grade vision sensors. Additionally, it utilizes an easy-to-use calibration scheme, using a custom built structure requiring commercially available materials.</p>
<p>SpaceWars system also includes a VR spectator mode, allowing users who are not playing to “live” the experience inside the virtual environment watch the on-going game.</p>
<p>During the demo session, all attendees had the opportunity to vote for the best presented demo. Subsequently, the committee convened taking into account the aforementioned votes. SpaceWars was voted as the best presented demo of the conference, winning the <b>Best Demo Award</b> along with the corresponding prize from Springer.</p>
<p>The award is available at the <b>Visual Computing Lab</b> official site (<a href="http://vcl.iti.gr/new/best-demo-award-mmm-2019/">click here</a>).</p>

		</div>
	</div>
</div></div></div></div><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_separator wpb_content_element vc_separator_align_center vc_sep_width_100 vc_sep_pos_align_center vc_separator_no_text vc_sep_color_grey" ><span class="vc_sep_holder vc_sep_holder_l"><span  class="vc_sep_line"></span></span><span class="vc_sep_holder vc_sep_holder_r"><span  class="vc_sep_line"></span></span>
</div>
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p><em>Text and figures: Spyridon Thermos, Anargyros Chatzitofis &#8211; CERTH</em></p>
<p>Come and follow us in this VR journey with <a href="http://www.i2cat.net/en" target="_blank" rel="noopener">i2CAT</a>, <a href="https://www.cwi.nl/" target="_blank" rel="noopener">CWI</a>, <a href="https://www.tno.nl/en/" target="_blank" rel="noopener">TNO</a>, <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener">CERTH</a>, <a href="http://www.artanim.ch/" target="_blank" rel="noopener">Artanim</a>, <a href="https://www.viaccess-orca.com/" target="_blank" rel="noopener">Viaccess-Orca</a>, <a href="https://www.entropystudio.net/" target="_blank" rel="noopener">Entropy Studio</a>, <a href="https://www.gpac-licensing.com/" target="_blank" rel="noopener">Motion Spell</a>.</p>

		</div>
	</div>
</div></div></div></div><div class="vc_row wpb_row vc_row-fluid"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div  class="vc_tweetmeme-element"><a href="https://twitter.com/share" class="twitter-share-button" data-via="VRTogether_EU">Tweet</a><script>!function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0],p=/^http:/.test(d.location)?'http':'https';if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src=p+'://platform.twitter.com/widgets.js';fjs.parentNode.insertBefore(js,fjs);}}(document, 'script', 'twitter-wjs');</script></div></div></div></div></div></section><section class="vc_section"><div class="vc_row wpb_row vc_row-fluid vc_column-gap-10"><div class="wpb_column vc_column_container vc_col-sm-12"><div class="vc_column-inner"><div class="wpb_wrapper"><div class="vc_row wpb_row vc_inner vc_row-fluid vc_custom_1547799200116 vc_row-has-fill vc_row-o-content-middle vc_row-flex"><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div  class="wpb_single_image wpb_content_element vc_align_center">
		
		<figure class="wpb_wrapper vc_figure">
			<div class="vc_single_image-wrapper   vc_box_border_grey"><img width="75" height="50" src="https://vrtogether.eu/wp-content/uploads/2019/01/EU_flag_75px.png" class="vc_single_image-img attachment-thumbnail" alt="" loading="lazy" /></div>
		</figure>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-8"><div class="vc_column-inner"><div class="wpb_wrapper">
	<div class="wpb_text_column wpb_content_element " >
		<div class="wpb_wrapper">
			<p>This project has been funded by the European Commission as part of the H2020 program, under the grant agreement 762111.</p>

		</div>
	</div>
</div></div></div><div class="wpb_column vc_column_container vc_col-sm-2"><div class="vc_column-inner"><div class="wpb_wrapper"></div></div></div></div></div></div></div></div></section>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2019/01/18/vrtogether-wins-the-best-demo-award-at-mmm2019/">VRTogether wins the Best Demo Award at MMM2019</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>New capturing experiment for HMD removal research</title>
		<link>https://vrtogether.eu/2018/07/03/new-capturing-experiment-for-hmd-removal-research/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=new-capturing-experiment-for-hmd-removal-research</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Tue, 03 Jul 2018 10:49:35 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[capturing]]></category>
		<category><![CDATA[certh]]></category>
		<category><![CDATA[facial capture]]></category>
		<category><![CDATA[HMD]]></category>
		<guid isPermaLink="false">http://vrtogether.eu/?p=594</guid>

					<description><![CDATA[<p>VRTogether partner CERTH will soon start capturing face content in 3D (RGB-D), with and without a head-mounted display (HMD), to be used for HMD-removal and any other research purpose of the project. In this post, we present the key components of the experiment and provide shots of the user lab located at CERTH. The photos [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2018/07/03/new-capturing-experiment-for-hmd-removal-research/">New capturing experiment for HMD removal research</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>VRTogether partner <strong>CERTH</strong> will soon start capturing face content in 3D (RGB-D), with and without a <strong>head-mounted display (HMD)</strong>, to be used for HMD-removal and any other research purpose of the project. In this post, we present the key components of the experiment and provide shots of the user lab located at CERTH.</p>
<p>The photos below depict the capturing environment, as well as the experimental setup. The user lab is located at the Visual Computing Lab of CERTH. The data will be recorded under controlled illumination and static background.</p>
<p>Each subject will be asked to complete two sessions of changing <strong>facial expressions</strong> with and without wearing HMD. Three different HMDs will be used: HTC Vive, Oculus and FOVE.</p>
<p>Three calibrated RGB-D sensors will capture both <strong>colour and depth facial information</strong> from different points of view. The main goal for the dataset is to capture 60-100 subjects and a significant amount of <strong>data</strong> for research purposes.</p>
<p>We’ll keep you updated on these experiments developments and results!</p>
<p><img loading="lazy" class="aligncenter wp-image-596 " src="http://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-1024x465.png" alt="" width="850" height="386" srcset="https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-1024x465.png 1024w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-300x136.png 300w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-768x349.png 768w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-700x318.png 700w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-410x186.png 410w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-100x45.png 100w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2-275x125.png 275w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-2.png 2016w" sizes="(max-width: 850px) 100vw, 850px" /></p>
<p><img loading="lazy" class="aligncenter wp-image-595 " src="http://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-1024x582.png" alt="" width="851" height="484" srcset="https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-1024x582.png 1024w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-300x171.png 300w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-768x437.png 768w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-700x398.png 700w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-410x233.png 410w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-100x57.png 100w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1-275x156.png 275w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1.png 2016w" sizes="(max-width: 851px) 100vw, 851px" /></p>
<p><img loading="lazy" class="aligncenter wp-image-597 " src="http://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1024x414.png" alt="" width="851" height="344" srcset="https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-1024x414.png 1024w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-300x121.png 300w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-768x311.png 768w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-700x283.png 700w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-410x166.png 410w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-100x40.png 100w, https://vrtogether.eu/wp-content/uploads/2018/07/hmd-removal-275x111.png 275w" sizes="(max-width: 851px) 100vw, 851px" /></p>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2018/07/03/new-capturing-experiment-for-hmd-removal-research/">New capturing experiment for HMD removal research</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>New VR-Together Capture System based on cutting-edge technologies</title>
		<link>https://vrtogether.eu/2018/01/24/new-vr-together-capture-system-based-on-cutting-edge-technologies/?utm_source=rss&#038;utm_medium=rss&#038;utm_campaign=new-vr-together-capture-system-based-on-cutting-edge-technologies</link>
		
		<dc:creator><![CDATA[CERTH]]></dc:creator>
		<pubDate>Wed, 24 Jan 2018 07:52:22 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[3D]]></category>
		<category><![CDATA[capture]]></category>
		<category><![CDATA[sensor]]></category>
		<guid isPermaLink="false">http://vrtogether.eu/?p=355</guid>

					<description><![CDATA[<p>Microsoft announced the discontinuation of the Kinect for Xbox One device, the integrated depth-sensing device of the initial people 3D capture system. However, Intel recently released new high-tech depth-sensing devices, Intel RealSense D400-Series, providing depth data of high quality and accuracy as well as of high frame rate. CERTH has purchased 4 x D415 and [&#8230;]</p>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2018/01/24/new-vr-together-capture-system-based-on-cutting-edge-technologies/">New VR-Together Capture System based on cutting-edge technologies</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Microsoft</strong> announced the discontinuation of the <strong>Kinect for Xbox One</strong> device, the integrated depth-sensing device of the initial people 3D capture system. However, Intel recently released new high-tech depth-sensing devices, <a href="https://software.intel.com/en-us/realsense/d400" target="_blank" rel="noopener noreferrer"><strong>Intel RealSense D400-Series</strong></a>, providing depth data of high quality and accuracy as well as of high frame rate. <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener noreferrer"><strong>CERTH</strong></a> has purchased <strong>4 x D415</strong> and <strong>4 x D435</strong> devices in order to integrate them to the system, offering higher hardware (HW) specifications to the <strong>VR-Together capture system.</strong></p>
<p><img loading="lazy" class="aligncenter wp-image-357 size-large" src="http://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-1024x588.png" alt="" width="1024" height="588" srcset="https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-1024x588.png 1024w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-300x172.png 300w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-768x441.png 768w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-700x402.png 700w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-410x235.png 410w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-100x57.png 100w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1-275x158.png 275w, https://vrtogether.eu/wp-content/uploads/2018/01/image2_small-1.png 1412w" sizes="(max-width: 1024px) 100vw, 1024px" /></p>
<p><em>Figure 1 Intel RealSense D415 devices used in CERTH’s laboratory using the Intel RealSense SDK viewer for visualizing the RGB and Depth streams in Full High Definition and in 1280&#215;720 resolutions, respectively.</em></p>
<p>Moreover, <strong>Intel</strong> provides a <strong>Low-Level Device API</strong> with the means to take direct control of the individual device sensors.</p>
<ul>
<li>Each sensor has its own power management and control.</li>
<li>Different sensors can be safely used from different applications and can only influence each other indirectly.</li>
<li>Each sensor can offer one or more streams of data. Streams must be configured together and are usually dependent on each other.</li>
<li>All sensors provide streaming as minimal capability, but each individual sensor can be extended to offer additional functionality.</li>
<li>Intel RealSense D400 stereo module offers Advanced Mode functionality, letting you control the various ASIC registers responsible for depth generation.</li>
</ul>
<p><img loading="lazy" class="aligncenter wp-image-358 size-large" src="http://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-1024x523.png" alt="" width="1024" height="523" srcset="https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-1024x523.png 1024w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-300x153.png 300w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-768x392.png 768w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-700x358.png 700w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-410x209.png 410w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-100x51.png 100w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small-275x140.png 275w, https://vrtogether.eu/wp-content/uploads/2018/01/image3_proccessed_small.png 1423w" sizes="(max-width: 1024px) 100vw, 1024px" /></p>
<p><em>Figure 2 A set of image sensors that enable capturing of disparity between images, a dedicated RGB image signal processor for image adjustments and scaling color data. An active infrared projector (emitter) to illuminate objects to enhance the depth data.</em></p>
<p>Beyond the hardware updates, CERTH has started to design and implement <strong>new versions</strong> of the sub-components integrated and used in the people 3D capture system.</p>
<p>New synchronization methods will be developed to <strong>better group and process </strong>the multiple RGB-D frames. <strong>Soft-HW</strong> and <strong>HW based approaches</strong> will be investigated to more accurately extract <strong>the 3D point cloud of the user</strong>, allowing for higher quality user 3D representation in the <strong>VR-Together platform</strong>.</p>
<p><em>Text and pictures: <a href="https://twitter.com/tofis3d" target="_blank" rel="noopener noreferrer">Argyris Chatzitofis</a>, <a href="https://www.certh.gr/root.en.aspx" target="_blank" rel="noopener noreferrer">CERTH</a></em></p>
<p>The post <a rel="nofollow" href="https://vrtogether.eu/2018/01/24/new-vr-together-capture-system-based-on-cutting-edge-technologies/">New VR-Together Capture System based on cutting-edge technologies</a> appeared first on <a rel="nofollow" href="https://vrtogether.eu">VRTogether</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
