360 video – Bitmovin https://bitmovin.com Bitmovin provides adaptive streaming infrastructure for video publishers and integrators. Fastest cloud encoding and HTML5 Player. Play Video Anywhere. Fri, 05 Jan 2024 13:02:34 +0000 en-GB hourly 1 https://bitmovin.com/wp-content/uploads/2023/11/bitmovin_favicon.svg 360 video – Bitmovin https://bitmovin.com 32 32 Encoding VR and 360 Immersive Video for Meta Quest Headsets https://bitmovin.com/blog/best-encoding-settings-meta-vr-360-headsets/ https://bitmovin.com/blog/best-encoding-settings-meta-vr-360-headsets/#respond Tue, 14 Nov 2023 07:24:23 +0000 https://bitmovin.com/?p=258046 This article was originally published in April 2023. It was updated Nov 14, 2023 with information about Quest 3 AV1 support. Whether you’re calling it Virtual Reality (VR) or 360 video or Metaverse content, there are a lot of details that should be taken into consideration in order to guarantee a good immersive experience. Things...

The post Encoding VR and 360 Immersive Video for Meta Quest Headsets appeared first on Bitmovin.

]]>
This article was originally published in April 2023. It was updated Nov 14, 2023 with information about Quest 3 AV1 support.

Whether you’re calling it Virtual Reality (VR) or 360 video or Metaverse content, there are a lot of details that should be taken into consideration in order to guarantee a good immersive experience. Things like video resolution, bitrates and codec settings all need to be set in a way that creates a high quality of experience for the viewers, while being conscious of storage and delivery costs that can come with these huge files. Although all this stuff has been widely discussed for 2D displays, like mobile phones and TVs, VR streaming differs enormously from those traditional screens, using different display technology that drastically shortens the viewing distance from eye to screen. In addition to that, VR headset specs may differ from one device to another, so the same video may produce a different visual experience depending on the model or device. In this post we are going to share the things you need to consider, along with tips and best practices for how to encode great looking VR content, specifically for playback on Meta Quest (formerly known as Oculus) headsets.

Visual quality requirements of 3D-VR vs 2D videos

Unlike traditional 2D screens, where viewers are located at a considerable distance from the screen, VR viewers are looking at a smaller screen much closer to the eyes. This drastically changes the way a video should be encoded in order to guarantee good visual quality for an immersive 3D experience. For this same reason, the traditional 2D video quality metrics such as VMAF and PSNR are not usually useful to measure the visual perception for 3-D VR content, for instance:

VMAF for 3D-VR

VMAF considers 2D viewers located at a viewing distance in the order of magnitude of the screen size, for example:

  • 4K VMAF model – vmaf_4k_v0.6.1,  takes into consideration that the viewer is located at 1.5H from the screen, where H is the TV screen high. 
  • HD VMAF model – vmaf_v0.6.1, considers  a viewer located at 3H from the screen.

The previous models resulted in a pixel density of about 60 pixels per degree (ppd) and 75 ppd – for 4K and HD respectively. However, when talking about VR videos, the pixel density is highly magnified, for instance, for Meta Quest 2 headsets the specs mention a pixel density of 20 ppd. Therefore, the predefined VMAF models are not suitable.  Actually, if you do use VMAF to get the visual quality (VQ) for a VR video intended for headset playback, you’ll probably find it does not look good enough even though it has a high VMAF score – this is because of the “zoom in” that Quest does in comparison to the traditional screens. 

PSNR for 3D-VR

Even when it is not a rule, it is expected to have good VQ on 2D videos when PSNR values are between 39 dB and 42 dB – for average to high complexity videos. See [1] [2] However, this PSNR range is usually not enough to create a good immersive experience with Quest headsets. For instance, according to some empirical tests we did, we found that at least a PSNR above 48 dB is required for good VQ with Quest devices.

- Bitmovin
image source: Meta Quest Blog

The Best Encoding Settings for Meta Quest devices

A general overview of the Video Requirements can be found at the Meta Quest website. Additionally, the following encoding settings may be useful when building your encoding workflow: 

Resolution

The minimal resolution suggested by Meta is 3840 x 3840 px for stereoscopic content and 3840 x 1920 px for monoscopic content, which is much higher than earlier generations or mobile devices.  

H265 Video Codec Settings 

Video Codec – Meta Quest devices support H264(AVC) and H265(HEVC) codecs, however given that they require resolutions above 3840 px, we strongly recommend H265 due to the high encoding efficiency it has when comparing it to H264. 

GOP Length – In our tests we successfully achieved a good VQ within the recommending bitrate range, using a 2-second GOP length for 30 fps content. However, since the VR experience is not as latency sensitive for video on demand, we suggest using greater GOP lengths in order to improve the encoding efficiency even more if needed.

Target bitrate and CRF – Meta suggests a target bitrate between 25-60 Mbps and as mentioned, we strongly suggest using the H265 codec to maintain high visual quality within that range. If the bitrate goes too far above the suggested maximum, customers may experience slow playback or stalling due to device performance issues. 

Having said all that, it is worth mentioning that setting a proper bitrate to meet the VQ expectations is really challenging, mainly because the bitrates necessary may change from one piece of content to another depending on their visual complexity. Because of that, we suggest using a CRF based encoding instead of a fixed bitrate. Specifically, we found that when talking about H265, a CRF between 17-18 would produce videos that are suitable for viewing on Quest headsets without excessively high bitrates. 

Building 360-VR encoding workflows with Bitmovin VOD Encoding

Bitmovin’s VOD Encoding provides a set of highly flexible APIs for creating workflows that fully meet Meta Quest encoding requirements. For instance:

  • If adaptive bitrate streaming is required at the output, Bitmovin Per-Title encoding can be used to automatically create the ABR ladder with the top rendition driven by the desired CRF target.
  • If progressive file output is required, a traditional CRF encoding can be used by capping the bitrates properly.
  • Additionally, Bitmovin filters can be used to create monoscopic content based on a stereoscopic input, for instance, cropping the original/stereoscopic video to convert it from a top-and-bottom or side-by-side array into a single one. Monoscopic outputs can be viewed on 2D displays, extending the reach of your 360 content beyond headsets.

Per-Title Encoding configuration for VR

The following per-title configuration may be used as a reference for encoding a VR content. Depending on the content complexity, the output may include from 4 to 7 renditions with the top rendition targeting a CRF value of 17.

perTitle: {
     h265Configuration: {
       minBitrate: 5000000,
       maxBitrate: 60000000,
       targetQualityCrf: 17,
       minBitrateStepSize: 1.5,
       maxBitrateStepSize: 2,
       codecMinBitrateFactor: 0.6,
       codecMaxBitrateFactor: 1.4,
       codecBufsizeFactor: 2,
       autoRepresentations: {
         adoptConfigurationThreshold: 0,
         },
       },
     }

Theres also full code samples here if you would like to dig deeper.

The same configuration can be used to encode any VR format such as top-and-bottom, side-by-side or monoscopic 360 content. The per-title algorithm will automatically propose a proper bitrate and resolution for each VR format based on the input details. Additionally, it is strongly recommended to use VOD_HIGH_QUALITY as an encoding preset and THREE_PASS as encoding mode. This will assure the Bitmovin Encoder delivers the best possible visual quality. 

In our tests using typical medium-high complexity content, we found that using a CRF of 17 produces good VQ for Meta Quest playback, with PSNR values above 48 dB and bitrates that are usually below the suggested maximum of 60 Mbps. 

Alternatively, traditional CRF encoding can be used instead of Per-title, for instance if only one rendition is desired at the output – with no ABR.

Creating monoscopic outputs from stereoscopic inputs

Usually, VR 360 cameras record the content in stereoscopic format either in top-and-bottom or side-by-side arrangements. However, depending on the customer use case, it could be required to convert the content from stereoscopic to monoscopic formats. This can be easily solved with the Bitmovin VOD Encoding API by applying cropping filters to remove the required pixels or frame percentage from the stereoscopic content, turning it into monoscopic format, i. e., by removing the left/right or the bottom/top side from the input asset.

- Bitmovin
Top-Bottom Stereoscopic Format source: Blender Foundation

For instance, the following javascript snippet would remove the top side of a 3840 x 3840 stereoscopic content:

.....
.....
// Crop filter definition
const cropTopSideFilter = new CropFilter({
  name: "stereo-to-mono-filter-example",
  left: 0,
  right: 0,
  bottom: 0,
  top: 1920,
 })
 
// Crop filter creation 
cropTopSideFilter = await bitmovinApi.encoding.filters.crop.create( cropTopSideFilter)

// Stream Filter definition
const cropTopSideStreamFilter = new StreamFilter({
  id : cropTopSideFilter.id,
  position: 0,
})

// StreamFilter creation
bitmovinApi.encoding.encodings.streams.filters.create(<encoding.id>, <videoStream.id>, [cropTopSideStreamFilter] )

AV1 Codec Support on Meta Quest 3

In the recommended settings above, we strongly suggested using HEVC over H.264 because the newer generation codec offers greater compression efficiency that turns into bandwidth savings and a better quality of experience for users. Now with the Quest 3, you can take advantage of AV1, an even newer codec that outperforms HEVC. On average, our testing has shown that you can maintain equivalent quality while using around 30% lower bitrate with AV1. This will depend on the type of content you’re working with, so if you’re experimenting with AV1 for the Quest 3, choosing a bitrate that’s ~25% lower than your HEVC encoding would be a good place to start.  DEOVR shared a 2900p sample .mp4 file encoded with AV1, but you can also create your own with a Bitmovin trial account.

Ready to start encoding your own 360 content for Meta Quest headsets? Sign up for a free trial and get going today! 

Related links:

Bitmovin Docs – Encoding Tutorials | Per-Title Configuration Options explained

Bitmovin Player 360 video demo

The post Encoding VR and 360 Immersive Video for Meta Quest Headsets appeared first on Bitmovin.

]]>
https://bitmovin.com/blog/best-encoding-settings-meta-vr-360-headsets/feed/ 0
Bitmovin Receives Excellence in DASH Award for Tile-Based Streaming of VR and 360° Video https://bitmovin.com/blog/bitmovin-receives-excellence-dash-award-tile-based-streaming-vr-360-video/ Wed, 28 Jun 2017 09:24:40 +0000 http://bitmovin.com/?p=20703 Tile-Based Streaming is set to play a major role in delivering VR and 360 video to mainstream audiences by reducing bandwidth requirements, reducing costs and vastly increasing accessibility. Bitmovin engineers and co-founders Mario Graf (@grafmar_io), Christian Timmerer (@timse7), and Christopher Mueller (@chris_bitmovin) have been awarded the Excellence in DASH Award at ACM Multimedia Systems 2017 in Taipei, Taiwan....

The post Bitmovin Receives Excellence in DASH Award for Tile-Based Streaming of VR and 360° Video appeared first on Bitmovin.

]]>
Tile-Based Streaming could be the future of VR and 360 video

Tile-Based Streaming is set to play a major role in delivering VR and 360 video to mainstream audiences by reducing bandwidth requirements, reducing costs and vastly increasing accessibility.

Bitmovin engineers and co-founders Mario Graf (@grafmar_io), Christian Timmerer (@timse7), and Christopher Mueller (@chris_bitmovin) have been awarded the Excellence in DASH Award at ACM Multimedia Systems 2017 in Taipei, Taiwan.

The Bitmovin team receives the third place award for their paper “Towards Bandwidth Efficient Adaptive Streaming of Omnidirectional Video over HTTP: Design, Implementation, and Evaluation” [PDF], [Session Slides]. In this paper, the researchers analyze adaptive bitrate streaming of VR and 360-degree video over HTTP and describe the use of tiles, as specified within modern video codecs, such HEVC/H.265 and VP9, to recognize bitrate savings of 40-65%.
These findings establish a baseline for advanced streaming techniques of immersive video, such as VR and 360-degree video, including real-life applications and the outline of the research roadmap. Bitmovin is committed to shaping the future of online video and building streaming solutions for commercial applications that enhance end-user experience and reduce friction for video developers. You can learn more about Bitmovin end-to-end support for immersive video in this tutorial – VR & 360° Video and Adaptive Bitrate Streaming.  

The award is established by the DASH Industry Forum. The DASH-IF “creates interoperability guidelines on the usage of the MPEG-DASH streaming standard, promotes and catalyze the adoption of MPEG-DASH and help transition it from a specification into a real business. It consists of the major streaming and media companies, such as Microsoft, Netflix, Google, Ericsson, Samsung and Adobe.” Bitmovin was among the first to deploy DASH in accordance with the DASH-IF open standard guidelines. We are an active member of the DASH-IF and actively contribute research and testing data to the DASH community.

What and Why is Tile Based Streaming?

The nature of 360 video creates much larger files sizes simply due to the extra pixels required for a spherical image. But many of these pixels delivered in the video stream are outside of the viewport and are never seen. This causes unnecessarily high bandwidth requirements and CDN costs.
Tile based streaming is a revolutionary technique that solves this problem by breaking a 360° video into “tiles”, and streams the highest quality only to visible sections of the video. At the same time it uses lower quality (smaller) files for unseen tiles.
Tile Based Streaming will save bandwidth and reduce CDN costs
This technique will be among the next major innovations in the area of 360 & VR video, and will offer huge cost savings and quality improvements, and Bitmovin is leading the way towards making this technology available for commercial applications.
To see tile-based streaming in action, request a demo with our video solutions experts.

The post Bitmovin Receives Excellence in DASH Award for Tile-Based Streaming of VR and 360° Video appeared first on Bitmovin.

]]>
360 degree (live) adaptive streaming with RICOH THETA S and Bitmovin https://bitmovin.com/blog/360-degree-live-adaptive-streaming-with-ricoh-theta-s-and-bitmovin/ Thu, 30 Mar 2017 13:29:09 +0000 https://bitmovin.com/?p=18960 Recently I got the RICOH THETA S 360-degree camera and I asked myself how to setup a (live) adaptive streaming session using Bitmovin cloud encoding and HTML5 player. I quickly found some general guidelines on the internet but before providing step-by-step instructions one has to consider the following: Update the firmware of your Ricoh Theta S by downloading...

The post 360 degree (live) adaptive streaming with RICOH THETA S and Bitmovin appeared first on Bitmovin.

]]>
Recently I got the RICOH THETA S 360-degree camera and I asked myself how to setup a (live) adaptive streaming session using Bitmovin cloud encoding and HTML5 player. I quickly found some general guidelines on the internet but before providing step-by-step instructions one has to consider the following:

  • Update the firmware of your Ricoh Theta S by downloading the basic app, start it (while the camera is connected via USB) and go to File -> Firmware Update… and follow the steps on the screen. It’s pretty easy and mine got updated from v1.11 to v1.82.
  • Think about a storage solution for your files generated by the Bitmovin cloud encoding and possible options are FTP, Amazon S3, Google Cloud Storage, and Dropbox. I used Amazon S3 for this setup which provides a bucket name, “AWS Access Key”, and “AWS Secret Key”.
  • Setup a basic web site and make sure it works with the Bitmovin HTML5 player for video on demand services with the content hosted on the previously selected storage solution (i.e., avoid any CORS issues). In my setup I used WordPress and the Bitmovin WordPress plugin which makes it very easy…

Step 1: Follow steps 1-4 from here.

Follow steps 1-4 from the general guidelines. Basically, install the live-streaming app, register the device, and install/configure OBS. Enable the live streaming on theRICOH THETA S and within OBS use the “Custom Streaming Server” of the “Stream” settings. That basically connects the RICOH THETA S with OBS on your local computer. The next step is forwarding this stream to the Bitmovin cloud encoding service for DASH/HLS streaming.

Step 2: Create a new Bitmovin Output

  1. Login to the Bitmovin portal and go to Encoding -> Outputs -> Create Output
  2. Select Amazon S3 and use any “Output Profile name”, e.g., ricoh-livestream-test
  3. Enter the name of your Bucket from Amazon S3
  4. The prefix is not needed
  5. Select any “Host-Region” (preferably one close to where you are)
  6. Enter the ”AWS Access Key” and the “AWS Secret Key” from Amazon S3
  7. Make sure the “Create Public S3 URLs” checkbox is enabled

An example screenshot is shown below.
- Bitmovin
Finally, click the “+” sign to create it and if everything is correct, the output will be created, otherwise an error message will be shown. In such a case, make sure the bucket name and keys are correct as provided when creating a bucket on Amazon S3.

Step 3: Create a new Bitmovin Livestream

  1. Login to the Bitmovin portal and go to Live (beta) -> Create Livestream
  2. Select “Encoding-Profile”: bitcodin fullHD is sufficient (4K not needed as the device provides only fullHD)
  3. Select “Output-Profile”: select the output you’ve created in previous step (ricoh-livestream-test)
  4. Add a “Livestream-Name” (any string works here), e.g., ricoh-livestream-test
  5. Add a “Stream-Key” (any string works here), e.g., ricohlivestreamtest
  6. Click “Create Live Stream”, an “Important Notice” shows up & click “Create Live Stream”
  7. Wait (could take some time, you may reload the page or go to the “Overview”) for RTMP PUSH URL to be used in OBS

An example screenshot is shown below which displays the RTMP PUSH URL, Stream Key, MPD URL, and HLS URL to be used in the next steps.
- Bitmovin

Step 4: Start Streaming in OBS

  1. Go to OBS -> Settings
  2. In section “Stream”, select “Custom Streaming Server”
  3. Enter the RTMP PUSH URL from Bitmovin in the “URL” field of OBS
  4. Enter the Stream Key from Bitmovin in the “Stream key” field of OBS
  5. Click “OK” and then click “Start Streaming” in OBS

An example screenshot is shown below and if everything works fine OBS will stream to the Bitmovin cloud encoding service.

- Bitmovin

Step 5: Setup the HTML5 Player

Basically follow the instructions here or in my case I simply used WordPress and the Bitmovin WordPress plugin. In such a case…

  1. Within WordPress, create a post or page and go to the Bitmovin WP plugin
  2. Select “Add New Video”
  3. Enter any name/title of the new video
  4. In the “Video” section, enter the “DASH URL” and “HLS URL” from the Bitmovin livestream provided in step 3 (i.e., the MPD URL and the HLS URL)
  5. In the “Player” section, select latest stable (in my case this was latest version 7)
  6. In the “VR” section, select startup mode “2d” and leave the rest as is

An example screenshot is shown below.

- Bitmovin

Finally, click on “Publish” in WordPress which will give you a shortcut code to be placed (copy/paste) into your site or post and you’re done…!
A similar approach can be used for video on demand content but in such a case you don’t need OBS as you simply encode your content using the Bitmovin cloud encoding and the HTML5 player for the actual streaming.
More Resources:

The post 360 degree (live) adaptive streaming with RICOH THETA S and Bitmovin appeared first on Bitmovin.

]]>
Build a Virtual Reality Website for HTML5 Browsers https://bitmovin.com/blog/html5-vr-player-virtual-reality-technology/ Thu, 23 Jun 2016 07:27:21 +0000 http://bitmovin.com/?p=9092 Thanks to MOZVR and their A-Frame JavaScript framework, you can now create a completely immersive web experience in standard web browsers. Usually when you think Virtual Reality or 360° you expect to open a video or some sort of software application. Thanks to MozVR’s A-FRAME, the internet is now available in Virtual Reality format, which means...

The post Build a Virtual Reality Website for HTML5 Browsers appeared first on Bitmovin.

]]>
Virtual Reality in HTML5 browsers

Thanks to MOZVR and their A-Frame JavaScript framework, you can now create a completely immersive web experience in standard web browsers.

Usually when you think Virtual Reality or 360° you expect to open a video or some sort of software application. Thanks to MozVR’s A-FRAME, the internet is now available in Virtual Reality format, which means you can enter a virtual reality environment simply by opening a normal website.

An HTML5/JavaScript VR Environment

Imagine creating a virtual cinema for your video portal. Allow users to “walk” through the foyer, browsing film posters and viewing trailers right on the wall. Once the user selects a film to watch, she can walk into the cinema, maybe even choose the best seat, before sitting down to watch the film on an enormous cinema screen.
Obviously this sort of immersive experience is easy enough to create (if you have the skills) in a host of Virtual Reality software environments and computer game frameworks, but up until now, not in a normal web browser.
MOZVR and their new A-FRAME framework is changing all that. It is now relatively simple to create a three dimensional environment that loads directly into the user’s browser without any interruption to their experience. A smooth transition from online browsing to VR.
Looking ahead a little further, the steady push towards HTML5 compliance from all the major browsers means that it is becoming relatively easy to embed an HTML5 VR player or an html5 360 video player within the HTML5 VR environment. Theoretically you could embed VR movies and experiences within your web based VR cinema environment and create a browser based “Virtual VR cinema.”

VR Browser Demo

We have created a very simple demonstration to show how easy it is to get started with Virtual Reality in HTML5 and JavaScript. This demo contains less than 40 lines of code in one HTML file. It links to the a-frame.js JavaScript file and to two images. You can enter the VR environment simply by following the link below. Control your position and view using your mouse and the W,A,S and D keys on your keyboard.
Virtual reality technology

Getting Started with Browser Based Virtual Reality Technology

Aframe is a JavaScript framework for building 3 dimensional environments within the browser. It wraps three.js and WebGL into HTML elements, creating an insanely easy to use framework that allows you to create a three dimensional web space within minutes.
To create a scene like the demonstration above can be done in a few minutes. The first step is to create a basic HTML file, and include the JavaScript plugin in the head of your HTML document in the same way you would include JQuery or any other JavaScript file.

&amp;amp;lt;script src="https://aframe.io/releases/0.2.0/aframe.min.js"&amp;amp;gt;&amp;amp;lt;/script&amp;amp;gt;

Creating the 3D Scene

Once that is in place you can create the 3D scene using the a-scene tags. Within this tag, you can create elements, set up your camera positioning and define the way the background looks. In my simple demonstration I created a sky, four shapes, some simple animations and activated the cursor so that the user can trigger the animations using a hover event.
The first section of my code defines some assets, namely the Bitmovin logo (logo.jpg) and the background 3D image (image.jpg) I assign ID tags to these assets so that I can reference them later.

&amp;amp;lt;a-assets&amp;amp;gt;
&amp;amp;lt;img id="logo" src="logo.png"&amp;amp;gt;
&amp;amp;lt;img id="bg" src="image.jpg"&amp;amp;gt;
&amp;amp;lt;/a-assets&amp;amp;gt;

With the assets in place, creating the background is as easy as using the a-sky tag.

&amp;amp;lt;a-sky src="#bg"&amp;amp;gt;&amp;amp;lt;/a-sky&amp;amp;gt;

There are also

Spheres, Boxes, Cones and Other Shapes

The logo shapes are slightly more complicated, but still very quick to set up. Each shape is treated as an element, with a range of configurable attributes. By using the “a-box” tag I can create square or rectangular shape and can control the appearance and behavior by adding various attributes, such as height and width. The “a-animation”, set within the “a-box” tag allows me to assign an animation to that particular element. The other three shapes are controlled in exactly the same way, but use the tags; a-sphere, a-cone and a-torus.

&amp;amp;lt;a-box color="#ffffff" width="6" height="4" depth="4" position="0 0 -5" rotation="40 50 10" scale="0.5 0.5 0.5" src="#logo"&amp;amp;gt;
&amp;amp;lt;a-animation attribute="rotation" begin="hovered" repeat="0" to="270 360 360"&amp;amp;gt;&amp;amp;lt;/a-animation&amp;amp;gt;
&amp;amp;lt;/a-box&amp;amp;gt;

You will notice that many of the attributes contain three attributes. Basically you can think of this as being one for each dimension. X-axis, Y-axis and Z-axis. Rather than thinking about it too much, I found that the best way to get most of the settings correct was to start with trial and error, especially when it comes to working out things like how the box should rotate or where each of the boxes should be positioned.
One of the really fun things about A-Frame is that the 3D environment is available by default. As soon as you have a scene up and running you will be able to “walk” around it and turn your viewer in all directions, controlling your scene in the same way you control a first person computer game. The primitives mentioned above allow one to start building the environment immediately. A-Frame also offers an impressive set of demos and a very well organised documentation section to help you get up and running quickly.

The Live Code

This is the final html page as it sits on the site. It pulls the A-Frame JavaScript directly from the aframe.io server, and the only extra files required are the two images.

&amp;amp;lt;!DOCTYPE html&amp;amp;gt;
&amp;amp;lt;html&amp;amp;gt;
&amp;amp;lt;head&amp;amp;gt;
&amp;amp;lt;meta charset="utf-8"&amp;amp;gt;
&amp;amp;lt;title&amp;amp;gt;Bitmovin&amp;amp;lt;/title&amp;amp;gt;
&amp;amp;lt;meta name="description" content="Bitmovin 3D World Demo"&amp;amp;gt;
&amp;amp;lt;script src="https://aframe.io/releases/0.2.0/aframe.min.js"&amp;amp;gt;&amp;amp;lt;/script&amp;amp;gt;
&amp;amp;lt;/head&amp;amp;gt;
&amp;amp;lt;body&amp;amp;gt;
&amp;amp;lt;a-scene&amp;amp;gt;
&amp;amp;lt;a-assets&amp;amp;gt;
&amp;amp;lt;img id="logo" src="logo.png"&amp;amp;gt;
&amp;amp;lt;img id="bg" src="image.jpg"&amp;amp;gt;
&amp;amp;lt;/a-assets&amp;amp;gt;
&amp;amp;lt;a-sky src="#bg"&amp;amp;gt;&amp;amp;lt;/a-sky&amp;amp;gt;
&amp;amp;lt;a-box color="#ffffff" width="6" height="4" depth="4" position="0 0 -8" rotation="40 50 10" scale="0.5 0.5 0.5" src="#logo"&amp;amp;gt;
&amp;amp;lt;a-animation attribute="rotation" begin="hovered" repeat="0" to="270 360 360"&amp;amp;gt;&amp;amp;lt;/a-animation&amp;amp;gt;
&amp;amp;lt;/a-box&amp;amp;gt;
&amp;amp;lt;a-sphere color="#ffffff" width="6" height="4" depth="4" position="5 0 2" rotation="40 50 10" scale="0.5 0.5 0.5" src="#logo"&amp;amp;gt;
&amp;amp;lt;a-animation attribute="rotation" begin="hovered" repeat="0" to="270 360 360"&amp;amp;gt;&amp;amp;lt;/a-animation&amp;amp;gt;
&amp;amp;lt;/a-sphere&amp;amp;gt;
&amp;amp;lt;a-cone color="#ffffff" width="6" height="4" depth="4" position="-5 0 2" rotation="40 50 10" scale="0.5 0.5 0.5" src="#logo"&amp;amp;gt;
&amp;amp;lt;a-animation attribute="rotation" begin="hovered" repeat="0" to="360 180 360"&amp;amp;gt;&amp;amp;lt;/a-animation&amp;amp;gt;
&amp;amp;lt;/a-cone&amp;amp;gt;
&amp;amp;lt;a-torus color="#ffffff" width="6" height="4" depth="4" position="0 0 6" rotation="40 50 10" scale="0.5 0.5 0.5" src="#logo"&amp;amp;gt;
&amp;amp;lt;a-animation attribute="rotation" begin="hovered" repeat="0" to="360 180 360"&amp;amp;gt;&amp;amp;lt;/a-animation&amp;amp;gt;
&amp;amp;lt;/a-torus&amp;amp;gt;
&amp;amp;lt;a-camera position="0 0 0"&amp;amp;gt;
&amp;amp;lt;a-cursor color="#ccc"&amp;amp;gt;
&amp;amp;lt;/a-camera&amp;amp;gt;
&amp;amp;lt;/a-scene&amp;amp;gt;
&amp;amp;lt;/body&amp;amp;gt;
&amp;amp;lt;/html&amp;amp;gt;

A good way to get an idea of what this framework is capable of is on the A-Frame blog, where you can see their weekly progress as well as some of the best virtual reality websites that have been produced using A-Frame. We have also created a short list of our favorites A-frame VR examples and listed them below.

5 of the Best VR Websites Built with A-Frame

A-frame example 1

Where is Piers Morgan disliked the most?

This is a humorous 3D environment that shows a map of the UK, with a three dimensional graph displaying the number of “dislikes” for Piers Morgan (a British television personality) per city. Fly over the country to get a visual impression of this celebrities popularity levels.


A-frame example 2

360 Syria – Fear of the Sky

This website, built for Amnesty International, uses A-Frame to present a confronting 360° view life as a Syrian, living in a city ravaged by war. The presentation and use of video in this example is very well done and creates a very powerful message. The video is delivered directly through standard HTML5 <video> tags in MP4 format.


A-frame example 4

Mars, an interactive journey

This is a fairly simple 360° view of mars, including a model of the Curiosity Rover and a series of informational images with accompanying audio.


A-frame example 5

VR Wiki Museum

This is a very interesting attempt to create a virtual museum version of Wikipedia. Walk through the halls of the Wiki Museum, see three dimensional statues, read information and look at photographs wall. Walking through doors takes you to new areas of the website.

Bitmovin and the HTML5 VR Player

Virtual Reality is a fast growing space. That is why Bitmovin is spending a lot of time to make sure that both our HTML5 Adaptive Streaming Player and our Cloud Encoding Service are completely ready to handle 360° video streaming and video within VR environments. We are constantly working on solutions to deliver video everywhere, browser based VR included. Our experiments with A-Frame and other browser based virtual reality technologies are just beginning. Look out for our follow up articles on VR in the browser and the HTML5 VR Player.
Get access to our 360° video enabled HTML5 Player or Encoding service with a free account today!

The post Build a Virtual Reality Website for HTML5 Browsers appeared first on Bitmovin.

]]>