• Home
  • Knowledge Base
    • Licensing New (VIOSO 6, EXAPLAY)
    • Operation
    • Quality Improvement
    • References
    • Licensing Old (Anyblend, Player)
  • Documentation
    • System Preparation
    • VIOSO 6
    • VIOSO 6 Integration
    • EXAPLAY
    • VIOSO Core 5
    • VIOSO Anyblend 5
    • VIOSO Anyblend VR&SIM 5
    • VIOSO Integrate 5
    • VIOSO Player 2
  • FAQ
    • Error and other feedback
    • Licensing
    • System & Requirements
  • Videos
  • Home
  • Knowledge Base
    • Licensing New (VIOSO 6, EXAPLAY)
    • Operation
    • Quality Improvement
    • References
    • Licensing Old (Anyblend, Player)
  • Documentation
    • System Preparation
    • VIOSO 6
    • VIOSO 6 Integration
    • EXAPLAY
    • VIOSO Core 5
    • VIOSO Anyblend 5
    • VIOSO Anyblend VR&SIM 5
    • VIOSO Integrate 5
    • VIOSO Player 2
  • FAQ
    • Error and other feedback
    • Licensing
    • System & Requirements
  • Videos

VIOSO Anyblend VR&SIM 5

home/Documentation/VIOSO Anyblend VR&SIM 5
Expand All Collapse All
  • VIOSO Anyblend VR&SIM Software Overview
  • 3D calibration
    • 3D model creation
    • 3D Alignment (MRD Adjustment)
    • Re-calculate calibration blending (optional)
  • Multi-client Calibration
    • Method 1: Abstract Displays
    • Method 2: Legacy
  • Multi-Camera Calibration
    • Method 1: 3D Alignment based multi-cam
    • Method 2: Marker based multi-cam
  • Intrinsic and Positions
  • Content space management
  • Content space transformation
  • Observer Conversion for Static Eye-Point
  • Dynamic eye-point correction
  • Calibration Export
  • Anyblend VR&SIM Examples
    • Export for Barco WB2560 (MIPS)
    • Calibration of a partial dome screen with an off-centre camera

VIOSO Anyblend VR&SIM Software Overview

4133 views 1

Emanuel
March 26, 2020

VIOSO Anyblend VR & SIM is an extension of VIOSO Anyblend that enables you to blend multiple views created by image generators (IG’s).

Whether there are multiple views rendered by one client machine, one view on multiple client machines, or even a mix of both.

This software allows you to perform a full-service operation from calibrating and blending the projection to applying the calibration in different ways, as well as playback content on the calibrated projection based on a 3D work flow.

Table of Contents

  • Extensions added
    • Use Cases
  • Main concepts of Anyblend VR&SIM
    • Cluster calibration
    • Projector pose to screen
    • Content Spaces
    • Projection mapping
    • Virtual cameras

Extensions added

  • Cluster calibration
  • Mapping conversions- Create texture maps from a camera scan and a screen model.
  • Model-based warp- Create warp and blend from model and projector intrinsic parameters without a camera.
  • Calculation of viewports according to screen and eye point- Create mappings for every IG channel to the screen.
  • Templated viewport export- Export a settings file that can be included to IG configuration for automatic viewport settings.
  • 3D Mapping- Create sweet spot independent warp maps for head tracked CAVE™ solutions.
  • Multi-camera calibration- Use fast re-calibration on screens where one camera can’t see it all at once.
  • Master export- Collect all warp/blend maps on the master machine and copy them to the appropriate location on each client for VIOSO plugin enabled IG’s.

Use Cases

  • Simulator setup on half dome, cylindrical panorama, or irregular screens.
  • Projective Interior (virtual museum).
  • Façade projections.
  • Tracked CAVE™ setups.
  • Projection mapping, augmented reality using textures projected on surfaces.

Main concepts of Anyblend VR&SIM

Cluster calibration

Use one or more cameras to calibrate a projection system residing in several computers. In most simulator or cave setups, there are several renderers with single or multiple outputs.

To calibrate this kind of system, we use a given Ethernet connection to calibrate a cluster. It is important that:

  • Every projector is “seen” by one camera completely.
  • The camera used to calibrate a connected projector must be connected to that PC. Use network camera(s) if you need a camera for multiple PCs.

Camera to screen

Like every camera-based projector calibration, the basis of all calculations is a camera scan, which registers every projector pixel to the camera. With a few extra parameters, this scan enables us to register the screen to the camera and the projectors to the actual screen.

What benefits arise from that? Well, if you know every projector pixel’s 3D coordinate in the real world, you can calculate viewports, warp and blend maps, and eye point dependent transformations.

Projector pose to screen

This is the opposite of a camera scan. What if we already know where every projector pixel points? Of course we can take advantage of that and place a 3D model in the ray fields of the projectors and calculate warping and blending from that. So basically all features known from Anyblend can be applied to a screen or every other surface without any cameras!

Content Spaces

A content space is the image to be projected. There can be several content spaces in one setup. Content spaces are texture maps and view ports.

In a basic projection setup, the content space is the camera image plus the warping grid. Once the setup is done, all projectors are blended and there is a rectangular content image back projected through all projectors. This is the standard case for all flat-surface projections where the camera is positioned where the audience is or will be. You could also adapt the projection using our virtual canvas warp grid (VC) to adapt slightly curved screens or move the viewer’s position by adding or removing keystone characteristics.

But what if that is not enough? Content spaces enable you to transform a mapping to fit all kind of special purposes.

Projection mapping

How can I project something on a specified area on the screen? This technique is called “projection mapping” this is where an image or “texture” is warped to be projected on the screen surface. This method is similar to texturing 3D objects in virtual worlds.

For that we need a model of the screen with texture coordinates for every surface. So once we’ve matched the model with the camera and the projectors, we can read this texture coordinate from the 3D surface. Now we can sample in that image and project the image back to the real surface.

Virtual cameras

Virtual cameras are view ports to virtual-reality environments. They are defined with an eye position, view direction, rotation, and fields of view related to the virtual world.

Our projection screen can be seen as a window to a virtual world. The first step is to match the screen with the simulated environment. This is done by matching a virtual screen model on the actual screen (augmented reality approach). In other words, if we can project a virtual screen on the real screen and they match, we know which projector pixel is affected by every view port’s virtual camera image.

Then we can create a mapping, which transforms a viewport’s image to be displayed by a projector.

Was this helpful?

1 Yes  No
Related Articles
  • Method 1: 3D Alignment based multi-cam
  • Method 2: Marker based multi-cam
  • Method 1: Abstract Displays
  • Method 2: Legacy
  • Export for Barco WB2560 (MIPS)
  • 3D model creation

Didn't find your answer? Contact Us

Next
3D calibration
  • Ticket System
  • VIOSO Home
  • Imprint
  • Forum
  • © 2020-now VIOSO GmbH. All Rights Reserved.

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
Do not sell my personal information.
Cookie settingsACCEPT
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT