Skip to content
joycefarrell edited this page May 15, 2018 · 4 revisions

iset360 is a project that uses open-source software to simulate camera arrays for virtual and augmented reality applications. More specifically, it uses physically based ray tracing to simulate a 3D virtual spectral scene and traces these rays through multi-element spherical lenses to calculate the irradiance at the imaging sensor. The software then simulates imaging sensors to predict the captured images. The sensor data can be processed to produce the stereo and monoscopic 360◦ panoramas commonly used in virtual reality applications. By simulating the entire capture pipeline, we can visualize how changes in the system components influence the system performance

iset360 uses code that is available in two other Github repositories:

  • iset3d (see https://github.com/ISET/iset3d/wiki) includes PBRT code that has been augmented to calculate the irradiance at the sensor as light travels from the 3D scene, through the lens, and onto the sensor surface. We augmented the PBRT code to return multispectral images, model lens diffraction and simulate light fields. To promote platform independent sharing, the augmented PBRT code is compiled into a machine-independent Docker container.

  • isetcam (see https://github.com/ISET/isetcam/wiki) includes code that converts the sensor irradiance into the expected image (rgb) data. The Matlab code in isetcam models the geometric, colorimetric, and electrical properties of the pixels and sensor.

This repository (iset360) contains a small library of pre-converted and formatted scenes. The scene files describe the size and distance of objects in meters, material properties such as image textures and BRDFs, and the placement and type of lights. In addition, one can specify the spectral power distribution of the lights, as well as the spectral reflectance of objects. Many of these values can be changed programmatically once the scene has been imported into iset3d.

In the expected usage, the user controls the iset360 simulation pipeline using a Matlab script. Typically, the user imports the PBRT scene data using the command recipe = piRead(sceneFile). The return is a Matlab object (recipe) whose parameters specify the scene data and camera settings. We loop through each camera in the rig, setting its position, lens and sensor parameters, before calculating its sensor irradiance (irradiance = piRender(recipe), units: photons/s/nm/m).

A large number of parameters can be specified by adjusting the iset360 recipe and the isetcam sensor parameters. These range from scene properties (e.g., lighting, materials, geometry), to camera array (positions, lens prescriptions), to sensor properties (e.g., pixel size, color filter array, sensor and pixel noise, etc.)

For example, we specify a camera position and viewing direction using a LookAt parameterization which describes an origin (from), a target (to) and an up direction (up). We describe multi-element lenses by their component positions, curvatures, thicknesses, diameters, and wavelength-dependent indices of refraction.

iset360 also includes a small library of spherical lenses, and additional ones can be added when the lens prescription is known. Within the iset360 camera structure, we specify the aperture, diameter and the position of the sensor. Within the isetcam structures we specify the color filter array, pixel sizes, and electrical noise.

Clone this wiki locally