1. MCAM Usage Overview

The two main objects in the owl module are the MCAM class, that is used to control the instrument itself, and the mcam_dataset which holds the raw image data, as well as the metadata associated with the acquisition.

MCAM Instrument Object

To start interacting with the MCAM, a connection must first be established with the instrument:

from owl.instruments import MCAM
mcam = MCAM()

The call to the construction MCAM() will attempt to connect to an MCAM instrument if it is found.

To take an image with the MCAM, execute the following code

mcam.acquire_full_field_of_view()

The acquired data will be stored in the dataset property of the mcam object. The dataset will be formatted nicely on the python terminal for rapid inspection.

>>> mcam.dataset
<xarray.Dataset>
Dimensions:                                      (image_x: 6, image_y : 9, reflection_illumination.led_number: 377, reflection_illumination.rgb: 3, reflection_illumination.yx: 2, transmission_illumination.led_number: 377, transmission_illumination.rgb: 3, transmission_illumination.yx: 2, x: 4096, y: 3120)
Coordinates:
  * reflection_illumination.led_number           (reflection_illumination.led_number) int64 ...
  * reflection_illumination.rgb                  (reflection_illumination.rgb) object ...
  * transmission_illumination.led_number         (transmission_illumination.led_number) int64 ...
  * transmission_illumination.rgb                (transmission_illumination.rgb) object ...
  * image_x                                      (image_x) int64 0 1 2 3 4 5
  * image_y                                      (image_y) int64 0 1 2 ... 7 8
  * y                                            (y) int64 0 1 2 ... 3118 3119
  * x                                            (x) int64 0 1 2 ... 4094 4095
  * transmission_illumination.yx                 (transmission_illumination.yx) <U1 ...
    transmission_illumination.led_positions      (transmission_illumination.led_number, transmission_illumination.yx) float64 ...
    transmission_illumination.chroma             (transmission_illumination.led_number) <U3 ...
  * reflection_illumination.yx                   (reflection_illumination.yx) <U1 ...
    reflection_illumination.led_positions        (reflection_illumination.led_number, reflection_illumination.yx) float64 ...
    reflection_illumination.chroma               (reflection_illumination.led_number) <U3 ...
    exif_orientation                             int64 8
Data variables:
    images                                       (image_y, image_x, y, x) uint8
    acquisition_count                            (image_y, image_x) int64 0
    trigger                                      (image_y, image_x) int64 0
    exposure                                     (image_y, image_x) float64
    bayer_pattern                                (image_y, image_x) <U4
    software_timestamp                           (image_y, image_x) datetime64[ns]
    digital_red_gain                             (image_y, image_x) float64
    digital_green1_gain                          (image_y, image_x) float64
    digital_blue_gain                            (image_y, image_x) float64
    digital_green2_gain                          (image_y, image_x) float64
    analog_gain                                  (image_y, image_x) float64
    digital_gain                                 (image_y, image_x) float64
    acquisition_index                            (image_y, image_x) int64
    latest_acquisition_index                     int64 0
    z_stage                                      float64 0.0
    transmission_illumination.state              (transmission_illumination.led_number, transmission_illumination.rgb) float64
    reflection_illumination.state                (reflection_illumination.led_number, reflection_illumination.rgb) float64

Details of the metadata available in the dataset structure can be found in MCAM Dataset. Briefly, the information contained in the MCAM.dataset describes:

  • The raw data from the image sensors: mcam.dataset['images'].

  • The sensor parameters that are used during the acquisition: see exposure and analog_gain for example.

  • The stage parameters used during the acquisition: see z for example.

  • The illumination parameters used during the acquisition: see parameters starting in transmission_illumination.

To change the exposure, z-stage position, or LED settings, use the following code to start:

>>> mcam.exposure = 50E-3            # In units of seconds.
>>> mcam.z_stage.position = 10.1E-3  # In units of meters.
>>> mcam.transmission_illumination.color = mcam.white_light_transmission * 5
>>> mcam.transmission_illumination.fill_array()

More details about how to use the MCAM class can be found in subsequent chapters.

Saving data

To save data from the MCAM, use the method called save:

>>> from owl.instruments import MCAM
>>> mcam = MCAM()
>>> mcam.acquire_full_field_of_view()
>>> mcam.save('my_dataset')
PosixPath('my_dataset_20201230_141013_107.nc')

This will save the acquired data, and metadata available in the dataset attribute of the MCAM in a single NetCDF4 file with a .nc extension.

The function will also automatically add a timestamp with microsecond precision to ensure that the filename does not overwrite any other dataset.

To read the dataset, use the load function available in the mcam_data submodule.

>>> from owl import mcam_data
>>> dataset = mcam_data.load('my_dataset_20201230_141013_107.nc')

The new dataset variable will contain the exact same data that was available in the mcam.dataset property of the MCAM instrument.

Computational Illumination Unit

Ramona Optics also provides a library to help with the custom control of its illumination technology. The Illumination units provided by Ramona Optics enable individual addressing of LEDs and control over their output intensity.

We believe those that are familiar with the Python programming language, should find how to use the API by reading up on 3 important functions:

  • The constructor: Illuminate

  • The property: Illuminate.led

  • The property: Illuminate.brightness

  • The property: Illuminate.color

The Illuminate module also provides information about each LED location and its specific capabilities. We point users to the attribute led_positions

It contains an DataArray with two dimensions. The first, the led_number, the second, the coordinate in z, y, x. Given the code

>>> from owl.instruments import Illuminate
>>> light = Illuminate()
>>> led_positions = light.led_positions
>>> led_positions
<xarray.DataArray (led_number: 377, zyx: 3)>
array([[0.     , 0.0021 , 0.01   ],
       [0.     , 0.0021 , 0.016  ],
       [0.     , 0.00305, 0.00305],
       ...,
       [0.     , 0.11195, 0.07395],
       [0.     , 0.1129 , 0.061  ],
       [0.     , 0.1129 , 0.067  ]])
Coordinates:
  * led_number  (led_number) int64 0 1 2 3 4 5 6 ... 370 371 372 373 374 375 376
  * zyx         (zyx) <U1 'z' 'y' 'x'
    chroma      (led_number) <U3 'rgb' 'rgb' 'rgb' 'rgb' ... 'rgb' 'rgb' 'rgb'
  • led_positions[9, 0] refers to the z coordinate of the index 9 led.

  • led_positions[0, 1] refers to the y coordinate of the index 0 led.

  • led_positions[3, 2] refers to the x coordinate of the index 3 led.

  • The chromaticity of each LED can be determined by looking at the chroma attribute.

The led_positions attribute may also be sliced like other numpy arrays or other DataArrays.

Using the location of the led positions can help you compute interesting patterns for the LEDs.