Main Content

Hardware-Triggered Acquisition with a GigE Vision Camera

Image Acquisition Toolbox™ provides functionality for hardware-triggered acquisition from GigE Vision® cameras. This is useful in applications where camera acquisition needs to be synchronized with another device (such as a DAQ device or another camera) by means of an external triggering signal. Other applications include controlling the acquisition frame rate with an external signal, or acquiring a multiple-exposure image sequence for high dynamic range (HDR) imaging. This example shows how to use the videoinput gige interface to configure a camera acquisition to allow external triggering-signal control over the frame rate and over the exposure time.

Requirements and Hardware Setup

  • MATLAB® R2016a or later, Image Acquisition Toolbox, and Image Acquisition Toolbox Support Package for GigE Vision Hardware.

  • GigE Vision compliant camera with hardware triggering capability. This example uses a Basler acA1300 GigE camera.

  • Gigabit Ethernet adapter that provides a direct camera network connection, configured as described in GigE Vision Image Acquisition Quick Start Guide.

  • External triggering setup that can provide a triggering signal to the camera trigger line input. For example, a DAQ device with digital output, an Arduino® board, or a function generator instrument can be used to output a custom triggering signal. Refer to the camera user manual for triggering signal voltage level / current requirements and for correct signal connections to the camera input lines. This example uses an Arduino Mega 2560 board that has been already configured to send a triggering signal.

Connect to Camera

Create a videoinput object with the desired video format and get access to the camera device specific properties. When using the videoinput gige adaptor, the camera GenICam features and parameter values are represented as videoinput source properties.

v = videoinput("gige","1","Mono8");
s = v.Source;

You can determine optimum streaming parameters as described in GigE Vision Image Acquisition Quick Start Guide. These values will be different for your setup.

s.PacketSize = 9000;
s.PacketDelay = 17327;

Capture Frames with Immediate Acquisition

By default, an immediate acquisition takes place when you call the videoinput start function and a hardware triggering configuration is not specified. For simplicity, this example performs an acquisition of a finite number of frames, and stores them in the MATLAB workspace.

triggerconfig(v,"immediate");

Set exposure time and mode.

s.ExposureMode = "Timed";
s.ExposureTimeAbs = 4000;

Specify the number of frames to acquire.

v.FramesPerTrigger = 30;
v.TriggerRepeat = 0;

Start continuous buffered acquisition and wait for acquisition to complete.

start(v);
wait(v,10);

Save the acquired frames from the acquisition input buffer to the workspace.

data = getdata(v,v.FramesAvailable);

Display acquired frames.

figure;
imaqmontage(data)

Stop the acquisition.

stop(v)

Capture Frames with FrameStart Trigger

Most GigE Vision cameras support a FrameStart hardware trigger mode, which is used to configure the camera to acquire a frame for each rising edge (or falling edge) signal applied to a camera line input. In this example, an Arduino is used to supply an external periodic square wave signal applied to the camera's Line1 input. The signal frequency effectively controls the camera frame rate.

When using the gige adaptor to configure a hardware triggered acquisition, the videoinput trigger type must be set to hardware. The remaining configuration is done using the videoinput source properties, which represent the corresponding camera features and parameters.

Specify the total number of frames to be acquired. One frame is acquired for each external signal pulse.

numFrames = 30;
v.FramesPerTrigger = 1;
v.TriggerRepeat = numFrames - 1;

Specify the videoinput trigger type as hardware.

triggerconfig(v,"hardware","DeviceSpecific","DeviceSpecific");

Configure the camera for FrameStart trigger mode and specify the external triggering signal input line and desired trigger condition.

s.TriggerSelector = "FrameStart";
s.TriggerSource = "Line1";
s.TriggerActivation = "RisingEdge";
s.TriggerMode = "on";

Set the exposure time and mode.

s.ExposureMode = "Timed";
s.ExposureTimeAbs = 4000;

Start continuous buffered acquisition and wait for acquisition to complete.

start(v);
wait(v,10);

Save the acquired frames and timestamps from the acquisition input buffer to the workspace.

[data2,ts2] = getdata(v,v.FramesAvailable);

Display acquired frames.

figure;
imaqmontage(data2)

Plot timestamp differences, which correspond to 10 Hz frequency of the external triggering signal.

figure;
plot(diff(ts2),"-x")
ylim([0 0.2]);
xlabel("Frame index");
ylabel("diff(Timestamp) (s)");

Stop the acquisition.

stop(v)

Capture Frames by Controlling Exposure Time

Certain GigE Vision camera models support controlling the exposure time of each frame by external signal pulse-width duration. This configuration can be achieved with a FrameStart hardware trigger mode and a TriggerWidth exposure mode.

Possible applications include high dynamic range (HDR) imaging, where the external triggering signal can be a sequence of pulses of different pulse width durations. In this example, the triggering signal is generated by an Arduino that is programmed to output a repeating sequence of pulses of different pulse widths (5, 10, 20, and 50 ms) with a 50 ms delay between pulses. For a simple example of how to generate a custom triggering signal using an Arduino, see How to configure a hardware-triggered acquisition from a GigE Vision camera?.

Specify the total number of frames to be acquired. One frame is acquired for each external signal pulse.

numFrames = 30;
v.FramesPerTrigger = 1;
v.TriggerRepeat = numFrames - 1;

Specify the videoinput trigger type as hardware.

triggerconfig(v,"hardware","DeviceSpecific","DeviceSpecific");

Configure the camera for FrameStart trigger mode and specify the external triggering signal input line and desired trigger condition.

s.TriggerSelector = "FrameStart";
s.TriggerSource = "Line1";
s.TriggerActivation = "RisingEdge";
s.TriggerMode = "on";

For exposure time control, configure a TriggerWidth exposure mode.

s.ExposureMode = "TriggerWidth";

Specify the camera ExposureOverlapTimeMaxAbs in microseconds.

s.ExposureOverlapTimeMaxAbs = 5000;

Start continuous buffered acquisition and wait for acquisition to complete.

start(v);
wait(v,10);

Save the acquired frames and timestamps from the acquisition input buffer to the workspace.

[data3,ts3] = getdata(v,v.FramesAvailable);

Display acquired frames.

figure;
imaqmontage(data3)

figure;
plot(diff(ts3),"-x");
ylim([0.05 0.11]);
xlabel("Frame index");
ylabel("diff(Timestamp) (s)");

Stop the acquisition.

stop(v)

Each acquired multiple-exposure image sequence can be further processed to obtain a high dynamic range image by using the function in Image Processing Toolbox™.

Clean Up

When you are done working with your camera, clean up the workspace.

delete(v)
clear v