Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

im2int16

Convert image to 16-bit signed integers

Syntax

I2 = im2int16(I)
RGB2 = im2int16(RGB)
I = im2int16(BW)
gpuarrayB = im2int16(gpuarrayA,___)

Description

I2 = im2int16(I) converts the intensity image I to int16, rescaling the data if necessary. If the input image is of class int16, the output image is identical to it.

RGB2 = im2int16(RGB) converts the truecolor image RGB to int16, rescaling the data if necessary.

I = im2int16(BW) converts the binary image BW to an int16 intensity image, changing false-valued elements to -32768 and true-valued elements to 32767.

gpuarrayB = im2int16(gpuarrayA,___) performs the conversion on a GPU. The input image and output image are gpuArrays. This syntax requires the Parallel Computing Toolbox™.

Class Support

Intensity and truecolor images can be uint8, uint16, int16, single, double, or logical. Binary images must be logical. The output image is int16.

Intensity and truecolor gpuArray images can be uint8, uint16, int16, logical, single, or double. Binary gpuArray images must be logical. The output gpuArray image is int16.

Examples

collapse all

Create an array of class double.

I = reshape(linspace(0,1,20),[5 4])
I = 

         0    0.2632    0.5263    0.7895
    0.0526    0.3158    0.5789    0.8421
    0.1053    0.3684    0.6316    0.8947
    0.1579    0.4211    0.6842    0.9474
    0.2105    0.4737    0.7368    1.0000

Convert the array to class int16.

I2 = im2int16(I)
I2 = 5x4 int16 matrix

   -32768   -15522     1724    18970
   -29319   -12073     5173    22419
   -25870    -8624     8623    25869
   -22420    -5174    12072    29318
   -18971    -1725    15521    32767

Create array of class double.

I1 = gpuArray(reshape(linspace(0,1,20),[5 4]))

Convert array to uint8.

I2 = im2int16(I1)

Extended Capabilities

Introduced before R2006a

Was this topic helpful?