This section describes how to interface a Simulink® block diagram to sensors in a virtual reality scene. It also describes how to programmatically input signals from the virtual world into a simulation model.
Virtual reality scenes can contain sensors, nodes able to generate events and output values depending on time, user navigation, and actions and distance changes in the scene. These nodes add interactivity to the virtual world. You can use Simulink 3D Animation™ functions to read sensor field values into simulation models and control simulation based on the user interaction with the virtual scene.
You can define the following VRML sensors in the scene:
|CylinderSensor||Maps pointer motion (for example, a mouse or wand) into a rotation on an invisible cylinder that is aligned with the y-axis of the local coordinate system.|
|PlaneSensor||Maps pointing device motion into two-dimensional translation in a plane parallel to the z=0 plane of the local coordinate system.|
|ProximitySensor||Generates events when the viewer enters, exits, and moves within a region in space (defined by a box).|
|SphereSensor||Maps pointing device motion into spherical rotation about the origin of the local coordinate system.|
|TimeSensor||Generates events as time passes.|
|TouchSensor||Tracks the location and state of the pointing device and detects when you point at geometry contained by the TouchSensor node parent group.|
|VisibilitySensor||Detects visibility changes of a rectangular box as you navigate the world.|
Interactive mode allows clients to modify a remote virtual world via events from sensor nodes defined in the virtual world. Interactive mode is useful when a virtual world includes a sensor.
Interactive mode is disabled by default on clients. You can enable (or later disable) interactive mode on a client via context menu in the Web Viewer or by pressing the I key shortcut.
You can disable interactive mode for a particular virtual world
on the host computer. For details, see the
vrworld/get or vrworld/set.
To read a value of a readable VRML field (either
first synchronize that field with the
After synchronization, each time the field changes in the scene, the
field value updates on the host. You can then read the value of the
field with the
or directly access the field value using the dot notation.
The virtual scene for the Magnetic
Levitation Model example,
a PlaneSensor (with the
The PlaneSensor is attached to the ball geometry to register your
attempts to move the ball up or down when grabbing it using the mouse.
The example uses the sensor fields
restrict movement in other directions. You can use the output of the
sensor translation field as the new setpoint for the ball position
controller. You can read the sensor output value into a MATLAB® variable
setpoint with the following:
% create the vrworld object and open the world wh = vrworld('maglev.wrl'); open(wh); % get the node handle nh = vrnode(wh, 'Grab_Sensor'); % synchronize the translation field sync(nh, 'translation', 'on'); % 3 alternative ways to read the synchronized field value setpoint = getfield(nh, 'translation'); setpoint = nh.translation; setpoint = wh.Grab_Sensor.translation;
The vrmaglev/VR Sensor Reader model displays. This model contains
the vrextin block, which is an S-function block. The vrextin S-function
synchronizes the sensor field in the
and periodically reads its value in the
To examine the S-function parameters, right-click vrextin and select S-Function Parameters.
The parameters defined in the mask supply the sample time, virtual world, and the node/field to read.
Note the following about the
Instead of setting its own block outputs, the
sets the value of the adjacent Constant block
This setting makes the VR Sensor Reader block compatible with Simulink Coder™ code
generation so that the model can run on Simulink Coder targets.
The signal loop between user action (grabbing the
ball to a desired position using a mouse) closes through the associated Simulink model
As a result, grabbing the ball to a new position works only when the
model is running and when the model sets the blue selection method
switch to the virtual reality sensor signal path. To experience the
behavior of the PlaneSensor using the virtual scene only, save the
under a new name and remove the comment symbol (#) to enable the last
line of this file. This action activates direct routing of sensor
output to a ball translation. You can then experiment with the newly
created scene instead of the original
ROUTE Grab_Sensor.translation_changed TO Ball.translation
You can use this method to input information from
all VRML node fields of the type
not only a Sensor
eventOut field. See VRML Data
Class Types for more information about VRML Data class types.
For fields of class
you can use an alternate name using the field name with the suffix,
alternate names for requesting the translation field value of the