Supervisory and low level robot tracking control of a 5-bar or articulated gantry using desired position based on video input processed in real time.
This model simulates a robot in an environment that includes a video camera source that is used for obtaining the desired postion of the robot end effector and a user interface that allows the user to command the robot to pick up the tracked object and bring it back to a home location.
This model demonstrates (from left to right in model):
1. Inputting a video file to Simulink
2. Running an object detection algorithm on the video to find a target point
3. Passing that location to a supervisory controller that filters the input from the video tracking code using a kalman filter
4. The supervisory controller chooses the mode of the robot, the state of the electromagnetic pickup end effector and passes that data to the low level controller
5. The low level controller runs a PD controller that is tuned for the plant chosen (5-bar linkage and articulated robot arm are available in the configurable subsystem block called "gantry plants")
6. The control inputs are fed to a mechanical model of the plant
7. The mechanical model outputs the updated positions of the plant and an estimator is used (analytical and SimMechanics estimators are used for the 5-bar, raw data is used for the robot arm)
8. Those positions are fed back into the supervisory control loop.
9. The whole operation is controlled and visualized by a GUI that displays the video that is being processed, the desired and actual position of the robot end effector, and allows the user to interact with the model to make the robot track the moving target and "pick up" the target using an unmodeled electromagnetic end effector.
Originally created in R14SP2, tested it successfully running in 2007b.