sl3dex_uav example shows how virtual collision sensors can be used to interactively control the simulation and to change the appearance of virtual world objects using Simulink® 3D Animation™. The example represents a simple unmanned aerial vehicle (UAV) challenge.
The UAV competition scene is based on the IMAV Flight Competition held in 2013 in Toulouse, France. ( http://www.imav2013.org )
The competition task is to fly through the larger window in the blue wall, pass the obstacles and land at the helipad in the blue landing zone. The flight time is started at the time of crossing the starting window and the simulation stops when the UAV lands at the helipad. During the flight, the number of collisions with obstacle poles is registered. Before crossing the starting window, hitting the blue start wall is registered without a penalty, you can proceed to the starting window.
By default, the model is set up for the UAV to follow a predefined trajectory. You can also use a SpaceMouse (R) to control the UAV and manually fly it through the obstacles onto the landing pad. To change navigation sources, toggle the SpaceMouse/ Predefined Navigation Switch block in the model.
In the associated virtual world, there are four
PrimitivePickSensor nodes defined that detect collisions of the UAV with various target geometries - the starting wall, starting window, orange obstacle poles, and the landing helipad. The sensors provide the following feedback to the simulation:
Hitting the start wall only changes the wall color
Passing the starting window starts the elapsed time of the challenge,
collisions with obstacles are registered and counted for scoring
Landing at the helipad stops the simulation and sets the visual appearance of UAV rotor blades to their static states.
The collisions are approximate -
PrimitivePickSensors detect collisions of a transparent box (
UAV_Collision_Box) that is wrapped around the UAV body.
In the model, there are VR Sink, VR Source and VR Text Output blocks, associated with the same virtual world. The VR Source is used to read the sensor signals. The VR Sink is used to set the UAV position / rotation and visual properties of virtual world objects. The VR Text Output block updates the HUD display text.