Digital Shadow for a Line Following Robot; using Unity3D and ROS2
|
|
3 лет назад | |
|---|---|---|
| ROS | 3 лет назад | |
| Unity | 3 лет назад | |
| dashboard | 3 лет назад | |
| pics | 4 лет назад | |
| pythonDDS | 3 лет назад | |
| .gitignore | 3 лет назад | |
| README.md | 3 лет назад | |
| info.txt | 3 лет назад | |
| server.py | 3 лет назад |
Note: This repository is a "fork" of Edward's work. Because of certain git configurations, an actual fork could not be created. The work in this repository should be more platform-independent.
This repo is the development of a Digital Shadow (DS) in Unity3D. It concerns a LEGO Line Following Robot (LFR), for which the mechanics, a similar control-algorithm for linefollowing, is implemented. This basically is a red-sensor which follows the left-side of a white line (i.e., anti-clockwise).
Remarks: The mechanics have been altered for the control-strategy to be succesfull. The location of the sensor has been changed. Physically a sensor between the two driven wheels will not result in a feasable linefollower.
The sensor is achieved with a camera with a limited scope (70x70 pixels) for which all "red" pixels are averaged. Thus an average value will result (0 = no red, 1 = full red). Besides, the sensor (part of the sensor actually) is a red spotlight.
Additonally a normal cameraimage (jpeg-stream) is created and also a depth-image (image-stream). The images are streamed over DDS and can be observed in any application that used DDS. The format in which the images are sent are standardised ROS-messages.
The full system should work on Windows 10, with Unity3D 2020+. While it has not (yet) been tested on Linux-systems (currently being investigated), it should work out-of-the-box as well.
Wheras the old repository used VortexDDS and a detailed configuration for DDS communication, this is not required anymore in this new version. All this is handled using the Robotec.ai plugin for ROS2.
ROS2 needs to be installed on the machine.
I installed ROS2 Galactic (https://docs.ros.org/en/galactic/index.html), but newer versions are also okay. ROS2 versions start with a letter in alphabetical order, "G" for Galactic, "F" for Foxy is older. Be aware that you DO NOT INSTALL ROS. Must be ROS2.
Galactic works with Ubuntu Focal (20.04.3 LTS). Recommended is to install this Ubuntu.
Note: _Fedora 36 currently has no way of using ROS2, due to compilation issues with ament. See also this StackOverflow issue_.
The package in directory ROS contains Python files and C++ files. You need to build the files in the "ROS2"-way (with colcon and lots of assisting files). The Python you can run without building, it will not be "known" as part of the package, but it works.
In a terminal you can already watch some topics.
ros2 topic list will list all available topics.
If you installed rqt (done by default I think) you can run the GUI (easier).
"ros2 run rqt_gui rqt_gui"
if you have installed the CV-bridge you can use CV in ROS2. The file "pcd_publisher_node.py" uses the CV-bridge. It subscribed to the depth_image-topic (coming from Unity) and applies a colormap to it. Than it shows the image with a colormap (depth-2-colormap).
Additionally we would be able to re-pulish the topic under another topic-name, and than it could be shown in rqt_gui.
The repository comes with a simple-to-use dashboard for identifying the DS of the LFR. Take a look in the /dashboard folder for more information.
A MongoDB is used to store timedata (currently only the position and orientation of the LFR every second).
Instaal a MongoDB and readup on howto do that (A DB can be stored locally or in the cloud)
A Unity-executable can be built from within the Unity Editor. A scene (or multiple scenes) must be used. There are 3 scenes defined in the project. They work stand-alone (they are not linked to eachother, like typically is done in a game which has levels, than each level is a scene). That is why you should only compile one scene at a time. For use of design of experiment use the scene 'SceneLineRenderer'.
For design of experiment a configuration file is read. The config.json is read that is in the main directory of the Editor (the parent of the Assets-folder) or the main executable-directory (in which the bat-file is present to start the executable).
The json-configuration file looks like this: [code] { "Database" : { "ClientConnection" : "mongodb://127.0.0.1:27017",
"DatabaseName" : "test",
"DBCollection" : "LineRobot2"
},
"Kspeedcontroller" : 1.45,
"SimulationTime" : 10.0,
"VelocitySetpointGenerator" : { "Dark" : 0.6,
"Bright" : 1.0,
"Threshold1" : 0.78,
"Threshold2" : 0.83,
"DistanceCenterToWheel" : 0.07,
"K" : 2.1,
"ForwardSpeed" : 0.1
},
"Parcours" : {
"LineWidth" : 0.05,
"Kind" : "Trail",
"LinePieces" : [
{ "x" : 0.0, "y" : -0.1 },
{ "x" : 0.0, "y" : 0.0},
{ "x" : -1.0, "y" : 0.0},
{ "x" : -1.0, "y" : 1.0},
{ "x" : 0.0, "y" : -0.1}
]
}
[/code]
Lets analyse the content:
"ClientConnection" : "mongodb://127.0.0.1:27017" "DatabaseName" : "test" "DBCollection" : "LineRobot2"
Tip: Download MongoDB Compass Community (free) from the MongoDB website and create a database (in the cloud or local), this will also show the connection string that can be copied to the json-file. The clientconnection is the connectionstring.
A database MUST be present at start of the application and thus made beforehand, the Unity app does not create one.
A DBCollection MUST not necessarily be present, the Unity app will create a collection if it does not exist. Typically you would store each experiment in a different collection.
The left and right-motor each receive a speed-setpoint from the VelocitySetpointGenerator. This is the P-action of each speedcontroller. If you do not know what this does keep it at 1.45.
This is the time in seconds after which the application stops. So if you want to store 20 seconds of linerobot-data set the simulation time to 20 seconds.
This contains the settings that use the redsensor to determine setpoints for both wheels.
The redvalue is a value from 0-1. This value is mapped to a different linear scale given the 4 values dark, bright, threshold 1 and threshold2.
Constraint: Dark < Threshold1 < Threshold2 < Bright.
The dark-value is mapped to -1 The values between Threshold1 and Threshold2 are mapped to 0 The bright-value is mapped to 1 Any values inbetween are linearly interpolated
DistanceCenterToWheel is half the distance between the two wheels. Do not alter this unless you really change the wheelbase, change K instead
K is a proportial action to determine the required rotation ForwardSpeed is the forward speed in m/s.
LineWidth determines the (constant) width of the lines Kind has two options (case-sensitive)
The LinePieces is an array of x and y. +x = to right, +y = to up (from point of view camera, x and y are on the flat horizontal plane)
In this case the array linepieces will form a uninterrupted line from each point to the next.
In this case the array of linepieces must have an even number of entries. If there are n entries than n/2 linepieces are made, going from 1st to 2nd entry, 3th to 4th entry, 5th to 6th, etc.
The lines are made with a LineRenderer in Unity and for the PieceWise case each linepiece has its own GameObject and LineRenderer attached to it.