Design of Remote Driving Wheeled Robot

. With the continuous advancement of high technology such as cloud computing, artificial intelligence, modern sensing, information fusion, communication and automatic control, the development speed of unmanned vehicles will continue to accelerate, and the acceptance and demand of unmanned ， vehicles will also gradually promoted. Although unmanned vehicles have not yet been widely promoted, their applicable fields are gradually emerging. Transportation, logistics, environmental sanitation, and security inspections will be its main application areas. This project intends to design and produce an unmanned vehicle based on embedded systems and the Internet of Things, and add a machine vision system to enable it to complete the required operations in different scenarios. In order to complete the operation requirements in different scenarios, the unmanned vehicle designed in this project can realize the remote control of the car to complete forward, backward, and steering actions through a driving simulator.


Introduction
This design combines for the first time remote driving, a cutting-edge technology, advanced sensing technologies such as computer vision and unmanned vehicles, and achieves remote intelligent unmanned vehicle control in a truly feasible way with extremely low operational threshold, reliable equipment and low cost. The paper involves a variety of new technologies that are hot today, such as robot mechanical structure optimization, energy power matching, remote communication implementation, sensor decision algorithm writing, robot closed-loop motion control, computer vision and deep learning application, and cloud platform computing. This design has five core technologies, namely, remote driving system, modular mechanical structure design, electronic architecture distribution, machine vision system, and risk control safety system.

Related Works
In 2014, Google announced the formation of the Open Automotive Alliance (OAA) to develop driverless cars in collaboration with GM, Honda, Audi, Hyundai, and chipmaker NVIDIA. In 2016, Japan's Ministry of Economy, Trade and Industry, Ministry of Land, Infrastructure, Transport and Tourism and the Japan Automobile Manufacturers Association established the "Institute of Autonomous Driving" and plans to In 2018, the European Commission announced a timetable for autonomous driving, aiming to achieve autonomous driving on highways and low-speed autonomous driving on urban roads by 2020, and to enter a fully autonomous society by 2030. To this end, the EU will invest 450 million euros to support digitization and automation. Through this strategy, the EU hopes to take the initiative in the global competition in the field of autonomous driving.
China has successively introduced policies such as "Made in China 2025", "Energy-saving and New Energy Vehicle Technology Roadmap" and "Management Code for Public Road Testing of Intelligent Networked Vehicles" to promote the development of the driverless industry. The development of China's intelligent networked vehicles continues to accelerate, and crossborder cooperation between automobiles and electronics, communications, and the Internet has been strengthened, with positive progress in key technology development, industry chain layout, and testing demonstrations. nine cities have introduced policies on autonomous driving road tests, opening relevant road tests one after another and actively promoting semi-closed and open road tests.

Design of unmanned vehicle related system
The design of unmanned vehicles is divided into three parts: body hardware design, electronic control system design, and additional function design. The body hardware design will focus on the modular mechanical structure design, the electronic control system design will introduce the electronic control architecture and computational power resource allocation in detail, while the remote driving, risk control and machine vision systems will be introduced in detail in the additional function system design. The system structure block diagram is shown in

Vehicle hardware design
For the mechanical structure, this unmanned vehicle adopts modular design. For the frame and body system, the frame is connected across the front and rear axles of the unmanned cart, which mainly supports and connects the assemblies of the unmanned cart, keeps the assemblies in relatively correct positions and withstands various loads inside and outside the car. The frame must have enough strength and stiffness to withstand the unmanned vehicle in the process of operation by the load and the impact from the wheels, and secondly, it also needs to meet the requirements of light weight. Therefore, the frame form adopts truss type high stiffness frame to meet the characteristics of high stiffness and low mass.
For the drive and brake system, the drive mode is single motor drive for the rear wheels. The rear wheel drive form makes the unmanned vehicle start to accelerate, the weight of the vehicle is transferred backward, therefore increasing the pressure of the rear wheels, thus making the rear wheels get more grip and improving the acceleration of the unmanned vehicle. And the rear wheel drive simplifies the transmission structure of the front wheels, which makes the mass of the whole vehicle more evenly distributed, and also allows the front wheels to turn a larger angle, reducing the turning radius of the unmanned vehicle, improving the flexibility of the unmanned vehicle, and at the same time there is no bias problem that may be caused by the uneven torque distribution of the front wheels, which is conducive to improving the maneuverability of the unmanned vehicle. In addition, the mechanical design of the rear wheel drive is simpler and easier to disassemble and maintain. The braking method uses motor reverse braking, and the reverse braking control circuit is simple, with rapid braking and significant braking effect. For the steering system, the steering form adopts the front axle Ackermann steering. Based on the Ackermann steering geometry principle, the unmanned vehicle is designed to use the equal crank of the four-link to make the steering angle of the inner wheel larger than that of the outer wheel when turning along the curve, so that the center of the circle of the four wheel paths roughly meets at the instantaneous steering center on the extension line of the rear axle, allowing the unmanned vehicle to turn smoothly.
For the power battery system, in order to improve the safety of the unmanned vehicle and prolong its service life, the unmanned vehicle in this project adopts the vehiclegrade lithium battery with the waterproof level of IP67. The vehicle-grade lithium battery can be used in more scenarios, has a longer service life, and can adapt to more severe working environments; it adopts multiple cells in series and parallel, which is more reliable; it can be produced on a large scale, and the consistency of product quality is high. The selection standard of raw materials is also more strict; the manufacturing control and process control are also stricter, and the safety is more guaranteed.

Electronic control system design
The embedded bottom layer can be controlled by a simple code to achieve forward, brake, steering, backward and other functions, while the top layer is responsible for the implementation of computer vision algorithms, sensor fusion algorithms, path-assisted decision-making, closedloop control algorithms, so that the demanding highperformance processor part can be integrated into the operator console. As the robot body reduces the precision system, it can be designed to improve the robot's resistance to harsh environments.
Chassis electronic architecture, considering the large number of chassis modules (more demand for IO port resources), the Infineon series embedded TC1782 is selected as the main control part, using the communication module to transmit the superior system commands to control the embedded, embedded I/O port output PWM wave to control the two half-bridge chips constitute the Hbridge full-bridge circuit to achieve motor drive, the same encoder is used as the speed detection , also analog circuit with AD module as current detection, and then input to the embedded to achieve closed-loop control. The control strategy is shown in Figure 2. The implementation process is shown in Figure 7. The specific control program is implemented by defining a structure that is used to continuously adjust the PWM output.

Machine vision system
Machine vision system is the technology for face recognition, item recognition and unmanned driving. This unmanned vehicle platform is loaded with machine vision system and reserved for development port. A complete machine vision system includes: lighting source, optical lens, CCD camera, image acquisition card, image detection software, monitor, communication unit, etc. The working process of the machine vision system used in this project is mainly shown in Figure 3. The specific working process is shown as follows. 1. when the sensor detects the object being picked up close to the movement to the camera's shooting center, the trigger pulse will be sent to the image acquisition card.
2. image acquisition card according to the set program and delay, the start pulse will be sent to the lighting system and the camera, respectively.
3. A start pulse to the camera, the camera to end the current photo, restart a new set of photos, or before the arrival of the start pulse camera in a waiting state, after detecting the start pulse to start, before starting a new set of photos camera open exposure components (exposure time set in advance); another start pulse to the light source, the light source open time needs to match the exposure time of the camera ; the camera scans and outputs a pair of images.
4. image capture card to receive the signal and digitize the analog signal through A/D conversion, or directly receive the digital video data digitized by the camera.
5. the image capture card stores the digital image in the computer's memory. 6. the computer processes, analyzes and identifies the images to obtain detection results. 7. The processing results are fed back and further control and correction of the unmanned vehicle is carried out by the processor.
In order to reduce the operational difficulty and improve the overall safety, a basic recognition system and a risk control system are developed on the basis of machine vision hardware. The recognition system relies on machine vision and machine learning, and the path planning and screen display will be performed according to the corresponding functions after the target object is recognized. The risk control system is also based on machine vision. After recognizing obstacles such as gullies, trees and pedestrians, if the system judges that the robot trajectory will intersect with the location of the obstacle, the operator console will alarm and remind the driver to steer, and if the driver does not make a response operation, the robot will automatically brake and lock the operation right.

Remote driving system
The remote driving system is realized with the help of 4G mobile network, and the specific technical difficulties are shown below.
1. Collecting the information of the control action of the console: In order to collect the information of the driver's action at the remote console, in collecting the rotation angle of the steering wheel, considering that the incremental encoder has the accumulated error of zero point, and there are problems such as poor anti-interference, power-off memory for receiving equipment shutdown, and zero or reference position for power-on, we choose the absolute encoder to convert the rotation position of the steering wheel into binary Gray code in a stable and reliable way. code. The pedal volume is realized by using pressure-sensitive (or photosensitive) components with the corresponding analog-to-digital conversion circuit. Finally, the collected information is output as digital signal to reduce the difficulty of subsequent operation.
2. Using 5G network to achieve information transmission: In order to simulate the environment in which the robot is located, we set up three displays at the operating console to receive real-time images captured by the robot camera, using video 5G network to achieve video transmission of the environment around the car, back to the remote driving robot view data, the picture has a high definition, low latency. At the same time, this part should also realize the precise transmission of control commands to the operator console (the data is relatively small and is not used as a module selection criterion). A total of two 5G modules are required for this project. One of them is connected to the serial port and camera of the microcontroller on the robot, and the other one is connected to the video and command transmission port of the operator console. Both modules need to realize input/output data through the interface protocol.
3. Real-time and precise control drive for the chassis of the vehicle robot: After prior preparation, the encapsulation of the basic action instructions such as start, brake, acceleration, deceleration, gear shift and steering of the vehicle class robot has been realized, while supporting CAN control and remote control, and the chassis drive can be controlled through the embedded control unit. Here we need to correspond the data sent by the operator console with the instructions sent by the embedded one by one, and realize that the operator console sends the corresponding information to the embedded control unit, and then the embedded control unit realizes the control of the robot.

Experimental device and result analysis 4.1 Speed control test
Since it is difficult to test the speed accurately, the wheel speed test is taken. After building the circuit, the wheel speed test data is shown in Table 1. Analysis: From the test results, it can be seen that the performance of the motor drive circuit meets the requirements.

Range/speed measurement module test
The actual circuit was built by placing an obstacle in front and reading the distance detected by the sensor through the serial port, and the test data is shown in Table 2. Actual distance (cm) 100

Overall test
The overall system test is a common power-up test of the whole system, and the relevant configurability test indexes of each module are shown in Table 3, and each module is tested in turn. The network connection delay is less than 30ms, the operation of the operator console is passed to the robot without error, and the robot can make corresponding actions.

Module mounting
Each module can be operated with the console to play full functions, the module is firmly installed, and disassembly will not damage the module.

Assisted driving algorithm
The accuracy rate of the guidance line displayed on the screen during remote operation is greater than 90%, and the robot can effectively correct deviations while driving, with a correction success rate greater than 95%. Risk control safety system Risk control safety system Alarm when positioning disappears, rapid braking, alarm and automatic power off when rushing to pedestrians and losing control.

Harsh environment test
It can operate normally under the condition of daily rainfall of 50 mm~99.9 mm (heavy rain).

Conclusion
This paper intends to design and produce an unmanned vehicle applied in several fields, so that it can complete the demanded operations in different scenarios, improve the functional unmanned vehicle system, develop a mobile network-based remote driving robot, fill the gap of unmanned vehicle operations in several fields, and follow the current hot stream of unmanned intelligent vehicles with high speed development, which is practical and relevant. Compared with the current immature unmanned vehicle technology, the remote driving robot designed in this paper has the advantages of cost economy, high stability of equipment and high universality of solution. At the same time, this project introduces vehicle-based robots as control objects, which is groundbreaking and inspiring for both the unmanned robot field and the remote driving field.