Skip to content

A ROS1 self-driving car for the "Autonomous Driving Competition". The vehicle itself is a DonkeyCar with RaspberryPi and Raspberry Pi Camera.

License

Notifications You must be signed in to change notification settings

CatUnderTheLeaf/rosRoboCar

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

rosRoboCar for the "Autonomous Driving Competition''

Competition task: A real small car has to do at least 3 loops (a long loop to keep the lane or a loop to avoid obstacles) in a row without human intervention.

Robocar uses ROS1 Noetic and DonkeyCar. For testing and training purposes it can be run across multiple machines, when a car sends camera images and on computer I can visualize and work with them.

Current configuration

  • Sensors:
    • RaspberryPi Camera
  • Motors (recieve only PWM signals):
    • Throttle
    • Steering

Run on PC

Currently on PC without a car a bagfile can be used.

  1. Install on PC

  2. Download bagfiles directory to robocar_ws/src/path_from_image

  3. Install tensorflow 2.2 (not the latest) with pip install tensorflow==2.2

  4. For GPU acceleration also install CUDA Toolkit 10.1 and CuDNN. Unzip to a suitable location and copy files to corresponding folders in NVIDIA GPU Computing Toolkit\CUDA\v10.1 Check with python -c "import tensorflow as tf; print(tf.config.list_physical_devices('GPU'))"

  5. I could possibly forget some dependancies to be installed, like scikit-learn etc.

  6. Make all .py files executable. (Some files could be commited from Windows PC)

  7. Add empty CATKIN_IGNORE file to raspicam_node so it will be not build

  8. Launch in robocar_ws

# make sure catkin_make was executed prior
source devel/setup.bash

# launch bag-play, image-view and rviz
# to change bag-file just edit src/donkeycar/launch/bagdonkey.launch file
roslaunch donkeycar bagdonkey.launch

# in separate terminal 
source devel/setup.bash
# and launch lane detection, which needs GPU acceleration
roslaunch donkeycar lanedetect.launch

To be updated later...

Steps to make a car

  1. Installation on PC and on a car

    • Resolve connectivity problems if there are any.
  2. (optional) Running ROS across multiple machines

  3. Camera calibration. All data should be added to robocar_ws/src/donkeycar/config/camera_info.yaml

  4. Steering and throttle calibration. All channels and PWM values should be added to robocar_ws/src/donkey_actuator/config/servos.yaml

  5. Setup joystick control

  6. Record bagfiles

Lane Keeping Pipeline

  • !!! On the competition site resolve all connectivity problems
  • publish TF messages from URDF model
    • URDF model of a car is in src/donkeycar/config/car.xacro
    • add optical_camera_link in the model due to different axis orientation
    • add joint_state_publisher and robot_state_publisher nodes to the launch file to publish static TF messages. Transfomation between base_footprint and camera_link_optical should be possible
  • send image from camera
    • raspicam_node publishes ImageCompressed and CameraInfo messages with 30Hz rate. With a flag can also publish ImageRaw
  • image processing
    • make undistortion faster as 20Hz, because it slows down everything else
  • Lane detection, 2 variants
    • Neural network
      • get lots of images (bag)
      • train network
    • Traditional Computer Vision approach
      • get matrices for transformation into 'top-view'
      • make a binary treshold
      • with a sliding window get points of lanes, make it faster then 25 Hz
      • lanes are not always seen in the image
      • fit polynomials
      • draw a filled polygon
      • get middle line points and draw them
      • unwarp polygon image
      • combine with weights original image with a filled polygon
    • publish image with lane polygon and middle line points in the camera_link_optical frame
    • transform middle line points from camera_link_optical to base_footprint and publish Path message
  • controller_node get first point from Path message and transform to Twist message (linear.x and angular.z)
  • donkey_actuator_node transforms Twist message to PWM signals

Nodes and topics

rqtgraph

TF frames tree

frames

About

A ROS1 self-driving car for the "Autonomous Driving Competition". The vehicle itself is a DonkeyCar with RaspberryPi and Raspberry Pi Camera.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published