Python Publisher Node for Vehicle Control

Hello everyone,

I have set up a simulation environment in BeamNG/ROS2 with a costum vehicle and a custom track featuring walls on both sides.

Now, I want to implement a simple PID controller that maintains a safe distance between the vehicle and the walls while moving forward at a constant velocity.

My questions:

  1. Which sensor should I use to measure the distance between the vehicle and the walls?
    I saw that the ultrasonic sensor provides the distance (I assume in meters) between the sensor and the wall using the range field:
header:
  stamp:
    sec: 1739373523
    nanosec: 323095157
  frame_id: ego
radiation_type: 0
field_of_view: 0.10000000149011612
min_range: 0.15000000596046448
max_range: 5.0
range: 0.9443764090538025
---

Based on this, I intend to use two left and right oriented ultrasonic sensors for this task. Am I on the right track?
(Also, we knew about the bug related to the ‘network_visualization’: ‘on’ setting and the LiDAR crash bug)

  1. How can I control the vehicle using a Python publisher node?
    I’d like to send commands (throttle, brake, and steering) from a Python script to control the vehicle in the simulation.

Any guidance or example scripts would be greatly appreciated!

Hello Daddysublime,

You’re on the right track with using ultrasonic sensors for wall distance measurement. Placing one on each side will give you the data needed for PID-based lateral control .

For vehicle commands, publishing a ROS2 Twist message to /vehicle/cmd allows control over throttle and steering . You can reference beamng_teleop_control (for teleop using Twist) and beamng_agent (for ) to structure your implementation.

Next steps: Subscribe to sensor data, implement PID for steering correction, and publish adjusted commands. Let us know if you need help fine-tuning the setup! :rocket:

Also, you mentioned a bug with network_visualization: 'on' and a LiDAR crash —could you provide more details? If it’s consistent, reporting logs and reproduction steps could help us.

1 Like

Hello,

Thank you for your detailed explanation! We will start working on the small controller following the proposed guidelines. We will keep update you on the project !!

Regarding the bug with "network_visualization": "on", we have documented and solved it in this topic:

:link: Custom level not loading in BeamNG.tech (started with terminal)

As for the LiDAR issue, based on our tests, it does not cause the integration or the simulation to crash, but it does create problems with topics. Specifically, when using LiDAR along with other sensors, the standard command:

ros2 topic echo "topic name"

Unfortunately, it stops working for any sensor present. We saw in the GitHub repository that LiDAR can cause crashes, so we assumed this might be one of the issues. We managed to solve the problem simply by removing the LiDAR from the scenario.

These issues should not represent obstacles for our project, but let us know if you need more details.

Thanks again for the support!

Hello @Daddysublime,

Thank you for the update! Glad to hear that you were able to resolve the issue with your custom level.

Regarding the LiDAR sensor problem. We’ll look further into the issue and fix it as soon as possible.

Let us know if you encounter any other obstacles. We’re happy to assist!

Best regards,
BeamNG.tech Support Team

1 Like

Hello everyone,

We are currently working on vehicle control in BeamNG.tech using ROS2. Our setup consists of a single ROS2 node that subscribes to various vehicle sensors (IMU, ultrasonic, time) and applies control using a PID-based approach.

1. At the moment, we are sending commands directly to the vehicle through the game client, as shown in the following part of our python script:

class VehicleControlNode(Node):
    def __init__(self):
        super().__init__(NODE_NAME)

        host = self.declare_parameter("host", "192.168.1.217").value
        port = self.declare_parameter("port", 25252).value
        vehicle_id = self.declare_parameter("vehicle_id", "ego").value

        if not vehicle_id:
            self.get_logger().fatal("No Vehicle ID given, shutting down node.")
            sys.exit(1)

        self.game_client = bngpy.BeamNGpy(host, port)

        try:
            self.game_client.open(listen_ip="*", launch=False, deploy=False)
            self.get_logger().info("Successfully connected to BeamNG.tech.")
        except TimeoutError:
            self.get_logger().error("Could not establish game connection, check whether BeamNG.tech is running.")
            sys.exit(1)

        current_vehicles = self.game_client.get_current_vehicles()
        assert vehicle_id in current_vehicles.keys(), f"No vehicle with id {vehicle_id} exists"

        self.vehicle_client = current_vehicles[vehicle_id]
        try:
            self.vehicle_client.connect(self.game_client)
            self.get_logger().info(f"Successfully connected to vehicle client with id {self.vehicle_client.vid}")
        except TimeoutError:
            self.get_logger().fatal("Could not establish vehicle connection, system exit.")
            sys.exit(1)

        self.subscription = self.create_subscription(VehicleControl, "/control", self.send_control_signal, 10)

        # Sensor Subscriptions
        self.create_subscription(Imu, 'vehicles/ego/sensors/imu0', self.imu_listener_callback, 10)
        self.create_subscription(TimeSensor, 'vehicles/ego/sensors/time0', self.time_listener_callback, 10)
        self.create_subscription(Range, 'vehicles/ego/sensors/ultrasonic_left', self.left_listener_callback, 10)
        self.create_subscription(Range, 'vehicles/ego/sensors/ultrasonic_right', self.right_listener_callback, 10)
        self.create_subscription(Range, 'vehicles/ego/sensors/ultrasonic_front', self.front_listener_callback, 10)

Instead of this approach, which is working quite well in our scenario, we were unable to use the method you previously suggested (using /vehicle/cmd for throttle and steering, beamng_agent, and beamng_teleop_control).
Would it be necessary to create an additional node to convert Twist messages into BeamNG control commands? Our goal is to have our node publish commands in the correct format.

2. Additionally, we have encountered the following error in the briidge terminal, but it does not seem to affect functionality. We suspect it might be related to the sensor orientation when not in its default position:

[ERROR] [1740049677.547881833] [vehicles.ego]: Fatal error [ego]: Rotation.as_quat() takes no arguments (1 given)

Do you have any insights on what might be causing this?

3. Finally, we have a question regarding vehicle colors in a scenario. Currently, our test scenario defines the vehicle color like this:

"name": "ego",  
"model": "Desert_Buggy_Light",  
"color": "white"

However, in the info_Desert_Buggy.json file, the vehicle has the following configuration:

"Configuration": "Desert Buggy",  
"Drivetrain": "4WD",  
"defaultPaintName1": "Black",  
"defaultPaintName2": "Red"

where defaultPaintName1 and defaultPaintName2 define the primary and secondary colors, which are associated with specific materials as described in the Vehicle Tutorial of BeamNG (e.g., chassis and rims).
How can we make the scenario use this predefined color combination instead of setting a single "color" value?

Any example or clarification would be greatly appreciated!

Thanks in advance for your help, have a good day!

1 Like

Hello @Kiriko, regarding issue #1 (Vehicle Control via ROS2 /vehicle/cmd):
You don’t necessarily need to convert the message type from VehicleControl to Twist unless it’s required for another part of your setup. If your approach of directly sending commands to the game client is working well, you can continue using it.

For issue #2 (Rotation.as_quat() error):
Could you provide the steps to reproduce this issue?

For issue #3 (vehicle colors in scenarios):
I’ll ask the developers and get back to you with an update.

Thanks!

1 Like

The issue seems to occur when using a sensor with a left or right orientation, as in the test scenario below:

{
  "version": 1.0,
  "level": "gridmap_legacy",
  "name": "example_tech_ground",
  "mode": "None",
  "network_visualization": "off",
  "vehicles": [
    {
      "name": "ego",
      "model": "Desert_Buggy",
      "pos": [490.680, 418.68, 1.100],
      "rot": [0.0000, 0.0000, 0.9000, 0.0100],
      "sensors": [
        {
          "name": "time0",
          "type": "Timer"
        },
        {
          "name": "imu0",
          "type": "IMU",
          "pos": [0.0, 0.0, 0.2],
          "is_send_immediately": true
        },
        {
          "name": "left_cam",
          "type": "Camera.default",
          "pos": [-0.074246, 0.052684, 0.4],
          "dir": [-1, 0, 0],
          "resolution": [1080, 720],
          "requested_update_time": 0.02
        }
      ]
    }
  ]
}

In this case, the left_cam sensor is oriented to the left (dir: [-1, 0, 0]). We suspect that the issue might be related to how the sensor direction is handled when it is not in its default orientation. The vehicle model used is “Desert_Buggy,” which we developed.

The problem doesn’t seem to be an obstacle because the simulation doesn’t stop and we see the sensors working well, so we don’t have any other information about it…
We also saw that the problem repeats itself with ultrasonic sensors and not just with the cameras.

Let me know if you need any clarifications! Let us know about the color problem !!

Hello @Kiriko , regarding issue #3, you can specify both primary and secondary colors in your scenario by using the "color2" attribute along with "color". BeamNGpy supports multiple color attributes:

  • "color": The primary vehicle color.
  • "color2": The secondary vehicle color.
  • "color3": The tertiary vehicle color.

Since the info_Desert_Buggy.json file defines "defaultPaintName1": "Black" (primary) and "defaultPaintName2": "Red" (secondary), you can update your scenario definition like this:

{
    "name": "ego",
    "model": "Desert_Buggy_Light",
    "color": "Black",
    "color2": "Red"
}

This will ensure that the predefined primary and secondary colors from the vehicle configuration are reflected in your scenario.

For reference, you can check the BeamNGpy documentation here.

Hope this helps! Have a great day!

Hello @Kiriko,

Regarding issue #2, I tested your scenario using the etk800 vehicle on the gridmap_v2 map. Everything appears to be functioning correctly, with no errors or warnings encountered.

Let me know if you need any further verification or additional details.

Best regards,