-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High CPU usage even when IDLE #4
Comments
Resolved in the mentioned PR, by toggling aruco_detect via |
Hi @youliangtan, I don't think we should close this issue just yet. Like I mentioned in the other PR,
This is because, the |
yes, I think we should keep this open. I briefly spent some time investigating this, but had no luck in identifying the main cause of high cpu load during idle. I kinda suspect it is related to action server creation and or tf_listener msg sub, no concrete answer to this. certainly, writing this in c++ will significantly boost the performance. |
I'm on a similar journey to reduce CPU usage for Some things I've tried to modify on
Combining the above, the idle CPU usage for the My current working theory is that it is due to using Undoing above changes and setting
Renders the Unfortunately, I do not have a live system to validate this theory at the moment. |
@LKSeng Thanks for the in-depth explanation, i also did the exact long experiment weeks ago by commenting on each component one by one, and derived the same conclusion. Yes, running in sim will create some overhead due to the subscirbtion of Just one thing to note, when running it in sim mode, it is crucial to have the same "real_time_factor" for all test scenarios to effectively compare their perf. This can be achieved by changing "max_step_size" and "realtime_update_rate" in gazebo gui. |
@youliangtan I am today years old to realise that the Indeed, my comparison were done with a final "real_time_factor" of 1 in all cases. I've changed the settings to reflect a "real_time_factor = 1" some time back on my local copy and I forgot about it. Repeating the comparison on the |
I saw a discourse about high CPU usage on Python nodes related to gazebo/sim_time (albeit in ROS2) and was immediately reminded about this. On my end, on a live robot, on idle the node takes around 0.31% CPU, while during parallel correction it was observed to peak up to 7% CPU usage, monitored via |
@LKSeng There could be some overhead in the communication layer in ros2 dds. This application which uses ros1 might be different in this case. In regards to the CPU usage, this might be due to the activation of the third-party pkg aruco_detect during docking., which is totally understandable if this is the case. |
Hi!
Thanks for this project! We have successfully deployed this algorithm long term and it has been working well :). Over time, we have observed that the
simple_autodock
node seems to be taking a lot of CPU resources even when IDLE. Samplehtop
screenshot shown below when TB3 sim is at the dock.Since this is just a state machine, could we figure out if there is something in the implementation that is causing it to have high CPU usage? I have done some preliminary investigation to look at if there are unrated loops or high frequency timer callbacks but could not find any egregious issues. We are running this in a moderately resource limited computer (4 core/8GB RAM) and sometimes, we see other issues because of the dock node overloading the system. If you have ideas on how to debug, I will be more than happy to try them out and report back/make a PR for review. FWIW this issue has been observed on both
18.04/Melodic
and20.04/Noetic
.Best,
Swarooph
The text was updated successfully, but these errors were encountered: