I can't get docker to work (on Ubuntu 20.04). The curl install does not work. But the "sudo apt install..." did work. I had to add "sudo" to the "./start.sh" command. It down loaded the docker container. I can't run the "roslaunch ... ...". It complains about networking. Someone had asked that on ROS Agriculture Slack, but that has all been deleted. Even if that worked, I see there are apparently two different ways to start a docker package...
So I decided to go back to what I know. I loaded VirtualBox on Ubuntu 20.04. I created a Ubuntu 16.04 virtual machine. I am loading ROS kinetic on that. Then I will clone which ever respository seems appropriate. Maybe I will clone the Ubuntu/Kinetic machine and have both simulators loaded that I can choose from.
*Thread Reply:* Q: Re: networking error,
$ docker run -p 6080:80 --network=”host” --device=/dev/input/js0 rosagriculture/lawn_tractor
I had trouble with the above statement and removed the '--network=”host”' portion
Q: Are you following someone's instructions? It would be interesting to see how they have documented their steps.
*Thread Reply:* hi @JeffS
If you would like to use Docker as a non-root user, you should now consider adding your user to the “docker” group with something like:
sudo usermod -aG docker <your-user>
Remember to log out and back in for this to take effect!
look the sreencap or the link..
I got VirtualBox loaded on 20.04. I created an image with Ubuntu 16.04 and ROS Kinetic. I will follow the Docker instrucions again. Here: https://github.com/ros-agriculture/lawn_tractor/blob/master/simulator.md I will see if these instructions work under 16.04. (Mainly as an experiment.)
But, I don't plan to put much effort into Docker.
I ordered a new hard drive yesterday. I plan to swap the 20.04/Noetic/Foxy hard drive in my robot laptop with the new one and build a fresh install of 16.04/Kinetic. Then clone the actual repositories and run it without virtual machines or Docker. I'll let you know when I have that loaded.
I loaded Docker from this page: https://github.com/ros-agriculture/lawn_tractor/blob/master/simulator.md on 16.04, under VirtualBox, under 20.04.
The curl command works under 16.04, but did not work under 20.04. If you load it with curl it gives you the message about:
sudo usermod -aG docker <your-user>
I didn't see that when I loaded using the "sudo app install" method. I may have just missed it.
I still had to add sudo to the ./start.sh command. Looks like somebody did something cute inside the start.sh script.
When I did the roslaunch... it took about 10 minutes before ROS time started counting and the tractor turned red. Then I could select a destination and it would drive. But all of the other windows were running real slow and the whole thing was un-usable.
So I loaded the "Local" version from the install page (with 16.04 under VirtualBox under 20.04). That seemed to work and behave much better.
So tomorrow when I get my new drive I will load that and hopefully have a native install that works well enough to do development.
I just started it again. It still takes 3-1/2 minutes to start up. (the local install under the virtual machine) So hopefully a fresh install with native Ubuntu 16.04 (on a SSD) will improve it.
Well at least we are all near the same place. I'm working in the ros in spanish community.. we are just working in the translation of the different type of install scripts. There are a lot of options but One thing that makes all this very confusing to understand is which problems are yours alone and which are common to everybody. The situation that you suffer with the installation is na example of that. I'm working in documenting all types of installations..
My new (cheap, 250GB SSD) showed up. In two hours I had it installed, loaded Ubuntu 16.04, ROS Kinetic, lawn_tractor simulator (local version).
It works. It boots from power-off in 50 seconds (due to SSD). When I run
roslaunch lawn_tractor_sim lawn_tractor_sim.launch
it takes about 15 seconds until it is driveable.
I'm pretty happy with this setup. My 20.04 setup is slow. Everything about it is slow. I may have to down grade to something else if I can't figure out what is wrong with it...
*Thread Reply:* So you are running ROS kinetic with ubuntu 16.04.. That will be your final setup..? Because if that's right we can began loading it to the spreadsheet..
*Thread Reply:* Juan said: "So you are running ROS kinetic with ubuntu 16.04.. That will be your final setup..? Because if that's right we can began loading it to the spreadsheet.." (I don't know hot to do a quote...
That is my final setup for today. I'm not really interested in updating the spread or even being in the spread sheet.
*Thread Reply:* I am running 16.04/Kinetic. I don't know about "final". At the moment I don't have a need to change so I guess I will stay where I'm at until there is a reason to switch. I'll use whatever works reliably and has the most support.
[Does anybody know how to prevent Slack from appending messages togher? Or do I just have to wait 10 minutes before posting another topic?]
I have a thought on joysticks. Both of my computers install the joystick as js1 and not js0. So I always have to hack the config files.
What if a guy were to create an alias for a particular computer? Maybe alias js0 (or js1) to a name, say, JOYSTICKALIAS. That could be done in the .bashrc file. Then edit all of the config files to use JOYSTICKALIAS instead of js0 (or js1).
*Thread Reply:* Naaa, I hate slack, it makes me fell old.. but seems to be the only tool posible.. 🤷♂️
*Thread Reply:* Oh, I have a new thought. But look... it appends it to the previous message...
*Thread Reply:* Oh course, now it didn't. Maybe it is context sensitive. This was in a thread. Let me go try it in the main channel.
*Thread Reply:* for me it is incomprehensible and the only way to see what happens is to see the threads of the conversations; but like that, several times I lost the context and order of the conversations
Oh, the alias thing doesn't fix my problems. Alias seems to work for commands but not for variables...
A bigger problem is that my joystick shows up as js0 or js1, depending if the joystick is plugged in when I turn the computer on.
I should look at udev rules to identify my Logitech gamepad. Then create a device name like "JOYSTICK". Then use that as /dev/input/JOYSTICK instead of /dev/input/js0.
*Thread Reply:* You have started a new message. I am replying to the thread.
This is a totally new thought... Does it get appended or does it start a new message?
*Thread Reply:* Well that is more Slack non-sense. If you don't check the box it is only appened to the thread. A user will be notified if "they are following the thread" (whatever that means). But if you check the box then it gets added (as a duplicate to the main channel) and everybody gets notified. I personally would not use threads since it just adds to the confusion, but that is the way the Slack is written. But if someones starts a thread then I will reply to the thread if I want to answer someone.
*Thread Reply:* The bottom line is... Slack is free and it mostly works. So I am willing to put up with it.
Now if I wait 10 minutes (or maybe 3 or 5 minutes it will start a new message. Or if somebody else posts before I do.
So the moral of the story is... You just have to guess at what the results will be. I'm assuming the people that write the Slack software don't actually use it...
Just a suggestion:
I'm looking at this file: https://docs.google.com/document/d/1uXMrZL0K-PJ7fLSx8w51RZotoYZD70tybRhmNpjhyyU/edit Named: Using Matt's waypoint publisher
It would be real handy to have a synopsis at the top. Maybe something like this:
Waypoint Publisher This will read waypoints from a file and send them to ... (movebase_flex?) to drive a path. This consists of two python scripts and a text file containing a list of waypoints.
I extracted this evaluation from digging through your steps.
I assume if I build the python scripts and create a waypoint file that I should be able to follow a path without using Docker or VS-Code or Terminator. But I will have to try it to see...
*Thread Reply:* I can add it or you can add it; You should have edit rights. Re: "without using Docker...", I assume so. It is simply meant to help people go fast; Not a hard requirement.
Why does Slackbot keep suugesting that I load github app and Google Drive Link app? Do I need those? What would they do to help me?
fyi, looks interesting to me - I've spoken to him before about RTK....https://deepsouthrobotics.com/2019/08/23/anatomy-of-a-huge-self-driving-mower/
I got my tractor to follow the path as Al described in his instructions. I read through his instructions and pulled out the pieces I thought I needed.
I created a package named "jeff" to put my new files in. So I put the files here:
~/catkin_ws/src/jeff/python/runmission$
The 3 files are:
generated_points.txt path_follower_node.py pathpublisher.py
I noticed several things. I have to execute the ~/catkin_ws/src/jeff/python/runmission$ rosrun jeff pathpublisher.py
from the runmission directory. Otherwise the pathpublisher does not find the generated_points.txt file. I will have to figure that out.
It does not start moving on the first path generated. It starts moving when the second path is published. It generates more paths while it is moving but they are ignored. When it finishes and I get "GOAL Reached!" and it stops. It will not start again until I shut down the simulator launch file and restart it. So I will have to figure that out.
It doesn't drive very smoothly and puts out warnings. But it goes where it is supposed to. That is probably tuning...
I also notice the packages I downloaded using git clone don't show up as packages. (when I do rospack find ...) So I should look at that.
But since I got it move I now have things to look at...
*Thread Reply:* "...It will not start again until I shut down the simulator launch file and restart it...., I have the same experience. I'm guessing there are some path cancel commands needed.; "...It doesn't drive very smoothly...", I have some notes in this file. Basically I think I learned the system wants a point at least every meter.
*Thread Reply:* Just posting this here for future reference https://github.com/carla-simulator/ros-bridge/blob/master/carla_ackermann_control/README.md
*Thread Reply:* This is newer than what I found. And it is not an end-less derivation of this original I found: https://github.com/jbpassot/ackermann_vehicle This is the newest version of that I found: https://gitlab.yonohub.com/mohamedahmed/picar/tree/master/src Which is the one with red wheels.
FYI: I am not suggesting that anyone else go down the Gazebo path. Maybe if I get something usable...
That is not using cmdvel. It is using ackermanncmd. (as shown in the image.)
BTW, if you try to run roslawntractor/lawntractornavigation/launch/sendgoals.launch you will find that the file "followgps_waypoints" is missing. I searched the original Slack channel and did not find it.
*Thread Reply:* I thought maybe the docker image did not match the main repository, but they do match and it is missing in both places.
I did not try to upgrade the docker. I see there is a page that suggests that may be possible: https://hub.docker.com/u/rosagriculture
I don't know if it is newer or not.
*Thread Reply:* I'm not sure it's going to work but I'm uploading everything I have so you can see it.
*Thread Reply:* Maybe I found the solution. This video says to gpsgoalserver: https://www.youtube.com/watch?v=lTqUMAUM1cc https://github.com/ros-agriculture/gps_goal_server I'll check it out...
*Thread Reply:* Hey, this a perfect time to use my video index I am creating: 12052020 Lawn tractor meeting https://www.youtube.com/watch?v=lTqUMAUM1cc 00:00 Juan discusses documentation 12:30 Al talking about adding satellite image to RVIZ. 15:20 Johanthan announces a video presentation 20:20 Doug has a new Oculus Quest 2 headset. Mentions microcontrollers. 23:55 Matt asks if people are having problems with Docker networking. Lots of discussion about Docker and the simulator. 41:10 Al requests a model for an Ackermann vehicle. 45:50 Jeff asks about Matt's drone.
*Thread Reply:* I didn't hear a mention of gpsgoalserver in the video. But there is a link in the comments below the video.
Might be some interesting code here: https://hackaday.io/project/25406-wild-thumper-based-ros-robot I cannot open the git repository referenced from work.
*Thread Reply:* He does have a gpsfollowwaypoints.py script. Looks like it reads waypoints from a file, sends them to move_base and waits for each one to complete.
*Thread Reply:* $ git clone <https://defiant.homedns.org/git/ros_wild_thumper.git>
Al, are any of these towns close to you in Texas?
jeff@jeff-hp-pavilion:~$ grep -i tx rtk2gosourcetable.txt STR;AUSDUVAL;Austin, TX;RTCM 3.2;1005(1),1074(1),1084(1),1124(1),1230(1);0;GPS+GLO+BDS;SNIP;USA;30.30;-97.73;1;0;sNTRIP;none;N;N;0;none; STR;AUSLOFTTX;Austin, Tx;RTCM 3.2;1005(1),1074(1),1077(1),1084(1),1087(1),1094(1),1097(1),1124(1),1127(1),1230(1),4072(1);;GPS+GLO+GAL+BDS;SNIP;USA;30.28;-97.69;1;0;sNTRIP;none;N;N;10860;none; STR;EPCWID-Fabens;Fabens, TX;RTCM 3.1;1004(1),1006(10),1008(10),1012(1),1033(10),4091(10);2;GPS+GLO;SNIP;USA;31.50;-106.16;1;0;sNTRIP;none;N;N;1940;none; STR;PROVEN;Wichita Falls, Tx;RTCM 3.1;1004(1),1006(10),1008(10),1012(1),1033(10),4091(10);2;GPS+GLO;SNIP;USA;33.89;-98.53;1;0;sNTRIP;none;N;N;2540;; STR;VN1-legacy;Dallas, Tx;RTCM 3.2;1004(1),1005(1),1012(1),1033(1),1230(1);2;GPS+GLO;SNIP;USA;32.89;-96.70;1;0;sNTRIP;none;N;N;3140;none; STR;VN1;Dallas, Tx;RTCM 3.2;1005(1),1077(1),1087(1),1097(1),1127(1),1230(1);;GPS+GLO+GAL+BDS;SNIP;USA;32.89;-96.70;1;0;sNTRIP;none;N;N;7120;; STR;VN1v2;Dallas, Tx;RTCM 2;;0;;SNIP;USA;32.89;-96.70;1;0;sNTRIP;none;N;N;0;none; STR;ZUPT6818;Houston, Tx;RTCM 3.2;1005(1),1077(1),1087(1),1097(1),1127(1);0;GPS+GLO+GAL+BDS;SNIP;USA;29.94;-95.53;1;0;sNTRIP;none;N;N;0;none;
I pasted the sourcetable from here into a file and did grep. http://www.rtk2go.com:2101/
Nothing seems to show up in PA or OH.
*Thread Reply:* Checking now....trying to figure out the distance. Austin will be West and Houston will be East. (link)
*Thread Reply:* It is fascinating that I was just looking at RTK reviews of the ublox devices:
*Thread Reply:* Since I don't know how the position degrades with distance from a base station, I always think it would be good to experiment with that. But I don't have a convenient way to do that. But it just occured to me that I could tell the CORS system that I am by any random base station and it might send me the data from that location. I might try that if I get around to it. I can just dump the binary data to my screen and it has a few readable messages in it. One message is the name/location of the base station. So I could verify that I can actually pick and choose base station sites.
*Thread Reply:* This does seem to work. I can get data from Cologne, MN (32.5 miles) or Hollywood, MN (37.2 miles) or Brownton, MN (57.5 miles). So I can click on the map and it tells me where the base station is. I'll have to figure out how to test this. I'm using this map: http://mncors.dot.state.mn.us/Map/SensorMap.aspx
Oh, if you put in "penn" it shows a couple in Pennsylvania. Are those close to you.
jeff@jeff-hp-pavilion:~$ grep -i penn rtk2gosourcetable.txt STR;Base9557;Is near: Doylestown, Pennsylvania;RTCM 3;PENDING;;;SNIP;USA;0.00;0.00;1;0;sNTRIP;none;N;N;0;none; STR;MAMCCMR1;Is near: Doylestown, Pennsylvania;RTCM 3;PENDING;;;SNIP;USA;0.00;0.00;1;0;sNTRIP;none;N;N;0;none;
I found the path generator code. It is under Juan's files. https://github.com/rje1974/Proyecto-Pochito/blob/master/software/base/tractor/catkin_ws/src/pochito/scripts/path_generator.py I t was referenced in this video: https://www.youtube.com/watch?v=wnXnZdxiRt8 At around 23:50
*Thread Reply:* I just got "path_planner.py" to run. And it does work.
I loaded it and ran it under python3, so it does work under python3 also.
I copied it into my home directory. I had to load "dubins". pip3 install dubins
I had to change the path for the "waypoint.txt" input file. I just made up a random waypoint file. And ran it. I was surprised that it created a graph for me.
It also created a "generated_points.txt" output file.
So, @Al Jones you can modify paths all you want...
*Thread Reply:* This is the output file.
jeff@jeff-hp-pavilion:~$ cat generated_points.txt
100.0 0.0 0.0
101.0 0.0 0.0
102.0 0.0 0.0
103.0 0.0 0.0
104.0 0.0 0.0
105.0 0.0 0.0
106.0 0.0 0.0
107.0 0.0 0.0
108.0 0.0 0.0
109.0 0.0 0.0
110.0 0.0 0.0
110.95885107720841 -0.24483487621925448 5.783185307179586
111.78366617911476 -0.8053070937595033 5.836654403443426
112.75423179462344 -0.9985475938345956 0.05346909626383933
113.6986276847929 -0.7028181439855021 0.5534690962638393
114.38563279875153 0.009476372539585998 1.0534690962638393
114.64704432446128 0.963941415955595 1.5534690962638393
114.4188596033827 1.9268906555021608 2.0534690962638393
113.75694621345619 2.662560533309475 2.5534690962638393
112.8233636375723 2.990833405906349 3.0534690962638393
111.84668545013798 2.831336625164389 3.5534690962638393
111.01861614211509 2.2769477785279775 3.6741235767316827
110.06823567732195 2.00105563109248 3.1741235767316827
110.0 2.0 3.14
109.00000000134673 2.000002030529716 3.141593161222221
108.00000000134688 2.000001522897288 3.141593161222221
107.000000001347 2.0000010152648597 3.141593161222221
106.00000000134713 2.000000507632432 3.141593161222221
105.00000000134726 2.000000000000004 3.141593161222221
104.00000000134739 1.999999492367576 3.141593161222221
103.00000000134752 1.999998984735148 3.141593161222221
102.00000000134764 1.99999847710272 3.141593161222221
101.00000000134777 1.999997969470292 3.141593161222221
100.0000000026945 1.999999999995715 3.1400000013472558
jeff@jeff-hp-pavilion:~$
*Thread Reply:* nice, I am a bit lost. When I try to go back to all this, it got complicated .. Today I am going to try again .. I would like to get to Friday with this ready and a small text where I explain what my goals with the robot are. It left me wondering the question you asked about what we thought we were going to achieve and if we were confused with the goals that matt had said were possible
*Thread Reply:* Looking forward to working on it with you guys on Friday.
*Thread Reply:* Al and Jeff, I'm for the meeting, unless it gets complicated with my sister-in-law, who is ill and I have to help my brother.
I think last week the meeting was very good although we got complicated with jitsy .. How about extending the meeting a little and let's work a little with the simulators together ..?
With all this Spanish translations I am with a lot of work and I have not been able to dedicate myself to simulators. I'm also going to start reporting in slack when I'm working in the simulator in case anyone wants to do something together.
*Thread Reply:* My idea is that the robot travels a path, where it stops several times, each time that happens it takes a variable and continues its journey until it ends where it started. With the improvement of these steps, I plan to advance to a robot that does recognition over long distances and on large surfaces autonomously. For those goals I need to bring the following points to the robot that I achieve in the simulator.
Those things I achieved in the simulator and if it weren't because my tractor is broken, I could be trying to reproduce them. As I can't solve the tractor break for now. The points that I am looking to work on are those in the worksheet that we shared with Al, to which last week I added making movements by commander akermann.
*Thread Reply:* Keep in mind this expects your tracotr to be at the starting location and pointed in the right direction. Also the point spacing may need to match the speed of the vehicle. So 1 meter spacing may be too far for it to keep up.
*Thread Reply:* Yeah the simulator does not end very elegantly so I have to restart it each time; It always starts at that same spot
*Thread Reply:* I don't know what you mean by "simulator does not end very elegantly". I would think it should stop at the end of the path you defined.
But after it stops you can do a 2D Nav Goal to have it drive back to the start. But I don't think the "2D Nav Goal Rotational Tolerance" is set to a reasonable value. So it is not pointing in the correct direction when it stops.
*Thread Reply:* re: not ending elegantly, I can't just run another set of waypoints
*Thread Reply:* I also have the same problem with needing to restart the sim after each time
*Thread Reply:* Doesn't Move Base Flex have a "cancel navigation" function? I don't recall it if is a topic or a service. But I think that would reset the statemachine so you can send it the next path.
*Thread Reply:* I have an instruction file in the Google folder called, "Creating a waypoint plan using path_generator.py". It has a few different path images created by a few different waypoint files. Worth checking out IMO.
*Thread Reply:* Another way to do a turn on a square is to pick a point early. Then specify the next point the same distance up the next line. I thought it would be one "radius" distance. But it looks that may not be completely right. But the first image looks like it did and arc, a line and another arc. But it works. If you push it too far it throws in an extra circle. So someone would have to figure out the "magic number. Plus I am pushing back to the main messages, since it is buried in a thread from 21 days ago.
THis is really random..
The other day we were talking about how to see the folders of the files we were looking for .. It is a problem that I had several times .. This is a form .. You search and when you mark it with the mouse you see a bar with the address where it is the file, you click on the directory and it takes you there.
This tells how to display the stage simulator screen when running the simulator: @16:00 https://www.youtube.com/watch?v=-JQ77rXgAoc
Nice, when you say left-right speed.. It's the wheel enconder..?
*Thread Reply:* Yes, these are the sensors to detect speed mounted by the rear wheel axle.
@Al Jones Sorry, but in my drunken stuper I let on that people have moved beyond the ROS Agriculture lawn tractor meetings... But I could not help myself...
*Thread Reply:* I'm happy for anyone to join. I updated the meeting notice. Feel free to forward it.
So if get requests for this, should I forward them on. In particular I am thinking of Jono (Jonathan, and his brother) and Doug and Bob (who are both local to Minneapolis
So, today, I had a burst of energy. During last Tuesday's HBRobotics Zoom meeting I started thinking about my Lawn Tractor power steering unit The first problem was I could not find my power steering motor. Later I found it in my garage. I gave it random attention. I was thinking I would make a chain drive to run my existing steering wheel shaft. I was going to add a 2:1 or 4:1 reduction between my power steering motor and my steering shaft. After adding a "flag" to the steering motor (a ty-warp) and connecting to a 12v battery, I decided I would be better off with a 1:1 connection. (like everybody else has done) I thought I should just put my steering motor on top, replacing my steering wheel (like Juan did). I found that I has an aluminum tube that will press fit onto the splined shaft on the power steering motor. The outside of the tube is 3/4". So I could either put a sprocket on that, or couple that directly to my steering shaft. I had to remove my steering wheel to determine the shaft diameter. The existing steering shaft is also 3/4". So if I get a pipe that has an inside diameter of 3/4" I can directly connect the steering motor in place of the steering wheel. I don't have a pipe with 3/4" I.D. But I found that 3/4" electrical conduit is close enough. That is what I am using to hold up my "green screen" for my Zoom videos. So I can cut about 3" from that. I may do that tomorrow and see how far I get. If that works then I can have a steering motor attached like Juan has. Then I will have to make a bracket to hold the motor in place. It is supposed to warm up in the next week or two. If it stays above the freezing point I may be able to get something done.
If anyone is interested, the Home Brew Robotics ROS discussion groups starts in one hour:
What: This is an open discussion of all things ROS. When: Tuesday Mar 2 @ 7:00PM PST Where: Zoom https://zoom.us/j/92376743464 Why: Homebrewed Robots! Beginner or expert, if you're interested in ROS please join us. Thanks, Camp
I poked at the "one radius" a little more. I thought maybe the heading value in radians needed more precision. That didn't fix it. But if I specify exactly 1 radius short of the corner and around the corner just a hair more than 1 radius distance on the next line. It works. And you can see I added 0.001 to the radius value. The resulting curve looks good and appears to be a single arc.
*Thread Reply:* I'll give it some thought (maybe not a whole lot). Seems like eventually you will want to have a program or a script that will cycle through and generate corner points. (Oh, I forgot to send this...)
*Thread Reply:* So I came up with this: 0.0 0.0 0.0 10.0 0.0 0.0 10.0 10.0 3.14 0.0 10.0 3.14 0 2 0 8 2 0 8 8 3.14 2 8 3.14 2 4 0 6 4 0 6 6 3.14 4 6 3.14 But without controlling the vertical lines I think you won't have enough control. You can see as it gets smaller that the end coverage isn't ideal, And the turn degraded to a "light bulb" turn.
*Thread Reply:* No matter, it is going to get funny as your area tightens up at the end.
*Thread Reply:* Al, do you know what the turning radius of your lawn tractor is? And do you know what turning the radius of your full size tractor will be when running in the hay field? You can change the radius inside of the path_planner.py.
*Thread Reply:* I played with this again. If you are doing a large area it behaves fairly well. (by just defining the horizontal rows) As long as you want your two spacing to match your turning radius. So the row spacing is 2 meters and the turning radius is set to 2 meters. You could set your radius to 3 meters but then you have to space your horizontal rows by 3 meters. Otherwise you would have to define both horizontal and vertical rows and use the more complex "stop short by 1 radius" method. 0.0 0.0 0.0 20.0 0.0 0.0 20.0 20.0 3.14 0 20 3.14 0 2 0 18 2 0 18 18 3.14 2 18 3.14 2 4 0 16 4 0 16 16 3.14 4 16 3.14
@Al Jones, I was watching this video:
04172020 Lawn Tractor Meeting <https://www.youtube.com/watch?v=jsqcQD7W01M>
This was one of the many times that you seemed confused about getting the correct TF tree for your physical lawn tractor as opposed to the simulator. Do you want to go through this sometime? I think I have seen this come up enough times that I know how to explain the difference and how to fix it.
*Thread Reply:* https://www.worldtimebuddy.com/?qm=1&lid=5128581,3435910,3117735,5368361&h=3435910&date=2021-3-5&sln=14-15&hf=2
*Thread Reply:* Unfortunately, won't be there this week but have fun all 🙂
Thanks Jeffs..
VUELTAS LOCAS - crazy turns
-14 0 0 0 0 0 40 0 0 40 2.5 3.1415 0 2.5 3.1415 0 5 0 40 5 0 40 7.5 3.1415 0 7.5 3.1415 0 10 0 40 10 0 40 12.5 3.1415 0 12.5 3.1415 0 15 0 40 15 0 40 17.5 3.1415 0 17.5 3.1415 0 20 0 40 20 0 40 22.5 3.1415 0 22.5 3.1415 0 25 0 40 25 0 40 27.5 3.1415 0 27.5 3.1415 0 30 0 40 30 0 40 32.5 3.1415 0 32.5 3.1415 0 35 0 40 35 0 40 37.5 3.1415 0 37.5 3.1415 0 40 0 40 40 0 47 40 4.7123 47 1.25 4.7123 43.5 -2.25 3.1415 -2.5 -2.25 3.1415 -5 0 1.57075 -5 40 1.57075 0 42.5 0 40 42.5 0 42.25 40 4.7123 42.25 1.25 4.7123
*Thread Reply:* Nahhhh, @JeffS explained very well this subject and the rest is practice..
Where is cmdvel linear x coming from? I ran this path, captured the bag file and noticed cmdvel linear vel.x varies from ~1.6 to ~1.9. @vinny ruia I will keep searching, but I'm wondering if you know off hand where "lawntractorsim lawntractorsim.launch" determines the speed to tell the tractor to run if waypoints are sent to it using pathpublisher.py and pathfollowernode.py which use /movebaseflex/exepath? I'm only giving it x,y coordinates in the waypoint planning process.
*Thread Reply:* thanks. We need to basically work through how to program the speeds we want to program the tractor to travel. Knowing where the default is set seems to be a good place to start.
*Thread Reply:* I believe it is being set by the local planner. On the develop
branch, you can see that the config file for teb_local_planner_params_carlike.yaml
has the max_vel_x
set to 2.0
*Thread Reply:* yeah I just pulled up the bag and I am confident that is the relevant parameter
*Thread Reply:* and, if not too much effort, can you tell me how to recreate? (how are you generating this bag file)
*Thread Reply:* uploading now, same folder as above; using command al@al-ThinkPad-W530:~/bagfiles$ rosbag record -a
*Thread Reply:* sorry, I meant — what are you going to get the robot to drive that path?
*Thread Reply:* @vinny ruia re: recreate, do you want to start from the very beginning and create the waypoint file?
*Thread Reply:* Ok got these, no I didn't need to create the waypoint file
*Thread Reply:* This is also an important piece to the overall puzzle if you want the full picture. The python source is referenced in that link.
*Thread Reply:* sorry it is taking a bit longer than expected to get sset up
*Thread Reply:* not sure if path_publisher
is working correctly,
I keep on getting the message Sending Path
when I run it, but when I echo the topic it is publishing on /drive_path
, it is just empty poses:
^Croot@pop-os:~/tractor_ws# rostopic echo /drive_path
WARNING: no messages received and simulated time is active.
Is /clock being published?
header:
seq: 28
stamp:
secs: 0
nsecs: 0
frame_id: "map"
poses: []
*Thread Reply:* looks like path_publisher
is looking for a message on a topic called got_path
.. not sure where that is supposed to be coming from
*Thread Reply:* yeah, good question. Not the author, so a bit of a challenge; Any other basic path publishing routines around you know of?
*Thread Reply:* I am digging into it and trying to delete reference to got_path
*Thread Reply:* I am surprised you got it working without modification .. maybe there is something going on I am missing
*Thread Reply:* Do you notice the following warnings on the stdout of the window that you called the lawn_tractor_sim lawn_tractor_sim.launch
```[ WARN] [1615069628.314063033, 18.700000000]: TebLocalPlannerROS: trajectory is not feasible. Resetting planner...
*Thread Reply:* If it were an 8 cylinder engine I think we would be firing on about 5 1/2.
*Thread Reply:* Re: Your orignal question: • I was able to recreate and limit the speed by changing the parameter in the config file I mentioned above
*Thread Reply:* Remember that you have to edit the config file and
catkin build && source devel/setup.bash
in the container or window that calls roslaunch lawn_tractor_sim lawn_tractor_sim.launch
*Thread Reply:* Also, note that the above warning messages disapearred when the I made max_vel_x: 0.0
*Thread Reply:* So you changed /movebaseflex/TebLocalPlannerROS/maxvelx and the vehicle did not exceed the value you set?
*Thread Reply:* let me confirm with a bg ... but the robot was visually alot slower
*Thread Reply:* I used $ rosparam set /movebaseflex/TebLocalPlannerROS/maxvelx 0.75
*Thread Reply:* and then verified with $ rosparam get /movebaseflex/TebLocalPlannerROS/maxvelx
*Thread Reply:* Ah.... I am not sure if that parameter can be changed at runtime
*Thread Reply:* I was not clear how to changed a param like that in Docker
*Thread Reply:* there are a few ways, you can use nano or vim to edit the file in the container... or you can use the vscode docker extention to remote into the container
*Thread Reply:* you might have to CTRL^C out of the launch of that container before you do these edits
*Thread Reply:* and then as always, remember to build your workspace and source your setups
*Thread Reply:* I believe you that the paramater "takes", I just have the funny feeling that the underlying planner will not "listen" to parameter changes
*Thread Reply:* in the lawn_tractor
container, the full path to the parent dir is /root/tractor_ws/src/lawn_tractor_navigation/config
*Thread Reply:* See if I have the same as you /root/tractor_ws/src/lawn_tractor/lawn_tractor_navigation/config
*Thread Reply:* here is my yaml ```# file: teblocalplanner_params.yaml TebLocalPlannerROS:
# Trajectory tebautosize: True dtref: 0.3 dthysteresis: 0.1 globalplanoverwriteorientation: True allowinitwithbackwardsmotion: False maxglobalplanlookaheaddist: 3.0 feasibilitycheckno_poses: 5
# Robot maxvelx: 0.50 minvelx: 0.0 maxvelxbackwards: 1.0 maxveltheta: 2.0 acclimx: 2.5 acclimtheta: 2.5 minturningradius: 1.0 # diff-drive robot (can turn in place wheelbase: 0.4 cmdangleinsteadrotvel: True weightkinematicsturningradius: 1.5 # increase of minturningradius is not enough footprintmodel: type: "line" # include robot radius in minobstacledist linestart: [0.0, 0.0] # include robot expanse in minobsticaldist lineend: [0.4, 0.0]
# Goal Tolerance xygoaltolerance: 0.4 yawgoaltolerance: 0.2
# Obstacles minobstacledist: 0.25 costmapobstaclesbehindrobotdist: 1.0 obstacleposesaffected: 10
# Optimization noinneriterations: 5 noouteriterations: 4 optimizationactivate: True optimizationverbose: False```
*Thread Reply:* I agree that it is confusing that setting the rosparam
didn't work, but I am sure that setting that parameter at run time is not something the developers imagined people would ever do
*Thread Reply:* so I bet they probably just take the value of that parameter when the planner first initializes, and sticks with it.
*Thread Reply:* I am going to have to sign off soon, any other doubts I can help with?
*Thread Reply:* sudo apt-get update && sudo apt-get install nano
*Thread Reply:* https://github.com/ros-agriculture/lawn_tractor/tree/feature/add-path-publisher
*Thread Reply:* the latest is on that branch, I am PR-ing it with Matt as reviewer... but not sure if he is available.
In that case who shall I assign to review?
*Thread Reply:* @vinny ruia Can you assign 3 names with "or's"? Meaning any of @JeffS, @Juan Eduardo Riva or @Al Jones could approve?
*Thread Reply:* i gave it to @Juan Eduardo Riva but next time I wiull do that
I have my first group of meetings indexed. I have 100 of 250 done. At various times I will just overwrite the file as I get more done. This is just a temporary location. Eventually I will find a permanent home for it. I may try to figure out how to get the actual links for the "Chat:" sections. I may figure out a way to convert to a different format so the links are actually clickable. I will put a README file in the same directory explaining what I did and how to use it. Eventually that type of stuff should be merged into the index file. This seems to be accessible with Ubuntu 20.04/Firefox and on Windows 10/Google Chrome. sampson-jeff.com/RosAgriculture
It turns out if I create a file named README or README.txt that the browser does not see it. But if I call it readme.txt then it shows up in the browser. So there is a readme.txt file in the same directory.
I experimented by copying your long text file in to my iPad. Then I pasted that into my iPad’s word processor called PAGES. So far so good. Next I tried high lighting a couple of the http addresses and made them links. It worked. Reading thru your document, I’ll be able to click on the new http link and go right to the YouTube meeting. Guess I know what I’ll be doing in my free time. Thanks Jeff for making it possible.
If you are modifying the file, keep in mind I will totally replace it when I do updates. So give some thought as to how much work you plan to put into that.
This message about Docker showed up on the HBRobotics group. I don't know anything about Docker and I don't plan to learn it.... But this may mean something to people that use Docker.
In this meeting, during this Docker discussion the issue was raised as the what happens if changes are made in a Docker Containing while it is running, such a revising script in the ROS workspace src or sudo update, then shutting down and restarting. Are the changes persistent? In answer to that, Steve referred to a dos.docker.com/bind-mounts document that described using the -v flag, which the docker run in this case uses. Following up with the developer, Raffaello sent me the following reference that seems to further address the question. "You need to terminal from the nanosaur host to store the changes", https://phoenixnap.com/kb/how-to-commit-changes-to-docker-image. "If you want share part of the code between volumes, you need to run the Container with a volume, in this case you will able to move files inside and outside the container". I haven't tried this but its on my ToDo list. Ross
*Thread Reply:* yep, it's a pretty clear document on how to deal with Docker. I'm going to translate it into Spanish to have it handy for the ROS wiki tutorials. Thank you very much
In the last few days I was indexing an old meeting video and we were talking about Al's absolute sensor which is single turn. Yet it has to turn 1-1/2 to 2 turns in normal operation. It was suggested to add an additional multi-turn pot to determine what the overall absolute position is. I thought of another way that may work. Use those magnetic sensors like Matt is using on his back wheels. Strategically mount them above the "half moon" gear. If it is centered then neither sensor is on. If turned some amount to the right them the right senosr will turn on. If turned some amount to the left then the left sensor will turn on. If neither sensor is on then you know you are in the center section. But I can't visualize from memory if you would be able to mount them properly.
I: 06122020 Lawn Tractor Meeting https://www.youtube.com/watch?v=OED92oZw9ts At the very begining of the video is where we are talking about this.
*Thread Reply:* For me, the problem statement is I need to know the wheels are pointing straight before engaging the magnetic sensor since the magnetic sensor does not retain its position data on reboot. The solution has been a "low resolution" analog encoder and some code to test if the angle is within the "straight zone". btw, I'm still not sure having a "high resolution" magnetic sensor is worth the effort yet. Below is a photo clipped from the video.
*Thread Reply:* @JeffS Came across this...https://youtu.be/JyIkXSQGX_8?t=162
*Thread Reply:* I did see this this, even though it was buried in a thread, since you put my name on it. I see part two is talking about the electronics and remote control.
I though of a new concept. "Words of Wisdom"...
Words of wisdom: "Try some something quick and cheap to verify that is what you want to do". Words of wisdom: "You are going to need a bigger box than you think".
*Thread Reply:* "You are going to need a bigger box than you think", that happened to me becouse I'm cheap..
and..
"Try some something quick and cheap to verify that is what you want to do". THis is a MOD in life.. but this one has a sister.. I don't know how to put this in a way that is well understood; But we are going to try: "accept when you buy I buy bad things because they are miserable and not waste time trying to resuscitate them", also here in Argentina it happens that we buy "what there is" and not "what we should" so working with things that are not optimal..
I was watching another video and I was remined that we should go over A/D conversion sometime. Which reminds me I need to find an op-amp that will run on either 3.3V or 5V and correctly give the correct output.
I was looking at the latest zoom meeting. Some one was looking for ideas to lift up their heavy lawn tractor. Maybe use a modified sheep grooming stand ...like this crank up one or hydraulic lift. Just narrow the area that the sheep stands on. Match it to the underside of the lawn tractor..... https://www.valleyvet.com/ctdetail.html?pgguid=88bbbde3-9bde-47fc-8850-9e73e487870e&itemguid=a17c98ac-9374-4513-bd0e-892521a97888&sfb=1&grp=L000&grpc=L600&grpsc=L640&sp=f&utmcontent=35673&ccd=IFF003&gclid=CjwKCAiA4rGCBhAQEiwAelVti7sz-FdyRvEtaUR1cLIk2rsX1coh5I6V4-GqFaglTOR3z4DjQdjyBoCkS8QAvDBw|https://www.valleyvet.com/ctdetail.html?pgguid=88bbbde3-9bde-47fc-8850-9e73e487870e&itemguid=a17c98ac-9374-4513-bd0e-892521a97888&sfb=1&grp=L000&grpc=L600&grpsc=L640&sp=f&utmcontent=35673&ccd=IFF003&gclid=CjwKCAiA4rGCBhAQEiwAelVti7sz-FdyRvEtaUR1cLIk2rsX1coh5I6V4-GqFaglTOR3z4DjQdjyBoCkS8QAvDBw
I'll go look at it later. It is time for the Home Brew Robotics Group meeting. Or I may look at it while that meetig is going on...
Okay, so I am looking at Al's RPi catkin_ws. I can see files. If I click on one it goes down that path of "what do you want to open it with... I say text editor... It wants me to agree to endless questions if I want to sign away my life..." About 3 steps into I gave up. I can download the file. Then force it to open. Is there an quick way to view a file?
Hi Jeff, here is the tool to mount the google drive directories on your computer and use the editor you want https://support.google.com/a/answer/7491144?hl=en#zippy=%2Cwindows
*Thread Reply:* I was about five minutes late for the meeting and some of Jeff's news may be missing; but if I did not understand something wrong or I forget anything, things were this way ...
Jeff showed the model that he is assembling in conglomerate wood on which he is going to mount the steering of the vehicle and the other accessories and electronics, as if to start his tests. I found it very interesting and I'm sure I'll copy it shortly. He explained to us how he is going to compose the entire steering column and he showed us his gear where he is going to connect the steering motor and the potentiometer. We were talking a little about the files that are in google drive and how to use them better, to which google drive opens a program to see the code of the python files. I told them that if you mount the directories in the root with the specific app for w10 you can use vscode. Then Al showed us his new 3D impressions for the wheel magnetometer sensor from which he receives odometry. It was very good and super neat. It will be necessary to see how long these gears last with a little dirt and the other elements but clearly enough to test in the field. The meeting lasted about 25 minutes, because Al was about to catch a plane. I said that: a boy began to help me, take me to fix the tractor (they give it to me on Monday) and that I began to present the electronics again with my assistant.
*Thread Reply:* Jeff, any chance you could post pics of your model Juan is talking about?
*Thread Reply:* @Bob Hassett To get @JeffS’s attention you want to put "@" in front of his name.
*Thread Reply:* Bob doesn't have to add my name to flag me. If he posts in a thread then he clicks button that says "Also send to #..." and it gets pushed back to the main channel. That does a several things. It keeps people happy that want messages organied into threads. It posts to a main channel so everyone sees it. When it gets posted to a main channel then it is flagged as a new message. And if I get notified that a new message was posted then I am happy. But I'd rather not see a thread of 60 messsges as a thread and as separate messages in a main channel. But since Slack doesn't work very well, it is up to each individual to use their own discretion on how to do it. (Here, I'll send it to the main channel as well...)
*Thread Reply:* And I see I can't reply to a reply in a thread... Bob, no I can't (conveniently) post a pic of what Juan was talking about. I did show it at both the ROS Agriculture meeting Friday morning and at the TCRobots meeting the night before. Both of which you belong to. As lazy as I am I will probably not get the motor mounted before next Friday's meeting, so I may show it again there.
*Thread Reply:* ok if we don't like the thread idea we can scrap it... I won't be offended 🙂
*Thread Reply:* I'm not saying it should not be done. I am just saying that Bob has founsda way around it.
No, we could not figure out a reasonable way to record it and it lasted 30 minutes.
https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 Bob, we will try this one, if it does not work we will use other one and i will send it here.. It's ok..?
Can not make it work.. Please use this link https://us02web.zoom.us/j/84606675957?pwd=S25tRVpsb2tZUnQ1SGFXdEdPbjkzZz09
*Thread Reply:* I had to leave the meeting early to get my second COVID shot. I left shortly after Jeff finished talking about his steering motor. Juan was about to show something on his phone. What did I miss?
*Thread Reply:* Thanks for asking Juan. First few hours I was a little dizzy with flu like symptoms. Saturday morning woke up like nothing happened. Back to my ditzy self ;-)
This is the page that inspired me to make my steering motor frame for my lawn tractor. https://www.sparkfun.com/news/3329
I watched a video about the Bosch BMO055 IMU module: https://www.youtube.com/watch?v=2AO_Gmh5K3Q I was inspired. I had two sensors that were cheap crap that I bought on eBay or Amazon. I wired one up and could not figure out how to get the Adafruit libraries to work. So I tried this library: https://github.com/kriswiner/BNO055 I spent hours trying to get anything out of it. After switching to a second board and soldering two jumper pads it seems to be working. I think the first board has a short. the regulator got hot enough to burn my finger... Once it was running the yaw appeared to be stable. But it did not come up the same orientation each time I started it. And this particular code had me wave the board around like an idiot. It seems like it would be hard to convince your robot to wave it around when it starts up. Maybe I can figure out how to store the calibration values so it does not need to be done each time.
But it is running. I need to read about it and see what can be done to make it usable.
I got a "Tindie" IMU board to run on a Teensy 3.2. It took forever... https://www.tindie.com/products/onehorse/ultimate-sensor-fusion-solution-lsm6dsm-lis2md/ Now I have to decide which IMU to waste my time on. I need to figure out one of the two so I can get one that runs correctly.
*Thread Reply:* Join Zoom Meeting: https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
$15K https://directedmachines.com/index.html
This is the write-up I follwed to get the "Tindie" IMU to work on a Raspberry Pi: https://github.com/droter/em7180_imu
Here is a random series on Cub Cadet mowers. https://www.youtube.com/watch?v=KopYghpcjIk
Here is a video about sensors on cell phones: https://www.youtube.com/watch?v=C7JQ7Rpwn2k A couple more that may be related: https://www.youtube.com/watch?v=nOQxq2YpEjQ https://www.youtube.com/watch?v=OPsVr44uCb8
Guys I need to be with my son at his school at Noon so will be unable to be on the call today. I am hoping to get my tractor out this weekend (Pittsburgh) and test a fully planned mission. I was in Texas last week and worked on installing the steering motor and transmission control servo on Tractor II. Something new I did was wire an LED and reset button to the Nano that I install inside the SuperServo for the transmission.
*Thread Reply:* Al, the super servo is for the steering wheel..?
I have not "looked under the hood", pun intended, but thought I would share... https://github.com/udacity/self-driving-car/tree/master/steering-models/steering-node
In the real world where should the Teb Planner Run? I'm trying to run a simple path. I'm thinking I need to update my RPI on the tractor with the latest version that matches the code running in the Docker container and continue to use "export ROSMASTERURI=http://tractor_aj:11311" or set "export ROSMASTERURI=http://localhost:11311" and have the RPI just be a slave. Any thoughts?
*Thread Reply:* Not good. Updated the RPI bashrc with ROSMASTERURI=http://localhost:11311. The result is a "rostopic list" command does not return the topics running in the Docker container. I can ping the laptop (aka controller_aj) from the RPI.
*Thread Reply:* reviewing: https://answers.ros.org/question/298602/how-to-have-topics-publish-and-subscribe-both-ways-between-host-and-docker-container/
*Thread Reply:* @JeffS Do you launch the "develop" branch of lawntractor? How do you do that in a roslaunch lawntractorsim lawntractorsimreal_world.launch command? The reason I ask is because I think there are changes in develop that are needed that are not in master.
*Thread Reply:* @Al Jones I don't run anything. I don't know what you are asking here: "How do you do that in a roslaunch lawntractorsim lawntractorsimrealworld.launch command?" I can see changes on this page: https://github.com/ros-agriculture/lawn_tractor/compare/develop What are the errors you are getting? Vinny's name is on the "develop" branch. He probably knows how this works.
*Thread Reply:* @JeffS Thanks. Re: error messages, I'm not getting any. My problem statement at the moment is the RPi (192.168.193.95) is not seeing the topics published by the ROSMASTER which is the instance of ROS running in the Docker container on my Ubuntu laptop (192.168.193.96). I think my next step will be to add the IP address of the laptop to the ROSMASTER statement on the RPI. If that does not work I will try and run the simple mission on my native laptop without Docker. I have forgotten why that did not work the last time.
*Thread Reply:* On Laptop: export ROSMASTERURI=http://localhost:11311; on RPi: export ROSMASTERURI=http://controller_aj:11311 seems to do the trick.
I updated the instructions a bit on using the waypoint publisher. Mostly adding samples and reference to the video where Jeff spoke about creating his model.
Had a minor setback last night. I spent so much time making sure my control box was wired correctly but I for got to set my go no go toggle properly. I blew one channel of my motor controller. so I guess I need to order a new board.
I don't see a problem. Your schematic isn't quite complete, but I get the idea. I don't see any combination that could blow anything out. I think you have a wiring error. Which ever side blew out, I think the switch it is connected to is wired wrong.
I’ll go thru the wiring again and use a test light as I go. Thanks for the feed back.
Every thing working as it should be. Switches where ok, so I must have mixed up the primary power lines some how. Reminded to always use a test light first and assume NOTHING. Still need to replace the driver.
I’m all for robotics in agriculture BUT it @should not be used to promote gigantic destructive monoculture farming systems. Robotics should be used to create “a revolution in farming practices”. Just saying. https://www.geekwire.com/2021/isilon-founder-lifts-hood-farming-startup-carbon-robotics-weed-zapping-machine/?fbclid=IwAR3wT7-fPhT9pPhPYbATmXiS4U8LgCpZuxsVtNht7HFYECJLpHEsDa7ctf4
*Thread Reply:* I can be wrong but what I see that the robots are going to generate is a virtuous consociation of crops that is otherwise impossible. Today agriculture already has very modern techniques such as the integrated management of insects that with robots can be taken to another level. All these techniques are of processes and for the taking of decisions a lot of information that is gathered in the field is made flat, unfortunately the path that was chosen for development was wrong at first; but fortunately that is being rectified
*Thread Reply:* Thank you Juan for restoring positive thinking! I can really get resentful of big corporate farming some times. I’d like to see more affordable robotics for the small farmer. Maybe small farmers can get a chance to “look under the hood” of the expensive machines for inspiration and adaptability? Btw can you help me see the “otherwise impossible”? On another note, If I remember correctly, it seemed that Universities in Europe are a hot bed of ag robot development. I’ll have to converse with Dr. Google about it!
*Thread Reply:* I think we have to wait a little longer, the issue goes through the artificial intelligence that is being developed .. You have to look at that .. the robots will only be the means for the execution of that artificial intelligence. Look at the new data sets, none of them are from the big companies and most of them are free. For example, the only case of a useful robot in the agricultural sector of Argentina is one that does plant physiology.
*Thread Reply:* I like free! How can I learn more? Not that I am anywhere near competent enough to implement. But one can drool.
Yes, link is at the top of the General channel and I will email you an invite.
I assume there may be a meeting today. Unless people start canceling. As always, the link is still in the pinned message at the top of the General channel, same place it was the last two times you asked.
Oh, I see, you were asking Bob (due to multiple messages Slack confused me again).
Everyone should consider that you can send a private message to someone. In case you don't want your email address (or anything else) to be public. When Al sends you the invite you can go back and delete your message if you don't want it to be public...
So to understand the link in General? I just click on the link and wait?
Here is an article on Farmhand robot: https://www.sparkfun.com/news/3780
So the spark fun extended range they are talking about is up to 100 ft of wire?
The term “extended range”? I’m not sure what they are trying to do? Just use a distributed system for a mobile machine? So they are using Ethernet wire on a single “tractor” ? Instead of using usb and a hub?
It all comes down to the fact they are trying to inappropriately use a communication bus for the wrong thing. It goes back to the fact that I2C was designed as a bus between two or more chips on a single printed circuit board. But it is real handy to get a sensor that uses I2C and connect it to a microcontroller over a length of wire. The I2C bus drive the bus low and pulls to an inactive state using pull up resistors. Which works fine if the chips are close together on a printed circuit board. But when you separate them with wire, you re adding inductance and capacitance which will affect the signal propagation. If the wire gets too long the signal degrades and you will start getting random errors. If the wire gets too long it won't work at all. If you try this (connecting I2c devices on medium to long wires) on a vehicle (or any other electrically noisy environment) then you are introducing electrical noise to you bus which it was never designed to handle.
To further complicate matters. The I2C has two wires. A clock line and a data line. I think the clock is always driven by the master. The data line is bidirectional. So if you want to add line drivers/receivers to the existing signals you have to account for which device is in control. And I2C used some clever (foolish) tricks to to make it work (basically "clock stretching" was added as an afterthought for accommodate newer slower parts. And they watch the state of the bus to decide it they are allowed to transmit or not. And to decide if the bus is hung So Basically, none of this was designed to run over an extended length of wire.
So I didn't realize until I saw this article that someone (Philips?) designed a chip to get around (some/maybe all) of these problems. The PCA9615 is designed to be I2C on one side and has differential driver/receivers on the other side. It apparently has all kinds of magic designed inside the chip to make it all work. So they have provide a method to get an I2C signal to work over an arbitrary distance.
To relate a similar story, at Cray they were building Super Computers. Not the old vector processors, but new "lots of CPU core in a box" concept. One on the latest cabintes I was associated with (from memory) had 96 nodes. Each node was a dual processor SMP (symmetrical multiple processing) configuration using AMD Athalon server CPUs. So that meant it had 192 AMD server class CPUs in a single cabinet. Each node was basically a full dual CPU desktop or server computer, minus about half of the circuitry because you don't need keyboard/monitor/disk/USB/networking. But they had their own big networking for each node. It had big power supplies, redundant power supplies. I think each cabinet drew about 54KW when it was thinking hard. And the cooling fan took 5KW to keep it from melting down. Well, with big 3-phase cooling fans and huge power supplies it was relatively electrically noisy. So someone decided to put a microntroller in the bottom of the cabinet. Then run a cable to the top of the cabinet to connect to a random I2C sensor (I think it was a temperature sensor). Guess what? It didn't work well. So here was real world example of a solid brick about the size of a refrigerator drawing 50KW and it had some of the same problems you will have on a lawn tractor...
Bottom line, I don't use I2C unless I have to, A reason I would have to use it is if I buy a sensor that has that interface. But if have to use it I would put the sensor and the microcontroller as close together as possible and only add a length of wire if I need to. An example of that would be if I have a magatometer that I don't want close to anything which would affect the magent field. If that becomes a problem then you could put a microcontroller next to each sensor and connect everything together with USB, which is what Al has done.
But basically, I2C started as a cute/simple/elegant solution to put multiple slave peripherals on a single two-wire bus on a localized circuit board. Then it grew wildly ouf of control. Not people are trying to bastardize it even further by trying to create a long range general purpose communication bus...
To answer your other question "So they are using Ethernet wire on a single “tractor” ? Instead of using usb and a hub?" Yes, Sparkfun gave them QwicBus (a big hammer) so everything else looks like a big nail. So rather than switching to a more appropriate method to connect distant nodes together the are going to extend their QwicBus and live it. In my opinion they are trying to force something they shouldn't. Although I now know about the PCA9615 chips that will allow more reasonable solutions in real world solutions. I will have to see if I can find that chip on a simple/cheap breakout board. Because in some cases it would be useful.
[I'm tired of proof reading, here you go.]
If anyone thinks about going down this path, keep in mind the "differential" board (BOB-14589) is different than the "endpoint" board (COM-16988). So make sure you understand the difference before you order anything. https://www.sparkfun.com/search/results?term=pca9615
Not for me. I’ll go with the usb connections. Not knowing anything I really appreciate the explanation! Thanks for taking the time to explain stuff Jeff!
Getting MoveBaseFlex to run on RPi... with tremendous help from @vinny ruia I have cloned the development branch of LawnTractor to my RPi and ran catkin_make cleanly. I have ver. 0 of an RPi launch file (attached). I am getting this error, but will continue to try and figure this out. The rough steps I followed are here.
*Thread Reply:* yeah this would have been faster if we had gotten rosdep update to work
*Thread Reply:* remember the patterns I described when you get package failures like that , it’s always ros-<distro>-pkg
*Thread Reply:* yes. that might be easier than manually installing like we are
*Thread Reply:* this is done at the root of the workspace, after a “rosdep update”
*Thread Reply:* ok now try the rosdep install command from a few messages up
*Thread Reply:* this one: rosdep install --from-paths src --ignore-src
*Thread Reply:* rosdep update updates the internal sources list , and install goes and installs the dependencies that you want (takes it from the package.xml in your workspace )
*Thread Reply:* Oh for some reason you have the old state machine getting launched
*Thread Reply:* <node pkg="lawn_tractor_navigation" name="mbf_behavior_tree" type="movebaseflex_bt_node" respawn="false" output="screen" />
*Thread Reply:* You can reference this (this is for SIM)
*Thread Reply:* So you are saying line 29 in my launch file is incorrect? <node pkg="lawntractornavigation" name="mbfstatemachine" type="mbfstatemachine.py" respawn="false" output="screen"/>
*Thread Reply:* Yeah. I am not sure if that will fix the problem., but it is a good place to start.
*Thread Reply:* This is better?
<node pkg="lawntractornavigation" name="mbfbehaviortree" type="movebaseflexbtnode" respawn="false" output="screen" />
*Thread Reply:* What about this? Need it?
<!-- ******** Global Parameters ********** --> <param name="/usesimtime" value="true"/>
*Thread Reply:* @vinny ruia Thank you again...no way would I have made any of this progress without your help; My promise is to write it down and share it with others....
Trying to startup an old raspberry image...buster but when asked for a password the default...raspberry...doesn’t work. I’ve tried holding down the sift key as well as trying shift alt delete. Looked on google and the pi forums but no help. I suppose I’ll have to reflash it? I would lose two other programs on it...Python and openCV
*Thread Reply:* Yes it is connected to a monitor...it shows user..pi...then asks for password...I type in the default...raspberry...comes back not correct.
So I gave up, now I’ve switched sd cards to the ubiquity image. Finally found out how to open a terminal..
So now what? I’ve been spoiled using gui. How do I get to the installed stuff...how can I just snoop around? Do I use Linux commands in the terminal or?
*Thread Reply:* I’m in. Found a lot of applications. Tried exploring ros. I’ll have to dig into the tutorials to get around. First project will be to interface pi with teensy 3.2. I see a lot of late nights after tomorrow!
*Thread Reply:* @Bob Hassett Another option, if the RPi is on your network is to use FileZilla from another computer to connect to the RPi if you want to review the files that are on the RPi.
Just for clarity using ubiquity.....I disable their machine.....now I can start going thru the ros tutorials at ros.org? Create a workspace for learning? Create a different workspace for my tractor?
I don't know. What do the tutorials tell you to do? I would just start at the beginning and continue until something conflicts.
Just purchased a Lenovo Think Pad X1 Carbon for a robot development work station. It’ll be here in a couple of days. Comes with Windows 10. I’ll be using a raspberry pi in a similar capacity as Lawn Tractor projects. Soooo I’m trying to decide how to set it up.
Partition to go between Windows and Ubuntu?
Dump windows and just do Ubuntu?
Put either of them on a start up thumb drive?
Pros and cons?
The best is to try to get the dual boot working. Both systems is the best and my favorite conf is Ubuntu and W10 + Ubuntu virtualization..
My Lenovo Think Pad X1 Carbon came this afternoon, a day early. Got it connected to our WiFi and the net using the default Bling. Seems I have 5 G of free ram and 214 G free of disk space. So I’m considering Juan and Jeff’s suggestions for downloading as discussed at the meeting today. Is there a link yet of today’s recording?
*Thread Reply:* I moved to a bit more open space, gave the tractor a simple 2D nav goal, but it was not able to navigate straight, forward about 5 meters. After a few minutes I stopped it. It would move back and forth and in a circle, but not straight to the goal. Bagfile is here.
*Thread Reply:* • rostopic echo /movebaseflex/getpath/goal
• rostopic echo /movebasesimple/goal
• rostopic echo /movebaseflex/currentgoal
All return the same goal of
pose:
position:
x: 21.1926879883
y: -1.35362887383
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.0124892463647
w: 0.999922006321
*Thread Reply:* Congratulations @Al Jones. Maybe you want to try in a near future to only let the tractor move forward from the teb planner as a test..
Congratulations! You have advanced to the point where you are dangerous! 🙂
*Thread Reply:* You are right and here is where all the talking about security begins to have a meaning. As we all move forward I hope that we talk a lot this issue this 2021, but mainly I think that the best approach is not to build something that you can't kick and stop it, it's faster than a one year old kid and weight more than 25 kilos.. 🤷♂️:skintone4:💩
*Thread Reply:* For security i¿m starting the navigation test in gazebo first, and next with a small robot with the same hardware (electronics).
I just stumbled on something interesting. I was just playing with my power steering motor and the Cytron 30A power driver board. I realized the trim pot on the board controls BOTH the speed when you push the "jog" pushbuttons and also the speed when you apply an external PWM signal. I'll see if I can track down some documentation on it.
Here is some documentation: https://www.amazon.com/Cytron-30A-Motor-Driver-MD30C/dp/B07L6HGFWY https://www.cytron.io/p-30amp-5v-30v-dc-motor-driver A manual here: https://docs.google.com/document/d/178uDa3dmoG0ZX859rWUOS2Xyafkd8hSsSET5-ZLXMYQ/view#
I still don't see anything saying the pot also controls the external PWM input, but it does. I'll continue playing with it... But this is a really fabulous and affordable motor driver board.
*Thread Reply:* Jeff, I think this is what I was talking about the other day at the meeting about the issue of potenciometer, the motor and force, or are you talking about something different ..?
*Thread Reply:* I'll have to re-watch the video to see what exactly we were talking about.
*Thread Reply:* I watched the video. I'm not sure how this compares. You stated that you need to verify that the actual mechanical system is correct. This is exactly right. Various people tried to co fuse the issue. (Including me)
But yes, adjusting the pot on the driver board should control how much power (force) is applied to the motor. But controlling the PWM number form the Arduino sent to the motor would also control how much power (force) is applied to the motor.
Maybe we need to document this is more detail.
*Thread Reply:* Let's see if this is going to append this message or start a new message...
Yes, you need to determine correct functionality at a very basic mechanical level. Below ROS on Linux, and below a microcontroller. You need to verify the output voltage of a potentiometer by reading the voltage output. (assuming the potentiometer is wired as a voltage divider) This can be done with a multimeter (I.e., read the voltage) without any conversion using a microcontroller. If the votage is within an acceptable range, then you can move on to a microcontroller and verify the ADC reading that it is presenting. (I don't know how well this is translating to Spanish, Also, I have started drinking already...) But if it passes the voltage readings, then does it pass the the microcontroller ADC readings? Keep on mind that the range has to meet the range that is acceptable for the mechanical left to to right mechanical range. If any of this is does not match the accepted range then you have a problem. And yes, you have to control the applied power so you do not smash anything mechanically as you do your testing...
I made a slight amount of progress. I am able to control my power steering motor position by turning an extra potentiometer. https://www.youtube.com/watch?v=MHd0v3Arq_g https://www.youtube.com/watch?v=EkU9EA50GnY
The weekly meeting is on Friday at 12:00 (Noon) Eastern time, 9:00AM Pacific time). About 12 hours and 45 minutes from whenever you receive this message. The Zoom meeting link is in the pinned message at the top of the General channel. (If I didn't screw this up...)
#general The ROS Agriculture meeting starts in 1 hour. The ZOOM meeting link is in the pinned message at the top of the General channel.
*Thread Reply:* @JeffS I'm on the run because it's my youngest son's birthday and I have a few kids on the farm. I think I'll be at the meeting; but I didn't want to go beyond asking .. this is a message that you made automatic ..? Because I'm interested in doing the same in other groups. On the other hand, I'm going to upload to the projects channel and generally what we were working on this week.
*Thread Reply:* This was not automatic. I just sent it at 1 hour before meeting. So people have a time reference. SInce we have several new people and the pinned message has not been updated with the schedule I thought I would post this today only.
*Thread Reply:* ahh, I was hoping that it could be automated because I have the same problem in the group in Spanish, I will continue to investigate it anyway. Thanks, see you in a bit.
*Thread Reply:* Recording from yesterday's meeting has been uploaded to the Google drive. The link is at the top of the General channel.
The diameter of the turning circle measured by the outer wheels of my lawn tractor while making a complete turn is ~13-14 feet. (~4-4.3 meters). I know this because I executed cmdvel commands linear.x(speed) 0.8, angular.z(steer angle) -1.0 and +1.0 , distance (in meters) 80 and measured the results. Now, is that the "minturningradius" in teblocalplannerparamscarlike.yaml in /home/ubuntu/catkinws/src/lawntractor/lawntractor_navigation/config? Anyone? Ferris? @vinny ruia?
*Thread Reply:* not able to make a turn less than the value stated... (not a fan of negatives) - so 4.3/2 or minturningradius = 2.15
*Thread Reply:* does this wiki page help? http://wiki.ros.org/teblocalplanner/Tutorials/Setup%20and%20test%20Optimization|http://wiki.ros.org/teblocalplanner/Tutorials/Setup%20and%20test%20Optimization
*Thread Reply:* I saw that earlier and wondered if it could be used in in the physical world.
*Thread Reply:* just be careful and keep your hand on the estop if it does some automatic tuning
*Thread Reply:* for future ref: https://wiki.ros.org/teb_local_planner/Tutorials/Obstacle%20Avoidance%20and%20Robot%20Footprint%20Model
*Thread Reply:* I was able to run a simple 2D Nav goal straight ahead. I adjusted teblocalplannerparamscarlike.yaml in /home/ubuntu/catkinws/src/lawntractor/lawntractornavigation/config in the tractor RPi:
min_turning_radius: 2.2 # 14 feet diameter
wheelbase: 1.27 # 50 inches front axle to rear axle
xy_goal_tolerance: 1.0 # from 0.4 , just trying 1.0
yaw_goal_tolerance: 1.0 # from 0.2, just trying 1.0
*Thread Reply:* In other tests it was still doing a whole lot of reversing, so it would still be good to figure out how to reduce the tendency to go into reverse.
*Thread Reply:* not certain but you MIGHT be able to prevent the tractor going backwards by setting backwards vel to 0
*Thread Reply:* Here is another thing to try. Section 6 on this page
https://mowito-navstack.readthedocs.io/en/latest/step_5c.html
says:
weight_kinematics_forward_drive number Optimization weight for forcing the robot to choose only forward directions positive transl. velocities). A small weight (e.g. 1.0) still allows driving backwards. A value around 1000 almost prevents backward driving (but cannot be guaranteed).
*Thread Reply:* Found this which further suggests the weightkinematicsforward_drive setting helps backwards and forwards oscillation. Hoping to test tomorrow.
*Thread Reply:* for future ref: https://medium.com/@subodh.malgonde/building-an-actual-self-driving-car-53f67ca41566
I have solved the squealing steering motor problem. The answer is to switch to the TimerOne library instead of using the AnalogWrite() function: https://www.pjrc.com/teensy/td_libs_TimerOne.html Or the TimerThree library would probably also work. The TimerOne says it works on Arduinos and Teensys. But the pin numbers that are available may vary.
It allows you to set the PWM frequency. I attempted to set it it to 20Khz (50us), I haven't verified it with a scope yet. But that frequency is not audible.
The other implication is that the pulse width range is 0-1023 instead of 0-255 on the AnalogWrite() command. I haven't tracked down any other side affects it may have.
I can give a demo between the two methods on the video meeting tomorrow.
Hi, here it's the gist of my main.cpp code.. https://gist.github.com/rje1974/48f9bc3dc2685634ecaca59a7748b60b
@JeffS See if you can access this: https://gist.github.com/jones2126/bfebe7879e83ad2a7121bd7cace6f393
A tool to use for what we were talking about organizing the google drive .. This way you mount it on your ubuntu https://linuxhint.com/google_drive_installation_ubuntu/
Rowbot just sent me an email newsletter. You can see it here: https://mailchi.mp/050aaafc6357/field_report_cover_crop_seeding
*Thread Reply:* GPS antenna problems and navigation in high crops, I really hope that soon will be a common problem for all of us. Interesting application besides I hate cover crops
*Thread Reply:* This is something that @Javier has been working
*Thread Reply:* Yes Juan, in the case of corn the problem is complex, because when the robot is far fron the base station lost free path
I was just reviewing a couple of the original ROS Agriculture videos on YouTube. I see they are now set to "unlisted". So that is why they disappeared from the ROS Agriculture YouTube index page, but they still exist if you know the URL That doesn't answer if they will disappear completely some day.
Let me try this again... (Slack really pisses me off)
Okay, I'm not quite sure what your confusion is. By searching my index that I posted: http://sampson-jeff.com/RosAgriculture/
grep ' Meeting ' ros-agriculture-youtube.txt > all_meeting
This yields this. Not exactly, because I have more entries than what were posted. But I am going to ignore any further questions or statements from Bob...
I: 11302017 Community Meeting https://www.youtube.com/watch?v=mYeBaRp7DNc 12122017 Community Meeting https://www.youtube.com/watch?v=MSyXIZTZML4 01092018 Community Meeting https://www.youtube.com/watch?v=GrPFrgsnQsk I: 01162018 Community Meeting https://www.youtube.com/watch?v=BlDYHg9DOI4 ...
Oh, it won't let me post that . Because it is tons of information. So you have to do the search yourself.
*Thread Reply:* Maybe list them as private. Do we know which was the license that they originally had..? Because we really don't know who is the 'rigths owner' of the videos.. coming from an habitant of a country of the third world may be this sounds exaggerated but I don't want others to have problems..
I have a copy of all 250 meeting videos. I sent a copy to Al and to Bob. But it did not include all of the videos. You have a copy of everything.
I haven't decided if these videos should be reposted... Or where.
My videos used 90.1GB. Maybe it depends on which server the videos were downloaded from? Or the time of day?
I keep thinking we should start over. But r=there was lot of good data presented in the last 3 years.
I updated my index here: http://sampson-jeff.com/RosAgriculture/ I just added a new text file with the date 20210513 appended to it. As people have noted, if you view it in a browser you get extra characters. If you download the file and view it with a text editor then the extra characters go away. And view the readme.txt file for extra info...
Oh, I screwed up. I uploaded a bogus file. But I think I have it fixed now.
This was just posted to Home Brew Robotics email list: https://luci.com/wp-content/uploads/2021/05/LUCI-Safety-Report-2021-st.pdf Parts of this probably apply to any autonomous vehicle.
*Thread Reply:* I found the document very interesting and although I have to read more about the Luci standard, it seems to me that you are right is that it is something valuable and from where we can surely take as a basis for many instances, because being something that is directly related to a human, surely it takes much more into account to this than other products. Thank you very much Jeff ..
Someday I am going to have to figure out how to efficiently edit videos for upload. I have a couple of pictures and three videos from today. I may just upload the videos to YouTube and post the photos here.
I put my lawn tractor up on a jack stand to relieve the torque on the front end. Then I used the push-buttons on the Cytron 30-amp driver board to run the steering back and forth. It takes much more force than I expected when on the ground. And I decided that it requires almost exactly one turn of the steering wheel to go from left to right, These positions were not up against any hard stop, but close to full travel.
Here are photos of the setup: I need to figure out how to edit the size of photos in Linux...
I got my videos uploaded. Here is a hot from the front on low power. A shot from the back on low power. And a shot from the back on higher power. I need to get the frame attached solidly so it can withstand the torque (as opposed to me holding it). https://www.youtube.com/watch?v=hokrLYEN9XQ https://www.youtube.com/watch?v=9OzLPtF8f2k https://www.youtube.com/watch?v=uIF9KCYWabU
Hi Al. If you use the wheel odom source only, you have the same result for a rect line in small distance?
*Thread Reply:* @Javier Although I have wheel odom data I only use it to measure speed. Odom position comes from GPS via a conversion program. I will post the link.
*Thread Reply:* I get odom by using this conversion program "gps_odom.py" https://gist.github.com/jones2126/0f98af1cad3ca73df3b0e759cc30dc3f
Did you fix the high publish rate that the URDF devices were publishing? I think they were publishing 10000/second.
*Thread Reply:* @JeffS Give me a hint, where am I looking for publishing rate? No I have only updated dimensions and tolerance related items.
*Thread Reply:* It was on the transform tree you would show. The devices listed in the URDF said they were publishing 10000/second. I would assume that would be controlled when you launch robotstatepublisher, or it is an entry in the URDF file.
*Thread Reply:* My urdf: https://gist.github.com/jones2126/ea6e3a2393ffc6f55a32ef9e22c2fe06
*Thread Reply:* Here is the launch file where two of the tf's are set up: https://gist.github.com/jones2126/e6334ce6d4fa1dbc79e83940216bc6b6
*Thread Reply:* for my future ref: http://www.zdome.net/wiki/index.php/Setting_up_the_ROS_Transforms
*Thread Reply:* Just a quick observation. Maybe you didn't notice the qx,qy,qz,qw fields. That is specifying a quaternion. I "think" if you supply 3 values it treats it as x,y,z. (or whatever the order of values to get pitch, roll, yaw). Then you won't have to figure a quaternion.
*Thread Reply:* I see in your launch file that the rate for robotstatepublisher is 30 (30 hz). But your transform tree says a rate of 10000hz. I'm guessing you simply have something broken. Possibly your "rate" value is being ignored. (Probably because your names do not match) Or something is just specified incorrectly.
I still haven't figured out why you don't just throw out your URDF and robotstatepublisher and just put the two static transform publishers in your launch file. If you want to kep your URDF, you are going to have to throw out the extraneous crap in the URDF. And figure out why the values from the URDF are not being used.
*Thread Reply:* This is suggesting 10K is sort of a default. Are you sure it is a problem? https://answers.ros.org/question/357030/problem-with-static-transform-publisher/
*Thread Reply:* Regarding the qx, qy, etc. there seem to be two formats for node pkg="tf". There also appears to be a node pkg="tf2_ros" which is more modern.
*Thread Reply:* That's true about the two different formats for statictransformpublisher. I just wanted you to be aware in the future when pull up that note that you are not required to supply a quaternion.
*Thread Reply:* Counting elements in my statement.....seem to be 6
<node pkg="tf" type="static_transform_publisher" name="map_to_odom" args="0.0 0.0 0.0 0 0 0.0 map odom 100"/>
*Thread Reply:* Okay "Bob", do what you want. But you have raised a very valid point. About publish rate, TF vs. TF2, URDF vs. statictransformbroadcasters. I will publish something ion a main channel so Slack doesn't hide it from everyone..,.
2nd Attempt at running 2D Nav Goal via RVIS
I'm start testing teb+move_base with odom from the wheels to debug the software
The first problem that i resolve is the accelerations limits. Maybe this parameter cause an oscilation
*Thread Reply:* Thanks for looking at it. Are you looking at teblocalplannerparamscarlike.yaml?
*Thread Reply:* So this is my current yaml file: https://gist.github.com/jones2126/3155dd13cf19ed10f8357a163b2a9333
and urdf: https://gist.github.com/jones2126/ea6e3a2393ffc6f55a32ef9e22c2fe06
1st Attempt at running 2D nav goal (gps, rvis, move base_flex)
The movements are very fast, maybe if the feedback is not too fast cause the problem
[Just to clarify. Slack bit me again and this will be a continuation of what I started...] [Again, Slack is fighting me with everything I type. So I will edit this in a text file and post it...] [I will post this in a main thread as opposed to a thread, so it is not hidden.]
Al has raised a very valid point. I see his transform tree says the values defined in his URDF file are publishing at 10000/second. That looked wrong to me. But he posted a link stating that it is normal if you are running TF2. https://answers.ros.org/question/357030/problem-with-static-transform-publisher/ (Point of order - TF2 was adopted with the introduction of Hydro (8 YEARS AGO!!!!)
They posted a couple of links (which is left up to the user to track down) that talk about using TF2 as opposed to TF.) So they are saying that you should not use: <node pkg="tf" type="statictransformpublisher" name="maptoodom" args="0.0 0.0 0.0 0 0 0.0 map odom 100"/> (which is TF) But instead you should use: <node pkg="tf2ros" type="statictransformpublisher" name="link1broadcaster" args="1 0 0 0 0 0 1 link1parent link1" /> The first method uses TF, which hasn't been used used (supported) for 8 YEARS!!!.... The second method uses TF2, which is the modern standard. Which has been the prefered (supported) method for the last 8 YEARS!!! It uses the "statictransform_broadcaster" from the TF2 package as opposed to the orignal one from the TF package. It also does not use the publish rate anymore. [this also has the format for quaternion as opposed to picth.roll.yaw, but that is different issue that you are supposed to inherently know].
So bottom line, don't use the original method any more for commnand line (or launch file) specification. Use the "new" method. (which may result in all of your topics indicating they are being published at 10000/second).
The second point, if you are going to use a URDF file for a physical vehicle, know it and understand it. The lines:
<!-- define base footprint (virtual link) --> <link name="base_footprint" />
Are broken. For two reasons. The URDF file uses XML formatting. A "<link" tag rerquires a terminating "</link" to terminate the block. So that breaks it. Secondly these is no code associated with the "<link name="base_footprint" />" block. So that breaks it. I don't know what the implications of these are in Al's case are, but it is wrong.
I assume you have to have a "link" and a "joint" for each object in a URDF file. And maybe other thing are required, but I don't use a URDF for a physical vehicle so I don't know the requirements.
I see some of the blocks are commented out (which was not immediately obvious). So I don't know if that particular URDF file is valid. (There is probably some form of "check this URDF file" avaialble, but I don't know what it is called.
I will go back to what I should be doing, making sure my taxes go out tomorrow. I can do a Zoom meeting or other direct interaction. But this random text interchanges on Slack are killing me...
And base footprint is connected to base link by L72 : https://github.com/ros-agriculture/lawntractor/blob/b9efa3345985e2135cf4db40a3e3265b8ad9d5a0/lawntractorsim/urdf/lawntractor.urdf.xacro#L72
I do agree that likely using the tf2 static transform publisher is a good idea
Oh... It is because it is: <link name="basefootprint" /> Instead of: <link name="basefootprint" > which would require: </link> But as I said, I don't use URDF. So pardon my ignorance.
Not exactly ROS robotics, yet. My son has a bad back and can not twist around and open the seed gate of a tow behind broadcaster while driving his lawntractor. So I used a programmable remote FOB. I tried to follow the instructions but had problems. Seems one has to wait 25 seconds for the programming function to time out before it saves the selections. Any way it works. Now to mount everything on the tow behind spreader. I’ll order another fob unit for a kill switch on my project. Reason being that it can connect to other devices byway of I2C. I might need more auto shut down commands from future sensors?
*Thread Reply:* I understand that it would not be recommended for security reasons. any automated process that you cannot stop is dangerous
Anyone looked at this? https://docs.github.com/en/codespaces/about-codespaces I have tractor specific configurations I would like to easily share. I also have code that run on the Teensy's on the tractor. At the moment I have been using gist to share that and although that works I'm wondering if "codespaces" would be better. Hence the question, has anyone looked at this?
Not ROS control but hey, gotta start somewhere. A programmable FOB activates the actuator which is connected to the seed gate/fertilizer gate. Pressing A on the FOB opens the gate. Pressing B closes the gate. The sliding limit switch more or less sets the application rate. BTW this is son Nate’s tow behind broadcaster spreader. Battery is behind the aluminum plate.
Nice video about something i thiink a lot and is about geting more from car parts especialy of the steering power motor.. https://www.youtube.com/watch?v=R6vzCr-iPoQ
Thanks Juan for the heads up! Nice video and very interesting. I’ll have to review it a couple of times in slow motion.
If you are up, there is a lunar eclipse in 4 hours: https://www.timeanddate.com/live/eclipse-lunar-2021-may-26
[Will this get appended to my previous message?] I attended the Home Brew Robotics ROS Zoom meeting tonight. They went through converting a ROS1 Python program to ROS2. It was interesting to see what changes were needed. Mainly it was replacing rospy with rclpy. And each call to rclpy needed to be converted to the new new format.
*Thread Reply:* Did that group discuss which text editors they use? Sublime or pycharm or? Which editors do people here use?
*Thread Reply:* They didn't discuss editors... Camp was using vi and he made it work most of the time. Michael suggested several times that Camp should look into VS Code. I use gedit.
Pods detection on soybean. Model trained using yoloV5. https://github.com/ultralytics/yolov5
https://discourse.ros.org/t/first-beta-of-ros1-integration-for-farming-simulator19/20625
This is the library that i'm using to control the motors of the wheels and the steering. Works very well and stable using an arduino UNO board connected using the USB to a computer with ROS. In my case like as said in the meeting i used an odroid board with 4GB of RAM
*Thread Reply:* Always nice to see different boards! Makes one think.
This is the tutorial I followed to get the Blue Pill to work like any other Arduino. I didn't test it alot, but I can directly upload code from the Arduino IDE over USB, and I could do a print statement and it would show up in the Arduino terminal screen. I didn't check to see how many Arduino libraries are available. https://www.youtube.com/watch?v=Myon8H111PQ
I'm not sure exactly what your question is, or how it relates to your Jeep. But we put our relay between the battery and the ignition switch, If the relay opens it was like turning the key off. So the engine would die and all of the auxilliary electronics on the tractor shut down.
*Thread Reply:* Thanks for the suggestion about moving the kill cuircut. Also thanks for Camp’s link.
*Thread Reply:* I don't recall any suggestions about moving eStop circuits. I just told you what we did on the eStop circuit on the Danfoss project.
*Thread Reply:* I should have been more accurate. I should have said, I’m going to borrow that Dan Foss idea.
*Thread Reply:* I have noticed a disconnect about "eStop" There are actually several layers to interupting your robot. One level level is to say "pause". Another level is to say "cancel the current navigation attempt". Anther level is to say "totally shut down any physical movement and cancel all naviagtion attempts" (maybe even killing the control software completely). It depends what you want to do.
I was watching these videos and that is where I noticed his statement that a button on his remote was an eStop. It wasn't it simply interrupted navigation and came to an orderly halt. In my opinion, an eStop kills any movement, and hopefully kills the navigation process (and maybe all software) so when you release your "eStop" it does not not resume motion. https://www.youtube.com/playlist?list=PLmb0WjGtyZZTXYqRmrmorwfBD2mkl4XiM
So for everybody to consider... Don't assume a word or phrase is all encompassing. You have several levels; eStop (stops it completely to prevent death), kill navigation (because you changed your mind or because it can't make the navigation goal), or a simple pause (that will cause the software to sit and wait until you release it, maybe someone was in the path). Also consider a simple pause may cause other complications, I.e., it may start recovery behaviors since it can not continue forward.
I will push this to the main channel since this is a concept that everybody needs to be aware of. Not all interruptions are eStop. But preventing death (to yourself or others) is definately an eStop condition. A pause is a convenience.
*Thread Reply:* Thanks for the clarification. I’ve relabeled my circuit as “ interrupt circuit” the FOB toggles on....or toggles off ...the rear axle motors, at this point.
Camp Peavy is starting a robot similar to Bob's new jeep. Here is a link to his description. https://groups.google.com/g/hbrobotics/c/jiGSzRuWG-U/m/9TOaWjmBAwAJ
You guys are likely better linux admins than me so maybe you know this, but thought I would pass along I was trying to update something on my RPi ($ sudo apt-get update); I had the error, "...signatures were invalid: KEYEXPIRED 1622248854..."; After running $ apt-key list
I found an Open Robotics key had expired causing an issue with http://packages.ros.org/ros/ubuntu xenilease
pub 4096R/AB17C654 2019-05-30 [expired: _**2021-05-29**_]
uid Open Robotics <<a href="mailto:info@osrfoundation.org">info@osrfoundation.org</a>>
After research, the fix was $ sudo apt-key adv --keyserver <a href="http://keys.gnupg.net">keys.gnupg.net</a> --recv-keys AB17C654
I'm assuming your key will be different than AB17C654
*Thread Reply:* Its a very powerfull command i solved a lot of problems with dpkg
*Thread Reply:* Maybe this applies: https://groups.google.com/g/hbrobotics/c/uWisDR7quYE/m/pjxrbyB1AwAJ
I got the fob interrupt working. It has a range of 300 + feet BUT the slightest blade of grass will interfere. So my lesson is to be careful, it may not work when I need it. I have to have the FOB receiver high AND totally unobstructed, if possible. I am satisfied at this point. All I wanted at this point is some control within a few feet.
*Thread Reply:* This is a development for the project you are working on ..?
Hi Juan, yes. We working to build another robot, the simulation helps in the design and we testing all using gazebo before build
*Thread Reply:* NICE, I really think it is a very good business model.
It has been really hot the last few days. So I decided to move downstairs and play with some YouTube publishing software. Here is a test result from today: https://youtu.be/4fW7nESS6BQ I am fighting with the audio. Is it usable? Too loud? Too soft?
I have come to the conclusion that the green screen in Zoom is FAR superior to OBS software. In Zoom, you just turn it on and it works. In OBS there are many parameters you can change, and none of them work very well. Maybe I will hang a neutral back drop and not worry about it.
This description at the top of this page may fix your expired keys, if people are still having problems... https://discourse.ros.org/ This was just mentioned in the Home Brew Robotics ROS meeting...
*Thread Reply:* Which links are broken?
I can get to https://discourse.ros.org/
I can execute
curl -s <https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc> | sudo apt-key add -
*Thread Reply:* FYI: I didn't actually read it. Somebody said it fixed their problem. When I just skimmed through it now, it sounds like they may have it fixed anyway.
Finished the 4 teensy tutorials using 3.2 board. In tutorial 4 https://www.pjrc.com/teensy/tutorial4.html they used a simple 10k pot.
So now I want to experiment with maybe 2 joy sticks like these https://www.adafruit.com/product/3102?gclid=Cj0KCQjwzYGGBhCTARIsAHdMTQyrLOql8cxIEMF175jRD8JHfWouM4K57wC9q97ZjztEN5-bEvc124aAoZkEALwwcBf
JoyStick #1 would control forward/reverse ...up/down on the stick And control steering left/right on the stick
Anyone using these? Any recommendations for a different product?
I’ll go from analog to pwm into my cytron motor controller. I’ll have to figure out how change the directional on the motor controller.
Those joysticks are almost identical to the ones I used.
Al points out that you may want to disable the spring-return-to-center (for one or both axis). Some joysticks have a small lever that enables or disables the spring. Or, on the lager ones, you can probably use small needle nose pliers to pop the spring off and that disables it. (Or maybe not...) But don't create a situation where you may leave a dangerous situation if you take your hand off the joystick. I.E., you probably want forward speed to always return to zero if you take your hand off. The steering axis can be your preference.
If you are just experimenting, you can get cheaper/smaller joysticks. https://www.adafruit.com/product/512 https://www.adafruit.com/product/245 https://www.amazon.com/s?k=thumb+joystick+arduino&ref=nb_sb_noss_1 https://www.mpja.com/Mini-X-Y-Joystick-10K-with-Mushroom-Knob/productinfo/34792+MI/ https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313&_nkw=joystick+arduino&_sacat=0 If you get a small one, you want one soldered to board. It will be be easier to mount and easier to connect.
Another option is a 3 axis joystick: https://www.mpja.com/X-Y-Z-Axis-Motion-Control-Joystick-10K-Resistance/productinfo/34791+MI/ But to me those look like they would be hard to use.
And of course the other approach is to use a USB joystick through your ROS computer. You can get hardwired joystick/gamepad or get a wireless gamepad to connect to your ROS computer. Al switched over to a wireless gamepad directly into his ROS computer on his lawntractor.
That has the advantage that you are running through the same path that the autonomous path will be using. The disadvantage is that you have to have the ROS computer booted up and running ROS to get this to work.
The main advantage to connecting a joystick directly to your Teensy is that you can drive you vehicle directly with ROS not running or even without a ROS computer connected. My remote control bypasses ROS completely so I can turn on the main power switch on the robot and then turn on the remote and start driving. I find that very handy. But I have less incentive to get my ROS based control working... 🙂
In the long run, it would be handy to have both types of joystick available. So that is something to think about in the future.
For now I want to bypass ROS. I decided against the Bluetooth controllers because eventually I want to use RF with the joysticks. I like the alternatives you suggested for the small size. I just don’t know how long they would last with repeated use out side.
Another thing to consider with any joystick control...
Have a way to qualify the control. Have a physical switch to switch between manual control or automatic control. (Which doesn't solve the real problem.)
The real problem is that you may accidentally bump a joystick and your machine will move unexpectedly.
So you can add a "dead man" function. Typically an extra puch button on your remote control device. You must hold down the pushbutton for the joysticks to be honored. If you are not purposely pressing the dead man button the the machine can't move unexpectedly.
Or another approach is to put a gaurd around your joysticks. (or recess them down into your box so you have to reach inside with your thumbs to push the joysticks. Which is what I did because I was copying someone else's product.)
Or combine two or three of these methods...
Here Kyler demontrates what happens if you bump a joystck and you did not implement a dead man button: https://youtu.be/TatrSXJiq0I?t=390 You can watch that video from the beginning to see if first attempt at a remote control.
https://link.medium.com/paJfJnSwXgb
If you want to use ROS with ps3 joystick I wrote an article about how connect to ROS with a raspberry pi 4. I have an image of the sd card with ubuntu and the ROS packages installed and ready to use to share.
https://agrirobotics.wordpress.com/
I hate to impose my bias here. But if you were just bitten by Microsoft's latest update non-sense (Microsoft News Feed), here is a way to fix it... https://www.youtube.com/watch?v=_CpCkFa5Gi0
My power is fluctuating. So I may not make the meeting tomorrow (later today). Either because my alarm clock may not go off or simply becuase I may not have power. We'll see...
Hello, I stopped by the meeting and I see that it was complicated for everyone .. I leave you my update of the week ..
Al said he is going to be late. So Bob and I are waiting for the meeting to start.
I got stuff to do outside before the humidity gets bad. C u next week.
*Thread Reply:* @vinny ruia or @Javier or anyone that wants to comment about movebaseflex.... When movebaseflex sends cmdvel twist message, what basic instruction is it asking the low level steering controller to execute? At the moment, a cmdvel angular z of -1 would equate to a right turn at 156 degrees and cmd_vel angular z = +1 would equate to a left turn at 162 degrees on my tractor
*Thread Reply:* I guess I should add, it appears the lawntractor package is expecting movebase/cmdvel to send values between -0.3 and 0.3 and then remaps those values to -1 and 1, but the basic question is still the same, when movebase/cmdvel sends values between -0.3 and 0.3, what physical behavior is it expecting the low level steering controller to deliver? (e.g. If movebase/cmd_vel sends -0.3 is that an expectation the physical steering controller turn right 0.3 radians/second?)
*Thread Reply:* It seems a little fishy to me that you have 156+162=318 degrees of steering. From you protractor photos, may you have 30 degrees one direction and 25 degrees the other direction. But I don't know what that is you are measuring there.
By definition, ROS planners output rotational velocity for steering.
But... if you are using the TEB local planner, and you set the parameter to use steering angle instead of rotation velocity, then it will be an angle.
*Thread Reply:* You may want to be skeptical about the existing code. To me it looks like a hack. Probably to get the Stage simulator to work.
The code in the in the command velocity mux you are using is articially limiting the output of the TEB planner to +/-0.3. SInce you told the TEB planner to out an angle instead of rotational velocity then the value is interpretted as an angle of +/-0.3 radians. One radian * 0.3 (573. degrees * 0.3 = 17.19 degrees) So right off you are artifically limiting the output of the TEB planner to +/-17 degrees. That seems like an awfully small angle to me. It should work if you are only attempting to drive straight lines.
Then the code blatantly multiplies that by 3 (the 0.3 to 1.0 translation). So the value that was clipped at +/-17 degrees is scaled to your full value. Seems to me that will cause massive oversteer.
But the bottom line is. You have to know if you want your controller wants to see an angle or it wants to see rotational velocity. (Hint: you told it to expect an dangle because the Teb planner was told to output angles.
So, by definition, you want your low level controller to expect and input as an angle (in radians) and scale that to your actual steering angle.
Currently you code is scaling an input of +/-1.0 to -14003 to 13683. (Or maybe you are doing +/-1.0 to 1013 to 17.) Instead scale the actual measured angle to your sensor value. Let's assume you have a more realistic value like +/-30 degrees (0.5236 radians). So scale +/-0.5236 to -14000 to 13683. [Regardless of what your your panner is putting out or what your joystick is putting out. Because by definition, the input value is a an agnle that should match your physical steering.]
If you get that working, then the TEB planner is happy, it just outputs angles and expects your vehicle to listen and do it. You may have to recscale the joystick, either hack the joystick node, which was probably hacked already to get +/-1.0 (which is probably based on teleop_joystick) [It may have input parameters to do scaling for you.]
Or hack the command velocity mux to scale it. Or create a node that goes between the joystick node and the command velocity mux node.
Now I will stop and look at what you just posted...
*Thread Reply:* Al said "Ai would be 162-90 or 72." Can you define that a little harder. Do you mean steering of 0 is 0? And fulll left is -90? And full right is 72?
*Thread Reply:* Another issue is that Ackermann is not linear. Sometimes in descriptions people will talk about "an imaginary center wheel". That center wheel is the actual angle you request. But both the left wheel and right wheel will be different angles. The inside wheel turns sharper and the outside wheel will be less that the center wheel. I am sure there is a transfer function for that, but I don't what it is. Maybe I will go look for it.
*Thread Reply:* I think somebody is still confused on your measured angles (maybe me). I am assuming you have a bars on the ground. One reresents the centered line and the other is parallel to your tire. (I.e., the two lines defining Ai above.) Looking at your protractor for the left side the protractor says either 30 degrees or 152 degrees. (And 180-152=28) What does 90 have to do with it?
*Thread Reply:* @vinny ruia I have two questions.
*Thread Reply:* HI Al, in my case I calculated in the controller the turning radio of the wheels. If the move_base package sent an angular velocity of 0.3 means 0.3 radians per second and depend of the geometry of your robot you need to calculated the ICC to respond to the requirements of the move base.
What is the quickest way to read data from a USB/serial port and publish as a ROS topic?
I want to plug in a Sparkfun Razor IMU (probably with the Sparkfun code). Set it to put out (at least) the raw (degrees/sec) values from the gyroscopes. Then I want something quick and dirty to read the strings of data and publish them as a ROS topic so I can record them in a bagfile.
Yes, I could setup a full ROS driver for an IMU. It does publish rotational rates in the IMU message. But I don't know if those are raw unadulterated values, or if somebody played with those (in their infinite wisdom). And my experience is that it would take days to get REAL values from any IMU.
*Thread Reply:* Jeff, good morning.. Nice sensor.. Did you see this video..? https://www.youtube.com/watch?v=we5FBjllVn0 and this driver.. https://github.com/mitll-ros-pkg/razor_imu_m0_driver
*Thread Reply:* Well, yes and no. The video is for an old board, with old software, that may or may not work adequately.
Back story... Sparkfun made an IMU board with a random processor and random IMU components. Then over the years the repeatedly changed the design and continued to keep the name "Sparkfun Razor MU". So you will find code which they have hacked on for years to account for the change of processors and change of IMU components. And of the hundreds of firmware and ROS packages that are available, I haven't found one yet that I have been able to get good results.
Although, that second link you posted looks promising (when I want to spend a week or two playing and experimenting to verify it works). I'll look at that one when I get back to IMUs.
But at this point I just want a rotational reference that has been factory calibrated. So I just want the raw yaw rotational rate from a 3-axis gyroscope and hopefully I can trust that. I don't want full IMU results or AHRS results. Because then I would have to figure out what they have done to mask the actual results that I am looking for...
I didn't find a quick/easy way to publish a serial stream. So I hacked one together. It creates a topic like this:
data: "11127603, 14.45, -2.62, 39.57"```
I assume most people aren't subscribed to Home Brew Robotics Group and probably won't be. So here is a quick update on Camp Peavy's plastic car robot: https://groups.google.com/g/hbrobotics/c/jiGSzRuWG-U/m/9TOaWjmBAwAJ https://photos.app.goo.gl/Qh9i4nj63wHx3d1R8 https://youtu.be/NhnkrQyU0Us https://photos.app.goo.gl/rze1biRVWu3W39vQA https://youtu.be/AdPY_YmWC3k https://photos.app.goo.gl/FNE3WCXTYx44phgr8 https://youtu.be/xhwEK7FCC-8 This leaves out a lot of the commetary, but if you want more info you can go track it down. I probably won't post any more updates on this.
The weather is supposed to cool off in the future. So I was planning to configure my robot today so I can take it outside and do some steering angle verification. But my Internet provider decided to make my life a living hell the last couple days, so I haven't gotten anything done. But I need to configure the wheel odometry, the dual GPS and the gyroscope rotational velocity so I can verify if I know what I am doing with steering angle...
But instead I have been fighting with the decision of DSL, cable, wireless, cell modem in the brief times I have Internet connection... I was was wondering if I can run my whole house Internet on a cell phone modem... So I haven't gotten anything done.
I tried to get uBlox ucenter to run under Wine on Linux. It wasn't obvious how to translate the COM port names to make it work.
I reverted back to ucenter under an actual Windows laptop. I was able to load both configuration files. (one for the main RTK receiver and the one for the "moving base line" "rover" receiver on the robot. So the main receiver generates the precision Lat/Lon position and the second receiver gives me a static heading (if I am moving or not). I need to get it put back on the robot to verify it works there.
I need to take it outside and verify I can get GPS data, IMU data and wheel odometry data and log the data. Then I need to verify I can get "recovered" "rotational velocity" from IMU, wheel odometry and from GPS. If I can get all three sensor to agree, then I have a way to verify the actual path/rotation of the vehicle. Then I can attempt to give it a steering command (either steering angle or rotational velocity) and verify I can get the input to match the recovered output.
Maybe I can try that tomorrow...
So I got things configured and I went outside to make a sample bag file. I wanted to know where it puts trhe bag files, is the data useable and what other problems I will have to fix.
I found out a cable was in the way and I couldn't easily wedge my laptop into place. With my GPS receivers inside of a box I have no idea what the status is. But I drove around anyway. So now I have a bag file.
For some reason rqt_plot will not work on my kinetic laptop. Plotjuggler will not load on my kinetic laptop.
I copied my bagfile over to the noetic laptop. I loaded Plotjuggler. I can't figure out how to do an X/Y plot. It just doesn't seem to work.
My topics that are just a string of numbers are not recognized. So I have to decide how to fix that.
So now I have several things to look at and try to fix...
So out of desperation,,, I shutdown my Ubuntu 20.04/Noetic laptop and restarted it. Now I can do X/Y plots...
After starting Plot Juggler multiple times, I have gotten 3 different initial screens. Now it has settled down now. I am able to plot the topics that have individual fields. Here is in an X/Y plot of odometry and GPS. The odometry is rotated since I didn't think to align my vehicle before I started ROS. The only thing I could get rotation velocity was odomentry. Looks like I am getting 0.8 radians/second when driving in the tight circle. I need to work on the other two sensors to present rotational velocity. (Or I can hack on the strings to separate them...) 3 files
I looked at my numbers some more. Rotational velocity from RTK GPS (I took the change of heading per sample, scaled to degrees/sec, divided by the update rate and conveted to radion/sec.) It says approximately 0.35 rad/sec (full left or full right) I plotted rotational velocity from my wheel odometry message and it is approximately 0.8 rad/sec. One, or both, are wrong. I then plotted the heading from GPS and Odometry (the blue curves). They should look the same but they don't. The one from GPS is probably correct. The one from odometry is obviously wrong. (But I looked at an old plot of heading from odometry and it looks right.) Maybe it is in my interface node where I take values from the microcontroller and build/publish and odometry topic.
I didn't plot the rotational velocity from my IMU. It appeared to die shortly after I started the bag file. I think the cable was pulled loose.
I think I solved the mystery of the odometry heading value not looking like the GPS heading value (in the plot above). It looks like my ROS node which takes raw data from the microcontroller and builds the odometry message is wrong. I am simply pulling input fields and plugging them into output fileds. I probably picked a value for the quaternion and plugged into an inappropriate field in the new odometery message. So I can probably fix that.
I still haven't figured out why odometry say a rotational velocity of 0.8 and GPS says 0.35. Maybe if I run again with the IMU I can see it matched either one. If so I will know which one is wrong. It it doesn't match either of them then I am totally baffled for the moment. But it raining now so I may try the IMU tomorrow.
*Thread Reply:* Perhaps the difference between the gps and odometry from wheels are related to the calculation to translate the ticks from the encoders to odom.
*Thread Reply:* Hey, that is a good point. I can plot odometry linear speed against the GPS linear speed. I'll go try that.
I modified my gyro ROS logging to output a single Float32 instead of 3 values in a string. So I can now plot that directly in Plot Juggler. I also scaled it by converting from degrees/sec to radians/sec and multiplied by -1 to flip it. My first run today was baffling. I drove in a circle one direction and then drove the other direction. And the second circle was at a much faster forward speed. Only the first circle looked right. When I plotted the rortational velocity and forward speed, I see the bad circle had a forward speed of around 2 m/s. That is way too fast. And apparently the tires were slipping on my dry grass.
Then I drove around again at a more controlled speed. And this time I drove in several size different diameter circles. These plots look much better. The odometry path looks close to the GPS path. But one is rotated and maybe flipped. The GPS has at least one jump... The rotational velocity from the odometry matches the rotational velocity from the gyroscope. It tracks well enough to give me confidence that I did it right. And it also tells me that I should not go above 1 m/s forward speed...
Now I have to figure out how to convert the GPS lat/lon to offset in meters in real time. Then I can plot those two on top of each other in Plot Juggler.
At one time we discussed how you know if your tires are slipping. The fact that my two rotational velocities don't match above indicates that something something is wrong. You could have a separate node to watch for these things. Then you have to think what to do about it once you detect it.
I'm getting closer at converting my GPS heading to rotational velocity. My biggest problem was I was dividing by 10000 instead of 100000. I take the heading output and find the delta from current sample to last sample: Delta=Heading(T1)-Heading(T0) Then divide by large number to scale to actual degrees: Degrees=Delta/100000 Then convert degrees to radians by dividing by 180/PI: Radians=Degrees/(180/PI) Then account for time units by dividing by 5 samples/second. But that is off by a factor of 2. So I divided by 10 instead. Then threw in a * -1 to flip it to match the gyroscope put. Rotational Velocity=Radians/10*-1
So I still have a "off by 2 times error". I don't know why. Maybe the GPS is not sampling at 5 Hz, like I thought I was. Maybe the GPS has multiple sample rates internally? So I have to track that down.
*Thread Reply:* Hi Jeff, can you record a bag file with all the topics? The command is rosbag record -a. Then you can use rqt_bag to upload the recorded file and see the topics at the same time. It's a good way to see if there are a problem with the sampling. If you share the bag i can check it
*Thread Reply:* When you obtain the heading from the GPS, the Delta is represented by two measurements (T0, T1), perhaps the problem come from there . I'm not sure...
*Thread Reply:* I am recording all topics to a bagfile. I can plot the topics directly from the bagfile with Plot Juggler (the screen shot above on the left). I can also use either of two different programs to break the bagfile up into individual CSV files. Then I can load CSV files into Open Office Calc (or Excel) and do calculations and plots. (the screen shot above on the right)
I can't figure out how to get Plot Juggler to plot bagfile values AND additional CSV data at the same time. (That may cause an issue of sample rate of the CSV vs. bagfile topics.)
Apparently I can do extra calculations in Plot Juggler. But I haven't tried to understand that yet. It looks like it uses LUA.
Another problem is that I have to use two computers to make this work. I record the original bag file on the robot computer. That computer will run the bagfiletocsv programs. But I can only get Plot Juggler to run on the other computer. So I am always copying files back and forth. It gets tedious...
After doing about 6 steps to convert heading to rotational velocity, it could be any step. I'm just missing a scale factor of 2:1 somewhere. If I just ignore the fact that I don't understand, it is giving me the correct data. Eventually it may become obvious.
Could you be a little more specific about what expect out of your joystick or what you are expecting it do.
*Thread Reply:* The joystick controls 2 motors. I’m focused on the forward/reverse motor pot. When the stick is relaxed sitting in the neutral position I’d like no movement on a motor. Using a vom I set the wiper at about the midpoint of the pot. I’ll copy then paste the code. I’m using write statements and delays (5000) for troubleshooting.
After a couple of days of fighting with Python I have a ROS node that subscribes to the uBlox F9P /gps1/navrelposned message and converts the static heading into a rotational velocity as it runs. I plotted the gpsrotvel with gyrorotvel and they appear to track each other. I would have plotted the wheel odometry rotational velocity, but my robot failed to log any data for it when I drove it around. I have to do some playing around to eliminate those spikes. They are caused by the rollover between 0 and 360 degrees.
*Thread Reply:* Excellent result Jeff. So your rotational velocity from the gps heading match with the measurements of your gyrocope.
*Thread Reply:* did you test the navigation stack?, for example: if you graph the angular and linear velocity from ROS in the topic cmd_vel, and then graph the angular and linear velocity from the gps, comparing the two graph it's a way to test the response of the robot to the requirements of the navigation stack
*Thread Reply:* if you put the two graphs at the same axis (time) you will see a delay between the sent w and v, and the final w and v of the robot. If these delay are too high, the navigation stack could not work propely
*Thread Reply:* this delay can be reduced with acclimx: acclimtheta: from the teblocalplanner_params.yaml
*Thread Reply:* and depends of the configuration of your low level controller too
*Thread Reply:* I haven't compared it to any inputs yet. I have been driving this with the joystick. One of the next experiments may be to load the cmd_vel to angle converter: https://github.com/rst-tu-dortmund/teb_local_planner/blob/0d242c0aff4f81b4ff5d67d7e4f970a8f454ad4a/scripts/cmd_vel_to_ackermann_drive.py Then I can specify an input rotational velocity and scale the angle out so my vehicle turns at the same rate.
[code] /*Example code for * testing the Cytron Technologies 10A Rev 2.0 Motor Driver *The code uses only two pins, one for the motor direction (Pin 9) * and one for the motor speed (pin 9). The speed is set using PWM. * This code also uses a potentiometer connected to pin A0 in order to control speed and direction * This code is meant to be run on the Arduino Uno hardware. **/
//Pins used to control direction and speed of the motor. Speed pin should be a pwm pin.
int FRSpeedVal = 0; int LRSpeedVal = 0;
void setup() { //Declaration for the pins used, both should be outputs. pinMode(FRMotorDirection, OUTPUT); pinMode(FRMotorSpeed, OUTPUT); pinMode(LRMotorDirection, OUTPUT); pinMode(LRMotorSpeed, OUTPUT);
Serial.begin(9600); }
void loop(){
FRSpeedVal = analogRead(FRPotPin); //Reads the position of the Move Forward or Reverse potentiometer LRSpeedVal = analogRead(LRPotPin); //Reads the position of the Steer Left or Right potentiometer
//Makes the forward reverse motor rotate in the clockwise direction.
// Go forward
if(FRSpeedVal > 550){ digitalWrite(FRMotorDirection, LOW); FRSpeedVal = map(FRSpeedVal, 550, 1024, 0, 255); analogWrite(FRMotorSpeed,FRSpeedVal); Serial.print("Moving Forward -- Motor speed = "); Serial.println(FRSpeedVal); delay (5000); }
// // Makes the forward reverse motor rotate in the counter clockwise direction.
// Go in Reverse
else if(FRSpeedVal < 499){ digitalWrite(FRMotorDirection, HIGH); FRSpeedVal = map(FRSpeedVal, 498, 0, 0, 255); analogWrite(FRMotorSpeed,FRSpeedVal); Serial.print("Moving in Reverse -- Motor Speed = "); Serial.println(FRSpeedVal); delay(5000); }
// Makes the Left Right Steering motor rotate in clockwise direction.
if(LRSpeedVal > 500){ digitalWrite(LRMotorDirection, LOW); LRSpeedVal = map(LRSpeedVal, 501, 1024, 0, 255); analogWrite(LRMotorSpeed,LRSpeedVal); Serial.print("Steering Left -- Motor speed = "); Serial.println(LRSpeedVal); delay (5000); } // Makes the left right steering motor rotate in the counter clockwise direction.
else if(LRSpeedVal < 500){
digitalWrite(LRMotorDirection, HIGH);
LRSpeedVal = map(LRSpeedVal, 498, 0, 0, 255);
analogWrite(LRMotorSpeed,LRSpeedVal);
Serial.print("Steering Right-- Motor speed = ");
Serial.println(LRSpeedVal);
delay (5000);}}
There is a wealth of information in stored rosbag files. I was getting ready to attempt to figure out the relationship between turning diameter/curvature/steer angle/rotational velocity... I know the diameter of the circle that baselink is tracking. (Because I can measure it on the plots.) But it occurred to me that the front of the vehicle traces out a different path than the baselink in the back. Do I know what that is? Oh, I have a GPS on the back AND a GPS on the front. Would that tell me anything? I plotted the back ground plot vs. the front ground plot. It does!!!! What is the distance between the back antenna and the front antenna? I could go get a tape measure and check... Or plot the moving baseline distance to see what it thinks.
The top plot is back GPS vs. front GPS. The bottom plot is the moving baseline distance between antennas. It thinks they are 75cm apart.
Found this Adafruit IMU BNO085 IMU's in my IMU "one of these days I'm gonna get around to using it days" shelf; Any suggestions on the better connection approach (I2C or UART)? https://learn.adafruit.com/adafruit-9-dof-orientation-imu-fusion-breakout-bno085/arduino
*Thread Reply:* @Al Jones What does this have to do with anything? Just read read the rotational rate value out of the gyroscope and use it.
Tonight i going to post the code to upload to the arduino and the ros node to obtain the heading
*Thread Reply:* @Javier I didn't see your post for the Arduino and ROS node. I have a BNO085 module (from eBay) wired to a Teensy 3.2. I would like to try your code since it is working for you.
*Thread Reply:* @Al Jones If you are talking to your sensor with the Adafruit libraries, run the example named
read_all_data
and see what you get.
*Thread Reply:* you can print Serial.println(orientationData.orientation.x) and get the heading
*Thread Reply:* then with a basic node take the value of the heading to insert into odom
*Thread Reply:* #!/usr/bin/env python
import rospy from std_msgs.msg import String import serial
ser = serial.Serial('/dev/ttyACM1', 115200)
def talker(): while not rospy.isshutdown(): data= ser.readline() decodedbytes = (data[0:len(data)-2].decode("utf-8")) rospy.loginfo(decodedbytes) pub.publish(String(decodedbytes)) rospy.sleep(0.1)
if name == 'main': try: pub = rospy.Publisher('heading', String) rospy.init_node('imu') talker() except rospy.ROSInterruptException: pass
*Thread Reply:* This is my basic heading node, i'm used a string message to sent the heading, pretty basic but works
My free cell modem account finally caught up to me. They said several months ago that my current account was being upgraded and I had to buy a new sim card. I ignored it and it continued working the same way at $0. I just got billed for $3.99. I looked at the account and my limit is raised from 200MB to 350MB, but it will cost me $3.99/month. And for the moment it is still working with the old sim card...
Is anyone else here using the uBlox F9P GPS receivers? I had problems with massive serial port log errors using the hacked ublox driver that I posted to ros-agriculture github. I switched over to the official driver http://wiki.ros.org/ublox The official one appears to work.
I was also digging through the documentation for other features. It looks like it "may" be capable of publishing an IMU message (I assume they convert internal data to look like an IMU). But it is not obvious if I need a different receiver model or if it generates data from the "moving baseline" mode. I will have to attempt to enable that and go outside and try it...
*Thread Reply:* I jeff i used ardusimple that have a f9p gps and i used the official drivers too.
I played with the "publish IMU data" option, it looks like it is based on receiver type. So it currently doesn't work. Maybeif I dig through the source I can find a way to enable it...
in my case I connected the arduino board with the BMO055 (using i2c) to an Odroid board (similar to raspberry Pi 4) and works
I previously tried to compile Adafruit code for the BNO085. I can't figure out how to change the I2C pins. The example program I had said to wire to pins 4 and 5. The I2C on the Teensy 3.2 are on different pins. I assume the definition is buried deep down, bit I can't find it...
I had something running before. Maybe I went down this path: https://forums.adafruit.com/viewtopic.php?t=92153 I will look down this path...
Or maybe I was using this code: https://github.com/kriswiner/BNO055 This looks real familiar. I now have something to search for in my Arduino IDE to find my original attempt...
This is the value that i read with the heading node to transform into a ros topic
So basically from the arduino+BMO055 i obtained the orientation.x and with the head_read.py i transform this value onto ros topic using string message type
I found this code the author refers to as "rock bottom" to extract data from the Adafruit BNO085. The code is small enough to run on a small Nano. At the moment it is reading the rotation vector "report" (page 64 of the reference manual). I'm thinking I need to pull Linear Acceleration (0x04).
Ah Hah!!! He says. I did lots of searching on how to change I2C pins when using Arduino code on Teensy boards. Turns out... The I2C interface definition is fixed for the board type. If you just use the correct pins then it works. I had it wired to pins 16,17 for previous software (probably the i2ct3 that is specifically for Teensy). I moved the wires to 18,19 and now it works! So I can now run any random Arduino code that is for the BNO085. Running readall_data I get this: ```Orient: x= 330.06 | y= 7.38 | z= 2.38 Rot: x= -0.13 | y= -0.06 | z= -0.06 Linear: x= -0.03 | y= 0.02 | z= -0.04 Mag: x= -5.88 | y= 40.00 | z= -43.00 Accl: x= 1.22 | y= -0.38 | z= 9.66 Accl: x= 1.26 | y= -0.40 | z= 9.71
temperature: 30
Calibration: Sys=0 Gyro=3 Accel=1 Mag=0```
This is worth looking at... This section of code that Al just posted says you can disable background calibration. Up to this point everybody has said you can't disable it, which may randomly bite you. Or maybe I am thinking of something else...
```//**********************************************************************************************
/* This code disables the calibration running in the background of the accelerometer gyro and magnetometer.
* sensors can be set individually on and off (chapter 6.4.7.1)
* P0 = accelerometer, P1 = gyro; P2 =magnetometer; P4 = planar accelerometer (without Z axis)
* 0 = disable; 1 = enable
**/
void MEcal(uint8t P0, uint8t P1, uint8t P2, uint8t P4){
uint8t mecal[15] = {0,2,0,0xF2,0,0x07,P0,P1,P2,0,P4,0,0,0,0};
I2c.write(BNOADDRESS, sizeof(mecal)+1, mecal, sizeof(me_cal));
}
//**********************************************************************************************```
Out of town for 2weeks. Very poor internet access. I’ll have to wait for the recording :-(
I figured out how to scale a plot in Plot Juggler. You right-click on the plot and select "Apply Filter". One of the options is "Scale/Offset". My static heading has a range of 0-36000000 and if I multiply it by 0.00001 I get the 0-360 that the new IMU puts out.
This is a plot of the static heading vs. the BNO055 module (it is not a BNO085 module, which I claimed before). I didn't do any extra calibration and the IMU probably needs rotated slightly so they both line up initially. (Or I could just add a rotation offset to correct for it.) But in general they track very closely.
You can see a further explanation when the Zoom video from today gets posted.
It just occured to me that I can add an offset in Plot Juggler (instead of re-running everything). After it stopped the red line (BNO055 IMU) says 2.000 degress. The blue line (uBlox moving baseline) says 8.554 degrees. So I told Plot juggler to add an offest to the red line of 6.554 degrees. Now the plots are generally on top of each other. Of course it would be better to go back and mechanically align everything...
I see Al has today's video uploaded already if anyone wants more background on what I have been doing. https://drive.google.com/drive/folders/1H_Jyx46SWu0-FB8WD-7FELhuQ3a3OBSh
This can be of use for debugging an Arduino connected to ROS. It inputs a string from the Arduino (that is a list of comma separated numbers) and breaks it into fields. One example selects a single field and publishes it as a float32. The other example publishes all the fields as an array of float32. This array can be opened and plotted with Plot Juggler. ```#!/usr/bin/env python
import serial import rospy from std_msgs.msg import Float32 from jeff.msg import Float32List
def talker(): # ser = serial.Serial('/dev/ttyACM0') ser = serial.Serial('/dev/ttyBNO055') # Port was renamed with udev rules ser.flushInput() pub = rospy.Publisher('bno055heading', Float32, queuesize=1) pubList = rospy.Publisher('bno055array', Float32List, queuesize=1)
rospy.init_node('bno055_node', anonymous=True)
while not rospy.is_shutdown():
ser_bytes = ser.readline() # Wait for a line to be received
# rospy.loginfo(len(ser_bytes))
# rospy.loginfo(ser_bytes)
ser_bytes = ser_bytes.strip() # Get rid of extra characters
parts = ser_bytes.split(",") # Split the string into a list using comma separator
# publish a single field from the list as a float32
heading = float(parts[0])
pub.publish(heading)
# Now publish entire list as float32 array
a = Float32List()
a.data = [float(i) for i in parts]
pubList.publish(a)
if name == 'main': try: talker() except rospy.ROSInterruptException: pass
It occurred to me that if you run the program above (bno055_node.py), and you are running a ttyUSBx type port, then you probably have to specify baud rate somewhere. If it is ttyACMx type port then it doesn't care about baud rate. I am currently using a Teensy 3.2 which creates a ttyACMx port and my udev rule renames it to /dev/ttyBNO055.
Javier's code above (read_head.py) does this on line 6:
ser = serial.Serial('/dev/ttyACM1', 115200)
So I guess that is all that is required is to append baudrate to the end of the serial open.
More interesting tid-bits... I notice that when I run the BNO055 that it never achieves magnetic calibration. I have it fixed directly to the frame so I have no way to "wave it around." But (if if I start with it pointed North) it does a good job of reporting an accurate heading. I want to experiment with starting out with it pointed in a random "non-North" direction, but I haven't tried that yet. Actually I tried a couple of tests with it pointed in a 'non-North" direction, but I haven't analyzed those yet. But...
I noticed couple of things...
I pointed my vehicle North and drove around two times clock-wise, turned around and drove two times counter-clockwise. The BNO055 did a good job of reporting heading. The dual antenna moving base line did a good job of reporting heading. I noticed the RTK correction dropped out at one point, which did not affect the moving baseline heading (but see below). I also noticed the magnetic calibration level from the BNO055 was either 0 or 1. The data sheet say to "ignore the heading completely if calibration level is 0". So it never did achieve a decent calibration level, but it was giving usable values.
I added an RGB LED to my BNO055/Teensy3.2 board. It will give me a quick indication of what the calibration level is. I haven't driven it yet.
Here are some random plots. (You can look at the the labels to to which plot applies.) One has BNO055 heading (blue), Moving baseline heading (purple), BNO055 calibration level (red), GPS0 RTK status (green), GPS0 number of satellites (yellowish). Notice the RTK drop out did not affect the moving baseline heading. Also notice the BNO055 calibration did not affect the BNO055 , even though I was told to ignore it under those conditions.
The other plot has GPS ground plot and gpsodom.py plot. The green is the RTK status, When the RK dropped out, the gpsodom.py dropped back to 0.0, but the real RTK GPS was unaffected. (I can't highlight his because I am on my Linux machine and I don't have access to Paint...) So there is problem in gps_odm.py, maybe it it is being too cautious?
So the bottom line is, plot out all of your data and see what modules have problems... And what you can trust...
Regarding BNO085 IMU, some progress to report. I need to test on my tractor and make a ROS node out of it, but I seem to be able to read the IMU and get both heading and rotational velocity. I have the high level code from Sparkfun working, but will try and pull out just what I need. Code link.
I found this video to be informative. And it is quite recent. https://www.youtube.com/watch?v=BQzmCmSijaA
A few updates...
On the last Zoom meeting I was telling Al that I screwed up my ESP32-CAM ntrip receiver. I attempted to add security to my SSID that the ESP was broadcasting. After that I couldn't log in anymore. After about a day of fighting with it I was able to reprogram two ESP32-CAM boards and was able to add a lower security level to the SSID. I put it on the robot and configured it to connect to my cell modem hotspot. When I turned everything on I could see the "data used" was incrementing on my hotspot and the F9P indicated it had an RTK fix. So I packed it up to take it outside and drive around the block. (Since I wouldn't be limited to line of site to my living room window for WiFi...
When I got outside and powered everything up, I didn't seem to be getting RTK fix. The cell modem hotspot "data used" was incrementing VERY slowly. So it didn't seem to be supplying correction data. I poked at it for quite awhile but couldn't figure out what was wrong with it. Finally I went in the house and told my laptop to show me correction data. The DOT (Minnesota Department of Transportation) system was down. So in the time it took me to move my robot outside, their system went down. (When you use free correction you can't expect it to work all of the time. Sometimes it comes back in a short time. Sometimes it may be down for days.)
I didn't have correction but I decided to drive around the block anyway. Maybe some things would not work without RTK fix. I plotted both the front and back GPS ground path. You don't see much difference since I am zoomed out so far. But that is what it looks like with no correction data.
On the left, the pink line is the dual antenna moving baseline heading. (It seems to work well even without correction data.) The peach colored line is the heading from the BNO055 IMU. You can see that it drifts over that long distance. (Someone is going to ask... I think that is 311 or 318 meters all the way around.) I didn't do any initial calibration on the BNO055. The blue line across the bottom (value of 0) and it stays at 0 the entire run. Adafruit says if you bolt down the BNO055 and drive around it will automatically calibrate itself. It didn't. I will have to experiment...
I added an RGB LED to my Teensy/BNO055 board which gives real time magnetic calibration status. So I decided to test it. I decided I would try various manual maneuvers to see if the IMU magnetic would calibrate. Then I could drive around, after it thought it was calibrated. The blue plot is the heading from the BNO055.
So I pointed the robot North and started up the launch file. The first plot shows the entire ground path. The green plot is the GPS moving baseline heading. The next plot has several line, but the blue line is the pitch value from the BNO055. You can see the pitch change as I pick up the front end and set it down. Then I thought I may be able to get pitch from the dual antenna GPS. I found the relPosD field in the navrelposned message. It is comparing the height of the front anttena to the back antenna and calculating pitch. It is inverted, but it follows the IMU pitch. It appears to be in degrees.
So to sum up, I pointed the vehicle North and moved it around until the magnetic calibration was at 3 (best). Then I drove around. It dropped to 2, then dropped to 1, then went back to 2, popped up to 3 and dropped back to 2. I will have to figure out what that is telling me... But once calibrated, it does not stay calibrated....
*Thread Reply:* Impressive work Jeff. I believe that the final solution is fusing the data of the heading from the GPS, the BMO055 and wheel odometry to obtain a good accuracy over the time. Now i'm working on that.
This is a zoomed in view of the initial calibration experiment. You can see the interaction of the top blue plot (heading/yaw) and the 4th blue plot which is the pitch. You can see when I picked up the front end and then see the heading change as I walk it around in circles around the back wheels. I didn't have a plan when I started, I just wanted to see what I had to do to get it to calibrate.
Since I had a lot of information recorded, I plotted some other things. The top plot is vehicle speed. The teal line is from the wheel odometry. The yellowish line is ground speed from the GPS.
The bottom plot is rotational velocities. Blue - wheel odometry, red - Razor IMU gyroscope Z velocity, teal - BNO055 gyroscope Z velocity. The GPS recovered rotational velocity is turned off since it has big spikes.
I notice that I don't get an output from gps_odom.py since I didn't have RTK correction data. Matt chose to ignore everything if the solution value is not 2 (RTK fix). I have patched that but I haven't tested it yet.
I connected an Arduino Nano to the position pot in the Open Frame Giant R/C servo. It just outputs the raw value from AD0. Then I have a simple ROS node to publish that as a float32. It is quite clean (to my surprise). It has something weird. It is suppoesed to go from about 250 to about 750 (from manual measurements). But there is a big spike below 250 that I don't think should be there. I will have to look at that. I also added a 10K resistor in series with the AD0 input. Because I knew that if I power down the Arduino that it would drag down the pot voltage in the servo. Which would cause it violently smash my steering into a hard stop and break something. With the 10K resistor it moves 5-10 degrees to the right when I power down the Arduino, but it doesn't break anything. It is close enough for testing...
PS: My correction data came back Monday morning when I checked it. I have one more major thing that I want to do before I run another test. I need to have my STM32F4Discovery board driver publish its mass of data as an array instead of a string. Plot Juggler will understand the float array but it does understand the single string. That data has the 5 values I need to publish the odometry message. Plus I blatantly added a bunch of debug data to that message. Things like... left encoder value, right encoder value, total linear distance traveled, the PWM value being sent to the steering servo. And more stuff I don't remember right now. So I have to go do that...
Other than the things I listed, I haven't been doing anything... 🙂
https://github.com/felias-fogg/SoftI2CMaster
This was just posted to the Home Brew Robotics group email server. I'm not sure exactly what it is, but it looks interesting. ```Hi everyone,
I'm the author of ROSshow (https://github.com/dheera/rosshow/), which lets you visualize ROS topics with ASCII art.
I wanted to introduce ROSboard: https://github.com/dheera/rosboard/ which simply runs on your robot as a ROS node, and serves up live-streamed visualizations on https://your-robot-ip:8888/. It already supports most common types, including Image, CompressedImage, time-series data of all std_msgs types, NavSatFix, /rosout, LaserScan, and even lets you stream the output of 'dmesg'.
This has been a long-running project of mine (I started working on it before WebViz) but I've been looking to pick it up again due to various inadequacies in WebViz.
A couple of the most important things I'm hoping to achieve with this:
ROS1/ROS2 compatible – it should work in both ROS versions! Tested in noetic, foxy, galactic, it should work in kinetic and melodic as long as you pip3 install rospkg. By the way, it makes use of my library "rospy2" which allows for the same code to work in ROS1 and ROS2.
Mobile-friendly – one of my preferred ways of debugging (especially outdoor) robots is to walk around with the robot and a phone. Being able to answer questions like "how much does the motor current jump if I put a load on top of the robot" in a few seconds with a phone-based visualization in one hand and a tool in the other.
Easily extensible – creating a custom visualization involves only adding ONE .js file and adding a reference to it in the main .js file.
Happy to hear any experiences / suggestions / what you would like to see in something like this!
Dheera Venkatraman http://dheera.net``` https://github.com/dheera/rosboard/
@JeffS or @Javier Ever seen non-linear output from an IMU? I'm getting 180 degrees from East->North->West, but from East->South-West the output is not linear. Video
I Al i dont see any non linearity... can you share the code that upload to the control board?
I saw an offset when i have running a lot of time the imu about 3 degrees
Non linearity is moving from 270 degrees to ~225 (i.e. 45 degrees) equates to a Yaw reading from 90 to 180
Sparkfun Yaw = -150 when pointing South; Yaw = 0 (North), -90 (East), +90 (West)
Several things... Are those trash barrels empty? Do those rocks have any magnetic properties? Can you move the laptop further away to see if that affects the heading? Why are you mentioning +90 and -90? Doesn't your IMU put out 0-360 degrees? Have you done any magnetic calibration on the IMU? (Verifiable by reading out the calibration status from the IMU.) If your magnetic calibration is off, or you have extra magentic fields then your heading is very likely to be non-linear. Any reason why you have the IMU mounted at a weird angle on your white board? Why do you mention Sparkfun? Is that where the code came form?
I'll look at your code to see if you are expecting NED or ENU. I'll get back to you after I look at the code.
barrel empty; put the rocks there after I noticed the issue so the rocks did not seem to cause any variation; I moved the laptop away the distance of the usb cable (~4ft) with no apparent change.
Regarding yaw, in the Sparkfun library, float yaw = (myIMU.getYaw()) ** 180.0 / PI;
Returns Sparkfun Yaw = -150 when pointing South; Yaw = 0 (North), -90 (East), +90 (West)
The board is mounted that way, to give me a North orientation. I know it seems odd.
AT first glance of your code I see it is from Sparkfun, so that answers my question about Sparkfun. I'll dig into the code. I want to see what myIMU.getYaw is suppose to return. Because it looks like they are converting from radians to degrees. But my biggest question is "have you done any magnetic calibration?"
Without spending the time to change your code to read calibration level, just pick up the IMU and wave it around for 30 seconds or so. If it is going to calibrate then it should do it in that time. Then rerun the test.
Oh, when you get a chance (not a big hurry) can you rename the meeting video form yesterday (zoom_0.mp4) to have the date, to match the other videos. @Javier We were discussing magnetic calibration in the video yesterday, if you are interested.
As something to back up your observations... The test I did driving around the block shows a drift in yaw as I was driving East and West but not driving North and South. The blue line across the bottom indicates the sensor was not calibrated. And the resulting magnetic lack of calibration was causing problems with the yaw output. https://tractorautomation.slack.com/archives/C8X6X6H7D/p1626206269064500
@Javier Oh, so you don't waste a lot of time, we were talking about magnetic calibration at the end of the video. (Let me see if i can find that...) Oh, now I can't get the video to play after it was renamed...
Back to Al's immediate problem. It looks like you are getting valid heading values from the Sparkfun library. So I think it comes down to getting a valid magnetic calibration to get consistent heading values.
I going to see the video about the magnetic calibration i ha e some troubles with that
I was thinking about putting a push button on my Teensy/BNO055 board. If I push the button it would store the current calibration values internally (or just print it out the serial port so I can write it down). But instead I can just look for an input message on the serial port to do the same thing. So I can connect my Windows laptop, start the Arduino serial monitor, man-handle the robot until my LED indicates it is calibrated, then send it a command saying to store the calibration data.
Maybe I will work on that right now.
I downloaded the renamed video from this week and it plays... At 47:45 I mention a new Sparkfun board that has addressable LEDs. Which is just a more solid version of using the flexible LED strips. (Either would work.) Then use these to display any random real-time status that you want to know about. (Then some no-sense about wire length)
At, 52:00 we start talking about IMU and calibration and storing/retrieving values. (At one point I said "on my BNO085", but I actually have a BNO055.)
Might be useful some time: https://wiki.ros.org/ros_imu_bno055
What's everybody up to?
On the meeting tomorrow I may comment on my grass fed beef purchase, my new Internet access, my thoughts on BNO055 IMU calibration. I didn't get much done this week...
*Thread Reply:* Well, I did not want to bring my hardships here, but the reality is that for work they sent me to the USA (Miami, Atlanta, New York, Houston and Charlotte) supposedly for seven days and I have been trying to return to Argentina for a month. That is why I was not coming to the meetings. Today I am in Paraguay which is a bordering country with Argentina and supposedly next Tuesday I have a private flight to Argentina .. Crazy but it's my reality .. About robots what I was doing adds little to what happens here .. I kept translating into Spanish the ros wiki and I am trying to install a virtualization of the other robot (the little one) that I am building .. I was remembering you Jeff the last weeks because I am stuck in a javascript course in the object-oriented programming part ..
*Thread Reply:* @Juan Eduardo Riva If you go here, you can walk across the bridge and be home! https://en.wikipedia.org/wiki/Triple_Frontier
*Thread Reply:* @Al Jones of course; but i need my passport signed.. The only legal way to acces Argenitina is by plane.. Belive me that I know all the other way
*Thread Reply:* I can't even imagine the situation you are in. My company sent me to Europe twice. But I had no doubt that I would be able to get back home. I can't imagine being in an adjacent country and having to sneak back into my home country.
If you have to sneak in with no signature... I can imagine you getting stopped for an traffic violation in 10 years. They will turn to the "deportation wheel" and give it a spin... It may turn up Siberia, or Antarctica or "United States, Alabama". ( I don't mean to offend Al, but it might turn up "United States, Texas") If you get deported to United States, I will try to clean out some room and you can stay with me...
So I hope you can get back home. Legally or illegally.
(We discussed this on the unrecorded Zoom meeting. I hope you were vacinated before they sent you on a world tour.)
*Thread Reply:* Yes, I was vacinated.. Thanks, for worring and now it's all ok. See you today
I told Al I may attempt to get my LED status strip to work. It works... I gave a description of the concept here: (@21:00) https://www.youtube.com/watch?v=kZkRv-ihn9o I know I showed the LED status indicator working on a video meeting. But I can't find that. The strip/Arduino accepts a string of 8 comma separated numbers. Each number is a color of that particular LED. Then I created a ROS node that subscribed to whatever I want monitor. It fills in the field for that that LED and sends it out. As a test, I am subcribing to GPS0 and GPS1 and displaying the solution status on two separate LEDs. And I subscribe to the output array from my BNO055 IMU and display the value of the magnetic calibration level on a third LED. It all works, but I need to add a timeout for each topic so it turns the LED off if that topic stops publishing.
@Al Jones Do you have a bagfile with your back wheel position encoders? I was just looking at my wheel odometry calibration. I realized we can convert your back wheel encoders to wheel odometry. (I would do it in Excel, to verify it works.)
I found the YouTube page for OpenCV-AI: https://www.youtube.com/c/OpenCVAI/featured Lots of good info... Did anybody get one of these cameras through the Kick Starter campaigne? If so, have you done anything with it? I got the 3D version of the board (OAK-3D?), but I haven't done anything with it.
While I still have an Internet connection... Someone on Home Brew Robotics Group used the worn out phrase (paraphrasing: GPS is noisy and inaccurate). There was a comment about altitude from GPS being unusable... I decided to plot my altitude. I see with RTK correction (and single antenna) I get a difference in altitude of 1.5 meters from one corner of my area (90 meters x 60 meters) to the opposite corner. (North East corner to South West corner) But I did get a massive jump in altitude 🙂 (0.2 meters) when it switches from "fix" to "float". So I normally don't look at altitude, but I did because of the comment. It looks relatively acceptable to me if you are using a modern, multiband, RTK receiver, with correction data. Now I may have to add the Tindie IMU which has a barometric pressure sensor to see what it thinks about altitude...
On the second plot, the bottom plot (greenish) is the fix status. The top line is "fix", the middle line is "float", the bottom line is "single point". (Single point is equivalent to a cell phone or the cheap $20 modules you can buy on eBay.)
I probably won't be on the meeting Friday. No Internet...
I had another revelation. Since I have no Internet, and there is nothing worth watching on over-the-air television, I can do some more ROS-Agriculture video indexing. One down and I have started on a second one. Only another 112 more to go...
I had my first run-away robot. (Well, not the first one ever. But the first one for this robot.) It is still a mystery.
It was driving forward and the steering was not responding. I hesitated and then pressed the eStop on the remote. The beacon did not change from orange to red. And it was still moving. I started to chase it down, hoping the manual eStop would kill it. It stopped before I got to it.
I assume the processor on the remote control receiver died. But it gives me a list of things to check and improve. Like make sure I have a timeout on the motor drive. So if there is no speed request it will stop. (Like the ROS side does.) I think I will remove the 24-volt boost converter. That will reduce the top speed from 2 m/s to 1 m/s. Easier to catch (and do less damage if I don't catch it).
I need to swap the master/slave. Currently the remote is the master. If it gets a communication it can't do anything about it. If the robot is the naster. Then a timeout can stop the robot. I should also consider adding a watch-dog timer to the remote control receiver. So it will reset the processor if the code dies.
I need to add more debug info so I can better analyze these failures in the future.
I also replaced the eStop switch on my remote control transmitter. It had a fairly large plastic switch. I replaced it with a fancy metal one, like the one on the robot. I haven't run it long enough yo see if it solves the intermittent problem.
My Internet is working again. Apparently it was all screwed up.
I did get 12 more ROS-Agriculture meetings indexed. Only 100 more to go...
I did some more testing. I can post something on the dramatic effects of tire pressure on wheel odometry.
And, driving around the block twice (around 600 meters) the BNO055 IMU was tracking well (better than before). And that was with no magnetic calibration. If I load the calibration data at startup, it won't output any data. So I don't do that anymore.
I was able to get the BNO085 to publish to the Float32 array. I went to test and was having issues getting my Ubuntu 20 laptop setup with my Tkinter gui. Anyway, I started with what Jeff had provided and added some more notes. Still rough. (link)
In the meeting I mentioned changes your print statements for debug or normal. I was talking about standard C preprocessor directives (here is random documentation on that): https://docs.microsoft.com/en-us/cpp/preprocessor/preprocessor-directives?view=msvc-160 You can mix and match these to do lots of things. Keep in mind this actually controls which lines of code get compiled. So if you change form one format to the other, you have to recompile and download for it to work
The other option is just create a variable. So you can do this: if (debug == 1) { print("Fancy text: %d %d\n", x, y); } else { print("%d,%d\n", x, y); }
Then you can change it on the fly without recompiling.
As I write this... The Adafruit BNO085 board is currently in stock. Maybe not for long. If you have been tracking these, this would be the time to get one before they are gone again. http://www.adafruit.com/products/4754
Apologies, but I'm unavailable today for a call. My day-job is causing the problem. @Juan Eduardo Riva if you are able to have a Google call with @JeffS can you let him know?
Just a couple of comments. I couldn't sleep last night. So I worked through various concepts. First, I worked through how to mount 3 IMUs on my mini-tractor. (The protoboards are now mounted.) I worked through the whole steering definition that Al keeps agonizing over. (Down to lines of code.)
And a third major concept, which I can't remember... (Maybe it was automatic calibration... Or maybe navigation experiments.) According to my USPS tracking info, my Adafruit BNO085 boards should show up tomorrow. I found a new Teensy 3.2 that I should solder the pins on tonight. If I get far enough, I should download and compile some test software for the BNO085. I have a third (well, fourth, IMU on my mini-tractor, if you count the Sparkfun Razor IMU). So with any luck I can get the BNO055 and BNO085 and EM-7180 IMUs talking some time tomorrow. I may unplug the Razor IMU, if I need the USB port. If I determine that a single IMU will give adequate results then I can remove the other ones. The BNO055 was giving me good heading info the last time I ran it. (It shouldn't have. But it was.)
I was also contemplating some simple navigation experiments. I may step through those...
I think I have another 27 ROS-Agriculture videos indexed. I think that leaves another 81 or so to go.
My BNO085 boards showed up today. The default address is opposite between the Sparkfun board and the Adafruitt board. The direction is 90 degrees from what I wanted, so I will have to rethink the mounting. The examples worked right out of the box (except for the different address). My North angle is off by about 45 degrees. I will have to look into the calibration steps.
All night I fought with the Adafruit software for "THEIR" BNO085 IMU boards. It will not work. It won't run on a Teensy. It looks like they got cute and added another layer on top of the I2C drivers. I can't figure it out...
The Sparkfun code runs. I will go back to that.
*Thread Reply:* Will prices come down? Seems like there is a lot of support on GitHub
https://discourse.ros.org/t/roscon-2021-cancelled-ros-world-2021-announcement/21929
Some info on my Tkinter gui to help start launch files and display topics on my "operator" laptop https://gist.github.com/jones2126/3460e9a51e1bb943eef714edf1c3eb3b
My first results with the BNO085 IMU. The first image shows the GPS Static heading, BNO055 heading and the BNO085 heading. The BNO085 isn't tracking as well as I hoped, but much better than expected. It is doing a pretty good job considering the half-ass attempt at magnetic calibration. This is driving around the block like I usually do.
The second image shows GPS static heading and BNO085 heading. The middle plot shows the "quaternion accuracy level", whatever that is. It doesn't seem to correspond to anything. The bottom plot is the BNO085 magnetic calibration level (0-3). So it is changing while I am driving around.
Now I have something to look at...
It ocurred to me that I have a second BNO085 sensor. So I added it to my robot. This image shows the BNO055, BNO085, a second BNO085 and the GPS static heading for reference. Now I have to figure out how to do the calibration. It is not real apparent from the available documentation. The bottom plot shows the calibration level as I drive around.
The bno055 sensor needs to be moved into four or more directions/angles and stay there for 10 seconds. Depending on the library you could save the calibration. Saving didn't work for me through i2c.
I'm afraid to touch the BNO055 at the moment. It is working the best. I don't think the BNO055 saves calibration to the chip. I think you have to store it locally.
I was just playing with a BNO085. I can't get it to calibrate outside. It seems to calibrate indoors. And I can save to the chip. And everyone is happy. If I take it outside the indoor calibration does not work. But I can't get it to calibrate outdoors...
So that is where I am.
I told the BNO085 to not do background calibration, but it was still changing as I was driving around.
I finally got around to measuring a survey marker. I found a converter on a web page that didn't allow me to type in enough digits. Another page took all of my numbers. I had to convert because I get degrees (dd.ddddd....) and the survey report is in degrees, minutes, seconds (dd mm ss.sss...)
I also found two web pages to calculate distance between two points. Neither of them allow enough resolution. According to them the two points are identical.
MNDOT tells me the surveyed point is: 45 00 21.55959 93 14 48.90473 Which converts to: 45.005988775 -93.2469179806 I measure: 45.00598905 -93.24691679 (This was just a random point and not the average.) Difference of 0.000000275 degrees by 0.000001191 degrees Off hand, I don't know how to convert that to millimeters. I am going to stop worrying about my accuracy until I find a better way to calculate the differences.
I searched for "how many meters in 1 degree of latitude" and it turned up this page: https://www.usna.edu/Users/oceano/pguth/md_help/html/approx_equivalents.htm I thought 1 degree = 1 km seemed awfully small. I knew there was more that 180 km from North pole to South pole... 🙂
It looks like you can use the latitude directly. Which if the 7th decimal point is 1.1 cm then I think I have 2.75 ** 1.1 = 3.025 cm difference in the latitude direction. I don't have the patience to figure out the longitude difference.
But considering I am using correction data with an unknown baseline length. And the survey pin is bent at about a 45 degree angle. And I eyeballed the placement of the antenna/tripod over the pin. I'm real happy with a 3 cm error.
I was using the newer correction data which has GPS, Glonass and Galileo data. If I switch to the old version it only has GPS and Glonass. It may be worth a test sometime to see if that makes a difference. And if I had thought ahead I probably could have displayed a screen that would have told me the baseline length...
Does anybody know how to get LEDs to show up in digital photographs? If I take a picture or make a movie with my Nikon camera, the LEDs don't show up. Even though they are plenty bright. Do I need filter on my camera? Or a diffuser in front of the LEDs? Just a strip of white paper will diffuse them. Here is an example photo of my status strip:
When I look closer I can see 4 greens, a blank, 2 blue and one red. I don’t have experience trying to photo leds. I do use a temporary backup camera on my tractor and truck. On the tractor, the monitor is hard to see. I’ve had to experiment with monitor angle, elevation and especially providing shade for the images to stand out. The experimental cardboard shade is kinda like a hood. Btw your status led concept is really cool and worth bookmarking for future reference!
*Thread Reply:* I send him an email and try to connect but lost the opportunity on my side because I was to busy. After that, never reply me again.. 🤷♂️
@Al Jones I have been looking at this steering angle non-sense. I thought I was getting reasonable results with my old bagfiles. So I looked at your numbers... Here are some computations. The diameter of the circle measured on the screen isn't real close to the calculated diameter. 1.587 vs. 2.278. But what I am doing isn't real precise. I picked what looked like the smallest complete circle. My comments:
```@ 13:15:57
Diameter from screen: x1-x2 (@13:15:57) 18.337 - (@13:16:00) 16.059 = 2.278m diameter
gps_forwardvelocity: sqrt(/vel/twist/linear/x ^ 2 + /vel/twist/linear/y ^ 2) = m/s sqrt(0.4925^2 + 0.7140^2) = 0.714 m/s
bno085_rotvel: ~0.9 r/s
Using equations from this page: http://wiki.ros.org/teb_local_planner/Tutorials/Planning%20for%20car-like%20robots
calculated radius: forwardvelocity/rotationalvelocity = radius 0.714 / 0.9 = 0.7933m radius = 1.587m diameter
steerangle: arctan(wheelbase / radius) = steerangle (but i don't know your wheel base...)```
Anyone recommend a web camera, usb..wireless.. end of life Realsence cameras
This is where the new Lawn Tractor Automation YouTube channel will be: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ It won't let me change the URL to something reasonable. So we will have this URL for the foreseeable future. The videos are taking forever to upload. I will finish the upload process.
I didn't get too far. My second video upload aborted. It ran for about 1 hour and 45minutes. It said it had 3 minutes remaining. Then it aborted and told me it exceeded some arbitrary limit. It told me I could "verfiy my account". That involves supplying them with phone number (because Google and Facebook can make a fortune by linking a phone number to online data). I don't have a phone, I just had it disconnected. For the moment I am done...
Enough people have run into this problem. There are apparently ways to get around it. I'll track those down...
I have to verify my YouTube account by by phone with either a text message or a voice mail. I don't have a phone. There are web sites for verification, like: https://oksms.org/ But the US numbers don't work, I assume they have been used too many times. I can get it to accept a Canadian phone number, but the text message never shows up. So I am still playing with this.
Hey, I got it work!!! Both my personal account and the new Lawn Tractor Automation channel are now authorized. I can go back to uploading videos...
I got the first actual meeting video uploaded: (actually I have two now) https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/featured I was watching the first one on my TV to make sure it was going to work, and "DOh!!!" I realized I could be indexing it while I was watching it. So I made notes, started over and finished my notes. In the description for the first video you can see how I integrated the chat. And I did the index correctly. So an index entry has a time stamp, the time stamp shows up in the video slider bar. If you click on an index entry it should start playing at that point.
While I was doing that the second video finished. (I took about 1-1/2 hours to upload and another ~1/2 hour to convert.) I can go watch that one now.
@Al Jones The next time you are driving around, can you get a picture of your lawn tractor, in the grass, in front of a soothing background (a hedge or flowers, or just a wall if that is all you have). And an angle that shows that it is a modified lawn tractor with some control electronics on it. I need a photo to replace the "L" icon for Lawn Tractor Automation identity.
Oh, I am trying get the play lists figured out too. To make organization work out better.
This is what I am thinking for the channel icon. But with Al's lawn tractor (since he started this whole concept). I just used this image as a test. It should have been zoomed out a little more. Because you can crop it down to a circle when it is published. And it will look like this... https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ
Oh, it shows up as a square here. But it is cropped to a circle on the actual page.
It's always somethun... YouTube will not let you put < and > in the description. One guy says to copy the characters from his tutorial video and paste them into your description. That fixes it. Apparently he is using special characters that look like < >. I'll track it down some other day...
Warning!!! If you see comments on the new Lawn Tractor Automation YouTube channel, don't click on any links that may be there.
We seem to have "SPAM Bots" randomly dumping links into the comments. Searching says "Do NOT click on these links!" I may turn off comments completely, if I can figure that out.
I found something in YouTube that says "Hold all comments for review". But setting the global checkbox doesn't affect previous uploads. So I had to change each one separately. And the global setting doesn't seem to work on new uploads. So I will fight with it some more...
I got caught up with the Lawn Tractor Automation videos. The one from last week was on my Windows computer. The Windows computer seems to upload faster than a Linux computer. But while that is running my Roku television refuses to work. I'm not sure why.
I am missing two videos. They were audio only sicne the videos got lost. If I upload an audio only MP4 file, YouTube refuses to process it. I'll have to look at that.
I had another video that did not finish processing the HD video part. Then a SPAM Bot posted a comment to that video. I can't delete that comment. I assume because it did not finish processing. So I just deleted the entire video and I am re-uploading the video.
I am working on indexing the videos.
*Thread Reply:* That is not the board that I have. Your boards looks like this one: https://www.amazon.com/STM32F407VGT6-System-Development-Single-Chip-Learning/dp/B081BCDZ1N/ref=sr113?dchild=1&keywords=stm32f4xx+board&qid=1631210752&sr=8-13|https://www.amazon.com/STM32F407VGT6-System-Development-Single-Chip-Learning/dp/B081[…]113?dchild=1&keywords=stm32f4xx+board&qid=1631210752&sr=8-13 Did you get documentation with this board?
Digikey says it is obsolete: https://www.digikey.com/en/products/detail/stmicroelectronics/STM32F4DISCOVERY/2711743?s=N4IgTCBcDaIM4BcC2BmMAzALAEwJZwGMB7ANwFMAnATxAF0BfIA
Try this link for todays meeting. It is working from my second computer. https://us02web.zoom.us/j/82358108849?pwd=aFJ0MmFHdERmZXdtTGRMZWdmaGlVQT09
I was having runaway problems. And I blew a fuse on my main motor drive. (See the last two meeting videos.)
I have added a USB/CAN adapter to monitor the internal CAN bus. And I am putting the velocity and steering values out to my normal ROS topic. So I can see if the remote control receiver is not transmitting CAN messages. Or the main microcontroller is not receiving CAN messages.
And I added a current monitor to my main motor drive. It gets published anlong with my BNO055 values to a ROS topic.
Maybe tomorrow I can drive it around and get some data.
*Thread Reply:* @JeffS goodmorning.. Later yesterday I was thinking .. You collect the data in a bag file and then reproduce that file and see it plotted. You also have access to the topics and nodes through the console. Is there something I'm not seeing about how you analyze your work and the robot ..? @Al Jones you do the same, don't you ...?
*Thread Reply:* I have been doing three things.
I have a couple of different programs that will break a bagfile into separate CSV files. I can then open those with Excel or Open Office Calc. Or use the CSV values for anything I need.
I can open a bagfile directly with Plotjuggler. Then I can plot any combination of data. I can scale any plot and apply offset to any plot. Two big advantages of Plotjuggler... It lines up all plots in time. Excel won't do that. The other very omportant feature is you can use the slider bar to tun the plots forward and backwards.
I have also used a bagfile playback to get realtime data to play with. I wrote a node to subscribe to /gps/vel. Then started the bagfile playback. It extracted forward velocity, heading and I converted heading to rotational velocity. Then the next I ran the robot it created the new topics as it ran.
I did something I thought was clever. I started a bagfile playback, started another bagfile record, then started my new node I was writing. So when it was done, I had a new bagfile with the original data plus the new data I just created. There are time when that may be handy.
I need to look to see if I can edit a bagfile. I would like to go back and delete the ros_out topic from an existing file.
And if you want to know where to look in the last videos, look in the video index in the description of each video.
*Thread Reply:* Nice job Jeff! I like the individual chapters presentation ability!
Thanks..
I did something I thought was clever. I started a bagfile playback, started another bagfile record, then started my new node I was writing. So when it was done, I had a new bagfile with the original data plus the new data I just created. There are time when that may be handy.
when was that case scenario..?
*Thread Reply:* Well, I started a bagfile playback. My /gps/vel to heading-velocity-rotational_velocity node created new topics and the rosbag record created a new bagfile with the original data and the new topics I created.
I added my current sensor to my main motor. Actually it was between my battery and my 12V to 24V upconvertor. I drove around. The top plot is motor current. The bottom plot is vehicle velocity. The two big spikes on the left are hard acceleration. The first one is forward. The second one is reverse. The big spike on the right is when I drove from my sidewalk up onto the grass. The zoomed in view shows the current maxed out the A/D. When I calculate current that level is 30A. So I guess my 20A fuse had a right to complain. I took my 12V to 24V convertor out and I will try it again tomorrow to see what the numbers are.
I did not get a run-away condition to analyze. But my steering did something stupid, so I have a new thing to look at.
If you have a flux core welder, this video is worth watching: https://www.youtube.com/watch?v=AIWnygcnGBY
Here is the Wiki page I was trying to find in the Zoom meeting today. http://wiki.ros.org/teb_local_planner/Tutorials/Planning%20for%20car-like%20robots
It has two key points. First it has a diagram to show the relation ship of velocity, rotational velocity, steer angle, turn radius and wheel base. It also has a version of a program to convert from rotational velocity to steer angle. (This is the same code that is inside of the TEB planner to do the same conversion.) https://www.youtube.com/watch?v=nAHbvCe2bTU
This is a code snippet from gps_odom.py which takes lat/lon and publishes odom:
`if data.status.status == 2:`
`#Convert from lat/lon to x/y`
`_xg, _yg = gc.ll2xy(data.latitude, data.longitude, _GPS_origin_lat, _GPS_origin_lon)`
`# Build Odometry Message`
`self.odom_msg.header.stamp = rospy.Time.now()`
`self.odom_msg.header.frame_id = "odom"`
`# self.odom_msg.child_frame_id = "base_link"`
`self.odom_msg.child_frame_id = "base_footprint"`
`# set the position`
`self.odom_msg.pose.pose = Pose(Point(_xg, _yg, 0.), Quaternion(**self.odom_quat))`
`self.odom_msg.pose.covariance[0] = data.position_covariance[0] # x`
`self.odom_msg.pose.covariance[4] = data.position_covariance[4] # y`
`self.odom_msg.pose.covariance[7] = data.position_covariance[8] # yaw`
`# Publish Transform between odom and base_link`
`odom_broadcaster = tf.TransformBroadcaster()`
`#odom_broadcaster.sendTransform((_xg, _yg, 0.0), self.odom_quat, rospy.Time.now(), "base_link", "odom")`
`odom_broadcaster.sendTransform((_xg, _yg, 0.0), self.odom_quat, rospy.Time.now(), "base_footprint", "odom")`
<!-- define gps joint -->
<joint name="gps_joint" type="fixed">
<parent link="base_link"/>
<child link="gps"/>
<origin xyz="0.46 0 0.61" rpy="0 0 0" />
</joint>
Fyi, this is how base_footprint joint is defined
<!-- define base footprint joint -->
<joint name="base_footprint_joint" type="fixed">
<parent link="base_footprint"/>
<child link="base_link"/>
<origin xyz="0 0 0.25" rpy="0 0 0" />
</joint>
Another issue at some point to deal with is the GPS is sitting .86 meters above ground. On a hill, without adjusting for the angle, there will be an offset that should be included.
*Thread Reply:* is there a reason you all didn’t choose navsat transform
*Thread Reply:* navsat transform takes in IMU and GPS and publishes odometry http://docs.ros.org/jade/api/robotlocalization/html/navsattransform_node.html
*Thread Reply:* @Al Jones I have been thinking about this. I have bookmarked a bunch of videos on TF I want to watch. But I think what needs to (or could) be done is to subtract the difference between GPS and baselink before using the data. Something like this:
_xg, _yg = gc.ll2xy(data.latitude, data.longitude, _GPS_origin_lat, _GPS_origin_lon)
_xg = _xg - X_DELTA
_yg = _yg - Y_DELTA
Where XDELTA and Y_DELTA are constants from the GPS to baselink transform.
A better way to get the values would be to query the GPS to baselink transform and get the values. I'll look at that...
*Thread Reply:* "Another issue at some point to deal with is the GPS is sitting .86 meters above ground. On a hill, without adjusting for the angle, there will be an offset that should be included."
This may be as simple as adding pitch and roll from the IMU into the GPU to baselink transform. Instead of using a static transform (either a static transform broadcaster or a static link in a URDF) you could create a dynamic transform broadcaster. Supply the x/y/z offset which was previously defined. Then add in the pitch and roll from the IMU. Maybe it is more complicated than that...
I am finally up to date on the new You Tube channel. I got the audio only videos to upload. I had to convert them with this program: https://www.openshot.org/
Not exactly on topic, but... This is not a refurbished steam tractor. They built it from scratch from old drawings. And it shows the power of steam. https://www.youtube.com/watch?v=FuJbfrcFKyU Now if they would add some GPS, a computer and some steam powered actuators, they could have an autonomous steam tractor.
*Thread Reply:* Also add a stirring engine on the wasted heat to generate electricity for technology.
*Thread Reply:* Upon further review..use thermo electric materials to generate power for electronics? Come to think of it..utilizing heat from a muffler/exhaust pipe of a lawn tractor, farm tractor?
*Thread Reply:* Uploaded where? In Slack I get two images that have the bagfile name. I assume you do too if you look in the Random channel.
*Thread Reply:* Oh, I see. You put in in the Google drive. (If I had read your new image.) I will look at it.
Okay, first thing... What are /speed, /leftspeed, rightspeed? I assume leftspeed and rightspeed are scaled values from your wheel sensors. From my test, /speed does not appear to be forward speed (average of leftspeed and rightspeed).
Forward speed @00:00::50.092
/speed = 0.887
/left_speed = 0.848
/right_speed = 1.578
Does (left_speed + right_speed) / 2 equal /speed?
(0.848+1.578)/2 = 1.213 which does NOT equal /speed
Now I will go and attempt to extract forward speed from GPS...
Checking your GPS speed at the same instant in time I get:
GPS forward speed @00:00::50.092
To get forward speed from GPS, calculate length of vector from /vel at this point.
x = -0.757
y = -0.462
GPS forward velocity = sqrt(x^2 + y^2)
sqrt(-0.757 ** -0.757 + -0.462 ** -0.462) = 0.8868m/s (which is close to /speed)
Next I will attempt to extract one or more rotational velocities...
Another point to remember. I created a second plot (of a heading) and it screwed up my location. I had selected a location with the slider at the bottom of the screen. Now when I go back to reset the slider, I can't hit the same spot. But I will try to get close... (Actually it shouldn't matter since you have a constant speed and constant rotational velocity) (and at the moment I am just trying to figure out the steps)
While I am typing, I have a question. I the area I am in the /bno085_rotvel = 0.9 and I see the other values from the bno085 all have just one point past the decimal point. Was that intentional?
Oh, now I see a 0.92 on /bno085_rotvel. Maybe it was just a coincidence that it only had 1 place past the decimal point.
Here is another interesting point about Plot Juggler (somebody should make a list of interesting points...)
You can open a plot. Zoom in to what you want to look at. Then drag the slider back and to select a point in time. A little marker follows the path so you know which sample you are looking at. But... if you just have a smooth curve you can't ell where you are. So... right-click on the curve and select Line Styles. Under that select "Dots" or "Lines and Dots". Now when you step through time with the slider you can see the individual samples.
More importantly, you can click the button on the slider to get focus. Then you can use the the right and left arrows on your keyboard to step forward or backward one sample at a time. That eliminates a lot of aggravation.
So below I am looking Al's plot to extract rotation velocity from GPS. I didn't know his GPS sample rate. (I assumed it was 10 hertz) I go to the sample I want, look on the left under /vel/header and I have sequence number and time stamp. The point I'm on has seq of 240 and a time of 1632692473.68. If I back up one step I see seq of 239 and time stamp of 1632692473.478. If I subtract the times I get 0.202 which is 5 hertz. (not the 10 hertz I had guessed at)
I don't have the ambition to fix this right now. I had my television plugged in to give me dual screen. It takes a screen capture of both screens. I unplug the HDMI but it still thinks the second screen is there there. I may have to reboot to get rid of it...
I'm tired of fighting to get rotational velocity from GPS. So let's go with what we know. From TEB planner code: ```double TebLocalPlannerROS::convertTransRotVelToSteeringAngle(double v, double omega, double wheelbase, double minturningradius) const { if (omega==0 || v==0) return 0;
double radius = v/omega;
if (fabs(radius) < minturningradius) radius = double(g2o::sign(radius)) ** minturningradius;
return std::atan(wheelbase / radius); }``` Ignore the non-sense about minimum turning radius. That is just setting a limit.
Your IMU says either 0.9 or 0.92 radians/second at the points that I randomly chose. For forward velocity we could use either 0.8868m/s from GPS or 0.887m/s from /speed. From memory I think your wheelbase is 1.27m. So:
radius = 0.887 / 0.9 = 0.9856m (I measure 1.243m on the Plot Juggler screen from GPS ground plot) angle = atan(1.27 / 0.9856) = 52.18628454 degrees (0.91082249 radians)
Does that seem at all close?
PS: I use this calculator since my Linux calculator does not seem to have atan. https://www.rapidtables.com/calc/math/Arctan_Calculator.html
To sum up. To get recovered radius and/or recovered angle, read instantaneous forward velocity and instantaneous rotational velocity from a random point in time. Then use the two equations: radius = fvelocity / rvelocity angle = atan(wheelbase / radius) (But keep in mind that if rvelocity = 0, then it will die with a divide by zero error.) And keep in mind you seem to have two sources for both fvelocity and rvelocity. You can get speed from your wheel sensors and from your GPS. You can get rvelocity from your IMU or tease it out of GPS.
Just adding (subtracting) an x/y offset won't work. The values derived from GPS are in World coordinates and don't account for rotation of the vehicle. So you would have to do the "translate point" thing.
_xg = (X_DELTA ** sin(heading)) + _xg
_yg = (Y_DELTA ** cos(heading)) + _yg
(The sin and cos may have to be reversed)
This node already subscribes to a heading value. So this may work.
There may be a clever way to ask the transform system to subtract the offset.
I should dig through the navsat_transform node to see how they do it.
*Thread Reply:* Well the circle I posted are the result; They are ~ 2.2, 2.4 meters in diameter
I moved my back GPS ahead by 8cm. That should put it directly over my baselink. I was thinking I could feed my front GPS into a second copy of gpsodom.py and tweak the code until it reports the same as base_link.
Then I just had an inspiration. I can play back a random bagfile, feed the front GPS in to a second copy of gpsodom.py and tweak the code until it reports the same as baselink. So If I playback this file: https://tractorautomation.slack.com/files/U01LCR21M8B/F0266G8GB8D/screenshotfrom2021-06-2520-26-05.png?originteam=T8WP3RHH7&originchannel=Pbrowse-channel-files|https://tractorautomation.slack.com/files/U01LCR21M8B/F0266G8GB8D/screenshotfrom2021-06-[…]ginteam=T8WP3RHH7&originchannel=Pbrowse-channel-files And when I get it tweaked, those plots should be on top of each other. I can test it without driving...
I have been fighting with tools again.
Plot Juggler has decided that the time slider bar will not let me scroll back and forth in a plot. (sometimes) It may be because I am recording a new bagfile from a bagfile playback. Or because I am specifying a start time and duration in the bagfile playback when I convert. Or my new node (a second copy of gps_odom.py) is screwing up everything. It works just fine on the orignal bagfile I started with. So it is something I did that breaks it.
I am getting some correction of the GPS mounting location to work. I seem to be able to get the vertical (latitude direction) to work. But my horizontal (longitude direction) is off. It might be because my front GPS is running moving-baseline instead of RTK. I have to move one wire to switch from moving-baseline to RTK. (selects source of correction data) But I then lose my source of heading that I was using. My bno055_heading message is in the same format, so I may be able to substitute that.
Or I may just be confused and I am missing something...
Here are the results from my attempt to offset the GPS based on location.
The first plot is the back and front antenna zoomed out. The second plot is zoomed in to the bottom. The third plot is zoomed in to the top.
As I was typing this I realize what the problem is. I specified the horizontal offset as 0. So it did no correction in the horizontal direction. So I am missing something...
This is what I attempted to do. Which only corrects for front to back distance and not side to side. (As shown in plots.) But I know what to look at now.
The values of 0.0 and .067 are the offsets from the baselink to GPS transform. ``` #Convert from lat/lon to x/y (x is East, y is North) _xg0, _yg0 = gc.ll2xy(data.latitude, data.longitude, _GPSoriginlat, _GPSorigin_lon)
_xg = _xg0 - (0.0 ** math.sin(self.heading)) # Longitude
_yg = _yg0 - (0.67 ** math.cos(self.heading)) # Latitude```
I wrote a node to convert forwardvelocity, rotationalvelocity, wheel_base to radius and angle. I ran this on Al's latest bagfile. I will put the code in a thread. And put some plots in the thread.
*Thread Reply:* Here is the code I ran ```#!/usr/bin/env python
import rospy import math from std_msgs.msg import Float32
WHEELBASE = 1.27 # In radians
class MainClass(): def init(self):
# rospy.init_node('fvel_rvel_to_angle', anonymous=True)
self.relpos_sub = rospy.Subscriber('/speed', Float32, self.speed_callback)
self.relpos_sub = rospy.Subscriber('/bno085_rotvel', Float32, self.rotvel_callback)
self.pub_angle = rospy.Publisher('recovered_angle', Float32, queue_size=1)
self.pub_radius = rospy.Publisher('recovered_radius', Float32, queue_size=1)
self.rotvel = 0.0
while not rospy.is_shutdown():
pass
# rospy.spin()
def rotvel_callback(self, msg):
self.rotvel = msg.data # Save for later
def speed_callback(self, msg):
speed = msg.data
rotvel = self.rotvel # Get a local copy
if (speed == 0.0) | (rotvel == 0.0):
angle = 0.0
radius = 0.0
else:
radius = speed / rotvel
angle = math.atan(WHEELBASE / radius)
self.pub_angle.publish(angle)
self.pub_radius.publish(radius)
if name == 'main': rospy.loginfo("fvelrveltoangle.py") rospy.initnode('fvelrveltoangle') nodeobject = MainClass() #try: # talker() #except rospy.ROSInterruptException: # pass``` I ran this against the bagfile. (2021-09-26-17-40-23.bag) Al should be able to include this to his launch file and create this in real time.
*Thread Reply:* Here are some plots. The first one shows the output of recoveredangle and recoveredradius. They are much noisier, but I zoomed in... The second plot shows joystick steer request, front angle target, front angle raw, and recovered angle.
At the time I have selected your joy is asking for 1.0. You scaled that to 14000, The steering looks like it is tracking at 13946. The recovered angle at that time is 0.8885 radians (50.91 degrees)
Coincidentally, when I manually converted a single point of the previous bagfile, I got the exact same number. They should not match that close but it shows I apparently got this right.
So that says 1.0 -> 14000 -> 0.8885 radians
So I would scale the modifiedcmdvel of +/-0.8885 to +-14000. Or more specically:
if cmd_vel.z < 0 map(0, -0.8885, 0, -14000) else map(0, 0.885, 0, 14000)
Or however that map command works. You already have a command mapping +/-1.0 to +/-somesteeringvalue. Just use the new numbers.
And make sure you limit the travel of the steering if TEB planner puts out too big of steering angle number...
*Thread Reply:* @Al Jones This is the thread with the fvelrvelto_angle.py node.
*Thread Reply:* @Al Jones In the code above, change
while not rospy.is_shutdown():
pass
# rospy.spin()
to
# while not rospy.is_shutdown():
# pass
rospy.spin()
otherwise it chews up massive amounts of CPU time.
*Thread Reply:* @JeffS In "fvelrvelto_angle.py" you have
WHEELBASE = 1.27 # In radians
Is the comment "In radians" correct? I'm not sure how to translate a linear measurement (e.g. 48" wheelbase) to an angular measurement in radians. Any suggestions?
I figured out why the time slider on Plot Juggler does not work on my new bag files. I think the time stamp on my new topic I created has a time/date that is days or weeks newer than all of the previous data. I have tried several things to fix it. But I'm done fighting for tonight. Maybe tomorrow...
I still haven't figured out the time slider on Plot Juggler. But it does have something to do with merging old and new time stamps.
I made a breakthrough on remapping the GPS antenna location. It looks right but I haven't verified that mapping GPS antenna to base_link is really what I should be doing...
``` #Convert from lat/lon to x/y (x is East, y is North) xg0, _yg0 = gc.ll2xy(data.latitude, data.longitude, _GPSoriginlat, _GPSorigin_lon)
_xg = _xg0 - (0.67 ** math.sin(self.heading)) # Longitude
_yg = _yg0 - (0.67 ** math.cos(self.heading)) # Latitude```
I put the offset distance into both lines of code and instead of just one (like I posted on Friday). I currently have an offset error of around 7 - 8 cm. But I told it the distance from front antenna to base_link (0.67m) . But I am actually using the back antenna and front antenna as reference so the distance should have been 0.75m. So that accounts for 8cm of error.
The first plot shows the front antenna with NO correction mapped to the back antenna. The second plot is the front antenna WITH correction mapped to the back antenna. The second image is zoomed in versions of the top area and the bottom area.
So I think I have gpsodom reporting the location of baselink even though the antenna is offset. I haven't checked what this does to transforms...
Looks like the Sun is up. I should go do some driving around. Or take a nap.
When I zoom in I see my GPS (uBlox F9P with RTK corrections) has more jitter than I would prefer. I was concerned that gpsodom.py was introducing the jitter. So I plotted the direct GPS Lat/Lon and the output of gpsodom.py side by side. I turned the dots on so I could see the individual samples. I decided they look the same.
It would help to speed up the output rate of my GPS. It is currently set to 5Hz. I think I can just tell it to run at 8Hz and it should work with 4 satellite constellations. I think it said somewhere that if I run 3 constellations that I can run at 10Hz. (So I could throw out Baidou. Or it may be disabled anyway.) And I "think" it said that if I run just the GPS satellites that I can set it to 20Hz. I'll have to verify those references to see if it really says those things. And it would be interesting to see how accurate it will be with just GPS satellites enabled.
I gave a description at the end of last Friday's meeting of what I was trying to do with the GPS location remapping. It was after Juan and Al had signed out, so I am the only one that has seen it. The time stamp for that section should be on the index in the video description.
Here is a link to the YouTube channel, in case people didn't know we have a new Lawn Tractor Automation YouTube channel. https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/featured
*Thread Reply:* sorry for having to retire last Friday. I'm really going to have to spend a lot of time on plotjuggler because I see that it is very useful .. So .. As main debugging tools .. would they be rosbag and plotjugler only or is there something else ..?
*Thread Reply:* I use whatever seems appropriate.
RVIZ, either live or a bagfile playback.
The RQT plot programs. (either live of bagfile playback)
Convert bagfile to .CSV files. This is good for viewing/plotting a single topic at a time. https://pypi.org/project/cnspy-rosbag2csv/ https://github.com/AtsushiSakai/rosbag_to_csv (One of these works better than the other. But I forgot which one.) Then you can view/plot them in Excel. And you can do further calculations with the CSV values as input. So, for instance, we could take Al's bagfile and create wheel odometry from his wheel encoder messages.
I use Plot Juggler to view bagfiles. There may be a way to do more calculations in Plot Juggler (like you would do in Excel) but I haven't figured that out.
I think Al said he is able to watch real time ROS topics in Plot Juggler. I haven't tried that.
And sometimes you just have to watch topics scroll by when using "rostopic echo".
@Bob Hassett I found this image in a video https://drive.google.com/drive/folders/1u2zG9WfJiiIcmby0f9IYj108OB1lsngI In this directory https://drive.google.com/drive/folders/1eOakFydReiihAvkAY_bPpzF-xGEUHlpZ
Here is the link to the OpenCV AI YouTube channel: https://www.youtube.com/c/OpenCVAI
If you search the channel for "agriculture", several videos show up.
@JeffS thank you. Very timely, at least for me. I’m starting to look at CV on the Jetson Nano 4 G. Just need a couple little items before I start smoke testing ;-)
Here are some images for reference. This one shows the front GPS antenna remapped over baselink. The blue line is the back antenna, which has been remounted directly over baselink. The red line is the actual location of the front antenna. The green line is the front antenna remapped over baselink. So you can see both antenna are reporting as baselink location.
Hi all/Bob! Jeff thanks for the invite to get back involved in this group.
I have figured out several things. Depending which [GPS] driver you pick, you will get different topics published. That is obvious up front.
The nmeanavsat driver publishes a /heading topic which is a quaternion. The ublox driver does not. So I had to create a /heading topic. More subtle, the topics may not be defined the same. The nmeanavsat driver publishes /vel and /heading in NED format. The ublox driver publishes /vel and /navheading in ENU format. So you have to do some digging to figure this out.
I told Al that the new Plot Juggler will convert a quaternion to Euler angles, but it is pain and not worth doing. I played with it some more and figured out how to use it. So I guess it is usable.
I was also complaining that if I do a screen dump that I don't which file it was created from. The new version of Plot Juggler fixed that. The file name is now shown at the top of the list of topics. (See below)
And I found out that if you have a subset of topics selected and want some more, you can reopen the file and click a few more topics and they will be added to your list. I always assumed I had to close everything and reopen the file...
And... I found out I can run a second copy of Plot Juggler at the same time. (See below) I have my bagfile on the left and Al's bagfile on the right.
I just got an email that says: ```The lawn_tractor team on the "Robot Agriculture" organization has been deleted.
Cheers & Octocats, GitHub Support``` So it looks like the systematic deletion of ROS Agriculture is continuing... The original YouTube videos are still there, for the moment.
*Thread Reply:* Can’t help but think some people want to monetize available information?
This might be already known, but for YouTube it's possible to download the videos however the copyright part could be an issue unless there is written transfer/permission by the creator.
I went through the git page and cloned things I thought I might want. The page is still there if you want to clone or fork images.
A quick look at a couple of respositories look like they have been forked numerous times. So it might be able to search in the future and find copies of things.
Ardusimple announced a new RTK receiver. Seems to have much better specs that the F9P based board. It does appear to be about 3x the cost. But I no longer have money to spend on the latest toys. https://www.ardusimple.com/simplertk3b-receivers/
This is the link to the wheel odometry message that I mentioned in the meeting. https://answers.ros.org/question/249001/yaw-problem-for-imu-fusinon-in-robot_localization/
Anyone know of any foss raspberry pi board hats that would make a good integrated ag display (preferably with canbus)? I was looking for something similar to https://i-carus.com/shop/126/
That is an interesting point. I didn't know that type of integration existed. I knew you could piece modules together to come up with the equivalent. I.e., A random RPi, an LCD display, power supply, various interface modules. It seems that if you had the right search terms then may may find tons of solutions. But I can't find search terms to locate anything. I tried HMI (Human Machine Interface) but that didn't bring up much. (some software solutions) I tried "raspberry pi car computer" and didn't find anything good. Anybody have suggestions for search terms?
How about https://linuxgizmos.com/automotive-hat-offers-canbus-and-12v-5v-converter/
That looks pretty nice. What I think would be a low cost short cut but probably not doable is one of those low cost Atoto A8 car double din radios and somehow getting Ubuntu to run on them. They typically have a touch screen, video in, a couple gpio, canbus (for steering wheel controls), etc. and sice they are mass produced they can be purchased for about 200USD. I guess for now I'll stick with my tablet and a USB to can adapter. Maybe I'll roll a raspberry board for a winter project once the crops are all cut.
I used Mat Sadowski's write-up for robotlocalization using GPS only: https://msadowski.github.io/ardusimple-ros-integration/ https://github.com/msadowski/ardusimple_rover (I'm not sure he had the firmest grasp on the F9P and its ROS driver. Or I may be missing something.) I copied his config files and changed a couple of names to match. I copied a couple of things from his launch file. It was starting to rain so I quickly drove around the front yard. Two circles and two squares. The first image is the back antenna, the front antenna and something called /gps/filtered (I'm not sure what that is but I was publishing it.) The second image shows the offset in meters. The purple line is /odometry/filtered. The blue line is /odometry/gps. I "think" the blue line is the EKF filtered position. But I'm not sure. The whole plot has an odd rotation and the origin appears to have started where I turned on the robot. It may have set the rotation based on that. So I got past the first step of making something work. Next steps: I see I can add a parameter to set an origin where I prefer. That may fix the overall rotation also. I need to turn my gpsodom.py back on so I can compare to that. Check to see if the GPS antenna is being mapped over base_link.
Maybe I can do some more testing today...
I just noticed something... I don't get the zoom scroll bar on Slack when I zoom in to an image. Maybe if I restart the compute that will come back? Or maybe they thought people don't really want the ability to view an image the way they want?
*Thread Reply:* I download the images.. I will try to repeat your work in the sim
Sometimes it pays to look at the obvious. I had numbers in the wrong fields in the statictransformbroadcaster. Maybe next time I run it will work better.
static_transform_publisher x y z yaw pitch roll frame_id child_frame_id period_in_ms
I had
<node pkg="tf" type="static_transform_publisher" name="base_link_to_gps0" respawn="true" args="0.0 0.0 0.0 -0.08 0 0 base_link gps0 100"/>
<node pkg="tf" type="static_transform_publisher" name="base_link_to_gps1" respawn="true" args="0.0 0.0 0.0 0.67 0 0 base_link gps1 100"/>
It should be
<node pkg="tf" type="static_transform_publisher" name="base_link_to_gps0" respawn="true" args="-0.08 0.0 0.0 0.0 0.0 0.0 base_link gps0 100"/>
<node pkg="tf" type="static_transform_publisher" name="base_link_to_gps1" respawn="true" args="0.67 0.0 0.0 0.0 0.0 0.0 base_link gps1 100"/>
My latest run. The first image on the left is back GPS, front GPS and the GPS output from robotlocalization. The right side is the 4 outputs from my gpsodom nodes. The middle image on the left is the incoming GPS converted to offset in meters and the ekf output. The right side is the ekf output and the back antenna remapped to base_link. You may see an offset when moving vertical but not horizontal. The third image on left side is zoomed in so you can see the offset. The right side shows that the path is referenced to my origin I defined in my yard.
After I fixed the static transform broadcaster values I may get better results next time. At this point I am not sure if the navsattransform/robotlocalization pair is remapping the GPS antenna to base_link.
I changed the parameters on my two ublox F9P to output 10hz instead of 5hz. I changed the constellations down to just GPS and Glonass. And I set output to 10hz. The output looks awfully noisy. I will probably set it back to 5hz with GPS+Glonass+Galileo and see if it calms down. (I am so confused...)
*Thread Reply:* I don't why it is doing that. I will switch it back to GPS-Glonass-Galileo at 5Hz. (I can get correction for those 3). And see if goes back to something usable. If I want to play I could just change the sample rate and see if it improves. Or a whole series of experiments. But I know it was working well before, so I will probably just go back to that.
I added the latest index for ROS Agriculture here. (166 of 250) And I am going to try to keep the Lawn Tractor Automation file updated weekly. That way there is central site to search the Lawn Tractor meetings. http://sampson-jeff.com/RosAgriculture/
Anyone seen this before? https://mushr.io/ @Juan Eduardo Riva Note there is a reference to "classroom ready". Made me think of some of the things you have mentioned before.
A couple of things. I changed my GPS back to 5Hz update rate. It didn't completely fix my problems. So maybe the sample rate wasn't really a problem. The first image shows the 5Hz which can be compared to the 10Hz image I posted last week.
I still have a weird offset between the gps_odom output and the ekf output. At the moment I am totally baffled as to why. The second image shows that.
*Thread Reply:* Jeff, excuse me but next time you could do left, right and straight .. Because looking at it it seems that it does it only when it turns, or am I wrong ..?
The offset I am seeing is on the image at the far right. The offset is there if is going straight (sometimes) and when it is turning (sometimes). I will have to dig some more.
It is getting cold outside. I want to have a simulator for this Winter. Nine months ago I played with the picar Gazebo simulator. This might be what I started with: https://gitlab.yonohub.com/mohamedahmed/picar My post from 9 months ago: https://tractorautomation.slack.com/files/U01LCR21M8B/F01LRAR0FPG/peek_2021-02-03_22-47.gif
I played awhile tonight. I got it running again. Here is a screen shot. One window is Gazebo with the car driving in a circle. The bottom window is a command line publishing an Ackermann message to make it move.
I guess the next step is to resize things (wheels and frame) to match my mini tractor. If I get that to work and be controllable, then I will add in the sensors (dual GPS, IMUs...)
I decided the URDF file for that example is more complicated than I would prefer. I found a couple other cleaner examples: https://www.youtube.com/watch?v=gPNLd4F7HPA https://github.com/robustify/audibot
https://www.youtube.com/watch?v=m8aK8_DIIeo https://github.com/mfilipen/autonomous-mobile-car-like-robot
I think I will play these when I get a chance. And at the moment I am just looking at examples for a URDF file...
I can make complicated URDF files for you if required. I am expert in that. Just tell me what type of URDF you want
At this point I am not looking for a complicated URDF file. Actually I want the simplest file I can get by with. At this point I just 4 wheels that is steerable. And then there is the extra support code to make it work. The two examples I quoted are worth looking at for me. One has a single file with the macros embedded. The other the macros separated out. I will look to see which makes the most sense to me and which I think would be easiest for someone else to modify.
So the most important thing to me right now is just figuring out how to do this so I can change and add to it. Hopefully if I get something that I am happy with then somebody else can measure their machine and use those numbers to create a simulation close to their own machine. People can always expand thier model with bells and whistles. And fancy meshes to make them pretty.
But I will keep in mind that you can answer questions on this...
I don't mean to complain... But I am going to...
A few days ago GoDaddy sent me an email saying they are moving my email to a new system. It was free for 10-20 years. But after a trial period, it will now have a monthly fee. I spent an hour or two trying to set up a free gmail account. Yes, I know they are going to steal my information. But after going around in circles for a long time, I now have a brand new gmail account. I told my pobox.com (jsampson@pobox.com) to forward to my new gmail account. That seemed to work. Now I have to figure out how to get Thunderbird to download from my gmail account. Then I have to figure out how to delete my GoDaddy email account.
Years ago GoDaddy told me that they will silently delete random messages and not bother to tell me about it. (And they were were proud of that.) So I added a Yahoo email account as a backup. So basically both my GoDaddy and Yahoo were both redundant backups. Occasionally I would start an old copy of Thunderbird to download all of the email for archival backup.
Now I need to configure Thunderbird to download the gmail account. THEN I can delete both GoDaddy and Yahoo accounts...
I hate computers!!! And the idiots behind them... (On Twin Cities Robotics Group they have heard for 20 years that I hate computers.)
So if I sound grumpy that is why. (Oh, and the Minneapolis city elections, which is now out of the way for me.)
*Thread Reply:* @vinny ruia I really appreciate the work you did with this. It is the tool that most helped me to learn
To discuss today https://gist.github.com/jones2126/a9575cdda87b2b3c8ba431b9f38b8449
Here are some of the things I was talking about in today's meeting: Video to verify Ackermann steering: https://www.youtube.com/watch?v=WlFdLkD3wa8 The one where he says "I get tired of no documentation": https://github.com/chrissunny94/ackerman_ros_robot_gazebo_simulation
Let's try this. A post for our last meeting. 11/05/2021 https://youtu.be/d6QbVnTnmNg Chat: Jeff: Al's code for today: https://gist.github.com/jones2126/a9575cdda87b2b3c8ba431b9f38b8449 Index: 00:00 Jeff: Talks about the meeting recording process. 01:25 Juan: Shows his small model robot. 03:15 Al: Asks if Juan's new robot will have GPS. Juan talks about his GPS setup. (Birds eating cables.) 06:15 Juan: Shows his ROS-Agriculture simulator running again. Talks about simulated GPS in Stage simulator. 09:25 Juan: Talks about mixing physical hardware with a simulated vehicle. 11:30 Juan: Talks about connecting a physical folder to a docker image. 15:00 Al: Talks about his transmission control software. (See link in chat.) 24:45 Jeff: Talks about creating a 4-wheel Ackermann vehicle for Gazebo. 33:25 Discussion about random ROS versions. 34:55 Al: Asks if anyone has found the one video or tutorial that explains everything about URDF files. 46:10 Different levels of simulation. 51:45 Juan: Talks about using a physical accelerometer as an input controller.
I have been playing with my Gazebo model. I removed the shock absorbers and made the front wheels smaller. When It complains, and I patch things out, I notice this model is 4-wheel drive. I may try to remove the drive from the front wheels and let them freewheel.
I was trying to level the chassis by raising and lowering the front end. I was baffled why making the number positive or negative it still looked the same. Then I noticed the vehicle was flipping upside down when it started. It is probably something to do with weight distribution.
So I will keep playing with it.
@channel We just had a time change in the USA. We set our clocks back 1 hour. So the Zoom meeting next Friday will be off by one hour. If you are not from the USA, check the time on the East coast to verify the time. The meetings are at 12:00 noon Eastern time. https://time.is/New_York
not joining the meetings in time (fast forwarding on you tube) right now but good to know the silly time change happens also over there 😉
I want to create an articulated front axle in Gazebo. Lawn tractors and full size tractors have a front axle that pivots in the center. So as you drive over uneven ground the front wheels follow the contour of the ground. If you don't have the pivot, and you define a rigid frame, then only 3 wheels will stay in contact with the ground. If the front axle can pivot then all wheels will stay in contact with the ground.
I assume I can create a "revolute" joint to allow the axle to rotate. But I want it to have zero resistance. So I don't have to specify a speed or position or force, I just want the wheels to follow the ground due to physics.
Can I specifiy a control method (like force) and set the gain value to 0?
I can't seem to find a search term that gives me what I want...
*Thread Reply:* I am currently using Ununtu 16.04, ROS Kinetic, Gazebo 7. So I am trying to define this is a urdf(xacro) file.
*Thread Reply:* Thanks @JeffS; but I was asking what software do you use to make the drawing and create the element in the simulator.
*Thread Reply:* I think you only need to put a revolute joint in xacro file and it will behave like a passive joint.
*Thread Reply:* @Hugo Rafacho I will try creating a solid axle and adding a joint. I will test it to see if works.
*Thread Reply:* @Juan Eduardo Riva I use a text editor on raw URDF/XACRO text files. You add text, then load it in to Gazebo to see what damage you created.
The meeting starts in 20 minutes, if you haven't figured out the time change...
Here is the info from today's meeting.
11122021 Lawn Tractor Automation meeting https://youtu.be/eRP3C9cVFws Chat: Jeff: Code I started with: https://gitlab.yonohub.com/mohamedahmed/picar Al: https://github.com/prl-mushr/mushr/search?q=urdf Index: 00:00 Al: Shows an R/C car he is looking at. 02:20 Al: Talks about lawn tractor progress. 06:10 Jeff: Talks about models created for Gazebo. Lots of rambling conversation on Gazebo and versions. And how the navigation software fits in with the simulation.
You’ll have to excuse me but I need to rant about my journey with the Jetson Nano. I found what I thought was a good series of lessons called “AI on the Jetson Nano”. I watched all the lessons several times and thought that was just what I needed. Working through them Was frustrating and Wrong. The lessons started 2 years ago. The author steps thru all the lines of instruction…do this …do that. The problem was he didn’t talk to much about which versions he was using. So naturally I was out of syn using current versions.
So now I’ll lick my wounds and start from scratch and just use the Nvidia site while thinking about the concepts from the lessons. Things like using Python, curl, OpenCV and other programs he demonstrated.
End of rant and end of pulling out my hair!
We had a meeting today, but we didn't record it. We did talk about why Al's lawn tractor won't start after he drove it in the rain.
*Thread Reply:* ok, been there with rain and electronics. Check if the usb in the rpi are burned..
I found a meeting invite in one of my SPAM folders...
For everyone else, @channel I interpret this invite to say:
"No meeting next week, due to Thanksgiving holiday in the USA. Meetings will be Thursdays at 12:00 (Noon) USA Eastern time."
I assume the meeting link is the same. (I didn't compare them.)
There probably won't be a meeting Christmas week, but we can decide later.
The message pinned at the top of the General channel should be updated. I don't have access. It could also have the link to the new YouTube channel added.
Making progress on my NVIDIA Jetson Nano 4G. The Nano is configured and I can ssh into it from my iPad. Now how much I can actually accomplish/experience is yet to be seen. But it’s a start. I’d like to do some computer vision stuff down the road.
Question….what do you guys use docker containers for? Is it worth the time to learn?
*Thread Reply:* I know long term learning and using it will pay back, but I spent more time than I would like getting it working and that time could have been spent in ROS or other things that actually made progress on my project. There are a couple example projects which use it and they can be easy to setup and run if they are close to what your doing are probably a nice way to jumpstart. Probably not exacly what your looking for but is my 2 cents.
There are many reasons why docker is used and several are personal issues as with any tool. Personally I tell you that I started using it mainly because the version we use of ros (kinetic) is based on python 2 and today everything is python 3 at least. That version conflict on which the system was based was a mess on my computer. Every time I installed a package something would break. Docker encapsulates the instances in things called containers and what happens there stays there, as with Las Vegas. It is a one-way trip and you have to learn several things, but as I learned over time .. work until your current tool breaks and where you see that you need to use something else to avoid problems, you know what docker is to give you a hand. As that problem happened to many people and due to the difficulty of those configurations, other tools tended to be built on docker and that is why you see it in several places, but the reality is that if you do not have problems you do not need to base your system on the. In my case, I ended up with my robot base completely in docker, also as ros is tricky with network configurations, you can assign a specific network configuration to that docker container, making life a little easier .. I hope I have explained myself .. good luck
*Thread Reply:* This guy shows how to test the coil https://www.youtube.com/watch?v=TlMqyNVNM6U
Do you have a wiring diagram for your lawn tractor? And possibly a diagram/drawing of the wiring harnesses?
Do you know what the ignition coil wiring look like where it comes out of the engine? You know what the other end looks like, because you have it connected to your relay board to kill the engine. Can you disconnect the wire coming out of the engine? Being efficient, they probably grouped it with other wires I might be able to find something on YouTube...
*Thread Reply:* How did you connect your wire? Were you able to pop the pin out of the connector. If you disconnect the wire from the magneto at that connector then that eliminates everything beyond that point. Well except for the wire you added.
So if it is convenient, pop that out and check for spark. If no spark, disconnect your wire and check for spark.
If you get the cover off, and there is a plug on the magneto for that wire, unplug that and check for spark.
*Thread Reply:* I assume you have tried it without your relay board connected???
*Thread Reply:* Interesting there is no yellow wire into the coil, yet that is the wire I have been grounding for e-stop.
*Thread Reply:* @JeffS Unplugged the wiring harness going into the coil, turned the engine over, still no spark.
*Thread Reply:* Well, that does not appear to be a magneto, just a "coil" like you would have on a car or a motorcycle. So either it works like an old car with contact points, condenser and is powered from your battery. Or it is using a more modern "electronic ignition" system which would have a "control module" somewhere.
The little black thing in the photo may be a pickup sensor to trigger the spark.
So probably shorting something to ground was not the appropriate way to kill the engine.
*Thread Reply:* I guess the thing to do now is to search for [mower model]/[engine brand]/electronic ignition. And see if anything turns up.
*Thread Reply:* So I'm trying to decipher your spreadsheet here: https://docs.google.com/spreadsheets/d/13iu1SKzC3u8UJB-Z2nUe2jxEUZqns0Dpvobv8dYRnRs/edit#gid=700333408
I see two numbers 13apa1ct056 and 13a9a1cs256. On the parts site I see one has an ignition system page, the other one does not. The page it has shows a magneto. I assume that is your older one (non-EFI). The other has nothing about engine parts. I assume that is your newer one.
So that doesn't tell me anything. So you will either have to find it documented somewhere or trace those wires from the coil to see where they go. They will either go to a module on the engine somewhere or out through the main cable harness. Either way, I would cut your yellow wire that you added in to kill the engine. (because it looks like it killed your engine)
If the wires go all the way back to the "ignition switch" module then electronics may be there (which may fix the problem if that module is replaced). Or you may have damaged your coil assembly or a control module on the engine.
But you should get a better idea of what is there and how it works before you start swapping stuff out.
*Thread Reply:* 13apa1ct056 (Texas, no EFI, Kohler engine); 13a9a1cs256 (PA, EFI, Cub Cadet/China engine)
Does it have wiring diagrams and cable harness diagrams? Or just tell you how to change oil and don't run over your kids? 🙂
*Thread Reply:* I'm googling "cub cadet wiring diagram model 13A9A1CS256"
As for Al's problem,,, I have a few thoughts:
Think of your engine like a black box. You have (at least) two electrical plugs coming out of the engine. It may be appropriate to try to figure out what each wire does. (More about that below)
On the inside of your black box, figured out what is wired where. The wires from your ignition coil assembly probably goes to an electronics module on the engine somewhere. There are probably wires from your fuel injector assembly. They may or may not got to the same (proposed) electronics module for the ignition coil. Somewhere (a promotional video from Cub Cadet) they suggested you will have an oxygen sensor in your exhaust pipe. That will probably go to the same place the injector wires go. You also have an alternator that will have a few wires. Those may go directly out of the engine black box, or you may have a voltage regulator on the engine somewhere. There may be other things I haven't thought of. (Maybe low oil sensor.)
But the point is you will have surprisingly few connections from inside to outside of your black box. Since your fuel and ignition are both electronic, they have to have power supplied. You also need at least one ground wire across the boundary. Your alternator has to get out to your battery somehow. But if you a voltage inside then maybe there will be a single power charging wire coming out. (Which may or may not go through your ignition switch.)
So what I would do is get a big sheet of paper and draw the connectors, Then draw wires between each connector and pin numbers and label the colors. Look on the key switch and any other modules and see if the pins are labeled. If they are labeled, write that down also. Then try to figure out what each wire might be.
The engine needs a 12V input, and may have another wire indicating it is authorized to run. It looks like your starter solenoid is outside of the black box, and besides it has a huge wire which goes directly to your starter.
The alternator may have a cable harness the directly exits the engine, or as I said before, it may have a voltage regulator on the engine somewhere. Whatever the output of the alternator is, it probably goes through your ignition switch.
But back to your safety system, any one of the safety switches and associated wiring may prevent the 12V or the authorize signal to get to the engine. So check all of those switches and any hacking you have done.
One more thing... It looks like Cub Cadet is trying to "John Deere" people. (i.e., not let the customer repair things) It may or may not be intentional. It is probably because they could get a cheap engine directly from China which saves them money. But it means third party vendors (like Partstree) haven't caught up and don't have parts and diagrams available. Or maybe Cub Cadet will not release that information. But... you may be able to contact Cub Cadet and get wiring diagrams directly. Maybe go to a local dealer and ask to look at their documentation and maybe they be willing to help (if they are bored or think an autonomous lawn tractor is a cool idea).
But for the moment do some poking at the wiring and see what you can figure out.
@Al Jones I had another thought. How do you set the engine speed? Is is a throttle lever like the older mowers had? Or is it some fancy electronic thing?
Here are a couple of searches you may want to dig through. Some claim to have free manuals. I think you probably want an engine manual of service manual. An operators manual may not have enough information. Although the manual Juan has shows wiring diagrams...
I also notice that MTD has a 547cc EFI engine. What are the odds they use the same one?
https://duckduckgo.com/?q=Cub+Cadet+547cc+efi+engine+Manual&t=ffab&ia=web https://www.google.com/search?q=Cub+Cadet+547cc+efi+engine+Manual
Note: These manuals seem a little poorly written... This manual (on page 15) says if it won't start that you replace a fuse. https://manualzz.com/doc/52743160/cub-cadet-lt42-efi-xt1-enduro-series-lt-42-in.-547-cc-fue... It also mentions something during starting about reverse mode operation. What did you do with your reverse switch on your transmission?
This manual (on page 26) says your fuse is in the wiring harness under your seat. https://www.manualslib.com/manual/1146415/Cub-Cadet-Xt1.html?page=26#manual
@Al Jones Hey, check to see if you have Cub Connect App. If you do, see if it tells you anything. https://www.youtube.com/watch?v=x2zrlpJNZoM
*Thread Reply:* that is a great idea. I never tried to load that before. I will give that a try.
*Thread Reply:* Still working through my issue of "no spark". The app seems to no longer be available in the US. There is an EU version, but after installing it seems only EU models are listed and my model triggered an error. I've determined the engine ignition and control system is built by Walbro.com I have a new coil to try. Currently working on installing it. If that does not work I need wait for Cub Cadet to be on-line again to get the wiring diagram.
Migrating stm32 not easy.. https://community.st.com/s/question/0D53W00000RQMZr/how-to-migrate-the-cubemx-project-ioc-file-to-a-different-mcu
I will look at the ST Micro parts after I finish indexing today's meeting.
*Thread Reply:* Thanks for the help I will install de idea in w10, I think there is something wrong with the app in linux
*Thread Reply:* I think that the problem is that I was using the wrong software.. it's need to use stm32cubemx
*Thread Reply:* Yep, I think I was able to compile.. Tomorrow I will have more news
If I go to this page: https://www.st.com/en/microcontrollers-microprocessors/stm32f407-417.html#2 (you "may" have to search for stm32f407 on that page) The stm32f407vg and stm32f407ve look identical to me, except for flash size and operating temperature. Some where there is info on what the last two characters of the part number mean, but I can't find that. Unless you see an obvious conflict of an I/O pin between the two parts (The stm32f407vg on the Discovery board or the stm32f407ve on your board, I would assume they are the same.) But you mat have to go into the IDE and change the CPU type from stm32f407vg to stm32f407ve. It may compile as a vg, but it may not download and run if you don't change it.
@Bob Hassett It appears to work. I had a problem with it pausing for long periods. That went away after I replaced the crappy eStop switch.
*Thread Reply:* Just ordered 2 Ada Fruit radios PID 4074 RFM 95W@915. I wanted to see a display on each end for confirmation and trouble shooting.
The only range test I did was inside the building at work. I think it was about 100 feet through a closed door. In real life I am usually within a few feet of the vehicle when it is running.
Unless I realize that I drove it 50 feet or so down the sidewalk and I realize that if it starts ignoring me... I won't be able to chase it down and hit the manual eStop. That is a sinking feeling.
You can send anything you want but it is low speed. I am sending two dual axis joystick and several switches.
What else do you plan to use with your boards. I see these are designed to plug into a Raspberry Pi. If you use a Raspberry Pi, how do you plan to read analog data at the hand-held remote end?
On the hand held side, Ill use 2 dual axis joysticks. Once that’s ironed out Ill switch to low profile thumb joysticks. For the hand held processor I might use my teensy 3.2. The other radio will go on the Barbie Jeep. Hopefully in the future Ill migrate the electronics on to my tractor. I’m experimenting with the Nvidia Jetson Nano for the vehicle but development with the nano is at a stand still. My laptop crashed. I heard a rumor I might get a new laptop for Christmas. I have to wait until the 25th. :-(( Meanwhile, for the tractor I’m rewiring the control box with smaller gage wire so I can make more space available. I’m also replacing the slow moving actuators with motors but that has a a lot of positioning challenges because I have to use wire and small pulleys.
I looked at those boards you ordered. I tried to find a schematic but again I was reminded that Adafruit has lost control of their process. You can find a schematic as the last page of this PDF: https://cdn-learn.adafruit.com/downloads/pdf/adafruit-radio-bonnets.pdf Its not very readable because they probably pasted it as a JPEG instead of a PNG. But you can make it out if you have a lot of patience. There is a complete .SCH file somewhere, but you have to download Eagle CAD to view it. I tried to complain when I got their BNO055 module. But they made it clear they don't want my advice.
But... the radio is separate which has an SPI interface. The OLED is separate with an I2C interface. The push buttons are separate as individual I/O pins. You can make it work with a Teensy 3.2 if you pay attention to what you are doing. Assuming you can find the OLED driver for the Teensy 3.2.
Let me know when you get the boards and I may be able to guide you...
This looks interesting but they aren’t built yet. https://www.sparkfun.com/products/14048
*Thread Reply:* There is some discussion on the Ada Fruit Forum that Ill start reading.
This is the info from this weeks video meeting:
12022021 Lawn Tractor Automation meeting https://youtu.be/h7t-wrM4FQE
Chat: Al: Walbro EFI overview: https://www.youtube.com/watch?v=Q9agc-7_jqs&t=148s Juan: stm32f407vbtx Juan: Stm32f407vet6 Juan: https://tienda.ityt.com.ar/plataforma-stm32/11345-mini-stm32f407-stm32f407vet6-desarrollo-arm-cortex-m4-itytarg.html?search_query=Stm32f407vet6&results=2
Index: 00:00 Jeff: Says he is too lazy to do anything. 00:30 Al: Shows his "recovered angle" data from his lawn tractor. 05:25 Al: Moves on to why his lawn tractor would not start. 06:00 Jeff: Mentions that Plot Juggler "might" have an option to "average" the values for a plot. 07:00 Al: Asks how acurate the measurements have to be. 07:35 Al: Asks about what he needs to change in his code. 19:10 Juan joins the meeting. 20:15 Juan: Asks if Al can get his lawn tractor to stop. Turns into discussion of vehicle wiring diagrams. 23:45 A discussion of how to stop the engine (on the new fuel injected models). 22:45 Jeff: Questions the original. Was the question about stopping from lack of cmd_vel or about e-Stop? 26:45 Al: Describes about his engine with fuel injection and electronic ignition. 29:50 Juan: Talks about his ST Micro CPU he is is using.
Tested a map function to use in the steering control low level code. I wanted to desk check the code to make sure it was behaving the way I thought it should. Gist code link
@Al Jones Hey, I just had a thought. Since you have fuel injection and you can't short a magneto to ground to stop the engine (since you don't have a magneto). You could use one of your safety switches. I assume originally your engine would stop if you got off the seat. So you used a shorting jumper to bypass the seat switch. If so...
Remove the shorting jumper from your seat switch and connect it to relay contacts. So when the relay contacts are closed the engine will run. If the relay contacts open then the engine will quit. Look at your original remote control relay board and see if has both "normally open" and "normally closed" contacts. Just use the normally closed contacts and it should work like it did before, without destroying anything. That won't be any "less safe" than your original setup.
I have 4 motors I’m running on 24 volts. I’m thinking of trying to provide 2 (selectable) sources of 24 volts. One source is from panel mom on-off-on switches. The second source of 24 volts would come from my MDD motor controllers. Sooo, Im thinking I need a DPDT relay for each motor. Ideally they would be activated by GPIO pins from a Teensey (3.2)? Less than 20ma? How about https://www.tindie.com/products/bugrovs2012/8-channel-i2c-solid-state-relay-module-v20/?
So my question is what kind of relays are people using? Is there a better solution than DPDT relays?
Upon further review….maybe I could just reverse my manual on off on toggle with the panel on -off on. So the off on the toggle goes to the motor.
If you look a little harder, you will notice these are for AC and not DC. Those modules are probably equivalent to SPST. Also that board has an I2C interface and an I/O decoder chip to turn individual relays on and off.
You can buy relays that are buffered so you can drive them form logic signals. On places like eBay and Amazon and lots of others. https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m570.l1311&_nkw=arduino+relay+module&_sacat=0
https://www.amazon.com/s?k=arduino+relay+module&sbstp=n&ref=nb_sb_noss_1
Or can get bare relays but you might have to wire up a driver circuit to convert 3.3V logic to the proper voltage and current to drive the relay coil.
Another option is to connect your motors directly to your driver boards and just control the input to the driver.
I didn't realize we ar designing this in the fly... I'll see what you just posted and then wait awhile for it to calm down.
Sorry about the rush to judgement. ….I’ll have a need for more relays later on so all of your suggestions are good to know ahead of time. I’ll definitely use them. Thanks!
But, back to the point of connecting your motors directly to your power drivers... That eliminates high power switching (either switches or relays). Connect the motor driver input directly to your Teensy (or what ever you end using). Then your Teensy decides where to get its inputs (switches, joysticks, USB commands).
The downside is that the Teensy will always have to be powered up. But it eliminates big parts and extra large wires.
@Bob Hassett how big are your motors? These are 12VDC coils and decent amperage rating in a single bridge package. https://www.waytekwire.com/item/77010/Cole-Hersee-24452-Forward-Reverse-Relay-Module/
They might also make them in other coil voltages
*Thread Reply:* Hello Ross. Thank you for the suggestion. They are kinda pricy for my limited budget. The motors are rated at 36 volts and I’m trying to run them at 24 volts by way of a step up 12 volt to 24 volt. The problem seems to be that the step up is only putting out 14 volts. The motors are just being used in a limited capacity not heavy continuous duty.
@Bob Hassett To address your question (that you deleted)... It is very likely your 12v to 24v converter doesn't like the startup surge from your motor. I assume you have a toggle switch between your converter and the motor. When you flip the switch on (at zero speed) you probably get a 20-30 amps surge and the converter thinks it is a short circuit. So it goes into fold-back current limiting. Or the converter has shut down and is just passing the input voltage through. You didn't mention if your are connected to an actual battery or a power supply.
I had a 12v to 24v converter and I didn't see it shutting down. But I wasn't logging output voltage. I took off the converter to slow down the motor max speed.
If that is the problem, you might get around it by ramping the motor power so you don't get a huge surge. You can use your Teensy and power driver to do that.
@JeffS thanks for the suggestions and insight. I’ll dig deeper. BTW my radios arrived! Now the big time hair pulling time begins. I’ll use my pi and follow the AdaFruits setup at first. Just hope we continue to get really cold temps so I am “forced” to stay in the house.
Video meeting today:
12092021 Lawn Tractor Automation meeting https://youtu.be/FxT_4zGoBmw
Chat: Juan: https://www.linkedin.com/in/lucas-daniel-larroque-mart%C3%ADnez-2aa76a15b/ Al: https://gist.github.com/jones2126/78fc852caee9d10e5108e64422993657 Al: https://docs.google.com/document/d/19BwoQ0tGbLQzXHREdpmU91daD2YybY-LSbKjXs8yGow/edit?usp=sharing Al: that is the Arduino-cli note Al: 2021 RPi image build https://docs.google.com/document/d/1e_G5wbIG-_NvdfubZtinnoX8fGwul4fCja07e2X2Yew/edit?usp=sharing
Index: 00:00 00:40 Al: Testing his steering map function. Talks about Platform-IO problems. And he may switch to the Arduino CLI interface instead. 08:40 Jeff: Says he is not doing anything. 08:50 Juan: Describes his processor conversion effort. He has code designed for the STM32F4-Discovery board (stm32f407vg) and he wants to run it on a board with a stm32f407ve processor. For the next hour we go through and try to figure out the simplest way to make the conversion. We got something that it looks like it will work. Juan will try to recreate the process later with a fresh copy of the respository and see if it works...
@Juan Eduardo Riva I know why you opened your project several times and it opened correctly. Then several times you opened it and it was broken. That is because at time 56:45 in the video, you deleted half of your project information. So don't do that. 🙂
Hi Jeff, I'm going to take it into account; but I don't think that's the problem. I think the main problem is that I have no idea what I am doing and I have to read the documentation more deeply. This is a situation where one takes a path that, rather than solving something, complicates my life. it is important to recognize my stupidity, although now I have to find a way out
I do not cry, because I am a big man and I understand that what I am doing is going to help me solve other problems arising from the lack of chips
Well, in a couple of days I may know the secrets. I just ordered two of the STM32F407VET6 boards on Amazon. I only wanted 1, but due to the usual factors, and that I had to have a minimum of $25 to get free shipping, I ordered 2 boards. In their decpetive process I am now a proud member of Amazon Prime (which I didn't want).
So tomorrow I should have 2 boards to play with. They say it will do DFU programming. But other notes say I have to contact them directly through Amazon (which is nonsense).
I ordered 2 boards (after they forced me into Amazon Prime I would have only ordered 1 board), tomorrow I should have 2 boards.
They say if I want info (schematics and DFU programming info) I should contact them. That was a waste of time.
I'm still the at the point where I don't know if this will append to my previous message or start a new message...
I attempted to send a message to these clowns through Amazon to get more info. We will see if that worked.
But... I want to see if I can program the board through DFU, serial, TPLink or other.
If I can program it I want to see if I can do debug (maybe only through TPLink).
If you are going to go by that card. Notice the green box that is blurred out by your flash. It says SPI port.
Below it shows pins 10,11,12,13 as the primary port. It shows pins 7,8,9,14 as the alternate port. It shows other pins as CS. So I assume you can specify which pin to use as chip select.
@Juan Eduardo Riva Ahh-Hah!!! He says. This may be as good as it gets. https://github.com/mcauser/MCUDEV_DEVEBOX_F407VET6 I was trying all day to figure out which pin the LED is connected to. Then I found that page... I'll do some more digging.
I received two boards. They both act a little differently. They had some test code that blinks the LED and it acts like a USB Flash drive. I was able to get both of them to go into DFU mode. Now one of them is refusing. I did not try actual DFU programming. I connected a cheap ST-Link V2 module to one of them. (maybe both, I forget.) STM32CubeProgrammer recognizes the board. Both DFU mode and ST-Link mode report a chip ID of 0x413 and a Flash size of 1MB. They should be 512KB. Maybe just old version of STM32CubeProgrammer?
I connected my original STM32F4-Discovery board to STM32CubeIDE and it worked as usual. (It has the ST-Link hardware built it.) I could say "debug", it compiled and downloaded the code and did a breakpoint at "main". I could singlestep.
Then I connected the new STM32F407VET6 board with the ST-Link module and connected it to STM32CubeIDE. I clicked "debug" and it compiled and downloaded and did a breakpoint at "main". I could single step but it would crash on the "fatal error" handler. I assume it has floating input pins and it is causing an unhandled interrupt. But I was surprised that it got that far.
It didn't occur to me that running the debugger would write over the original test code, but it did. I guess that means I have to move forward and not backward. 🙂
Okay Bob, what were you asking? "Do people use the Arduino library?" If I want quick and easy then I would use an Arduino library. Assuming you have found an example that is based on Arduino libraries. I also assume you mean "Arduino libraries for Teensy 3.2" since that is the board you mention. (In other words, go directly to the Teensy Arduino site to look up commands and don't look at generic commands/libraries.)
I think I used the Arduino libraries for the LoRa radio. Either directly from the Teensy libraries or maybe they were from the Ada Fruit libraries. I did this on a Teensy 3.2 and it worked.
"what is the common practice here?" I don't know common practice is, but I will take the defaults unless it conflicts. So for instance it appears the "default" is pins 10,11,12,13. But if any of those pins conflict, say you want to use TX2 as a UART, then that guy says you can individually swap that pin with any other pin they have defined. Also note that the "LED" is on pin 13. So if you don't change that then you can't use the LED in your code. It also means the LED lights up as your SPI transfer is happening. Pros and cons...
Also consider you have pins on the back of the board that may or may not help out. I personally don't use those unless I absolutely have to. It is a pain to have to solder on extra wires and then you have to run them out from underneath to use them. But if that is the place I can get to a signal that I need, then I'll do it.
That guy is claiming you can remap any or all of the pins on a particular peripheral. (assuming the reference card says there are alternate pins) I didn't realize you could do that. I assumed you either used all pins of the default or you told it to use all pins of the alternate mapping.
being so inexperienced as I am, I really appreciate your clarity. I’m going to just dive in and RTFM more until I break something!
*Thread Reply:* I read on the internet, so it must be true, there is magic smoke inside those boards. That is why they stop working if you ever release the magic smoke, but I learn a lot in the process.
@Juan Eduardo Riva Here is another reference. https://stm32-base.org/boards/STM32F407VGT6-STM32F4XX-M.html
A little off topic... But if your bored and want to watch Mother Nature. https://www.youtube.com/watch?v=0rXHF-YAUJ0 There is a storm about the size of Europe moving across the US. They say you can get 75-100 MPH wind gusts and they are tracking about 6 tornadoes.
*Thread Reply:* Gusts between 70 and 100 through part of the day today. My roof is roughed up. Gutters gone. I'm in the middle of Kansas. Power out since about 2:00.
*Thread Reply:* Windy does no justice to actual wind speed it seems
Here are some tweets during this latest storm. https://twitter.com/ryanhallyall Or the original live video will probably still be there after it finishes. If you want to watch all 6+ hours. I haven't looked but I assume this guy has a video covering the Kentucky tornadoes that just happened. Yes, here is the video from last week. https://www.youtube.com/watch?v=M7Ga1P1Zmyg
12162021 Lawn Tractor Automation meeting https://youtu.be/0uhaa5hTrT8
Chat: Jeff: A project to run ROS Navigatio2 on AWS racer. https://discourse.ros.org/t/integrating-ros-nav2-stack-with-aws-deepracer/23481 Jeff: https://github.com/mcauser/MCUDEV_DEVEBOX_F407VET6 Jeff: https://stm32-base.org/boards/STM32F407VGT6-STM32F4XX-M.html Juan: https://github.com/Itamare4/ROS_stm32f1_rosserial_USB_VCP Juan: https://github.com/SyrianSpock/stm32f4discovery_rosserial
Index: 00:00 Juan: Talks about STM32 boards he is working with. Juan gives a BluePill to ROS demo. Some discussion about STM32 boards and ST-Link modules. 14:35 Al: Is in the process of of updating his low level controller code. 16:30 Jeff: Posts a link to ROS Discourse about running ROS2 navigation2 on the AWS Racer. 18:35 Jeff: Talks about two new STM32F407vet6 boards like Juan has. Discussion of different board types. 40:55 The discussion turns to getting the STM boards to talk to ROS. 53:00 Juan: Posts some links for ROS serial on STM.
I made some progress on the new STM32F407VET boards. I tried putting a bootloader on the boards. That didn't work.
I can program both of them directly from Arduino with a ST-Link module. Or I can program one board directly from Arduino with DFU and just a USB cable. The other board will not go into DFU mode.
Either way, the code loads and my LED blinks and I get messages printed out through the USB port. Before I assumed the USB/serial was part of the bootloader. But now I assume it is code that Arduino complies and loads.
Info from today's video meeting: 12232021 Lawn Tractor Automation meeting https://youtu.be/wiivncpLJHY
Chat: None
Index: 00:00 Jeff: Goes through his setup to program the STM32F407VET board from Arduino. 14:15 Discussion of hands-free in-circuit programming. So you don't have to open the robot to reprogram. 19:00 Juan: Demonstrates programming and using the Blue Pill (STM32F103) board. 24:00 Discussion about which IDE to use. STM32CubeIDE or Arduino IDE. Pros and cons. 25:30 Jeff: Mentions a PDF from a presentation. It details how to use STM32CubeMX tool to configure the chip and generate code. Then move the code into Arduino IDE. PDF: https://www.dprg.org/wp-content/uploads/2019/09/Moving-Up-To-ARM.pdf Youtube presentation: https://www.youtube.com/watch?v=xGFnFWJXC8M DPRG on Youtube: https://www.youtube.com/user/DPRGclips 31:10 Suggestions of "extra" things to include when publishing/creating documentation. 33:30 Juan: Points out that you can find lots of articles telling you how to avoid using the ST-Link programmer. But he suggests you should just spend the ~$10 and get the ST-Link module. 34:20 Jeff: Points out that the ST-Link module can also be used for debugging. 35:20 Juan: Explains that it is hard to buy parts in his country. 41:30 Jeff: Mentions that the Arduino IDE may/or may not support all of the peripherals on the STM32 chips. 42:35 Juan: Recaps his serial communication issues. 44:50 Jeff: Offers some debugging suggestions to verify each step of communication to Linux.
Info from today's video meeting: Neither Al or I had anything to talk about. So we did not record the meeting. But we still spent over an hour talking about nothing...
Anyone have thoughts or experiences with https://www.ifm.com/us/en/product/O3D303
Maybe collision avoidance. I also want to follow hay windrows in a field and possibly identify and correct a swath heading based on swath edge. I think lower cost cameras and image processing probably also work but am looking at what 3d options are out there. All the lidar options seem pretty pricey as well if they have any range.
Info from today's video meeting:
Chat:
O3D303 3D Camera https://www.ifm.com/us/en/product/O3D303
Jeff: 3D Camera ROS Agriculture presentation https://www.youtube.com/watch?v=-VELjDwsgoo
Al: https://store.opencv.ai/products/oak-d
Al: https://docs.opencv.org/4.x/d9/df8/tutorial_root.html
Al: Auto-Focus -- Fixed Focus for Color Camera on OAK-D-Lite
We're thinking we should change the OAK-D-Lite to use a fixed-focus color camera instead of an auto-focus camera. The why
is that fixed focus allows use in high-vibration environment environments, with little loss of functionality for all other applications. The only real change to the end-user of OAK-D-Lite would be that the min focus distance for OAK-D-Lite would move from about ~pi inches (~3.14) to about 20 inches. And for most applications, it is seeming that this is totally acceptable - in return for the camera being usable in places where there are high vibrations (like drones, but also on machinery, RC-cars, etc.).
Al: https://www.meetup.com/ai-ml-for-robotics-and-iot/events/282811556/
Index: 00:00 Jeff: Mentions the 3D camera he bought a couple of years ago. He posts a link to the ROS Agriculture presentation. 02:20 Discussion of specs on various cameras. Al posts links to Oak-D cameras. 04:30 Discussion of to look for when considering any sensor. 06:45 Jeff: Goes into rant about proliferation of AI as the answer to everything. More talk about Oak-D cameras. 10:25 Fixed focus vs. auto focus. 13:00 Al: Is working on his remote firmware update for his Teensys. 13:40 Al: Found another ROS group through Meetup that has video meetings. 20:50 Al: Goes to his to-do list to remember what he wanted to ask. 26:50 Al finally decides he wanted to ask about LORA radios. Discussion of Jeff's LORA radios.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
This camera sounds too good to be true. Do you think OpenCV can work with this? https://fb.watch/aofTwHefZg/
*Thread Reply:* The low price smells like they are trying to just dump them as fast as possible. A lot of 5 stars review but not much detail info. Still Can't tell if OpenCV will work on them.
01132022 Lawn Tractor Automation meeting https://youtu.be/xN4Qe9fCXp0
Chat: Juan: https://github.com/ros-drivers/rosserial/issues/518 Juan: Remove folder ros_lib inside libraries directory. Then got to Arduino IDE-Sketch-Include Library-Manage Libraries... ( Ctrl+Shift+I ) And search for "rosserial" and install: Rosserial Arduino Library by Michael Ferguson
Using version 0.7.9, worked with fresh new Arduino Uno and Arduino Mega. Also able to compile for Seeeduino Zero (integrated on Odyssey X86J4105) Ross: Hi guys. I'm in transit today. Just thought I would drop on for 5 min and say 👋
Index: 00:00 Juan: Talks about remote networks using cell modems. 03:50 Juan: Shows the parts he has to work with in the USA. 04:30 Juan: Asks for a recommendation for a cheap GPS for testing. 08:00 Juan: Talks about version issues with ROS Serial. And talks about using ROS Serial. 15:15 Ross: Joins the meeting. 17:30 Ross: Says "hi". We talk about is wireless networks for his fields. 21:20 Juan: Suggests cell modems. 23:30 Discussion of what data to send and how much to send over network. 30:00 Jeff: Says he may document his LED status strip for ROS. 35:25 Jeff: Talks about his LORA radio remote control. And plans to make a new version.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
How can farmers, ranchers survive with those temperatures? https://www.facebook.com/100000952810697/posts/6877969435578082/
*Thread Reply:* Our west coast experienced many out of control wild fires because of extreme temperatures, how is Argentina fairing?
If two of the regular people are out then I am going to skip it too. So I guess the answer is no.
I’m just now starting to add radios. One will reside in my tractor control box.Another will be battery powered remote control. For a quick connect I bent some header pins to make it work for now. I looked at the pi zero and a possibly a feather board. But I like the teensy because of the low power requirement. Any suggestions?
One other issue. I didnt write down my pi 3 password so….I followed the recommended way to recover or reset by adding initial=/bin/sh. But when typing “su” as suggested I get an error message. Long story short I need to understand the cmdline sentence better. Can someone parse out the wording and explain it?
Did you purposely change your password? If not, on the Raspberry Pi versions of OS it is probably "raspberry". If you are running Ubuntu then the default is "ubuntu". Here is a random page that talks about it: https://tutorials-raspberrypi.com/raspberry-pi-default-login-password/ They also have some instructions on how to change it.
Back to your question... it would be helpful to see what error message you get.
*Thread Reply:* I changed the default password :-( Ill need a couple of minutes to setup the pi3 again. I'll be back.
*Thread Reply:* Thanks for the link I will speed a lot of time learning it.
That appears to be an error during start up. Nothing to do with commend line.
Are you in a position you could do a Zoom call. If so I can attempt to start a meeting.
Can't do zoom we are going out the door. Maybe later when it's convenient for you?
Okay. For anybody, it is possible to have private Zoom calls at any time. So let me know when you want to do it.
On your question about radios... Keep in mind that none (that I know of) the Raspberry Pis have A/D input. So you will have to add an external board to get A/D.
I was just going through the radios and Teensys last night.
A little update..
I dumped all the ADAFRUIT RFM 9x documents onto my white board.
Because I’m using a “HAT” radio I can use the pi gpio to map (wire) to the Teensy. Ive made a best guess of where the pi gpio wires should go to the Teensy. Again that is just a guess.
Anyone care to take a look and correct me? The viewer can zoom in almost infinitely for more detail. https://miro.com/app/board/o9J_kzvV0Pw=/
Lets see if this works... Jeff Sampson is inviting you to a scheduled Zoom meeting.
Topic: Jeff Sampson's Personal Meeting Room
Join Zoom Meeting https://us02web.zoom.us/j/4310467829
Meeting ID:
Dial by your location
This is what Bob and Jeff talked about tonight: (In the Shorts and Tutorials playlist)
========================== https://youtu.be/MrHonCkq5Tk
Jeff and Bob Discuss Bob's LORA radios. He may use them on a Teensy 3.2 or a Raspberry Pi. Bob has these: https://www.adafruit.com/product/4074 Jeff has these: https://www.adafruit.com/product/3178 https://www.adafruit.com/product/3231 https://www.adafruit.com/product/3072 https://www.adafruit.com/product/3200
Jeff: Posted this table of proposed connections for Teensy 3.2 Pin Digital Used for === ======= ================================ 1 GND 2 0 sw1 3 1 sw2 4 2 sw3 5 3 sw4 6 4 sw5 7 5 eStop1 8 6 eStop2 9 7 DOUT (MOSI - Radio) 10 8 DIN (MISO - Radio) 11 9 CS (Radio) 12 10 D10 - (RST - Radio) 13 11 D11 - (IRQ - Go on Radio) 14 12 n/u
15 13 LED (In parallel with panel LED) 16 14 SCK (Clock - Radio) 17 15 pb1 A1 - Reserved 18 16 pb2 A2 - Reserved 19 17 pb3 A3 - Reserved 20 18 n/u SDA0 - Reserved (or A4) 21 19 n/u SCL0 - Reserved (or A5) 22 20 A6 - Joystick 23 21 A7 - Joystick 24 22 A8 - Joystick 25 23 A9 - Joystick 26 3.3V Out 27 AGND 28 Vin
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
*Thread Reply:* So the column headers for Jeff’s pin out…1st Board Pin #, 2nd Logic(?) Pin #, 3rd…..,4th…
This video: https://youtu.be/86hm15g--nY
Chat: Al: https://www.alibaba.com/product-detail/2-4G-Wireless-Remote-Control-Wireless_60636492203.html Al: aka "2.4Ghz TGZ-850M Gamepad" Al: https://docs.google.com/document/d/19EID7QT9J-TceXtfbqKu_0GrIiIGkpM6COMi5lEBcc0/edit
Index: (I'll do the index later...) In general we talked about remote control, LORA radio, safety. I don't remember until I watch it again...
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Index: 00:00 Jeff: Recap of Bob's project that we talked about on previous video here: https://youtu.be/MrHonCkq5Tk 01:10 Jeff: shows his LORA radio collection. 03:15 Jeff: Complains about soldering SMA connectors to the Ada Fruit boards. 06:35 Discussion of the 3 different antenna connection options. 10:40 Jeff: Talks about using the Ada Fruit All-In-One LORA board. 14:50 Jeff: Again mentions he may eventually make tutorial videos. Remote control, for instance. 15:10 Moves into discussion of gamepads on ROS. And mention of Ross's long range WiFi network 17:15 Back to tutorial topics. Discussion of uses of remote control and eStop. Comments of safety. 23:50 Al: Posts a link to the Wireless gamepad he bought. 25:25 Jeff: Says you should also have a "Dead Man" switch. 28:15 Al: Asks about remotely programming the Arduino/Teensy type boards. 29:00 Jeff: Finds his small joystick and shows it. 31:15 Al: Asks more about remote programming. But runs into Slack/Zoom/YouTube maintanence issues being discussed... 36:05 Back to discussion on Arduino CLI interface and Program-IO programming. We talk about why neither one works for Teensy and try to track down ways to fix either one. 50:45 Jeff: Makes a feeble attempt to show pictures of his robot. 51:50 We start updating the "Links" pinned message at the top of the General channel on Slack. It is updated now.
@Al Jones This may be the solution to your PlatformIO problem: https://community.platformio.org/t/can-i-install-an-older-platformio/21533/7 The very last entry says to use the latest version of PlatformIO but specifies a link to make it work with the correct Python version.
It's been frustrating lately. I tried using ADAFruit guides to get my RFM9x going on my Pi 3. Things seemed to go well until I started downloading library's and radio packages. Then problems. So I shut down. I tried to restart but an error message pops up “job control turned off”. Also seems to have something to do with safe mode etc. So I'm just going to reflash my card and start over. I have to do more research how job control works and about safemodes and how to put what, where to avoid problems. I'm just venting for now. Ill be back after I reflash.
Making some progress. Rpi.org has a one button formatter and flasher, nice and smooth. Using the recommended ADAFRUIT Mu editor, looks like Blinka is loaded and things are turned on. Next up, what can I break this time ;-)
*Thread Reply:* Upon further review ADA docs say this message means this radio is working. Now onto the Teensy and matching radio. Going to be awhile.
While I was trying to get RFM9x to run on a pi, I would try to use the “find” like find python. So I can see the search scrolling thru files and I would see the python several times. But the scrolling goes so fast, I can't get a good read of the displayed path. So….how do I slow down the scroll or stop it and step thru it? Can I capture it some how and put it in a generic text file for analysis?
This video: https://youtu.be/8lsFr6vn7Hs
Chat: Jeff: https://www.adafruit.com/ Al: YouTube video I mentioned earlier that references Meshtastic. https://www.youtube.com/watch?v=TY6m6fS8bxU Al: https://www.amazon.com/868MHz-915MHz-Bluetooth-Development-Transceiver-WIshioT/dp/B07HJ49VN8/ref=pd_vtp_1/147-7220866-7036043?pd_rd_w=9AZvZ&pf_rd_p=96226b5f-2d9a-439b-be45-97603787c682&pf_rd_r=N6MTV0XQ3VSYTWT34Z86&pd_rd_r=8b0a4b1a-3eac-4234-9d8e-9a94f97aa681&pd_rd_wg=Nv3rB&pd_rd_i=B07HJ49VN8&psc=1 Jeff: https://forum.pjrc.com/threads/63528-MISO-and-MOSI-ports-for-connecting-LoRa-to-Teensy-3-2
Index: 00:00 I think we talked about LORA radios for remote control the whole time . I will view it and may update the index.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Another interesting link about LoRa radios is ……https://forum.pjrc.com/threads/41878-Probable-race-condition-in-Radiohead-library
Another interesting link. I like the charts as a beginning point to understand the process in general. https://www.diva-portal.org/smash/get/diva2:1443524/FULLTEXT01.pdf
Another “concept” for LoRa radio including a “repeater”. I'm not advocating Pi. I'm just interested in the concepts. I would like to use Teensey instead of all the pi “concepts” I'm finding. https://www.quora.com/How-can-I-send-and-receive-data-between-two-Raspberry-Pi-3-using-Python-and-the-RFM69HCW-tranceiver
Oooops IShould have reference this link with the “repeater concept”…https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1532&context=eesp|https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1532&context=eesp
*Thread Reply:* Hear is their video with the repeater concept and mapping…https://youtu.be/hHjIfwgMbhI
Current experimental board. I wanted to make use of some PIs that where not doing any thing. This setup is just for learning how to send packets. It will not go on the tractor. My current thinking (always subject to change) is using a Teensy 3.2 for a remote control and a Teensy 3.5 in the control box on the tractor.
Ive ordered a 3.5, more breadboard wires, and misc items.
@JeffS and @Juan Eduardo Riva I will be driving back from Texas tomorrow and unable to join at Noon.
Okay, thanks for the notice. If anybody else has a paid Zoom account and wants to start a meeting, feel free, and post a link to Slack. If not I will poke at Zoom and try to start a free meeting at the usual time. I will post a link if I do that.
See if this link works for today's meeting: https://us02web.zoom.us/s/86042560843
Jeff Sampson is inviting you to a scheduled Zoom meeting.
Topic: My Meeting Time: Feb 10, 2022 11:00 PM Central Time (US and Canada)
Join Zoom Meeting https://us02web.zoom.us/j/82000854448?pwd=UXFBNnpzSG5zaWk2L3VOdUpmSUp1QT09
Meeting ID: 820 0085 4448 Passcode: 545076
Info from today's video meeting:
===========================================
02102022 Lawn Tractor Automation meeting This video: https://youtu.be/kO4d7yYuVpY
Chat: None
Index: 00:00 Bob: Posted images to Slack and talks about parts he needs to aquire. He talks about his approach for his radios. And talks about his spreadsheets for pin numbers. 08:55 Bob: Shows his tractor control box and describes it. 12:30 Bob: Talks about a Raspberry Pi display board. 14:55 Bob: Shows his "too small box". And talks about adding connectors instead of hard-wired. 17:15 Bob: Talks about his prototyping layout. 21:10 Jeff: Talks about antennas and antenna connectors. 23:50 Jeff: Mentions that the "extra" GPIO pins on the radio do not have to be connected back to the processor board. Comments on the OLED display connections. 26:10 Bob: Talks about formatting data for LORA. 28:00 Jeff: Says he is too lazy to work on anything. 28:25 We discuss Al's plans. And "over-the-air" reprogramming. 35:00 Bob: Talks about real-time-clocks and temperature sensors. 35:20 Bob: Questions the amount of processing poower on the Tennsy boards. 36:50 Bob: Talks about not having enough desk space for all of his display monitors.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
I was noticing that Bob has lots of wires between his Raspberry Pi and a protoboard. You can save a lot of wires and time by getting a custom interface board, like this: https://www.amazon.com/Haitronic-Breakout-Expansion-Assembled-Raspberry/dp/B076T2WJSS/ref=sr113?crid=3IH5BA0BDWWXY&keywords=raspberry+pi+gpio+breakout+board&qid=1644529392&sprefix=raspberry+pi+gpio+breakout+board%2Caps%2C495&sr=8-13|https://www.amazon.com/Haitronic-Breakout-Expansion-Assembled-Raspberry/dp/B076T2WJS[…]&sprefix=raspberry+pi+gpio+breakout+board%2Caps%2C495&sr=8-13
Or you can get different variations, with or without protoboards and/or jumpers: https://www.amazon.com/s?k=raspberry+pi+gpio+breakout+board&crid=3IH5BA0BDWWXY&sprefix=raspberry+pi+gpio+breakout+board%2Caps%2C495&ref=nb_sb_noss_1
*Thread Reply:* Sure, now I find out after I ordered ;-) just kidding! Might get some?
I just noticed that the 40-pin protoboard adapter probably won't work on the original Pi B. It has a 26-pin connector and the audio connector will probably be in the way. (Whoops, I meant the yellow video connector, not an audio connector. [Is there an emoji of shame?]) If you have a 26-pin cable handy you could probably use that with the 40-pin adapter. https://static.techspot.com/images/products/2012/desktop-pcs/desktop-pcs/org/2014-07-14-product-1.jpg
What is the quote from Monty Python? "Now for something completely different..." Has anybody found a use for the original Raspberry Pi Zero? Or for the Raspberry Pi Zero-W? I never figured out what to use a Pi Zero for. The Pi Zero-W has Wifi and Bluetooth so I can see a use for that. I may have several around that I have never used. But the fact that they take multiple adapter cables has always put me off. Although I think you can set up an SD Flash card on an original Pi B and then transfer it to a Pi Zero. That may avoid having to buy all of the adapter cables, maybe... And now there is a new Pi Zero 2-W, but I don't know what it would be good for.
*Thread Reply:* I have seen the Pi zero stuff and cameras used for remote security. Maybe some sensors and IOT?
*Thread Reply:* https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1532&context=eesp|https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1532&context=eesp these guys put a zero on a drone. How about using an anchored weather ballon with a zero for extending range? I dont know about power requirements.
*Thread Reply:* New Raspberry Pi Boot loader. https://apple.news/A7y2yHFFfTu2LUdiwMnF4qg
*Thread Reply:* Once upon a time in a land far far away (Allen’s RSOH), we talked about using radio beacons for navigation. Some where recently, I saw some radio boards that can select which radios to talk with. I wonder if time of flight could be used to plot location? Maybe not very accurate? Maybe ok with streaming video or on board sensing of its environment? Just throwing it up on the wall to see what might stick. 🤪
*Thread Reply:* How about a mesh network? https://www.iottrends.tech/blog/diy-how-to-create-a-home-mesh-wifi-using-raspberry-pi/
I just ran across this on ROS Discourse. It is interviews with robotics people. I haven't watched any yet but it may be worth watching. https://www.youtube.com/channel/UC2_89LMusoMaEol3MHCeQ_A
DIY flat ribbon cables … 7:56 How to Make Custom Ribbon Cables for My Delta Robot https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjS75aZm_j1AhUnjYkEHQOmD_4QwqsBegQIBhAB&url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DXbtyf5WqQ-I&usg=AOvVaw2y6IlQbVhFp14GIykOkyIj
Things I would find useful if others could share them.
Is anybody going to start a meeting (@Al Jones) Or should I start a free meeting?
The meeting is live on the usual link. https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
This video: https://youtu.be/S635Kqplqn0
Chat: Al: Google doc that goes through the various software "stacks" https://docs.google.com/spreadsheets/d/13iu1SKzC3u8UJB-Z2nUe2jxEUZqns0Dpvobv8dYRnRs/edit#gid=1906044958 Jeff: https://www.youtube.com/watch?v=TKJ9fyqLX1s&list=PLdWg7SF1vdl0lg4vYOa5wIAXLIfPErG4_&index=3 Al: Transforms "helper" https://docs.google.com/spreadsheets/d/1ygpZITLZh5PuPgFR0frdsZWDlsJZkBNAy8hnVjMy4qs/edit#gid=0 Al: https://gist.github.com/jones2126?page=1 Al: non-contact rotary position sensor with J1939 output https://www.cw-industrialgroup.com/Products/Sensors/Rotary-Position-Sensors/NRH27C-No-Contact-CANbus-Rotary-Position-Sensor Al: another non-contact sensor https://www.directindustry.com/prod/sensata-technologies/product-37684-1876062.html Al: sensor on digi-key https://www.digikey.com/en/products/detail/sensata-bei-sensors/ACW400-12BT-001/9921158?WT.z_cid=ref_Mectronic&s=N4IgTCBcDaIIIGEDqAWADGgtARjAIQBVMNsQBdAXyA&utm_campaign=buynow&utm_medium=aggregator&utm_source=mectronic
Index: (I will index it later.)
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
#random I see a new channel has been added. If you want to see other channels your are not on... click the three dots next to "Channels" and click "Broswe Channels".
"camp ." <camp@camppeavy.com>: Feb 15 11:05PM
Feb 15, 2022 7:00pm Pacific Time Beginner or expert, if you're interested in ROS please join us. Zoom: https://us06web.zoom.us/j/87058611464?pwd=b09pSTluVDdpbW9xeHRpdUMrUnlwdz09 Meeting ID: 870 5861 1464, Passcode: 403987 Thanks,Camp
I found the github that goes with e HBRC Tuesday night meetings: https://github.com/cpeavy2/botvac_node
I was slightly bored and I haven't done anything useful for months. So I decided to go back to my Gazebo simulation. As luck would have it, when I started my robot laptop all of the simulation screens were still on screen from when I put it sleep. I pick the correct roslaunch command line and I got a tractor on a Gazebo screen. Then I clicked a command line to generate ackermann messages and my tractor was driving in a circle.
I copied the files from the robot computer (Ubuntu 16.04, Kinetic) and put them on my main computer (Ubuntu 20.04, Noetic, Foxy). After much playing around I got the tractor to drive in ROS1 Noetic. But I want to play with ROS2, So I started digging to see how to get Gazebo to work with either ROS1 or ROS2. It looks like it will be easiest to run Gazebo from ROS1 (for various reasons) and use ros1_bridge to let ROS1 and ROS2 to talk to each other.
I will go down this path and see how far I get...
(The simulated tractor will drive around under Kinetic, with an error or two, but it still works.)
From "Weekly Robotics" newsletter, https://github.com/Developer-Y/cs-video-courses/blob/master/README.md This look like a wealth of info, but I didn't dig through it.
This is the index from the video meeting last week:
Index: 00:00 Start of discussion on where documentation is located. 10:20 We look at Al's spreadsheet for documentation. 15:40 Some talk about URDF files. 26:30 We talk about code on low level microcontrollers. 44:45 Al: Talks about which code base to run on his ROS computer. 48:00 Ross: Asks about simplifying or "dumbing down" low level control. 50:35 Ross: Brings up CAN bus. 51:20 Jeff: Goes back to simplistic low level control. 53:50 Al: Talks about position encoders. AMS-5048B magnetic encoder 57:05 Ross: Mentions using J1939 CAN standards for standardized messages. 1:19:10 Jeff: Mentions the Home Brew Robotics Group "Anything ROS" Zoom meetings. 1:24:45 Some comments about Docker.
@Al Jones Here is the thread talking about the PlatformIO version problem. https://community.platformio.org/t/can-i-install-an-older-platformio/21533
The last message in the post says the following. I haven't looked at it yet, I will look.
I looked at it and it is not obvious to me. Maybe with enough reading...
Here is another reference. https://github.com/platformio/platformio-docs/issues/176
has anybody else out there solved this (running latest PlatformIO under ROS Kinetic which uses python 2.7)?
02242022 Lawn Tractor Automation meeting - ROS This video: https://youtu.be/iAakH-jup-o
Jeff: My YouTube channel: https://www.youtube.com/user/jws8675309/videos Jeff: https://www.youtube.com/watch?v=HB9Z1SIBHn0 Al: https://www.aliexpress.com/item/32840238513.html?spm=a2g0o.9042311.0.0.6be54c4dMsc92F Al: https://randomnerdtutorials.com/ttgo-lora32-sx1276-arduino-ide/
Index: 00:00 Jeff: Shows some diagrams of a hypothetical physical ROS vehicle. Then switches to Gazebo instead actual vehicle. 31:00 Jeff: Holds up his small robot and compares it to his larger robot. 32:55 Al: Asks about the lineage of the small robot. 33:30 Jeff: Digs up a link to his YouTube channel. And shows a quick excerpt for the small robot. The date was 2016. 40:45 Al: Shows his ESP32/LORA/Display board. 44:00 Al: Posts link to sample code for the board. 52:35 We talk about Al's problem that PlatformIO no longer works with ROS Kinetic. 55:50 Jeff: Complains about his weather.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Here is another random snipit from PlatformIO documentation: https://docs.platformio.org/en/latest/core/migration.html#drop-support-for-python-2-and-3-5 If I knew more about Pyhton then this would not be so mysterious to me...
@JeffS, thanks for laying simple block diagrams for a complicated system. @Al Jones thanks for the heads up about your loRa and Oled. It looks interesting. Even the $40 tutorial package.
On a side note... I have a problem with my [new] neighbors behind me. They have 2 big dogs, Really nice dogs. But they bark when outside, The owners have the nerve to tell me that it is okay if they bark during the day, So I just ordered an air horn: https://www.amazon.com/SeaSense-SL52272-Air-Horn/dp/B0019M3FKA/ref=sr13?crid=2UBZ8F9508ZVZ&keywords=air+horn&qid=1646094381&sprefix=air+horn%2Caps%2C183&sr=8-3|https://www.amazon.com/SeaSense-SL52272-Air-Horn/dp/B0019M3FKA/ref=sr13?crid=2UBZ8[…]s=air+horn&qid=1646094381&sprefix=air+horn%2Caps%2C183&sr=8-3 we''ll see where this goes.
How about “Bark recognition software” triggering one of those frequencies only dogs hear….said with an evil grin.
That's easy. Attach an electric dog whistle to a "bark collar". You might get a positive feedback loop though vice the desired negative one.
I don't want to project the wrong impression. It is not the dogs. It is the owner that thinks it is cute that his dogs bark. He has told me at least twice that it is okay if his dogs bark during the day. We''ll see what happens after a couple of times of the air horn and me pointing out that it is okay if I do that during the day...
This video: https://youtu.be/wn3YTXuOhdc
Chat:
Jeff: How do we describe a robot? With URDF! | Getting Ready to build Robots with ROS #7 https://github.com/joshnewans/urdf_example https://www.youtube.com/watch?v=CwdbsvcpOHM
Creating a rough 3D model of our robot with URDF https://articulatedrobotics.xyz/mobile-robot-2-concept-urdf/ https://www.youtube.com/watch?v=BcjHyhV0kIs
NAV2 - Setting Up The URDF https://navigation.ros.org/setup_guides/urdf/setup_urdf.html
linorobot2 https://linorobot.org/ https://github.com/linorobot/linorobot2
Al: https://www.osrfoundation.org/simulated-car-demo/ Al: https://gist.github.com/jones2126
Index: 00:00 Jeff: Talks about URDF tutorials Later... Al: Gives update on his lawn tractor project.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
@JeffS , I don't suppose there is a recording with your conversation with Camp? I'm curious as to why he chose the Pi 4? Is there a link to his build project?
@Bob Hassett The Tuesday night meetings do not get recorded. You can search here: (Maybe search for botvac or botvac2) https://groups.google.com/g/hbrobotics Or check the monthly meeting videos: https://www.youtube.com/user/hbrobotics
https://github.com/cpeavy2/botvac_node Looking through that might take you to the original (ROS1) project.
The second video for creating a robot for Gazebo is out: https://www.youtube.com/watch?v=IjFcr5r0nMs
If you notice at the start the steering is pointed straight, When I start the ackermann message publisher the steering turns to the left and it starts driving forward. When it gets back the start point I do a Ctrl-C and it stops. The rest of the video is dead space...
*Thread Reply:* When you have a moment can you echo cmd_vel angular z? I would be curious how it changes.
*Thread Reply:* I could echo it, but it doesn't exist... If it were there then it would be a constant value.
I have no idea on how to program a LoRa9x. Lucky for me ( and others) Adafruit sponsors a forum so I posted for some help. I thought I should share my research and stumbling down this road to understanding how to implement LoRa.
Thanks to Jerry Needell on the Adafruit Forum for suggesting the following;
https://github.com/adafruit/AdafruitCircuitPythonRFM9x/tree/main/examples
rfm9x_header.py
rfm9x_node1.py
rfm9xnode1ack.py
rfm9xnode1bonnet.py
rfm9x_node2.py
rfm9xnode2ack.py
rfm9xrpiinterrupt.py
rfm9xrpisimpletest.py
rfm9x_simpletest.py
rfm9x_transmit.py
https://docs.circuitpython.org/projects/rfm9x/en/latest/api.html
When copying .py files off of github is it good enough to just press “raw” and then save them?
Well... If you just need one of two files you can do that. You may have to update the permissions after you save them.
Or there should be a download button. You can click on that and it will ask if you want a ZIP file or if you want a download link. If you get a ZIP file you can extract that to your computer and have the entire repository. If instead you copy the download link then you can clone the entire repository to your local computer. If you do that then it will track all of your changes and you can back up to an arbitrary version ans restore it. I can't tell you how to do that off the top of my head. (Is that a valid cliche?)
There should be tons of pages and/or video tutorials that tell how to do these things.
Okay, apparently it is a snow day for everyone. I'm tired of playing with my video settings. So I am going to cancel my meeting with my self...
I did google how to get files and they told about what you said. But I didnt see anything like that. Here are photos. Im able to save the “raw file” and run it in Thonny and it works. I can see the print statements in the Thonny “shell” window. Ive recently just found out most of the examples dont use the Oled. That’s why I thought something was wrong. BTW I switched over to ribbon cable directly from pi’s to ora boards. One of the “node” examples does utilize the Oled. I have to look closer. Ill print out on paper a couple of the examples and compare Oled use.
None of these photos are at the top level. If you back up one level you will see a green button that says "Code". If you click that it gives the option of "Zip" or "Clone".
Daylight saving time changes in the US (I think in the middle of the night tonight). So if you are not in the US that will affect the video meetings. It probably does not affect anything else...
I decided to replace wires and ribbon cables with stacking headers for now. Also been comparing and playing with examples. I’m beginning to understand, but I wish I could find one “man pages” that’s as easy to use as Linux man pages. I ordered a WiFi dongle for the Model 3 but its the wrong one. So I returned and will get one recommend for Pi. After that comes in Ill convert to headless operation for both Pi to get around my one tv and switching cables all the time. Any one know of a good and complete easy access man pages?
A couple of things...
If you want more range from the LORA board with the wire antenna, you may want to solder the wire to the antenna connection instead of a ground connection.
I'm not sure what you are saying about a WiFi dongle. Do you want two WiFi connections on a RPi3? Or are you trying to get more range than the built-in WiFi? Or are you talking about something totally different? Or is a "Model 3" something different than a Raspberry Pi 3?
I'm not sure what you mean by "good and complete easy access man pages".
*Thread Reply:* A couple of things...
If you want more range from the LORA board with the wire antenna, you may want to solder the wire to the antenna connection instead of a ground connection.
• Ooops, sloppy soldering. I was trying for quick and easy replacement. I think I’ll replace the wire with a proper normal coil antenna. I'm not sure what you are saying about a WiFi dongle. Do you want two WiFi connections on a RPi3? Or are you trying to get more range than the built-in WiFi? Or are you talking about something totally different? Or is a "Model 3" something different than a Raspberry Pi 3?
• The Pi 3 Model B+ has WiFi built in but my Pi 3 is a 2015. Wifi was added in 2016. I'm not sure what you mean by "good and complete easy access man pages".
• When using Linux in a terminal widow it is so easy to look up a command and the needed arguments.
Here is a sample that I would like to look up and determine what the arguments are required.
rfm9x.send( bytes( "message number {} from node {} button {}".format( counter, rfm9x.node, button_pressed ), "UTF-8", ),
rfm9x.send(bytes("Hello world!\r\n", "utf-8")) print("Sent Hello World message!")
I get its sending an array but {}, back slash r back slash n? One example was clear the {} returned a derived number. I'd like a better discription of ways to format. How flexible is that format arguement
My ultimate goal is to send a joystick value and print out the received value.
I don't know where you got your info about RPi3B from 2015 not having WiFi. I just dug three of them out. All three say "Raspberry Pi 3 Model B V1.2 {Copyright} Raspberry Pi 2015". All three of them have the WiFi antenna. I'm sure I used at least one of them on WiFi...
As far as your code... There seem to be several issues. First, I can't identify that language. Second, you need to look up rfm9x.send in whatever library you are using to find out about it. Third, the \r \n are carriage return and line feed (new line).
The {} is dependent on the language you are using. Something about substituting in the statement. So "message number" will be "counter". "from node" will be "rfm9x.node". And "button" will be "button_pressed". So it is probably going to send a string that looks something like "message number 10 from node 23 button 3" But the bytes function may convert all of that text to bytes. But I'm just guessing.
The control character definitions are standard in C. So this language apparently adopted that format. So look at something like this: https://www.tutorialspoint.com/csharp/csharp_character_escapes.htm
But you have to look at the rfm9x library to find out what those functions do and what they are expecting. You have to look at your language to see what syntax it is looking for. Which is probably where you have to look for the []. Maybe the [] is specific to the byte() function?
And the control character thing is sort of standard across several languages.
*Thread Reply:* finally got WiFi and Bluetooth working on my slave rasp pi 3 B. Also I can hear YouTube audio playing on the iPad over to pi monitor(TV) but NO video. Some where I read I can't do that on blue tooth and I'd be better off using VNP. So thats next on a growing to do list. If I had a camera on a pi I'd like to see it on my iPad.
#random I forgot to post this... The meeting is in 15 minutes, due to our time change.
I get "The host has another meeting in progress" when I try to join meeting...
03172022 ROS Lawn Tractor Automation meeting This Video: https://youtu.be/99Z3pIJqOJQ
Chat: Al: https://gist.github.com/jones2126/bfebe7879e83ad2a7121bd7cace6f393 Al: https://docs.google.com/document/d/1mW2v-a57ChIMwLSJhm87rMTcmUPB0fjNsEGPqLDi37Y/edit Al: https://gqrx.dk/ Al: https://www.wsprnet.org/drupal/ Al: https://www.rtl-sdr.com/building-a-ham-tranceiver-with-an-rtl-sdr-raspberry-pi-and-rpitx/
Index: 00:00 Al: Gives updates. Steering scaling. He needs to put his tractor back together and make it work. udev rules. IP address conflicts. 05:45 Al: Explains his IP address problem. 15:35 Al: Talks about udev rules (that don't work). 22:15 Comments that everybody needs to get their machines put back together and able to run outside. 23:25 Jeff: Goes off-topic by talking about SDR (Software Defined Radios). 45:00 Jeff: Talks about publishing odometry messages from a Gazebo simulator.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
@Al Jones I was listening to your conversation with @JeffS about conflicted ip addresses. Could you point me back to your docs describing your goals/implementation of wireless operation ie vpn and headless operating, more about ham radio interfacing. I'd like to understand more of what are accomplishing.
*Thread Reply:* My ubiquityrobot would pull two IP addresses from the router. One for the on-board wi-fi chip and one for the USB Ethernet add-on antennae that I added. I only wanted the USB to get an IP address. I disabled the onboard wi-fi following the link below. Previously I was using a VPN from ZeroTier which made it easier, but that has failed for unknown reasons. https://raspberrypi.stackexchange.com/questions/61760/disable-onboard-wifi-bluetooth-raspberry-pi-3 You can edit /boot/config.txt and add this two lines: dtoverlay=pi3-disable-wifi dtoverlay=pi3-disable-bt
*Thread Reply:* May I ask why you wanted a hot spot at the expense of WiFi? What are your plans for a hot spot? For WiFi?
*Thread Reply:* @Bob Hassett I'm not sure I understand. At the moment my "tractor" and laptop both get their separate and distinct IP addresses from a single router wirelessly. Not sure if you consider that a hot spot or wi-fi.
*Thread Reply:* Could you pass video with that set up? Video from the tractor to the laptop and vis versa? can program and control the tractor over WiFi?
Just as a reminder, a UDEV rule for a USB connection with no serial number where you use the device ID can look like this:
KERNEL=="ttyUSB[0-9]**", SUBSYSTEM=="tty", ATTRS{idVendor}=="1a86", ATTRS{idProduct}=="7523", ENV{USB_HUB_TYPE}="1a86:7523"
ENV{USB_HUB_TYPE}=="1a86:7523", KERNEL=="ttyUSB[0-9]**",SUBSYSTEM=="tty", KERNELS=="1-1.4:1.0", SYMLINK+="ttyIMUNano"
In retrospect, I was dumb enough to use a nano clone that has a CH341 chip and no serial number. (annoying). To deal with that my udev rule is fragile because I have to use the device id of 1-1.4:1.0 as the unique identifier. ref
The biggest problem I see with your blank rviz screen is that you don't have anything turned on to display. Set the same display options that the working screen has and you should see something. Do you have a saved "rviz configuration file" somewhere. If so, start it with that or load the config file after it is running.
*Thread Reply:* Hmmm. I’m guessing then I configured tractor_vis.rvis sometime before...
*Thread Reply:* I'm actually not sure why that launch file is failing. It should have worked since that package is installed on the RPi. Maybe the sequence to logon to the RPi first is broken and it tried to launch from the operator laptop instead. More testing required....
*Thread Reply: Be_st I can do so far with having the package lawn_tractor_sim only installed on the RPi (i.e. tractor) is take out the reference to the *.dae file and replace it with a generic box as part of the robot description and a white sphere as the gps. When I leave the **.dae file in the URDF on the RPi and start RVIZ on the laptop I continue to get errors about not able to find the package/file. You might ask what is different than before? Previously I was running kinetic on the laptop and had lawntractorsim on both the laptop and the RPi. I'm just guessing that prevented this issue previously._
I'm considering using a VNC.. For example, If I go static on my pi's and adjust my router, do I loose protection? But gain access from any where on the internet? …So I have to consider the pros and cons of dynamic vs static. Any comments are appreciated.
From your comments I assume you mean VPN instead of VNC. Do you need access to your base or your tractor from anywhere on the Internet? I think Al uses VPN because his base station goes out over the Internet and comes back in over a cell modem to talk to his tractor. But I may be wrong.
I personally don't open my home network to the Internet (not intentionally, anyway).
If you are only running your computers on your local network then the choice of dynamic vs. static IP is up to you. Making them static then you get away from getting random IP address when you turn them on. But when running dynamic IP the router usually remembers your devices and you will usually get the same IP address each time (at least mine does). But occasionally a different IP address may come up...
It would be fun to operate or update software over the internet. But which is the safest mode? I'd rather not use a cell modem because of the fees involved.
Well, I got RealVNC Home Edition (free service but no tech support) working between mi iPad and my Pi 3 B +. Now I can really screw things up changing ip, usernames, passwords, etc. ;-)
I'm on the original link waiting for it to start. https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
I saw a message in the last few minutes from Al saying he is running late. He didn't say how late. And I can't seem to find the message now. So I am going to wait and see what happens...
Al just buried a response in a thread swaying he is trying to start the meeting.
This video: https://youtu.be/6M6iLDKJ8B0
Chat: Jeff: https://github.com/rst-tu-dortmund/teb_local_planner/blob/melodic-devel/scripts/cmd_vel_to_ackermann_drive.py Jeff : https://www.youtube.com/watch?v=I_5leJK8vhQ
Index: 00:00 Jeff: Gives a demo of the cmdveltoackermanndrive converter. 06:30 Jeff: Gives a demo of his progress with a 4-wheel Ackermann vehicle in Gazebo. 16:30 Al: Asks how you get odom messages from gazebo. 21:50 Bob: Asks if the simulator and physical machine are interchangeable. 28:45 Jeff: Talks about trying to photograph addressable LED strips. 37:55 Bob: Talks about his radio and Raspberry Pi experiments. Wireless communication setup. VNC vs. remote desktop.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
@Al Jones I figured out why my rviz would not display today. My command line said this: rviz -d catkinws/src/picar/src/ackermannvehiclegazebo/acknew.rviz So if I start it from my home directory it finds the .rviz file and works. If I start it anywhere else it does not work. I added the "~/" in front of catkinws and it is fixed. Like this: rviz -d ~/catkinws/src/picar/src/ackermannvehiclegazebo/ack_new.rviz That forces the search path to start for the home directory. So you may have a similar problem in your launch file.
Here is a screen shot. Image on the left is rviz. You can see the joints connecting the pieces to base_link in the back. The lower right is the Gazebo screen with the laser scan visualization turned off. And upper right is an actual photo. I added the IMU and it is publishing. I don't know if it is valid data. I added a non-functional GPS antenna in the front just for bling...
I just ran into the fact this no linger works in Ubuntu20.04/Noetic:
rosrun tf view_frames
You have to do this to install tf2_tools:
sudo apt install ros-noetic-tf2-tools
Then you can do this:
rosrun tf2_tools view_frames.py
You can still do this to display it:
evince frames.pdf
I think I have my Ackermann robot driving with movebase and teblocal_planner. At least it will drive to 4 points and come back to close to the starting point. I'm sure it needs lots of tweaking. I just now disabled AMCL and replaced it with a static transform publisher to put odom on top of map. And all of a sudden it was working.
Ubiquity for ROS Noetic: https://groups.google.com/g/hbrobotics/c/xbzBRjZG9yQ
03312022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/WIWNgo5GFBQ
Chat: None
Index: 00:00 Al: Gives progress report. Steering, RVIZ not working correctly, some outdoor testing 09:05 Jeff: Gives demo of Gazebo progress. 11:25 Jeff: Shows the output of viewframes. 12:10 Starting Gazebo. 14:00 Driving with joy->cmdvel->ackermann launch file. 16:50 Driving with joy->ackermann launch file. 18:30 Adding some obstacles to the Gazebo world, just for fun. 19:00 Starting movebase launch file with standard differential drive local planner. 21:55 Starting movebase launch file with TEB local planner. And some obstacle avoidance. 32:50 Discussion on what to do next. 34:00 Jeff: Again fights with Linux version of Zoom to stop screen sharing. Then just kills the whole thing... After watch in the video it looks like screen sharing stopped as soon as I clicked "New share". So I probably could have just exited at that point and the screen sharing would be done. 35:25 Discussion of a "flat" URDF file vs. a file with macros. 40:30 Al: Talks about changing values in his URDF and expected that to be the robot footprint (it's not). 44:45 Restating of how to add footprint to RVIZ. Jeff then wastes a lot of time trying to post an image of the topic screen. 52:20 Jeff finally displays the image to select footprint... 53:45 Talk about getting dimensions of other vehicles to see if they can be reproduced.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Al was whining around about seeing my URDF file. So I hacked and slashed and created a separate workspace. I copied it to a new harddrive and resolved the dependencies. And it seems to work. So maybe tomorrow (actually later today) I may post this to a git-hub. So if you have a computer with Ubuntu 20.04 and Noetic then you can play with it.
ROS 2 on Raspberry Pi how to guide https://groups.google.com/g/hbrobotics/c/dQ29v3OShb0
*Thread Reply:* @JeffS Thank you for finding that. It looks very interesting.
*Thread Reply:* I've been reading up on Docker Containers, advantages/disadvantages. ROS has a lot of resources that can be incorporated but ROS 2 has a small fraction of that. I'm wondering about the role of containers in automating our small machines. What would a software platform look like if containers are used in our small machines? Is it worth the learning curve given the learning curves needed for ROS, python….etc?
I can't figure out how to use github. So I am giving up on that. If I want to upload my catkinwssim to here: https://drive.google.com/drive/folders/1t2P2sfiranPe10fmUBUnnVvKNgrfybAV How do I do that? Can I upload directly? Or do I have to upload to my "My Drive" and share it? Or should I just ZIP it and upload to my web site?
I see there is a "+ New" button on the google drive page. (Instead of calling it "Upload".) I uploaded my catkinwssim directory. It said 1 file failed "CMakeLists.txt", but it doesn't tell me which "CMakeLists.txt" file they are talking about. I see I should have added a higher level directory so I could name it. So maybe I will delete it and try a few more times...
Okay, I finally got it to work... If you want to try to install this: I used Ubuntu 20.04 and ROS Noetic. If you did a rosdesktopfull install then Gazebo should already be loaded. Then go here: https://drive.google.com/drive/folders/1t2P2sfiranPe10fmUBUnnVvKNgrfybAV And go into jeffsackermannsimulation you should see a catlinwssim directory. If you click on that and do a download (under the 3 dots by the trash can in upper right hand corner) it will generate a ZIP file and down load to your downloads directory. You can unzip it there. Go into the resulting directory and copy the catkinwssim directory to your home directory.
I assume your .bashrc has:
source ~/catkin_ws/devel/setup.bash
and replace it with:
source ~/catkin_ws_sim/devel/setup.bash
Restart all of your terminal windows.
The README.md file has a list of dependencies to load. (If you know what you are doing you could probably fix these with rosdep. But I just loaded them.)
Open a window and go to the catkinwssim directory and do:
catkin_make
If that works then I noticed the zipped file transfer deleted the execute permissions on the Python files. So you have to fix those.
If you get that far then you can execute the launch files listed in the README.md file.
Or, if you don't want to waste a lot of time and just want to look at the files, you can look at them on Google drive. Or down load them to your computer and dig through them there.
There is no guarantee this is going to work...
*Thread Reply:* roslaunch nav joyop.launch is working and able to drive around, but have not yet figured out why roslaunch nav joy.launch is not publishing from /joy to /cmd_vel.
*Thread Reply:* You could skip joy.launch for the moment. And move on (since joyop.launch works).
*Thread Reply:* It has probably been doing that all along. Mines does.
In Gazebo (which is what you are showing) something is trying to access the model which apparently hasn't been loaded yet. But it still catches up and runs.
The first time (and other random times) it seems to take forever for Gazebo to start up.
The maker of LinoRobot will be on HBRobotics April 27th general meeting. The get a Zoom link you can check here https://groups.google.com/g/hbrobotics closer to the meeting date. ( I will post again if I don't forget.)
Steve Macensk, who is in charge of ROS2 Nav2, is giving a presentation April 9th at the RSSC group. https://www.rssc.org/ To get the Zoom link you have to sign up for their Google group. Which is listed on the RSSC web site. It was mentioned tonight that Nav2 already has waypoints built-in and you can create a "keep out" layer to go along with navigation.
Or you can go here to get the Zoom link for the Nav2 presentation on April 9th. https://groups.google.com/g/hbrobotics/c/79CbH7J_2nc
This is interesting, in case your tractor drives off a cliff... https://www.youtube.com/watch?v=cI1SXZQk_H8
I was able to put macros back into my URDF file for my 4-wheel Ackermann vehicle. Now I can define a vehicle like this: (These are numbers for Al's lawn tractor.) ```<?xml version="1.0"?> <robot xmlns:xacro="http://www.ros.org/wiki/xacro" > <!-- Wheel and frame Properties for Al's Cub Cadet -->
<xacro:property name="hub_dia" value="0.15"/>
<!-- chassislength is measured along the x axis, chassiswidth along the y axis, and chassisheight along the z axis. --> <xacro:property name="chassislength" value="0.75"/> <!-- 0.505 --> <xacro:property name="chassiswidth" value="0.33"/> <!-- 0.25 --> <xacro:property name="chassisheight" value="0.01"/> <xacro:property name="chassis_mass" value="2.788"/>
<xacro:property name="mainlink" value="baselink" /> <!-- hubdia and tiredia are the diameters of the hub and tire, respectively. --> <xacro:property name="tirefrontdia" value="0.356"/> <xacro:property name="tirefrontwidth" value="0.159"/> <xacro:property name="tirereardia" value="0.483"/> <xacro:property name="tirerearwidth" value="0.229"/>
<!-- Wheel spacing is cener to center --> <xacro:property name="wheelspacingfront" value="0.851"/> <xacro:property name="wheelspacingrear" value="0.737"/> <xacro:property name="wheelbase" value="1.219"/>
<xacro:property name="wheel_mass" value="0.29"/> <!-- ?????? -->
<xacro:property name="axle_length" value="0.03"/> <!-- 0.05 -->
<xacro:property name="axleefflimit" value="5.12766"/> <xacro:property name="axlevellimit" value="244.8696"/> <xacro:property name="servostalltorque" value="0.5649"/> <xacro:property name="servonoload_speed" value="4.553"/>
</robot>
The values that actually define his vehicle are:
<xacro:property name="tirefrontdia" value="0.356"/>
<xacro:property name="tirefrontwidth" value="0.159"/>
<xacro:property name="tirereardia" value="0.483"/>
<xacro:property name="tirerearwidth" value="0.229"/>
<!-- Wheel spacing is center to center -->
<xacro:property name="wheelspacingfront" value="0.851"/>
<xacro:property name="wheelspacingrear" value="0.737"/>
<xacro:property name="wheelbase" value="1.219"/>```
I added my sensors on his vehicle so it would do something interesting. Here are some pictures.
04072022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/Mqk1Mmqk24g
Chat: Jeff: The original R/C car I started with: https://gitlab.yonohub.com/mohamedahmed/picar
Index: 00:00 Al: Gives progress report. He is testing his GPS outdoors as a standalone device. He talks about how to fix view_frames for Noetic. 02:15 Al: Has installed the new Ubiquity image for Noetic on Raspberry Pi. 04:15 Jeff: Points out they are to "fix" the Ubiquity image. Then Al will have his problems back. 08:20 Discussion of what Al's GPS base station consists of. 13:40 Jeff: Starts to explain his new phase-2 URDF files that use macros. 18:50 Start of live demos. Demos of the new URDF with Al's Cub Cadet dimensions. 25:00 The parameter file is changed to Jeff's Sammy robot. A demo of that. 27:50 Talk about tweaking parameters. 28:40 Al: Asks if the obstacle avoidance works. So we test it out. 33:00 Attempt to cancel navigation (since it can't reach goal). A new goal is selected. 35:25 Question about which versions of ROS this will work on. 36:45 Jeff: Complains about github. "Maybe" eventually release this on github. 38:15 Al: Summarizes the way he thinks it works. Then asks how difficult was it to make this work. 39:45 More in-depth view of what is required. 45:40 Several changes were made to Al's vehicle which may improve steering angle. That is tested. 49:00 Jeff: Explains some of the strange issues he was fighting with. More details on how the macros work. 56:20 Jeff: Fights with Linux/Zoom again to shut down screen sharing. And Al asks about link to the original R/C car model. 58:35 History on the R/C car URDF. 59:50 A search is started to find the R/C car link that Jeff started with. 1:04:00 Jeff: Posts the link to the original (that he started with) R/C car. 1:06:15 Jeff: Says he cut the sensor code from the URDF and put it in an include file. 1:07:15 More talk about github. 1:10:50 Al notices that Jeff put all this code out on the shared Google drive. 1:15:40 Bling...
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Reminder: The Navigation2 presentation is today. The main meeting starts in 2 hours. (10AM Pacific, 12PM Central, 1PM Eastern) The first hour is something about AI. The second hour is the ROS2 Navigation2 presentation. https://groups.google.com/g/hbrobotics/c/79CbH7J_2nc https://www.rssc.org/
I spent hours yesterday trying to figure out ROS2 and Navigation2. ROS2 is enough different that it is going to be tedious to figure out the differences. The launch files are now Python scripts instead of XML scripts. Most commands start with "ros2". For instance, instead of "rostopic list" it is now "ros2 topic list". And I assume most commands work that way. The Navigation2 is greatly expanded over ROS1. It has lots of plug-ins and lots of planners...
I spent enough time to know what I don't know. But I now have enough background to at least be able to follow today's presentation on Navigation2.
Hopefully by our next video meeting I will have nav2 running under ROS2 talking to the Gazebo simulator running under ROS1. (Mainly as an experiment.)
By request... I uploaded my phase 2 simulation. The one that uses macros to define the vehicle. I put it here again: https://drive.google.com/drive/folders/1t2P2sfiranPe10fmUBUnnVvKNgrfybAV There are various info/instruction files under the main catkinwssim_phase2 directory. Now I will go back to my ROS2 experiments.
*Thread Reply:* I kind of figured that is what you meant when you said you commented everything out. But... the main code is in acknewmacros.xacro. You can't get rid of that. And it won't run if you don't have the properties defined. (cubcadetparam.xacro)
So as a bare minimum you need
ack_new_macros.xacro
cub_cadet_param.xacro
or
ack_new_macros.xacro
sammy_param.xacro
and you can also add sammy_sensors.xacro if you want to sensors to be created.
If you have either above uncommented, what does it do then?
*Thread Reply:* If I add sammysensors.xacro I get the error "GetModelState: model [ackermannvehicle] does not exist", but as you have said before I can ignore that and the model loads.
I moved sammy_sensors.xacro to load at the end and the errors are removed. I also spent time to see how you separated the base machine, from the robot params and the sensors - very nice.
I thought of one more thing that may improve the turning angle for the Cub Cadet. If you search for:
<limit lower="${-degrees_45}" upper="${degrees_45}"
In:
ack_new_macros.xacro
You can open up that min/max steering angle to whatever physical angle your lawn tractor will do. I should make that a property...
Hi everyone, I have seen that soon you may need a coverage path planner to find the best path to cover a field.
I am right now doing a PhD about coverage path planning: https://www.wur.nl/en/project/Fields2cover-Robust-and-efficient-coverage-paths-for-autonomous-agricultural-vehicles.htm . Part of my PhD is creating an open-source offline Coverage Path Planning Library (in C++). I plan to release it the 16th of May. I hope it will be useful for this project and I could even join to a meeting when the day gets closer for a demo if you want.
If you have any questions, I'm open to talk :)
I'm giving up on my ROS1/ROS2 integration, for the moment... Maintaining several terminals in ROS1 and several terminals in ROS2 gets REAL tedious. My biggest problem seems to be that I can't pass the odom frame from ROS1 to ROS2. Maybe none of the frames are getting passed.
By about 4AM I decided I had fought with it long enough. So I gave up. Maybe I will go back to it someday.
Now I have decide what I should be working...
Info for today's video meeting. I haven't done the index yet. I will add that later.
This file: https://youtu.be/U7FOJbe8x3A
Chat: Jeff: youtubedl to download videos: https://askubuntu.com/questions/334081/downloading-multiple-files-with-youtube-dl
Index: To be added...
Index: 00:00 Al: Talks about replicating Jeff's Gazebo simulation experiments. 10:45 Al: Asks about the difference between a physical GPS and a simulated GPS. 15:05 Jeff: Proposes creating diagrams to illustrate the code organization. 16:40 Al: Points out the latest meeting video is getting views. 17:15 We discuss the ability to download all of the meeting videos so you can watch them any time. 20:20 Jeff: Posts the link to download the videos. 21:50 Discussion of how YouTube searches work. 24:15 We go back to discussing physical vehicle vs. simulated vehicle. 25:55 Jeff: Tells story of trying to get ROS1 and ROS2 running at the same time. 27:25 Jeff: Points out an Ackermann controller plug-in for Gazebo. (instead of running an external Python script) 31:00 Jeff: Also found a pure pursuit local planner plug-in for move_base. 41:00 Jeff: Reiterates the steps for the ROS1/ROS2 integration for simulation. 48:30 Al: Questions the stability. 50:00 Al: Talks about moving his code from Kinetic to Noetic for physical vehicle. 50:15 Discussion of pieces involved. Like GPS. And ros-serial.
@JeffS Looking forward to your boxes concept! I think it would help beginning HGAC (haven't got a clue) people like me! @Al Jones How would you incorporate a laptop on your tractor if you go that way? I read some place the new Raspberry Pi 4 with up to 8 GB is as powerful as a laptop? I haven't really dove into that comparison yet.
*Thread Reply:* @Bob Hassett Re: incorporating a laptop I think Jeff runs a laptop on his today so it would be something similar. I would have a platform I can strap it down to and connect my USB hub into it which is how my RPi drives the various micro controllers today.
On February 28th I said: "On a side note... I have a problem with my [new] neighbors behind me. They have 2 big dogs, Really nice dogs. But they bark when outside, The owners have the nerve to tell me that it is okay if they bark during the day, So I just ordered an air horn: https://www.amazon.com/SeaSense-SL52272-Air-Horn/dp/B0019M3FKA/ref=sr13?crid=2UBZ8F9508ZVZ&keywords=air+horn&qid=1646094381&sprefix=air+horn%2Caps%2C183&sr=8-3|https://www.amazon.com/SeaSense-SL52272-Air-Horn/dp/B0019M3FKA/ref=sr13?crid=2UBZ8[…]s=air+horn&qid=1646094381&sprefix=air+horn%2Caps%2C183&sr=8-3 we''ll see where this goes. (edited)"
and then I said: "I don't want to project the wrong impression. It is not the dogs. It is the owner that thinks it is okay if his dogs bark during the day. We''ll see what happens after a couple of times of the air horn and me pointing out that it is okay if I do that during the day..."
I decided that provoking my neighbors wasn't the best idea...
But tonight I got fed up. (they have have visitors with an extra big dog) I let out a blast with my air horn. A woman came out and quickly let the dogs into the house. She was ignoring me because she knew they were being assholes. We wil see where this goes...
Here is a different shaft encode than what Al is using. I assume it gives the same results. https://www.instructables.com/Magnetic-Shaft-Encoder/
The grandkids spent the night with us and got curious about my radio project. So we decided to do some distance tests. The master node was in the house and the slave node was handheld carried out side. In the yard we got RSSI in the -60s. The longest test was some 960 feet with RSSI -130. Even at that distance and signal weakness,, the messages could go back and forth. We also noticed we had to hold down the buttons on the radios for the count of two, before a message was received. I could not help remember @JeffS conversation about having too many layers of software/hardware to deal with. Btw the master node was a raspberry pi 3 B+. The remote was a pi 3. Each one was using the Adafruit RFM9x Oled Bonnet. All in all I was pleasantly surprised at the radio reception at 960 feet. I think I could use the radios to pause a machine. As a next safe step, I'll have to test the radios on the Barbie Jeep.
04212022 ROS Lawn Tractor Automation meeting This file: https://youtu.be/nVq63fToc4I
Chat: None
Index: 00:00 Al: Talks about replicating Jeff's Gazebo simulation experiments. 09:00 Brief discussion on modifying some things in the URDF file. 13:20 Long discussion about diagrams. 52:45 Jeff: Talks about a Gazebo simulation of a radio direction finding module. 57:00 Discussion switches over to differences between ROS1 and ROS2.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
*Thread Reply:* Very interesting discussion this week. About that cave simulation….i get the need to designate the location of the transmitter. I can't understand why the students are having an issue with that issue. For me I would think the big issue is how to simulate an active transmitting signal in a 3 dimensional space? How could signal strength could be simulated in a 3 dimensional space? Any suggestions?
*Thread Reply:* Here is an interesting paper. I have not read it. I have requested a copy. https://www.researchgate.net/publication/344312748AGazeboROS-basedCommunication-RealisticSimulatorforNetworked_sUAS
I see YouTube is getting more clever. Now I have to turn off my ad blocker before I can sign in or sign out. (I usually run signed out)
But I just noticed that the description of the video is now available in the first level search. So if you do this: https://www.youtube.com/results?search_query=lawn+tractor+automation
It has a tab/button that will display the description index before you try to view the video (on the recent ones). But at the moment the thumb-nail preview doesn't show up for today's video...
The presentation for Navigation2 has been posted. It is in this video at about the 1 hour mark. I suggest everybody watch the Foxglove presentation. https://www.youtube.com/watch?v=3dqKXSAblvE 0:00 Some AI thing. 1:00 Steve Macenski - ROS2 Navigation2 2:11 Camp Peavy - LinoRobot2 Demo 2:25 Business meeting 2:45 Ross Lunan - OakD Lite camera, Jetson Nanosaur 3:00 Jim - "Orange" robot, speech recognition 3:17 Jim - ROS2 Turtlebot3 simulation demo 3:24 Roman - Foxglove Studio
I requested a paper once... and never heard back. But one trick you can try is to copy the title of the paper and put it in a search engine. Sometimes the author will have the PDF on their personal page. That is the case this time. But the PDF link just points back to ieee.
Other people are indexing these (to get clicks/views) and just point back to the "technical paper brokers".
But if you are persistent you may find it. This seems to be the paper: https://sci-hub.se/10.1109/ICUAS48674.2020.9213892
*Thread Reply:* I was able to find some tables/drawings…https://www.semanticscholar.org/paper/A-Gazebo%2FROS-based-Communication-Realistic-for-sUAS-Moon-Bird/5ddb22ff9b97712b0f02af227d9124d153c1108c|https://www.semanticscholar.org/paper/A-Gazebo%2FROS-based-Communication-Realistic-for-sUAS-Moon-Bird/5ddb22ff9b97712b0f02af227d9124d153c1108c
04282022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/BCIWRfd7L7A
Info: ROS2 TurtleBot3_World tutorial: https://navigation.ros.org/getting_started/index.html LinoRobot: https://linorobot.org/ LinoRobot2: https://github.com/linorobot/linorobot2 HBRC meeting with LinoRobot2: https://www.youtube.com/watch?v=Hj7m2xwlhWY RSSC meeting with Steve Macenski (@ 1 hour mark): https://www.youtube.com/watch?v=3dqKXSAblvE
Index: 00:00 Jeff: Gives a progress update and review. (turtlebot3_world, ackermann vehicle in ROS1) 06:15 Jeff: Reasons to not use the turtlebot3 code for my Ackermann vehicle. 08:10 Jeff: Decided to look at Linorobot2 project for my Ackermann vehicle. 09:45 Live demo of stock Linorobot2 2-wheel drive robot. With Gazebo, RVIZ, Slam, navigation and mapping. 19:30 Switch the stock 2-wheel drive robot to Jeff's Ackermann vehicle. 27:40 Mention of the Home Brew Robotics meeting from the night before. There was a presentation of LinoRobots2. 28:15 What steps are next. 29:00 Why ROS2 at this point? 30:10 Jeff's experience/impression of ROS2 so far. (Just a few weeks of hack and slash.) 39:40 Mention of the RSSC meeting with presentation by Steve Macenski on ROS2-Navigation2.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Do we still have any ROS1/ROS2 experts on the list?
Can someone give me a simple/quick explanation of overlays? Overlays may fix my problems...
Can someone tell me how to install/remove packages without installing all of the associated pieces that are included with meta-packages?
I am currently trying to remove gazeborospkgs. It wants to remove other things. Then I want to install ros-simulaion as a git-hub clone in my workspace. But then it will complain that things are missing. If I try to install one of them, it will insist on reinstalling everything. Which will bite me...
Maybe "overlays" will get me around this. But they don't want to explain "overlays" to me. So I am screwed...
If no one here knows, I will elevate this to Home Brew Robotics Group. If they don't know. I will elevate this to the world-wide ROS world.
So I may end up looking like a fool, or it will point out that the documentation is inadequate.
I will start here..
Cute story about John Deere and "Right to repair" https://www.youtube.com/watch?v=h0kbx8zJZvQ
The ONLY benefit of closed/proprietary Ag software! I'd love to see JD drive their stuff back to Ukraine!
yes don't like that too but driving back where the stuff belongs is really what should happen and would be the only valid reason for such a backdoor
I have this weeks Zoom meeting indexed and ready to release, but Google/YouTube has decided to "mess" with me. (I can't say what I want to...) So If the second attempt to upload it works then I will post it...
I am calmer today... I will delete my rants from yesterday. But it had pushed me to the edge. After several hours the YouTube was still at 39%. So I cancelled that and started it over. Slack was giving me grief. Probably my own fault. And my Internet was giving me fits. But I fixed everything and added a huge block of links (that may be useful...) So I will post the announcement.
05122022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/3RJmOFJivNo
Info: (Other websites mentioned)
picar (What I started with) https://gitlab.yonohub.com/mohamedahmed/picar/-/tree/master/ (Newer version of Picar) https://github.com/aizzat/ackermann_vehicle control commands: ackermannmsg model: URDF control software: ackermanncontoller.py linkage: topics -> joint transmission
Linorobot2 https://github.com/linorobot/linorobot2
Linorobot2 - 2wd
control commands: cmd_vel
model: URDF
control software: standard Gazebo differential drove plug-in
linkage: cmd_vel -> Gazebo differential drive plug-in
Linorobot2 - ack (my hack)
control commands: ackermann_msg
model: URDF
control software: ackermann_contoller.py
linkage: topics -> joint transmission
olinrobotics This uses a modified version of ackermann_controller from the R/C car (Apparently this is where the tractor model in ROS Agriculture stage simulator came from) https://github.com/olinrobotics/tractor_sim https://github.com/olinrobotics/gravl https://github.com/olinrobotics/gravl/wiki
ROS1 (Kinetic) Prius simulation https://github.com/osrf/car_demo/blob/master/prius_description/urdf/prius.urdf https://github.com/osrf/car_demo/blob/master/car_demo/plugins/PriusHybridPlugin.cc control commands: custom (velocity, steer, shift, brakes) model: URDF control software: Gazebo plug-in linkage: Gazebo plug-in (tf directly to joints?)
ROS2 Prius simulation https://github.com/ros-simulation/gazebo_ros_pkgs/blob/foxy/gazebo_plugins/worlds/gazebo_ros_ackermann_drive_demo.world https://github.com/ros-simulation/gazebo_ros_pkgs/blob/foxy/gazebo_plugins/src/gazebo_ros_ackermann_drive.cpp control commands: cmd_vel model: SDF control software: Gazebo plug-in linkage: Gazebo plug-in (tf directly to joints?)
Other links https://github.com/chrissunny94/ackerman_ros_robot_gazebo_simulation
<https://github.com/benwarrick/simple_ackermann>
<https://github.com/RobotnikAutomation/rbcar_sim/blob/melodic-devel/README.md>
<https://github.com/mfilipen/autonomous-mobile-car-like-robot>
<https://github.com/robustify/audibot>
<https://github.com/srmainwaring/steer_bot>
<http://wiki.ros.org/steer_bot_hardware_gazebo>
<http://wiki.ros.org/cirkit_unit03_simulator>
<https://github.com/froohoo/ackermansteer>
Index: 00:00 Al: Talks about replacing his Raspberry Pi with a laptop computer. He shows his parts he has collected. 04:10 Al: Mentions his LoRa radios. 05:00 Jeff: Mentions that his laptop looks just like Al's laptop. 05:25 Jeff: Talks about collecting parts and next steps. 15:20 Al: Asks about using LoRa radios for remote control. 23:50 Jeff: Gives a recap of software progress. Simulators of various versions... 34:30 Jeff: Talks about trying to convert the ROS1 Ackermann controller Python script to ROS2. 37:00 Jeff: Goes down the path of complaining about his inexperience with ROS2. 44:20 More comments on Olin Robotics. And documentation in general.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
So the "Algorithm" (The phrase weak-minded people will use) decided I was annoyed so it appended my last two messages together, to get in one last jab...
Zoom has me baffled again. I am watching the replay of today's video meeting. I see the pictures of the "talking heads" seem to be larger than they usually are. It is covering up what people are talking about. I don't know what controls that. Maybe we can't control it? It just seems to randomly change as they do updates.
05192022 ROS Lawn Tractor Automation meeting https://youtu.be/aXO2RU56RVw
Al: https://www.pcbway.com/blog/technology/How_to_get_started_with_LoRa_LPWAN_using_TTGO_LoRa32.html Al: https://www.instructables.com/TTGO-ESP32-LoRa-Board-With-DHT22-Temperature-and-H/
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. 04:00 Al: Says his code will not compile. We start a debug process. 25:15 Discussion of how many I/O pins are available on this particular board. 32:20 Jeff: Brings up a comment that someone made about "courteous" use of the LoRa band. Discussion of transmit power. Channel selection. 45:50 Jeff: Reiterates "who is in control?" 49:00 More talk about library maintenance. 1:01:00 Jeff: Makes suggestions about design considerations. Specifically, mounting laptop on lawn tractor.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Here is some information on Oak-D cameras. https://groups.google.com/g/hbrobotics/c/UMF7TA1gTXs
Info from this week's video meeting. They just posted the HBRobotics meeting so I added that on to the chat section. I am about to watch that.
===========================================
05262022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/lXCd9lwkwkk
Chat: We were talking about Bob's giant sheep. So I posted a link Robot Shop of Horrors and a link to the day at Bob's house. Jeff: http://sampson-jeff.com/rsoh/ Jeff: http://sampson-jeff.com/rsoh/010617/ Al posted a video of what he wants to do. Al: https://www.youtube.com/watch?v=dePHLU3Oof8 Link to Ava-V2 write-up. Jeff: https://www.robotshop.com/community/robots/show/ava-v2 Link to Articulated Robots YouTube page. Jeff: https://www.youtube.com/channel/UCx9vSJTSZGFrErfPtut5GNQ Link to Home Brew Robotics Group YouTube meetings. Jeff: https://www.youtube.com/user/hbrobotics Link to the meeting with the Ava-V2 robot. https://www.youtube.com/watch?v=ijS7iwU9IfQ
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. 02:45 Al: Talks about errors he is getting trying to compile. 27:00 Jeff: "I have nothing..." 27:20 Jeff: Mentions the HBRobotics meeting from yesterday. Posts a link to the "Ava" robot. 31:40 Jeff: Posts link to Articulated Robots on Youtube which has new videos added.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
The Ava-V2 presentation starts at 48:20 in this video: https://www.youtube.com/watch?v=ijS7iwU9IfQ
Able to get to TTGO ESP32 lora board talking and sharing data. The receiving board also pushes the data to a web server. Tutorial is here
*Thread Reply:* @Al Jones Very interesting. Anyway you could measure response times for remote control?
*Thread Reply: @Bob Hassett I may not be exactly answering your question, but I'm transmitting about 30 characters of data (e.g. Data received: 15/78.24&35.59#981.38@2911$1184*0) still close to one another with a delay of 500 ms (i.e. 2 hz) with CRC enabled and not seeing any corrupted data. The RSSI is -48 so things are looking OK. I need to move farther apart and see what happens. My goal is 500 feet because I'm not really interested in much further distance than that. I have the power set at 20 and a SyncWord of 0xF1.
This video: https://youtu.be/7TXn6GktyYw
Chat: Jeff: Ardusimple review: https://www.ardusimple.com/field-test-evaluation-of-the-ardusimple-simplertk2blite-zed-f9p-operating-in-rtk-mode-on-the-south-carolina-virtual-reference-network Jeff: https://www.tandfonline.com/doi/full/10.1080/15481603.2020.1822588
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. Shows his prototype box and talks about which pins he is using. He says he wants to use LiPo batteries since his "Power Bank" battery won't stay powered up. Lists his requirements and a test plan. 08:35 Al: Asks why his pots give him an over-range value. 09:45 Jeff: Asks how Al will use LiPo batteries. And LiPo charging. Discussion of using a USB charging battery through the USB connector. 25:20 Jeff: Talks about a article on GPS receivers. And posts a couple of links. 30:30 Mention of fire ants... 31:10 Discussion of distance from correction server. Discussion of survey markers. 34:15 Al: Asks if there were any new messages on Slack channel.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
We didn't record the video meeting today. But Al told me about his plans to manage his cows. And then we spent a long time brainstorming ways to cut and transport grass using lawn tractors.
We talked about pulling a small lawn trailer, cutting grass without pulverizing it (sickle blade, hedge trimmer, rotating blades) and how to get grass from gound level to the trailer (conveyor, vaccum blower, auger, oscillating fingers, rotating brushes). And it would be handy to figure out a way to unload the grass at the feeders.
Hmm sounds like green chopping for cattle? What kind of volume is produced, how high does it have to be elevated? Maybe something like the old grain binders, sickle up front, rotating sweep, then a small elevator on the side or directing behind into a small manure spreader? So the sickle, sweep and catching canvas is in front of the garden tractor. The canvas rotates left or to right into a small elevator thats on the side, going up and into a pull behind small manure like spreader. The spreaders/forage box type that have chains that move the pile to the rear of the wagon?
I found this: Using Windows Subsystem for Linux (WSL2) with ROS https://groups.google.com/g/hbrobotics/c/9_MtHpycjIw
Info from today's video meeting:
===========================================
06162022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/rzZ7O4wVHAU
Chat: Al: https://www.aliexpress.com/item/2251832810444969.html?spm=a2g0o.order_list.0.0.6dbf18021y06fP Jeff: https://learn.adafruit.com/adafruit-feather-m0-radio-with-lora-radio-module/using-the-rfm-9x-radio Al: https://www.youtube.com/watch?v=hL5wzYBbuYU Al: https://github.com/sandeepmistry/arduino-LoRa/blob/master/examples/LoRaDuplexCallback/LoRaDuplexCallback.ino Jeff: rf95.waitAvailableTimeout Al: https://github.com/CytronTechnologies/RFM-LoRa-Shield-Examples/wiki/Arduino-Sketches-Overview Al: https://www.seeedstudio.com/blog/2020/08/03/lorapedia-an-introduction-of-lora-and-lorawan-technology/
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. Al asks if Jeff's communication is bi-directional. We track down some examples... 11:40 Al: Shows some code he found. 20:00 Jeff: Goes looking for his code... 23:40 Short discussion of old vs. new laptops. 26:00 Al: Logs into a remote computer looking for his Arduino code. 38:05 Some digging through the Cytron website.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
06232022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/dOWqqKUjeBA
Chat: (I lost the link to Al's GIST page.) Al: https://randomnerdtutorials.com/power-esp32-esp8266-solar-panels-battery-level-monitoring/
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. He goes through some code. 04:45 Short discussion of pre-compiler options. 08:25 Mixing code examples. 11:00 Al: Talks about powering his LoRa board with solar panels. 13:00 Jeff: Talks about measuring engine RPM on a small engine. 14:20 Method to read alternator waveform. 22:15 Method to read the magnets on the flywheel. 28:40 Demos of the proximity sensor and the Hall effect sensors. 40:45 If you have a magneto then the transistor circuit that was shown may work on the primary side. 42:25 Optical sensing. 45:50 Bob had asked about cows...
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
I have been able to get the send and receive example from Adafruit working with the TTGO ESP32 Lora boards
06302022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/EbD1vzlYa-g
Chat: None
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. 06:15 Al: Goes through some of his code. He talks about an addressable LED strip for status display. 08:55 Al: Talks about the code to run the LoRa radio. 12:45 Jeff: Shows his (remote control end) code and talks about it. 32:45 Some discussion of e-Stop condition vs. pause condition. 36:20 Discussion of timeouts. 44:20 Jeff: Shows his (vehicle end) code and talks about it.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
07072022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/f_n1r3Hva0U
Chat: Al: https://gist.github.com/jones2126/73eb93df2c8eead98ce4fab4d6cfe5fe Al: https://gist.github.com/jones2126/fd3a5d0ae8eac66f42afc491b2490573 Al's version 2 LoRa remote: https://youtu.be/Parj9fLj3Oo
Index: 00:00 Al: Talks about his ESP-32 LoRa boards. Posts links to code. 09:15 Jeff: Talks about his lawn tractor. 19:25 Al: Talks about using Google Remote Desktop. 21:30 Jeff: Talks about steering control and extra sensors.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Here is a direct link to Al's LoRa remote version 2. I see if I delete a YouTube reference from a post, there doesn't seem to be a way to add it back on... https://youtu.be/Parj9fLj3Oo
Info from today's video meeting
===========================================
07142022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/UYDreA0QmGg
Chat: Al: https://github.com/jones2126/ttgo_projects Al: How to use GIT / Github in VSCODE for PlatformIO | Teensy | Arduino | Embedded System Al: https://www.youtube.com/watch?v=13wmV6gcb2A&t=329s Al: From https://randomnerdtutorials.com/vs-code-platformio-ide-esp32-esp8266-arduino/#2
Index: 00:00 Al: Explains that he has figured out git-hub. He is using it with VS-Code. He is also using PlatformIO. 11:30 Jeff: Asks about some finer points on building/maintaining the git-hub page. 17:15 Jeff: Gives an update on his lack of progress. 20:15 Al: Goes over a wish list. 20:55 Jeff: Asks about Lithium Ion battery holders. And then some talk about 3D printers.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
After watching the last 2 weekly meetings about LoRa I ran across this You Tube about LoRa and GPS. Kinda interesting. No sure I like the distance accuracy of zero to 2 meters. Maybe a better GPS module would work?
https://youtube.com/watch?v=90ziiwkOJFw&feature=share.
I don't mean to be rude... But you need to listen to what they are saying. She goes on and on about batteries and 3D printing (and someting else I missed). But at no point does she mention precision GPS. She is throwing a bunch of random concepts together to make it sound like she knows what she is talking about.
You have have been following this long enough to say "Maybe a better GPS module would work?" So yes, you saw through what she was saying...
She is presenting lots of facts that mean nothing. I am proud of you Bob for seeing through her nonsense.!!!
None Military GPS will never be more accurate. If you need that look for RTK which is basically two GPS with interpolated values.
Thanks for the feedback! Like you said Jeff, I was just very interested in the basic concepts like sending GPS over LoRa and the discussion on batteries. Just wondering in general what kind of accuracy folks ARE getting and equipment needed used? If I remember, Jeff was using differential?
I had to look up the mentioning of haversine calculations http://www.movable-type.co.uk/scripts/latlong.html Seems rather complicated. How are other people calculating GPS distance?
I am running RTK on u-blox F9P with free correction data from the Minnesota Department of transportation. I get 1-2 centimeter accuracy. The easiest way to convert lat/lon to a usable position is to just grab someone's code that does it.
*Thread Reply:* Thanks for the info! Obviously that seems the best route to follow!
Info from today's video meeting:
===========================================
07212022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/iDeE69qRH3k
Chat: Al: https://www.amazon.com/Leviathan-Penguin-Classics-Thomas-Hobbes/dp/0141395095/ref=sr_1_2?dchild=1&keywords=Thomas+Hobbes&qid=1620765914&sr=8-2 Al: https://youtu.be/faOb2YV_-Tg Al:https://github.com/DSG-UPC/Arduino-LoRa-Mesh
Index: 00:00 Al: Is talking about his LoRa project. He found a new LoRa library. 08:25 Jeff: Talks about his lawn tractor progress.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
After much screwing around, much guessing and much anxiety... I drilled down through my floor pan. (You can see the video yesterday to see the background on this.) Both sides did hit the center of the frame rails. The holes were a little further forward than I expected. But there was a lot of guess work that led up to this. I am (mostly) happy with the results I got. After I got everything bolted back together and clamped together it looks like it may (might?) work. Just grabbing it and shaking it, it seems fairly sturdy. Maybe I will dig out my welder tomorrow and make it permanent.
I got my bottom brace welded today. I will have to see if it is adequate... (Note to self: Buy a bigger welder. Or watch YouTube videos to see how to do a better job...)
If it's MIG your machine might have a feed/tem/material thickness chart to help get close to a setting
It is a the cheapest flux core wire feed welder. I realized after watching a couple YouTube videos that I should go look to see if it has a chart somewhere. I will go do that now...
If you need to wing it and your on something like 1/4 material turn the welder up to 3/4 power and the feed up to 3/4 start a test weld and if the wire isn't melting (pushing) start turning down the feed speed until it stops "pushing/bunching" the wire and sounds like a smooth burning noise. If your burning through the material turn them down in tandem about the same amount.
Clean material (wire brush it first) helps a lot.
Putting the material flat and in an easy posistion helps even more.
I never have liked Flux core IMO the gas add on is well worth it.
Also dragging a bead instead of pushing one is much easier if your new at it.
I'm assuming it's 110 unit. If you have an extension cord make sure it's heavy not a 14 or 16awg especially if it's very long.
This a $89 90amp Harbor Freight special. This is 1/8" material.
To address the original point. There is no chart. It says the "specs" are 55amps or 90amps. It has a switch that says min/max. I started with it set to max. When I just now looked, it is set at min. Which explains why I was getting blobs and no pentration. ( at one point I bumped it and it felt like I has changed a switch position) The first welds looked better.
So I probably need to get some sacrificial stock and practice.
The other points... It is 110VAC and plugged directly into an outlet. I have min and max current setting. I have 1-10 feed rate. And then you hope for the best.
I welded it a couple of times and ground it a couple of times.
Info from today's video meeting:
===========================================
07282022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/YktscLCvVrc
Chat: Al: https://www.youtube.com/watch?v=dLCRxTQ73FA https://www.johnnyseeds.com/tools-supplies/harvesting-tools/greens-harvesters/baby-leaf-harvester-7980.html https://www.youtube.com/results?search_query=quick+cut+greens+harvester
Index: 00:00 Jeff: Talks about his lawn tractor progress. Mechanical stories. Welding stories. 12:10 Jeff: Talks about the impending "Loss of Data" on the Slack channel. 22:15 Al: Talks about his TTGO/LoRa boards. His Platform_IO has stopped working. We talk about his code. 34:05 Al: Shows a small greens harvester.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
I got my steering to move. As a first test it is working fairly well. I can probably give it way more power. But I didn't want to break anything. Next step is a feedback pot and a microcontroller. https://www.youtube.com/watch?v=_uxsaOqgomU https://www.youtube.com/watch?v=SxwuK75CF2U I had to add an adjustment bolt to keep the motor from pulling up as it actuates. Here is a picture of that.
*Thread Reply:* It is smooth because it is not being controlled. It is a constant PWM so the motor is getting constant power. I'm sure once I add closed loop feedback that it won't look so smooth.
@Al Jones Maybe adapt/modify the leafy harvest or you found to include some features like the old time “binders” https://youtube.com/watch?v=NSLp7ciddCg&feature=share
Initial success using the RadioLib library and data Structures sending and receiving data between two TTGO LoRa ESP32s. Code here with hardcoded sensor values:
Slave/radio control: https://github.com/jones2126/ttgo_board/blob/main/radio_control_slave_v1/src/main.cpp Master/tractor: https://github.com/jones2126/ttgo_board/blob/main/radio_control_tractor_v1/src/main.cpp
*Thread Reply:* Thanks @Al Jones for posting your code! Makes me want to program my stuff again!
Info from today's video meeting. I do recommend digging through the GOAT channel. (Although his comments about multiple LoRa radios seem a little fishy.)
===============
08042022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/Evp_g1RZaSg
Chat: Jeff: Jeff's steering motor videos https://www.youtube.com/watch?v=_uxsaOqgomU https://www.youtube.com/watch?v=SxwuK75CF2U Al: Code https://gist.github.com/jones2126/11557c73abbde5d32298918804950342 Jeff: The GOAT channel https://www.youtube.com/channel/UC_fquXcal5kvQXsbDxuiJhA
Index: 00:00 Al: Talks about his radios and computer configuration. 06:30 Jeff: Talks about Slack messages and the 90-day message limit. 13:40 Jeff: Talks about his power steering motor on his lawn tractor. 34:50 Jeff: Talks about adding videos to his personal page instead of the Lawn Tractor page. 38:40 Jeff: Talks about YouTube maintenance. 40:00 Jeff: Mentions the exposure on YouTube, specifically the old ROS Agriculture page. 41:20 Jeff: Complains about his lighting for Zoom videos. 42:30 Al: Talks about his C code. 44:45 Jeff: talks about "GOAT Industries" that he ran across.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
*Thread Reply:* I know next to nothing about LoRa Spreading Factors.
I took a look at the GOAT’s 2 videos and came away with more questions than answers. I am suprized he is using RPis instead of something faster.
This https://support.machineq.com/s/article/What-is-the-Spreading-Factor-SF implies for a SINGLE radio.
But he implies that using 4 radios in parallel he can get more SF. I don’t understand his reasoning why he can boost SF. If all radios are set at 7 how does going parallel multiply the SF? Would they all be the same at 7? There for the end result of the 4 is still 7?
How can that be as good of SF as using 4 radios to boost SF.
*Thread Reply:* I took another look at https://youtu.be/ckd_9kuyRCI.
BTW one can follow the video with the transcript on the side. That helps a lot.
At about 5 minutes he states that with 4 modules he gets zero errors. So does that mean that if you add enough radios you get no errors? So he is blasting out a signal with four radios, like a shot gun saturating a space on a target? I wonder how his coding can look at multiple radio signals and report getting zero errors? Seems rather an expensive solution?
Info from today's video meeting
===========================================
08112022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/fu09BCyIdWM
Chat: https://github.com/jones2126/ttgo_board https://www.cubcadet.com/en_US/snow-blowers/3x-28inch-snow-blower%C2%A0/31AH5DVB710.html#start=0 Kyler Laird's remote control (At around 6:30 is where it lurches): https://www.youtube.com/watch?v=TatrSXJiq0I All of Kyler's Tractobots: https://www.youtube.com/c/KylerLaird/videos
Index: 00:00 Al: Talks about his new 3D printed box for his LoRa remote. 02:00 Al: Presents some code. 07:30 More discussion about the box. 09:00 Jeff: Tells story about garage clean out. And a story about a snow blower. 13:10 Al: Suggests making a snow blower computer controlled. 17:20 Jeff: Talks about his "green screen" and controlling background images. 20:15 Al: Shows an industrial remote control he found. 37:05 Jeff: Comments on where videos were posted on YouTube, and the results.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Here is something interesting: https://www.youtube.com/watch?v=qWp_-F_zcwo
Just pocket change for you oil executives @Al Jones https://www.lely.com/exos/ ;-))
Info from today's video meeting.
===========================================
08182022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/1YoKlHvj5hI
Chat: Jeff: https://esp32io.com/tutorials/esp32-button Jeff: batteryhookup.com
Index: 00:00 Al: Talks about progress for his LoRa remote control. He points out some switches and talks about his status LEDs. And battery monitor. 06:25 Al: Talks about his eStop switch. 07:10 Jeff: Asks if the OLED is viewable in sun light. 08:30 Al: Talks about displaying the battery status on his status LEDs. 11:35 Al: Points out where to get cheap batteries. 13:05 Jeff: Isn't doing anything. Then launches into a story about a dirty workbench.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Here is something posted on "Weekly Robotics". If anyone has a Raspberry Pi Pico and is running ROS2 you could try it. https://robofoundry.medium.com/raspberry-pi-pico-ros2-via-micro-ros-actually-working-in-1-hr-9f7a3782d3e3
@Al Jones I think this is the reference to fiber optic gyros you were asking about. You have to read through to see how long his cable was... https://hackaday.io/project/2744-1-d-laser-ring-gyroscope
Info from this week's video meeting
===========================================
08252022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/V_MBBNDW6WQ
Chat: None
Index: 00:00 Jeff: Tells story about unloading junk on some scrap guys. 03:40 Jeff: Mentions some thoughts on testing. 06:30 Al: Talks about I/O pins on his TTGO ESP32 board. 15:15 Al: Talks about fiber optic gyros. (from an experimental point of view) 23:25 Jeff: Mentions email from Ardusimple. 24:40 Jeff: Talks about YouTube exposure and Slack channel members. 27:50 Conversation moves to the impending Slack channel changes. (and if they are actually valid) Note: They are really going to do this: https://slack.com/help/articles/7050776459923-Pricing-changes-for-the-Pro-plan-and-updates-to-the-Free-plan#free-plan-changes
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Kinda wish we had this back in our strawberry farming days ;-) https://www.youtube.com/watch?v=M3SGScaShhw
These are the robots we have built since 2014: https://youtu.be/ilNIkZJ3NwY
This is our latest robot: https://youtu.be/idD_xI074bY
Very interesting bots you have Behzad! I like the customer selected modules idea and Big Top for blueberries! Can you talk more about the available modules? Talk about how the bots sense and navigate their environment? Know what when where and how to do their tasks?
Here is your green chopper @Al Jones https://www.youtube.com/watch?v=ayLcdy5LPmE
Info from today's video meeting.
=========================================== 09012022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/HVwxZ48opYE
Chat: Jeff : Index of ROS Agriculture and Lawn Tractor meetings: http://sampson-jeff.com/RosAgriculture/ Jeff: My YouTube page: https://www.youtube.com/user/jws8675309/videos Jeff : Lawn Tractor meetings: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/featured Jeff: GOAT: https://www.youtube.com/watch?v=flXaL9d_CaM&list=PLYSpcfRJNQHPACEoVBRTK7H3WjZ0e59Wu&index=2 Al: https://www.powersportsmax.com/product_info.php/products_id/20784&utm_source=shopping&utm_medium=cpc&utm_id=atv-t045?msclkid=73aa11ec951015fade8572fc707f0a6a
Index: 00:00 Al: Gives a re-cap of his remote control system. 04:30 Al: Talks about learning Fritzing. 05:50 More discussion of the radio control architecture. And the OLED display. 07:50 Al: Shows some info in Fritzing. Making printed circuit boards with Fritzing. 15:45 Jeff: Mentions other PCB software. 18:00 Al: Shows where is going to mount a push button on his box. 19:30 Jeff: Talks about go-cart parts for parts. 29:10 Jeff: Talks about the mini quad bike that G.O.A.T. Industries is using. 33:40 Jeff: Mentions using two motors in the rear instead of using a differential. 39:00 Jeff: Asks Matt if he wants an invite to the Slack channel. 39:50 Discussion of the Slack "improvements". 40:35 Brief re-cap of how to access our group.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
If you don't get Weekly Robotics news letter, there was a link to a remote controlled lawn tractor. https://offthegrid.dev/blog/remotely-controlled-lawn-mower You can see how he solved making the lawn tractor "drive by wire".
Info from today's video meeting.
===========================================
09082022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/kmN7HNAVxSw
Chat: None
Index: 00:00 Al: Talks about ordering a PCB that he created using Fritzing. 30:20 Jeff: Talks about reusing R/C cars. 37:40 Jeff: Talks about Go Cart parts and/or fabricating parts.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Info from today's video meeting,
===========================================
09152022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/NKd29mZpR1Q
Chat: Fancy/expensive robots on eBay: Search eBay for "ros ackermann robot"
Index: 00:00 Al: Talks about his LoRa remote. Shows his new PC board made with Fritzing. 13:00 Jeff: Suggestion for holes too small. 19:40 Al: Goes over a TO-DO list. 21:20 Jeff: Talks about random thinks he is thinking about or researching. Differentials, transaxles... 25:20 Jeff: Creating steering to match the size and scale of the backend. 40:40 Jeff: Using off-the-shelf kid's cars. 48:30 Differences between Go Carts and robots. (precision) 50:30 Jeff: Fancy/expensive "ROS Ackermann" robots on eBay. 1:05:25 Trying to search for "ros ackerman robot gps". Cheap vs. precise GPS. 1:09:00 Thoughts on R/C cars for ROS robots.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Have any of modified kiddie cars or go carts replaced the Ackerman steering with large pneumatic swivel tires? Has anyone seen robotic zero turn kiddie cars or go carts? Something like this https://hackaday.com/2020/01/10/diy-autonomous-mower-in-the-wild/
I'm not quite sure what you are asking... If your current "kid's car" has two independent drive motors in the back, then just do it. But the cheap plastic "tires" they use will not have enough traction. But... Amazon lists a couple of differential drive "kid's cars". https://www.amazon.com/Kidzone-Electric-Vehicle-Control-ASTM-Certified/dp/B08BL2HFY6/ref=sr148?crid=IRSGFHAM9PGL&keywords=ride+on+kid%27s+car&qid=1663358567&sprefix=ride+on+kid%27s+car%2Caps%2C412&sr=8-48|https://www.amazon.com/Kidzone-Electric-Vehicle-Control-ASTM-Certified/dp/B08BL2HFY6[…]id=1663358567&sprefix=ride+on+kid%27s+car%2Caps%2C412&sr=8-48 https://www.amazon.com/Huffy-Ride-Green-Machine-Vortex/dp/B07YBVLG16/ref=sr134?crid=2U3ZU7008NGD6&keywords=ride+on+zero+turn+lawn+mower+for+kids&qid=1663352576&sprefix=ride+on+zero+turn+%2Caps%2C175&sr=8-34&ufe=appdo%3Aamzn1.fos.304cacc1-b508-45fb-a37f-a2c47c48c32f|https://www.amazon.com/Huffy-Ride-Green-Machine-Vortex/dp/B07YBVLG16/ref=sr134?cri[…]4&ufe=appdo%3Aamzn1.fos.304cacc1-b508-45fb-a37f-a2c47c48c32f Camp Peavy from Home Brew Robotics found a differential drive kid's car for Robomagellan: https://www.youtube.com/watch?v=MBptwD1ODKo I don't recall what product he bought, but I can find out if you want to know. But you might want to search for wheel chair robots for inspiration: https://www.youtube.com/results?search_query=wheel+chair+robot This turns up a video of a "Power Wheels": https://www.youtube.com/watch?v=TwCL0BEUXjs I don't see that available for purchase any more... But back to the "wheel chair robots"... You may find a used wheel chair on Craig's List. Or you can buy wheel chair motors on eBay or Amazon. (But I recommend getting the motor and wheel as a unit, so you don't have to mix and match.) But you imply converting an over priced cheap kid's car into a very expensive (industrial/commercial) "zero turn" lawn mower. That seems like a leap in logic to me...
What are you really asking?
First off, thank you for all the usefull links. Maybe I should have ask the questions with more context? I watch the latest YouTube and the discussion of Ackerman linkages and such. Also about all the videos you watched. Also there was a comment about making (?) something that was smaller for a back yard. So the zero turn concept came to mind. Just speculating which method, Ackerman vs zero turn, would be less complex to make and program? Just wondering. I dont plan on using zero turn. Also I do not want to spend a lot of money on the jeep. Dealing with its Ackerman steering will help me understand the Ackerman on my tractor. The Ackerman vs zero turn was just a mental exercise about which method is complex, hardware and software wise. About the jeep…there are 2 independent 12 v motors driving the thing. I could drive them independently or together. Together would mimic my tractor better. About the plastic wheels slipping…i thought I could cannibalize a car tire. Just cut out the tread part and glue and fasten it on the plastic tire, if slipping is a problem. So for a small area to mow, which method of steering would be the simplest in hardware and software to implement?
First point... You can easily convert your Jeep to differential drive. Since it has independent rear drive, you just have to add castor wheels. Just get a board (plywood or 1x6 pine board) and screw one or two castor wheels to it. Then bolt the board to the bottom of your Jeep. Just make sure it rasies the front end enough that the existing front wheels no longer touch the ground. So at moments notice you can install or un-install the castor wheels. Then drive the back wheels independently to get differential drive. So this is a relatively un-complex approach.
Another thing to keep in mind. When you use a cator wheel, make sure the pivot point is vertical after you get it mounted. Also use fairly large diameter castor wheel if you want to drive through the grass. Or just run on flat surfaces.
"Just speculating which method, Ackerman vs zero turn, would be less complex to make and program? Just wondering."
Mechanically, as I just mentioned, you can easily convert your Jeep back and forth from Ackermann to differential. To control Ackermann steering, you need some form of a position control actuator. Because it depends on the setting a steering angle to what the software specifies. On my mini-tractor or on your Jeep, the giant R/C servo seems to be a good fit. But they are somewhat expensive. Otherwise you have to create your own close-loop position actuator. Differential drive, you just control the speed of the two independent pack wheels.
Now on the software side, if you are running ROS, it will generate velocity commands to drive a path of its choice. It is up to you to get your vehicle to respond to the requests. Traditionally ROS has generated a "standard" "command velocity message". So the result of the multiple layers of software is to generate an instantaneous foward velocity and rotational velocity. For differential drive, this converts very easily left wheel speed and right wheel speed. If you are going to use Ackermann steering then you have to convert forward velocity and rotational velocity to forward velocity and steering angle. That is just a few lines of code and is supplied by an extra convertor node supplied in TEB Planner package.
So the bottom line is... It doesn't matter if you use differential drive or Ackermann steering. You just need to supply the appropriate convertor nodes between ROS navigation and the low level control in your vehicle. And you can instantly switch back and forth by starting the appropriate launch file.
As far as glueing tire treads to the plastic Jeep wheels... That seems extreme to me. But maybe that would work.
Speaking of castor wheels... This guy did something I would never thought of. He is using castor wheels as his front wheels of an Akermann steered vehicle. I think he welded a tab to each wheel. https://www.youtube.com/watch?v=WO-9hQtDoHw
*Thread Reply:* Looks like a good solution to be able to switch back and forth from Ackerman to zero turn, and vis versa.
I watched this video: https://www.youtube.com/watch?v=zuXqg4q3cQ8 I noticed they modified the back tires in some way. Some searching turns up something called "Traction Bands". https://backyartisan.com/power-wheels-traction-bands/ This seems easier than hacking car tires.
Testing 4 spim08hp batteries wired together as a replacement for my lead acid battery.
*Thread Reply:* test results and @JeffS is correct, there is no BMS. I think the term @Matt Droter uses is "hillbilly". At 14.2 that means each cell is at 3.6. If I drained it to 12.5 then each cell would be 3.1. https://1drv.ms/v/s!AlSRhxzgYnAliItv9fmR7tZQRqsI4w?e=61QaDh|Video
Testing Cytron motor controller with a PID. Overview video: https://1drv.ms/v/s!AlSRhxzgYnAliIwLRvWLATq63dUP6w?e=AMH1cK|link; (Forgive the sound quality. Need to work on that.) Gist with code: link
*Thread Reply:* PID Kp at 5, no Ki or Kd, https://drive.google.com/file/d/1TJiW9MKhBLfOiDYUtvogqIX9Pd8HZgs_/view?usp=sharing
@Al Jones Here is a complete example of a scratch built steering servo. Both a video and a web page... https://dronebotworkshop.com/custom-servo-motor/ https://www.youtube.com/watch?v=vFMwAy7BaH0
Here is another example of making a servo motor. https://www.youtube.com/watch?v=3pYWLF8qw-g
Info from today's video meeting.
===========================================
09292022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/ZACEihL96Zo
Chat: Motor running on table: https://youtu.be/MHd0v3Arq_g Plot of motor running: https://youtu.be/EkU9EA50GnY
Index: 00:00 Al: Talks about his new code for controlling his power steering motor. 06:10 Jeff: Has comments on what Al is doing. 06:15 - Convert pot value to an intermediate angle value. 10:50 - Discussion of center steer value. 14:15 - Just use P out of P, I, D. Unless you have problems. 18:45 - Ways to debug strange results. 23:30 - Setting artificial end limits in code so don't smash anything 34:50 - Debug print statements may screw up your timing. 44:25 Al: Says he may post updated code. 45:35 More random comments. 48:20 Jeff: Gives another run-down on parts to build a steering system. 53:00 Jeff: Complains that his robot is too heavy. So we talk about Al's Lithium batteries. And voltage regulators (or lack of regulator).
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Updated testing program to figure out steering PID values. Included in this version is changing to "angles" per @JeffS’s suggestion. The next issue I want resolve is the high pitch whining coming from the motor. I think I've seen some videos explaining the frequency issue. Gist link.
I think this qualifies as a blooper... https://1drv.ms/v/s!AlSRhxzgYnAliI9zmLaqsFv0lIB3CA?e=hnmkPL|link (if it does not play, try downloading it) - testing motor controller, code being used: link
@Al Jones You might want to rethink what you are doing with the output of your PID function. Normally you just use that as your power level for your motors. But you seem to be doing some magical remapping, which probably renders the PID function useless...
*Thread Reply:* Thanks for looking at the code. Last Thursday I understood you to suggest using angle because later on angle is what will be fed from the ROS steering algorithm. This is my attempt to do that. Its now not clear to me what you are suggesting to feed as input into the PID and how to manipulate the motor power to reach the target angle. 🤔
*Thread Reply:* I am busy until about 3:30 Central time today. After that I may make some comments and tell some stories...
*Thread Reply:* Well, it is now 4:15 PM (my time). Nobody showed up...
Let's start at the beginning, A PID function expects that you supply two input variables. One is usually the current state, the other is a new state,
This can be temperature, position. speed, amount of light. Anything...
But it expects the two inputs to be in the same units and in the same scale.
It will attempt to make the current state match the new requested state.
So in this case, you are providing the current steering position and you are requesting a new steering position. The PID algorithm will calculate the "error" by simply subtracting the two inputs to determine the difference between the two. Then it applies a proportional correction, a derivative correction and an integral correction. The sum of these three is the output. The output is used to moved the current value towards the requested value.
Oh. the clowns are here, an hour and a half late. I'll continue this when they are gone.
*Thread Reply:* Well, false alarm, sort of... If you want to hear the whole story then tune in for the weekly meeting before we start recording...
So, the PID takes two input values and produces an output value. The output value is considered to be a correction factor. If the inputs are temperature values then you either turn on a heater to turn on cooling. In the case of steering postiton it is treated as a power value to your steering motor. It will drive the steering motor in the direction to make the error value 0.
So you don't want to remap that value to anything else, just feed it directly to your motor as a power/direction value.
So at that point. just adjust the PID values to give a result that works.
Let me start another rant/soapbox...
*Thread Reply:* So, my original point was... Just convert the target pot to an angle. Then it will match the value that ROS will send you (assuming you are running TEB planner with the flag set for "produce steer angle instead of rotational velocity". So my thought was you can substitute the ROS angle for the target pot angle.
But I see you converted everything to an angle. That will work, as long as both inputs to the PID are an angle. So the PID can accept either A/D counts or an angle. As long as both inputs are the same units and the same range.
Then the output should be treated as power. And yes you should clip/clamp the output value to what ever the PWM driver wants to see. But don't try to scale that to something else. Feed it to your motor (and honor the direction based on a positive/negative value)
*Thread Reply:* Now, I said to look at your position feedback pot and kill the output if you went too far. If you would take those extra lines of code (that you added) and merge them below into the "if error > 0" or "if error < 0" then you will be able to recover from going too far. So figure out which block (i.e., error > ?) is right and which is left, put the check in the appropriate block. Then, if you go too far, you can back out by specifying the opposite direction. What you have will lock up the motor completely and you are done, until you manually back the motor off.
*Thread Reply:* Another issue at this point. Don't compare against error. Error is local to you PID function. Use the output of you PID ("output") instead. When you get "output" from your PID function, that is the power value to send to your motor. You don't have to remap anything.
*Thread Reply:* Another thing I noticed... You are sampling your "target pot" at a 5 second interval. Speed that up. In fact sample the target pot and your feedback pot at the same time. You will get much better results for testing. And I notice that you are averaging your feedback pot, but not averging the target pot (or PID parameter pots). Is the feedback pot that much worse? Or didn't you notice?
I probably had a couple more comments, but I started drinking when the "clowns" didn't show up when they were supposed to...
Let me know if I confused you even more. (Oh damn, now I have to get up early again tomorrow for the video meeting... See you then.)
*Thread Reply:* Oh, another comment. Eventually you may want to do "slew rate limiting". So when you specify a new target position that is not close. (I.e., a new location that is a considerably faraway), you may want to consider controlling how fast it attempts to reach that position. You can check each time through the loop and incemantally increase the position each time. Then you are attempting to reach a closer position. That allows you to crank up the motor power to attempt to reach a close position. Intead of specifing a far location and slamming the motor to full power. Which would cause it to overshoot.
We can talk about this more later...
@channel Just a reminder... The way Slack works, threads can never be archived (unless you are much smarter than me) So in 90 days, this last thread will disappear forever.
Info from today's video meeting.
===========================================
10062022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/GzAQU0ZhkrA
Chat: None
Index: 00:00 Al: Goes over version 3 code for his power steering motor. Mention of "ledc" which is a high-frequency PWM function. 04:20 Conversions between counts and angles. 28:45 Discussion of noisy analog and resulting jitter. 37:00 Tour of Al's physical layout. 48:00 Jeff: Shows the results of using plastic blocks for steering components. 54:40 Jeff: Points out ways to make a back end for testing. 57:20 Jeff: Size consideration. 1:00:20 Al: Asks about naming and asks what is next.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Updated PID configuration helper link. I now have settings that can be used for field test. Full left to right takes about 2 seconds while sitting still on concrete with max power setting at 150. The output of the PID is now directly fed into the motor power of the Cytron motor controller. Thanks @JeffS
Currently working on getting my LoRa handheld radio control to be the source of data for moving the steering and throttle instead of the hard wired potentiometers I used for getting the PID settings defined. Just started. I noticed that even though I have the Tractor sending packets back to the Radio Control 2x/sec, and the rssi is ok, there is a long lag between packets received at the Radio Control. Continuing to debug....
Radio Control code link; Tractor code (initial basic shell): link
Info from this week's video meeting
===========================================
10132022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/4_HCT0AA8cw
Chat: Al's remote code: https://github.com/jones2126/ttgo_board/blob/main/radio_control_v2.1/src/main.cpp Al's vehicle controller: https://github.com/jones2126/ttgo_board/blob/main/radio_control_tractor_v1/src/main.cpp Video about Cub Cadet seat switch: https://www.youtube.com/watch?v=qwqORlyu7r8
Index: 00:00 Al: Gives progress update on his lawn tractor controller and the remote controller. 05:20 Jeff: Shows how he mounted his laptop computer. 08:30 Al: Asks how to secure the screen. I say I just close the lid. Al asks how to prevent it from sleeping. We talk about where to find the setting to control that. 13:45 Al: Mentions powering the laptop from the vehicle battery. 16:30 Jeff: Asks if we are going to get a demo of the tractor moving. 16:50 Jeff: Asks if Al has the eStop figured out with the fuel injected version of the tractor. 22:35 Jeff: Briefly mentions his lack of progress on a new smaller tractor.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
Info from today's video meeting
===========================================
10202022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/bkBEDjLSARU
Chat: Al: https://clonezilla.org/downloads.php Al: Radio control software https://github.com/jones2126/ttgo_board/blob/main/radio_control_v2.1/src/main.cpp Al: Tractor software: https://github.com/jones2126/ttgo_board/blob/main/radio_control_tractor_v1/src/main.cpp Al: https://www.youtube.com/watch?v=qwqORlyu7r8 Al: https://www.youtube.com/watch?v=ArEKafd0oBM
Index: 00:00 Al: Gives progress update on his lawn tractor controller and the remote controller. 02:45 We see a video of Al's lawn tractor driving. 07:00 Jeff: Talks about his progress on building a smaller robot. 21:00 Al: Asks about data logging. 34:50 Discussion of the seat switch and wiring. 42:45 Jeff: Asks if the laptop has a mechanical drive or a solid state drive. (for vibration) 47:15 Jeff: Complains about the way Zoom works... 49:00 Laptop mounting. 50:15 More video demos.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos Index for the original ROS Agriculture group (~250 videos) and the new Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/
*Thread Reply:* @JeffS watching the latest video where you are discussing your next small bot…..I missed how this new one will be different than the other small bots you have create in the past? What new sensors and or mission capabilities will it have? I thought I heard camera?, vacuum cleaner scanner? …?
Well, I mentioned more than once that I want something smaller, lighter, less sloppy steering and two drive motors instead of a single drive motor with a differential.
As far as "mission capabilittes", I just want to prove that two drive motors instead of one will work.
I'm assuming we will not have a video meeting this week. The meeting has not been started and I can't log in to start it...
I was viewing ROS Agriculture meetings for references to my various steering stages. Before mounting any servo: https://youtu.be/SsqqjrB5-0U?t=1439 After mounting the Super Servo200: https://www.youtube.com/watch?v=-BewtBuwrtw After upgrading to the open frame "giant servo": https://www.youtube.com/watch?v=Xr0XDVY8wpU
When I turned on my computer today I had problems with Yahoo mail, Youtube preferences, Slack and Firefox updated and forced me to restart. It could be that the Firefox update and restart deleted my cookies.
But when I tried to sign in to Slack it presented me with a page saying they sent me a secret code that I must enter. But they never emailed me the code. If you click at the bottom on "click here to sign on with password" WITHOUT entering your email address, it takes me back to the normal sign in with password. But if you enter your email address and on "click here to sign on with password", then you will be stuck in a loop going back to the magic code.
A guy at Slack is looking at why the magic code is not being sent out.
It turns out the problem logging in was (ultimately) my problem. One of my SPAM filters decided (incorrectly) that the emails from Slack were SPAM, and flagged them as such. It then bounced the message back to Slack saying I will not accept SPAM. So they conveniently marked me as a problem and refused to send me anything (all without my knowledge, you know, a Skynet thing). So I went through and attempted to stop all of my SPAM filters from doing stupid things. After a couple of days it is working again. I may have figured out how to prevent gmail from screwing me over. I need to poke harder at Yahoo mail, because they are still marking things a s SPAM. But I think I convinced pobox not to silently bounce messages which is causing major problems...
Have I mentioned lately that I hate computers? In reality I just hate stupid people that make computers do stupid things...
Info from today's video meeting.
===========================================
11032022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/WEJI0AgxsK8
Chat: Al: https://docs.google.com/document/d/1iQgvms-inNsP_-TtOGxOjQipT-mIIxHGuhmcW948AYo/edit Al: https://docs.google.com/document/d/1w-nADgWnSuUZCkjbIYt2L3B5GJlkTXXfgxCJSgD9p9k/edit Al: https://www.mcmaster.com/90447A160/
Index: 00:00 Al: Gives progress update on his lawn tractor controller. Mention of remote login. Disk imaging. 04:30 Al: Has his GPS board installed. 05:40 Discussion of udev rules. 13:50 GPS and NavSat driver. 15:45 Jeff: Quick review of the pieces for a a new small robot. 16:20 Jeff: Some thoughts on reducing the size. And methods of fabrication. 21:20 Jeff: Talk about making the front end match the width of the back end. Discussion of bolt sizes. 24:50 Talk about different ball joints from Amazon. 26:00 Talk about cutting threads. 28:10 Pros and cons of metric vs. SAE hardware. 31:35 Prices at hardware stores vs. Menard's vs. Fleet Farm. More random discussions and complaints...
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
I probably won't be available for the video meeting on Thursday 11/10. I "MAY" be getting a new water heater.
The line that says: usr/bin/env: 'python': No such file or directory Probably indicates you have a mismatch between python and python3. Did you get the new version of the nmeanavsat that is for Noetic? Does your launch file says something like "python nmeanavsat..."? Maybe changing that to "python3 nmea_navsat..." will get past that. Otherwise, I assume you have a reference to python buried somewhere.
I switched away from the ROSAG specific version and I have now cloned https://github.com/ros-drivers/nmea_navsat_driver.git, fought through a few issues and have nmea sentences coming into the tractor laptop now.
Info from today's video meeting.
===========================================
11102022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/gvmQ-t7D3Cc
Chat: None
Index: 00:00 Al: Talks about updating his GPS driver to work with ROS Noetic. 06:15 Al: Talks about Odom. (looks like left and right wheel sensors) 07:10 He is installing simulator on his lawn tractor laptop (for some reason).
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
I was trying to make the YouTube channel more obvious. I thought I could add a banner image and put text on the image that says "Click About tab for more info". I couldn't figure that out. But I was able to add a link that says "More Info" which takes you to the About page.
The banner image is just a random image from pixabay.con.
I don't understand why you are starting with anything to do with a simulator if you are trying to get your physical lawn tractor to run.
It appears that the code you are showing extracts a location from Gazebo and publishes something. I would think what you want is the original code that subscribes to your GPS messages and publishes results (offset in meters? transform?)
If you are going to start with code that was made to keep the simulation world happy, then you will have to understand how that code is different from the physical world code.
(I don't mean to be short, but I have been drinking and my sewer backed up. So I had to do an emergency sewerectomy.)
I can't find anything anymore. But I was digging through here: ROS Agriculture Slack General Channel 7-15-2019 to 12-24-2020.pdf (I have a local copy on my harddrive. The original appears to have been under https://github.com/ros-agriculture/ros_lawn_tractor/wiki/, but the wiki is gone.) I found this link: https://gist.github.com/droter/c96e12deea000cac4da566dac2c91c23 You need some version of this for a physical vehicle. It subscribes to navsat/fix to get lat and lon. It also subscribes to heading, which may be from IMU or from a modified GPS message. But you should be able to get all three from GPS and not require IMU or Odometry or LIDAR. Then it needs to publish an odometry message for Move Base and a odomtobaselink transform. (which I think it does)
If you get heading from GPS, remember the heading is not valid if you are not moving.
Here is a description of using just GPS: https://youtu.be/kHBzMp5E-lU?t=2132
Long way from working, but hoping to use ekflocalizationnode (e.g. https://blog.abdurrosyid.com/2021/07/22/fusing-wheel-odometry-imu-data-and-gps-data-using-robot_localization-in-ros/) #random Anyone done this in the physical world?
@Al Jones I am watching today's meeting video. On your Teensy encoder problem... it is more likely your encoder is putting out something different than it used to. Maybe check for range and verify it goes from 0 to max in a linear ramp.
I got the video meeting for this week published. The indexing was particularly tedious this time since we were all over the place. I'll give it a few minutes since I can't figure out how to force Slack to start a new message...
Info from today's video meeting.
===========================================
11172022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/up1TMz-FOi0
Chat: Al: https://automaticaddison.com/how-to-publish-wheel-encoder-tick-data-using-ros-and-arduino/ Al: https://chiprobotics.gitbook.io/chip-robotics/sensors/imu-sensor/getting-started-guide Al: https://chiprobotics.gitbook.io/chip-robotics/sensors/imu-sensor/getting-started-guide Al: https://www.youtube.com/watch?v=dEPvZZ4luV4 Al: https://gist.github.com/jones2126/0f98af1cad3ca73df3b0e759cc30dc3f Here is a description of using just GPS and gps_odom.py to do localization. https://youtu.be/kHBzMp5E-lU?t=2132
Index: 00:00 Jeff: Shows his axle that he cut to reduce the length. 01:50 Al: Gives a status update. Comments on GPS, encoder. 03:45 Al: Talks about trying to get his Chips IMU running again. 05:30 Al: Watched some GPS videos. 05:55 Al: Mentions the original gpsodom.py code. Posts another link to a navigation video. 06:10 He is thinking about jumping into EKF and applying all of his sensors. 08:45 Al: More details about what he plans to do. PlatformIO, another video 10:00 Jeff: Suggests not jumping into EKF and just using GPS. 12:05 Discussing the "modified" NMEA NavSat driver. 14:50 Jeff: Complains about trying to use Google Drive. 17:20 Jeff: Mentions a script that extracts velocity and heading from a standard navsat/vel message. Back to gpsodom.py Mentions this meeting where the multiple copies of gps_odom.py were running: 10082021 Lawn Tractor Automation meeting https://www.youtube.com/watch?v=GddLZDXx75Q 20:20 Talk about fixing or removing behavior trees that were added for the stage simulator. 22:35 Jeff: Mentions difference between various navigation packages. 23:20 Jeff: Again complains that he can't find any of the old data. 29:15 Snow??? 29:45 Al: Points out he has his own backups. 33:50 Jeff: Babbles incoherently... 34:35 Al: Says his Teensy code no longer compiles. 41:30 Al: Mentions he is going to use the URDF file from my Gazebo simulator code. I missed it when he said that, but I caught it when he says it again later... 42:10 Jeff: More excuses for a small robot. 43:25 More reiteration on what Al is about to do. 44:30 Jeff: Asks which computer is running which nodes. 46:30 We start looking at launch files and I suggest going back to the original launch files and nodes from his RPi. 55:20 Al: Points out he is determined to use some of the shiny new code from my simulator project, that does not apply. And this is where he says he wants to use the URDF from the simulator. I said "Don't". Part of the confusion may be due to missing ROS packages. 59:30 "The Punch Line"
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
Does anybody know what this means? "That looks like a GitHub Gist link. Do you want us to import it and all future GitHub Gist links from you?"
It asks me that repeatedly. And I have no idea what they are asking me.
*Thread Reply:* Sorry, I've not seen that. I googled it, but no decent hits.
@JeffS Regarding the code for tracking encoder rollover, here are some results and test code. “Movement” is calculated correctly in 3 of the 4 examples, but when last position is 20 and current position is 16383, the movement is incorrect (i.e. -21 vs 16,363 (16,383 – 20)).
uint16_t current_position = 10;
uint16_t last_position = 1;
int16_t position_movement;
int32_t rotational_position = 0;
uint16_t bitshift_cur_pos = 0;
uint16_t bitshift_last_pos = 0;
int16_t bitshift_pos_delta = 0;
void setup() {
Serial.begin(115200);
Serial.print("code tester program has started ");
Serial.println();
Serial.println();
perform_calc();
print_info();
current_position = 20;
perform_calc();
print_info();
current_position = 16383; // (pow(2,14)-1) - the highest value before rollover
perform_calc();
print_info();
current_position = 1;
perform_calc();
print_info();
}
void loop() {
// put your main code here, to run repeatedly:
}
void print_info(){
Serial.print("last pos: "); Serial.print(last_position);
Serial.print(", bitshift_last_pos: "); Serial.print(bitshift_last_pos);
Serial.print(", current pos: "); Serial.print(current_position);
Serial.print(", bitshift_cur_pos: "); Serial.print(bitshift_cur_pos);
Serial.print(", movement: "); Serial.print(position_movement);
Serial.print(", total movement: "); Serial.print(rotational_position);
Serial.println();
last_position = current_position;
}
void perform_calc(){
bitshift_cur_pos = current_position << 2; //Bitshiftleft - The leftmost 2 bits in current_position are shifted out of existence
bitshift_last_pos = last_position << 2;
bitshift_pos_delta = (bitshift_cur_pos - bitshift_last_pos);
position_movement = bitshift_pos_delta >> 2; //BitshiftRight 2 bits
rotational_position += position_movement; // Update the absolute position values. (Position of this encoder since CPU reset.)
}
Well, all 4 steps here look right me. Keep in mind that 16383 is actually -1 as a signed number.
Is your encoder still putting out a 14-bit unsigned number through one full rotation? If you watch the output of the encoder and run it through a full revolution (or possibly 3 revolutions), does it start at 0 and ramp up to 16383 and then jump back to 0 and repeat?
Still not quite sure what is different, but I have it working again. source The encoder is still 14 bit (i.e. rolls over at 16,384.)
@JeffS I can't get gpsodom.py to start. Actually I can't get an example from geonavtransform to work either. Both attempts fail on the line
import geonav_transform.geonav_conversions as gc
I followed these instructions and there was no error:
$ cd /home/tractor/catkin_ws/src
$ git clone <https://github.com/bsb808/geonav_transform.git>
$ cd /home/tractor/catkin_ws/
$ catkin_make
$ source devel/setup.bash;
The error for gpsodom.py is
rosrun beginner_tutorials gps_odom.py
Traceback (most recent call last):
File "/home/tractor/catkin_ws/src/beginner_tutorials/scripts/gps_odom.py", line 12, in <module>
import geonav_transform.geonav_conversions as gc
ModuleNotFoundError: No module named 'geonav_transform'
tractor@tractor-HP-ProBook-645-G1:~/catkin_ws$
I'm going to try and hard code the path in the gpsodom.py to see if I can force it to find geonav_transform.
*Thread Reply:* moving geonavconversions.py to /home/tractor/catkinws/src/geonavtransform/src/ and then including the statement below in gpsodom.py helped find the module sys.path.append('/home/tractor/catkinws/src/geonavtransform/src/')
Now the statement below fails reload(gc)
rosrun beginnertutorials gpsodom.py
tractor@tractor-HP-ProBook-645-G1:~$ rosrun beginnertutorials gpsodom.py Traceback (most recent call last): File "/home/tractor/catkinws/src/beginnertutorials/scripts/gps_odom.py", line 15, in <module> reload(gc) NameError: name 'reload' is not defined tractor@tractor-HP-ProBook-645-G1:~$
*Thread Reply:* Maybe try commenting out that reload statement. I normally don't see that used. PLus, I don't know what it is supposed to do. Or maybe you need to import another module to support the reload statement?
*Thread Reply:* @Al Jones These may apply: https://answers.ros.org/question/380106/how-to-import-modules-using-ros-noetic-with-python3/ https://answers.ros.org/question/269701/import-python-modules-failedwhat-am-i-missing/ https://stackoverflow.com/questions/12229580/python-importing-a-sub-package-or-sub-module https://roboticsbackend.com/ros-import-python-module-from-another-package/
*Thread Reply:* @Al Jones Here is the issue with reload. First it looks like it is used to allow you to change the source code and automatically continue on. So it sounds like it is not strictly required. But this page shows different formats depending on which version of Python you are running. https://www.geeksforgeeks.org/reloading-modules-python/
This page: https://get-help.robotigniteacademy.com/t/no-module-name-geonav-transform-geonav-conversions/14494/6
Says
cd /home/user/catkin_ws/src
git clone <https://github.com/bsb808/geonav_transform.git>
cd /home/user/catkin_ws/
catkin_make
source devel/setup.bash; rospack profile
Maybe adding the
; rospack profile
will help?
error with gps_odom.py after resolving the reload statement by using "from imp import reload"
[ERROR] [1668989152.252446]: bad callback: <bound method MainClass.gps_callback of <__main__.MainClass object at 0x7f6ac4acc160>>
Traceback (most recent call last):
File "/opt/ros/noetic/lib/python3/dist-packages/rospy/topics.py", line 750, in _invoke_callback
cb(msg)
File "/home/tractor/catkin_ws/src/beginner_tutorials/scripts/gps_odom.py", line 82, in gps_callback
odom_broadcaster.sendTransform((_xg, _yg, 0.0), self.odom_quat, rospy.Time.now(), "base_footprint", "odom")
File "/opt/ros/noetic/lib/python3/dist-packages/tf/broadcaster.py", line 63, in sendTransform
t.transform.rotation.x = rotation[0]
IndexError: tuple index out of range
*Thread Reply:* You got an error so gpsodom died. You try to echo /heading but apparently it is not being published. You did an info and it said /nmeaserialdrivernode is supposed to be publishing it. It says it has no subscribers since gps_odom died.
But yes, /heading should be getting published at this point. You could put deubg statements in /nmeaserialdriver_node to see why it isn't publishing.
*Thread Reply:* I would still need to improve to get heading from GPS when the tractor is moving. Right now the driver is just taking the imu data.
*Thread Reply:* I can get to this by adding the code below to the startup launch file.
<!-- ************ Fake localization for map and gps ********** -->
<node pkg="tf" type="static_transform_publisher" name="map_to_odom" args="0.0 0.0 0.0 0 0 0.0 map odom 100"/>
<node pkg="tf" type="static_transform_publisher" name="map_to_utm" args="0.0 0.0 0.0 0 0 0.0 map utm 100"/>
*Thread Reply:* I see you are putting your odom directly on map. That should be correct for you are doing now.
Get rid of the map to utm. You don't have utm. As far as I know you have never had utm. I think it came from an alternate to gps_odom node.Something that was associated with the EKF stuff.
I see odom to basefoot print is being supplied by gpsodom, which should be correct.
When you start your robotstatepublisher (which reads your URDF file), you should get base_footprint to baselink transform. And you should get your baselink to gps transform.
*Thread Reply:* A little more progress. frames.pdf is looking better to my untrained eye.
updated launch file link
This launch file seems to be successfully launching mbfcostmapnav and mapserver. My issue at the moment is I am unable to get this urdf and mesh filename argument to enable rvis to load lawntractor.dae file. I also did not include "pkg="lawntractornavigation" name="mbfbehaviortree"" which was in the legacy lawn_tractor.launch The red sphere is the gps.
I would suggest not spending a lot of time trying to get the robot model to display. It is just there for decoration and it hides the real information you need to see to debug (transforms).
PS: This statement doesn't make sense to me:
"file:/home...." works, "
*Thread Reply:* The statement below in the urdf resolved displaying the dae file. Using file: 2 slashes, which was referred to in some posts, does not work.
<mesh filename="file:/home/tractor/catkin_ws/src/lawn_tractor_sim/meshes/lawn_tractor.dae" scale='1 1 1'/>
My current problem statement is no cmd_vel topic is triggered after initiating a 2D Nav goal in RVIS.
*Thread Reply:* Interesting https://uos.github.io/mbfdocs/tutorials/beginner/basicnavigation/#:~:text=MoveBase%20expects%20goal%20Messages%20%28geometrymsgs%2FPose%29%20on%20topic%20movebasesimple%2Fgoal.,launched%20from%20within%20Move%20Base%20Flex.%20mbfmbac%3Dactionlib.%20SimpleActionClient%28%22movebaseflex%2Fmovebase%22%2Cmbfmsgs.|link that show remapping
We used Move Base Flex by relaying mb_msgs/MoveBaseAction to mbf_msgs/MoveBaseAction in a standard Move Base goal callback. This is useful in case you want to use Move Base Flex as a drop-in replacement for Move Base and want to take advantage of continous replanning, which is built into Move Base Flex, but not Move Base.
*Thread Reply:* Looks like Vinny monitored movebasesimple/goal here
*Thread Reply:* More sleuthing needed. After doing these things:
• Make a package "lawn_tractor_navigation "
◦ $ cd ..
◦ $ cd src
◦ $ catkin_create_pkg lawn_tractor_navigation rospy
• Copy down the src folder
• Make .py executable, $ chmod +x mbf_state_machine.py
• copy down the file lawn_tractor/lawn_tractor_navigation/CmakeLists.txt
• copy down the file lawn_tractor/lawn_tractor_navigation/package.xml
• copy down the folder lawn_tractor/lawn_tractor_navigation/include
I'm getting a clean catkin_make, but during start up this error is being generated:
[mbf_behavior_tree-2] process has died [pid 80213, exit code -6, cmd /home/tractor/catkin_ws/devel/lib/lawn_tractor_navigation/movebaseflex_bt_node __name:=mbf_behavior_tree __log:=/home/tractor/.ros/log/54da8f4a-725e-11ed-951f-ab510d278eeb/mbf_behavior_tree-2.log].
log file: /home/tractor/.ros/log/54da8f4a-725e-11ed-951f-ab510d278eeb/mbf_behavior_tree-2**.log
The referenced log file is empty.
master.log show:
[roslaunch][ERROR] 2022-12-02 11:28:31,000: [mbf_behavior_tree-2] process has died [pid 80213, exit code -6, cmd /home/tractor/catkin_ws/devel/lib/lawn_tractor_navigation/movebaseflex_bt_node __name:=mbf_behavior_tree __log:=/home/tractor/.ros/log/54da8f4a-725e-11ed-951f-ab510d278eeb/mbf_behavior_tree-2.log].
@vinny ruia You around? Any suggestions? (hope all is well)
*Thread Reply:* Trying to build movebaseflexbtnode
Using these steps: • $ cd /home/tractor/catkinws/src/lawntractornavigation • $ code . • Confirm CmakeLists.txt is present and the statement below is present ◦ addexecutable(movebaseflexbtnode src/mbfbehaviorTree.cpp) • Confirm mbfbehaviorTree.cpp is present • In VS Code, Select Terminal, New Terminal • $ mkdir build • $ cd build/ • $ cmake .. • $ cmake --build . I seem to have gotten a clean compile...
[100%] Linking CXX executable devel/lib/lawn_tractor_navigation/movebaseflex_bt_node
[100%] Built target movebaseflex_bt_node
tractor@tractor-HP-ProBook-645-G1:~/catkin_ws/src/lawn_tractor_navigation/build$
*Thread Reply:* No joy, still not finding movebaseflexbtnode
/
mbf_behavior_tree (lawn_tractor_navigation/movebaseflex_bt_node)
ROS_MASTER_URI=<http://localhost:11311>
process[mbf_behavior_tree-1]: started with pid [30317]
terminate called after throwing an instance of 'std::runtime_error'
what(): Internal error in realpath(): No such file or directory
[mbf_behavior_tree-1] process has died [pid 30317, exit code -6, cmd /home/tractor/catkin_ws/devel/lib/lawn_tractor_navigation/movebaseflex_bt_node __name:=mbf_behavior_tree __log:=/home/tractor/.ros/log/5a8aa39e-7358-11ed-89f9-492abb57fe04/mbf_behavior_tree-1.log].
log file: /home/tractor/.ros/log/5a8aa39e-7358-11ed-89f9-492abb57fe04/mbf_behavior_tree-1**.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done
tractor@tractor-HP-ProBook-645-G1:~$
but it is actually here:
/home/tractor/catkin_ws/src/lawn_tractor_navigation/build/devel/lib/lawn_tractor_navigation/movebaseflex_bt_node
*Thread Reply:* I'm looking at your steps... I don't know why you created a build directory. I don't recall doing that step in anything that I have done. That might explain why your executable is under an extra "build" directory level.
*Thread Reply:* So instead of doing these steps:
$ mkdir build
$ cd build/
$ cmake ..
$ cmake --build .
I would just switch to to your catkinws directory and do a catkinmake...
Or maybe I am missing something.
*Thread Reply:* Well, I would do the obvious... Delete the build directory that you created. Then delete: catkinws/devel catkinws/build Then switch to catkinws and do a catkinmake See what that gives you.
*Thread Reply:* @JeffS Followed the steps above. Unfortunately same results.
/
mbf_behavior_tree (lawn_tractor_navigation/movebaseflex_bt_node)
ROS_MASTER_URI=<http://localhost:11311>
process[mbf_behavior_tree-1]: started with pid [38998]
terminate called after throwing an instance of 'std::runtime_error'
what(): Internal error in realpath(): No such file or directory
[mbf_behavior_tree-1] process has died [pid 38998, exit code -6, cmd /home/tractor/catkin_ws/devel/lib/lawn_tractor_navigation/movebaseflex_bt_node __name:=mbf_behavior_tree __log:=/home/tractor/.ros/log/c355d7f6-7379-11ed-89f9-492abb57fe04/mbf_behavior_tree-1.log].
log file: /home/tractor/.ros/log/c355d7f6-7379-11ed-89f9-492abb57fe04/mbf_behavior_tree-1**.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done
tractor@tractor-HP-ProBook-645-G1:~$
I only have one copy now.
Info from today's video meeting.
===========================================
12012022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/DD53JInlFTY
Chat: None
Index: 00:00 Jeff: Gives a tour of his new small robot progress. 10:50 Al: Gives a status update. Points out he can't get his vehicle to move. 13:30 Al: Can't get current stuff to work so he is looking at other navigation packages. 14:05 Jeff: Says maybe he was too hasty in recommending stripping down the navigation stack. 18:45 Leading up to a live debug session... 24:30 Discussion of what RVIZ publishes when you click 2D-NavGoal. 31:30 Some searching to see what topic name movebaseflex wants to see. 39:00 Start of live debugging. 1:01:30 Jeff: Suggests to also check cmd_vel to see if it needs renamed.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
Current problem statement: When I run this https://github.com/jones2126/tractor_laptop_ROS_workspace/blob/master/src/lawn_tractor_sim/launch/mbf.launch
Which launches
<node pkg="lawn_tractor_navigation" name="mbf_behavior_tree" type="movebaseflex_bt_node"
It fails with an error:
process[mbf_behavior_tree_1]: started with pid [38998]
terminate called after throwing an instance of 'std::runtime_error'
what(): Internal error in realpath(): No such file or directory
[mbf_behavior_tree-1] process has died [pid 38998, exit code -6, cmd /home/tractor/catkin_ws/devel/lib/lawn_tractor_navigation/movebaseflex_bt_node __name:=mbf_behavior_tree __log:=/home/tractor/.ros/log/c355d7f6-7379-11ed-89f9-492abb57fe04/mbf_behavior_tree-1.log].
movebaseflexbtnode exists in
*Thread Reply:* @JeffS Was correct. The executable is being found. I added two ROS_INFO statements. The first one is printed, but the second one is not.
```int main(int argc, char **argv) { ros::init(argc, argv, "mbfbehaviorTree"); ros::NodeHandle nh("~"); ROSINFO("In main() of mbfbehaviorTree.cpp"); std::string xmlfile; std::string pkgpath = ros:📦:getPath("lawntractornavigation"); // TODO: make this a param std::string filepathprefix = pkgpath + "/config/behaviortree/"; nh.param<std::string>("xmlfile", xmlfile, "movebaseflextree.xml");
std::string completeFilepath = filepathprefix + xml_file;
BehaviorTreeFactory factory;
factory.registerNodeType<WaitForGoal>("WaitForGoal"); factory.registerNodeType<GetPathActionClient>("GetPath"); factory.registerNodeType<ExePathActionClient>("ExePath"); factory.registerNodeType<RecoveryActionClient>("Recovery"); auto tree = factory.createTreeFromFile(completeFilepath); ROSINFO("mbfbehaviorTree.cpp - after createTreeFromFile xml"); NodeStatus status = NodeStatus::IDLE;```
sharing fyi. Links to Ackermann projects... https://discourse.ros.org/t/nav2-reference-ackermann-robot-simulation/24901
*Thread Reply:* ANY chance you could show a line up of all your bots?
This weeks video meeting was on Friday instead of Thursday. Here is the info.
===========================================
12092022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/vGMxhmObOtA
Chat: Jeff: My older robots: http://www.tcrobots.org/members/jsamp.htm Al: https://www.youtube.com/watch?v=ySlU5CIXUKE
Index: 00:00 Jeff: Gives a second tour of his new small robot progress. 05:25 Bottom view showing servo mounted to axle. Steering control arms. 08:00 Discussion of weight. 10:50 Image for size comparison. (Wrong screen confusion.) 15:05 What about batteries? 17:10 Robot line-ups. 20:15 What GPS to use. uBlox M8T and RTK-Lib? 22:00 Al: Gives a status update. 23:05 Al: Suggests loading a new computer with his tractorlaptopROS_workspace and see if it still works. 23:45 Al: Explains his current problem. And adding a map. 33:30 Google Maps vs. Google Earth. 34:05 Discussion/confusion about various origins. 56:20 Origin vs. origin. [Pause] 57:35 Al gives a recap of what we hashed out. 1:00:45 Jeff gives a recap of his interpretation.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
Intrinsic acquires ROS maker Open Source Robotics Corp https://www.therobotreport.com/intrinsic-acquires-ros-maker-open-source-robotics-corp/#:~:text=Intrinsic%20is%20also%20acquiring%20Open%20Source%20Robotics%20Corporation,from%20OSRC%20and%20OSRC-SG%2C%20which%20are%20for-profit%20businesses.|link the for profit part of Open Source Robotics Foundation, which is the developer of the Robot Operating System (ROS)
Info for today's video meeting.
===========================================
12162022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/PmTr2Jwofrw
Chat: Al: https://github.com/jones2126/tractor_laptop_ROS_workspace
Index: 00:00 Jeff: Shows the mostly completed steering on his new small robot. 10:45 Al: Starts with a status update. New bag file, RVIZ problems. New battery management system for his LIPO battery. He has pots for real time control of his PID parameters. He is consolidating his code, pictures and documentation on his github site. 12:30 Al: Shows his development roadmap. 13:20 Al: Shows his RVIZ screen. 13:50 Al: Shows pictures of his electronics on his lawn tractor. 15:25 Al: Goes back to a live RVIZ screen. 20:55 Jeff: Asks about battery monitoring. (with the fancy new BMS) 22:55 Al: Shows pictures of his phone app to show battery monitoring. 30:50 Al: Sums up where he is.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
On your TF problem, the three frames listed are supplied from your URDF file. Maybe you are not loading that file, or it had errors.
Or you don't have a connection from odom to basefootprint (which is supplied by gpsodom.py. So that may not be running or it has errors.
Maybe run the "show frames", or whatever it is called to see where the TF is broken. Or look at the startup messages on your terminal windows to see what might be complaining.
After Terry talked about using cordless drill batteries (on today's video that I haven't posted yet) I pulled out the two drills I have. They are both 18V, maybe NiMh or NiCad. One was dead, the other had a good charge. I connected the charged one to my giant R/C servo (which wants 12-24V). It runs on that and is pretty snappy. It is not chattering and shutting down like it was on my random lead acid battery.
So I will have to look into buying (or making) an adapter for one or both of those. One battery foolishly has unprotected tabs sticking out so it would be easy to short it out and watch the flames. But it also easy to clip on alligator clips. The other one is close to Terry's Milwaukee batteries and I have to shove metal tabs in to get connection.
*Thread Reply:* oops, pressed Enter when I shouldn't have
Info from today's video meeting. This turned into a massive debug session. Which turned out to be a typo/oversight at 1:06:20.
===========================================
12222022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/ecwIJu_NSH8 Chat: Al: https://github.com/jones2126/tractor_laptop_ROS_workspace/tree/master/project_support_files Al: https://automaticaddison.com
Index: 00:00 Jeff: Shows latest progress on his new steering. 05:55 Al: Starts with a status update. Shows some RVIZ screens to show positions. 08:00 Al: Has modified his geonav software. 10:20 Al: Is looking at using parameters. 11:50 Al: Added PID values to his output screen. 12:10 Al: List of things to do. 12:40 We start looking at Al's current problems. URDF file. 15:15 Jeff: Asks if it will run right now to do real-time debugging. 16:40 We go to a saved RVIZ screen to look at TF errors. 22:25 We decide to pause the video and Al runs out and sets up his lawn tractor. [Pause] Now the GPS is broken. 24:30 Terry joined us. Talks about autonomous snow removal. Discussion of using Milwaukee M18 drill batteries. 31:05 Al: Posts a link to his github for his documentation. 31:50 Al: Was doing experimentation on his GPS driver code (while Terry was talking). We do a lot more digging through code. 37:30 Al: Restates the obvious goal "just get it to move from point A to point B". More digging through code... We try lots of things. 52:05 We finally resort to deleting build and devel directories and doing catkinmake. 53:15 While it is rebuilding we talk about Terry's project some more. Various ways to remove snow. 55:45 Al: Tries to give up again. But we won't let him. 56:45 We go back to reading the actual messages from the GPS to compare. 1:02:50 Al: Mentions various links as a learning source. 1:04:35 We decide to verify that the working program and the non-working program are both talking to the same port. 1:06:20 We find that the two programs are pointing to different ports (which is bad). Just as the catkinmake is finishing. 1:09:30 He fires it up again and the GPS works. So we spend a couple minutes deducing why the port changed. 1:12:00 Al starts up everything and the TF errors are gone. 1:16:30 The tractor isn't where expected. So Al is going to run outside and carry the antenna around. [Pause] 1:17:15 We are back and can see the antenna moving. 1:18:50 Terry: Asks about the accuracy of Al's GPS. 1:21:05 Analysis of why the tractor position has the wrong offset. 1:22:50 Terry: Asks "what is TF?"
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
@Al Jones It concerns me that in RVIZ your maps and grid and local costmap are not square with the world.
In the panel on the right (Views) there is Angle,X,Y. That angle always show 0.22 on your screens. Try double clicking on the parameter for Angle (the 0.22) and enter 0 then press Enter. See if that squares up your mp on the view. If it does, try saving the RVIZ configuration and then see if it comes up squared next time you start it (or if you normally load a configuration file each time see if that comes up 0 angle then).
@Al Jones Oh, I see on today's meeting video your maps and grid are square to N/S. You must have already figured it out.
just so the URL is nearby for Zoom call started now: https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
I think this resolved my "driving below ground" issue
<joint name="base_footprint_joint" type="fixed">
<parent link="base_footprint"/>
<child link="base_link"/>
<!-- <origin xyz="0.0 0 0.1525"/> -->
<!-- <origin xyz="0.0 0 ${tire_rear_dia}"/> wheels are still off the ground -->
<origin xyz="0.0 0 ${tire_rear_dia / 2}"/>
You may have to adjust that on each wheel and when you specify basefootprint to baselink.
If you search this: https://duckduckgo.com/?t=ffab&q=what+controls+forward+speed+in+teb+panner&atb=v343-1&ia=web You get various references to adjusting forward velocity and rotational velocity/angle using TEB planner.
Let's try this... Since Slack doesn't work very well any more. I'll see if I can repost the pinned message from General.
Useful links: • Video conference link (Zoom): https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 ◦ (The Zoom video meetings are Thursday at Noon (USA ET); ◦ Saved videos (link); Saved chats (link) • You Tube Video of Meetings: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ • Indexes made by Jeff: http://sampson-jeff.com/RosAgriculture/ • Google drive folder (link); Sheet with current configurations (link); Juan's action list (link); Rough work plan (link); Shared catkinws copies (link) • Robot Agriculture: https://github.com/ros-agriculture • Simulator: https://github.com/ros-agriculture/lawn_tractor/blob/master/simulator.md • Discourse: https://discourse.ros.org/c/ros-agriculture/63 • Discourse – Perception: https://discourse.ros.org/t/perception-project/17621 • Archived RosAg Slack content (link) That seems to have worked. Maybe I should consider doing this once a month...
Info from Today's video meeting.
===========================================
12292022 ROS Lawn Tractor Automation meeting This video: https://youtu.be/gDjq6gSq1j8
Chat: Al: https://protonvpn.com/support/how-to-set-up-protonvpn-on-openwrt-routers/ Jeff: rostopic pub -1 /turtle1/commandvelocity turtlesim/Velocity -- 2.0 1.8 Jeff: http://wiki.ros.org/ROS/Tutorials/UnderstandingTopics Jeff: rostopic pub [topic] [msgtype] [args] Jeff: https://github.com/jones2126/tractor_laptop_ROS_workspace Terry: https://www.adafruit.com/product/3800
Index: 00:00 Jeff: Gives a demo of the mostly completed steering on his new small robot. 05:15 Jeff: Talks about drill batteries. 10:10 Al: Compares his lithium batteries. 12:20 Al: Talks about Jeff's steering again. Questions about the position sensor in these R/C servos 17:45 Jeff: Talks about possibly adjusting the pot and getting better tracking. 20:15 Al: Shows a simulated vehicle so he can see the cmdvel that is coming out. 36:50 Al: Reiterates what he thinks he is getting. 40:00 Jeff: Suggests directly publishing a cmdvel to verify that works. 49:00 Okay, we finally figured out how to publish a cmdvel. 53:00 We proved the vehicle will drive faster. So the problem seems to be TEB planner parameters. 54:30 Terry: Gives a quick update and asks how to proceed. Discussion of simple motor control. And an Ada Fruit Itsy-Bitsy board. SAMDPWM function. Suggestion of rosserial. 1:02:15 Al: Gives a brief overview of how does low level motor control. 1:03:45 Jeff: Says where to find this github. But it also gets posted to chat. 1:04:10 More talk about rosserial. 1:05:00 Jeff: Talks about quadrature decode. 1:05:55 Al: Asks about Terry's Itsy-Bitsy board. 1:10:00 Al: Talks about simulation. Some exploration of his simulation shows up. Attempting to fix the vehicle being below the ground plane. 1:28:30 Some talk about how to create a map.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
Info from today's video meeting.
===========================================
20230105 ROS Lawn Tractor Automation meeting Length 27:15 This video: https://youtu.be/4Y3q4dTHyOg
Chat: Al: https://www.aliexpress.us/item/2251832568712028.html Al: 180kg.cm Parameters: - Model: XJD-X1 Al: https://servodatabase.com/compare
Index: 00:00 Jeff: Mentions shorter screws and bolts. 00:35 Jeff: Talks about R/C servos, wire ferrules, cable clamps, grounding the frame 02:45 Jeff: Gives a status update on what Terry is doing. He has his dual channel Cytron driver board working with an Ada Fruit ItsyBitsy board. Possibly weird problems with ground wires. 05:10 Jeff: Talks about snow blower concepts. 05:50 Jeff: Is back to indexing ROS Agriculture videos. 200 of 250 completed. 07:20 Al: Talks about file maintenance. 09:50 Al: Talks about workspace maintenance. 13:30 Al: Found out why his internet connection was being rejected. 15:25 Al: Is trying to find old code/videos. We go through how to find the ROS Lawn Tractor Meeting index and the ROS Agriculture meeting index. Turns out it was the description of creating a path with Dubin's is what he was looking for.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Just a side note... As I said on today's meeting video, I have an updated ROS Agriculture video index. Before I had 166 videos indexed. Now I have 200 videos indexed. That leaves 50 videos which seem to be averaging about 2 hours per video to create. So I'm not real motivated to jump back in right now to finish them...
*Thread Reply:* Thanks for putting the investment in time you have done so far.
I've made it easier to run the lawn tractor in simulation and published the instructions on github. https://github.com/jones2126/ros1_lawn_tractor_ws
Suggestions welcome. #random
Info from today's video meeting.
===========================================
20230112 ROS Lawn Tractor Automation meeting Length 52:57 This video: https://youtu.be/V3robck4V5c
Chat: Al's github page: https://github.com/jones2126 https://github.com/jones2126/tractor_laptop_ROS_workspace Al's instructions for installing a simulator: https://github.com/jones2126/ros1_lawn_tractor_ws
Index: 00:00 Al: Gives a status update. Modified some dimensions on his simulator model. 01:20 Al: Talks about path generation. Referring to this video: https://youtu.be/F3ycvmWkUl8?t=300 03:40 Al: Talks about his speed problem. 04:25 Discussion of velocity units. 07:20 Al: Talks about his github organization. And more comments of paths. 10:35 Jeff: Shows the crimp on wire ferrules he mentioned last week. 13:55 Jeff: Holds up a single channel Cytron board. And Terry's grounding issue is discussed. 16:25 Jeff: Random updates on what did (or did not) do. 17:15 Terry: Talks about the communication protocol for his drill batteries. 24:05 Terry: Talks about controlling his motors with a PID function. 28:40 Terry: Asks if ROS will control accel/decel or do you have to do it yourself. 30:20 Al: Brings up processor speed for quadrature decode. 33:15 Jeff: Again brings up the concept of using a cheap Blue Pill board as a smart peripheral. And a discussion about SAMD21 vs. SAMD51. Because Jeff was confused. 34:45 Terry: Plans to address his motor control. Discussion of the Milwaukee drill battery. 38:40 Al: Shows his battery management software monitor. 40:30 Jeff: Asks about the battery adapter for the drill battery. 43:30 Terry: Shows a voltage convertor he ordered. 44:55 More discussion on decoding the battery protocol. 48:15 Terry: Points out where he is loading software. He is going to put Zener diodes on his voltage dividers. 49:15 Al: Points out the obvious. 49:50 Jeff: Talks again about automatically verifying his R/C servos.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Here is some discussion on the BNO055 IMU: https://groups.google.com/g/hbrobotics/c/3VC2sGo3qDM
@JeffS when do you think you might get to making a video verifying servos?
I don't know. Whenever I get around to it.
I just did a foolish thing and ordered another giant servo, https://www.amazon.com/gp/product/B0B1DX3RFW/ref=ppx_od_dt_b_asin_title_s00?ie=UTF8&psc=1 This new version has the pot on top of the board so it easier to solder on a feedback wire. But I notice it has a position output pin. Which I assume is the voltage from the pot. (maybe buffered)
I should open a "normal" servo and see if the pot has a feedback voltage, or if it has some weird digital signal...
Looking at the data sheet…it says remote control? Is that only RC control?
Without looking at the data sheet, the previous versions were selctable between R/C pulse width or a voltage input.
Well, you need something to supply R/C pulses or a control volatge. I don't see what that has to do with Blue Tooth or WiFi.
I miss took “ remote control” to mean more than Just RC. That is, also capable to Control over blue tooth or WiFi some how. Ooops. I suppose one would need to add additional circuits to control over WiFi.
@Al Jones When I post the video I made comment at 1:48:06. And as I say it looks the global path is constantly replanning so it may confuse the local planner. As an experiment, go into nav_newteb.launch and change planner frequency from 5.0 to 0.0 and see if that makes a difference.
I have finished uploading and indexing the video but I have a few more steps to go...
Info from today's video meeting. This one got quite long...
===========================================
20230119 ROS Lawn Tractor Automation meeting Length 2:27:36 This video: https://youtu.be/pina3Rx4kMI
Chat: Articulated Robotics: https://www.youtube.com/@ArticulatedRobotics Robot Shop of Horrors: http://sampson-jeff.com/rsoh/ Twin Cities Robotics meetings: http://www.tcrobots.org/mtgpics/picindex.htm Jeff's old member page: http://www.tcrobots.org/members/jsamp.htm
Index: 00:00 Jeff: Shows his dual motor drive train. 10:20 Jeff: Talks about selecting a microcontroller board. 20:50 Jeff: Talks about the latest version of the giant servo he ordered. And general R/C servo testing. 24:20 Jeff: Talks about some optoisolators he ordered. 30:30 Jeff: Stories of Robot Shop of Horrors and Twin Cities Robotics meetings. 32:10 Jeff: Servo testing, buying parts. 36:10 Terry: Talks about Articulated Robotics channel. 38:35 Terry: Gives an update on his motor control progress. 41:55 Al: Give a quick demo of his simulated vehicle to show the speed control problem. [Warning, we spend the 1-1/2 hours trying to figure out what the problem is.] 53:45 We fire up dynamicreconfigure to view and change parameters on the fly. We verify that we can view and change values. We can save all parameters and load them back in. 1:13:30 We come to the conclusion that baselocalparameters are probably not used at all. 1:16:30 We print out the loop rate that the planner is running at. 1:25:05 The question is asked if the forward velocity is reducing if the vehicle veers right and left. Yes, it looks like that is a problem. 1:40:40 Jeff's package just showed up. 1:44:10 Maybe the (modified)cmdvel to ackermann translator is changing values? 1:45:15 Terry leaves (so he knows where to pick this up if he wants to) 1:48:06 So in re-watching the video, I see at this point that I can see the global plan keeps replanning. That pulls the path out from underneath and hands it a new one. That could confuse it. You may be able to disable the global replanning. I think it is either a TRUE/FALSE in the global planner parameters. Or it is a global replan rate in the same place. 1:52:50 The concept of switching from Al's lawn tractor back to Jeff's mini-tractor comes up. 2:12:50 movebaseflex? 2:15:25 We go look at parameters after switching vehicles. 2:15:50 It just occurred to me that teb local planner needs to know wheelbase. So it is probably defined in the teb local planner yaml file. So that is another place to change values if switching between vehicles. 2:20:00 Discussion of how to get all these random variables to match up automatically. 2:23:10 Jeff: Shows his new ASMC-04B servo he just got in the mail.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I was confused in the meeting today. When Terry mentioned Articulated Robotics, I was thinking of this guy: https://www.youtube.com/@automaticaddison9709/videos I was quoting Articulated Robotics in this meeting: 03032022 ROS Lawn Tractor Automation meeting https://youtu.be/wn3YTXuOhdc
I got a ASMC-04B servo last Thursday. I assumed the "B" version was newer. But of course, I was wrong. The "A" and "B" version are different speed/torque versions:
ASMC-04A 110kg/cm - 0.12sec/60deg - $53 with no "servo plate" ASMC-04B 180kg/cm - 0.5sec/60deg - $57 with "servo plate"
So the "A" version turns 4 times faster but has lower torque. I got a ASMC-04A today. They both track quite well (just playing with a servo tester). The speed difference is quite noticeable.
Also, the ASMC-04 has a position output pin. So you don't have to solder a wire onto the pot to get the position voltage.
Info from today's video meeting:
===========================================
20230126 ROS Lawn Tractor Automation meeting Length 47:53 This video: https://youtu.be/xbPi8kgM8h4
Chat: Jeff: Giant R/C servos ordered from Amazon ASMC-04A 110kg/cm - 0.12sec/60deg - $53 with no "servo plate" ASMC-04B 180kg/cm - 0.5sec/60deg - $57 with "servo plate" Jeff: https://github.com/jones2126/tractor_laptop_ROS_workspace/blob/master/project_support_files/embedded_programs/tractor_control_2022/src/main.cpp Jeff: https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/project_notes/vehicle_dimensions_helper.ods Jeff: https://groups.google.com/g/hbrobotics
Index: 00:00 Jeff: Gives a demo of his giant R/C servos moving and talks about them. 06:50 Terry: Talks a bout PID libraries for Arduino type boards. Interrupts and volatile variables. 09:25 Al: Says he has a PID example in his github. We post a link in chat to Al's code. 14:30 Al: Presents his "Vehicle Dimension Helper" file. Foot Print definition. Talks about how to create the Foot Print. We post a link in chat to Al's spreadsheet. 22:30 Al: Mentions a Zoom meeting he listened in on. 23:45 Talk about all the various file locations for numbers. 25:10 We track down Al's spreadsheet on his github. 35:10 We track down the files that have the PID code. 40:45 Mention of how remote control ties into this. 43:30 Jeff: Asks what this is used for. It is the steering control. Some discussion which sensors are being used for steering.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I posted this issue on Ros Answers today. Any advice is welcome.
Info from today's video meeting.
===========================================
20230202 ROS Lawn Tractor Automation meeting Length 1:01:33 This video: This video: https://youtu.be/JTdYo8IAF5Q
Chat: Jeff: Giant R/C servos ordered from Amazon ASMC-04A 110kg/cm - 0.12sec/60deg - $53 with no "servo plate"
Jeff: Code change in odom.py to fix velocity: Was odom.childframeid = 'base_footprint' odom.pose.pose = result.pose # odom.twist.twist = result.twist
Now odom.childframeid = 'base_footprint' odom.pose.pose = result.pose # odom.twist.twist = result.twist x = float(result.twist.linear.x) y = float(result.twist.linear.y) odom.twist.twist.linear.x = float(sqrt(x * 2 + y * 2)) # odom.twist.twist.linear.z = float(atan2(y,x)) odom.twist.twist.angular.z = result.twist.angular.z
And you have to add this at the top somewhere: from math import atan2, sqrt
Jeff: Thread on HBRC about logic analyzers: https://groups.google.com/g/hbrobotics/c/vgeu9af-vUw/m/CtUhESNuCgAJ
A reference to the analyzers on eBay: https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2510209.m570.l1313&_nkw=USB-SALEAE-24M-8CH-Logic-Analyzer-24M-8-Channel-with-Buffer-Support&_sacat=0
Big selection from Amazon: https://www.amazon.com/s?k=logic+analyzer&crid=1F5WHD0NPYMI5&sprefix=logic+%2Caps%2C151&ref=nb_sb_ss_ts-doa-p_10_6
Index: 00:00 Jeff: Gives a demo of his new fast giant R/C servos on his robot. 03:05 Jeff: Says he played with the simulator package to fix the speed. Describes the problem and possible fix. Demo coming up later. 06:45 Terry: Is working on his PID code. Shows some logic analyzer plots. Battery monitoring with A/D. Starting with ROS. Vacation plans. 11:30 Al: Random comments on code and PID. 12:15 Al: Skipped over the speed problems and is working with Move Base Flex and waypoints. 16:00 Al: Demo of what he is working on. 18:10 A question comes up about driving backwards. 22:10 More comments on purposely driving backwards. 24:40 Jeff: Gives an explanation for his changes. 29:30 Jeff: Starts a demo. 46:45 Al: Asks about Terry's logic analyzer. Terry describes his USBee-SX. 57:50 Some discussion on the current Gazebo simulation model and switching different models.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Just a side note... I looked out the front door and decided I can't run my robot in my front yard. I was just playing with my simulated 4-wheel Ackermann vehicle. So I started up the simulator and published an Ackermann message with 0.5 m/s speed and 0.25 radian steer angle. The vehicle drives around in a circle. The Ackermann command is at the top. Gazebo ground path shows it driving in a 4 meter diameter circle. The window on the right is the /odom topic created from Gazebo. The big window on the left is the /imu/data topic. The small window in the lower left is the /gps0/fix_velocity topic. The rotational velocity from /odom and /imu are showing 0.25 rad/s. The forward velocity is showing 0.5 m/s. The /gps is showing the published velocity vector. (I can extract velocity and heading directly from that. Then create rotational velocity from heading.) All of the numbers seem appropriate all the way through. So this should be valid to experiment with velocity and steering.
That's some serious snow. Thanks for doing the testing. It's really insightful. I'm trying to run a Dubins path using teb_planner and starting with trying to run a circle using this program. Struggling a bit. Trying to work through it.
Trying to tune my teb_planner. Using this script to run a consistent mission each time. At the moment it is a square. Performance is really bad. Lots of backing up. Still at it....https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/driving_square_example.py
Link to screencast showing simulated robot running this mission https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/driving_star_example.py Although its better, I'm wondering if there is a way for the speed to be more consistent (i.e. Eliminate slowing down to a crawl as the robot gets closer to a way point.)
Zoom meeting (Thursdays at Noon (ET)) <https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09#success>
This is the Dubins-RRT global planner I found. I don't know anything about it, but has nice pictures. https://github.com/jhan15/dubins_path_planning/
Info from this week's video meeting.
===========================================
20230209 ROS Lawn Tractor Automation meeting This video: https://youtu.be/rMfrhUc0AgY
Chat: Al: https://fritzing.org/ Jeff: https://fritzing.org/download/ Al: https://github.com/adafruit/Fritzing-Library/blob/master/parts/Adafruit%20ItsyBitsy%20M4.fzpz Terry: https://www.reddit.com/r/MilwaukeeTool/comments/108d3j9/m18_fuel_battery_protocol/ Jeff: Set GPS to ideal to closer represent RTK: See note below... Al: https://ieeexplore.ieee.org/document/8918671 Al: Path Planning for Mobile Robot using Dubins-curve based RRT Algorithm with Differential Constraints https://github.com/FelicienC/RRT-Dubins Jeff: The global Dubins-RRT planner I found: https://github.com/jhan15/dubins_path_planning/ (Turns out this is not made for ROS)
Note: YouTube doesn't like the greater-than and less-than characters in the description. So I deleted the URDF GPS noise changes from chat. See the changes @43:00 in video.
Index: 00:00 Terry: Mentions his PID. He is loading ROS on a laptop. He found more information on Milwaukee drill batteries. He burned up his Itsy-Bitsy board. 02:00 Terry: Talks about power distribution. He is using Eagle to create schematics for documentation. 04:30 Jeff: Suggests looking at Fritzing for creating wiring diagrams. 05:20 Terry: Found some information on the drill battery data protocol. 09:50 Al: Points out you can also create PC boards from Fritzing. And prebuilt libraries are available. 10:45 Terry: Posts a link the drill battery on Redit. 11:10 Al: Talks about orientations in Gazebo and RVIZ. 15:30 Jeff: Questions sensor placement. 21:15 Al: Has a video of path following. 23:50 Al: Found an implementation for Pure Pursuit. 24:55 Al: Shows a small computer he found. 30:25 Al: Asks for suggestions on speed at weigh points. Jeff suggests turning on odom markers when driving around in RVIZ. 30:55 Discussion on Pure Pursuit and where is has been used. 35:15 Discussion of speed at weight points. 41:05 Jeff: Talks about removing the "noise" from the simulated GPS. 49:55 Jeff: Mentions quality of messages from Gazebo. 52:50 More discussion about reverse engineering drill batteries. 56:20 Reminder that the method in ROS is not sacred. You can replace the navigation stack completely. You may not want to plan around obstacles. 57:45 Mention of dubins-RRT planner.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20211026.txt
I posted this over at ROS Ag Discourse. Just trying to get some feedback. https://discourse.ros.org/t/ackermann-steering-bot-using-teb-planner-mbf-and-dubins-path-slows-down-at-each-goal-how-to-avoid-that/29731
Back in 2019 we had Dr. El Houssein Chouaib Harik speak to us from https://precisionag.no/ I sent an email over there this morning to see if anyone will take a call to talk again about their ROS experience. Thanks @JeffS for the video index. It helped me find https://www.youtube.com/watch?v=zOgJNwGXBSg
Video link of running this simple Dubins path (link) in simulation. Not terrible efficient, but its a start.
Info from today's video meeting.
===========================================
20230216 ROS Lawn Tractor Automation meeting Length 1:02:30 This video: https://youtu.be/YMndWJp7Pls
Chat: Path to Al's stuff. https://github.com/jones2126/ros1_lawn_tractor_ws/tree/master/src/ackermann_vehicle/nodes/missions Jeff: Path to saved workspaces on Google drive. https://drive.google.com/drive/folders/1t2P2sfiranPe10fmUBUnnVvKNgrfybAV?usp=sharing Al: Some reference to path name. python script sub/pub
rospy.Subscriber('got_path', Float64, path_callback)
path_pub = rospy.Publisher('/drive_path', Path, queue_size=10)
https://drive.google.com/file/d/1oHgs4eQg1ZwQ_JoquGgf-rm78N-NoccJ/view?usp=share_link Jeff: Link to YouTube video about using path_generator.py. 03052021 Lawn Tractor Meeting https://www.youtube.com/watch?v=F3ycvmWkUl8 Al: Instructions on using the path generator from above. https://docs.google.com/document/d/17bPVy9IUBp3xqllMGpGTsoqm8cyIUm37bdLj_oB9NDE/edit Jeff: Link to the F1tenth Pure Pursuit https://github.com/f1tenth-dev/pure_pursuit
Index: 00:00 Al: Talks about his progress with path following. Shows a video. Explains some code. 09:50 Jeff: Talks about path planning and suggests that MoveBaseFlex/TEBplanner may never give adequate results. Mentions Pure Pursuit. 11:20 Jeff: Brings up Matt and Juan's Pure Pursuit experiments. 12:00 Jeff: Goes searching for Juan's original source code. Found the Pure Pursuit package. We look at: Pure Pursuit gpsheading.py pathgenerator.py pathpublisher.py 22:40 Jeff: Incorrectly opens pathgenerator.py when he should have opened pathpublisher.py. But no one noticed... 24:25 Jeff: Tracks down the YouTube video where we were explaining some variation of pathgenerator.py. 28:50 Look at launch files. 34:50 We go back and look at Pure Pursuit code again. 39:40 Jeff: Goes looking for the Pure Pursuit code from last week. 43:15 Finally found the repository. 46:43 Finally found the Pure Pursuit controller code. 50:00 We go back and look at path_publisher from Juan's repository. 1:01:25 Al: Says is going to spend some time on his remote control.
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
This is why you need an e-Stop. And make sure everybody knows how to use it. (Russian robot clobbering a reporter.) https://twitter.com/i/status/1626877928444231680
Info from today's video meeting.
===========================================
02232023 Lawn Tractor Automation meeting Length 2:00:26 This video: https://youtu.be/3fwYPfUF6-o
Chat: Al: https://github.com/jones2126/lawn_tractor_embedded_code Story about Alan's scooter project: http://www.tcrobots.org/members/akili.htm#robot3 Search for "28 May 1999" And note the phrase "Arghhhhhhhh. I look at the switch, and I can see really pretty green explosions going on INSIDE the switch that is lighting up all the way through the switch's plastic parts." These old accounts of experiments are hilarious. That's why it was called "Robot Shop of Horrors"! Al: Jeff, is this the package you used? http://wiki.ros.org/purepursuit_planner Jeff: == Matt's demo of his Pure Pursuit == Quick mention Lawn Tractor Meeting 08232019 @56:00 https://youtu.be/zOgJNwGXBSg?t=3359 Lawn tractor meeting 11062020 @29:50 https://youtu.be/lq0xQs77Mmc?t=1787 Lawn Tractor Meeting-v4zdeQC9RCg @12:16 https://youtu.be/v4zdeQC9RCg?t=736
Index: 00:00 Terry: Got replacements for his Itsy-Bitsy board he blew out. He talks about wiring and relays. 02:05 Mechanical relays or solid state relays? Story of abusing switches. 06:20 When do you have to worry about your robots killing people? 07:30 Terry is working on wiring. 07:45 Terry: Talks about drilling stainless steel. And next steps. 10;10 Jeff: Mentions the Milwaukee drill battery project. 12:55 Al: github - speed, physical implementation, tuning. Organizing embedded software. Remote control. ESP32 communication. 16:25 Al: chatGPT 18:25 Terry: Asks where the 0.73 come from? 20:05 Differences between two-wheel differential drive and 4-wheel differential drive. 21:15 Jeff: Presentation on Pure Pursuit control. (For the next 1-1/2 hours) Hacking of existing code. 36:00 Terry: Asks if you can also control external functions. 40:35 Running multiple planners in parallel. 41:30 Creating actual scenarios to create path planning methods. 43:55 Creating maps to represent the real world. More description of Pure Pursuit. 53:55 Cross track error. 57:35 Starting in the proper location. 59:40 cmdvel, ackermannvel 1:02:05 Does Pure Pursuit currently control speed? 1:03:40 Demo time... 1:37:10 Running Pure Pursuit and Move Base at the same time. 1:42:55 Terry: Asks if rviz can plot the area covered by his snow plow. 1:45:10 Al: Asks what the launch file for Pure Pursuit looks like. 1:51:15 Jeff: Says this is how you fix the steering angle that he screwed up. The comment is how to fix the "rightsteeringjoint", but you also have to fix "leftsteeringjoint". 1:53:30 Al: Asks about "/got_path", Jeff says "throw it out". 1:56:40 Simulated vehicle vs. real vehicle. 1:57:40 ROS2 NAV2?
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Pretty cool to watch them progress.....https://www.farm-equipment.com/articles/21047-sabanto-releases-first-autonomy-kit-for-kubota-m5-tractors
The message Al just posted has a side link. This is troubling. I guess all lawn equipment will have to switch to diesel engines (because they are not spark ignition). Because an uninformed person funded by money from anti-gasoline lobbies told this person to do so... https://www.farm-equipment.com/articles/21137-minnesota-to-consider-ban-on-gas-powered-lawn-equipment
Trying to get rosserial to work on my TTGO ESP32 board. Hoping the solution is similar to these folks. https://github.com/espressif/arduino-esp32/issues/4807
*Thread Reply:* Unfortunately no luck so far. I made this post...https://community.lilygo.cc/topic/179/lora32-v1-0-esp32-development-board-continually-rebooting-when-using-ros
This video is quite informative for doing auto steer. https://www.youtube.com/watch?v=kEL2vY1CA8s
* REPOST *
Useful links: (The pinned message from the General channel.) • Video conference link (Zoom): https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 ◦ (The Zoom video meetings are Thursday at Noon (USA ET); ◦ Saved videos (link); Saved chats (link) • You Tube Video of Meetings: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ • Indexes made by Jeff: http://sampson-jeff.com/RosAgriculture/ • Google drive folder (link); Sheet with current configurations (link); Juan's action list (link); Rough work plan (link); Shared catkinws copies (link) • Robot Agriculture: https://github.com/ros-agriculture • Simulator: https://github.com/ros-agriculture/lawn_tractor/blob/master/simulator.md • Discourse: https://discourse.ros.org/c/ros-agriculture/63 • Discourse – Perception: https://discourse.ros.org/t/perception-project/17621 • Archived RosAg Slack content (link)
I asked ChatGPT for a response to the question I posted on ROS Discourse. You can read the response here: https://discourse.ros.org/t/ackermann-steering-bot-using-teb-planner-mbf-and-dubins-path-slows-down-at-each-goal-how-to-avoid-that/29731/2
Info from today's video meeting.
===========================================
20230302 Lawn Tractor Automation meeting Length 46:47 This video: https://youtu.be/Rsmwa3C2OAc
Chat: Al: https://chat.openai.com/chat
Index: 00:00 Terry: Is again having trouble drilling his stainless steel panel. 02:05 Terry: Wiring plans. 03:15 More about drilling stainless steel and tempered steel. 06:00 Al: Talks about his embedded code. ROS Serial. 10:20 Jeff: A quick way to show path coverage in RVIZ. 22:10 Jeff: Talks about cross track error. 29:00 Jeff: Talks about choosing a processor board (yet again). Wiring methods. 35:55 Jeff: Comments about using a Blue Pill board as an Arduino. 36:50 Using uart communication between low level boards. 43:25 Discussion about remote control.
===============
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
Just a reminder, meeting in 58 mins.
Zoom meeting (Thursdays at Noon (ET)) <https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09#success
Info from today's video meeting.
===========================================
20230309 Lawn Tractor Automation meeting - Length 1:32:06 This video: https://youtu.be/D0T9628aaZ8
Chat: Jeff: Ubuntu Mate 20.04 Raspberry Pi 3B (This wasn't obvious, so I'm not going to make any recommendations.) Al: Prebuilt Raspberry Pi image notes. https://docs.google.com/document/d/1QFE9bjuWu2TFPzNrz0wPjaYWnLHPU8j6e5EXUDqmovQ/edit Jeff: Ubiquity images : https://learn.ubiquityrobotics.com/noetic_pi_image_downloads LinoRobot code: https://github.com/linorobot/linorobot/wiki Presentation: https://www.youtube.com/watch?v=Hj7m2xwlhWY Al: sample udev rules https://github.com/jones2126/lawn_tractor_embedded_code/blob/main/05-serial.rules Jeff: Linux command to show secrets... dmesg OR dmesg --follow OR dmesg -w Jeff: The quadrature decode example for STM32. https://github.com/chrisalbertson/quadratureBluePill Al: $65 Cortex M4 https://www.thanksbuyer.com/stm32-development-board-cortex-m4-small-system-board-stm32f429igt6-core-board-61091 Al: Roger Clark site https://www.rogerclark.net/ Jeff: Other Roger Clark / STM32duino links: https://github.com/rogerclarkmelbourne https://www.stm32duino.com/ https://github.com/stm32duino/Arduino_Core_STM32/wiki/Upload-methods#stm32cubeprogrammer https://github.com/rogerclarkmelbourne/Arduino_STM32/wiki/Bootloader
Index: 00:00 Terry: Battery mounting. Wiring - switches, power convertors, fuse block. 02:10 Terry: Loading ROS on Raspberry Pi. Desktop vs. server. Ubiquity image? Lino Robot image? Discussion on installing different images. 09:45 Terry: Talks about his ROS to microcontroller communication method. ROS Serial or hand built protocol. UART Vs. USB. 15:40 Terry: Wants some way to identify which port is plugged in. We discuss udev rules. 23:55 Jeff: Talks more about serial interfaces. 29:00 Terry: Asks how to power the boards. 33:45 Back to serial discussion. 36:30 Terry: Asks which diode to use. 38:10 Jeff: Mentions bootloaders. Then talks about a quadrature decode example. Will not compile on Windows. Loaded Arduino and a STM32 core on Linux. That compiles. Long story of loading drivers to get the STM32 ST Link module to work. 42:00 Jeff: Goes back to Windows machine to fix it. Replacing the STM32 core fixed it. But ST Link module no longer works. 43:05 Jeff: "Now I can compile and upload under Arduino on Linux." 43:15 Jeff: Back to bootloaders. 46:00 Back to the quadrature story. 47:05 Back to bootloaders again. 50:50 The ultimate goal of total in-circuit programming. 51:25 Down the stm32duino rabbit hole. 56:20 Other than that, Jeff hasn't been doing anything, except thinking about wiring up a physical board. Can be compiled and programmed with either Arduino IDE or ST Micro IDE. 59:15 Al: Asks for confirmation that this is all figured out. 1:05:00 Jeff: Talks about difference between ROS Serial and hand built protocol. 1:14:40 Al: Goes over changes he is making. Messages, servo resolution, mode switch, test driving, PID settings. Next steps. 1:17:15 Al: Shows his current board and goes through code changes. Changing from ESP32servo.h to ledc.h to improve servo resolution. Other changes and description.
===============
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
I have been playing with the Blue Pill STM32F103CC8 boards for Arduino for about a week and a half. I can't recommend them (yet).
In the last couple of days I have an HID bootloader working that appears to be stable (even without the infamous "change the resistor value"). I wasted a lot of time on the Roger Clark STM32duino bootloader I haven't gotten it to download even once.
Depending which "Arduino Core" you run, they act differently. The "official" STM Arduino Core (which is supported by ST Micro) appears to use HAL libraries (Hardware Abstraction Layer - which people refer to as "big and bloated"). The Roger Clark STM Arduino Core (and derivatives) uses something else (CMSIS libraries?) These are different ways ST Micro has developed over the years to "make programming easier". But if you find sample code then it may or may not work with the different "core" types.
I had found a hardware quadrature encoder example that will compile with the Roger Clark flavor of "core", but not on the ST Micro flavor of "core". After digging through the forums I found an example of hardware quadrature decode (in HAL?) that compiles and works with the ST Micro flavor. So I have something to proceed with.
If you don't do "special stuff" that requires direct register access, then either version "core" should work to do "standard" Arduino code. But if you need to play with registers then you have to pick one or the other.
So in my frustration, I ordered 3 Black Pill STM32F401 boards on Saturday (@$5.30 each). They showed up today. I can't talk to them because it says I have to load the STM32CubeProgrammer package (as opposed to the ST-Link binaries that work with the STM32F103). So I have to figure that out next.
But the Black Pill boards have 4 times the code space (256KB) and three times the RAM space (64KB) and have a built-in DFU USB bootloader in addition to the serial bootloader. So I will let you know how that works out after I get the STM32CubeProgrammer loaded.
The encoder code I have compiled for the Blue Pill is taking about 24K out of 62K (64K total - 2k bootloader). So the extra code space on the Black Pill may be needed.
In case you can't tell by my comments... I am quite annoyed at this point.
So I got the STM32CubeProgrammer loaded. It would have been handy if they had usable documentation. But a little guessing got me running.
It works in both ST-Link SWD mode and in USB DFU mode. You have to change a switch to put it into DFU mode.
The blink program seems to coincidentally take 24KB of space but the new chip has 256KB total with no space required for a user bootloader. Both download modes are quite fast. The SWD is about 0.75 seconds and the DFU is about 1.018 seconds.
Now I will see if I can get the encoder to run on the new board.
I'm much happier now. I just now got the first encoder readings out of my Black Pill STM32F401 board. I built the code with the STM32CubeIDE instead of Arduino. The final thing that made it work was to slow down the clocks for the board and now it works. I still haven't gotten it to work with the STM32ArduinoCore. I will have to dig through their clock setup and see if I can modify it.
Just a reminder, meeting in 56 mins.
Zoom meeting (Thursdays at Noon (ET)) <https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09#success>
Info from today's video meeting.
===========================================
20230316 Lawn Tractor Automation meeting - Length 1:39:53 This video: https://youtu.be/LavOitWonAo
Chat: Al: GitHub initial use instructions (rough) https://docs.google.com/document/d/1SSOxppbL0xE6TJTNi88JQj_6uKqhQ5eyVVkR58dwTBE/edit?usp=sharing
Index: 00:00 Al: Talks about speed control. Speed sensing, PID control. 03:20 Al: Playback of speed control testing. 06:25 Al: Shows the code he is using. 07:55 Al: Next steps. 08:25 Terry: Asks about one wheel sensor vs. two. 09:15 Al: Goes over how his encoders work. 12:45 Terry: Talks about his progress. Battery mounting, wiring, 15:25 Terry: Loading software on RPi and laptop. 16:00 Terry: Next steps. Battery monitoring, RPi to laptop communication, mount circuitry on robot. VNC server on RPi. Start going through ROS tutorials. 19:30 Bump sensors and IR distance sensors. 21:40 A discussion about github. 25:15 Al: Shows a quick demo of VS Code integration with a github. 34:40 Jeff: Launches into to his experiments/progress on Blue Pill and Black Pill boards. Blue Pill with Arduino and bootloader. Differences in Arduino cores. 40:45 Gave up on DFU bootloader and switched to an HID bootloader that seems to work. 43:00 Trying to get encoder code to run. 44:55 Out of frustration, the Black Pill boards were ordered. Some description of Black Pill. Differences between programming the Blue Pill and Black Pill boards. 47:50 Black Pill board will not run the encoder code from the Blue Pill. 49:00 Installed STM32CubeIDE to attack the problem a different way. Figured out how to get the USB port to act like CDC serial port. Encoder and USB working for STM32F103, but on STM32F401. 50:40 Assumption the clocks are wrong, played with clock configuration. 55:30 Encoder now working on Black Pill STM32F401. Comparison of usage and programming. 58:45 Standard Arduino function vs. non-standard functions. 59:30 What next? 1:00:00 Physical prototyping and development. Which peripherals is Arduino using? What peripherals do I need to make my robot work? 1:08:00 Communication between low level controller and high level controller. ROS Serial? 1:08:55 Terry: Suggests adding limit to switches to steering (or whatever you are moving). 1:16:20 Al: Suggests monitoring current to detect a mechanical fault condition. 1:18:45 Al: Asks about setting programming mode on Black Pill board. 1:21:05 Al: Asks about using interrupts instead of I2C interface. Some discussion of his magnetic absolute shaft encoder chip. 1:34:50 Discussion on transporting your vehicle to a testing location. 1:36:00 Comments on Terry's automated snow blower.
=====
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
Info from today's video meeting.
===========================================
20230323 Lawn Tractor Automation meeting - Length 45:20 This video: https://youtu.be/YyTRvCFoUL4
Chat: None
Index: 00:00 Terry: Talks about a dead laptop. Loading a spare laptop. 04:30 Terry: Is documenting his plans/progress. 07:00 Jeff: Continuing saga of ST Micro processor boards. Probably giving up on Arduino for STM32. The Arduino for STM32 has various problems that I can't figure out. Bluepill (STM32F103), Blackpill (STM32F401), STM32F407VE, F4-Discovery (STM32F407VG). 11:50 Moving to STM32CubeIDE, libraries, USB. 13:45 Comments about using the A/D inputs. 14:10 Comments about encoders and other timers. 15:00 Giving up on Arduino IDE. 15:20 Setting the clock speed the same across boards to allow cut and paste between projects. 16:00 Testing and verifying the 3 "extra" timers. 16:20 Showing a couple of encoders used for testing. 18:10 Prototyping methods. 19:25 UARTs for remote control and debug. 20:25 Configuring the processor. Peripherals vs. pins. 21:15 Adding the LORA radio board. 22:00 Al: Asks which board will be used. Size vs. pins vs. peripherals vs. pin configuration. 24:20 Al: Asks about encoders. 25:20 Header pins. 27:00 Al: Shows his status. He is working on wheel odometry. Problems with IMU. 29:10 Terry: About heading vs. Yaw vs. rotational rate. More on Al's code. 34:40 ROSSerial. 38:30 Jeff: More comments about analog inputs on ST Micro parts. 40:50 Al: Shows how he tracks his pins. 42:35 Pin remapping on modern processors. One more advantage/headache.
=====
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
Wow! This is worth looking at... https://groups.google.com/g/hbrobotics/c/L_MW26-Q1ac This video demonstration is referenced, but I think it is someone else's: https://www.youtube.com/watch?v=NUBm9b8YQMQ
@JeffS Thanks for the clue. I now have an improved, but not perfect tf tree attached. Wheel odom code is here: https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/odom_wheel_230329.py
Info from today's video meeting.
===========================================
20230330 Lawn Tractor Automation meeting - Length 50:45 This video: https://youtu.be/lRiSvOvub88
Chat: Jeff: SerialTransfer for Arduino communications: https://github.com/PowerBroker2/SerialTransfer
Index: 00:00 Jeff: Continuing saga of ST Micro processor boards. Bluepill (STM32F103), Blackpill (STM32F401), STM32F407VE, F4-Discovery (STM32F407VG). Overall description of pros and cons. Shows the dual processor master/slave prototype. 10:20 Al: Asks about the visualizer software that was posted to slack. 11:25 Al: Talks about implementing wheel odometry. 14:50 Jeff: Mentions difference between "real" wheel odometry and what Gazebo is providing. 17:55 Terry: Suggests periodically "syncing" the wheel odometry to the GPS. 24:05 Jeff: Mentions that Ros Serial also works on the Blue Pill board. And that you could use something different from a Blue Pill board as your master processor. SerialTransfer can use UART or I2C or SPI for a connection. 26:00 Terry: Talks about loading Ubuntu on laptops. Future plans. 33:15 Mention of zones and speeds. 35:35 Enabling VNC. 37:30 Jeff: Asks about the steps Terry used when loading Ubuntu. Discussion of Terry's several computers. Jeff is confused about Terry's Windows computer. Laptop memory.
=====
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
@Al Jones (from meeting today) If you look at this page: https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/Imu.html the z field of angular_velocity is your rotational velocity.
Info from today's video meeting.
===========================================
20230406 Lawn Tractor Automation meeting - Length 42:14 This video: https://youtu.be/C2KhruVG8N0
Chat: Al: https://www.amazon.com/gp/product/B07DW646GY/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1 Al: https://frame.work/products/laptop-diy-12-gen-intel
Index: 00:00 Jeff: Continuing saga of ST Micro processor boards. Talks about powering boards from either USB or externally, or both. USB hub vs. hacking USB cables. Modular design. Physical relays vs. solid state relays for eStop circuit. 13:10 Al: Asks what the extra lines were on the diagrams. Which USB hub to use? 17:00 Terry: Laptop stories. Buying a new laptop. 23:55 Terry: Is working with the tutorials and writing new code. 25:10 Terry: Currently: VNC, prototyping, ROS on RPi. 25:40 Al: Asks about Terry's code experiments. 27:55 Al: Posts a link to configuring/building laptops and a link to a USB hub. 28:55 Al: Talks about odometry/location progress. 34:40 Al: Ordered two of the Black Pill F411 boards. STM32 bootloaders. 40:20 Al: Asks Jeff where he got his LORA boards.
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
@JeffS I was following this weeks video. Could you post a good picture of the various power and comm options you discussed? BTW I hope you guys dont mind,I downloaded the transcript and intend to deep dive in what you guys where talking about. All this is just for my personal use.
@Bob Hassett I assume you mean "zoomed in" for "good picture". These were random concepts out of my notebook for my own benefit. Basically "cartoons" of what I was thinking. They were never meant to be accurate. So I am not going to post a zoomed in version of these (unless I add a disclaimer that says these are nonsense).
What I should do is create a detailed version of this presentation to point out the problems and the solutions. Maybe I will do something for the next meeting. Or maybe I should post it to the real world, on may YouTube page. Or maybe do nothing...
The image below represents the course travelled by my lawn tractor testing the odom publishing program using wheel odometry plus getting heading from my IMU. Previously I was trying to calculate heading based on meters travelled by the left and right wheel, but decided the IMU data has more accuracy.
# Calculate the change in x and y position of the robot
#delta_x = distance ** cos(self.heading_radians_wheels)
#delta_y = distance ** sin(self.heading_radians_wheels)
delta_x = distance ** cos(self.heading_radians_imu)
delta_y = distance ** sin(self.heading_radians_imu)
As mentioned on our calls, I'm hoping to use this odom data when I lose GPS RTK fix and then return to RTK derived odom when fix is returned. That is code yet to be written. Link https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/odom_wheel_230329.py
Testing my odom publishing program that uses wheel odometry when RTK Fix is not available. The biggest shift comes at the beginning when Fix first starts and wheel odom has been running for about 20 seconds. So far I'm pretty happy with it. More testing needed.
Bagfile here: https://drive.google.com/file/d/1PJ63h97h0YQ2P90CkHfBvK5tNtmY4UlF/view?usp=share_link Odom publishing code here: https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/odom_from_wheel_and_gps.py
Reminder...
Zoom meeting (Thursdays at Noon (ET)) <https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09#success>
Info from today's video.
===========================================
20230413 Lawn Tractor Automation meeting - Length 1:03:28 This video: https://youtu.be/_SUk36E5LGM
Chat: Al: https://www.uugear.com/product/7-port-usb-hub-for-raspberry-pi/ Jeff: https://abra-electronics.com/robotics-embedded-electronics/arm-development/stm32f411ceu6.html Jeff: https://docs.zephyrproject.org/3.1.0/boards/arm/blackpill_f411ce/doc/index.html Al: STM32F411CEU6 core board V3.0 Freq. 100MHZ ROM:512KB RAM:128KB Jeff: https://stm32-base.org/boards/STM32F401CEU6-WeAct-Black-Pill-V3.0.html Jeff: https://stm32-base.org/boards/
Index: 00:00 Terry: Goes over progress learning TurtleSim and writing code to drive it. 04:50 Al: Shows his wheel odom and GPS odom test results. 09:40 Al: Talks about his Pure Pursuit experiments on a simulator. 13:20 Al: Issues with git. 14:20 Al: What's next. 14:40 Jeff: Talks about his project powering multiple Adruino-type boards. 21:00 Jeff: Says his snow is gone so he could start playing with his lawn tractor again. 22:35 Jeff: Mentions the variations of cmdvel, modified cmdvel, ackermanncmdstamped, and ackermann_cmd. 36:15 Al: Shows his USB hub. 38:25 Al: Talks about his new Black Pill F411 boards. (For rest of meeting)
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
I see the our last video meeting put us at the 100 video mark. A few of those are shorts and tutorials, but most are meetings. 💯
There are few things going on here. First the path from the garage to the starting point of a dubins test, running the dubins test using pure pursuit, manually driving a hard right turn, then continuing with a manual hard left turn and returning to the garage. So this was the first time in the real world to test letting the lawn tractor be guided by pure pursuit. Lots to adjust and refine.
Crazy prices on GNSS equipment. https://www.aliexpress.com/store/1101386350?spm=a2g0o.detail.1000007.1.7fde29fctLYsr9 No, of course I don't need another receiver. Just looking at the Base Station container https://www.aliexpress.com/store/1102602874?spm=a2g0o.detail.1000007.1.7f5933c7zuuZQV
@Al Jones At around 40:00 in today's video Al says he is using pose.orientation.z but that is actually part of a quaternion. If you want z, use the yawdeg (or convert that to radians). Or convert the quaternion to x,y,z, but that should be the same value as yawdeg...
Info from today's video meeting.
===========================================
20230420 Lawn Tractor Automation meeting - Length 43:08 This video: https://youtu.be/B6ITWNVfY1M
Chat: None
Index: 00:00 Jeff: Random updates. Cell modem doesn't work. 3G vs. 4G? Screwed up his cell phone during debug. Various alternate methods of transferring correction data. 03:00 Jeff: New fiber optic Internet (that isn't hooked up yet). 04:25 Al: Has a list of things he is doing. Pure Pursuit, vehicle speed. Status lights. IMU problems. Black Pill won't program. Ideas for organizing a GPS base station. 07:25 Al: Shows some plots. 11:35 Jeff: Asks about turning radius. 17:55 Al: Now has a mysterious offset when he starts up. 19:10 Al: Talks about his status LEDs. 21:20 Al: Safety features. 23:55 Al: Ardusimple survey package. (As an example) 26:05 Discussion about programming a Black Pill board. 30:15 Al: Next steps. 31:25 Jeff: Asks about Al's IMU. 37:50 git problems. Back to the problem. 39:30 More analysis of plots.
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt (or newest file)
Controlling my speed has become an issue. Part of that problem is the connection between my SuperServo200 and the transmission. It appears the shaft coupler that I had welded to a connecting strip is wearing on the shaft and has become a little loose. I might need to change to a clamp-style coupler to get more surface area to hold on to the servo shaft.
Or you could just put removable Loctite on the set screw. https://www.loctiteproducts.com/en/products/specialty-products/specialty/loctite_threadlockerblue242.html or Harbor Freight is cheaper. https://www.harborfreight.com/02-oz-removable-threadlock-96059.html?_br_psugg_q=loctite
Thanks for weighing in. The control arm is not loose because the set screw was absent or became loose. I thought I was being clever the last time the set screw was loose. I tightened it down as far as I could and used Loctite RED. Now I've bent an allen wrench trying to get the setscrew out. The set screw has actually gouged the shaft somewhat. I was able to get the control arm off with force. Now it seems to get the set screw out I would need to heat it up. Or make a new control arm. A new collar is enroute from McMaster Carr.
Do you still have the round plate that came with the servo. You're not going to get a much better fit than that. You can bolt a metal arm to that.
Here is an article on lithium battery management: https://www.synopsys.com/glossary/what-is-a-battery-management-system.html from this email thread: https://groups.google.com/g/hbrobotics/c/UigClwx-wwI
Info from today's video meeting.
===========================================
20230427 Lawn Tractor Automation meeting - Length 43:55 This video: https://youtu.be/-mRuvFu-ro8
Chat: None
Index: 00:00 Jeff: Cell modem account cancelled. Now what? 02:50 Terry: Has another dead Unbuntu install. Need backup? 07:30 Jeff: Asks about updates on using drill batteries. 08:15 Al: Talks about his giant R/C servo for transmission control. Shows pictures of his servo arm. Shows graph of speeds. 17:00 Jeff: Asks about GPS speeds vs. wheel speeds. Calculated angular rotation vs. rotation directly from IMU. Mounting plate that came with Super Servo. 23:25 Removing the spring from inside of the transmission. 25:40 Jeff: Asks about Al's big tractor. 26:00 Software speed control loop fighting the engine governor speed control loop. 32:00 We start looking at actual data from odometry and IMU. 38:20 Quaternions in Plot Juggler. And custom series.
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Info from today's video meeting.
===========================================
20230504 Lawn Tractor Automation meeting - Length 43:33 This video: https://youtu.be/qgptDyHIDdw
Chat: Al: Something to share https://github.com/yoneken/rosserial_stm32 # might be useful if trying to use STM32 board http://wormfood.net/avrbaudcalc.php https://github.com/ClemensElflein/open_mower_ros https://wiki.openmower.de/index.php?title=FAQ#What_is_OpenMower? http://wiki.ros.org/wit-imu-driver https://wiki.openmower.de/index.php?title=FAQ#Who_created_OpenMower? https://github.com/Inspur-ROS/WIT-IMU https://allentown.craigslist.org/grd/d/macungie-craftsman-trac-drive/7606249290.html ESP-NOW https://www.youtube.com/watch?v=bEKjCDDUPaU (DroneRobot Workshop) https://www.youtube.com/watch?v=6NsBN42B80Q (Andrea Spiess) https://www.youtube.com/watch?v=oz0a7Ur7nko (packet loss analysis) Robo Tank https://github.com/SensorsIot/RoboTank
Index: 00:00 Terry: Updates on laptops and software experiments. 02:20 Jeff: Talks about telemetry radios. 14:30 Also have LORA and WiFi to try instead. 15:00 Maybe make a GPS base station. 16:45 Jeff: Pulled an EarthWise reel type push mower out of the trash. 20:15 Al: Talks about sending parameters to his vehicle microcontroller. 22:20 Al: Talks about different versions of ROS-Serial he has tried. 27:15 Al: Safety switches. (Stopping on loss of signal) 32:20 Al: Interesting links to share. 37:00 Al: Next steps - Pure Pursuit and velocities. 39:40 Jeff: We need to create a list to modify on the "standard" Pure Pursuit package.
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
A couple of videos of RVIZ results; One a simulated 3 turn mission and the second is a failed attempt at running in the real world. Next step is to debug what the heck went wrong. https://drive.google.com/drive/folders/1u2zG9WfJiiIcmby0f9IYj108OB1lsngI?usp=share_link
Here is my current test bed. I’m combining an old pic based Vex EDR system with heavy current motor drivers. Right now I’m using 12 volts to drive motors. Eventually I’ll use 24v. Vex uses 8 volts to power their motors. I don’t presently have a 5 volt source to provide logic high to the heavy current motor drivers. do I tap the battery pack or hang a step down of the 12 volt source?
Good design says keep the 2 circuits separate. But I have to compromise.
So what to do?
I’m now thinking some kind of converter from RC PWM to a PWM.. something like https://www.hicomponent.com/4-channel-pwm-to-0-5v-converter-module.html any opinions?
I don't think that board does what you want. It looks like it converts 100% PWM to a voltage.
You could create your own converter with an Arduino type board. Put code to decode R/C PWM to get a number. Then use that number to create PWM and DIR to run your Cytron board.
While following with Jeff’s suggestion, I stumbled across this one. Looks interesting and flexible for now and future adaptations. https://www.dimensionengineering.com/products/sabertooth2x12rc
Info from today's video meeting.
===========================================
20230511 Lawn Tractor Automation meeting - Length 40:30 This video: https://youtu.be/PmbywXZOvFM
Chat: None
Index: 00:00 Terry: Discusses his laptops. Dual boot, replacing hard drives, external hard drives. 06:30 Jeff: Adding old scrap lawn mowers to the Sammy2 mini-tractor. 09:15 Jeff: Talks about his telemetry radio experiments. 14:00 Al: Talks about ground plots. 15:35 Al: Radio problems. Blocking calls affecting loop rate. 28:30 Jeff: Breaks in with urgent news... 28:45 Jeff: Makes comments on the ground plots from earlier in this video. 33:15 Al: Shows some connectors and asks for some advice.
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Finally got the Sawtooth 2x12 controller to work. Now I can remote control 2 motors. One for steering and one to fwd/reverse. I’ll have to put 2 12v batteries in series if I want 24v to drive the motors…go faster. Found out I can’t use a step up for 24v because of the regeneration feature.
Still testing LoRa communications between my DIY purpose built radio control and my tractor using the jgromes radiolib library on ESP32 boards. At the moment I can send a 30 character packet to the tractor and get a 22 character packet back from the tractor at a rate of 20 messages every 10 seconds (i.e. ~2 Hz). So far that is about the best I can do. Code for testing is:
Initiator (for me this would be my radio control): https://github.com/jones2126/lawn_tractor_embedded_code/blob/main/ttgo_LoRa_initiator_2/src/main.cpp
Receiver (for me this would be my tractor): https://github.com/jones2126/lawn_tractor_embedded_code/blob/main/ttgo_LoRa_receiver_2/src/main.cpp
I have been playing with the telemetry radios. The short story, they work. I haven't done any range testing. I have checked the received data by decoding it with the RTKlib "convbin" software with no complaints. And the F9P does not report any checksum errors. I can get a "float" and "fix" solution.
But I notice my F9P GPS is acting goofy. The last thing I did was disable all of the Galileo satellites and it seems to have settled down.
Here is an image showing the incoming RTCM3 status. You can also see the Galileo satellites popping in and out.
Info from today's video meeting.
===========================================
20230518 Lawn Tractor Automation meeting - Length 1:06:12 This video: https://youtu.be/yQh7S0jG4tc
Chat:
Command to fetch correction data and send it to the radio.
str2str -in ntrip://'USERID':'PASSWORD'@mncors.dot.state.mn.us:9000/'RTCM32NAD83(2011)'#rtcm3 -out
Command to fetch correction data and save it to a file
str2str -in ntrip://'USERID':'PASSWORD'@mncors.dot.state.mn.us:9000/'RTCM32NAD83(2011)'#rtcm3 -out
Command to convert correction data to a readable format. convbin usb1.bin -r rtcm3
Index: 00:00 Jeff: Talks about his telemetry radio experiments. Verifying data. Logging with Putty or writing to file using RTKlib str2str tool. Shows wiring and explains connections. Explanation of more experiments. 32:15 Thoughts on range checking. 35:10 Terry: Asks about correction data reliability. 39:00 Jeff: Shows the two types of radios that he has. 42:55 Jeff: Mentions also trying to get cell phone to provide correction data. 47:55 Terry: Talks about laptops. And he has another new laptop. 54:20 Al: Talks about his LORA radios. 59:00 Discussion of loop rates and stopping distance. 1:01:00 Another discussion about stopping distance. Some final comments about safety.
==
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I mentioned on the video meeting that I was going to make another attempt to get my cell phone to provide GPS correction data. The phone is a Samsung A20 with a Tracfone cell phone plan. They have the Wifi and Bluetooth tethering locked out.
I loaded this software: https://www.ardusimple.com/introducing-gnss-master-app/ And this adapter cable: https://www.amazon.com/JSAUX-Adapter-Compatible-MacBook-Samsung/dp/B07BS8SRWH/ref=sr11?crid=1A420OXK020V1&keywords=USB%2BC%2Bto%2BUSB%2BAdapter%2B%5B2%2BPack%5D%2C%2BJSAUX%2BUSB%2BType%2BC%2BMale%2Bto%2BUSB%2B3.0%2BFemale%2BOTG%2BCable&qid=1684641521&sprefix=usb%2Bc%2Bto%2Busb%2Badapter%2B2%2Bpack%2B%2C%2Bjsaux%2Busb%2Btype%2Bc%2Bmale%2Bto%2Busb%2B3.0%2Bfemale%2Botg%2Bcable%2Caps%2C200&sr=8-1&th=1|https://www.amazon.com/JSAUX-Adapter-Compatible-MacBook-Samsung/dp/B07BS8SRWH/ref=sr[…]to%2Busb%2B3.0%2Bfemale%2Botg%2Bcable%2Caps%2C200&sr=8-1&th=1 And this FTDI adapter: https://www.amazon.com/HiLetgo-FT232RL-Converter-Adapter-Breakout/dp/B00IJXZQ7C/ref=sr118?crid=1WE111HRW2S6H&keywords=ftdi+ttl-232r-3v3+usb+board&qid=1684641614&sprefix=ftdi+ttl-232r-3v3+usb+board%2Caps%2C192&sr=8-18|https://www.amazon.com/HiLetgo-FT232RL-Converter-Adapter-Breakout/dp/B00IJXZQ7C/ref=[…]41614&sprefix=ftdi+ttl-232r-3v3+usb+board%2Caps%2C192&sr=8-18
And much to my surprise, it works. It successfully bypasses whatever Tracfone attempted to lock me out.
I have only run it on Wifi, but I assume it will work on cell network data...
I was struggling getting my purpose-built radio control LoRa ESP32 board, my tractor LoRa ESP32 and Laptop running ROS and using rosserial to work together so I just rolled my own serial, data passing, test code to see how that might work. At first I struggled and then @JeffS and @TERRY SCHUMACHER set me straight today. Just testing I set the laptop to ESP32 at 100 Hz and the ESP32 to laptop at 50 Hz and it looked like it was able to handle 99 and 49 Hz respectively. I will likely only need 10 and 5 Hz. Code below:
ESP32 side: ```#include <Arduino.h> //#define BATTERY_PIN A0 // analog pin for battery voltage
float linearx = 0.0; float angularz = 0.0; float voltage = 0; float incomingHz = 0.0;
unsigned long currentMillis = millis(); unsigned long previousSendVoltage = 0; unsigned long previousDisplay = 0; unsigned long previousHzCount = 0; const long intervalSend = 200; // 5Hz = 200; 50 Hz = 20 (e.g. ((1/50)**1000)=20 rate for sending battery voltage const long intervalDisplay = 1000; const long intervalHzCount = 5000; int incomingCount = 0;
AdafruitSSD1306 display(SCREENWIDTH, SCREEN_HEIGHT, &Wire, -1);
void startOLED(){
pinMode(OLEDRST, OUTPUT);
digitalWrite(OLEDRST, LOW);
delay(20);
digitalWrite(OLEDRST, HIGH);
Wire.begin(OLEDSDA, OLEDSCL);
if(!display.begin(SSD1306SWITCHCAPVCC, 0x3c, false, false)) { // Address 0x3C for 128x32
while(1) delay(1000); // SSD1306 allocation failed - loop forever and don't continue
}
display.clearDisplay();
display.setTextColor(WHITE);
display.setTextSize(1);
display.setCursor(0,row_1);
display.print("start OLED");
display.display();
delay(2000);
}
void updateDisplay(){ if (currentMillis - previousDisplay >= intervalDisplay) { display.clearDisplay(); display.setTextSize(1); display.setTextColor(WHITE); display.setCursor(0, row_1); display.println("INCOMING READINGS");
display.setCursor(0, row_4);
display.print("Incoming Hz: ");
display.print(incomingHz);
display.setCursor(0, row_6);
display.print("linear x: ");
display.print(linear_x);
display.setCursor(0, row_7);
display.print("angular z: ");
display.print(angular_z);
display.display();
previousDisplay = currentMillis;
} }
void sendVoltage(){ // 5Hz rate for sending battery voltage or other tractor data
if (currentMillis - previousSendVoltage >= intervalSend) {
//float voltage = analogRead(BATTERY_PIN) ** (3.3 / 4095.0); // Replace with your specific conversion function
voltage = voltage + 1; // just using as a placeholder
Serial.println(voltage);
previousSendVoltage = currentMillis;
}
}
void checkSerial(){ // Receiving linearx and angularz data at 10Hz if (Serial.available() > 0) { String incomingData = Serial.readStringUntil('\n'); int commaIndex = incomingData.indexOf(','); if (commaIndex != -1) { linearx = incomingData.substring(0, commaIndex).toFloat(); angularz = incomingData.substring(commaIndex + 1).toFloat(); // add a safety feature to make sure the values are within an acceptable range or confirm it's done elsewhere incomingCount++; // increment the count of incoming messages } } }
void calcIncomingHz(){ if (currentMillis - previousHzCount >= intervalHzCount) { incomingHz = incomingCount / (intervalHzCount/1000); incomingCount = 0; // reset the counter every 5 seconds previousHzCount = currentMillis; } }
void setup() { Serial.begin(115200); startOLED(); }
void loop() {
currentMillis = millis();
sendVoltage();
checkSerial();
calcIncomingHz();
updateDisplay();
}
Python Laptop side
import serial
import time
ser = serial.Serial('/dev/ttyACM0', 115200, timeout=1) ser.flush()
voltage = 0.0 linearx = 0.0 angularz = 0.0 previoustimesend = time.time() previoustimereceive = time.time() previoustimedisplay = time.time() voltage_count = 0
while True: current_time = time.time()
# Receiving voltage data at 5Hz
if ser.in_waiting > 0:
line = ser.readline().decode('utf-8').rstrip()
voltage = float(line)
#print(f"Voltage: {voltage}")
voltage_count += 1
# Sending linear_x and angular_z data at 10Hz
if (current_time - previous_time_send) >= 0.1:
previous_time_send = current_time
ser.write(f"{linear_x},{angular_z}\n".encode('utf-8'))
# For testing, we can simulate linear_x and angular_z changes
if (current_time - previous_time_receive) >= 1.0:
previous_time_receive = current_time
linear_x += 0.1
angular_z += 0.2 # Adjust the increment to suit your needs
# Displaying the voltage count every 10 seconds
if (current_time - previous_time_display) >= 10.0:
previous_time_display = current_time
print(f"Number of voltage messages received: {voltage_count}, incoming Hz: {(voltage_count/10)}")
voltage_count = 0 # reset counter ```
Info from today's video meeting. I added a link to the Slack channel in the description. We'll see if that attracts anyone new.
===========================================
20230525 Lawn Tractor Automation meeting - Length 35:30 This video: https://youtu.be/Jk5CwawWj7k
Chat: The new Ardusimple GNSS Master cell phone app. https://www.ardusimple.com/introducing-gnss-master-app/
Index: 00:00 Jeff: Got his cell phone to provide RTK correction data for the F9P GNSS receiver. 08:50 Terry: Talks about laptops. And here is going next. And the Linux history command. 11:40 Al: Talks about his LORA radios. Println vs. ROS Serial. 18:20 Discussion about sending messages both directions. 27:00 Al: Has another dead laptop. 27:30 Mentions of Linux swap file size. 29:00 Al: Mentions "ESP Now" for communications. 32:35 Jeff: Asks about Al's previous experiments.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Info from today's video meeting.
===========================================
20230601 Lawn Tractor Automation meeting - Length 47:41 This video: https://youtu.be/SaqsdcoYjq8
Chat: None
Index: 00:00 Jeff: Quick mention of the cell phone CORS RTK correction data. 01:00 Jeff: Yes, after all this time, the Sammy2 robot still moves... And next steps. 02:00 Jeff: Steering calibration, yet again. Doing everything from remote control box and not using ROS command line. Rotational velocity vs. steering angle for control. 09:00 Using the switches on the remote for other control situations. 12:00 Using the switches to specify a path to be run. 15:50 Or, if running a navigation stack, have it post preprogrammed nav goals. More random discussion about procedures and remote control modifications. 38:00 Al: Status and ToDo list. LORA radio libraries. Some description of how his code works.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I had a thought yesterday. What if you convert the power steering motor on a lawn tractor directly to R/C control? There are lots of people on YouTube showing how to do that. They just use a controller board from a scrap R/C servo, a power driver board, and a pot attached to you motor.
I haven't thought through all of the advantages and disadvantages. But it might be something to consider.
I had a thought yesterday. What if you use your actual vehicle as a debug tool? Start your simulator and navigation software. Give it a path to follow and watch it run. Then have your steering controller on your vehicle subscribe to the same movement commands you are using (cmdvel or ackermanncmd). Then watch the steering on the vehicle as it just sits there. You can compare the physical vehicle to the simulated vehicle. The simulator is in complete complete control. The physical vehicle just mirrors the steering.
*Thread Reply:* @JeffS Thanks for giving it your thoughts. Both good ideas. I am currently struggling getting my low level ESP32 controller to work with LoRa and passing data back and forth to ROS. We can chat tomorrow if you are available at out regular time.
Info from today's meeting.
===========================================
20230608 Lawn Tractor Automation meeting - Length 51:17 This video: https://youtu.be/NtOn3eJ0cZk
Chat: Jeff: Examples of driving big motors from R/C signals. Using R/C servo https://www.youtube.com/watch?v=bURsvNdro4o Using Arduino https://www.youtube.com/watch?v=3pYWLF8qw-g Driver boards from videos: https://www.amazon.com/Driver-Channel-Current-Stepper-Controls/dp/B0C6JZFQP8 https://www.amazon.com/BTS7960-H-bridge-Double-Current-Diagnostic/dp/B09W8VV6RH
Index: 00:00 Jeff: Possibly convert big (power steering) motors into R/C servos. 19:20 Jeff: Ponders running your vehicle steering in parallel with a simulator. 27:25 Al: Says he may modify his firmware just to listen to cmd_vel for this test. 27:45 Al: Explains his current control architecture. 30:25 ROS Serial vs. custom serial. 31:10 Diagrams of Al's control system. 44:15 Visio vs. Freeform vs. hand drawn 47:00 Al: No GPS, udev rules, communications.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I'm running a path in Gazebo using pure pursuit and getting some cmdvel.angular.z values > steeringangle_limit: 0.785 which is meant to be set in pure pursuit yaml file. Starting to investigate....
path:
23.5 3.3 1.64
23.5 13.3 1.64
21.5 13.3 3.21
21.5 3.3 4.78
19.5 3.3 1.64
19.5 13.3 1.64
17.5 13.3 3.21
17.5 3.3 4.78
*Thread Reply:* Link to video of running the path in simulation: https://drive.google.com/file/d/1W-K6tMXfqNShEgWeFHk8hTltnQy45dUe/view?usp=drive_link
*Thread Reply:* First, my code does have the lines that you showed commented out. They are commented out in mine also. More importantly, those are not in the original repository, So somebody added these after the fact and then commented them out. I didn't intentionally make changes to the Pure Pursuit code. I did modify the ackermann_cmd type, then switched it back because I converted it externally.
*Thread Reply:* I see you are playing with the cmd_vel topic. I shouldn't even mention this again. But keep in mind you probably don't want the rotational velocity output, but the steer angle instead. My stuff was using the steer angle that was output in an ackermann topic. But basically I'm not sure what you are trying to do. Maybe you are just distracted by the funny rotational velocity you were plotting out.
*Thread Reply:* You mention in passing that the Dubins planner adds extra intermediate data points. And that you just created a path with corner points only. That's not really a valid path for the Pure Pursuit follower. It uses those intermediate points to do its job. But I'm not sure what you are testing for,
*Thread Reply:* I also noticed in passing the you did not update the wheel base value in your parameter file. You still have my value of 0.505 instead of your value (which I thought was 1.26).
*Thread Reply:* I couldn't get your video to play. I finally had to wait for Google drive to say it was done converting. Even downloading ahead of time did not work. After it thought it was done converting it seemed to play okay.
*Thread Reply:* Link to bagfile running this path in the real world outside. On a scale of 1-10 I would put it at about a 4. Speed still needs adjusting and steering is "wobbly" so I'm hoping PID tuning will improve that. At least the tractor generally ran the path and finished.
I was watching my transmission servo arm while the tractor was running its path and it would frequently move to neutral for a brief moment. I'm thinking my LoRa signal was dropping or the over protection on the servo was kicking in to protect the motor. My transmission connecting rod is ~2" from the servo center of rotation. That equates to about 44 lbs of torque capacity on the Super200 servo which is about what the rough bench testing showed the hydro-static transmission control arm required. I still need to look at the data closer, but I may need to see if I can reduce the distance from the center of rotation down to about 1.5-1.3".
Info from today's video meeting.
===========================================
20230615 Lawn Tractor Automation meeting - Length 44:05 This video: https://youtu.be/krF_3aVBGQU
Chat: None
Index: 00:00 Terry: Gives quick update on his status. 00:35 Jeff: Makes comments about his attempts to convert a power steering motor to R/C control. 06:40 Jeff: Shows some AS5600 magnetic shaft encoders from Amazon. Monitor carburetor throttle plate, for example. 12:20 Al: Gives update on his path following experiments. Comments about his transmission not maintaining constant speed. 25:00 Jeff: Mentions providing a valid path that Pure Pursuit expects. 25:50 Jeff: cmdvel vs. ackermanncmd. 39:20 Jeff: Makes comments on the difference between rotational velocity and steer angle.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Info from today's video meeting.
===========================================
20230622 Lawn Tractor Automation meeting - Length 18:43 This video: https://youtu.be/ucg0tIzKKco
Chat: None
Index: 00:00 Jeff: Says he hasn't gotten anything done. (comments about making power steering motor into R/C servo) 01:00 Terry: Also says he hasn't gotten anything done either. 01:20 Al: Shows his giant R/C servo he built from scratch. 03:10 Al: Shows a new giant R/C servo (Model DH-03x). And a 3D printed mounting bracket. 04:40 Terry: Asks Al about making counter sunk (beveled?) holes with the 3D printer. 06:10 Jeff: Says the DH-03x servo is the same one he used on his Sammy2 Mini-tractor for steering. 07:15 Jeff: Adds more info about the "giant servos". 12:00 Jeff: Mentions a YouTube video he just watched: https://www.youtube.com/watch?v=0nhxqqtfhoc Here is another one: https://www.youtube.com/watch?v=bRsKDwj_J3o 16:05 Jeff: Asks about Al's numbers for steer angle vs. rotational velocity. 16:30 Comments about using a small(ish) robot to do something useful.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Just a reminder,
Zoom meeting (Thursdays at Noon (ET)) <https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09#success>
*Thread Reply:* So it’s 11 m Minnesota time? We are out of town right now. @Al Jones was curious about my latest postings. All I have to work with is an iPhone right now and fat fingers. I do have Zoom loaded, maybe I can log in? If someone can display my photos maybe I could explain that way? Spoiler alert. This will not be a permanent solution. I just got impatient for remote control.
Bob said: "So it’s 11 m Minnesota time?" I assume you mean 11 AM... If so, yes it would be 11AM Central time. I have never logged in with a phone, but I assume it works.
My old white board. I have to update it to to include remote control progress. Can people see it? You will have to zoom in to to see details. https://miro.com/app/board/o9J_kzvV0Pw=/
*Thread Reply:* Thanks @Bob Hassett for introducing this tool. I've started converting my current diagram here: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=869571444353
Info from today's video meeting.
===========================================
20230713 Lawn Tractor Automation meeting - Length 1:23:45 This video: https://youtu.be/n_e7xKt9wZs
Chat: Jash: remoteteleop.com Bob's strawberry farm from about 20 years ago: http://sampson-jeff.com/rsoh/010617/ Software for robot arms: https://moveit.ros.org/ Al: https://github.com/jones2126/ros1_lawn_tractor_ws/tree/master
Index: 00:00 Jeff: Explains why we didn't have meetings. 01:00 Jeff: Talks about the state of his software and ways to fix it. 05:50 Jash: Joins the meeting. He introduces himself and his project. 07:00 Jeff, Al and Bob give short description on their projects. 13:10 Jash and Bob discuss strawberries. And other applications. 17:50 Jash: Posts a link to his project. 19:00 General discussion of operator assisted autonomous agriculture vehicles. 20:40 Short discussion about Internet connectivity in a typical farm field. 27:45 Short discussion of public disclosure about projects. 30:00 How do you test/train a remote operator. 32:30 Jeff: Gives a brief description of his "Sammy2" mini-tractor. 37:00 ROS1 vs. ROS2. 39:40 Jash: Asks if the Lawn Tractor Automation group has a documentation repository. Al: Posts a link to his github. Some talk about the video indexes and the way Slack and Zoom work. 51:00 Al: Gives a status update. Talks about his R/C servo for his hydrostatic transmission. 54:00 Al: Talks about publishing topics in a format that Plot Juggler will recognize. 54:50 Jeff: States that you should take advantage of "ros bags", if you are not already doing that. 56:50 Al: Makes comments about finding things in his github. 1:00:00 Bob talks about his tractor and control systems. 1:17:40 Jeff: Asks about Al's new giant R/C servo. Comments about buffering the feedback signal.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
*Thread Reply:* @Jash Mota here is a video of autonomous berry picking. Maybe your any where in the world driving software could interface with it https://m.youtube.com/watch?v=M3SGScaShhw&feature=youtu.be|https://m.youtube.com/watch?v=M3SGScaShhw&feature=youtu.be
I tried attaching the graphic to comments of the video meeting. No go, only takes text?
My LoRa project is on hold for now, depending on my portable case system development. I’m thinking of adding another processor and “dumb” vision.
I think I am running these: https://www.adafruit.com/product/3231 https://www.adafruit.com/product/3178 It is not obvious to me what you mean by "not providing reliable results".
*Thread Reply:* My LoRa safety switch triggers if the Tractor does not receive a LoRa message within one second. (code) In a 15 minute test run the safety trigger was set off 7 times.
*Thread Reply:* @Al Jones just curious about your testing environment? Outside how far away?
*Thread Reply:* The distance from my Tractor and DIY radio control is typically 5-15 feet. I'm typically walking behind or beside the Tractor until I get more comfortable with it.
Messing around with pure pursuit and I need to set steer angle limits. Currently this statement (link), uses one value for 'steeringanglelimit' which I need to set to 48 degrees which is the measured angle change from straight to a hard left turn. A hard right turn can go a bit higher, but I'll just stick with 48 degrees or .84 radians.
@Al Jones Yes that sounds correct. I'll make note of this and we can talk about it in the meeting tomorrow.
Info from today's video meeting.
===========================================
20230720 Lawn Tractor Automation meeting - Length 1:04:30 This video: https://youtu.be/QxoeBne2Xx0
Chat: Al: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=539014993193
Index: 00:00 Jeff: Hasn't done anything. 00:20 Al: Publishing topics as arrays. Miro white board. Testing, changing steering parameters. 03:10 Al: Posts a link to his white board. Shows a few details. 09:50 Al: Has a clever way to measure wheel steer angle. And how he uses those numbers. 12:50 Difference between rotational velocity and steer angle. 16:45 Al: Talks about his LORA timeouts. Some analysis of steering control. 32:00 Jeff: Suggests controlling the steering velocity. 40:40 Jeff: Talks about low level limit checking. 52:50 Jeff: The dividing line of control between ROS and your vehicle.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
As I am screwing circuit boards down on to painted plywood, it occurred to me maybe I better not touch the solder side down as to touch the painted plywood?
*Thread Reply:* Latex paint is a polymer (such as acrylic, vinyl acrylic, or styrene acrylic) suspended in water. These components do not conduct electricity well. If you used some other type of paint you can Google its components.
I would not expect paint to be a problem. But if it acts funny then maybe it is a problem.
@JeffS I could use your help when you have a moment. I'm field testing and the calculations for steering angle by pure_pursuit.cpp are going up and down. If you have a moment to jump on a call so I can explain I would appreciate it. I've taken some notes here: https://miro.com/app/board/uXjVM1yzdFo=/?moveToWidget=3458764560052849149&cot=14
@Al Jones Before I dig any further, I'm going to have to insist you give this a valid path to follow. Either fill in the "extra" points between you sparse points. Or just create a path with that generate_points.py (or what ever it is called).
@Al Jones I'm looking at your Miro page. You have a block labeled "CALCULATING PURE PURSUIT PATH". It has some Python code and some C code (that is labeled pseudo code). What is that stuff? Did you run Python code or C code on the tractor? Did you run the same thing on both the simulation and the real tractor?
I'll have to find a different computer that has PlotJuggler loaded on it.
*Thread Reply:* "pseudo code" was not run. Those are just notes trying to look at different ways to calculate a look ahead value. 'purepursuit.cpp' is located here. Purepursuit.cpp was run on both simulation and field test. Although pure_pursuit was the same in both settings, one significant difference is how the odom message is created. In the field test the odom comes from the gps. I think the tf too.
The video meting is starting now. https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
Info from today's meeting.
===========================================
20230727 Lawn Tractor Automation meeting - Length 1:08:43 This video: https://youtu.be/UV85342JqBE
Chat: Al: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350 Al: https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/odom_from_wheel_and_gps.py Al: How did you plot it out in degrees instead of radians?
Index: 00:00 Jeff: Talks about Al's current experiments (Pure Pursuit) and shows plots. 33:30 Jeff: Suggests analyzing control inputs to Pure Pursuit controller. 34:50 Some discussion of where heading is coming in both cases. 39:20 Al: Goes over some issues. 49:50 Tips/Hints when running PlotJuggler.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
*Thread Reply:* Dog gone it @Al Jones ! Why do you have to find such complicated problems! Now I have to research all those terms so I can understand the language you guys are using!
Another live field test. I would score this run about 80 out of 100. Green is planned path, orange is actual. Bagfile link Corrections that I had to make were updating the IMU driver to better adjust for initial heading, changes to odomfromwheelandgps.py to use the updated IMU orientation and change the launch file so pure pursuit used basefootprint instead of baselink. Video link
That looks much better. I don't know what you expect 100 out of 100 to look like... But if you would generate your path with a 1/10 meter spacing (instead of 1 meter spacing) and set your lookahead to 1 meter instead of 2 meters than it will follow your path much closer.
I finally got your video to play. It does look like is very deliberate about what it is doing. If you turn the blade on it will leave a history of where it has been.
I tried reducing the step size in the path generator from 1 meter to 0.1 meters and reduced the look ahead from 2 meters to 1 meters in the pure pursuit .yaml file. What is odd is the first 'straight' tracked well, but the others did not perform as well. Will try different config settings to see the results.
*Thread Reply:* Couldn’t help noticing the similarities of the corrections during travel. Reminds me of a line following bot hunting back and forth because of the distance between sensors and speed of travel.
*Thread Reply:* Hunting back and forth seems to be exactly what it is doing.
I saw a YouTube some place about a guy using his iPhone and iPad over FaceTime. He used his R/C controller to control his lawn mower. His iPhone was on the mower and his iPad was with his r/c controller. He could “see” where the mower was going. It took me awhile to figure out how he did it. I ended up using two different emails in gmail. Both are registered in my contacts. Long story short, it works. I’m itching to try it out when we get back home. We are at the lake. Next I want to try some kind of Zoom just over our home router. Just on our home net work. Not going back over the modem to the outside world.
*Thread Reply:* @Bob Hassett If I go into Zoom and copy the meeting invitation below is what I get. So I don't think Zoom uses your phone number or email. You only need the URL below to join.
> AlJones is inviting you to a scheduled Zoom meeting. > > Topic: LawnTractorAutomation > > Join Zoom Meeting > https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 > > Meeting ID: 820 8803 6016 > Passcode: 172310
*Thread Reply:* Thanks @Al Jones from what I’ve read, invitees don’t need a Zoom account. So I’m thinking I have to setup an account for the VexTractor with iPhone/iPad combo and then invite my device. I’ll have to decide which device will go on the tractor. Maybe the pad and then phone next to remote controller. I just keep experimenting.
More testing earlier today. (
@Al Jones When you are running the Dubins path planner, what are you specifying as the radius? Was it the same every time you ran it?
I'm going to dig into your bag files, but I thought I would ask before I start.
*Thread Reply:* @JeffS turning_radius was 1.2, the same each time. I was trying to force a tight circle.
After the 8 tries I tried 3 more with these parameters: turningradius = 2.0, stepsize = 1.0, lookaheaddistance = 2.0; turningradius = 1.8, stepsize = 1.0, lookaheaddistance = 2.0 ; turningradius = 1.6, stepsize = 1.0, lookahead_distance = 2.0
None of those were any better. :(
*Thread Reply:* So what is the actual turning radius of your vehicle? If you set the steering to max right or max left, what is the resulting radius?
It doesn't do any good to tell it to turn sharper than what it is capable of doing.
That would be like asking it to drive forward at 100 mph, it can't do that. You have to stay within the physical limits.
*Thread Reply:* Based on tracking the x, y coordinates it appears the actual turning radius is 2 meters.
@Al Jones In your Miro board, you have a plot of yaw of GPS vs. IMU. Where is the GPS coming from? It does seem awfully noisy. And it is not following the IMU yaw closely. Is that being derived from /gps/vel, or somewhere else? Is it being created in gps_odom.py (or whatever it is called now)?
@Al Jones I'm sitting here watching the Ukrainian war update and casually browsing your documentation. It appears you are running ackermanncontroller.py. That is specifically for the Gazebo simulation. I also see you are running vehiclestate_publisher so I assume you are running the full URDF from the simulator. That may work, but 99% of that data is not needed. It may not be a problem, or it may be,..
I came up with various thoughts... First, maybe your GPS yaw (is that COG [course over ground] ?) is noisy simply because your engine is running and vibrating. Next time you are testing, come to a stop, turn off the engine and set the free wheel lever in the back and just push the thing some distance across the yard, Then you ca start the engine and back to testing. Later check the bag file to see if the noise went away.
Another thing you could do is use a different pattern. Put your 3 vertical lines on top of each other so it drives up and down on the same exact path. (i.e., you drive back and forth between the same two points)
Make another pattern that is a square. That might tell you if the direction is causing some weird problem. Or maybe make a triangle. First leg north/south, next leg east /west and the final leg is a diaganol nack the start point. So that gives you 3 angles to try to follow.
Another thing you could try is to mark the points in you yard. Either just lay something there, or put a squirt of spray paint to mark it. Or push in some "big nail like things" at each location and stretch as string between them. With the stakes and string you have both end points marked and a visible line you can manually follow.
So lets assume you have a string between two points. Start your lawn tractor without starting Pure Pursuit. Manually drive up and down the line a few times. Then start over and start Pure Pursuit and leave it in manual, Then drive up and down the line a few times. So that means you are manually driving on the line and you can see how Pure Pursuit is reacting to any errors you have trying to stay on the line. Then maybe start a new test with Pure Pursuit and put it in auto (just so you have a record of how acted on the same day).
Another thing I thought of... You said you did something to artificially move the GPS location. Maybe you did that wrong. Try patching that out and see it affects the wavyness. I was going to suggest you temporarily move you GPS antenna directly over baselink, but I see you have a laptop computer in the way.
@JeffS Thanks for looking at it:
Messing around with wiper motors. This is running 12v continuous. I’m guessing it rotates back and forth about 120 degrees. It runs to fast. Sooo maybe I step down the voltage supply to 3 volts to slow it down for my controller. Then figure out how to tell it where to go and hold. But just seems like a lot of trouble to make it into a servo.
I suppose I could canibalize an RC servo. Mount the pot on top of the yellow shaft. Or go on the top side and belt the rotating shaft over to a pot stem. Maybe add a small H bridge? Then there is figuring current draw. I don’t know. PID control is getting to fancy I think? I want KISS.
The video meeting starts in one hour (from now). Noon Easterm time, USA. https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
Useful links: • Video conference link (Zoom): https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 ◦ (The Zoom video meetings are Thursday at Noon (USA ET); ◦ Saved videos (link); Saved chats (link) • You Tube Video of Meetings: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ • Indexes made by Jeff: http://sampson-jeff.com/RosAgriculture/ • Google drive folder (link); Sheet with current configurations (link); Juan's action list (link); Rough work plan (link); Shared catkinws copies (link) • Robot Agriculture: https://github.com/ros-agriculture • Simulator: https://github.com/ros-agriculture/lawn_tractor/blob/master/simulator.md • Discourse: https://discourse.ros.org/c/ros-agriculture/63 • Discourse – Perception: https://discourse.ros.org/t/perception-project/17621 • Archived RosAg Slack content (link) (edited)
Info from today's video meeting.
===========================================
20230803 Lawn Tractor Automation meeting - Length 1:09:39 This video: https://youtu.be/mtfZVkfVc6g
Chat: Al's Miro: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350 Video where we talked about creating paths with the Dubins path planner tool. 03052021 Lawn Tractor Meeting https://www.youtube.com/watch?v=F3ycvmWkUl8 If you have access to the Google Drive folders... Jeff: Random scripts https://drive.google.com/drive/folders/1ZMUC3L8v2PiuP4K4nJRjU3QYYqJ0b65F I can't get a direct link to gpsheadingnew.py. Just click on it...
Index: 00:00 Al: Starts status update. Talks about his Miro white board. 04:55 Al: Talks about his field testing results. 11:30 Jeff: Suggests the error cause might be cumulative. 17:35 Jeff: Asks where the GPS heading in Miro came from. Some confusion about the GPS predefined COG (Course Over Ground) and Al doing his own calculations and calling it COG. 20:45 Jeff: Mentions deriving SPEED and HEADING from the gps/vel topic. See link to scripts on Google drive (in chat section above). 24:00 Jeff: Suggests specific tests on a single vertical line pattern. 30:00 Jeff: Asks about Al's process for deriving GPS heading. Al pulls up Plot Juggler. 38:50 We go over how Al calculated what he is calling COG vs. "Real" COG that is provided directly in a GPS NMEA sentence. 41:30 Jeff: Attempts to do fancy math corrections in Plot Juggler, but fails. 48:20 We go look at Jeff's code to convert /vel topic back to speed and heading. 1:04:30 Question if Al is saving raw NMEA sentences. 1:07:25 Jeff: Starts playing around with Zoom settings and talks about bandwidth.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
*Thread Reply:* Recommended parameters for wiper motors http://www.scary-terry.com/wipmtr/wipmtr.htm
Heading out to run a test that Jeff suggested to go up and down the same path.
21.5 3.3 1.64
21.5 13.3 1.64
21.51 13.3 4.78
21.51 3.3 4.78
21.49 3.3 1.64
21.49 13.3 1.64
21.5 13.3 4.78
21.5 3.3 4.78
I hooked up a Windows laptop with the OEM GPS software since I was confused by having "RTK Fix' yet ROS lat/lon was jumping around. The lower right circle shows a radius of the circle when the tractor is sitting still of 0.012 meters (i.e. 0.024 m or 1" diameter). Something odd going on between the OEM software view and the ROS lat/lon view.
@Al Jones I'm just getting started for the day, I will think about this while I eat my lunch.
My theory on the circle is that you are attempting to correct your x/y position by multiplying the values by heading. So the noisy heading when you are stopped gives you that cute pattern,
And BTW, your OEM program seems to say you have a 1" circle, not a 9" circle.
*Thread Reply:* odom x and y is calculated using the 'll2xy' utility from
import geonav_transform.geonav_conversions as gc
...
self.x_gps, self.y_gps = gc.ll2xy(data.latitude, data.longitude, self.GPS_origin_lat, self.GPS_origin_lon)
If heading is used by 'll2xy' I'm not sure about that.
I do adjust the position based on the physical position of the gps
x_offset = 0.51 # 20 inches in front of the rear axle
y_offset = 0.03 # 1 inch to the right of the center line
*Thread Reply:* This looks awfully suspicious:
"I do adjust the position based on the physical position of the gps
x_offset = 0.51 # 20 inches in front of the rear axle
y_offset = 0.03 # 1 inch to the right of the center line
"
You are trying to offset by 0.5 meter and your result is swinging around a point by 0.5 (probably based in incoming heading, which is noisy because you are stopped).
*Thread Reply:* Did not work so well to simply take the x, and y without offsetting for where the GPS actually sits....
*Thread Reply:* I assume now if you come to a stop it does not create that circle of spider webs.
So maybe if you fix the code that was meant to do the offset then maybe it will fix everything.
*Thread Reply:* When I stopped for ~7 seconds the drift was < 0.5 inches
I uploaded the fixed path_planner (the program that generates points for a Dubins path) to this directory: https://drive.google.com/drive/folders/1t2P2sfiranPe10fmUBUnnVvKNgrfybAV
It is in directory: pathplannerfixed_20230806
Unfortunately no improvement with tweaks to the offset algorithm.
Problem statement: The robot follows a path reasonably well on the first segment, but on subsequent segments the robot struggles to stay on path.
Miro whiteboard - GPS to base_link offset testing https://miro.com/app/board/uXjVM1yzdFo=/?moveToWidget=3458764561123820016&cot=10|link - changing the offset calculation slightly is the only recent change. Bagfile link
My first thought is that your offset don't appear to right.
I didn't have the geonav stuff loaded on this computer. So I just hard-coded current x/y as 0,0 since the GPS doesn't matter. But I am confused on the different coordinate frames.
(in bagfine 2023-08-08-13-25-19.bag) So I wanted to see if you have any rotation between your lat/lon and x/y. But your /fix appears to have invalid lat/lon, what happened to that? So I could not compare the two. So I am assuming the x/y plot from odom is the correct rotation.
But, rotations of 0 and 3.14 seem to be correct. I think 1.57 should be +x and +y. I think -1.57 should be -x, -y. (It didn't help that your diagrams in Miro are in a different order, and unlabeled)
Here are the numbers with forcing GPS to 0,0. ```jeff@jeff-hp-pavilion:~/Downloads$ python3 pathtestoffset_1.py Testing Yaw (in radians): 0 GPS coordinates (X, Y): 0 0
Testing Yaw (in radians): 1.57 GPS coordinates (X, Y): 0 0
Testing Yaw (in radians): 3.14 GPS coordinates (X, Y): 0 0
Testing Yaw (in radians): -1.57 GPS coordinates (X, Y): 0 0 Base link coordinates (X, Y): 0.1 0.51 ------------------------------------------------```
*Thread Reply:* Re: invalid /fix lat and lon....certainly for the first 2 minutes the lat and lon are invalid because they are zero. Valid data comes in after RTK Fix is achieved. Did you see other time periods with invalid data?
*Thread Reply:* Fair point on documentation...improved version...I'm still looking at your calculations
*Thread Reply:* I just realized I am wrong... 1.57 is up, not down. I was going the wrong direction.
*Thread Reply:* "Did you see other time periods with invalid data?"
I didn't see any valid lat/lon values. Maybe I misinterpreted what should be there,
I have a new change to try. I was comparing my pure pursuit code and https://github.com/larics/pure_pursuit/blob/master/src/pure_pursuit.cpp For some reason that I do not remember, it appears I commented out the statement below in June.
*Thread Reply:* That change did not help. Recently the steering of first segment was not wavey and the subsequent segments were wavey. Now the first segment is wavey too. Which is maybe a clue. Continuing to debug.... :(
I added Al's lawn tractor video. I tried to hit all the keywords in the title. https://www.youtube.com/watch?v=HJcv3TwLYb8&list=PLdWg7SF1vdl0lg4vYOa5wIAXLIfPErG4_&index=7
The weekly video meeting starts in one hour. (Noon Eastern time) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
Info from today's video meeting.
===========================================
20230810 Lawn Tractor Automation meeting - Length 43:46 This video: https://youtu.be/4y8eaCSiYpU
Chat: https://f1tenth.org/ Al's Miro page: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350
Index: 00:00 Jeff: Says he finally got his fiber Internet connected. Says he should be installing a laptop back on his robot. Suggests doing creative things with the simulator for Al's project. 00:50 Al: Talks about his experiments to solve his "wavy" path. 01:50 Al: Mentions his GPS remapping code. Ideas for fixing the path. 04:20 Al: Is the GPS accurate? Steering? 06:30 The original Pure Pursuit paper from 1992. 07:10 Jeff: Asks if Al has tried driving manually around the path. 09:45 Jeff: What are the inputs to Pure Pursuit? 11:10 Al has compared his Pure Pursuit code to Juan's code and the original code. 11:40 The F1TENTH project. 13:10 We look at Al's Miro page. 15:40 Extra plotting values for debug. 17:05 We go look at GPS values in the bag files. 36:45 Jeff: Says he will try harder to get his robot running again. 38:00 Jeff: Asks about cell phone viruses. 42:00 Jeff: Again mentions trying to break the simulator for testing purposes.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Continuing to debug. Not much luck. I spent yesterday writing a little script that hopefully follows the pure pursuit conventions. It is neither elegant or compact. It is written to show step by step how pure pursuit works and to visualize the key data elements. (link) I did this so I could compare the steer angle being published by the pure pursuit (link). I already knew there was a problem, but I wanted to know what correct values should be. Here is a screencast of walking through some plotjuggler data (link) and includes a screen print of the pure pursuit 'helper' that I uploaded. Now I'm looking at how steer angle is being calculated and working backwards to see where the calculation is going awry.
*Thread Reply:* @Al Jones Thank for the exceptional documentation and extensive details of your project!
*Thread Reply:* @Bob Hassett No prob. Miro is a nice tool for laying info out. It still feels cluttered so if you have a suggestion on a high level table of contents or structure I'm looking for input.
One of my issues at the moment is the steering performance in simulation is fine, but in the real world its not. A bit of a walk through of that challenge is attached. In summary, the statement from simulation 'result = getmodelsrv(model)' populates a transform. The 'y' value is used in calculating the steering angle and I need to replicate that in my real world calculations. At least that is what I think my key challenge is at the moment.
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
I did find out that gedit does not do automatic periodic backups unless you specifically turn it on. So now I have to re-do about 1/2 hour of my video indexing...
*Thread Reply:* Try revoldiv.com for next meeting! Works great for me. Or otter.ai but would be paid for lo ger meetings
So the brain bending conversation yesterday about odom statements, tf's and KDL frames has me looking at the python script 'odomfromgazebo.py' which is used in the Gazebo simulation. I'm noticing how 'odom.twist.twist.linear.x' is set in line 79. The formula 'odom.twist.twist.linear.x = float(sqrt(x * 2 + y * 2))' suggests the vehicle's velocity in the Gazebo simulation is not purely along the x or y axis but a combination of both. I'll call it a magnitude of velocity is assigned to odom.twist.twist.linear.x field. I am definitely not doing that in my real world equivalent script that builds the odom statement (odomfromwheelandgps.py). see line 258 - I'm sooo twisted up (pun intended) 🙂 .....
You are mixing two different things here. Your code at line 258 should be correct assuming you created the forward velocity (of the robot, in the robot frame) correctly in your wheel odometry code
But the lines at 77 and 78 are a "velocity vector - in the world (map) frame". So that says the point is moving x velocity in the x direction and y velocity in the y direction (in the world). Line 79 takes the length of the vector. Which is the velocity in the direction you are moving. You can also extract the direction of travel in the world.
If you look at this code gps_heading_new.py
in this directory https://drive.google.com/drive/folders/1ZMUC3L8v2PiuP4K4nJRjU3QYYqJ0b65F it subscribes to /vel which is published by nmea navsat driver. It extracts the forward velocity from the vector (m/s) and extracts heading from the vector (rad). Then it does various other things., like latching the heading at low speed. If you want to use this keep in mind it may need degree to radian conversions and NED to ENU conversions. I don't get line numbers but here is an extract:
# New heading degrees
<b>#yaw_d</b> = float(atan2(y,x))
yaw_d = float(atan2(x,y))
yaw_d = self.yaw_to_degree(yaw_d)
speed = sqrt(x **** 2 + y **** 2)
self.speed_pub.publish(speed)
self.head_deg_pub.publish(yaw_d)
if speed > 0.1:
self.head_deg_latched_pub.publish(yaw_d)
self.last_yaw = yaw_d
else:
self.head_deg_latched_pub.publish(self.last_yaw)
But these line do the actual conversion:
yaw_d = float(atan2(x,y))
speed = sqrt(x **** 2 + y **** 2)
This whole thing needs rewritten because it is a total hack,
I have to decide what to do today. I worked on my robot all day yesterday. I got numbers coming out. I need to fire up Plot Juggler and see if ANY of the headings it is producing are correct. If not, I will have to cobble something together to get one that is valid, Then see if my modified cmd_vel makes it all the way to the steering and drive motor. So I could look at that.
When I was indexing the meeting video yesterday my computer went belly up (actually more of a belly flop). Turned out the AC plug was loose. So I see it doesn't die gracefully as the power gets low (which is bad for my robot). So I backed up to recover my indexing and quit at about the half way point (out of 2 hour video). So I could finish that.
Another interesting point... I assumed my new fiber optic Internet would upload to YouTube some what faster. The 2 hour video yesterday normally would have taken about 2 hours to upload. It took less than 2 minutes. So I'm pretty happy about that. 😍
@Jash Mota I briefly looked at those two web sites. But I don't see how they would help with my indexing task. Quite some time ago @Juan Eduardo Riva had suggested automatically producing transcripts. but also at that time I couldn't see how transcripts would help.
Info from this week's video meeting. Mostly debug session for Al's lawn tractor using Pure Pursuit. So you can decide if that is of interest to you.
===========================================
20230817 Lawn Tractor Automation meeting - Length 1:54:05 This video: https://youtu.be/dsDderfhk_M
Chat: Al: https://drive.google.com/drive/folders/1nos6IM9X7Go81RT0_Dm4SR7izqPRxe90
Index: 00:00 Jeff: Updates - Says he is going through the steps to run his robot outside. GPS correction data, marked points in yard, move path to the robot computer. 03:35 Jeff: Has given some thought to simulator changes. 07:00 Jeff: Thoughts on documenting the simulator package we have. 08:40 Jeff: Tells about old guys mowing the yard. 09:30 Al: Tells us where he is in his development journey. We look at some plots, comparing actual vs. simulation. 18:40 Al: Shows his program that graphs the the Pure Pursuit values. 30:00 We are discussing how Off-Track-Error is calculated. 40:00 We are now trying to interpret what the transform between baselink and lookahead actually represents. 55:40 Jeff: Warns that not all functions in ROS treat quaterion order as x,y,z,w. He has seen a function or two that use the order w,x,y,z. (Or maybe it is ROS vs. non-ROS functions.) So that is something to watch out for. 1:04:40 Jeff: Suggests plotting the baselink to lookahead transform. 1:09:00 We finally get around to starting Plot Juggler. We get a plot of simulated data showing that base_link to lookahead transform is a difference between the two points. Much analysis ensues. 1:18:40 We now go look at data from the physical lawn tractor. Much confusion because the screen dump wasn't saved at time 0. 1:24:15 "Oh, there's yer problem!" We noticed the scale at the bottom is wrong. We re-plot and screen dump. We do more analysis before moving on to the non-simulated data. 1:32:30 We switch to the non-simulated data. We move the vehicle to the point where RTK kicks in. Much analysis as the vehicle approaches the initial path point. 1:41:55 Jeff: Asks if Al has verified if the vehicle is doing what he asks. 1:45:30 Jeff: Asks if the wheels are actually wandering back and forth when going straight. 1:48:10 Jeff: Asks about plotting real time data in Plot Juggler. 1:49:55 Jeff: Asks where Al's bag files are stored.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Yesterday I plotted my recent test run. Since it was probably a year since I had driven it I wanted to see what still works.
Of my headings, my best source is the GPS static base line output (as expected). The second best is the BNO055 IMU. They are both outputting NED format as degrees. The BNO055 has an extra 185 degree offset.
So I need to fix one (or both) of those to give ROS compatible heading. Then I need to modify (or rewrite) one of my gpstoodom.py programs to work. Then I can take it outside to see what the next thing is that doesn't work. Maybe that will happen tomorrow...
I had it all put together and my steering doesn't work in ROS mode. After a bunch of digging, I found in my "vehicle interface node" that it was blatantly multiplying the steer angle by 400, I don't know why. Maybe because it was originally a value of rotational velocity. I will try it again before I go to bed to see if I have any steering now.
What are the pros and cons of using ArduPilot? $400 is kinda steep but seems it shortens development time?
*Thread Reply:* I used ArduRover for my JD 5055 tractor before trying ROS. Pro's - You get an integrated package with the hardware, Mission Planner and ArduRover. Con's - Once I had it working the path tracking was "weavy" and trying the change the codebase was beyond my skills. If you google ArduRover you will find others have seemingly gotten it to work. https://mowerproject.com/page/2/
*Thread Reply:* Any interest in discussing this gps problem tomorrow? Compare Jeff’s and Al’s experience and hardware/software solutions?
*Thread Reply:* We can discuss anything... I don't know what you mean by "this gps problem". I'm having problems with my cell phone GPS correction data. But I will put that in a different message.
*Thread Reply:* I was referring to the gps problems/solutions described on page 2 of https://mowerproject.com/page/2/ scroll down to his Better RTK fix…..local rtk base verse remote sources like what Jeff is using.
*Thread Reply:* I must have missed your source in your question. I see the guy at https://mowerproject.com has lots to say. But I don't have the ambition to read all of it before a meeting tomorrow. I will go into a meeting tomorrow with an open mind.
*Thread Reply:* He was talking about a u-Blox NEO-M8N and a ZED-F9P. What are folks using on a base station and on their machines? Any problems if the F9P drops out, how quick can it lock on to another signal quickly?
Earlier, maybe Sunday, I noticed my remote control box was flakey. The ROS/manual switch was flicking back and forth. (I could see my beacon changing colors.) If I tapped on the panel it would randomly change, which indicated a loose connection. I later opened it up to see what was up. My toggle switch for ROS/manual has screw terminals. They were both loose. I tightened those and the problem seems to have gone away.
I went out Monday to see if any of my new code works. My cell phone correction data kept randomly disconnecting. It gave no indication or error messages. It just quit. So everything failed. A week or two earlier I was able to drive an entire path but it took about 3 times of reconnecting until I thought it was stable.
So what is wrong? It could be the cell connection is dropping out. It could be the USB connection to my FTDI adapter is flakey. Maybe their software just doesn't work.
I checked and there is a software update for GNSS Master available so I loaded that. I could wiggle the USB cables. I could switch from cell data to WiFi to see that makes a difference.
If I don't find any immediate solutions I may switch over to my telemetry radios.
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Info on using MNDOT for correction https://www.dot.state.mn.us/surveying/cors/
After the Zoom meeting I started rethinking my tractor project. Chances are we’ll sell the tractor next year. So I’m going to redirect my efforts to a Barbie Jeep or smaller. Not going the riding mower route. So I’ll take every thing off the tractor. I’ll start shopping for an f9p and a surveyor grade antenna. For correction data maybe I’ll look in to using internet and WiFi.
Info from today's video meeting.
===========================================
20230824 Lawn Tractor Automation meeting - Length 55:21 This video: https://youtu.be/p-y0u9rHv_Y
Chat: Jeff: https://www.ardusimple.com/introducing-gnss-master-app/ Jeff: http://lefebure.com/software/android-ntripclient/ Jeff: https://rtklibexplorer.wordpress.com/2016/09/08/adding-a-radio-link/ Jeff: AT commands for the telemetry radios. https://ardupilot.org/copter/docs/common-3dr-radio-advanced-configuration-and-technical-information.html
Index: 00:00 Jeff: Current quest for GPS correction data. Cell phone method doesn't work. Other cell phone options. 04:20 Jeff: Telemetry radios with RTKlib ntrip utility. Power problems. (And fixed.) Baud rates. 07:00 Jeff: ESP32-CAM board. 09:00 Jeff: Or just use the laptop computer directly. 09:20 Jeff: Tweaking software to make this work. Static heading from second GPS. 12:30 Jeff: Maybe make sure BNO055 heading is also valid as a backup. 13:00 Jeff: Actual testing stories. A quick hack to always take forward speed from joystick. 14:40 Al: Asks about the AT commands for the radios. (Jeff eventually posts a link in chat.) 15:30 Bob: Asks what GPS hardware is being used. 16:55 Al: Asks about the joystick speed control hack. 18:05 Bob: Asks about using free public correction data as opposed to creating a base station. 20:20 Al: Asks about viewing correction data that has been received at the GPS. 23:45 Bob: Asks for a description of Al's GPS base station. 27:10 It is suggested to search the the index (for both Lawn Tractor Automation and ROS Agriculture) and find Al's previous base station presentations. The links are in the description below, Discussion about using cheap non-RTK receivers. 33:30 We talk about the mower bot article. 40:10 LORA vs. Telemetry radios. 43:00 Al: Gives his update. Talks about Pure Pursuit. 48:45 We talk about creating location marks in your yard. Or using existing lines in parking lots.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I got my telemetry radios working again so I will go outside tomorrow and see if I have decent GPS data...
In general my GPS "appears" to work better with cell phone correction data as opposed to laptop correction data. Maybe just my imagination? The other problem is that the cell phone keeps dropping the connection. I found that when the screen goes blank, it drops the connection. A work-around is to set the screen timer to 10 minutes. It won't let me disable it completely.
So I will go play with that for a couple of hours and see how well everything is working,,,
@Al Jones I finally got my mini-tractor to work as badly as Al's lawn tractor. But that is a massive step up from not working at all. I'll post some plots when I get them done.
So this what my first plots look like. The path is hacked for some reason. My vehicle is also steering full left and full right. I was controlling forward velocity with my joystick. I had just replaced my pure_pursuit.py with Al's version because he outputs more debug data.
Now I want to run it in RVIZ and see what I get.
Ya know Bob... This a good time to get your Barbie Jeep running. Since the Barbie movie is doing so well.
This is what RVIZ thinks about my test. The path is not hacked up. My lookahead transform is bouncing around (which you can't see here). But is much better than it was before. My lookahead transform is advancing now but it gets lost a few times.
But this gives me stuff to dig through and I can make my own tweaks and rerun it... And feed these values into Al's Pure Pursuit Helper (or what ever he called it) to see if they compare.
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Info from today's video meeting.
===========================================
20230831 Lawn Tractor Automation meeting - Length 59:12 This video: https://youtu.be/bo36buodKUk
Chat: None
Index: 00:00 Jeff: Quick story about telemetry radio range test. 01:30 Jeff: Cell phone correction data testing. 02:20 Jeff: Outdoor test results. (Physical vehicle) 08:30 Jeff: These are the same results in RVIZ. 11:10 Jeff: Results from simulated vehicle. 14:40 Jeff: Switch to Al's Pure Pursuit code. 15:40 Jeff: Switch from gazeboodom.py to gpsodom.py. 21:35 Jeff: Reiterates that switching the simulator to its GPS output may make things clear. 23:30 Jeff: Still need to experiment to see which heading to use from the simulator. 25:00 Al: Asks Bob to go ahead. Bob talks about his Barbie Jeep. Adding extra traction strips to the plastic wheels. 30:30 We attempt to look at Bob's pictures. 41:10 Al: Shows an adapter he made for a Barbie motor for his tractor steering. 43:45 Al: Shows a GPS accuracy plot. 45:10 Al: Shows an outdoor vehicle plot. And explains a problem. 49:45 Jeff: Asks about bad/intermittent connections with USB connectors. 55:50 Jeff: Mentions GPS drop-outs.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I wanted to confirm I understood the geonav.ll2xy function that converts lat lon positions to x, y positions. In addition to the geonav function I found two other approaches and compared them using this code. The results below were very similar. I think if I tested on a football field with markings I might be able to tell which I preferred.
Lat, Lon and corresponding X, Y coordinates:
=============================================
Lat: 40.34530097, Lon: -80.12894139
latlon_to_xy => X: 0.391, Y: -4.434
geonav.ll2xy => X: 0.435, Y: -4.417
cmurphy_ll2xy => X: 0.392, Y: -4.422
-------------------------------
Lat: 40.34530663, Lon: -80.12887741
latlon_to_xy => X: 5.820, Y: -3.804
geonav.ll2xy => X: 5.863, Y: -3.736
cmurphy_ll2xy => X: 5.828, Y: -3.795
Distance between this point and previous using latlon_to_xy: 5.465 meters (215.160 inches)
Distance between this point and previous using geonav.ll2xy: 5.471 meters (215.380 inches)
Distance between this point and previous using cmurphy_ll2xy: 5.473 meters (215.457 inches)
-------------------------------
Lat: 40.34526831, Lon: -80.12887049
latlon_to_xy => X: 6.406, Y: -8.069
geonav.ll2xy => X: 6.492, Y: -7.983
cmurphy_ll2xy => X: 6.416, Y: -8.049
Distance between this point and previous using latlon_to_xy: 4.305 meters (169.504 inches)
Distance between this point and previous using geonav.ll2xy: 4.294 meters (169.036 inches)
Distance between this point and previous using cmurphy_ll2xy: 4.295 meters (169.090 inches)
-------------------------------
Lat: 40.34526175, Lon: -80.12895261
latlon_to_xy => X: -0.561, Y: -8.800
geonav.ll2xy => X: -0.475, Y: -8.780
cmurphy_ll2xy => X: -0.561, Y: -8.778
Distance between this point and previous using latlon_to_xy: 7.005 meters (275.792 inches)
Distance between this point and previous using geonav.ll2xy: 7.012 meters (276.076 inches)
Here is a cute write up that was posted to Home Brew Robotics Group: https://blog.arduino.cc/2023/09/05/tiny-tesla-go-kart-gains-self-driving-autopilot/
The weekly video meeting starts in 30 minutes. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09
Info from today's meeting.
===========================================
20230907 Lawn Tractor Automation meeting - Length 56:15 This video: https://youtu.be/RlrscSc8OtE
Chat: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350 https://github.com/jones2126/ https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/ackermann_vehicle/nodes/odom_from_wheel_and_gps.py
Index: 00:00 Al: Shows some plots from his path following experiments. 02:00 Al: Talks about testing various lat/lon to x/y programs. 06:40 Jeff: Talks about experiments with Pure Pursuit. Differences between location on the simulator and the real world. 11:00 Jeff: Suggests making a video of this debugging process. 12:30 Jeff: Lookahead transform jumps around when played back on RVIZ. Some description of how the code works. 23:45 Jeff: Discusses his multiple sources of heading info. And compatibility, 27:20 Jeff: Dealing with different data types for heading info. 31:30 Jeff: Single GPS heading vs. dual GPS heading (just an observation). 32:55 Jeff: A comment about trusting static heading from dual GPS. 33:40 Jeff: Summation of current status and where to go from here. 34:00 Al: Shares some info from his Miro white board. And the data that he collects. 35:30 Al: Mentions a new IMU board he was notified about. (Sparkfun BNO086) 36:40 Jeff: Suggests adding a second IMU to compliment the first one. 41:20 Jeff: Asks how Al created the pose values that he published with his augmented code. 46:50 Where is the source code for TransformToBaselink? 55:15 Parting thoughts. (YouTube maintenance)
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Interesting paper regarding an adaptation of pure pursuit named Regulated Pure Pursuit which seems to also be incorporated into ROS2 Nav2 https://github.com/ros-planning/navigation2/tree/main/nav2_regulated_pure_pursuit_controller
https://arxiv.org/pdf/2305.20026.pdf
It still is not a solution to not being able to drive a straight line consistently, but it is interesting to see their code.
Jeep Project …..Current Plan subject to changes at any time ;-) https://miro.com/app/board/o9JkzvV0Pw=/?sharelink_id=866627361216
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Info from today's video meeting,
===========================================
20230914 Lawn Tractor Automation meeting - Length 35:28 This video: https://youtu.be/c2cYz8fxTKw
Chat: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350 https://github.com/jones2126/ Regulated Pure Pursuit https://arxiv.org/pdf/2305.20026.pdf
Index: 00:00 Jeff: Comments and thoughts on Pure Pursuit. 04:30 We exchange ideas on speed control. And transforms. 13:00 Al: Status update. We look at notes on his Miro board. Some plots from a couple of simulated tests. 29:30 Al: Mentions reference to Regulated Pure Pursuit (used in ROS2). 31:20 Jeff: Mentions how to fix what Slack just forced on us. 34:25 Next testing steps.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
Here is an interesting thread: Future of ROS 2 GPS Support I found this reference on Home Brew Robotics list.
What a coincidence, I have been looking at this stuff.
I notice that a slower GPS update rate makes it worse. My real vehicle is set at 5 hz. I have run the simulator at 5, 10, 20 hz.
But I still have the same issue that it works better on the simulator (as opposed to my real vehicle).
I'm at the point where I think the steering speed may also add to the problem. The only way to speed up the steering on my real vehicle is to swap out the R/C servo.
But I'm about to try slowing down the steering on the simulator to see if I can break that.
By slowing down your lawn tractor forward speed you are leaving the GPS rate fixed. But you are making your steering appear faster. So these these different observations are starting to come together...
So I looked at plots for my vehicle. It looks like it goes from full right to full left (which is currently limited to +/-0.4 radians). And it looks like ti takes 3 seconds to do that. (That seems way too slow) So I divided 0.8 radians by 3 seconds to get 0.2666 rad/s.
I added a new line: steeringanglevelocity to the config file. And ackermann_controller.py seems to honor that in Gazebo.
I ran steering velocities of 0.25, 0.5 and 1.0. It's really broken at 0.25 and gets almost usable at 1.0. I'm assuming if I change either GPS update rate and/or forward velocity that it will interact with the steering velocity to affect overall results.
I'm thinking that any of the following will help: Increase GPS update rate (if possible), slow down the vehicle (maybe slower than what you want), increase steering speed (again, maybe not possible).
But I'm thinking that there may be something fundamentally wrong with the existing Pure Pursuit code. Or we may have to do what everyone expects and fire up robot localization to fill in the extra points between GPS updates.
Maybe I will look at the original Pure Pursuit paper and compare that to what I know about the existing code. It seems that like the existing code is highly reliant on the GPS update rate, which directly controls the odom update rate, which directly controls the Pure Pursuit calculate rate. It doesn't seem like that is what the original paper expected... It had lots of right triangles and a big circle...
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Video of running test path (link) using Larics pure pursuit path tracking. Speed ~ 0.75 m/s, Look ahead @ 2.5 meters.
Info from today's video meeting.
===========================================
20230921 Lawn Tractor Automation meeting - Length 52:17 This video: https://youtu.be/0IJGgtXB_y4
Chat: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350 https://github.com/jones2126/ Al's lawn tractor video. https://youtu.be/R6Qf-qMG5DY
Index: 00:00 Jeff: Comments and thoughts on Pure Pursuit. Simulator vs. real vehicle. Affects of GPS update rate. 03:00 Jeff: Steering speed. Simulator results from changing steering rate. 10:55 Jeff: Why slowing down GPS update rates affects the path. 14:00 Jeff: I'm still confused why it works better on the simulator than with a real vehicle. 16:40 Jeff: Summary of changes that may help. 22:30 Jeff: Another summary. 22:50 Jeff: Bite the bullet and run Robot Localization. 24:35 Jeff: Quick hacks to increase GPS update rates. 26:55 Jeff: Praise for the simulator. 38:10 Al: Notes and examples from his Miro Board. Speeds and resolutions. 45:10 Al: Mentions his latest video. 45:30 Al: Next steps. 46:25 Al: Asks how to get position values on Plot Juggler. 49:00 Al: Wants to create a coverage plot for testing. GPS rates. Adaptive look-ahead in Pure Pursuit. 50:45 Jeff: Comments on Al's latest video. (I will upload it to YouTube and add a link in chat.)
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Planned and actual path driven below. I've seen a couple of different path planning tools that I will have to see if I can dig up again. The plan below was built without automation so it is pretty basic.
Video of a longer path I ran today. Maybe tomorrow I'll turn the blade on an cut some grass. There is a brief display of the electrical components of my system at the end. https://drive.google.com/file/d/1B_2FpcaEXHaq5vypOjC5pVmJ9xynpHql/view?usp=sharing
Here is Al's latest video on YouTube. (Nice dog. It's good you are getting your dog interested in robotics.) https://www.youtube.com/watch?v=FbUeUImJztY
Still working on my path planning. I've written a couple of programs with different approaches.
My path planning process is still clumsy, but this is what I have so far. I ran the mission in the real world at ~0.9 m/s shown in Step 5. It is more "wavey" than I would like, but is reasonable. When I convert my John Deere tractor I will be able to drive the mission and capture the path. That will simplify things. I also have to work on the path to end the mission (i.e. the white space in the middle).
I'm baffled...
At line 171 of https://github.com/larics/pure_pursuit/blob/master/src/pure_pursuit.cpp ``` if (!path.poses.empty() && idx >= path_.poses.size()) { // We are approaching the goal, // which is closer than ld
// This is the pose of the goal w.r.t. the base_link frame
KDL::Frame F_bl_end = transformToBaseLink(path_.poses.back().pose, tf.transform);```
I see reference to
path_.poses.empty() path_.poses.size() path_.poses.back().pose
Some where else it says
path_.poses[idx_].pose
I assume that refers to a specific entry in poses
I assume empty and size refer to the array poses inside the path.
But I can't figure out what
path_.poses.back().pose
is doing, since it has no reference to a pose location since idx is not specified.
So my questions is, can anybody tell me where empty, size and back are defined? Or what exactly what they are doing? (edited)
*Thread Reply:* Maybe I'm responding too simply, but path.poses.back().pose is the pose of the final goal in the path, and KDL::Frame Fblend = transformToBaseLink(path.poses.back().pose, tf.transform); is transforming this pose to the robot's base link frame and storing the result in a KDL::Frame. If you are asking how 'transformToBaseLink' does that, I really don't understand vector math very well. I'm thinking navmsgs::Path message type uses std::vector<geometrymsgs::PoseStamped> and empty(), size(), and back() come from C++ as part of working with vectors. https://www.geeksforgeeks.org/queue-cpp-stl/
*Thread Reply:* No, my entire question is, what does this statement do?
path_.poses.back().pose
path_ comes from
navmsgs::Path path;
https://docs.ros.org/en/api/nav_msgs/html/msg/Path.html https://docs.ros.org/en/api/geometry_msgs/html/msg/PoseStamped.html https://docs.ros.org/en/api/geometry_msgs/html/msg/Pose.html
But I don't see anything that allows me to apply .back() or .size() or .empty().
Speaking of KDL, I found a modified a version of Larics Pure Pursuit where they have ripped out the KDL parts. I just have to find it again.
This is the project where they removed KDL.
https://github.com/howde-robotics/pure_pursuit
https://github.com/larics/pure_pursuit/compare/master...howde-robotics:pure_pursuit:master
A video of Pure Pursuit on my physical vehicle. https://youtu.be/HVeilflPQX8
A video of Pure Pursuit on a simulated vehicle. https://youtu.be/teJGOXcylVI
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Soon I want to be able to put some type of servo to control the throttle
Info from today's video meeting.
===========================================
20230928 Lawn Tractor Automation meeting - Length 57:19 This video: https://youtu.be/mbJtmhh7Io4
Chat/Notes: Al's Miro board: https://miro.com/app/board/uXjVM1yzdFo=/?share_link_id=10226424350 Al's github: https://github.com/jones2126/ A video of Pure Pursuit on a simulated vehicle in RVIZ: https://youtu.be/teJGOXcylVI A video of Pure Pursuit on my physical vehicle in RVIZ: https://youtu.be/HVeilflPQX8 Original Pure Pursuit description paper: https://www.ri.cmu.edu/pub_files/pub3/coulter_r_craig_1992_1/coulter_r_craig_1992_1.pdf An expanded analysis of the 1992 paper: https://vinesmsuic.github.io/robotics-purepersuit/ A repo where they have removed the KDL library: It adds rotate in place, deleted KDL, the calc loop runs from a timer instead of odom callback. https://github.com/howde-robotics/pure_pursuit https://github.com/larics/pure_pursuit/compare/master...howde-robotics:pure_pursuit:master A repo with Adaptive Lookahead experiments. But they removed the Ackermann output: https://github.com/6RiverSystems/pure_pursuit The video on GPS Jump Detection: https://www.youtube.com/watch?v=XkOkvx9N6r8
Index: 00:00 Jeff: Comments and thoughts on Pure Pursuit. Simulator vs. real vehicle. 04:40 Jeff: Video of a simulated vehicle in RVIZ. 09:40 Jeff: Video of an actual vehicle in RVIZ. 22:50 Short discussion of Larics Pure Pursuit code. And other versions. 28:15 Jeff: Forks of the Larics Pure Pursuit repository. 37:00 Al: Digs through some code. 42:40 Jeff: Frustrations with github. 45:25 Al: Talks about his path planning experiments. 51:25 Al: A video on GPS Jump Detection. 52:05 Al: Notes from his Miro board. 55:15 Jeff: Says he should run one of Al's bagfiles through RVIZ.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I created additional steps in my plan to complete the path to 'fill in the white space in the middle'. The odd thing is when I ran the plan the turnarounds, although the points were in the .txt input file, did not become look ahead points. Note the half circles along the orange line in plotjuggler.
Although this is not a path tracking paper it does discuss geometrically calculating a Dubins path and then discusses using vector math. Sharing as a reference. https://gieseanw.files.wordpress.com/2012/10/dubins.pdf
This is run in simulation, but it seems a lookahead of 1.5 clears the earlier problem of losing 1/2 the circle mentioned above, when the lookahead was 2.0. Now, how to dynamically adjust the lookahead for straights and curves?????
From Weekly Robotics. A highly documented autonomous test vehicle. https://arxiv.org/abs/2205.04454?utm_source=weekly_robotics&utm_medium=newsletter&utm_campaign=weekly-robotics-267 https://github.com/OpenPodcar/OpenPodcar?utm_source=weekly_robotics&utm_medium=newsletter&utm_campaign=weekly-robotics-267
Looking for a truly free vnc type software. No free trial stuff. Any recommendations?
*Thread Reply:* Raspberry pi to windows, apple devices other pi’s maybe teensie.
*Thread Reply:* I can 'full gui' remote into my RPi from Ubuntu to RPi using RealVNC, but I sort of think I got RealVNC before they started charging and they have not kicked me off yet. For a non-GUI solution I use PuTTy and for file transfer I use FileZilla. Here is an article with other options that I have not tried. https://raspberrytips.com/remote-desktop-raspberry-pi/
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Looks interesting. Not that I need a GPS.
Info from today's video meeting.
===========================================
20231005 Lawn Tractor Automation meeting - Length 1:05:57 This video: https://youtu.be/WsZLcQ7PXK0
Chat/Notes: Al's Miro board: https://miro.com/app/board/uXjVM1yzdFo=/ Al's github: https://github.com/jones2126/ Links to Pure Pursuit githubs that were discussed. https://github.com/larics/pure_pursuit https://github.com/howde-robotics/pure_pursuit https://github.com/6RiverSystems/pure_pursuit https://github.com/f1tenth-dev/pure_pursuit https://mowito-navstack.readthedocs.io/en/latest/config_pp.html https://github.com/arimb/PurePursuit https://github.com/raphaelkba/pure_pursuit/blob/master/src/pure_pursuit_planner.cpp
Index: 00:00 Jeff: Is testing the Pure Pursuit code from Howde-Robotics. 04:30 Jeff: Describes some steps to get the Howde-Robotics code to run. 06:15 Jeff: Removing the rest of the KDL code. Now figure out how it works. 07:10 Jeff: Difference between odom trigger and timer trigger. 07:50 Jeff: Adaptive lookahead and velocity control. 08:50 Jeff: Pure Pursuit code from 6RiverSystems. 09:50 Jeff: Revisit last week's complaints about using github. 26:10 Jeff: What's next? 27:15 Jeff: More comments about "rotate in place". 31:55 Al: Shows his Miro board, path planning and links. 33:55 Al: Results of testing paths. 36:45 Al: Adaptive speed and lookahead distance. 41:15 Al: The problem with the configuration program not running. 51:05 Al: Talks about adding throttle control to his lawn tractor engine. 58:10 Al: Again asks about references to Adaptive Lookahead and Speed control. 1:03:00 Jeff: Talks about the audio settings on Zoom.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
An article on creating a custom R/C servo. If you want to duplicate this, or just use the documentation as a tutorial, this looks very useful. https://hackaday.com/2023/10/04/roll-your-own-servo/
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Something I'll talk about in the meeting today. I was getting strange results for the transform_lookup. I modified the time value and it jitters less, but is always one sample behind.
Info from today's video meeting.
===========================================
20231012 Lawn Tractor Automation meeting - Length 34:28 This video: https://youtu.be/sdzdF8YMv54
Chat/Notes: Al's Miro board: https://miro.com/app/board/uXjVM1yzdFo=/ Al's github: https://github.com/jones2126/ Al's Pure Pursuit code: https://github.com/jones2126/ros1_lawn_tractor_ws/blob/master/src/pure_pursuit/src/pure_pursuit.cpp Al's current bagfiles: https://drive.google.com/drive/folders/1qbbjFHqPQJPcxEXKm8okCXo87eeH0TLW Top level directory on Google Drive: https://drive.google.com/drive/folders/1eOakFydReiihAvkAY_bPpzF-xGEUHlpZ Al: GitHub for Python Pure Pursuit (non ROS): https://github.com/AtsushiSakai/PythonRobotics/blob/35f4f61e44ad9c3f8fc6584e3d0d7ab7889bd866/PathTracking/pure_pursuit/pure_pursuit.py Al: F1Tenth Pure Pursuit: https://github.com/f1tenth-dev/pure_pursuit
Index: 00:00 Jeff: Is testing the Pure Pursuit code from Howde-Robotics. 01:35 Jeff: Does the cmd_vel output work better than the Ackermann output? 02:45 Jeff: Maybe just drive to the target point based on angle. 05:10 Jeff: Output the actual heading derived from the odom heading. Which pointed out a problem. 16:45 Jeff: Summary. 14:50 Al: Asks which of those 3 heading lines are actually used for steering. 19:15 Al: Shows his path experiments. Simulation vs. actual. 22:50 Al: Has changed the way he reads a path. He can now specify a speed and lookahead for each path segment. 26:55 Al: Is tweaking his speed value lookup table. 27:35 Al: Added more display points to the path visualization in the path generator program. 29:00 Al: Bagfile location. 32:00 Al: Posts a link to his top level Google Drive. 32:50 Jeff: Explains his hesitation to hand over code.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I see my Slack is all screwed up today. I may end up posting several test messages, or just give up.
(I'll try this again.) I modified the Larics Pure Pursuit (PP) code to simply drive to a point instead of the circles and triangles. It runs on the simulator and sort of works on the actual vehicle. The first half of the path looks good. The second half falls apart. I was controlling the speed manually.
For all of these the actual vehicle is using: GPS update rate: 5hz Point spacing: 0.5m Lookahead distance: 1m Speed: Manually controlled
Simulator: GPS update rate: 10hz Point spacing: 0.5m Lookahead distance: 1m Speed: 1m/s
So I spent several days looking at plots and rviz playbacks and it looked like my steering was just too slow. I decided I had various ways to speed up the steering:
Increase the voltage on the servo from 12V to 24V. Replace the steering servo with the faster one. Or, just drive REALLY slow.
So I went out today and drove REALLY slow. That seems to work. Here is a plot from my "PP with angles". (The bag file name is wrong on the Plot Juggler screen):
In attempting to post the second message, my computer locked up. I finally turned on my Windows computer to see if I could recover anything. Slack was screwed up there too. Much to my surprise these was my unsent message on Slack on Windows. I typed in the last few words and sent it...
Okay, final report. After seeing that diving REALLY slowly makes my "PP with angles" work I changed the PP code to use their solution instead of mine. That works too if you driver REALLY slowly. Although it is much more jittery than mine. But it does work. So it does look like I will have to pick and option to speed up my steering and retest. That's the end of my testing. Except I will post a couple videos of my rviz screen. They are quite boring as slowly as I was driving... Result from real PP:
Video from rviz running the "PP with angles" playback. This first one I didn't start the video up front.
I see the videos won't play. If I download the videos to my computer then they will play.
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Info from today's video meeting.
===========================================
20231019 Lawn Tractor Automation meeting - Length 1:08:35 This video: https://youtu.be/qYmGLgDgBXI
Chat/Notes: Al's Miro board: https://miro.com/app/board/uXjVM1yzdFo=/ Al's github: https://github.com/jones2126/ Al: Link to Greenzie video on GPS: https://www.youtube.com/watch?v=XkOkvx9N6r8&t=1s Al: Link to Weekly Robotics: https://www.weeklyrobotics.com/ Jeff: From Weekly Robotics: https://discourse.ros.org/t/rosbag-tools-a-ros-agnostic-toolbox-for-common-rosbag-operations/34035?utm_source=weekly_robotics&utm_medium=newsletter&utm_campaign=weekly-robotics-269 Al: World's first millimeter-wave RFID tag to improve drone navigation accuracy https://www.youtube.com/watch?v=7SezvxK_vrY&t=1
Index: 00:00 Jeff: Revisits last meeting about "drive to point" for Pure Pursuit (PP), 02:30 Jeff: Screen capture of first outdoor test. 05:55 Jeff: Why suspect steering speed? 09:30 Jeff: How can I get faster steering rate (vs. forward speed)? 11:15 Jeff: Screen capture of my code driving slowly. 11:15 Jeff: Screen capture of original PP code driving slowly. 24:40 Jeff: A video of the rviz screen while driving. This turned out to be the original PP results instead of my code, which I was claiming. 34:00 Jeff: Back to the question: How do I speed up the steering rate? 36:15 Al: Asks what the new code looks like. 38:00 Jeff: What it will take to fix it right. And Al's steering on his vehicle. 42:30 Al: Talks about new menus for testing. 49:55 Jeff: Asks why Al's GPS is in a different location each time. Discussion of GPS base station configuration and start up procedure.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I have a new problem. The first bag file I created last Friday and the first bag file I created today seem to be corrupt. Linux will not copy them to a flash drive. The other files both days seem to be okay...
*Thread Reply:* I'm not sure I recall not being able to copy a bagfile, but I have had them be corrupt and not able to play back. For me, I try and ensure the 'rosbag record -a' is started from a new gnome terminal window and I ctrl-c that process before stopping any other ros processes. If I do that I can't recall the last corruption I've had.
*Thread Reply:* I don't recall what my exact order was. I wasn't paying attention since it hadn't bitten me yet...
As I mentioned in the last video meeting I could add an extra 12V battery to make my steering faster. I did that on Friday. It did seem to help. But I also noticed my steering appeared to me more jittery. I later looked at the plots to see if I could see a correlation to anything else. I tried plotting the requested steer angle against my steering feedback. But I noticed my steering feedback had died, The steering feedback is a wire soldered to the position pot on the R/C servo. Then I read that voltage with an Arduino Nano that is taped to the frame. So it looks like the Arduino Nano died (reset?) during testing. So I went out today and did some more tests to specifically get the steering feedback. The steering feedback died during every test I did. So I am assuming I am getting (excessive) electrical noise. I need to pull up some earlier plots to see if it was doing this before I added the extra battery...
The weekly video meeting starts in one hour. (Noon, Eastern time, USA) https://us02web.zoom.us/j/82088036016?pwd=K2lLc1FiWm9MU0dzRStxM2J2b3dpQT09 (Reminders won't always be posted, just when someone remembers to do it..)
Info from today's video meeting.
===========================================
20231026 Lawn Tractor Automation meeting - Length 42:52 This video: https://youtu.be/HpB56y1KsVQ
Chat/Notes: Al's Miro board: https://miro.com/app/board/uXjVM1yzdFo=/ Al's github: https://github.com/jones2126/
Index: 00:00 Jeff: Recounts Pure Pursuit experiments. Driving slowly. Increasing servo voltage. Attempt to average steering angle command. Compare steering request to steering feedback value. Story of higher voltage on main motor causing problems. When current sensing was added to main motor and the R/C servo. Analyzing the current of the servo as it moves. 15:30 Plots indicate that averaging of the steering request may help. I probably screwed up the averaging code. 18:40 To-Do list. 19:40 Al: Asks if the actual steering rate is known. 21:15 Al: Asks about electrical noise effects on the USB system. 25:00 Al: Asks if the forward speed is known. I.e., what works and what doesn't. 27:40 Jeff: More To-Do items. 30:50 Al: Is moving to a larger test yard. 32:20 Al: Steps for start-up. 33:20 Al: Example of his USB problem. 34:45 Al: Setup for new location. 38:15 Maps and backgrounds. 41:15 Al: List of things to take along.
==
Link to access the Slack channel: https://tractorautomation.slack.com/
Lawn Tractor Automation YouTube page: https://www.youtube.com/channel/UCGfslJO8yD3iz7u_nNFCMnQ/videos
Index for Lawn Tractor Automation group: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt
Index for the original ROS Agriculture group (~250 videos): http://sampson-jeff.com/RosAgriculture/readme.txt (Use following link, or newest file in that directory) http://sampson-jeff.com/RosAgriculture/ros-agriculture-youtube20230104.txt
I have been concerned about loosing all the valuable information as it is shared in the “Random” thread. So I have started coping the text and pasting it into my Miro whiteboard. https://miro.com/app/board/o9JkzvV0Pw=/?moveToWidget=3458764564091828511&cot=14|https://miro.com/app/board/o9JkzvV0Pw=/?moveToWidget=3458764564091828511&cot=14
But the problem has been trying to capture all the pictures. One way is do a screen capture and then edit to size then paste back into the text. But that is a hassle. Any suggestions on how better streamline the process?
I went through this process when ROS Agriculture was about to vaporize. At that time you could "export" to a ZIP file. You would get text, links to files, pseudo links to threads. I could never find where threads are stored. But links to files (images for instance) would work and you could it into a browser and display the image.
I haven't looked at this for awhile since they don't seem to want people to actually do this. But at the time I found two or three public attempts (githubs) to make this work. I haven't checked to see if those have been updated.
So today this page turned up: https://robots.net/tech/how-to-export-slack-conversation/ It says there is now a PDF option. I'll go try that and see what that does.
*Thread Reply:* This is a test reply...
This to test if threads can be downloaded
And include an image in the thread
Ahhh!!! It just blew my message away (again).
@Bob Hassett I will make this brief... Your Miro page seems to be a copy of meeting posts from Slack which is the same as the YouTube description which is the same as: http://sampson-jeff.com/RosAgriculture/LawnTractorMeetingNotes.txt If that is what you want, just use the LawnTractorMeetingNotes.txt data.
I assumed that you were talking about all of the other messages and images on Slack...
It seems the page I posted: https://robots.net/tech/how-to-export-slack-conversation/ doesn't match Slack. The only export I can find in Slack is the original/cryptic/useless option that was there originally...
Maybe I will dig further if I get (really) bored.