Thoughts on DIY Lane detection CNNs for openpilot

One of the most limiting factors of openpilot for developers is the snpe’ified TensorFlow vision model. Qualcomm SNPE is defined as Snapdragon Neural Processing Engine SDK, which is a platform to convert existing TensorFlow, Caffe and maybe some other models to a dlc file container format for acceleration on Qualcomm hardware, most commonly found in mobile phones.

The reason why this is limiting is that running openpilot on different hardware such as x86, Google Coral and Raspberry Pi with TPU is not immediately possible as there isn’t a snpe GPU accelerated runtime available for non-Qualcomm platforms. Newer platforms such as the Snapdragon 855 are benchmarked to have a 5x snpe speedup vs the comparatively ancient 821 in the LEX727 and 3T.

With all this said, there’s really nothing stopping you from running OP on another SNPE supported device but camera interfacing. It’s rumored that getting visiond to work with a phone’s camera is the reason why we still use IMX298 based hardware as camera manufacturers don’t publish specs, drivers and configs.

It’s become more complicated to train your own model now, as comma now performs vision radar, laneless, and coming lane change in their models; which isn’t trivial to develop as the subject matter is not well published online or in academia. You may want to target openpilot 0.5.8.

I’m not going to go into too much detail in training your own models, but I will point you to https://paperswithcode.com/task/lane-detection which is an excellent resource for lane models, complete with papers. Your objective is to input a camera image and output polys, and probabilities.

lane_planner.py model inputs. If you can push 256×512 pixels in, and get these out, you have a OP compatible lane finding model

Specifically, all you need is a left lane poly, right lane poly, and the probabilities of each. You don’t even really need polys, 50 points each will suffice, as the conversion from points to polys is in code, not from the model.

visiond model inputs and outputs. input tensor size is 256×512 for 0.6.4 model (medmodel)

Here’s a good dataset, or create your own with OP data: https://bdd-data.berkeley.edu

I’m excited to see your results!

Advertisements

Old OP PID tuning guide

This is from opc.ai back when it was around. RIP Oppey.

• Feel free to use your own parameters if you have some that are already working, but it’s also fun to start from “scratch” and experiment to learn the full impact of each change.

• Start with Kp = 0.25

• Start with Ki = 0.03 (Set low on purpose)

• Start with Kf = 0.00003 (Set low on purpose)

• Start with steerActuatorDelay = 0.1 (unless someone else has already found a closer value for your car)

• Start with steerRatio = factory spec

• Start with steerRateCost = 0.5

• Start with tireStiffness values of whatever you have stock in interfaces.py. Honda civic defaults are around 200000.

Drive on a straight road with OP engaged.

Kp will be the first parameter that we adjust, starting from Kp = 0.25

• If OP is oscillating back and forth around center, it will either be a slow or fast oscillation depending on how high/low your Kp is. We’re looking for a small amount of slow-ish oscillation.

• Increase Kp to 0.5, then decrease to 0.1. Choose the “best” one and split the difference between it and 0.25. Set Kp to the new value in the middle.

• You’re looking for no fast wobble around center. You want a lazy wobble around center, or ideally a fairly solid hold on center without any abrupt movements.

• Choose the best of 0.25, the extreme value, and the new “split difference” value. Split the difference again in the best direction and continue this process until Kp is optimized.

• You may need to re-tune this at slow speeds (10mph) and fast speeds (60+mph) and possibly even use speed-dependent Kp values as mentioned in the “Background” section above.

• If there are high winds or the road has significant camber then you may experience the car drifting to one side and OP adjusting the wheel only in one direction (always adjusts to the right, etc). This is okay. Keep focused on back AND forth oscillations of the wheel take take you across the center of the lane (slow or fast).

• Don’t drive yourself too crazy on this, as we still need to tune steerActuatorDelay to get the best performance.

Adjust steerActuatorDelay:

• steerActuatorDelay adjusts the starting orientation of the car which is fed into the MPC in order to account for the delay between the measurements and the response to a commanded output. The delaycurvature_factorsteerRatio product needs to match the response of your car.

• steerActuatorDelay should be adjusted in the range of 0.025 to 0.200 in 0.025 increments (0.025 – 0.050 – 0.075 – 0.100 – 0.125, etc)

• Find the setting that provides the least amount of wobble around center. You should see a dramatic difference between 0.025 and 0.200, and somewhere in-between should be a “best” setting.

• Some truly bad MDPS systems may even need delays above 0.200?

Adjust Kp slightly up and down to see if you can further optimize it now that the ActuatorDelay is set.

Adjust Ki to help with constant offsets like wind, cambered roads, etc.

• Increase Ki to 0.2, then try 0.1, 0.05, etc. Find the value just before it begins to do a slow overshoot, correct, overshoot, correct pattern.

• Again, you’re looking for no movement around center or a very slow and gentle movement around center.

Now try taking some turns and adjust Kf.  

• First make sure that OP is properly identifying the lane lines in the turn you are attempting to take. Not just the green path, but the actual lane lines on each side as well.

• If you’re having an issue initiating and holding turns, try increasing Kf. 0.00006 is the typical value used, and somewhere between 0.00003 and 0.00010 will probably be appropriate.

• If you increase Kf to help in turns, you may need to decrease Kp slightly if oscillations have increased. If you have increased Kf too much then it may not be possible to compensate with Kp changes.

• There is a balancing act between Kf/Kp/Ki that you are trying to find.

Camera Offset:

• If the car is too far right in the lane, try decreasing camera offset from the stock value of 0.06 to something like 0.03 or 0.0.

• If the car is too far left in the lane, try increasing camera offset by 0.03 at a time.

Adjusting tireStiffness changes the curvature_factor used in the MPC.

• Feel free to play with the tireStiffness values. The stock Honda civic values are around 200,000. Yours may be slightly higher or lower.

• Values between 50000 – 300000 are probably worth playing with.

• While tuning, try to keep them at the same value as generally they end up pretty close to each other anyway (within 5-10%)

Adjusting steerRateCost will affect how eager the car is to make sudden direction changes.

• steerRateCost around 0.7-1.0 will feel very sluggish and unwilling to make direction changes.

• steerRateCost around 0.5 is a nice median.

• steerRateCost around 0.3 or less will feel extremely darty as the lane has minor deviations or the path changes.

Adjusting steerRatio:

• steerRatio will have a large impact due to it essentially scaling Kf/Kp/Ki together (steerRatio is multiplied by the MPC’s calculated steering angle delta, and this result is then multiplied by the gains).

• Changing steerRatio will require scaling Kf/Kp/Ki as well in order to regain the tuned performance. It will be a dramatic change.

• I don’t bother changing steerRatio from spec once Kp and Kf have been tuned since the only other minor impact on lateral control it has is on the VehicleModel slip factor / curvature_factor calculation which is then fed into the MPC & the actuator delay orientation calculation.

• My advice: leave it alone

latPidDeadzone shouldn’t need to be messed with (defaults to 0).  

Understanding openpilot: CAMERA_OFFSET (lane centering)

EDIT: Angle offsetting is broken in 0.6.3, should be fixed by next release. Until then, https://github.com/commaai/openpilot/pull/798

EDIT 2: Current model lane fitting (0.6.3) isn’t perfect, which can cause oversteer in corners.

Hugging left? Hugging right? Hugging in curves? Re-mounted your Eon so many times that you’ve invested in a gallon container of goo-gone? Fear not, you’ll become a CAMERA_OFFSET variable pro in no time.

What is CAMERA_OFFSET?

CAMERA_OFFSET: The distance, in meters, from the center of vehicle to the openpilot device’s camera module.

This variable can be found in:

  • selfdrive/controls/lib/lane_planner.py in OP 0.6.3 and above (as of this article date)
  • selfdrive/controls/lib/model_parser.py in OP 0.6.2 and below
CAMERA_OFFSET in the wild

The default CAMERA_OFFSET is 0.06 meters, meaning, your openpilot device should be mounted dead center, with the camera module 0.06 meters to the driver’s side for lane centering.

But openpilot hugs too far left, or right!

Calm down now, it’s alright. Most people drive a bit right on two lane, opposed traffic roads (such as state highways) for comfort. openpilot, on the other hand, wants to drive in the exact center of the lane, which is handy for its intended use case: on the interstate.

Stock CAMERA_OFFSET value is tuned for the OP device’s camera module to be 0.06 meters to the driver’s side of the car for the best performance.

But, I want Openpilot to drive further to the passenger side anyway. I do most my driving on two lane, opposed traffic highways, local roads, etc!

Fine. Lower the CAMERA_OFFSET value to move the car within the lane to the passenger side, heighten it to move the car to the driver’s side of the lane (Left hand drive). Negative values are okay and work perfectly fine (such as -0.06). The only issue here is that it impacts corner centering and you could find yourself hugging in curves. It’s best to just get used to being center in the lane if you don’t feel like maintaining your own fork (comma, PLEASE parameterize this value, and make it accessible in the settings UI).

Corner hugging fix example for those with OP dev experience (I use this with INDI, which always hugs; but the same concept can be used for any control scheme):

 if v_ego < 0.3 or not active:
      indi_log.active = False
      self.output_steer = 0.0
      self.delayed_output = 0.0
    else:
      if self.angle_steers_des > 0.5:
          self.angle_steers_des = round(path_plan.angleSteers / (interp(abs(path_plan.angleSteers), [0,10], [1.1, 1.2])), 2)
      elif self.angle_steers_des < -0.5:
          self.angle_steers_des = round(path_plan.angleSteers / (interp(abs(path_plan.angleSteers), [0,10], [1.0, 1.2])), 2)
      else:
          self.angle_steers_des = round(path_plan.angleSteers, 2)

But, I don’t know how to code!

Sure thing. Move your Eon’s mount further to the driver’s side to move the car towards the passenger side. Move the mount further to the passenger’s side to move the car towards the driver side. Keep in mind the 0.06 meter factory offset, and you’ll be good.

Doesn’t this value “learn” over time anyway?

No, and it never has (at least since I’ve been involved with OP, starting in version 0.5.8).

Long story short, you can go nuts with the measuring tape, lasers, and all sorts of stuff; but unless you are blind, just mount the OP device itself (NOT THE CAMERA MODULE) to the center of the windscreen as best as your eyeballs can get it. If it’s off, it’s likely going to be off by a few cm anyway, a little more at the most; but are you really going to notice being an inch or two closer or further from center of the lane anyway?

But, I still hug in curves!

Yep. Follow my blog, I’ll have an article out about steerRatio and tuning sometime soon. In the meantime, check out my OP observations after my 3,000 mile road trip here. It explains, briefly, why OP hugs curves.

Misc OpenPilot notes after a 3,000 mile road trip (Part 2)

I’m clever, clever enough to title my last post as Part 1 so I’d be forced to write a Part 2 before moving onto other topics. I’ll focus more on the human factor of car autonomy for this post.

We decided to drive from Fort Meyers, FL to Indianapolis, IN all in one shot. It ended up being a ~20 hour drive, ~1,100 miles. It’s the longest, most exhausting drive I’ve ever been on, and it wouldn’t have been possible without OpenPilot.

Some of the drive data

Vehicle autonomy is interesting. On one hand, it frees the human from having to perform many observations and actions a second. On the other, it could lull you into a mindset of reduced situational awareness. There is no doubt that mental and physical fatigue is reduced with autonomy, but I feel as if it comes at a cost.

Look closely on the left. There’s a car stopped, blocking the two left lanes

It only takes a second any time that you are driving to be involved in a fatal accident. The above video was during a test loop of OpenPilot on the interstate. Earlier on that drive, I was messing about with my cell phone to dial in live-tuning settings for PID, my awareness was not on the road.

OpenPilot, at the time, did not have stopped vehicle detection, and even if it did on this drive, it’s unlikely that vision and radar would have picked up a black car at night, and a side profile at that. The collision would have been catastrophic, possibly fatal at 75MPH for not only me, but the driver of the spun out car.

Luckily, there was other traffic ahead and I noticed the brake lights of the cars ahead, causing me to pay attention. Even then, the vehicle was difficult to spot. Notice the swerving semi by the end of the video.

Do a favor for me and stop reading this to consider, how do you perceive OpenPilot? Is it some wizbang self-driving thing that you look for opportunities to share with friends and family at every opportunity? Be honest, do you pay less attention and/or play with your car’s infotainment system or cell phone while OP is engaged? Do you frequently drive with your hands off, and away from the steering wheel?

If your sentiment is anything other than an enhanced cruise control system, you are gravely mistaken, and will be for a long time as “self driving” is much, much further out.

Why?

  • Radar can’t detect static objects, like stopped cars. Otherwise, every single object like sign posts and telephone poles, manhole covers would cause your brakes to slam on.
  • Vision radar (new in 0.6.x) CAN detect stopped cars, but shouldn’t be relied upon. The training set isn’t fit to detect stopped cars in all situations, and certainly not situations like where a kid on a bicycle darts in front of your car (true story).
  • The cameras in our obsolete phones that run OP can often defocus and not see anything at all in front of you, especially when it’s raining (I had several times where OP focused on the rain beads on the windshield and not the road).
  • OP currently relies upon lane lines, with some limited “laneless” path prediction. Count on slamming into a concrete barrier if you go through a construction zone and lane lines are misidentified (and you are fucking around, staring at your crotch on your phone).

The most dangerous aspect of OpenPilot is how good it is, and how much further it has improved, and will improve. It’s to the point where you can achieve a ~80% engagement rate over 3,000 miles; and it’s the slim chance that something CAN go wrong with OP engaged that’ll get you. Humans are interesting creatures. If you drive a few thousand miles, many months with OpenPilot and it acts a certain way all other times, at what point do you loose the perception that OP could not detect a stopped car ahead of you (and you just so happen to be playing on your cellphone)?

At what point do you sit back and slip into pure observation, no longer interfacing with the car actively? Or, what if you come across a situation on the interstate where a car is spun out ahead of you and a snap judgement will be the only thing that saves you? A few people have fallen asleep with OP engaged, one woke up some hours later and was fine. The other smashed into the car in front of them, the accelerator still applying gas after he crashed into them.

One thing became immediately apparent to me. The last few hours of the drive, neither of us were fit to be driving. Our awareness was lapsed due to fatigue, our eyes tired, I was even a little dizzy. Still, we pressed on. It came to the point where it felt dangerous to NOT have OpenPilot engaged as it was driving better than I was.

I don’t want to come across as overly alarmist, but I am trying to scare you a little. You must remain ready, and willing to take over controls at any moment, especially when driving with OpenPilot. You must know your limits and be able to make a judgment call to find somewhere to get some rest, it’ll creep up on you, trust me.

With all of that said, I still greatly enjoy OpenPilot. I wouldn’t have preferred the trip without it. However, one must remain aware, and engaged with their vehicle, ready to take over at a moment’s notice. My advice? Disengage OP and drive yourself every so often to stay acquainted with the car’s controls, especially when tired. It’ll become immediately apparent just how disoriented you are at any given moment.

Misc OpenPilot notes after a 3,000 mile road trip (Part 1)

Indiana to Key West! Not only the longest road trip I’ve been on, but the longest trip I’ve had with OpenPilot! This is Part 1, as we will be driving back to Indiana in a few days.

I am running my highly customized 0.6.2 fork that is available here:

https://github.com/commaai/openpilot/compare/devel…zorrobyte:devel_ZSS

This trip wouldn’t of have been nearly as comfortable, or possible without my custom steer angle sensor (ZSS – Zorro Steer Sensor). Stock Toyota steer angle sensor on TSS1 cars is garbage. Not only is it only 0.5 degrees precise, but also can be up to 2.5 degrees lagged per any tilmestep due to torsional effects and backlash in the steering column between the sensor and the EPS assembly.

jsh348 has designed a custom PCB and he just so happens to live down the road from me. Small world! It’s absolutely insane that something like this is being manufactured based on such humble beginnings.

On to the random observations (on 0.6.2)!

  • The stock MPC costs are fine. Rate cost of lower than 1.0 on Toyota is uncomfortable on the interstate (even 0.7). Same for Path, Heading and Lane costs
  • INDI always oversteers in curves. You can fix this with my desired angle hack, but even then it needs more work as 3 degree longer curves are still ridden (especially left, for whatever reason)
  • CAMERA_OFFSET may be a factor in riding curves. I have mine set to 0.0 which is comfortable in two lane opposed traffic (further right), but may be a contributor to curve hugging
  • Static, factual steer ratios are critical for good lat perf, if you have a static ratio rack! Most do since EPS torque is modulated on speed and angle. Why? OP needs to know how much steering wheel angle for tire angle, stock OP values massively oversteer in curves due to invalid MPC estimation/vehicle model curvature planning. This has also been verified on 2020 Corolla on 0.6.3
  • The stock INDI values are fine, but you can drop timeConstant to 0.1 if running with ZSS (4,3,0,1). I’d imagine on TSS1.5/2 vehicles with 0.056 precise sensors, you could also run with a lower timeConstant
  • My fixes for lane width and right exit diving on 0.6.2 still works better than 0.6.3 exit diving
  • 0.6.3’s lane-less model can lead to erratic behavior when lane confidence is low. I’m not too hard on it as it is the first release of the path planning/laneless model
  • Road camber handling is still a problem. Maybe this’ll be enhanced soon
  • Camera focus in heavy rain can be an issue as it focuses on the rain droplets on the windshield instead of the road, leading to vision failure/disengagement
  • My battery less setup of 3.8v slowly “discharged” over the course of 13 hours, leading to a “Low Battery” disengagement for the last hour of the drive. Power cycling reset the “timer”. Higher voltage may be needed
  • Openpilot’s long control leaves much to be desired. Stop and go, even with pedal, is unusable. Much effort in rewriting some of the long code led to good results, but then stopped far too close for comfort in some instances. I’m running stock long (DSU connected) and you should, too. (Apart from the extremely irritating Cruise Fault regression which requires a full reboot of Eon, and that Automatic Emergency Braking is disabled with DSU disconnected)
  • Use deadzone, or round the desired angle (and steering angle, if using ZSS) to two decimal places. If using deadzone, set to your steering sensor factor. Desired angles that are several decimal places long just act as noise to the torque controller (PID, INDI, LQR)

I look forward to getting my hands on 0.6.3 in depth once I return, especially to see if my static steerRatio hacks still hold water, as apart from good angle data, has proven to be the greatest enhancement for OpenPilot.

I have plans to get my hands on a 2018 C-HR actuator assembly to compare the differences, and see if I can dump firmware to bring good stock angle to Priuses. The part numbers are the same, at the least I’ll perform a swap of the units and writeup my experiences.

Some of my custom cases for Eon (OnePlus 3t)

First post on the new blog! It feels good to be back at it again. I’m going to go ahead and stick these in a Category called OpenPilot.

These designs are for the OnePlus 3t. I’m unsure if they would fit a LeEco, however my LeEco arrives today (and I’ll be designing cases for it).

First off, we have what I call ZorroCase Minimal. It’s a clamp of sorts that fits into a $9 car mount. The goal here was to come up with an alternate way of mounting Eon that couldn’t fall off my windscreen.

Mounted to Prius Prime
Design Detail

You can download the sources files here: https://a360.co/2xNt9pl

Next up is a case I call ZorroCase Naked. It’s a clip based on the FrEon 24 degree mount back when the repo was labeled “Open Source”.

This mount directly attaches to the heatsink and then to a GoPro mount. This means that you’ll need to affix your heatsink using thermal adhesive or some other method, as the heatsink becomes a part of the phone’s structure.

This design has been created with a built in tolerance of 0.3mm, and a further tolerance to be printed in PLA and annealed (which causes some contraction). I baked this print (that I printed out of clear PLA) into the oven at 200F for 30 minutes to heat treat it (greatly increasing its heat resistance) and fits perfectly. YMMV.

You can download this case here: https://a360.co/32E5I07

These cases were created out of a desire to reduce complexity and print time. The OpenFrEon, for example, has about a 12 hour print time, which is brutal when you need to print several to find tolerances when annealing.