Blueprint

odins-eye

Odin's eye is a DIY speed camera. It uses a TF-Luna lidar to detect passing vehicles and their speed, which triggers the raspi camera module 3 to take a picture of the vehicle which is analyzed by a pi zero 2w. From the picture it gets the vehicle and in the case of it being a car, its license plate. Then using the pi's wifi it sends the data to your phone.

17 views

0 followers

Timeline

pro-grammer pro-grammer šŸš€ added to the journal ago

Wiring the Schematic

I started by laying out the different parts of the project, placing the inputs and power connections on the left, the processing in the middle, and the outputs on the right. This setup made it easier to visualize how data and power would flow through the system. Before wiring the sections, I spent some time researching and watching a few tutorials to make sure I understood everything. I also looked at a few example schematics to double-check pin configurations.

For the plug, I found that I would actually need a USB-C receptacle because I just needed it for charging the circuit. I only connected the necessary power and ground pins and included the proper CC resistors for charging negotiation, then moved on to the 6-pin connector.
The 6-pin connector was fairly easy because, using the pinout diagram from the TF-Luna we had bought, I connected 5V, RXD, TXD, and GND. The last two pins aren't used in this model.

At this point, I connected 5V, GND, RXD, and TXD to pins 2, 6, 8, and 10 respectively on the raspi's GPIO header. I moved on to the 30-pin connector, and here I had some problems. I originally wanted this 30-pin connector because it has no extra PCB border and is great for custom enclosures. The problem is I would have to connect around 20–25 of the 30 pins.

30-pin-connector

I would also have to add capacitors with different values and deal with a large number of pins. This is highly error-prone, and iteration with hardware isn't quick. In the end, I decided to delete the 30-pin connector and use the 7-pin connector, which would represent a header for an OLED like this:
https://www.amazon.es/-/en/HiLetgo-SSD1331-Display-3-3V-5V-Colorful/dp/B0813BB3K7

I wired this 7-pin connector and, lastly, wired the LED, which I decided would be green, using a 1 kΩ resistor. Once everything was wired, I added some no-connect flags on a few unused pins, and a small note on the USB-C. I finished by fixing two or three warnings from the ERC.

schematic2schematic

pro-grammer pro-grammer šŸš€ added to the journal ago

Starting The Schematic

The TF Luna arrived, but because of an ordering mistake, it didn't have the right cables so I couldn't connect it to the breadboard. I could buy the cable separately, but I found one that has all the right cables and is quite cheap, so I decided to buy it. Unfortunately, it will take around two weeks to arrive, so I decided to get started with the schematics.

TF Luna: https://www.amazon.com/Wishiot-TF-Luna-Single-Point-Terminal-WiFi_Lora_32/dp/B09PZ559ZB

This is my first time designing schematics, so to begin with, I looked at some of the guides Blueprint offers. I began by trying EasyEDA. Created an account, started the project, and got a better idea of what components I needed. I started inserting some of the components into EasyEDA. At this point, I needed to get the basics down so I looked at some tutorials. One that helped a lot was this one: https://www.youtube.com/watch?v=R_Ud-FxUw0g

After these tutorials, I thought that I might also have a look at KiCad, so I downloaded it, created the odins-eye project, and having a better idea of what I was doing, I started to get a good feel for it. After following this tutorial for a bit: https://docs.kicad.org/6.0/en/getting_started_in_kicad/getting_started_in_kicad.html, I decided that I was ready to begin designing the speed camera.

The first thing that I looked at was the raspi's 40 GPIO pin header (2x20). Because there wasn't the exact model I needed, and also because I needed to get familiar with KiCad, I decided to design my own. Which I am quite proud of.

Next, I inserted a 6-pin connector which represents the JST connector. I also added the USB-C plug, a 30-pin connector which represents the ZIF connector for the 0.96 oled, one LED with a corresponding resistor, and two more resistors for a voltage divider to measure the battery.

This is where I’m up to so far:

kicad2

pro-grammer pro-grammer šŸš€ added to the journal ago

Updated Speed Measurement Method

I realize that measuring the speed of a vehicle from a video is extremely difficult. There are so many variables that come into play. Now I see how silly my idea was. Here are some of the things that make my original idea so challenging:

  • You need to know the real-world distance the car actually travelled
  • The video's timing is often unreliable because the frame rate isn't constant
  • The camera's angle and lens bend the image

So... I decided to have a long chat with Gemini about other options. I looked into the doppler radar but realized that for it to work I would also need an amplifier, a sperate analog circuit, because the signal would be too weak and noisy for the raspi to read directly. I also looked at the microwave motion sensor but that didn't work because it would detect 360 degrees, so it would also trigger objects moving behind it.

More Messy Iteration:
more-messy-iteration.jpg

Then I had a eureka moment; I realized that maybe we could somehow use the TF-Luna lidar to measure the speed of vehicles. Very soon I realized that this could work, here are some reasons why:

  • It uses a narrow laser beam, so it will only detect cars passing in front of it
  • It's electronically simple, you can connect it directly to the raspi's GPIO pins (via I2C)
  • It has very low power consumption and won't put any strain on the power bank (much more efficient than the camera-only method)
  • It's extremely fast, at 100Hz, the sensor can check the distance 100 times per second
  • It's very compact and it was already in the original design

https://www.amazon.es/-/en/Benewake-TF-Luna-Compatible-Pixhawk-Raspberry/dp/B0DKCSF8C5

Now I would like to briefly explain what a lidar is and what it does. A lidar works by shooting a tiny beam of light and measuring exactly how long it takes for that light to bounce off an object and return.

My first idea on how to get speed was to use the Pythagorean theorem, to use it there would be one fixed distance in the code (a) the perpendicular distance to the side of the road, and we would use the lidar to measure the diagonal distance to the car (c). Using the Pythagorean theorem we would solve for the car's true position along the road (b = sqrt(c^2 - a^2)). By taking two of these measurements in say 0.1 seconds, we could determine the distance the car travelled and divide by the time to find the speed.

The only problem with this method is that it wouldn't be fully drop-and-go because you would have to make sure that (a) is the same in the code as in real life and if not change one of the two. What I am looking for is a completely stand-alone speed camera that starts working without having to be placed at any specific distance.

Knowing this I looked into some other options and found a very promising one; the least squares fit option. In this one the lidar is placed facing the road at roughly 45-degrees angle (I plan on designing a lock on the case). The lidar sends a distance reading every 0.1s until it detects an object in its path, once this happens it starts sending distance readings every 0.001 or 0.005 (I haven't decided yet) and because the object is getting closer at an angle, the distance will decrease as long as the lidar isn't hitting the side of the vehicle. Then it fits a straight line through the readings. The slope of that line tells how fast the distance is shrinking. Because the sensor is tilted, it divides that slope by cos(Īø) to get the real speed of the object moving across its view.

To prove this concept, I built some simple example code in python (which I might have to replicate in C++ if speed is a problem) to test the basic idea. I would ask chatgpt to generate a set of distances and the corresponding signal strength and I would test it, honestly I think this might actually work:

test-car.png

50-kph.png

Code:
https://github.com/adrirubio/odins-eye/blob/main/firmware/speed-calculation-code.py

To see how this works I bought a TF-Luna lidar to try both of these methods out. It still hasn't arrived but when it does I plan on seeing which of the two speed measurement methods works better and do a proof of concept.

pro-grammer pro-grammer šŸš€ added to the journal ago

Research and Project BOM

I started brainstorming how a DIY speed camera could be possible 3 days ago, and I am quite content with the basic idea I have after a few hours of project and parts research. My original idea was for the speed camera to have a stand that can rotate and follow vehicles while they pass, but I quickly realized that this would overcomplicate the project too much because I would have to design a stand from scratch. I settled for a universal stand where the speed camera and sensor are inside a 3d-printed case that screws onto it. I also wanted a touch screen where you could do some sort of onboarding every time you turned the speed camera on, but this idea was also quickly discarded because the onboarding would be unnecessary and small oled displays usually don't have touch capabilities. I needed the project to be standalone because my idea was for it to be in front of my house and gather data from the passing vehicles. I thought that having a battery holder and a slide switch to turn on was a good idea, but, after talking it over with my dad we decided that it was a better idea to have a power bank that would connect to the pcb/carrier board inside the case through a type-c plug.

After lots of research and modifying the design several times I ended up with the following base idea that I will build on top of:

The system is controlled by a raspi zero 2w that has a raspi camera module 3, a TF-Luna lidar, and a small, possibly 0,95 inch oled display connected. It works by detecting a vehicle before it passes using the lidar which triggers the camera module 3 to start recording. From the recording it can get the speed using the frames-per-second and distance, it understands using AI what object passed and in the case of it being a car, its license plate. It'll get power from a power bank that can be mounted on top of the 3d-printed case. The lidar and camera will be situated on the front side of the 3d-printed case and on the back will be the oled display and a small green LED that turns on when the pcb/carrier board gets power. Lastly, using the pi's wifi it can send notifications to my phone.

I made the project BOM which you can check out in here: https://github.com/adrirubio/odins-eye/blob/main/bom.csv. I also made some rough sketches with the base idea I have up until now, but none of these are permanent.

Messy Iteration:
messy-iteration.jpg

Complete Design:
design.jpg

PCB/Carrier Board:
pcb-carrier-board.jpg

pro-grammer pro-grammer šŸš€ started odins-eye ago