Account

You are currently not logged in yet.

Login

Penn Electric Racing - GeT Cameras collaboration

Penn Electric Racing is a student run organization dedicated to building electric racecars. Every year, the team competes with other student-built racecars in the Formula SAE, an international competition for university teams to design and manufacture the best performing racecars.  

Their ambition is to gain experience and compete in the autonomous competition in the future as well. To achieve these ambitions, Penn Electric Racing reached out to GeT Cameras. They were looking for compact industrial cameras online and came across our global shutter MER2-160-227U3C with the SONY IMX273 image sensor.

Sponsorship

Penn Electric Racing got in touch with GeT Cameras because of the clear website and competitive pricing. “GeT Cameras had the camera prices displayed on the website which was nice for orientation. Because of the transparency we decided it was worth reaching out.”

The vision components

After being in contact with Penn Electric Racing we concluded that for their vision system they needed the following components:
  • MER2-160-227U3C 4pcs.: A 1.6MP global shutter USB3 camera with a C-mount able to reach 227 frames per second at full resolution.
  • LM12-5MP-02MM-F2.0-3-MD1 2pcs.: Two wide angle M12 lenses. With these lenses the team can gather images and detect traffic cones in a wide angle with a medium distortion rate. Their horizontal field of view is 116° on a 2/3-inch sensor.
  • LM12-5MP-06MM-F2.8-1.8-ND1 2pcs.: Two M12 lenses with a narrower field of view to detect cones at longer distances. The low distortion lens result in a great depth perception.
  • LADAP-C-TO-M12-V2 4pcs.: A C to M12 mount adapter created and patented by GeT Cameras. This adapter is specially designed to be able to fix the M12 lens in the lens holder after focusing the lens, the focus ring can also be fixed.
  • CABLE-D-USB3-5M 4pcs.: Four cables to connect the USB3 cameras to the on-board computer of the racecar.

The testing

Since the car is not finished yet, Penn Electric Racing was only able to test the cameras indoors. However, their first impression of the image quality and the high-speed performance was even better than expected. This is mainly due to the bigger image sensor than they were used to.

The team used our ‘QuickStart Guide’ to download the SDK and set the cameras parameters. “Thanks to the QuickStart Guide we were able to easily gather the first pictures. The best part was the Python and C++ sample programs. This way we could easily get an impression of what the cameras were capable of.”

The car

The fully electric racecar is designed to deliver high performance on a narrow track stuffed with tight turns. Focused on handling and acceleration the car speeds up from 0 to 100km/h in 3 seconds and can reach its maximum speed at 100km/h. 

Autonomous

Building an autonomous car isn’t an easy assignment. The software department of Penn Electric Racing must work on perception, localization and control
  • Perception. To drive autonomously, the car must be able to recognize the track. When using two machine vision cameras it is possible to create a point cloud thanks to stereo vision techniques. The team uses two narrow angle lenses to detect cones on the front of the track and two wide angle lenses to detect cones in corners.
Because the car is moving and vibrating at high speeds, the industrial cameras must be able to gather as many images per second as possible. The SONY IMX273 captures 227 images per second at 1.6MP. The is fast enough to have little to zero motion blur. 

The MER2-160-227U3C, equipped with this SONY IMX273 sensor, has a global shutter. Every pixel of the global shutter camera will start its exposure and end its exposure at the same time and therefore freeze the image with little artefacts.

Rolling shutters expose different lines at different times as they are being read out. At the high speeds, the first line of a rolling shutter will be taken earlier than the last line resulting in a distorted image. The location of the cones cannot be read out precisely by vision software. Read more about the difference between global shutters and rolling shutters in our knowledge center.
  • Localization. After processing the images, the car needs to know where it is on the track. By using racing trajectory algorithms, the car can calculate how to drive this track as fast as possible.
  • Control. Once the imaging has been processed and the car has figured out where to go, they need to make sure that the software output is able to control the mechanical ‘Drive-by-wire-systems’. These Drive-by-wire-systems oversee breaking, accelerating and steering the car.

The competition

Penn Electric Racing enrolls their electric vehicle in the Formula SAE racing competition and will thereby be competing with other student-built cars on the 17th of May in Brooklyn, Michigan. The car will be judged in a static event based on how the car is build, a business pitch and a cost calculation. Secondly the car must perform in the dynamic events where it will be judged on its achievements on the track.  

First is the acceleration test. The car drives a 75-meter strip, starting at 0km/h, in as little time as possible. The second trail, the autocross event, is a short track where they time the car. Thirdly, the Skid pad event is a figure 8 track where they test how much grip the car has when handling corners at higher speeds. The most important one is the endurance trial. The racecar must drive 20 laps to see if the car is reliable enough to sustain these extreme speeds, forces and environmental factors.

Report card

Support: 
Penn Electric Racing received close support from GeT Cameras and is grateful for the helpful support. The support around the lens selection is especially appreciated. The lenses that were suggested helped Penn Electric Racing with the exact field of view they needed. 

Durability:
The team only used the machine vision products indoors so far, but even indoors they are noticing the sturdiness and industrial quality. The metal frame has a robust feeling, and the screw lock cable ensures a stable USB3 connection that will not be interrupted if you pull the cable by accident.

Image Quality: 
Before using industrial cameras, the team was using phone cameras. Since the sensor is much larger and light sensitive the image quality is better. Secondly, the specific settings like setting the shutter speed and exposure time make for a high-quality image.

Flexibility: 
Penn Electric Racing granted 5 out of 5 stars for flexibility because the camera runs on multiple operating systems. Thanks to the USB3 conformity it can be used with many standardized vision programs. “Since the camera has drivers and APIs with many programming languages you can go whatever way needed with the GeT Cameras products.”

Implementation process:  
The student engineers are pleased with the ease they experienced while implementing their vision system with cameras, lenses, and cables. The QuickStart manual and software samples that come with the SDK are helpful.

While mainly happy with the implementation they also ran into some trouble. The camera housing is so compact that it is hard to mount it to the car. Secondly the programming examples are basic and some more complex programs for low latency environments would be preferred.

Pricing: 
Thanks to the transparent online store of GeT Cameras, the team was able to see the very competitive pricing of the products. It is also nice that you can see the delivery times, and if you change them then the pricing changes accordingly.

Follow us

Did you enjoy learning about Penn Electric Racing? Make sure to follow them on LinkedIn. Later this year we will also release a second article focused on the technical side of implementing a low latency vision system with Robot Operating System (ROS). Follow us on LinkedIn to get notified when that article drops.

Do you have any questions about this sponsorship, machine vision cameras or industrial vision? Get in touch with us. We strive to answer all questions within 24hr.

 
  Machine vision market & applications     19-12-2023 14:09