Account

You are currently not logged in yet.

Login

GET racing - GeT Cameras Collaboration

GET racing, the Formula Student team of the Technical University Dortmund, joined a collaboration with GeT Cameras to enhance their autonomous racing capabilities. GET racing started competing in early 2022,  they spent their first years competing with combustion vehicles and have since then transitioned to electric racing cars. To improve their performance on the race track, two cameras from GeT Cameras were installed on top of the car. The team implemented two highspeed MER2-160-75GC-P cameras with the SONY IMX273 sensor.

The Formula Student competition

Formula Student is an international engineering competition where university teams design and build race cars to compete in different disciplines. The GET racing student team has been taking part in this competition since 2005. With a total of eleven cars built, 2023 was the year that they introduced their second electric vehicle. 

It was a big step for the team to compete in the racing competition changing from combustion motors to electric motors. This was not the only change, they have also included their first full aerodynamics package with the car. This includes the front wing, rear wing and rear diffuser. These developments prove the team’s dedication to innovation, which is further amplified with the step into the world of autonomous racing, where new challenges arise.

Sponsorship

GET racing's electric vehicle, equipped with GeT Cameras' vision system, is set to compete in the Formula Student engineering competition. Over the course of one week, the team will be judged on static events. Members have to convince expert judges in a technical interview as well as by presenting a business pitch and cost calculation. They also demonstrate the car’s capabilities in dynamic events on the race track which include the disciplines acceleration, autocross and skid pad.

The utilization of machine vision cameras is crucial when addressing challenges related to perception, localization and control. It helps the team perceive and digitize the track, to find optimal trajectories for the car to follow. This year the team incorporated a vision system that includes the GeT Cameras' MER2-160-75GC-P, featuring the Sony IMX273 sensor. The combination of a LiDAR sensor and two wide-angle lenses on the 1.6MP added an extra layer to their perception system increasing reliability and range.

The vision hardware

The following machine vision components are implemented in the vision system on the car’s frame:
  • MER2-160-75GC-P: This is a 1.6MP GigE camera with a C-mount, capable of capturing 75 frames per second at full resolution.
  • LM12-5MP-03MM-F2.8-2-ND1: This is a 3.3MM  M12-mount lens for capturing images with a non-distortion rate of < 1% and horizontal field of view of 73°.
An overview of all computer vision products is shown on the website. The MER2-160-75GC-P camera, equipped with the SONY IMX273 sensor, provides 75 images per second at 1.6MP, ensuring minimal motion blur. The global shutter freezes images without artifacts, which is crucial for accurate computation results.

A central processing unit in the car handles data from the cameras and LiDAR sensor, while also making decisions and controlling steering, throttle, and brake actuation. There is continuous research and development dedicated to optimizing the performance of the processing unit. At the same time minimizing the size of the unit making it easier to integrate and adding to the performance of the race car.

Testing of the camera set-up

Once receiving the cameras, the team has made significant progress in testing and development. Initially, they conducted static tests on a test stand to evaluate the performance of the autonomous system. This way, it is evaluated without any external influences. 

When these tests were completed successfully, they continued by doing dynamic tests, which means driving the car with the fully activated autonomous system. These tests brought the biggest challenges, such as varying lighting conditions and sensor movements. However, the Sony IMX273 sensor's global shutter ensured minimal motion blur at high speeds, offering precise data for perception. Overall, their approach to testing in combination with the 2 machine vision cameras ensured a robust autonomous solution.

Report Card

Support: 
GET racing was glad about all the offered support from the GeT Cameras team. Except from one minor problem with their driver that could not be solved, they were thankful for the close support, especially around lens selection, ensuring the right field of view.

Durability: 
After many hours of testing, the team still believes the cameras function like on the first day of use.

Image Quality:
The team rates the image quality 4.5 stars out of 5. With the resolution of the 1.6MP cameras, they were able to extend their sensor range from around 3 meters to around 8 meters. The image quality was still great.

Flexibility: 
GeT Cameras earns a perfect score for flexibility, running on multiple operating systems and supporting many options to change parameters based on different kinds of environments in which the cameras should perform.

Implementation Process: 
While the implementation process was generally smooth, some challenges were faced. However, the support offered by the technical team of GeT Cameras assisted them in solving these issues.

Pricing: 
Initially, the decision of the team to choose to work with GeT Cameras was not made based on pricing but on specifications and support. However, judging by the pricing in the market, they believe the cameras are fair and lower priced than comparable ones.

The collaboration between GET racing and GeT Cameras marks a significant stride in the world of autonomous racing. With their strategic vision system design and the shared passion for innovation, this partnership could be called a success. Would you like to keep up to date with the latest achievements of the student team? Make sure to follow GET racing via this link on LinkedIn or take a look at their website.
 
  Machine vision market & applications     26-02-2024 15:03