3D Depth Perception
Nerian's 3D Depth Perception in Science
Our stereovision technology has already been used in numerous research projects. Below you will find a list of scientific publications by us or our customers, where our technology has been applied.
Of course we are always interested in participating in or supporting research projects whenever possible.
Publications by Nerian
Real-Time Stereo Vision on FPGAs with SceneScan
We present a flexible FPGA stereo vision implementation that is capable of processing up to 100 frames per second or image resolutions up to 3.4 megapixels, while consuming only 8 W of power. The implementation uses a variation of the Semi-Global Matching (SGM) algorithm, which provides superior results compared to many simpler approaches. The stereo matching results are improved significantly through a post-processing chain that operates on the computed cost cube and the disparity map. With this implementation we have created two stand-alone hardware systems for stereo vision, called SceneScan and SceneScan Pro. Both systems have been developed to market maturity and are available from Nerian Vision GmbH.
Published at Bildverarbeitung 2018, pp. 339–350.
SP1: Stereo Vision in Real Time
Stereo vision is a compelling technology for depth perception. Unlike other methods for depth sensing, such as time-of-flight or structured light cameras, stereo vision is a passive approach. This makes this method suitable for environments with bright ambient lighting, or for situations with multiple sensors within close proximity to one another. The reason why stereo vision is not used more widely is that it requires a vast amount of computation. To overcome this burden, Nerian Vision Technologies introduces the SP1 stereo vision system. This stand-alone device is able to handle the required processing by relying on a built-in FPGA.
Published at MuSRobS@ IROS, 2015, pp. 40–41.
Customer publications referencing SceneScan
- Hinze, C., Zürn, M., Wnuk, M., Lechler, A. & Verl, A. (2020). Nonlinear Trajectory Control for Deformable Linear Objects based on Physics Simulation. In the 46th Annual Conference of the IEEE Industrial Electronics Society (IECON), p.310-316. [Link]
- Strobel, K., Zhu, S., Chang, R. & Koppula, S. (2019). Accurate, Low-Latency Visual Perception for Autonomous Racing: Challenges, Mechanisms, and Practical Solutions. Technical report. [Link]
- Hinze, C., Wnuk, M. & Lechler, A. (2019). Harte Echtzeit für weiche Materialien. atp magazin, v. 61, n. 11-12, p. 112-119. [Link]
- Vrba, M., Heřt, D. & Saska, M. (2019). Onboard Marker-Less Detection and Localization of Non-Cooperating Drones for Their Safe Interception by an Autonomous Aerial System. IEEE Robotics and Automation Letters, 4(4), 3402-3409. IEEE. [Link]
Customer publications referencing Stereo Vision IP Core
- Kartha, A., Sadeghi, R., Barry, M. P., Bradley, C., Gibson, P., Caspi, A., Roy, A. & Dagnelie, G. (2020). Prosthetic Visual Performance Using a Disparity-Based Distance-Filtering System. Translational Vision Science & Technology, 9(12), 27-27. [Link]
- Fütterer, R., Schellhorn, M., & Notni, G. (2019). Implementation of a multiview passive-stereo-imaging system with SoC technology. In Photonics and Education in Measurement Science 2019 (Vol. 11144, p. 111440Q). International Society for Optics and Photonics. [Link]
Customer publications referencing SP1
- Erz, M. (2018). Computer vision based pose detection of agricultural implements without a priori knowledge of their geometry and visual appearance. In 2018 15th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS) (pp. 1-6). IEEE. [Link]
- Buck, S., & Zell, A. (2019). CS:: APEX: A Framework for Algorithm Prototyping and Experimentation with Robotic Systems. Journal of Intelligent & Robotic Systems, 94(2), 371-387. [Link]
- Hanten, R., Kuhlmann, P., Otte, S., & Zell, A. (2018). Robust Real-Time 3D Person Detection for Indoor and Outdoor Applications. In 2018 IEEE International Conference on Robotics and Automation (ICRA) (pp. 2000-2006). IEEE. [Link]
- Dubey, G., Madaan, R., & Scherer, S. (2018). DROAN-Disparity-Space Representation for Obstacle Avoidance: Enabling Wire Mapping & Avoidance. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 6311-6318). IEEE. [Link] [PDF]
New stereo vision project?
We help you with the evaluation!
If you would like to stay informed about our stereo vision products then subscribe to our mailing list: