Skip to main content
Other Projects
Other Research and Fun Projects
Self Parking Car
Self-Parking Car
A vision system with multiple cameras provides a bird's eye view of the surrounding of the autonomous vehicle for self parking and proximity sensing. This is an on-going project in the Robotic Vision Lab.
The Hummingbird project uses a drone, Raspberry PI, and PI camera to track and catch a foam basketball. The system detects, tracks, and estimates the trajectory of the basketball. Trajectory estimation is critical because the camera moves and loses track of the basketball when the drone tilts and moves. This project can be expanded to track, recognize, and chase birds away from farms, orchards, wind turbine, or restricted airfields.
An efficient target detection and tracking algorithm was implemented on an FPGA to perform ultra low latency visual servoing for high-speed target tracking using multi-focal length camera arrays. This system was designed for tracking bright objects in the dark sky. In the video, the green dot simulates the randomly moving bright object, and the red dot indicates the center of the image frame. It is detected and tracked by a low-resolution camera system on a high-powered high-payload gimbal to guide a high-resolution and high frame-rate camera, also mounted on the gimbal, to record high quality videos for analysis.
Safense is an embedded vision sensor that was designed to be mounted on the back of a bicycle, motorcycle, or helmet to detect approaching vehicles or objects of threat (OOT) and provide an early warning to the cyclist to avoid accidents. It sends the warning signal to a mobile App through Bluetooth or directly to an indicator light. Watch this video to see the statistics of cyclist safety.
All smartphones are equipped with most of the sensors needed for a self-driving car. These include camera, Bluetooth, compass, GPS, gyroscope, and accelerometer. We developed a mobile vision App and used the smartphone to navigate the truck through obstacles autonomously.
Ball Hawk is a mobile vision App designed to improve basketball free throw accuracy. NBA teams use similar but much more expensive tool for training. Our App detects and tracks the basketball and calculates its trajectory to report its approaching angle in real-time. This real-time feedback is used to help train the player's muscle memory to improve shooting accuracy.
Computers have been beating humans at board games like chess and Go. But now they've conquered the basement and bar game of foosball too. A group of undergrads at Brigham Young University in Utah have built an AI machine that can play on a modified foosball table. In a recent game, the machine defeated a human player four to one. BYU News CNN Business. KSL TV. Fox News. Deseret News.
This is a next-generation automobile headlamp. It has 24 high-powered LEDs that can be controlled and dimmed individually. A forward-looking camera mounted on the rear view mirror detects incoming vehicle's headlight. This vision sensor detects and tracks the incoming vehicle and dims the LEDs that could blind the driver to increase road safety.
The Brigham Young Urban Challenge (BYUC) team entered its vehicle, Ynot, in the 2007 DARPA Urban Grand Challenge.  The DARPA Gand Challenge (www.darpa.mil/grandchallenge) is an unmanned vehicle competition sponsored by the Defense Advanced Research Projects Agency (DARPA), which is the division of the Department of Defense in charge of deploying cutting-edge technologies in the military.
It looks like a cross between a lawn mower and a snowmobile, but the Brigham Young University Y-Clops is all high-tech. Designed and built by a team of 12 electrical and computer engineering students, the mobile robot uses a color camera for an eye, an old wheelchair for a body and a custom-built circuit board running artificial intelligence algorithms for a brain as it navigates a course filled with barrels, buckets and cones all by itself. BYU News.