<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>autonomous vehicles | UCSC OSPO</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/tag/autonomous-vehicles/</link><atom:link href="https://deploy-preview-1007--ucsc-ospo.netlify.app/tag/autonomous-vehicles/index.xml" rel="self" type="application/rss+xml"/><description>autonomous vehicles</description><generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><lastBuildDate>Mon, 07 Nov 2022 10:15:56 -0700</lastBuildDate><item><title>Open Source Autonomous Vehicle Controller</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/project/osre22/ucsc/osavc/</link><pubDate>Mon, 07 Nov 2022 10:15:56 -0700</pubDate><guid>https://deploy-preview-1007--ucsc-ospo.netlify.app/project/osre22/ucsc/osavc/</guid><description>&lt;p>The OSAVC is a vehicle-agnostic open source hardware and software project. This project is designed to provide a real-time hardware controller adaptable to any vehicle type, suitable for aerial, terrestrial, marine, or extraterrestrial vehicles. It allows control researchers to develop state estimation algorithms, sensor calibration algorithms, and vehicle control models in a modular fashion such that once the hardware set has been developed switching algorithms requires only modifying one C function and recompiling.&lt;/p>
&lt;p>Lead mentor: &lt;a href="mailto:aamuhunt@ucsc.edu">Aaron Hunter&lt;/a>&lt;/p>
&lt;p>Projects for the OSAVC:&lt;/p>
&lt;h3 id="vehiclecraft-sensor-driver-development">Vehicle/Craft sensor driver development&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Topics&lt;/strong>: Driver code to integrate sensor to a microcontroller&lt;/li>
&lt;li>&lt;strong>Skills&lt;/strong>: C, I2C, SPI, UART interfaces&lt;/li>
&lt;li>&lt;strong>Size&lt;/strong> 175 hours&lt;/li>
&lt;li>&lt;strong>Difficulty&lt;/strong> Medium&lt;/li>
&lt;li>&lt;strong>Mentor&lt;/strong> Aaron Hunter&lt;/li>
&lt;/ul>
&lt;p>Help develop a sensor library for use in autonomnous vehicles. Possible sensors include range finders, ping sensors, IMUs, GPS receivers, RC receivers, barometers, air speed sensors, etc. Code will be written in C using state machine methodology and non-blocking algorithms. Test the drivers on a Microchip microncontroller.&lt;/p>
&lt;h3 id="path-finding-algorithm-using-opencv-and-machine-learning">Path finding algorithm using OpenCV and machine learning&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Topics&lt;/strong>: Computer vision, blob detection&lt;/li>
&lt;li>&lt;strong>Skills&lt;/strong>: C/Python, OpenCV&lt;/li>
&lt;li>&lt;strong>Size&lt;/strong> 175 or 350 hours&lt;/li>
&lt;li>&lt;strong>Difficulty&lt;/strong> Medium&lt;/li>
&lt;li>&lt;strong>Mentor&lt;/strong> Aaron Hunter&lt;/li>
&lt;/ul>
&lt;p>Use OpenCV to identify a track for an autonomous vehicle to follow. Build on previous work by developing a new model using EfficientDet and an existing training set of images. Port the model to TFlite and implement on the Coral USB Accelerator. Evaluate its performance against our previous efforts.&lt;/p>
&lt;h3 id="state-estimationsensor-fusion-algorithm-development">State estimation/sensor fusion algorithm development&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Topics&lt;/strong>: Kalman filtering, Mahoney&lt;/li>
&lt;li>&lt;strong>Skills&lt;/strong>: C/Python, Matlab/Simulink, numerical optimization algorithms&lt;/li>
&lt;li>&lt;strong>Size&lt;/strong> 350 hours&lt;/li>
&lt;li>&lt;strong>Difficulty&lt;/strong> Challenging&lt;/li>
&lt;li>&lt;strong>Mentor&lt;/strong> Aaron Hunter&lt;/li>
&lt;/ul>
&lt;p>Implement an optimal state estimation algorithm from a model. This model can be derived from a Kalman filter or some other state estimation filter (e.g., Mahoney filter). THe model takes sensor readings as input and provides an estimate of the state of a vehicle. Finally, convert the model to standard C using the Simulink code generation or implement in Python (for use on a single board computer, e.g., Raspberry Pi)&lt;/p></description></item><item><title>Open Source Autonomous Vehicle Controller</title><link>https://deploy-preview-1007--ucsc-ospo.netlify.app/project/osre23/ucsc/osavc/</link><pubDate>Mon, 07 Nov 2022 10:15:56 -0700</pubDate><guid>https://deploy-preview-1007--ucsc-ospo.netlify.app/project/osre23/ucsc/osavc/</guid><description>&lt;p>The OSAVC is a vehicle-agnostic open source hardware and software project. This project is designed to provide a real-time hardware controller adaptable to any vehicle type, suitable for aerial, terrestrial, marine, or extraterrestrial vehicles. It allows control researchers to develop state estimation algorithms, sensor calibration algorithms, and vehicle control models in a modular fashion such that once the hardware set has been developed switching algorithms requires only modifying one C function and recompiling.&lt;/p>
&lt;p>Lead mentor: &lt;a href="mailto:aamuhunt@ucsc.edu">Aaron Hunter&lt;/a>&lt;/p>
&lt;p>Projects for the OSAVC:&lt;/p>
&lt;h3 id="vehiclecraft-sensor-driver-development">Vehicle/Craft sensor driver development&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Topics&lt;/strong>: Driver code to integrate sensor to a microcontroller&lt;/li>
&lt;li>&lt;strong>Skills&lt;/strong>: C, I2C, SPI, UART interfaces&lt;/li>
&lt;li>&lt;strong>Size&lt;/strong> 175 hours&lt;/li>
&lt;li>&lt;strong>Difficulty&lt;/strong> Medium&lt;/li>
&lt;li>&lt;strong>Mentor&lt;/strong> &lt;a href="mailto:aamuhunt@ucsc.edu">Aaron Hunter&lt;/a>, &lt;a href="mailto:caiespin@ucsc.edu">Carlos Espinosa&lt;/a>, Pavlo Vlastos&lt;/li>
&lt;/ul>
&lt;p>Help develop sensor libraries for use in autonomous vehicles. We are in particular interested in sensors for UAVs: airspeed sensors (pitot tube) or barometers, but also proximity detectors (ultrasonic), and range sensors. Code will be written in C using state machine methodology and non-blocking algorithms. Test the drivers on a Microchip microncontroller.&lt;/p>
&lt;h3 id="technical-documentation">Technical Documentation&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Topics&lt;/strong>: Documentation&lt;/li>
&lt;li>&lt;strong>Skills&lt;/strong>: Technical writing, markdown language, website&lt;/li>
&lt;li>&lt;strong>Size&lt;/strong> 175 hours&lt;/li>
&lt;li>&lt;strong>Difficulty&lt;/strong> Medium&lt;/li>
&lt;li>&lt;strong>Mentor&lt;/strong> Aaron Hunter/Carlos Espinosa/Pavlo Vlastos&lt;/li>
&lt;li>&lt;strong>Contributor(s)&lt;/strong> &lt;a href="https://deploy-preview-1007--ucsc-ospo.netlify.app/author/aniruddha-thakre/">Aniruddha Thakre&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Technical Documentation:
Write a tutorial to demonstrate how to start with an OSAVC and program it with the robotic equivalent of HelloWorld, moving onto more sophisticated applications. Create a web page interface to the OSAVC repo highlighting this tutorial. In this project you will start from scratch with an OSAVC PCB and bring it to life, while documenting it in a way to help new users.&lt;/p>
&lt;h3 id="rosgazebo-robot-simulation">ROS/Gazebo Robot Simulation&lt;/h3>
&lt;ul>
&lt;li>&lt;strong>Topics&lt;/strong>: Robot simulation with ROS/Gazebo&lt;/li>
&lt;li>&lt;strong>Skills&lt;/strong> ROS/Gazebo, Python&lt;/li>
&lt;li>&lt;strong>Size&lt;/strong> 175 or 350 hours&lt;/li>
&lt;li>&lt;strong>Difficulty&lt;/strong> Medium to Hard&lt;/li>
&lt;li>&lt;strong>Mentor&lt;/strong> &lt;a href="mailto:aamuhunt@ucsc.edu">Aaron Hunter&lt;/a>, &lt;a href="mailto:caiespin@ucsc.edu">Carlos Espinosa&lt;/a>, Pavlo Vlastos&lt;/li>
&lt;li>&lt;strong>Contributor(s)&lt;/strong> &lt;a href="https://deploy-preview-1007--ucsc-ospo.netlify.app/author/damodar-datta-kancharla/">Damodar Datta Kancharla&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Generate a simulated world and a quadcopter model in ROS/Gazebo. Provide a link from Mavlink to ROS using the mavros package and simulate a real vehicle data stream to command the simulated quadcopter in Gazebo. At the same time return the image stream from Gazebo to allow for offline processing of ML models on the images.&lt;/p></description></item></channel></rss>