Using robotics as an autonomous crop-field monitoring system is a cost-effective solution for acquiring different images (e.g., RGB and infrared). Furthermore, the onboard autonomous intelligence helps optimize the robotic sensing activity by pre-processing the collected data, and deciding the navigation paths based on information such as visual markers (e.g., aruco/fiducial markers). However, a generic and cost-effective robotic solution autonomously navigating a rugged outdoor environment is still an open issue. Given these challenges, the OpenioT research group proposes to develop a ROS-based rover, based on existing open-source projects, to navigate and acquire images using a camera mounted on a 4 DoF robotic arm.

The goal of this project is to develop ROS modules that allow the JPL Open Source Rover (a 6-wheeled ROS-compatible mobile robot) to autonomously executed a pre-programmed trajectory (based on waypoints) and collect data using a camera (or other sensors) mounted on the 4 DoF arm, optimizing the pose of the robot arm and the robot. 


Persee Depth Mapping

Scene Mapping Testing using Persee Depth Camera

Suspension Test

Indoor suspension testing

Lidar Slam

SLAM testing using Hector Slam with the RPILidar

Outdoor Test 1

Suspension testing in outdoor environment (not full speed)

Outdoor Test 2

Full motion testing with slope road and rock terrain (not full speed)

Outdoor Test 3

Outdoor motion testing on grass, mud, and slopped terrain (not full speed)

OpenIoT Rover: Environment characterization Outdoor Test 1

Outdoor test at FBK simulated field. 

OpenIoT Rover: Environment characterization Outdoor Test 2

Outdoor test at Tres apple field location.

OpenIoT Rover: dashboard outdoor Test

Dashboard  outdoor test at the Tres apple field.