August 2024
Tools Used: Python, Robosuite, Reinforcement Learning
My master's research project was to set up and integrate a learning task into the lab's cognitive architecture. Overall, this consisted of creating a Tower of Hanoi problem in Robosuite , training operators for each action in the plan, disrupting the plan with a novel scenario, and finally learning a new operator for the failed action using a custom reinforcement learning agent. All of these components operate within the lab's cognitive architecture and demonstrate the ability of a custom reinforcement learning agent to learn within the architecture, passing symbolic information back and forth between them. The environment was created in Robosuite and uses a Kinova arm for manipulation tasks.
August 2024
Tools Used: Python, Machine Learning, scikit-learn
In this project, I analyzed the sentiment—positive or negative—of reviews from Yelp, IMDb, and Amazon using various machine learning models. The models were trained on labeled review data, with all input data preprocessed to enhance accuracy. I used feature vectors that included both unigrams (single words) and bigrams (two-word combinations). The models employed were logistic regression, neural networks, and decision tree classifiers. For each model, I tuned hyperparameters and performed cross-validation on the training data to identify the best-performing models on new test data. This project involved model training, data preprocessing, model validation, and hyperparameter tuning.
May 2024
Tools Used: Python, OpenCV, ROS
This project utilized an Interbotix Locobot to track an orange Sphero as it navigated an enclosed map. The Locobot was equipped with an Intel RealSense Depth Camera D435 to measure the distance of the Sphero from the robot upon detection. The robot autonomously tracked the Sphero by initially moving along predefined points on the map until the Sphero was detected. Once detected, the Locobot maintained a distance of 0.6 meters from the Sphero and attempted to relocate it if it went out of sight. To relocate the Sphero, the Locobot first predicted its location based on its trajectory on the map, and if unable to locate it, moved to its last observed position. To detect the Sphero in the camera frame, I used OpenCV to extract pixels within a certain HSV range and found the centroid of these pixels. I then extracted the depth of the centroid from the Locobot's depth camera and transformed the Sphero's coordinates into a location on the map using ROS' tf library. Using this method, I was able to track the Sphero as it moved around the map and successfully find it if it left the camera frame.
December 2023
Tools Used: Python, PyBullet, Reinforcement Learning (HER and PPO Policy), Matplotlib, OpenAI Gym
Throughout this project, I conducted tests using various reinforcement learning algorithms in multiple Gym-based environments. The aim was to investigate how the observation space, action space, and reward function influence training performance when tackling a task. This exploration culminated with the training of a kinematic model for a Kinova arm in a PyBullet simulation to execute a reach task. Key considerations included effectively shaping the reward function, choosing between sparse or dense reward functions, and selecting suitable algorithms. Following the Kinova arm training, I assessed the effectiveness of training across models by plotting cumulative rewards using the Matplotlib library. You can find the detailed write-up for this project below.
December 2023
Tools Used: Python, PyCharm, ESP32, ESP32 Camera, PS4 Controller, C++, 3D Printing
In this project, my responsibilities included configuring electronic components and designing the software architecture. The robot was operated using a PS4 controller, communicating via Bluetooth with the ESP32 to control both speed and direction. I established a camera stream on an ESP32 camera mounted at the front of the robot, enabling remote control of the entire system. Additionally, I incorporated a WT901 gyroscopic sensor to assess the tilt angle of the robot, particularly when ascending stairs. If the robot detected stair climbing, it would autonomously take control from the user until it returned to a flat surface, at which point control was returned to the user.
November 2023
Tools Used: C++, ESP32, ESP32 Camera, Computer Vision, Machine Learning, Motor Control (Stepper & Servo)
In this project, my responsibilities included configuring electronic components and training a computer vision model to detect a player's hand gesture (rock, paper, or scissors) using Edge Impulse's embedded machine learning tools. This process involved labeling image data and training an effective model for accurate gesture recognition. While the player made their gesture, the robot simultaneously mimicked the gesture using its onboard 3D printed arm and pinwheel. Once the player's gesture was recognized, the robot determined the winning state and updated its expression accordingly. This operation required precise timing of the onboard stepper motors and servo motors to provide a seamless Rock, Paper, Scissors playing experience. Additionally, the robot had the added feature of being able to navigate to a player raising their hand to initiate the game.
November 2023
Tools Used: C++, ESP32, PD Control, Motor Control (Stepper & Servo), Gyroscope & Accelerometer, 3D Printing
In this project, my responsibilities included configuring electronic components and implementing the PD controller necessary to maintain the robot's balance. The robot was equipped with an accelerometer and gyroscope onboard, providing the tilt angle and angular velocity required for the PD controller. The controller consistently adjusted the motor output of the wheels using the PD control expression:
PD = Kp * (balance - angle) + Kd * angular velocity
Using this straightforward method, the robot could control its tilt angle relative to its balanced angle by adjusting the constant values Kp and Kd.
October 2023
Tools Used: C++, ESP32, Kinematics, Computer Vision (OpenCV), Machine Learning, Motor Control (Stepper & Servo), CAD (SolidWorks), 3D Printing
In this project, my responsibilities encompassed designing the linkage system, 3D printing, laser cutting the corresponding parts, configuring the electronic setup, and developing the software architecture for the writing robot. The robot, a 5-link, 3-degree-of-freedom arm, was capable of writing letters and user emotions. I initiated the design process using SolidWorks and then 3D printed the necessary components for the arms and stepper motor mount. To enable the arm to write letters on paper, I devised and implemented an inverse kinematic model tailored to our robot. This kinematic model allowed us to determine the arm angles required to position the end effector at a specific point in the X, Y grid. Additionally, we employed OpenCV on a laptop to detect whether a user is smiling or frowning and then drew the corresponding emotion under the user's initials.