Off-campus SJSU users: To download campus access theses, please use the following link to log into our proxy server with your SJSU library user name and PIN.

Publication Date

Fall 2023

Degree Type

Thesis - Campus Access Only

Degree Name

Master of Science (MS)

Department

Computer Engineering

Advisor

Stas Tiomkin; Haonan Wang; Wencen Wu

Abstract

Humans are considered very skilled object manipulators. Their capability to manipulate relies on their adaptability. They can change their grasp while interacting with an object. They utilize touch sensation to understand the amount of pressure required to grasp the object. They use their vision to decide the placement of their fingers to grasp effectively. There are situations where humans can perform manipulation blindly as well. Even though manipulation task is considered natural for humans, it is still challenging task in the field of robotics. The challenges associated with manipulation are the design of the end effector of the robot, grasp planning, capturing useful data from sensors and the formulation of an effective control strategy to perform manipulation. The design of the end effector has progressed towards an anthropomorphic design have the dexterity of human hands. As there has been a plethora of recent developments in image processing and computer vision a lot of researchers have utilized the visual perception of robots to perform object manipulation. The use of tactile perception to come up with a control strategy is still little known. Few researchers have suggested that using tactile information and visual perception can improve the performance of object manipulation. In this thesis, we aim to explore various ways to merge data from tactile perception with proprioception data to come up with object manipulation control strategies. We sought to determine if it is possible to complement partial proprioception information with tactile data for manipulation We were able to provide substantial results indicating tactile data can classify objects with distinct geometrical shapes as well as estimate relative object position during the interaction of objects and the end effector of the robot.

Available for download on Monday, February 26, 2029

Share

COinS