Virtual Cooking 2
The Virtual Cooking project has been in existence since 2019 and aims to teach robots to grasp like humans so they can assist in cooking.
On the one hand the work of this Master’s project focused on adding new objects to a virtual kitchen environment. Therefor the process was automated applying the technology of photogrammetry (taking pictures of the object and rendering those in Meshroom for modelling a virtual 3D representation) and even further with the usage of a robot arm for taking those pictures.
On the other hand this project aimed at analysing grasping within the kitchen context, detecting which of the 33 most common grasp types (according to the “GRASP Taxonomy”) was performed, and visualizing the collision between hand and object by generating heat- and force maps, considering where the object was picked and how much force was applied. In our user study we recorded grasping data using a Cyberglove and an OptiTrack system and compared grasping behaviour in real life to virtual grasping with and without objects as haptic feedback.
Zur Projekt Webseite