About the course
Scenario
The scenario enables interaction between an operator and a Robotic arm NED2, using interfaces based on artificial intelligence. In particular, the operator can use gestures to designate the part to be picked up by the Robotic arm NED2, one gesture corresponding to a part from a collection arranged in the alpha zone. The Robotic arm NED2 places the gripped part in the operator’s hand, after identifying its position using the vision set camera, above the loading zone. This is a pick-and-place sequence, where the pick and place points are provided in real time by the operator’s gesture commands. Gesture and hand position recognition is performed by deep learning tools.
The objective of the scenario is to perform this operation by following the steps illustrated in the following algorithm:
Lab Contents
Chapter 1: Pick and Place
- Define points of interest
- Create movements for pick and place
Chapter 2: Gesture recognition
- Know how to use Teachable Machine to train a model
- Obtain predictions based on gestures
- Create a filter to validate a gesture
Chapter 3: Hand detection
- Detect the hand in the camera image
- Calibrate the camera
- Obtain the coordinates of the drop point in the middle of the hand
Chapter 4: Integration
- Integrate the subprograms into a complex and functional program
Prerequisite knowledge
Python: Basic syntax + simple data and control structures + loops + simple function calls
Reference frames and transformation: Understanding how Cartesian coordinate systems work and the principle of transformations
Required Equipment
What you'll learn
- Practice the concepts of reference frames and points, associated with a tool and a base, in robotics.
- Perform the steps of picking and placing in a workspace with a specific tool.
- Deepen programming concepts with the implementation of lists and/or dictionaries in Python.
- Practice using the Keras library.
- Understand the constraints and limitations of a 2D camera.
- Implement a calibration system for a camera in a robotic environment.
- Train and evaluate a machine learning model for gesture recognition.
- Use Python libraries to recognize the center of the hand.
Course content
About the Author
Enrolment options
Exploring AI-Driven Collaboration in Robotics
- Length: 8h
- Content Type: Lab
- Programming: Python
- Equipment: Bundle discovery
Scenario
The scenario enables interaction between an operator and a Robotic arm NED2, using interfaces based on artificial intelligence. In particular, the operator can use gestures to designate the part to be picked up by the Robotic arm NED2, one gesture corresponding to a part from a collection arranged in the alpha zone. The Robotic arm NED2 places the gripped part in the operator’s hand, after identifying its position using the vision set camera, above the loading zone. This is a pick-and-place sequence, where the pick and place points are provided in real time by the operator’s gesture commands. Gesture and hand position recognition is performed by deep learning tools.
The objective of the scenario is to perform this operation by following the steps illustrated in the following algorithm:
Lab Contents
Chapter 1: Pick and Place
- Define points of interest
- Create movements for pick and place
Chapter 2: Gesture recognition
- Know how to use Teachable Machine to train a model
- Obtain predictions based on gestures
- Create a filter to validate a gesture
Chapter 3: Hand detection
- Detect the hand in the camera image
- Calibrate the camera
- Obtain the coordinates of the drop point in the middle of the hand
Chapter 4: Integration
- Integrate the subprograms into a complex and functional program
Prerequisite knowledge
Python: Basic syntax + simple data and control structures + loops + simple function calls
Reference frames and transformation: Understanding how Cartesian coordinate systems work and the principle of transformations
Required Equipment
- Enrolled students: 57