🤖 AI Learning Companion
Agent Skills:
Capstone Project: The Autonomous Humanoid
Goal: A final project where a simulated robot receives a voice command, plans a path, navigates obstacles, identifies an object using computer vision, and manipulates it.
Project Steps​
- Voice Command: "Go to the kitchen and pick up the apple."
- Planning: LLM decomposes this into
Navigate(Kitchen)->Detect(Apple)->Grasp(Apple). - Execution:
- Nav2 handles navigation.
- YOLO/Isaac ROS handles detection.
- MoveIt handles manipulation.