Skip to main content

🤖 AI Learning Companion

Agent Skills:

Capstone Project: The Autonomous Humanoid

Goal: A final project where a simulated robot receives a voice command, plans a path, navigates obstacles, identifies an object using computer vision, and manipulates it.

Project Steps​

  1. Voice Command: "Go to the kitchen and pick up the apple."
  2. Planning: LLM decomposes this into Navigate(Kitchen) -> Detect(Apple) -> Grasp(Apple).
  3. Execution:
    • Nav2 handles navigation.
    • YOLO/Isaac ROS handles detection.
    • MoveIt handles manipulation.