AR-Based Human Interaction Enabled Application for Marine Life
Duration: 175 Hours | Project Size: Medium | Difficulty: Advanced
Mentors
Somya Barolia, Nikhil Ranjan Rajhans
Required Skills
- Python
- YOLO
- MediaPipe
- Pose Detection
- Machine Learning
- C#
- Unity
- AR Foundation
- Vuforia SDK
- Firebase
- Cloud
- Git
- REST API
- CI/CD
- Blender
Expected Outcome
A gesture-driven AR marine learning experience.
Description
Introduces human interaction using hand gestures and pose-based controls instead of traditional UI. Users can interfere with creature movement, trigger behaviors (e.g., octopus camouflage or ink defense), and influence the ecosystem through real-time camera-based interaction.
← Back to Projects