Project scope
Categories
Information technology Software development Machine learning Artificial intelligence HardwareSkills
object tracking full stack development video streaming object detection scalability application programming interface (api) web applications internet of things (iot) movielandThis project challenges students to build a system that allows real-time object interaction within live video streams. The core deliverable is a web-based interface that detects and tracks physical objects through a webcam/IP stream, enabling user interaction via overlays that can trigger logic inside the FreeFuse platform—or even control IoT devices.
The team will explore real-time video pipelines, object tracking, and full-stack orchestration of interaction data to backend triggers or hardware responses.
- Object detection & tagging pipeline (YOLOv8, MediaPipe)
- Real-time video overlay UI
- Trigger logic API that communicates with FreeFuse or simulated IoT
- Demonstration video showing live interaction
- Architecture documentation with scalability paths
Providing specialized, in-depth knowledge and general industry insights for a comprehensive understanding.
Sharing knowledge in specific technical skills, techniques, methodologies required for the project.
Direct involvement in project tasks, offering guidance, and demonstrating techniques.
Providing access to necessary tools, software, and resources required for project completion.
Scheduled check-ins to discuss progress, address challenges, and provide feedback.
Supported causes
The global challenges this project addresses, aligning with the United Nations Sustainable Development Goals (SDGs). Learn more about all 17 SDGs here.
About the company
FreeFuse aims to address the problem of low engagement, conversions, brand recognition, and brand loyalty by focusing on delivering personalized, engaging, and relevant digital experiences for users.