Real-Time and Accurate Full-Body Multi-Person Pose Estimation&Tracking System
-
Updated
May 13, 2024 - Python
Real-Time and Accurate Full-Body Multi-Person Pose Estimation&Tracking System
Awesome work on hand pose estimation/tracking
Computer Vision library for human-computer interaction. It implements Head Pose and Gaze Direction Estimation Using Convolutional Neural Networks, Skin Detection through Backprojection, Motion Detection and Tracking, Saliency Map.
Orchestra is a human-in-the-loop AI system for orchestrating project teams of experts and machines.
👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.
AAAI 2024 Papers: Explore a comprehensive collection of innovative research papers presented at one of the premier artificial intelligence conferences. Seamlessly integrate code implementations for better understanding. ⭐ experience the forefront of progress in artificial intelligence with this repository!
Introducing Venocyber md bot your personal chuddybuddy md you were looking for this is most powerful Whatsapp chat bot created to ensure your WhatsApp personal requirements you are all in one ✍️👋👋
Code for CVPR'18 spotlight "Weakly and Semi Supervised Human Body Part Parsing via Pose-Guided Knowledge Transfer"
Clojure(Script) library for phrasing spec problems.
手势识别进行自定义操控电脑程序 | an application based on gesture recognition for controlling desktop softwares, developed by MediaPipe + Electron + React
Quickly add MediaPipe Pose Estimation and Detection to your iOS app. Enable powerful features in your app powered by the body or hand.
A curated list of awesome affective computing 🤖❤️ papers, software, open-source projects, and resources
VR driving 🚙 + eye tracking 👀 simulator based on CARLA for driving interaction research
Easy to use Python command line based tool to generate a gaze point heatmap from a csv file. 👁️
openEMSstim: open-hardware module to adjust the intensity of EMS/TENS stimulators.
Toolkits to create a human-in-the-loop approval layer to monitor and guide AI agents workflow in real-time.
Code and data belonging to our CSCW 2019 paper: "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites".
The official implementation for ICMI 2020 Best Paper Award "Gesticulator: A framework for semantically-aware speech-driven gesture generation"
Wearable computing software framework for intelligence augmentation research and applications. Easily build smart glasses apps, relying on built in voice command, speech recognition, computer vision, UI, sensors, smart phone connection, NLP, facial recognition, database, cloud connection, and more. This repo is in beta.
Add a description, image, and links to the human-computer-interaction topic page so that developers can more easily learn about it.
To associate your repository with the human-computer-interaction topic, visit your repo's landing page and select "manage topics."