Webcam Paint Application Using Python
Abstract
Writing is a cohesive form of communication that can effectively convey our thoughts. Typing and writing are the standard methods of recording information today. Characters or words are written in the free space with a marker or finger. It differs from traditional writing methods in that the pen does not move up and down. With the development of intelligent wearable devices, the digital world can now be controlled by human gestures. These wearable devices can recognize and understand our actions. Recognizing and interpreting a continuous sequential gesture stream from the given set of input data is gesture recognition. Gestures are nonverbal information used to improve computer language understanding. Human gestures are perceived by vision, computer vision is used to analyze various gestures. The project takes advantage of this gap and focuses on developing a motion-to-text converter that can potentially serve as a software for intelligent wearable devices for writing from the air. The system will use computer vision to trace the finger’s path, in that way, one can write from above. The generated text can also be used for various purposes, such as sending messages, e-mails, etc. It will be a powerful means of communication for the deaf. It is an effective communication method that reduces cellphone and laptop usage by eliminating the need to write.
References
Application,” 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Las
Vegas, NV, 2016; 370-377.
2. Ramasamy P, Prabhu G, Srinivasan R. An economical air writing system is converting finger movements
to text using a web camera International Conference on Recent Trends in Information Technology (ICRTIT)
Chennai 2016; 1- 6: 2016.
3. Saira Beg, M. Fahad Khan and Faisal Baig, “Text Writing in Air Journal of Information Display Volume 14, Issue
4, 2013
4. Alper Yilmaz, Omar Javed, Mubarak Shah, “Object Tracking: A Survey”, ACM Computer Survey. Vol. 38,
Issue. 4, Article 13, Pp. 1-45, 2006
5. Yuan-Hsiang Chang, Chen-Ming Chang, “Automatic Hand-Pose Trajectory Tracking System Using Video
Sequences”, INTECH, pp. 132- 152, Croatia, 2010
6. Erik B. Sudderth, Michael I. Mandel, William T. Freeman, Alan S. Willsky, “Visual Hand Tracking Using
Nonparametric Belief Propagation”, MIT Laboratory For Information & Decision Systems Technical Report
P-2603, Presented at IEEE CVPR Workshop On Generative Model-Based Vision, Pp. 1-9, 2004
7. T. Grossman, R. Balakrishnan, G. Kurtenbach, G. Fitzmaurice, A. Khan, B. Buxton, “Creating Principal
3D Curves with Digital Tape Drawing,” Proc. Conf. Human Factors Computing Systems (CHI’ 02), pp. 121-
128, 2002.
8. T. A. C. Bragatto, G. I. S. Ruas, M. V. Lamar, “Realtime Video-Based Finger Spelling Recognition System
Using Low Computational Complexity Artificial Neural Networks”, IEEE ITS, pp. 393-397, 2006
9. Yusuke Araga, Makoto Shirabayashi, Keishi Kaida, Hiroomi Hikawa, “Real Time Gesture Recognition System
Using Posture Classifier and Jordan Recurrent Neural Network”, IEEE World Congress on Computational Intelligence, Brisbane, Australia, 2012
10. Ruiduo Yang, Sudeep Sarkar, “Coupled grouping and matching for sign and gesture recognition”, Computer
Vision and Image Understanding, Elsevier, 2008
11. R. Wang, S. Paris, J. Popovic, “6D hands: markerless hand-tracking for computer-aided design,” in Proc.
24th Ann. ACM Symp. User Interface Softw. Technol., 2011, pp. 549–558.
12. Maryam Khosravi Nahouji, “2D Finger Motion Tracking, Implementation For Android Based Smartphones”,
Master’s Thesis, CHALMERS Applied Information Technology,2012, pp 1-48
13. EshedOhn-Bar, Mohan ManubhaiTrivedi, “Hand Gesture Recognition In Real-Time For Automotive
Interfaces,” IEEE Transactions on Intelligent Transportation Systems, VOL. 15, NO. 6, December
2014, pp 2368-2377
14. P. Ramasamy, G. Prabhu, R. Srinivasan, “An economical air writing system is converting finger movements to text using a web camera,” 2016 International Conference on Recent Trends in Information Technology (ICRTIT),
Chennai, 2016, pp. 1-6.
15. Kenji Oka, Yoichi Sato, Hideki Koike, “Real-Time Fingertip Tracking and Gesture Recognition,” IEEE Computer Graphics and Applications, 2002, pp.64-71.
16. H.M. Cooper, “Sign Language Recognition: Generalising to More Complex Corpora”, Ph.D. Thesis, Centre for Vision, Speech and Signal Processing Faculty of Engineering and Physical Sciences, University of Surrey,
UK, 2012
17. Y. Huang, X. Liu, X. Zhang, L. Jin, “A Pointing Gesture Based Egocentric Interaction System: Dataset, Approach, Application,” 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Las Vegas, NV, pp. 370-377, 2016
18. Vladimir I. Pavlovic, Rajeev Sharma, Thomas S. Huang, “Visual Interpretation of Hand Gestures for Human-
Computer Interaction: A Review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, VOL.
19, NO. 7, JULY 1997, pp.677-695
19. Guo-Zhen Wang, Yi-Pai Huang, Tian-Sheeran Chang, Tsu-Han Chen, “Bare Finger 3D Air- Touch System
Using an Embedded Optical Sensor Array for Mobile Displays”, Journal Of Display Technology, VOL. 10, NO. 1, JANUARY 2014, pp.13-18
20. Napa Sae-Bae, Kowsar Ahmed, Katherine Isbister, NasirMemon, “Biometric-rich gestures: a novel approach to authentication on multi-touch devices,” Proc. SIGCHI Conference on Human Factors in Computing System,2005, pp.977-986
21. W. Makela, “Working 3D Meshes and Particles with Finger Tips, towards an Immersive Artists’ Interface,”
Proc. IEEE Virtual Reality Workshop, pp. 77-80, 2005.
22. A.D. Gregory, S.A. Ehmann, M.C. Lin, “inTouch: Interactive Multiresolution Modeling and 3D Painting
with a Haptic Interface,” Proc. IEEE Virtual Reality (VR’ 02), pp. 45-52, 2000.
23. W. C. Westerman, H. Lamiraux, M. E. Dreisbach, “Swipe gestures for touch screen keyboards,” Nov. 15 2011,
US Patent 8,059,101
24. S. Vikram, L. Li, S. Russell, “Handwriting and gestures in the air, recognizing on the fly,” in Proceedings of
the CHI, vol. 13, 2013, pp. 1179– 1184.
25. X. Liu, Y. Huang, X. Zhang, L. Jin. “Fingertip in the eye: A cascaded CNN pipeline for the real-time fingertip
detection in egocentric videos,” CoRR, abs/1511.02282, 2015.