IJSTR

International Journal of Scientific & Technology Research

Home About Us Scope Editorial Board Blog/Latest News Contact Us
Scopus/Elsevier
(Re-evaluation in-progress)
CALL FOR PAPERS
AUTHORS
DOWNLOADS
CONTACT

IJSTR >> Volume 9 - Issue 4, April 2020 Edition



International Journal of Scientific & Technology Research  
International Journal of Scientific & Technology Research

Website: http://www.ijstr.org

ISSN 2277-8616



Controlling Mouse Navigation Through 3D Head Movement

[Full Text]

 

AUTHOR(S)

Pradeep V, Jogesh Motwani

 

KEYWORDS

3D head movement, Assistive technology, Camera mouse, Controlling mouse cursor, Hands-free computing, People with disability in movement.

 

ABSTRACT

Various results have been proposed in the past decades to capture the user’s head motions through a camera to control the navigation of the mouse pointer to enable the people with disability in the movement to interact with computers. Movement of the facial feature is tracked to estimate the movement of the mouse cursor in the computer screen. Synchronizing the rate of movement of the head with the mouse cursor movement is identified as the challenge as the head movement is three dimensional but the sequence of images captured by the web camera is two dimensional. The proposed system is an innovative approach of capturing the three-dimensional head rotation through the usual web camera that captures the image in two dimensions.

 

REFERENCES

[1] Social Statistics Division, Ministry of Statistics and Programme Implementation, Government of India, New Delhi (2017) Disabled Persons in India, A statistical profile 2016, http://www.mospi.gov.in.
[2] T. Palleja, E. Rubion, M. Teixido, M. Tresanchez, A. Fernandez del Viso, C. Rebate, J. Palacion, "Simple and Robust Implementation of a Relative Virtual Mouse Controlled by Head Movements" IEEE Conference on Human System Interaction, pp 221-224, 25-27, May 2008.
[3] Kim H, Ryu D (2006) Computer control by tracking head movements for the disabled. In: 10th International conference on computers helping people with special needs (ICCHP), Linz, Austria, LNCS 4061. Springer, Berlin, pp 709–715.
[4] Frank Loewenich , Frederic Maire, Hands-free mouse-pointer manipulation using motion-tracking and speech recognition, Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces, November 28-30, 2007, Adelaide, Australia.
[5] S. Epstein, E. Missimer and M. Betke. "Using kernels for a video-based mouse-replacement interface," Personal and Ubiquitous Computing, 18(1):47-60, January 2014.
[6] Magee, John; Epstein, Samuel; Missimer, Eric; Kwan, Christopher; Betke, Margrit. "Adaptive mouse-replacement interface control functions for users with disabilities", Technical Report BUCS-TR-2011-008, Computer Science Department, Boston University, March 2, 2011.
[7] Nabati, M., Behrad, A.: 3D Head pose estimation and camera mouse implementation using a monocular video camera. Signal Image Video Process. (2012, December).
[8] Zhu Hao, Qianwei Lei, J. Clerk Maxwell, "Vision-Based Interface: Using Face and Eye Blinking Tracking with Camera", Proceedings of the 2nd International Symposium on Intelligent Information Technology Application, vol. 2, no. 1, pp. 68-73, 2008.
[9] S. Kumar, A. Rai, A. Agarwal, N. Bachani, "Vision based human interaction system for disabled", 2Nd IEEE International Conference On Image Processing Theory Tools And Applications, 2010.
[10] Woramon Chareonsuk, Supisara Kanhaun, Kattaleeya Khawkam and Damras Wongsawang, “Face and Eyes mouse for ALS Patients”, 2016 Fifth ICT International Student Project Conference (ICT-ISPC).
[11] C. Manresa-Yee, J. Varona, T. Ribot, F.J. Perales, “Non-verbal communication by means of head tracking”, Ibero-American Symposium on Computer Graphics - SIACG (2006).
[12] P. Gyawal, A. Alsadoon, P. W. C. Prasad, L. S. Hoe, A. Elchouemi, "A novel robust camera mouse for disabled people (RCMDP)", 2016 7th International Conference on Information and Communication Systems (ICICS), pp. 217-220, 2016.
[13] Javier Varona, Cristina Manresa-Yee, Francisco J. Perales, “Hands-free vision-based interface for computer accessibility”, Journal of Network and Computer Applications, Volume 31, Issue 4, November 2008, Pages 357-374.
[14] S. K. Chathuranga, K. C. Samarawickrama, H. M. L. Chandima, K. G. T.D. Chathuranga, A. M. H.S. Abeykoon, "Hands free interface for Human Computer Interaction", 2010 Fifth International Conference on Information and Automation for Sustainability, pp. 359-364, 2010.
[15] Z.-P., Bian, J. Hou, L.-P., Chau, N. Magnenat-Thalmann, "Facial position and expression based human computer interface for persons with tetraplegia", IEEE J. Biomed. Health Informat., vol. 20, no. 3, pp. 915-924, May 2016.
[16] Gorodnichy, D. and Roth, G., Nouse ‘Use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. In Image and Video Computing, Volume 22, Issue 12, Pages 931- 942, NRC 47140, 2004.
[17] A. Mohamed, R. Koggalage, "Control of mouse movements using human facial expressions", Third IEEE International Conference On Information And Automation For Sustainability, pp. 13-18, 2007.
[18] T. Morris, V. Chauhan, Facial feature tracking for cursor control, Journal of Network and Computer Applications, v.29 n.1, p.62-80, January 2006.
[19] Chairat Kraichan and Suree Pumrin, “Face and Eye Tracking For Controlling Computer Functions”, 2014 11th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON).
[20] K. Parmar et al., "Facial-feature based Human-Computer Interface for disabled people", Communication Information & Computing Technology (ICCICT) 2012 International Conference on, pp. 1-5, 2012.
[21] Eric Missimer , Margrit Betke, Blink and wink detection for mouse pointer control, Proceedings of the 3rd International Conference on Pervasive Technologies Related to Assistive Environments, June 23-25, 2010, Samos, Greece.
[22] M. Betke, J. Gips, P. Fleming, "The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 10, pp. 1-10, 2002.
[23] Akram W., Tiberii L., Betke M. (2007) A Customizable Camera-Based Human Computer Interaction System Allowing People with Disabilities Autonomous Hands-Free Navigation of Multiple Computing Tasks. In: Stephanidis C., Pieper M. (eds) Universal Access in Ambient Intelligence Environments. Lecture Notes in Computer Science, vol 4397. Springer, Berlin, Heidelberg.
[24] Connor C., Yu E., Magee J., Cansizoglu E., Epstein S., Betke M. (2009) Movement and Recovery Analysis of a Mouse-Replacement Interface for Users with Severe Disabilities. In: Stephanidis C. (eds) Universal Access in Human-Computer Interaction. Intelligent and Ubiquitous Interaction Environments. UAHCI 2009. Lecture Notes in Computer Science, vol 5615. Springer, Berlin, Heidelberg.
[25] Yunhee Shin, Eun Yi Kim, Welfare interface using multiple facial features tracking, Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence, December 04-08, 2006, Hobart, Australia.
[26] Fathi, A. & Abdali-Mohammadi, “Camera-based eye blinks pattern detection for intelligent mouse”, Signal Image and Video Processing, November 2015, Volume 9, Issue 8, pp 1907–1916.
[27] Kim E.Y., Kang S.K. (2006) Eye Tracking Using Neural Network and Mean-Shift. In: Gavrilova M. et al. (eds) Computational Science and Its Applications - ICCSA 2006. ICCSA 2006. Lecture Notes in Computer Science, vol 3982. Springer, Berlin, Heidelberg.
[28] J.J. Magee, M.R. Scott, B.N. Waber and M. Betke, "EyeKeys: A Real-time Vision Interface Based on Gaze Detection from a Low-grade Video Camera", In Proceedings of the IEEE Workshop on Real-Time Vision for Human-Computer Interaction (RTV4HCI), Washington, D.C., July 2004.
[29] Y. Sugano, Y. Matsushita, Y. Sato, H. Koike, "Appearance-based gaze estimation with online calibration from mouse operations", IEEE Transactions on Human-Machine Systems, vol. 45, no. 6, pp. 750-760, 2015.
[30] Uma Sambrekar, Dipali Ramdasi, "Estimation of Gaze For Human Computer Interaction", International Conference on Industrial Instrumentation and Control (ICIC), pp. 1236-1239, 2015.
[31] R. Valenti, N. Sebe, T. Gevers, "Visual gaze estimation by joint head and eye information", ICPR, 2010.
[32] Nasor, M., Rahman, K. K. M., Zubair, M. M., Ansari, H. and Mohamed, F. (2018) Eye-controlled mouse cursor for physically disabled individual, Proceedings of International Conference on Advances in Science and Engineering Technology, Abu Dhabi, UAE, pp. 152-156, 6 Feb-5 Apr, IEEE.
[33] Paul Viola, Michael Jones, (2001), Rapid Object Detection using a Boosted Cascade of Simple Feature, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR, Vol. 1 pp. I-511 - I-518
[34] Kazemi, Vahid, and Josephine Sullivan. "One millisecond face alignment with an ensemble of regression trees." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2014.