C.J. Perera, T.D. Lalitharathne, and Kazuo Kiguchi, “EEG-Controlled Meal Assistance Robot with Camera-Based Automatic Mouth Position Tracking and Mouth Open Detection,”  in International Conference on Robotics and Automation 2017, Singapore. Abstract

A Meal Assistance Robot is an assistive device that is used to aid individuals who cannot independently direct food to their mouths for consuming. For individuals who undergo loss of upper limb functions due to amputations, spinal cord injuries or cerebral palsy, self-feeding can be impossible, and to assist such individuals in regaining their independence meal assistance robots have been introduced. In this paper we propose a meal assistance robot that is controlled using user intentions based on Electroencephalography (EEG) signals while incorporating camera-based automatic mouth position tracking and mouth open detection systems. In the proposed system, users select any solid food item that theydesire to consume from three different containers by looking at corresponding flickering LED matrices. User intentions are identified through EEG signals using a Steady State Visual Evoked Potentials (SSVEP) based intention detection method. Initial motion commands for scooping food from the containers are generated and sent to the meal assistance robot from this first stage. At the second stage, a camera-based mouth position tracking method is proposed for automatically detecting the user’s mouth position and thereby moving the spoon or endeffector of the meal assistance robot towards the mouth of the user. This method is capable of automatically tracking the mouth position of users irrespective of their individual body differences and seating positions. A mouth open/closed recognition method is implemented at the final stage in order to feed food to the users when they desire consumption, indicated by the opening/closing of their mouth. A set of experiments were carried out with healthy subjects to validate the proposed system and results are here presented.

Kalinga Nisal, Isuru Ruhunge, Janaka Subodha, Chamika Janith Perera, and Thilina Dulantha Lalitharatne “Design, Implementation and Performance Validation of UOMPro Artificial Hand: Towards Affordable Hand Prostheses,” 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Jeju Island, Korea, 2017[accepted]. Abstract

In this paper, an anthropometric, active artificial prosthetic hand named UOMPro (University of Moratuwa Prosthetic) is proposed. The UOMPro hand is realized during an attempt to research on developing affordable hand prostheses for the use of people especially in developing countries where purchasing high cost state-of-the-art commercial hand prostheses beyond their capacity. The proposed hand is developed with an affordable cost (< 850 USD) and it consists of 6 Degrees of Freedom (DOF) including flexion/extension motions of five fingers and abduction/adduction motion of the thumb finger. Under actuated fingers are fabricated using a combination of 3D printed parts and CNC machined aluminum as a solution to address drawbacks in fully 3D printed hands. All components of the electronic control circuit which is responsible for lowlevel controlling of the hand are placed inside the hand where simple serial communication interface is provided to link with high-level control methods. The implemented low-level controller can communicate with either a high-level controller that sends individual fingers position commands or a high-level controller which sends hand grip pattern commands. A set of experiments are conducted to validate the performance of the overall system and results are presented with potential future directions.



C.J.Perera, I. Naotunna, C.Sandaruwan, N.J.Kelly-Boxall, R.A.R.C.Gopura, and T.D.Lalitharatne, ”Electrooculography signal based control of a meal assistance robot,” in IASTED International Conference on Biomedical Engineering, Innsbruck, Austria, 2016, pp.133-139.Abstract

Propose of meal assistance robot is to aid individuals who cannot consume food independently. In this paper, a meal assistance robot which is controlled based on Electrooculography signals is proposed. Eye blinks and horizontal eye movements detected from Electrooculography signals are used for controlling 4 degree of freedom meal assistance robot in the proposed method. Experiments are carried out to validate the system and results show the effectiveness of the proposed method.

C.J.Perera, I. Naotunna, C.Sandaruwan, R.A.R.C.Gopura, and T.D.Lalitharatne, ”SSVEP Based BMI for a Meal Assistance Robot,” IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, Hungary, 2016, pp. 2295-2300. Abstract

Meal assistance robots provide disabled individuals the access to one of the important activities in daily living, selffeeding. This paper proposes a Steady State Visually Evoked Potential (SSVEP) based Brain Machine Interface (BMI) for controlling of a meal assistance robot. In the proposed system, the user has the facility to select any solid food item that he would like to eat from 3 different bowls just by looking at the respective LED matrices blinking at different frequencies. The generated SSEVEP signals while looking at the LEDs are extracted from EEG signals acquired using OpenBCI EEG signal acquisition system. Extracted SSVEP signals are used to identify the intention of the user and subsequently the detected intentions are used to operate the meal assistant robot. Experiments are carried out to validate the system and results indicate the effectiveness of the proposed method.



I. Naotunna, C. J. Perera, C. Sandaruwan, R. A. R. C. Gopura and T. D. Lalitharatne, ”Meal assistance robots: A review on current status, challenges and future directions,” 2015 IEEE/SICE International Symposium on System Integration (SII), Nagoya, Dec 2015, pp. 211-216. Abstract

Need of assistive robots for performing activities of daily living is increasing with the reduction of labor force in the welfare and nursing care. Self-feeding or eating is one of the primary activities of a human in his/her daily life. Devices such as assistive robots for self-feeding have been developed as a solution for this problem. The purpose of this paper is to review existing meal assistance robots. In the paper, identification of important design features like feeding techniques, advantages and limitations of control methods of meal assistance robots and different inputs signals are comprehensively discussed. Challenges for developing meal assistance robots and potential future directions are also discussed at the end.