主動外觀模型在近年來被廣泛用於人臉偵測及人臉特徵的擷取，本論文以主動外觀模型為基礎，提出了一個新的人臉朝向追蹤的演算法，然後將以此人臉朝向的資訊應用於「平衡復健運動」及「人機互動介面」。 本論文先以主動外觀模型粗略地找出計算人臉朝向角度所需的五大人臉特徵：左右眼角、鼻尖和左右嘴角。然後利用膚色資訊再來細調原偵測到的人臉特徵點；此外，主動外觀模型在人臉角度過於偏移時會有偵測失敗的問題，本論文會使用光流演算法追蹤前一刻的特徵點，以確保人臉特徵都能被偵測到。本論文以此五個人臉特徵在影像上的相對距離及人臉的構造為原理，計算出臉部對攝影機的偏移角，進而得到人臉的朝向資訊。 首先，本論文利用此人臉朝向資訊加上遊戲互動元素，將其用於平衡復健運動中的頭部復健部份，病人不需配戴任何東西，也不需醫師陪同，可在家中以簡單的環境架設進行平衡復健運動。此外，系統會紀錄下每次使用的情形，以便日後可用來評估復健情況。本論文提出的第二種應用為頭動滑鼠，此功能可讓手部不便使用滑鼠的使用者，能透過頭部的轉動去操作電腦系統。若搭配本實驗室過去所設計的溝通輔具軟體，更可讓身障者執行打字、影音娛樂及家電控制等生活基本功能。上述兩種應用都有搭配相關實驗設計以確認它們的有效度和極限。
Detection of Face Orientation and its Applications in Vestibular Rehabilitation and Human-Computer Interface
In recent years, the Active Appearance Model (AAM) has been widely applied to the detection of human faces and the extraction of facial features. Based on the AMM, this thesis proposes a flexible algorithm for tracking the facial orientations in face images. The facial orientation information is then applied to the developments of a vestibular rehabilitation exercise system and a human-computer interface. This thesis adopts the AMM to coarsely detect five important facial features such as the far corners of the eyes, the tip of the nose, and the far corners of the mouth. These facial features are necessary for the computation of the facial orientations. Then the information of skin and non-skin regions is adopted to fine-tune the locations of the five facial features coarsely detected by the AAM. The traditional AAM cannot work when the slant of a human face is too large. The optical flow tracking method is adopted to track the features from the previous image in case the AMM model cannot find the features at the present image. The facial orientation is computed based on the geometrical analysis of these five facial features. First of all, the facial orientation information incorporated with an interactive game is applied to the development of a vestibular rehabilitation exercise system. Patients are able to do a vestibular exercise in home without the need of wearing anything or a doctor’s accompany. In addition, this system could record corresponding evaluation parameters during an exercise. The second application of the facial orientation information is the development of a human-computer interface called a “head mouse”. The head mouse allows a person with hand malfunction to manipulate a computer simply by rotating his or her head. In addition, the head mouse incorporated with the communication aid previously developed by our laboratory can further allows people with severe disabilities to type, to surf Web, to enjoy A/V entertainments, to control home appliances, etc.
Keywords: detection of face orientation, vestibular rehabilitation, human-computer interface, active appearance model, feature extraction, optical flow tracking