傳統的兵棋推演是利用桌子、地圖及棋子建立一個作戰模擬的環境，以人工的方式進行棋子的移動來推演、分析與記錄戰術的變化，達到戰術規劃的目的，但由於人工兵棋操作耗時且推演空間受到限制，現在多利用電腦來取代傳統的人工兵棋。雖然電腦兵棋相對於人工兵棋更為優異，但在使用上須具備系統操作知識且缺乏實體的推演操作。 本論文利用微軟 Kinect 3D深度攝影機結合投影機，建立一個混合實境之兵棋推演互動桌環境。這個互動桌擁有許多吸引人之特點，如：直覺性的體感操作方式、大尺度觸控功能、及可即時提供滿足不同的兵棋推演需求的地理圖資等。此系統會自動記錄兵棋推演之路徑，於演習結束後，並用動畫重新模擬兵棋推演之路徑，此歷史軌跡可供檢討想定推演及兵力運用之依據。此外，我們利用Kinect 深度攝影機做為觸控感測器，將桌面當做觸控面板，不僅改善大型觸控螢幕搬遷上的不便，且在不需增加現有資訊設備及有限經費下，即可達成系統開發之目標。 最後，本系統之各項功能皆有透過各種不同之實驗設計來驗證。在觸控正確率實驗中，其指令正確率達97.1%。在物件偵測與辨識實驗，物件在不同角度時平均辨識正確率為97.42%，移動時辨識正確率為87.49%。此外，我們利用系統可用性量表評估本系統，得到的分數為77.14。
A Mix-Reality-based Interactive War-Gaming Platform
A traditional war simulation or war game is to use a game table, a map and some pieces representing different forces to build a combat simulation environment with staff officers moving pieces around on the game table to deduce, analyze, and record changes resulted from the simulated military strategies to achieve tactical planning purposes. Since the conventional war game is very time consuming and its operating environment has many limitations, computer-based war games are usually adopted to replace the traditional war games now. Although computer-based war games outperform the traditional war games, they require human staffs to have the expertise to operate the war game systems. In addition, they are lack of intuitive physical actions during the deduction of military operations. This thesis integrates a Microsoft Kinect 3D depth camera with a projector to create a mixed-reality-based interactive war simulation (or war game) platform. This interactive platform owns many appealing characteristics such as intuitive embodied controls, large-scale touch screen functionality, and a real-time provision of geographic map information to meet different military practices’ requirements. The proposed system will automatically record the whole war game deduction procedures and then use animations to re-play the recorded deduction procedures at the end of the war game assignment. The recorded historical data would be the basis for reviewing the military practices and force troop’s deployment deductions. Via this kind of environment settings, the proposed interactive platform allows users to use embodied control to accomplish a military practice deduction. In addition, we use the Kinect depth camera as a touch sensor to make the table serve as a large-scale touch panel. This kind of arrangement can not only overcome the inconvenience incurred by the re-location of a large-scale panel but also reach the development goal under a small budget. Finally, several experiments were designed to evaluate the functionalities of the proposed mix-reality-based war simulation platform. In the touch experiments, the rate of instructions correct is 97.1%. The correct rate is 97.42% in the experiment with objects at different angles, and in the shift- experiments the correct rate is 87.49% in our proposed system. Besides, we use the system usability scale to measure our system, and the score is 77.14.
Keywords : depth cameras, mixed reality, touch screen interface, war game, three-dimensional object recognition