English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 46833/50693 (92%)
造訪人次 : 11845017      線上人數 : 336
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    主頁登入上傳說明關於CCUR管理 到手機版


    請使用永久網址來引用或連結此文件: https://irlib.pccu.edu.tw/handle/987654321/44502


    題名: 以類神經網路實現寵物機器人之互動設計
    Implementation of Pet Robot Interaction Design Based on Neural Network
    作者: 蘇國和
    郭重顯
    貢獻者: 機械工程學系數位機電碩士班
    關鍵詞: 寵物機器人
    卷積神經網路
    YOLO影像辨識
    無線感測網路
    循環式類神經網路
    基因演算法
    情感狀態參數模組
    Pet robot
    Convolutional neural network
    YOLO image recognition
    Wireless sensor network
    Recurrent neural network
    Genetic algorithm
    Emotional state parameter module
    日期: 2019-2020
    上傳時間: 2019-06-11 11:59:04 (UTC+8)
    摘要: 在少子化、高齡化及單身化的社會結構下,有人陪伴是很多人的夢想,透過寵物機器人,期待能取代真實家人朋友在使用者心中的地位,融入於生活中。但寵物機器人,目前存在一些缺點如:行為固定、類型固定…等等,造成互動不佳,使用者無法調整其行為方式,或者是變換寵物類型,以致無法符合使用者期望。為改善寵物機器人與使用者之互動要求,此計畫之寵物機器人的架構包括四部分:(1)臉部表情,眼神與姿態辨識技術: 人類有憤怒,高興,悲傷和期待,看懂這些表情與眼神,是新一代寵物機器要考慮的重要因素。此計畫運用YOLO來實現。從影像輸入到輸出,靠卷積神經網路(Convolutional Neural Network, CNN)計算來達成,硬體採用Intel i7-8700K處理器及GTX1080Ti GPU,辨識技術採用YOLOv3或YOLOv3-tiny版本,使辨識更準確及時,以轉換成心理狀態參數-[憤怒,高興,悲傷,期待]。(2)無線感測物聯網技術:寵物機器人主人,手上可帶著心跳計,血壓計,溫度計與動作偵測手環。這些感測器透過無線發射模組,將使用者的生理狀態-[心跳數,舒張壓,收縮壓,體溫]-傳送給寵物機器人,據以建立生理狀態參數模組;另外,當老人或幼童跌倒時,手環會偵測到大衝擊。此時手環可以發出警訊,並將老人位置傳給寵物機器人,並以無線的方式將現場景象傳遞給遠方的醫護中心人員或家人。(3)以類神經網路技術建立人與寵物機器人之間的兩個情感狀態參數模組: 從人工智慧的角度來看,人和寵物機器人之間的情感關係是由兩組參數模組所組成,分別是(a)生理狀態參數模組;(b)心理狀態參數模組,此計畫運用循環式類神經網路及基因演算法,根據蒐集的資料,建立這兩個參數模組的模型,以實現一個具有與人類相等情感準位的寵物機器人,能貼心地與主人互動。(4)語音輸出與肢體動作撫慰使用者:類神經網路依據使用者目前的情感參數,從語音資料庫選擇合適的安慰話語輸出,並啟動眼,手及足部動作,撫慰使用者。將完成下列功能:(1)寵物機器人控制器硬體設計,製作與整合。(2)控制器影像辨識軟體建置,以YOLOv3或YOLOv3-tiny即時影像辨識,取得參數。(3)控制器情感狀態參數模組建立,包括生理狀態參數模組及心理狀態參數模組。(4)語音資料庫建立及其對應之肢體動作(眼,手,足)規畫及AI伺服馬達的串聯驅動。
    Under the social structure of declining birth, aging and singleness, it is the dream of many people to be accompanied by someone. Through pet robots, we are looking forward to replacing the real family orfriends in the hearts of users and integrating into life. However, pet robots currently have some shortcomings such as fixed behavior, fixed type, etc., resulting in poor interaction. Users can not adjust their behavior, or change the pet type. As a result, it does not meet user expectations.In order to improve the interaction requirements between the pet robot and the user, the structure of the pet robot of this project includes four parts:(1) Facial expression, eye and gesture recognition technology: Human beings have anger, happiness, sadness and expectation. Understanding these expressions is an important factor to consider in a novel pet robot. The YOLO technology is adopted in this project. It is achieved by convolutional neural network (CNN) calculation from captured image. The hardware includes Intel i7-8700K processor and GTX1080Ti GPU. The YOLOv3 or YOLOv3-tiny version is used to make the identification more accurate and timely. The transformed state parameters are [anger, happiness, sadness, expectation].(2) Wireless sensing IoT technology: The owner of pet robot may have a heartbeat, a sphygmomanometer, thermometer and a motion detection bracelet. These sensors transmit the physiological state parameters ([Heart rate, Diastolic blood pressure, Systolic blood pressure, Temperature]) of the user to the pet robot through the wireless transmitting module, thereby establishing a physiological state parameter module. Furthermore, when the old man or the young child falls, the wristband detects a large impact. At this point, the wristband can send out a warning and pass the old man's position to the pet robot, and wirelessly transfer the scene to the remote health care center or family members.(3) Using the neural network technology to establish three emotional state parameter modules between human and pet robots: From the perspective of artificial intelligence, the emotional relationship between human and pet robots is composed of two parameter modules: (a) physiological state parameter module; (b) mental state parameter module. The recurrent neural network (RNN) and genetic algorithm (GA) are adopted to establish these two modules according to gathered data.(4) Voice output and limb movements to comfort the user: The neural network selects the appropriate comfort discourse output from the voice database according to the user's current emotional parameters, and activates the eye, hand and foot movements to soothe the user.The following functions will be completed:(1) Hardware design, production and integration of the pet robot's controller.(2) Image recognition construction, using YOLOv3 or YOLOv3-tiny instant image recognition technology.(3) Emotional state parameter modules establishment, including physiological state parameter module and mental state parameter module.(4) Establishment of the voice database and the corresponding physical movements (eyes, hands, feet) and the driving of the series AI servo motors.
    顯示於類別:[機械工程系暨機械工程學系數位機電研究所] 研究計畫

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML295檢視/開啟


    在CCUR中所有的資料項目都受到原著作權保護.


    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回饋