Curriculum Vitaes

Kazuhiko Shinozawa

  (篠澤 一彦)

Profile Information

Affiliation
Professor, Division of Math, Sciences, and Information Technology in Education, Osaka Kyoiku University
Degree
工学士(慶應義塾大学)
工学修士(慶應義塾大学)
博士(情報学)(京都大学)

Researcher number
80395160
J-GLOBAL ID
202101010709748024
researchmap Member ID
R000029802

Committee Memberships

 1

Awards

 3

Papers

 53
  • Shohei Yamashita, Tomohiro Kurihara, Tetsushi Ikeda, Kazuhiko Shinozawa, Satoshi Iwaki
    Advanced Robotics, 34(20) 1309-1323, Oct, 2020  Peer-reviewed
  • 長谷川孔明, 古谷誠悟, 金井祐輔, 篠沢一彦, 今井倫太
    知能と情報(日本知能情報ファジィ学会誌), 30(4) 634-642, Aug, 2018  Peer-reviewed
  • Yoichi Morales, Atsushi Watanabe, Florent Ferreri, Jani Even, Kazuhiko Shinozawa, Norihiro Hagita
    Robotics and Autonomous Systems, 108 13-26, May, 2018  Peer-reviewed
  • Reo Matsumura, Masahiro Shiomi, Kayako Nakagawa, Kazuhiko Shinozawa, Takahiro Miyashita
    Journal of Robotics and Mechatronics, 28(1) 107-108, 2016  Peer-reviewed
    We developed robovie-mR2, a desktop-sized communication robot, in which we incorporated a “Kawaii” design to create a familiar appearance because it is an important acceptance factor for both researchers and users. It can interact with people using multiple sensors, including a camera and microphones, expressive gestures, and an information display. We believe that robovie-mR2 will become a useful robot platform to advance the research of human-robot interaction. We also give examples of human-robot interaction research works that use robovie-mR2.
  • Shiomi Masahiro, Nakagawa Kayako, Shinozawa Kazuhiko, Matsumura Reo, Ishiguro, Hiroshi. Hagita, Norihiro
    International Journal of Social Robotics, 9(1) 5-15, 2016  Peer-reviewed
    The paper investigated the effects on a person being touched by a robot to motivate her. Human science literature has shown that touches to others facilitate efforts of touched people. On the other hand, in the human--robot interaction research field, past research has failed to focus on the effects of such touches from robots to people. A few studies reported negative impressions from people, even if a touch from a person to a robot left a positive impression. To reveal whether robot touch positively affects humans, we conducted an experiment where a robot requested participants to perform a simple and monotonous task with/without touch interaction between a robot and participants. Our experiment's result showed that both touches from the robot to the participants and touches from the participants to the robot facilitated their efforts.

Misc.

 8
  • 西尾拓真, 宮下敬宏, 宮下敬宏, 篠澤一彦, 篠澤一彦, 篠澤一彦, 萩田紀博, 萩田紀博, 萩田紀博, 安藤英由樹, 安藤英由樹
    日本バーチャルリアリティ学会大会論文集(CD-ROM), 28th, 2023  
  • 大野凪, 大野凪, 宮下敬宏, 宮下敬宏, 篠澤一彦, 篠澤一彦, 篠澤一彦, 萩田紀博, 萩田紀博, 安藤英由樹, 安藤英由樹
    日本バーチャルリアリティ学会大会論文集(CD-ROM), 27th, 2022  
  • Kayako Nakagawa, Masahiro Shiomi, Kazuhiko Shinozawa, Reo Matsumura, Hiroshi Ishiguro, Norihiro Hagita
    PROCEEDINGS OF THE 6TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTIONS (HRI 2011), pp.465-472 465-472, 2011  
    This paper presents the effect of a robot's active touch for improving people's motivation. For services in the education and healthcare fields, a robot might be useful for improving the motivation of performing such repetitive and monotonous tasks as exercising or taking medicine. Previous research demonstrated with a robot the effect of user touch on improving its impressions, but they did not clarify whether a robot's touch, especially an active touch, has enough influence on people's motive. We implemented an active touch behavior and experimentally investigated its effect on motivation. In the experiment, a robot requested participants to perform a monotonous task with a robot's active touch, a passive touch, or no touch. The result of experiment showed that an active touch by a robot increased the number of working actions and the amount of working time for the task. This suggests that a robot's active touch can support people to improve their motivation. We believe that a robot's active touch behavior is useful for such robot's services as education and healthcare.
  • NISHIO Shuichi, KANDA Takayuki, MIYASHITA Takahiro, SHINOZAWA Kazuhiko, HAGITA Norihiro, YAMAZAKI Tatsuya
    Journal of the Robotics Society of Japan, 26(5) 427-430, Jul 15, 2008  
  • 飯尾尊優, 篠沢一彦, 塩見昌裕, 宮下敬宏, 秋本高明, 萩田紀博
    日本ロボット学会学術講演会予稿集(CD-ROM), 26th, 2008  
  • Hagita Norihiro, Miyashita Takahiro, Kanda Takayuki, Shinozawa Kazuhiko, Akimoto Takaaki
    Journal of the Robotics Society of Japan, 25(4) 509-513, 2007  
  • Takahiro Miyashita, Kazuhiko Shinozawa, Norihiro Hagita, Hiroshi Ishiguro
    IEEE International Conference on Intelligent Robots and Systems, 3468-3473, Dec 1, 2006  
    This paper describes a method for recognizing the environment and selecting behaviors for a humanoid robot that are appropriate to the environment, based on sensor histories. In order for humanoid robots to move around many kinds of environments, one of the keys is selecting the appropriate behavior. Because features of an environment that deeply influence a robot's motion, such as viscous friction or floor elasticity, are difficult to immediately measure before moving, it is difficult for the robots to plan their behaviors to take these factors into account. Therefore, in our method, a robot first records a long time series of data from sensors attached to its body while performing each behavior in each environment, and it builds decision trees based on this data for each of the behaviors, enabling it to recognize the current environment based on the sensor history. By using the trees, the robot selects the behavior which will be most effective for recognizing the environment, acquires a sensor history while doing the behavior, and selects likely candidates for the environment. By iterating over these steps, the robot can recognize the environment and select an appropriate behavior. To verify the validity of the method, we used a small-size humanoid robot and conducted an experiment to recognize environments in a family's house. © 2006 IEEE.
  • Takahiro Miyashita, Kazuhiko Shinozawa, Norihiro Hagita
    2006 6TH IEEE-RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS, VOLS 1 AND 2, 462-+, 2006  
    In this paper, we propose a method for translating human gestures to those of various robots that have different degrees and arrangements of freedom. A key idea of the method is to simplify human gesture by using relevant movements of two body parts while making a gesture and explicitly reducing the gesture's degrees of freedom by utilizing principle component analysis. Experiments were conducted with three types of different types of robots to confirm that this method could produce similar motions to human gestures. The results show that subjects had the impression that the produced motions appeared similar to the motions that humans make with the hand.

Presentations

 1

Professional Memberships

 2

Research Projects

 3