Professor Adrian David Cheok, Chair Professor of University of London, has been invited to exhibit at the Ars Electronica Festival 2017. His work, Kissenger, has been selected by the Ars Electronica Festival committee to showcase for 5 days at one of the most prestigious media arts events to be held on 7-11 September 2017 in…Read More
Date: August 7, 2017 Adrian David Cheok, Kasun Karunanayaka, Surina Hariri, Hanis Camelia, and Sharon Kalu Ufere Imagineering Institute, Iskandar Puteri, Malaysia & City, University of London,UK. Email: firstname.lastname@example.org Phone: +607 509 6568 Fax: +607 509 6713 Here we are excited to introduce the world’s ﬁrst computer controlled digital device developed to stimulate olfactory receptor neurons…Read More
The Universiti Sains Malaysia (USM) Senate, in its 245th Meeting on 24 May 2017, is honoured to appoint Professor Adrian David Cheok as a committee member for the Board of Studies for Cognitive Neuroscience Postgraduate Degree Programme by the USM School of Medical Sciences.Read More
I recently accepted an invitation to serve as the Guest Editor for a Special Issue of the journal Multimodal Technologies and Interaction on the subject of “Love and Sex with Robots”. It is my pleasure to invite all researchers to submit an article on this topic.
The article may be either a full paper or a communication based on your own research in this area, or may be a focused review article on some aspect of the subject. MTI is an open access, peer-reviewed journal, edited byRead More
Adrian David Cheok has been invited to be the Editor-in-Chief of the new journal Multimodal Technologies and Interaction (MTI).
Multimodal Technologies and Interaction (ISSN 2414-4088) is an international, multi/interdisciplinary, open access, peer-reviewed journal which publishes original articles, critical reviews, research notes, and short communications on this subject. MTI focuses on fundamental and applied research dealing with all kinds of technologies that can acquire and/or reproduce unimodal and multimodal digital content that supports interaction (e.g. human–computer, human–robot and animal–computer). Such technologies may produce visual, tactile, sonic, taste, smell, flavor or any other kind of content that can enrich consumer/user experience.
Our aim is to encourage scientists to publish experimental, theoretical and computational results in as much detail as possible, so that results can be easily reproduced. TRead More