Kia Motors is previewing a range of new technologies developed for the post-autonomous driving era at the Consumer Electronics Show (CES) 2019, to be held in Las Vegas from January 8 to 11.
Building on the brand’s ‘Beyond Autonomous Driving’ vision for connected vehicles presented at CES 2018, Kia is looking ahead to a time when autonomous driving has become the norm with an interactive ‘Space of Emotive Driving’ exhibit at this year’s show. Central to the presentation of this ‘Space of Emotive Driving’ vision at CES 2019 is Kia’s new Real-time Emotion Adaptive Driving (R.E.A.D.) System – a world-first emotional AI-based optimized and interactive in-cabin space centered on human senses.
The R.E.A.D. System can optimize and personalize a vehicle cabin space by analyzing a driver’s emotional state in real-time through AI-based bio-signal recognition technology. The technology monitors a driver’s emotional state using sensors to read their facial expressions, heart rate and electrodermal activity. It then tailors the interior environment according to its assessment – potentially altering conditions relating to the five senses within the cabin, creating a more joyful mobility experience. AI deep-learning technology enables the system to establish a baseline in user behaviour, and then identify patterns and trends to customize the cabin accordingly.
Albert Biermann, president and head of research and development division of Hyundai Motor Group said, “Kia considers the interactive cabin a focal point for future mobility, and the R.E.A.D. System represents a convergence of cutting-edge vehicle control technology and AI-based emotional intelligence. The system enables continuous communication between driver and vehicle through the unspoken language of ‘feeling’, thereby providing an optimal, human-sense oriented space for the driver in real-time”
Kia will have specially-designed experiential modules installed at its CES booth to demonstrate the potential of the R.E.A.D. System. The public will be able to experience vehicle technology that recognizes their physiological emotions based on facial expressions, electrodermal activity, and heart rate.