Published: December 24, 2015 | Munich
BMW, Samsung and Panasonic have joined forces to develop “intelligent assistants” for connected car technology. The giant corporations are working with Nuance – a speech recognition and artificial intelligence company – to enable speech systems to better understand and process regional or country-specific accents.
First speech recognition systems launched in cars over ten years ago were known for their issues with understand commands sent to the car by a person that possessed an accent. Furthermore, one of the new system’s requirement is to be able to distinguish between words spoken in relation to commands that the car might have to execute — and words spoken that relate to completely unrelated topics such as conversation or radio talk shows etc.
As ‘connected cars’ now enter more widespread development and production, the need to provide speech recognition capable of understanding the nuances of human idiomatic language becomes more important.
In-car computerized speech systems now need to be able to distinguish between words spoken in relation to commands that the car might have to execute — and words spoken that relate to completely unrelated topics such as conversation or radio talk shows etc.
Nuance Mix gives device makers and developers the ability to create customized voice and natural language interfaces for the Internet of Things (IoT), including segments such as smart home, gaming, robotics and consumer health and fitness. The rapidly evolving ecosystem of specialized devices and services for the IoT and industrial Internet will be defined by the user experience – and the stakes are high in bringing these innovations into an incredibly competitive market.
This software lets developers define their use cases, concepts, parameters and the variety of ways consumers will interact with their device or app through voice. In other words,