In Silicon Valley, explanatory robots are everywhere. Advances in artificial intelligence have made these chat-loving assistants a reality. They are flourishing in various forms. social networksFacebookIts communication application has been greatly improved.MessengerRobot in the explanation. In a sense, it is the speech-controlled explanatory robot that facilitates personal assistants such asSiriAppeared on our mobile phones, contributed to AmazonEchoSmart speakers appear in our living room. All this is enough to convince you that explanatory robots are beginning to penetrate into our daily lives.
But they have not yet become the mainstream. Technologies that promote the development of robots and those related to machine learning and artificial intelligence must also be in place before they become standard user interfaces.“chat”Greater progress has been made. Google(micro-blog)That's what I realized. This week, it launched the human race.+Artificial intelligence research projects, the purpose is to promote“Human centered”AIThe development of the system. Computers need to better understand human language, emotions and intentions. Some big companies have begun to test water to explain robots to make sure they don't fall behind. However, AI must be developed in several important areas before it can be widely used. Some requirements are obvious, such as improving speech recognition, while others are obscure, such as explaining what services robots can prompt them to provide.
Talking explanation robots must make great progress in these five aspects in order to be truly acceptable to the majority of users.
Remember the early Internet??The pages were full of flashing neon lights and blue links. Today's explanation robot is equivalent to the network of that year. If the explanation robot is to be popularized, then people must be able to ask questions and place orders with it in natural language. Whether users use voice or text, you don't expect them to communicate with robots after they have mastered special vocabulary. If you ask a voice assistantAlexaPlaying a song, she didn't understand your request at first, which is understandable. Users who often use it have built something with it“relationship”Of course, we are willing to ignore these problems. However, if users fail to book a movie ticket for the first time to a new explanation robot, they may go elsewhere to buy a ticket.
Nowadays, neurolinguistics plays a very important role, but it still has some difficulties in dealing with dialects, slang and words. Speech recognition technology can gradually grasp the way people speak over time. But if you call an enterprise only once a year, it's hard for an explaining robot to learn how to speak. We are still in the early stage of man-machine interaction.
All this will be reflected in the image of the brand. Explanatory robots can't just copy the current pattern of automatic voice services. Because social media tends to amplify users'negative reviews, businesses need to get things right. Everything that people do now through the web and mobile applications should be done in natural language, but we haven't reached that point yet.
The key problem for AI to play its role is to understand the background information. Just as marketing and sales will use360To understand consumers from the perspective of degree, and to explain robots, we need to have a deeper understanding of the objects they interact with: who they are, how they become what they are, what they are looking for, and what they have done in the past. This information must be collected and shared among the explanatory robots. Only by understanding these background information can the interpreting robot meet people's needs steadily and continuously.
For example, school selection counseling services AdmitHubLast year, in collaboration with Georgia State University, robots were used to process freshmen enrollment and grants. In the early days, the robot helped the school deal with such problems as enrollment, grants and living room. As a result, it greatly improved the efficiency of students'enrollment. The school hopes that over time the robots will be able to better understand each student's academic and financial situation. By the time these students graduate, the robots will know everything about them.
The Internet is a magical place of interconnection. Enter any product name in the Google search engine and you will soon contact the vendor who sells the product. Explanatory robots also need to be developed in this way. Therefore, the explanation robots should be able to flexibly introduce users to each other, so as to smoothly complete the communication.
If I were inFacebook MessengerInput in“I want hamburgers.”So it should be able to pass this information on to other explanatory robots, so that other explanatory robots can arrange corresponding services to meet my orders. On the Internet, it's all throughREST APITo achieve this. There are many areas of robotics.APICompeting for people's attention. Explaining robots requires mature chatAPIImplement interaction.
If I interact with an application or a web page, I can quickly see what services it provides through links and what other information is on the screen. However, the explaining robot does not have this visual language ability. When you talk to an explaining robot, you're talking to it with your eyes closed. You have to think about it beforehand. What are you going to ask it??What can it do??Microsoft and Amazon have been trying to educate consumersCortanaandEchoThe function of the products. In fact, many articles on related topics have appeared. When interacting with the robot for the first time, people should make it clear whether the robot can let me choose a seat or only let me buy a ticket.?Can I change my appointment, or can I only make an appointment??Can I order according to my own needs??Without visual cues, the robot needs to find a new way to tell it what services it provides.
If the explanatory robots can read human facial expressions or voice changes to understand the emotional changes of the people they communicate with, they will undoubtedly provide better services. Explanatory robots can only deal with simple customer service now. If the user feels disappointed or annoyed, the explanation robot may need to hand over the dialogue to the human customer service personnel. However, there is an entire category of services, such as counseling or treatment, that are based entirely on interaction with users. The development of artificial intelligence and computer vision will make all this possible, but they still have a long way to go.
Explanatory robots have broad application prospects, whether in work or in our personal life. However, we need to solve these problems in order to make them mainstream. Once these problems are solved, explaining robots will bring us greater convenience and better experience.