In manipulation tasks, skills are usually modeled using the continuous motion trajectories acquired in the task space. The motion trajectories obtained from a human’s multiple demonstrations can be broadly divided into four portions, according to the spatial variations between the demonstrations and the time spent in the demonstrations: the portions in which a long/short time is spent, and those in which the spatial variations are large/small. In these four portions, the portions in which a long time is spent and the spatial variation is small (e.g., passing a thread through the eye of a needle) are usually modeled using a small number of parameters, even if such portions represent the movement that is essential for achieving the task. The reason for this is that these portions are slightly changed in the task space as compared with the other portions. In fact, such portions should be densely modeled using more parameters (i.e., overfitting) to improve the performance of the skill because the movements of those portions must be accurately executed to achieve the task.
In this paper, we propose a method for adaptively fitting these skills based on the temporal and the spatial entropies calculated by a Gaussian mixture model. We found that it is possible to retrieve accurate motion trajectories as compared with those of well-fitted models, whereas the estimation performance is generally higher than that of an overfitted model. To validate our proposed method, we present the experimental results and evaluations when using a robot arm that performed two tasks.