This work proposes a framework that enables arbitrary robots with unknown kinematics models to imitate human demonstrations to acquire a skill, and reproduce it in real-time. The diversity of robots active in non-laboratory environments is growing constantly, and to this end we present an approach for users to be able to easily teach a skill to a robot with any body configuration. Our proposed method requires a motion trajectory obtained from human demonstrations via a Kinect sensor, which is then projected onto a corresponding human skeleton model. The kinematics mapping between the robot and the human model is learned by employing Local Procrustes Analysis, which enables the transfer of the demonstrated trajectory from the human model to the robot. Finally, the transferred trajectory is modeled using Dynamic Movement Primitives, allowing it to be reproduced in real time. Experiments in simulation on a 4 degree of freedom robot show that our method is able to correctly imitate various skills demonstrated by a human.