The number and variety of robots active in real-world environments are growing, as well as the skills they are expected to acquire, and to this end we present an approach for non-robotics-expert users to be able to easily teach a skill to a robot with potentially different, but unknown, kinematics from humans. This paper proposes a method that enables robots with unknown kinematics to learn skills from demonstrations. Our proposed method requires a motion trajectory obtained from human demonstrations via a vision-based system, which is then projected onto a corresponding human skeletal model. The kinematics mapping between the robot and the human model is learned by employing Local Procrustes Analysis, a manifold alignment technique which enables the transfer of the demonstrated trajectory from the human model to the robot. Finally, the transferred trajectory is encoded onto a parameterized motion skill, using DynamicMovement Primitives, allowing it to be generalized to different situations. Experiments in simulation on the PR2 and Meka robots show that our method is able to correctly imitate various skills demonstrated by a human, and an analysis of the transfer of the acquired skills between the two robots is provided.