Due to the body analogy and their ability to communicate also via body language, humanoid robots are discussed as being mostly fit for applications in service robotics. An essential precondition of their deployment in human environments is the generation and control of human-like and thus predictable motions. This contribution introduces new motion control strategies which are based on the "Rapid Upper Limb Assessment" (RULA), a method for the analysis of ergonomic conditions at manual workplaces. We have previously adapted RULA to work with the Virtual Human, a simulated anthropomorphic multiagent system for the analysis of human motions and manipulations. Enabled by the control framework behind the Virtual Human, we here transfer RULA to humanoids in general and propose it as a well-defined and transparent heuristic for the control of human-like motions.