Improving the life quality of people with severe motor paralysis has a significant impact on restoring their functional independence to perform activities of daily living (ADL). Telepresence is a subfield of the robotic-assisted route, where human plays the role of an operator, sending high-level instructions to an assistive robot while receiving sensory feedback. However, for severely motor-impaired people, conventional interaction modalities may not be suitable due to their complete paralysis. Thus, designing alternative ways of interaction such as Brain-Computer Interfaces (BCI) is essential for a telepresence capability. We propose a novel framework that integrates a BCI system and a humanoid robot to develop a brain-controlled telepresence system with multimodal control features. In particular, the low-level control is executed by Programming by Demonstration (PbD) models, and the higher-level cognitive commands are produced by a BCI system to perform vital ADLs. The presented system is based on real-time decoding of attention-modulated neural responses elicited in the brain electroencephalographic signals and generating multiple control commands. As a result, the system allows a user to interact with a humanoid robot while receiving auditory and visual feedback from the robot's sensors. We validated our system across ten subjects in a realistic scenario. The experimental results show the feasibility of the approach in the design of a telepresence robot with high BCI decoding performances.