The eyes are gaining increasing interest within the HCI (human-computer interaction) community, as they are a fast and accurate input modality for interactive systems. However, the applicability of mobile eye-based HCI so far is restricted by several issues, such as calibration or the Midas Touch Problem [Jac91]. In this thesis, I propose the idea of contour-guided gaze gestures, which overcome these problems by relying on relative eye movements, as users trace the contours of (interactive) objects within a smart environment. Matching the trajectory of the eye movements and the contour shapes allows to estimate which object was interacted with and to trigger the corresponding actions. The interaction concept and the design of the system are described, along with the influence of several parameters on the gesture detection performance, that was evaluated in a first user study. Using these findings, a second study with three application scenarios was executed and demonstrated, that users were able to trace object contours to trigger actions from various positions on multiple different objects. It was further determined, that the proposed method is an easy to learn, hands-free interaction technique, that is robust against false positive activations. User study results highlight low demand values and show that the method holds potential for further exploration, but also reveal areas for refinement.