Job Description

  • Steinhausen (Zug)
  • , Switzerland
Workload: 100%
Starting: Immediately / Upon agreement

We are a sharing community for vacation rentals. Together we own 56 resorts with over 5,000 vacation rentals in popular destinations in Europe – by the sea, in the mountains or in cities. Why? Because we want to enjoy a relaxed and responsible vacation.

Our team is highly committed to simplifying the vacation experience of our shareholders and members, from the booking to the stay to the next vacation planning. All processes for guests and employees should be designed so intuitively and efficiently that our guests not only look forward to their next stay, but also take away many wonderful memories and enjoy coming back.

To fulfill our mission, we are looking for an enthusiastic and energetic person to join our IT team!

What you can expect

  • Development and maintenance of scalable, high-performance data pipelines on the Google Cloud Platform (GCP).
  • Design and implementation of native GCP data solutions, adhering to best practices for performance, security, and cost-efficiency.
  • Working with GCP services such as BigQuery, Cloud Storage, Dataflow, and Cloud Data Fusion for data ingestion, processing, and storage.
  • Implementation of ETL/ELT processes and data modeling.
  • Collaboration with data scientists, analysts, and stakeholders to understand data requirements and deliver efficient data products.
  • Ensuring data quality and availability.
  • Implementation of data security best practices, IAM roles, and encryption techniques.
  • Monitoring and troubleshooting of data pipelines using tools such as cloud logging and cloud monitoring.
  • Fostering a data-driven corporate culture.

What you bring along

  • A degree in Computer Science, Data Engineering, Data Science, or a comparable technical field.
  • Several years of professional experience in Data Engineering, with at least two years focus on GCP solutions.
  • Comprehensive knowledge of GCP data services (BigQuery, Cloud Storage, Dataflow, Cloud Composer, Cloud Data Fusion).
  • Strong programming skills in Python and SQL for data processing.
  • Experience with Big Data technologies such as Apache Airflow and Apache Spark.
  • Experience implementing CI/CD pipelines and Infrastructure as Code (IaC) with tools such as Terraform/OpenTofu.
  • Strong analytical skills and a solution-oriented approach.
  • Excellent communication skills and a team player quality.
  • Fluent in German and English.

What we offer

Local benefits
  • Challenging tasks and a diverse role with the opportunity to develop and realize your potential in an international environment
  • A modern open-plan office at Hapimag headquarters in Steinhausen (ZG) with good transport links, a friendly atmosphere, and a dedicated, agile team with a great sense of humor
  • A hybrid work model (up to 40% remote work) and the option to occasionally work from one of our resorts
  • Attractive employee benefits for holidays at Hapimag resorts and other perks related to public transport travel
  • We promote personal responsibility and the professional development of our employees
Benefits at Hapimag
30% discounted membership
30% discount in restaurants
Work across Europe and regardless of the season
Workation within the EU/EFTA region
Learn languages for free
Exciting team events and activities

Apply now

For this position, we currently only consider and review direct applications with an existing, valid residence and work permit for Switzerland. We also ask you to indicate your salary expectations and the date from which you are available.

Have we convinced you to become part of our Hapimag family and success story? Then we look forward to receiving your application via Mail to career@hapimag.com

Hapimag AG
Recruitment & Talent Acquisition
Sumpfstrasse 18
6312 Steinhausen
www.hapimag.com/jobs