Business Line Data, Analytics and AI, a team within ELCA that is responsible for delivering key technology platforms that support our customers growth – is seeking a Big Data Engineer to provide hands-on development within our delivery team. As a key contributor this individual will work closely with data engineers, data architect and data scientist across multiple teams to build and operate best-in-class data analytics platforms.
The successful candidate will participate in projects to build scalable, reliable technologies and infrastructure for our customers data analytics platform. This is a role combining proven technical skills in various Big Data ecosystems, with a strong focus on open-source (Apache Spark, Kafka, Hadoop, CDP…) software and cloud (Azure, AWS, Google) infrastructure.
Our ideal candidate is familiar with several big data technologies (Hadoop, Spark, MPP databases, and NoSQL databases) and has experience in implementation of complex distributed computing environments which ingest, process, and surface big amount of data.
In this role
Across customers’ projects:
- You will be part in multiple projects in the field of Big Data and act as Data engineer
- You will contribute to design, code, configuration, and documentation of components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple data sources and usages.
- You will cross-train other team members on technologies being developed, while also continuously learning new technologies from other team members.
- You will interact with engineering teams across project teams and ensure that solutions meet customer requirements in terms of functionality, performance, availability, scalability, security and reliability.
- You will work directly with business analysts and data scientists to understand and support their use cases.
What we offer
- A key role in a multidisciplinary service company, a Swiss leader in the field of new technologies
- Attractive development opportunities in various technological fields with further responsibilities
- Challenging work close to customers
- A very pleasant working environment in a young and motivated team where team spirit is a core value
- Time for self-development (certifications, conferences, …)
About your profile
- 3+ years of experience coding in Java, Python, or Scala, with solid cloud services fundamentals including data structure and algorithm design
- 2+ years of hands-on implementation experience working with a combination of the following technologies: Spark, Kafka, Hadoop, MapReduce, Pig, Hive, Impala, Presto, Storm, Hbase or Cassandra.
- 2+ years of experience deploying and automating solutions in Cloud, Azure is a strong asset
- Knowledge of SQL and MPP databases (e.g. Vertica, Netezza, Greenplum, Aster Data, Snowflake)