EPAM Systems

Striving for excellence is in our DNA. Since 1993, we have been helping the world’s leading companies imagine, design, engineer, and deliver software and digital experiences that change the world. We are more than just specialists, we are experts. 

EPAM Systems Zürich, ZH, Switzerland
06/07/2019
Full time
Currently we are looking for a   Big Data Engineer with Kafka knowledge for our Zurich office to make the team even stronger. Do you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data? Have you been working in similar positions before? We’re looking for someone like you who can help us: Engineer and integrate the platform from a technology point of view; Engineer core Big Data platform capabilities. YOU ARE A passionate Big Data Engineer looking for new challenges; Proactive, looking for creative and innovative solutions; A flexible, open-minded and cooperative person; Interested in working in a fast-paced international environment as part of an international team. YOUR TEAM You’ll be working in close association the Advanced Analytics (Big Data Analytics) team in Zurich. We offer you a highly motivated and experienced team with in-depth knowledge about Advanced Analytics, building models and algorithms. Our goal is to enable the Business’ for a digital and analytical future! You’ll operate and maintain the whole platform end to end and work closely with the development teams. This role includes significant responsibilities and development possibilities. Requirements Bachelor's degree in computer science, computer engineering, management information systems, related discipline or equivalent experience At least 3 years of experience in designing and operating Kafka Clusters (Confluent and Apache Kafka) on-premise At least 5 years of experience in design, sizing, implementation and maintaining Hortonworks based Hadoop Clusters At least 3 years of experience in securing and protection of Hadoop clusters (Ranger, Atlas, Kerberos, Knox, SSL) At least 5 years of experience in designing Big Data architectures At least 5 years of demonstrated experience in gathering and understanding customer business requirements to introduce Big Data technologies At least 5 Years of Experience in configuring the tools from the Hadoop ecosystem, like Hadoop, Hive, Spark, Kafka, Solr, Nifi Experience with IBM Watson Studio Local integration Experience with IBM DB2 is a plus Experience with IBM Power Systems is a plus Experienced in implementing complex security requirements in the financial industry Good abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personality We offer Experience exchange with colleagues all around the world Competitive compensation depending on experience and skills Regular assessments and salary reviews Develop integration modules for interacting with new systems and applications Opportunities for self-realization Friendly team and enjoyable working environment Corporate and social events Please note that any offers will be subject to appropriate background checks We do not accept CV from recruiting or staffing agencies Due to the swiss labour legislation we can only accept EU candidates and applicants who have open work permit for Switzerland
EPAM Systems Zürich, ZH, Switzerland
06/07/2019
Full time
Do you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data? Have you been working in similar positions before? We’re looking for someone like you who can help us: – Engineer and integrate the platform from a technology point of view; – Engineer core Big Data platform capabilities. Responsibilities You’ll be working in Treasury IT (Big Data Analytics) team in Zürich. We offer you a highly motivated and experienced team with in-depth knowledge about Big Data and Data Analytics, building models and algorithms. Our goal is to enable Client Business’ for a digital and analytical future! You’ll implementing new logic, enhance existing logics in our newly built DataLake platform. This role includes significant responsibilities and development possibilities Requirements Experience in DevOps Best Practices, building automation pipelines, CI/CD Experience in Hadoop Administration Data governance implementation Experience with Oracle PL/SQL is a plus Experience with Phyton and Scala is a plus Experience with Cloud environments like Microsoft Azure Good abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personality You are: A passionate Big Data Engineer and developer who is looking for new challenges Proactive, looking for creative and innovative solutions A flexible, open-minded and cooperative person Interested in working in a fast-paced international environment as part of an international team We offer Experience exchange with colleagues all around the world Competitive compensation depending on experience and skills Regular assessments and salary reviews Opportunities for self-realization Friendly team and enjoyable working environment Corporate and social events Unlimited access to LinkedIn learning solutions Additional Please note that any offers will be subject to appropriate background checks We do not accept CVs from recruiting or staffing agencies Due to the swiss labour legislation we can only accept EU candidates and applicants who have open work permit for Switzerland
EPAM Systems Zürich, ZH, Switzerland
15/06/2019
Full time
Currently we are looking for a   Big Data Engineer with Clustered Solutions   for our Zurich office to make the team even stronger. Do you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data? Have you been working in similar positions before? We’re looking for someone like you who can help us: Engineer and integrate the platform from a technology point of view; Engineer core Big Data platform capabilities. YOU ARE A passionate Big Data Engineer looking for new challenges; Proactive, looking for creative and innovative solutions; A flexible, open-minded and cooperative person; Interested in working in a fast-paced international environment as part of an international team. YOUR TEAM You’ll be working in close association the Advanced Analytics (Big Data Analytics) team in Zurich. We offer you a highly motivated and experienced team with in-depth knowledge about Advanced Analytics, building models and algorithms. Our goal is to enable the Business’ for a digital and analytical future! You’ll operate and maintain the whole platform end to end and work closely with the development teams. This role includes significant responsibilities and development possibilities. Requirements Bachelor's degree in computer science, computer engineering, management information systems, related discipline or equivalent experience At least 5 years of experience in DevOps Best Practices, building automation pipelines, CI/CD using Ansible for Platforms At least 5 years of experience in creating clustered solutions and integration of 3rd party tools At least 5 years of experience in integrating Security Best Practices in enterprise environments, both System Security and Network Security for Big Data systems At least 5 years of experience in DevOps automation with Ansible 5 years of experience in rollouts/deployments of layered environments in large enterprises At least 8 Years of Experience in System architecture/troubleshooting/operation, *nix focused Experience with IBM DB2 is a plus Experience with IBM Power Systems is a plus Experienced in implementing complex security requirements in the financial industry Good abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personality We offer Experience exchange with colleagues all around the world Competitive compensation depending on experience and skills Regular assessments and salary reviews Develop integration modules for interacting with new systems and applications Opportunities for self-realization Friendly team and enjoyable working environment Corporate and social events Please note that any offers will be subject to appropriate background checks We do not accept CV from recruiting or staffing agencies Due to the swiss labour legislation we can only accept EU candidates and applicants who have open work permit for Switzerland
EPAM Systems Zürich, ZH, Switzerland
15/06/2019
Full time
Currently we are looking for a   Big Data Engineer with Kafka knowledge for our Zurich office to make the team even stronger. Do you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data? Have you been working in similar positions before? We’re looking for someone like you who can help us: Engineer and integrate the platform from a technology point of view; Engineer core Big Data platform capabilities. YOU ARE A passionate Big Data Engineer looking for new challenges; Proactive, looking for creative and innovative solutions; A flexible, open-minded and cooperative person; Interested in working in a fast-paced international environment as part of an international team. YOUR TEAM You’ll be working in close association the Advanced Analytics (Big Data Analytics) team in Zurich. We offer you a highly motivated and experienced team with in-depth knowledge about Advanced Analytics, building models and algorithms. Our goal is to enable the Business’ for a digital and analytical future! You’ll operate and maintain the whole platform end to end and work closely with the development teams. This role includes significant responsibilities and development possibilities. Requirements Bachelor's degree in computer science, computer engineering, management information systems, related discipline or equivalent experience At least 3 years of experience in designing and operating Kafka Clusters (Confluent and Apache Kafka) on-premise At least 5 years of experience in design, sizing, implementation and maintaining Hortonworks based Hadoop Clusters At least 3 years of experience in securing and protection of Hadoop clusters (Ranger, Atlas, Kerberos, Knox, SSL) At least 5 years of experience in designing Big Data architectures At least 5 years of demonstrated experience in gathering and understanding customer business requirements to introduce Big Data technologies At least 5 Years of Experience in configuring the tools from the Hadoop ecosystem, like Hadoop, Hive, Spark, Kafka, Solr, Nifi Experience with IBM Watson Studio Local integration Experience with IBM DB2 is a plus Experience with IBM Power Systems is a plus Experienced in implementing complex security requirements in the financial industry Good abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personality We offer Experience exchange with colleagues all around the world Competitive compensation depending on experience and skills Regular assessments and salary reviews Develop integration modules for interacting with new systems and applications Opportunities for self-realization Friendly team and enjoyable working environment Corporate and social events Please note that any offers will be subject to appropriate background checks We do not accept CV from recruiting or staffing agencies Due to the swiss labour legislation we can only accept EU candidates and applicants who have open work permit for Switzerland
EPAM Systems Zürich, ZH, Switzerland
15/06/2019
Full time
Currently we are looking for a   Big Data Engineer with ETL knowledge   for our Zurich office to make the team even stronger. Do you want to work on the cutting edge? Do you want to expand your knowledge and abilities? Are you passionate about data? Have you been working in similar positions before? We’re looking for someone like you who can help us: Engineer and integrate the platform from a technology point of view engineer core Big Data platform capabilities. YOU ARE A passionate Big Data Engineer looking for new challenges; Proactive, looking for creative and innovative solutions; A flexible, open-minded and cooperative person; Interested in working in a fast-paced international environment as part of an international team. Requirements Bachelor's degree in computer science, computer engineering, management information systems, related discipline or equivalent experience At least 5 years of experience in Big Data Platform and Data Lake Architectures At least 5 years of experience in planning, realization and operation of Big Data Platforms At least 5 years of experience in System Integration At least 5 years of experience building automation pipelines, CI/CD using Ansible for Platforms 3-5 years of experience in Hortonworks HDP, Cloudera CDH Experience with IBM Power Systems is of advantage Experience with IBM Watson Studio Local and the integration with IBM DB2, IBM Big SQL, HDFS, Ranger, Hive, Yarn is of advantage Experience with IBM DB2 is a plus Experienced in implementing complex security requirements in the financial industry Expert understanding of ETL principles and how to apply them within Hadoop is a plus Good abstraction and conceptual skills combined with a self-reliant, team-minded, communicative, proactive personality We offer Experience exchange with colleagues all around the world Competitive compensation depending on experience and skills Regular assessments and salary reviews Develop integration modules for interacting with new systems and applications Opportunities for self-realization Friendly team and enjoyable working environment Corporate and social events Please note that any offers will be subject to appropriate background checks We do not accept CV from recruiting or staffing agencies Due to the swiss labour legislation we can only accept EU candidates and applicants who have open work permit for Switzerland