Big Data Engineer - Analytics wanted for our Basel based client in the financial sector.
- Software Engineering background with experience in SDLC tools and practices for agile, ideally including Continuous Delivery
- Proven experience implementing solutions to process large amounts of data in a
Hadoop ecosystem, utilising Apache Spark, as well as generic components for the ingestion, validation, and structuring of disparate data sources and complex event processing patterns
- Strong programming and scripting skills in Java, Scala, Python and R
- Profound knowledge of Hadoop technologies, such as MapReduce, HDFS, HBase, Hive,
Sqoop, Flume and Kafka
- Experience with end-user notepad tools, such as Jupyter and Zeppelin as well as BRMS driven solutions and master data management (MDM)
- Languages: fluent English both written and spoken
- Implementing solutions on the existing Hadoop platform in collaboration with a global team
- Liaising with Platform Engineers to ensure dependent components are provisioned
- Providing input in defining the design and architecture of the envisaged solution
- Implementing business rules for streamlining data feeds and rule based framework to abstract complex technical implementation into reusable, generic components
- Interacting with the architecture team to refine requirements
Does this sound like an interesting and challenging opportunity to you? Then take the next step by sending us your CV as a Word document and a contact telephone number.
Due to work permit restrictions we can unfortunately only consider applications from EU or Swiss citizens as well as current work-permit holders for Switzerland.
Going the extra mile…
New to Switzerland? In case of successful placement, we support you with:
- All administrative questions
- Finding an apartment
- Health – and social insurance
- Work permit and much more