Senior Data Engineer

Full time Data Engineering Artificial Intelligence Software Engineering Data Warehouse

Job Description

Job Category: Data Engineer
Contract type: CDI
Work location: Lausanne
Teleworking (2 days max/week): Yes

At BS-Team, we are committed to supporting our employees throughout their careers by offering them opportunities for development. We are convinced that expanding knowledge and skills is essential for their professional success and daily fulfillment.

Mission:

Our client is looking for a confirmed / senior Data Engineer (Data Engineering) to ensure the development and maintenance of data products, within the framework of various IT projects of our client.

Your role:

  • Design and produce data products for our client's various services, or for general public use, with a high level of requirement on our client's technical stack and according to a DevOps methodology.
  • Integration into different projects, works within the Data teams.
  • The work associated with the mission will mainly concern: the provision of APIs, the ingestion of data flows, transformation treatments involving high volumes of data on an on-prem infrastructure.
  • Participation in the team's Agile sessions.
  • Work in conjunction with enterprise architecture, integration and infrastructure teams to ensure consistent design and stable operations.
  • Ensures that data is processed appropriately. Ensures the quality of the data captured and its documentation.
  • Listening to customers and users to ensure functionality and a quality user experience.
  • Be a force for proposals. Know how to join and make others join.
  • Support other engineers in improving their practices. Facilitates the transfer of skills

Your profile:

  • Diploma in computer science from a university (HES, University, EPF) or training deemed equivalent.
  • Minimum 3 years of professional experience in data engineering projects.
  • Technical skills:
    • DevOps Tools: Gitlab, VSCode, Docker
    • Testing tools: PyTest / UnitTest, SonarQube, Postman
    • Transformation tools: DBT, Python, Debezium, other ETLs
    • Data Sources: Parquet sur S3, PostgreSQL, DuckDB, ArangoDB, REST (OpenAPI), GraphQL, XML, Kafka, SQLServer, SGBD Time Series (Prometheus, InfluxDB), ELK
    • Monitoring schedule: Prefect
    • Experience in executing projects in AGILE mode.
    • ITIL and Hermes knowledge desired.
    • Experience in the public sector, an asset.

The pros:

  • Long term mission
  • 2 days of teleworking
  • Very attractive remuneration
  • Training and certification proposals
  • Regular monitoring of our teams and strong integration

We will only respond to applications that best match the requirements.
Your file will not be forwarded to other companies.