Big Data Architect

  • EPAM Systems
  • Zürich, Switzerland
  • 31/08/2018
Full time Data Science Data Engineering Data Analytics Big Data Data Management Statistics Software Engineering

Job Description

DESCRIPTION

Currently we are looking for permanent, technically hands-on Big Data Architect to join our Zurich team and lead on various strategic client projects.

These are high profile and visible roles within both EPAM and onsite with our clients where you will have a high degree of flexibility to own and enhance the technical landscapes.

RESPONSIBILITIES

  • Design data analytics solutions by utilizing the Big Data technology stack
  • Create and present solution architecture documents with deep technical details
  • Work closely with business in identifying solution requirements and key case-studies/scenarios for future solutions
  • Conduct solution architecture review/audit, calculate and present ROI
  • Lead implementation of the solutions from establishing project requirements and goals to solution "go-live"
  • Participate in the full cycle of pre-sale activities: direct communications with customers, RFP processing, the development of proposals for implementation and design of the solution, presentation for proposed solution architecture to the customer and participate in technical meetings with customer representatives
  • Create and follow personal education plan in the technology stack and solution architecture
  • Maintain a strong understanding of industry trends and best practices
  • Get involved in engaging new clients to further drive EPAM business in the Big Data space

REQUIREMENTS

  • Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python
  • Experience delivering data analytics projects and architecture guidelines
  • Experience in big data solutions on premises and on the cloud (Amazon Web Services, Microsoft Azure, Google Cloud)
  • Production project experience in at least one of the Big Data technologies
  • Batch processing: Hadoop and MapReduce / Spark / Hive
  • NoSQL databases: Cassandra / HBase / Accumulo / Kudu
  • Knowledge of Agile Development methodology, Scrum in particular
  • Experience in direct customer communications and pre-selling business-consulting engagements to clients within large enterprise environments
  • Fluent English

NICE TO HAVE

  • Practical experience in performance tuning and optimization, bottleneck problem analysis
  • Experience in Linux-based environments
  • Understanding of data modelling challenges and techniques in an enterprise environment
  • Stream processing: Kafka / Flink / Spark Streaming / Storm
  • Background in traditional Data Warehouse and Business Intelligence stacks (ETL, MPP Databases, Tableau, Microsoft power BI, SAP Business Objects)

WE OFFER

  • Experience exchange with colleagues all around the world
  • Competitive compensation depending on experience and skills
  • Regular assessments and salary reviews
  • Develop integration modules for interacting with new systems and applications
  • Opportunities for self-realization
  • Friendly team and enjoyable working environment
  • Corporate and social events
  • Please note that any offers will be subject to appropriate background checks
  • We do not accept CV from recruiting or staffing agencies