Job Description:
Experience : 5+
Role: Data Engineer
Mandatory Skills: Spark , Scala, Azure cloud, azure data bricks, Data lake/blob storage, Azure Data factory, CI/CD tools, Elastic search
Good to have: Kubernetes, docker,Keyvault, mySQL, Graph machine learning, Angular & NodeJs, modern development tooling (e.g. Git, Gradle, Jenkins, Nexus)
Responsibilities:
- Working with clients to solve business problems in the area of fraud, compliance and financial crime
- Use leading open source big-data tools, such as Spark, Hadoop, Scala and Elasticsearch. You should be comfortable with working with high profile clients, on their sites.
- Providing technical leadership to a team of Data Engineers and Data Scientists to ensure efficient and effective delivery of reusable solutions.
- Managing, transforming and cleansing high volume data
- Writing defensive, fault tolerant and efficient code for data processing
- Developing scoring algorithms to identify high risk activities.
- Using emerging and open source technologies such as Spark, Hadoop, and Scala
- Implementing systems delivering automated data ingestion, scoring and alert generation
- Implementing deployment pipelines using modern CI/CD and deployment technologies such as Jenkins/Bamboo/Kubernetes/Openshift
- Presenting project results to clients
- Proven big data experience, either from an implementation or a data science prospective,
- Excellent technical skills including expert knowledge of at least one big data technology such as Spark, Hadoop, or Elasticsearch.
- Experience of building data processing pipelines for use in production hands off batch systems, including either (or preferably both) traditional ETL pipelines and/or analytics pipelines.
- Experience in working with Azure cloud , Databricks, Azure Data Factory and Datalake/blob storage.
Behavioral Skills:
- Strong client facing, communication and presentation skills.
- Enthusiasm to learn and develop emerging technologies and techniques.
- Exhibit strong technical communication skills with demonstrable experience of working in rapidly changing client environments.
- Demonstrate strong analytical and problem-solving skills and the ability to debug and solve technical challenges with sometimes unfamiliar technologies.
Spark , scala, Azure cloud, azure databricks, Data lake/blob storage, Azure Data factory, CI/CD tools, Elasticsearch