$ads={1}
Data Engineer Cloud/AWS / Remote / 6 months + / Start ASAP
Must have AWS, Databricks and DevOps experience.
Inside IR35 if based in UK
RESPONSIBILITIES:
As a Senior Data Engineer in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture which is the backbone of the analytical data platform at Customer. By constantly challenging the status quo, you deliver high-performance data processing solutions that are efficient and reliable at scale
Where your experience is needed:
Operational Excellence:
Delivering New Features:
Compliance & Security:
Cost Savings:
SKILL SET - Technical
Mandatory:
Nice to have:
SKILL SET - Soft skill:
English B2 (is able to communicate fluently with English speaking stakeholders, able to share ideas and provide reasoning)
Team player (easy & respectful communications, shares responsibilities for the team overall success)
Communication skills
Mastery in Engineering
Must have AWS, Databricks and DevOps experience.
Inside IR35 if based in UK
RESPONSIBILITIES:
As a Senior Data Engineer in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture which is the backbone of the analytical data platform at Customer. By constantly challenging the status quo, you deliver high-performance data processing solutions that are efficient and reliable at scale
Where your experience is needed:
Operational Excellence:
- Troubleshooting user issues related to Databricks (Spark) and suggesting optimisation's for long running or resource intensive jobs
- Guiding users on best practices of Databricks cluster management
- Consult and help our diverse teams to develop, implement and maintain sustainable, high-performance, growth-ready data-processing and data integration systems
Delivering New Features:
- Implementing services that improve the CI/CD experience of users by emphasising on self service
- Working together with other teams in Customer Data Infrastructure to deliver services that serve as a backbone for Customer's central data lake
Compliance & Security:
- Improving data access methods to provide bullet proof, secure and by default compliant self service platform
Cost Savings:
- Guiding users and supporting them to find cost effective setup while keeping efficiency high
- Implementing observability solutions in Databricks to reduce slack costs
- Driving users to adopt edge Databricks features like Photon and Graviton instances
- Reviewing current infrastructure to find cost saving opportunities
SKILL SET - Technical
Mandatory:
- Proficient knowledge in working with distributed data processing frameworks like Apache Spark and a good understanding of relational database management systems.
- Heavy hands-on experience with Databricks
- Hands-on experience with Python and Scala combined with SQL knowledge
- Hands-on experience in cloud technologies (AWS services like IAM, S3, Lambda, EC2)
- Engineering craftsmanship with an experience in software development processes focusing on testing, continuous integration/continuous delivery (CI/CD), monitoring and writing documentation
Nice to have:
- AWS Networking or Basics of Networking on AWS cloud
- Hands-on experience with Terraform
SKILL SET - Soft skill:
English B2 (is able to communicate fluently with English speaking stakeholders, able to share ideas and provide reasoning)
Team player (easy & respectful communications, shares responsibilities for the team overall success)
Communication skills
Mastery in Engineering
Reference
CR/108857_1692166290
