Senior Data Engineer AWS: Job Description .
🌍 Where you’ll do it: Chennai (HO)
🏁 An initial screening of the resume leads to a 3-stage interview process lasting at least a week: ➡️ 15-minute HR chat ➡️ 30-minute Technical Round ➡️ 1 -hour final interview with the CEO/CFO to assess a culture fit
What will make your journey with us amazing?
🏆 You will work with a supportive manager who cares about your well-being and invests in your development to help you achieve your full potential and grow your career with us.
You will be engaged in continuous learning with clear targets in a feedback culture.
🌱 You will join a company that is passionate about its people, values their contribution and strives for a fair and inclusive workplace.
What You'll do?
- Design, develop, and maintain scalable data pipelines and ETL processes using Python, SQL, and Spark.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver efficient Implement and manage data storage solutions on AWS, ensuring high availability, scalability, and security.
- Containerize data applications using Docker for consistent deployment across different environments.
- Monitor and optimize the performance of data pipelines and storage systems to ensure timely and reliable data delivery.
- Implement best practices for data governance, data quality, and data security.
- Troubleshoot and resolve data-related issues in a timely manner.
- Stay updated with the latest trends and technologies in data engineering and propose improvements to existing processes
What You will bring?
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or a similar role.
- Proficiency in Python and SQL for data processing and manipulation.
- Hands-on experience with Apache Spark for big data processing.
- Strong experience with AWS services such as S3, Redshift, Glue, EMR, Lambda, and RDS.
- Proficient in using Docker for containerization and deployment.
- Experience with data pipeline orchestration tools like Apache Airflow or AWS Step Functions.
- Solid understanding of data warehousing concepts and data modeling.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
- Excellent problem-solving skills and the ability to work independently and as part of a team.
- Strong communication skills with the ability to convey complex technical concepts to non-technical stakeholders.
What's in it for you?
⏰ Work Life Balance
🏖 Flexible holidays
📚Robust L&D programs
🤝People-centric Culture/Practice
💰Competitive package
Multi-domain experience
❇️Community contribution programs
💡Attend Hackathons and Conferences
💼Health Insurance Plan for Whole Family + Accidental and Life Coverages