About the position
Job Description
Develop and deliver data engineering solutions primarily using Python or relevant programming languages.
Participate actively in agile ceremonies such as sprint planning, refinement, and retrospectives.
Collaborate with stakeholders and team members to clarify requirements and refine user stories.
Share knowledge and learn from team members; contribute to team collaboration and improvement.
Identify and help resolve open points, proposing solutions or prototypes when appropriate.
Provide support and maintenance for existing data engineering solutions following DevOps practices.
Be flexible to take on different tasks within projects and support continuous improvement efforts.
Minimum Requirements:
SKILLS REQUIREMENTS:
Qualifications/Experience:
Relevant IT, Mathematics, Engineering, or Business degree or equivalent practical work experience.
Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related fields preferred.
Minimum 2–4 years of hands-on data engineering experience in complex data environments.
Some experience designing and implementing cloud-based data solutions.
Relevant cloud certifications (e.g., Azure AZ-900, AWS Certified Cloud Practitioner) are advantageous but not mandatory.
Essential Skills Requirements:
Experience designing, building, and optimizing scalable data pipelines and data models using big data frameworks (e.g., Apache Spark, Flink, or equivalents).
Proficient programming skills in Python or similar languages for data processing and automation.
Exposure to cloud data platforms and object storage solutions (e.g., Azure, AWS, Google Cloud) for enterprise data engineering.
Understanding of data governance, data quality, lineage, and compliance principles.
Experience or familiarity with CI/CD pipelines and orchestration tools for data workflows (e.g., GitHub Actions,
Jenkins, or cloud-native tools).
Good analytical and problem-solving skills with attention to detail.
Ability to collaborate effectively within cross-functional and distributed teams.
Advantageous Skills Requirements:
Working knowledge of cloud services and serverless architectures across Azure, AWS, or Google Cloud.
Experience with monitoring, logging, and data exploration tools (e.g., Splunk, Azure Data Explorer, ELK stack) is beneficial.
Exposure to streaming platforms or messaging systems (e.g., Kafka, MQTT, RabbitMQ) is an advantage.
Basic understanding of frontend frameworks (e.g., React, [URL Removed] is a plus.
Experience supporting users and managing tickets is beneficial.
Solution-oriented mindset with strong communication and teamwork skills.
Ability to understand business requirements and translate them into technical tasks.
Willingness to engage with international customers and navigate language or cultural differences.
Self-motivated, flexible, and ready to learn and take on diverse tasks.
Willingness to travel internationally (up to 2 weeks at a time).
Agile methodology experience and ITIL process knowledge are advantageous.
Basic German language skills are a plus but not required.
Desired Skills:
- scalable data pipelines and data models
- Apache Spark
- Flink
- Python