Return

Data Engineer

Cyprus|Information and technology|1 Vacancy|Bachelor|Advanced

Job Description

  • Design, develop, and optimize data pipelines, ETL processes, and data architectures that meet non-functional and functional business requirements, with a focus on handling both batch and streaming data.
  • Implement data acquisition processes and integrate data from various sources including databases and public data APIs using cloud tools and technologies. Build and maintain scalable data storage solutions, including data warehouses, data lakes, and data streaming systems.
  • Ensure the accuracy, completeness, and reliability of data through proper validation and error-handling processes. Monitor and maintain data pipeline performance, troubleshooting issues as they arise.
  • Identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, reliability and optimization.
  • Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet their needs.
  • Continuously evaluate and recommend new technologies and tools to improve data processing, storage, and analysis capabilities.
  • Stay up-to-date with industry trends and best practices to ensure our data engineering practices remain cutting-edge.

What we are looking for

  • At least 2 years of proven experience as a Data Engineer or in a similar role, with a focus on building and optimizing data pipelines, including real-time/big volume data pipelines.
  • Strong hands on experience with SQL and DBMS (e.g., MySQL, MS SQL, PostgreSQL, etc).
  • Experience with cloud-based data platforms (AWS, Azure, GCP) and related services. Experience with big data processing tools (Spark).
  • Experience with Databricks and with analytical databases (ClickHouse) is highly desired.
  • Proficiency in programming languages such as Python or Scala.
  • Knowledge of CI/CD methodologies, cloud data warehousing concepts and tools.
  • Experience with streaming data platforms such as Apache Kafka, Apache Flink, AWS Kinesis, or similar.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
Hidden
Hidden
Hidden
Drop files here or
Accepted file types: pdf, doc, ppt, pptx, png, jpg, jpeg, docx, Max. file size: 50 MB.