.
Senior Databricks Engineer
  • Kraków
Senior Databricks Engineer
Kraków, Kraków, Lesser Poland Voivodeship, Polska
ENTRADA AI SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
6. 11. 2025
Informacje o stanowisku

technologies-expected :


  • Databricks
  • Python
  • SQL
  • ETL
  • Apache Spark

technologies-optional :


  • BigData
  • Azure
  • Snowflake
  • DBT
  • Airflow

about-project :


  • Entrada AI, Inc. is looking for a highly skilled Senior Databricks Engineer to join our growing team of consultants. In this role, you will be responsible for designing, constructing, and optimizing large-scale data processing systems that enable advanced analytics and business intelligence. You will collaborate closely with data scientists, analysts, and other engineers to ensure the availability, reliability, and efficiency of our client’s Databricks solution.

responsibilities :


  • Design, develop, and maintain scalable and robust data pipelines that collect, process, and store large volumes of structured and unstructured data.
  • Architect and implement data warehouses, data lakes, and other storage solutions that support analytics and reporting needs.
  • Optimize data architectures and workflows for performance, scalability, and cost-efficiency.
  • Collaborate with data scientists and analysts to understand data requirements and ensure that data systems meet their needs.
  • Ensure data quality, integrity, and security by implementing best practices in data governance and management.
  • Develop and maintain documentation for data systems, including data models, flow diagrams, and operational procedures.
  • Mentor and guide junior data engineers, providing technical leadership and support.
  • Stay current with emerging technologies and trends in data engineering, and apply them to improve existing systems.

requirements-expected :


  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field.
  • 5+ years of experience in data engineering or related fields, with a focus on large-scale data systems
  • 3+ years of experience with Databricks.
  • Proficiency in SQL and experience with databases such as MySQL, PostgreSQL, or NoSQL databases
  • Strong programming skills in languages such as Python, Java, or Scala.
  • Experience with ETL tools and frameworks and data pipeline orchestration.
  • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud platforms (AWS, GCP, Azure).
  • Strong understanding of data warehousing concepts, data modeling, and schema design.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication skills and ability to work effectively in a team-oriented environment.

  • Praca Kraków
  • Kraków - Oferty pracy w okolicznych lokalizacjach


    108 226
    15 457