Working at T Hub will offer you an unique and highly rewarding experience on IT market. As a leader in the telecommunications industry, we do not only provide a platform to hone your technical skills but also empower you to be a catalyst for innovation.
Youll have the opportunity to work at the forefront of modern technologies, from 5G to IoT and AI, shaping the future of connectivity.
responsibilities :
Orchestrate Data Pipelines:
Design, build, and maintain robust data workflows using Dagster, ensuring reliable and observable data delivery across development, testing, and production environments.
Pipeline Configuration & Deployment:
Manage configuration-driven pipelines, support dynamic partitioning, and integrate Dagster with version control, CI/CD tools, and containerized environments (e.g., Kubernetes, Docker).
Data Services Development
Develop scalable configurable services for different common orchestration tasks like data profiling or aggregation.
Data Quality and Testing:
Implement data validation, unit and integration testing within Dagster workflows to ensure correctness and maintain trust in the data.
Collaboration & Integration:
Work closely with business analyst, data architect and business stakeholders.
Monitoring and Observability:
Set up logging, alerting, and monitoring for workflows using Dagster’s observability features and external tools (e.g., ELK Stack).
Documentation & Best Practices:
Document pipelines, configurations, and workflows thoroughly. Promote best practices for reproducibility, lineage, and maintainability.
Platform & Tooling Development:
Support improvements to our data platform by integrating new tools, automating deployment, and contributing to shared libraries and reusable componentsQualifications:
Must-Have:
Solid knowledge of Python and experience with writing production-grade code in data workflows.
Strong SQL skills and a solid understanding of relational databases and data warehouse concepts.
Experience with ETL/ELT development using tools like Spark, Pandas, or custom scripts.
Familiarity with data pipeline testing, quality gates, and configuration validation (e.g., Pydantic, Dagster config schemas).
Understanding of data lineage, metadata tracking, and reproducibility in workflow systems.
Strong collaboration and communication skills, with the ability to work cross-functionally.
requirements-expected :
Hands-on experience with Dagster or a modern orchestration tool (e.g., Prefect, Airflow, Mage).
Experience deploying Dagster with Kubernetes, Docker, and Helm.
Familiarity with CI/CD for data workflows and automated testing (e.g., GitHub Actions, GitLab CI).
Knowledge of data modeling and data warehouse design (e.g., dimensional modeling, star/snowflake schemas).
Experience with data catalog tools and metadata APIs.
offered :
Employment contract
Additional day off on the occasion of birthdays/namedays
Medical package and life insurance
Benefit platform - you choose what you benefit from
Access to training platforms to improve your knowledge
I know Talent - training or money for referring friends to work ?!
Besides, with us you can count on access to our products and services at preferential terms and benefits, which you will read about below
benefits :
sharing the costs of sports activities
private medical care
sharing the costs of foreign language classes
sharing the costs of professional training & courses
life insurance
flexible working time
corporate products and services at discounted prices
integration events
mobile phone available for private use
no dress code
parking space for employees
extra social benefits
sharing the costs of tickets to the movies, theater
holiday funds
birthday celebration
sharing the costs of a streaming platform subscription