We’re looking for a hands-on Technical Leader who takes ownership of designing, developing, and driving modern, scalable data ecosystems in Google Cloud. You’ll define the vision, set the standards, and lead teams to deliver data solutions that transform how our clients operate.
If you’re someone who thrives on turning business challenges into data-driven strategies, leading technical direction, and taking accountability from concept to delivery - this is your place.
responsibilities :
Take end-to-end ownership of client data strategies - from discovery and architecture design to development and adoption.
Define and champion best practices for data architecture, modeling, and engineering in Google Cloud (BigQuery, Airflow, Dataflow, Pub/Sub, Composer, Dataproc).
Design modern data platforms from scratch, based on the heavy use of metadata and aiming at AI Enablement (data consumptions by the LLMs)
Build top-notch quality DWH with the vision for logging, auditing, monitoring, CI/CD processes.
Lead technical design sessions and guide engineers on architectural direction, implementation standards, and solution development.
Drive architectural governance: ensure solutions are scalable, secure, cost-optimized, and aligned with business value.
Act as the trusted advisor to senior stakeholders - translating goals into actionable data roadmaps and clear architectural blueprints.
Proactively identify opportunities for innovation, efficiency, and modernization across client data platforms, including GCP, Azure, and multi-cloud environments.
Design and develop Data Warehouses, Data Lakes architectures tailored to client needs.
Lead performance reviews and cost optimization initiatives across BigQuery, GCP, and Azure to ensure scalability and long-term sustainability.
Collaborate with presales and account teams to co-create proposals, solution outlines, and client presentations.
Ensure high standards of data quality, security, and governance across all solutions. Bonus points if you know SOX Compliance
Define and implement policies for IAM, DLP, and Data Catalog in GCP.
Collaborate closely with data engineers, analysts, and business stakeholders to ensure consistency and scalability.
requirements-expected :
At least 4-5 years of experience in data architecture or advanced data engineering, ideally in GCP / BigQuery environments.
Advanced knowledge of Google BigQuery – schema design, partitioning, clustering, and query optimization.
Hands-on experience designing and deploying solutions using Cloud Storage, Dataflow, Composer (Airflow), Pub/Sub, and Dataproc.
Strong background in data modeling, ETL/ELT design, and metadata-driven frameworks.
Proficiency in SQL; Python experience is a plus.
Understanding of Data Governance, Security, and Data Management best practices.
Strong analytical and problem-solving skills, with the ability to think strategically and act pragmatically.
Full English proficiency (B2/C1).
offered :
We’re a growing company with global clients and ambitious goals - which means real challenges, variety, and opportunities to grow.
If you’re ready to shape how data drives modern business, we’ll make sure you have everything you need to succeed.
Remote-first, flexible working mode l
Greenfield project - building DWH from scratch.
Private healthcare, insurance, and Multisport
Full working equipment provided
1,000 PLN annual development budget for training, certifications, or conferences
Regular knowledge-sharing sessions and mentoring
Work with international clients from Switzerland, France, UK, the US, UAE, and more
A real team culture, where collaboration and learning go hand in hand
Team integrations that build connection - without awkward icebreakers