*We are looking for Poland based candidate.
Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.
Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.
We also are experts in Java, Python, Spring, Hadoop, Angular, React, Android, Google Cloud, Selenium, SQL, Docker, Kubernetes focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.
Were seeking a skilled Senior Big Data Engineer to join our Team. The ideal candidate will be responsible for designing, implementing and maintaining scalable data pipelines and solutions on on-prem / migration / cloud projects for large scale data processing and analytics.
BIG DATA ENGINEER @ CAPCO - WHAT TO EXPECT
WHY JOIN CAPCO?
ONLINE RECRUITMENT PROCESS STEPS*
We have been informed of several recruitment scams targeting the public. We strongly advise you to verify identities before engaging in recruitment related communication. All official Capco communication will be conducted via a Capco recruiter.
SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE
TECH STACK: Python, OOP, Spark, SQL, Hadoop
Nice to have: GCP, Pub/Sub, Big Query, Kafka, Juniper, Apache NiFi, Hive, Impala, Cloudera, CI/CD
*We are looking for Poland based candidate.
Joining Capco means joining an organisation that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.
Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.
We also are experts in Java, Python, Spring, Hadoop, Angular, React, Android, Google Cloud, Selenium, SQL, Docker, Kubernetes focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.
Were seeking a skilled Senior Big Data Engineer to join our Team. The ideal candidate will be responsible for designing, implementing and maintaining scalable data pipelines and solutions on on-prem / migration / cloud projects for large scale data processing and analytics.
BIG DATA ENGINEER @ CAPCO - WHAT TO EXPECT
WHY JOIN CAPCO?
ONLINE RECRUITMENT PROCESS STEPS*
We have been informed of several recruitment scams targeting the public. We strongly advise you to verify identities before engaging in recruitment related communication. All official Capco communication will be conducted via a Capco recruiter.
,[Work alongside clients to interpret requirements and define industry-leading solutions, Design and develop robust, well tested data pipelines, Demonstrate and help clients adhere to best practices in engineering and SDLC, Excellent knowledge of building event-driven, loosely coupled distributed applications, Experience in developing both on-premise and cloud-based solutions, Build and improve strong relationships with peers, senior stakeholders and client, Leading and mentoring the team of junior and mid-level engineers, Contribute to security designs and have advanced knowledge of key security technologies e.g. TLS, OAuth, Encryption, Support internal Capco capabilities by sharing insight, experience and credentials Requirements: GCP, Python, Hadoop, Spark, PySpark, DataProc, Airflow, Oozie, AWS S3, Terraform, Data structures, Spark Streaming, Kinesis, SQL, NoSQL, ETL, Docker, Kubernetes, Data Lake, Redshift, Snowflake, Git, CD pipelines, Jenkins, CircleCI, OOP, Scala, Java, BigQuery, Kafka, NiFi, Juniper Networks, Hive, Impala, Cloudera, CI/CD Additionally: Private healthcare, Employee referral bonus, MyBenefit, Udemy for business.