Our unique, proprietary data platform tracks real-time signals on millions of companies globally, delivering best-in-class monitoring and insight into global supply chains. Through our configurable Software-as-a-Service portal, our customers can monitor any company they work with and execute critical actions in real-time. We are a post-Series B high-growth technology company backed by top-tier investors in Silicon Valley and Europe, headquartered in San Francisco with hubs in Seattle, London, and Warsaw We support remote and hybrid work, with team members across North America, Canada, and Europe.
Were looking for innovative and driven people passionate about building the future of Enterprise Intelligence to join our growing team!
Craft is looking for an experienced and motivated senior-level Data Engineer to join one of our teams responsible for a key product within the organization. As a core member of this team you will have a great say in how solutions are engineered and delivered. Craft gives engineers a lot of responsibility and authority, which is matched by our investment in their growth and development.
We’re growing quickly and looking to hire data engineers for multiple teams. We’re always looking for folks with strong data engineering experience, Python coding experience, Pandas expertise, and solid software engineering practices. In addition to those skills, we’re looking for two different categories of data engineers. Someone who has experience with data lakes and data warehousing, graph databases and knowledge graphs.
# Keep track of emerging technologies and trends in the Data Engineering world, incorporating modern tooling and best practices at Craft.
Apply machine learning techniques such as anomaly detection, clustering, regression classification, and summarization to extract value from our data sets.
4+ years of experience in Data Engineering.
~4+ years of experience with Python.
~ Have fundamental knowledge of data engineering techniques: ETL/ELT, batch and streaming, DWH, Data Lakes, distributed processing.
~ Strong knowledge of SDLC and solid software engineering practices.
~ Demonstrated curiosity through asking questions, digging into new technologies, and always trying to grow.
~ Familiarity with at least some of the technologies in our current tech stack:
~ Python, PySpark, Pandas, SQL (PostgreSQL), ElasticSearch, Airflow, Docker
~ Databricks, AWS (S3, Batch, Athena, RDS, DynamoDB, Glue, ECS, Amazon Neptune)
~ Rapidly scaffold components and APIs
Option to work as a B2B contractor or full-time employee
# Competitive salary at a well-funded, fast-growing startup
# Full-time employees: 100% remote work (or hybrid if you prefer!