DataOps Engineer
CoreWeave Europe is seeking a DataOps Engineer to join the Monolith AI Platform Engineering Team, responsible for building and scaling the data and workflow backbone that powers advanced engineering simulation and AI workflows. The team aims to become the super-intelligent AI test lab for the engineering industry, helping customers ship science faster. CoreWeave delivers a platform of technology, tools, and teams that enable innovators to build and scale AI with confidence.
In this role, the DataOps Engineer will own and drive all aspects of data observability and operations across the client estate. Responsibilities include designing and implementing the end-to-end observability stack for data workloads, defining and maintaining operational SLOs/SLAs for critical data flows, building dashboards, alerts, and runbooks for rapid incident response, and standardizing instrumentation practices for data pipelines.
The ideal candidate will have 5–6+ years of experience in DataOps, Data Engineering, or DevOps/SRE for data platforms, including end-to-end ownership of production data pipelines. Strong hands-on experience designing, deploying, and operating data pipelines in production, practical experience with data orchestration and ETL/ELT tooling, solid SQL and/or Spark skills, and extensive experience implementing data observability are required. Proficiency in Python for building tooling and platform integrations is also essential.
CoreWeave offers a variety of benefits, including family-level medical and dental insurance, generous pension contributions, life assurance at 4x salary, critical illness cover, an employee assistance programme, and tuition reimbursement. The company fosters a work culture focused on innovative disruption and supports an entrepreneurial outlook and independent thinking. Growth opportunities within the organization are constantly expanding, providing an environment that encourages collaboration and the development of innovative solutions to complex problems.