Find Your Spark

Launch the next phase of your career with Canada's leading tech companies

Lead Machine Learning Engineer

Jozu

Jozu

Software Engineering
Remote
Posted on Dec 26, 2025
Jozu is using machine learning and large language models to find patterns in structured and unstructured data and help our customers understand the business impact of their engineering, marketing, and sales decisions. We are building this using operational and analytical databases, ML algorithms, LLMs, queue systems, and data monitoring tools. We focus on software engineering related to data replication, storage, centralized computation, and Data API’s. We are a small but passionate team with people from different nationalities working together on a single goal.We’re looking for an experienced machine learning engineer to help Jozu redefine how product, user, and business analysis is done. You’ll help define the vision and leverage some of the latest thinking and technology for structured and unstructured data analysis. You’ll turn that vision into code and models that will help our customers find new and unexpected ways to delight their customers and grow their businesses. You’ll do all this while keeping a constant eye on security, performance, and operational stability of our product.Candidates should enjoy working in a collaborative, analytical, and fast-paced environment and be comfortable interacting with highly technical cross-functional teams. Ideal candidates dive deep into data, have strong communication skills, and can confidently prioritize and choose the right tradeoffs.We are a remote-first team who meet quarterly for in-person working sessions. We welcome applicants from anywhere between the Pacific and Eastern time zones. We value critical thinking, curiosity, problem-solving, and an open and empathetic communication style. Personal and team growth is an integral part of Jozu. We love to learn and challenge ourselves, and always support each other as we grow. We are excited to take on tough challenges and celebrate together when we achieve our goals.In this role, you’ll be able to:- Help refine our vision for how and where AI/ML will benefit our customers most.- Build, administer and scale new and existing AI/ML models and pipelines.- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality.- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.- Collaborate with data engineers to ensure robust data pipelines.- Take on end-to-end ownership, from defining to developing to taking ML models to production.- Implement data governance and quality control measures.- Do your work with minimal meetings and remember what’s it’s like to do consistent deep work.What you’ll need to succeed:- Bachelor’s degree in computer science /information systems/engineering/related field.- Strong understanding of Python.- Proven track record of developing and deploying machine learning models.- Experience with machine learning frameworks (e.g., TensorFlow, PyTorch).- Excellent communication and collaboration skills, and the ability to work effectively in a team environment.- A love of code simplicity and performance.Ideally candidates will also have some of the following:- Understanding of data architecture principles.- Experience with Git and, ideally, GitOps.- Experience with handling multi-tenanted data models and ML systems.- Understanding of ML Projects lifecycle.- Experience with Kubernetes, time series databases, Apache Pulsar, and APIs.This role will allow you to accelerate your career growth, experiment with new technologies, and build your proficiency across multiple disciplines. It will enhance your analytical thinking, and give you a voice in the first and most critical decisions for a brand new product in a rapidly growing market. The ideal candidate feels at home with cloud-native technologies running across public, private, and hybrid clouds. We are using Kubernetes, eBPF, Kafka, serverless, gateways and data meshes, microservices, and declarative APIs.