Domino Data Lab tightens MLOps integration with Git repositories
Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.
Domino Data Lab, a pioneer provider of a machine learning operations (MLOps) platform, is making it easier for data scientists to manage code at a time when providers of DevOps platforms are starting to contend AI models as just another software artifact that needs to be managed within the context of any application development project.
Version 4.4 of the Domino platform adds a CodeSync capability that is integrated with Git repositories in a way that makes it possible to more easily track all aspects of experimentation, said Nick Elprin, Domino Data Lab CEO. While Domino Data Lab sees data science teams employing Git repositories to manage the artifacts that make up a AI model, the processes employed for building them will remain distinct from the DevOps processes that developers employ to build applications, Elprin added. “Models are fundamentally different,” he said.
As part of an effort to make it simpler for MLOps teams to achieve that goal, Domino Data Labs has also added a Durable Workspaces capability that makes it possible to run multiple sandboxed environments simultaneously to improve productivity. Durable Workspaces will also reduce infrastructure costs by enabling data scientists to stop, edit, and resume workspace configurations as required, Elprin said.
Finally, Domino 4.4 adds support for the Transport Layer Security (TLS) protocol to enable encryption in transit and the ability to mount external Network File System (NFS) volumes from within the Domino file system.
The move to tighten integration with Git repositories such as GitHub and GitLab comes at a time when the providers of those repositories are signaling their intentions to enable DevOps and data science teams to build and deploy AI models in a more collaborative fashion. DevOps teams are moving toward incorporating AI models into their workflows to accelerate deployments of applications infused with AI capabilities. It’s not clear yet if best DevOps and MLOps practices will simply converge or whether the tasks currently managed by MLOps platforms will be assumed by continuous integration/continuous delivery (CI/CD) platforms that many organizations already have in place.
Elprin noted that most organizations are already challenged when it comes to hiring both data scientists and DevOps engineers. The odds they will find a DevOps specialist who also knows the intricacies of MLOps are very slim, he added.
One way or another, though, organizations are looking to accelerate the rate at which AI-enabled applications are being deployed. Today it’s not uncommon for data science teams to take several months to create an AI model that needs to be deployed in a production environment. The challenge is many application development teams are deploying and updating applications at a rate of frequency that makes it challenging to align the efforts of data science teams with the pace of application development. As such, DevOps advocates are now making a case for making DevOps platforms more accessible to data science teams.
It’s still early days as far as the building of AI models within enterprise IT organizations are concerned. Most of the processes being employed to build AI models are far from mature. At some point, however, there will need to be more integrated processes spanning data science, developers, and IT operations teams as the number of AI models being deployed and updated in production environments continues to steadily increase.
VentureBeat
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more
Source: Read Full Article