NVIDIA Workbench AI emerges as a comprehensive solution designed to empower development efforts in both enterprise and private environments, offering a versatile and powerful platform that drives innovation in artificial intelligence.
Nvidia is not only launching such a tool to power private artificial intelligence, it is essential to have tools that are not only effective in terms of performance and capacity, but also in terms of privacy protection and control over sensitive enterprise data.
In this regard, NVIDIA Workbench AI not only provides the necessary tools to develop and deploy advanced AI models, but also puts Start Local and Scale Global at the center of the strategy.
In this article, we will explore the Workbench AI features in detail, highlighting how this platform can help companies drive innovation, improve operational efficiency, and protect data privacy while leveraging the full potential of AI to achieve their business goals by scaling and publishing to Git environments.
The download and installation of Workbench AI is a simple and straightforward process, designed so that users can start using the platform quickly and without excessive complications.
First, download the installer from its web page (Tools for AI Workstations | NVIDIA) and access it to find out the necessary requirements, mainly the use of a WSL2 distribution.
After installing the WSL2 distribution, you can choose between different Runtime. In my case I chose Docker.
After execution the necessary container is raised under port 10000. If in this step you encounter any failures check that you do not have another process using that port. If so, finish the task and reopen Workbench AI.
Workbench AI enables us to run our projects in both local and remote environments, allowing users to scale their processes more easily.
In local environments, it allows teams to maintain full control so they can start projects quickly and agilely.
On the other hand, remote environments allow for greater flexibility and scale workloads more easily by having a centralized workspace in either private or public clouds.
In private cloud environments such as those provided by VMware, the workloads will be in the datacenter itself, being able to take advantage of its local servers or dedicated GPU clusters, ensuring complete control of the development cycle.
On the other hand, the integration and access to public cloud loads, allows users to take advantage of the scalability and flexibility of cloud services to run AI workloads efficiently, without having to worry about infrastructure management.
Workbench AI works on a project basis, which can be new or cloned.
In new environments NVIDIA offers a small variety of containers already designed to run in the tool or use a custom container.
NGC Catalog offers a wide variety of containers, created and supported by NVIDIA, which include deep learning frameworks, software libraries, optimization tools and pre-trained models, all designed to take full advantage of GPU performance and ease of deployment. Currently, there are still quite a few options that are not yet available for Workbench AI and may be optimized in the future.
Within a project we have different administration and management possibilities. In the Environment section you can find the most relevant information:
Package: Add packages to our environment in this test case I added keras-core to be able to use keras in pytorch.
Environment variables
Secrets
Applications: Both JupyterLab and own through VS Code
Mounts: provide a way to access and maintain files outside the project container. When a container is stopped, all unwritten data in a mount is lost.
Hardware: Number of GPUs and shared memory
Nvidia has proposed different projects designed by them showing the capabilities of the tool with open-source developments.
In the first project NVIDIA provides an interface through Gradio, to perform RAG in making API calls or locally to Mistral or Llama in a simple and visual way, being able to add our own documents and modifying values such as number of characters maximum response or temperature.
NVIDIA Workbench AI is a promising tool that offers a convenient platform for AI application development and deployment, especially with its integration with other vendor resources. However, the lack of unpolished details suggests that there is room for improvement in terms of usability and functionality. In addition, greater compatibility with NGC Catalog containers could significantly expand its usefulness and convenience for developers.
In summary, although it is a valuable tool, it seems that it still has some way to go to reach its full potential, having to continue to empower open-source projects, which generate a large community around it.