Docker Debuts Generative AI Stack and Docker AI

Docker´s GenAI stack is available for no charge and can be operated locally on developers' systems.

Docker Inc., the renowned company behind the open-source docker container technology, has unveiled a series of initiatives aimed at aiding developers in the swift creation of generative AI applications. The announcement was made at the Dockercon 23 conference held in Los Angeles. Central to these initiatives is the introduction of the Docker GenAI stack, a product that seamlessly integrates Docker with the Neo4j graph database, LangChain model chaining technology, and Ollama for executing large language models (LLMs).

What the Docker GenAI Stack Offers Developers

The Docker GenAI stack is tailored to simplify the development of GenAI applications. These applications typically necessitate core components such as a vector database, now offered by Neo4j as part of its graph database platform. Additionally, Docker GenAI requires an LLM, a feature catered to by Ollama's platform that facilitates the running of LLMs like Llama 2 locally. Modern applications often involve multiple steps, and this is where LangChain's framework becomes invaluable.

The Docker GenAI stack is designed to alleviate the complexities of configuring these diverse elements in containers, making the entire process more streamlined. Docker's CEO, Scott Johnston, emphasized the stack's readiness, stating, “It's pre-configured, it's ready to go and they [developers] can start coding and experimenting to help get the ball rolling.”

The full suite of the Docker GenAI stack is available for no charge and can be operated locally on developers' systems. Docker intends to provide deployment and commercial support options available to enterprises as their applications progress.

Docker AI: Developers' ‘Mech Suit'

Separating itself from other abundant generative AI developer tools available on the market, Docker is launching its specialized GenAI tool, known as Docker AI. While other vendors use the term ‘,' Docker prefers the metaphor of a ‘mech suit,' emphasizing the enhanced power and proficiency developers can attain.

Docker AI is trained on Docker's exclusive data accumulated from millions of Dockerfiles, compose files, and error logs, providing a resource capable of rectifying error directly within developers' workflows. The tool aims to make troubleshooting and fixing issues more manageable, improving the developers' experience.

Scott Johnston, Docker's CEO, explained that the Docker AI has a unique role in facilitating container development, using its extensive Docker data training, which no other LLM has access to. With a dedicated focus on developers over the past few years, Docker is ready to support the future of GenAI application development.