STACKIT logo — Navigate to home pageSTACKIT logo — Navigate to home page

TAIA - TNG’s Secure AI Assistant

TNG Technology Consulting
Opens external URL in new window

TAIA is a secure AI Assistant for enterprises. It serves as a central hub to access the best commercial and / or privately hosted Large Language Models. Its strong focus on privacy offers to seamlessly integrate the latest AI-based productivity tools without compromising on data security. Make sure to try out our demo at https://register.taia.stackit.gg/ (or click the link at the bottom of this page).



Overview

Vendor

TNG Technology Consulting
Opens external URL in new window

Delivery method

Kubernetes

Categories

Business Applications, Machine Learning

Product description

TAIA is a cutting-edge AI assistant for enterprises, enabling knowledge workers to interact with a variety of Large Language Models (LLMs) through a central hub. Users can choose from top-tier commercial models to stay updated with trends or opt for best-in-class self-hosted open-source models for enhanced data privacy. Accessible directly from the browser, TAIA supports multiple input modalities, including text, speech, and images. TAIA offers many features to enhance interactions: AI-based productivity tools: Summarize or translate texts, understand or generate images, write or refactor code, and more with TAIA's specialized tools. Augmented by external data: Extend the LLM's knowledge by referencing external sources like PDFs, Excel files, websites, or Google Search results. Text and audio interfaces: Process prompts in audio and text formats for convenient use on laptops and mobile devices. Image generation and editing: Create images in various styles from natural-language prompts using state-of-the-art models. TAIA optimizes image prompts for better results and offers an "Edit Image" tool to modify specific parts of an image. TAIA is divided into several components: Through a web frontend, users can initiate conversations and centrally utilize the latest AI-based productivity tools via a browser. In an additionally provided GPU cluster, freely available LLMs from huggingface.co can be self-hosted. For this, TAIA uses TGI or vLLM as inference services. An additional proxy service routes inference requests between the web frontend and the GPU cluster. In this setup, user data is solely stored in the local browser cache of the end users, ensuring high data protection. Furthermore, it is possible to configure access to commercial model providers or other third-party services with separate API credentials. TAIA is deployed as a Kubernetes deployment via Helm charts. All used resources are directly created in the running STACKIT account.

Contact sales



About the vendor

TNG is a values-based consulting partnership focused on high end information technology.