YG3 Wiki

Welcome to the YG3 Wiki, your guide to creating the impossible.


What Is YG3?

Screenshot 2025-11-11 at 1.39.50 PM

YG3 is what you use when you’ve outgrown generic AI. It’s the platform you reach for when you want to customize intelligence, when your business needs more nuanced reasoning, domain-specific context, or tightly integrated automation that goes beyond prompt engineering.

At the core of YG3 is the ability to build, own, and deploy Concentrated Language Models (CLMs): focused intelligence engines designed specifically for your workflows.

YG3 transforms AI from a tool you borrow… into an asset you control.


What Are CLMs?

A Concentrated Language Model (CLM) is a purpose-trained intelligence system built for a specific business domain, operational workflow, or knowledge environment.

clm

Where general-purpose LLMs spread their intelligence wide:

Think of CLMs as your internal brain, not just another AI endpoint.

Feature

General LLM

CLM (YG3)

Scope

Broad and generic

Narrow, domain-focused

Memory

Ephemeral

Persistent workspace memory

Ownership

Vendor-locked

You can own and deploy your CLM

Stability

Drifts with model updates

Stable, instance-locked behavior

Multiple Models, One Intelligence Layer

YG3 provides several CLM-powered models tuned for different modalities:

These models power both the platform and the API, depending on how you choose to build.

You don’t need to learn different systems — every model flows into the same workspace intelligence layer that learns from you over time.


Two Ways to Use YG3

1. The Platform

YG3 provides a visual environment where you can run and customize prebuilt workflows powered by CLMs.

Once you create an account and log in, you’ll have access to modules across suites like:

All of these modules are driven by your workspace’s CLM — so it learns your style, context, and priorities.


2. The API

For direct integration, YG3 provides an OpenAI-compatible API powered by the same CLMs that run the platform.

Base URL: https://elysia-api.ngrok.io/api/public/v1
Model ID: Select from below options

Models available:

You can use these models inside your own products, agents, automations, and backend systems with just a few lines of code.

Example (Python):

from openai import OpenAI
client = OpenAI(api_key="YOUR-API-KEY", base_url="https://elysia-api.ngrok.io/api/public/v1")
response = client.chat.completions.create(
model="elysia",
messages=[{"role": "user", "content": "Help me optimize my marketing funnel"}]
)
print(response.choices[0].message.content)

The API supports LangChain, LlamaIndex, streaming, and structured outputs — everything you’d expect from a production-grade intelligence layer.


🚀 Getting Started

  1. Go to app.yg3.ai

  2. Create your account and log in

  3. Explore prebuilt workflows under the Marketing, Analytics, or Operations suites

  4. Or grab your API key and start building your own agents here YG3 API Documentation

  5. Your CLM will begin learning from everything you create, upload, and automate


🔗 Learn More


YG3 exists to make AI yours — controllable, reliable, and truly intelligent.

Published with Nuclino