Sovereign AI for Every Region
RCT Ecosystem integrates Typhoon (by SCB10X) as G38 — the Regional Thai model — and provides a plug-in architecture for developers to swap in any country's sovereign LLM.
Typhoon G38 — Regional Thai Model
ActiveTyphoon v2 70B by SCB10X (Thailand's Frontier AI Research Lab) is the most capable Thai LLM. Typhoon-S (4B) scored 78.02% on Thai legal tasks vs GPT-5's 75.34%. Integrated as model_id="scb10x/typhoon-v2-70b-instruct" with proficiency th=0.99.
Supported Regional LLMs
G38 Typhoon is active. "Pluggable" models can be added in 3 steps using the open plug-in architecture.
How to Plug in Your Regional LLM
The AdapterRegistry supports max_adapters=50 with runtime register() / unregister() — no restart required.
Add your regional model to the ModelRole enum in hexa_core_registry.py — e.g. ModelRole.REGIONAL_JP = "regional_jp".
Call AdapterRegistry.register(model_id, max_adapters=50) with your model's HuggingFace or API endpoint.
Add routing rules to routing_map — JITNA automatically selects your model for tasks that match the configured task types.
# hexa_core_registry.py — Add your regional model
class ModelRole(str, Enum):
REGIONAL_THAI = "regional_thai" # G38 — Already active
REGIONAL_JP = "regional_jp" # Add this for Japan
# Register adapter at runtime (no restart needed)
AdapterRegistry.register(
model_id="elyza/Llama-3-ELYZA-JP-8B",
task_types=[TaskType.REGIONAL_JP],
max_adapters=50
)Why Typhoon + RCT = Complementary, Not Competing
Typhoon trains models
SCB10X trains LLMs specifically for Thai language and local context — producing models that understand Thai nuances better than general-purpose models.
RCT orchestrates models
RCT uses JITNA + SignedAI + RCTDB to route, verify, and persist results from Typhoon — putting Thai-language AI intelligence inside enterprise-grade infrastructure.