ChatTogetherAI
This will help you getting started with ChatTogetherAI chat
models. For detailed documentation of all
ChatTogetherAI features and configurations head to the API
reference.
Overview
Integration details
| Class | Package | Local | Serializable | PY support | Package downloads | Package latest | 
|---|---|---|---|---|---|---|
| ChatTogetherAI | @langchain/community | ❌ | ✅ | ✅ |  |  | 
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs | 
|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | 
Setup
To access ChatTogetherAI models you’ll need to create a Together
account, get an API key here, and install
the @langchain/community integration package.
Credentials
Head to api.together.ai to sign up to
TogetherAI and generate an API key. Once you’ve done this set the
TOGETHER_AI_API_KEY environment variable:
export TOGETHER_AI_API_KEY="your-api-key"
If you want to get automated tracing of your model calls you can also set your LangSmith API key by uncommenting below:
# export LANGCHAIN_TRACING_V2="true"
# export LANGCHAIN_API_KEY="your-api-key"
Installation
The LangChain ChatTogetherAI integration lives in the
@langchain/community package:
- npm
- yarn
- pnpm
npm i @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
Instantiation
Now we can instantiate our model object and generate chat completions:
import { ChatTogetherAI } from "@langchain/community/chat_models/togetherai";
const llm = new ChatTogetherAI({
  model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
  temperature: 0,
  maxTokens: undefined,
  timeout: undefined,
  maxRetries: 2,
  // other params...
});
Invocation
const aiMsg = await llm.invoke([
  [
    "system",
    "You are a helpful assistant that translates English to French. Translate the user sentence.",
  ],
  ["human", "I love programming."],
]);
aiMsg;
AIMessage {
  "id": "chatcmpl-9rT9qEDPZ6iLCk6jt3XTzVDDH6pcI",
  "content": "J'adore la programmation.",
  "additional_kwargs": {},
  "response_metadata": {
    "tokenUsage": {
      "completionTokens": 8,
      "promptTokens": 31,
      "totalTokens": 39
    },
    "finish_reason": "stop"
  },
  "tool_calls": [],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 31,
    "output_tokens": 8,
    "total_tokens": 39
  }
}
console.log(aiMsg.content);
J'adore la programmation.
Chaining
We can chain our model with a prompt template like so:
import { ChatPromptTemplate } from "@langchain/core/prompts";
const prompt = ChatPromptTemplate.fromMessages([
  [
    "system",
    "You are a helpful assistant that translates {input_language} to {output_language}.",
  ],
  ["human", "{input}"],
]);
const chain = prompt.pipe(llm);
await chain.invoke({
  input_language: "English",
  output_language: "German",
  input: "I love programming.",
});
AIMessage {
  "id": "chatcmpl-9rT9wolZWfJ3xovORxnkdf1rcPbbY",
  "content": "Ich liebe das Programmieren.",
  "additional_kwargs": {},
  "response_metadata": {
    "tokenUsage": {
      "completionTokens": 6,
      "promptTokens": 26,
      "totalTokens": 32
    },
    "finish_reason": "stop"
  },
  "tool_calls": [],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 26,
    "output_tokens": 6,
    "total_tokens": 32
  }
}
Tool calling & JSON mode
The TogetherAI chat supports JSON mode and calling tools.
Tool calling
import { ChatTogetherAI } from "@langchain/community/chat_models/togetherai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { convertToOpenAITool } from "@langchain/core/utils/function_calling";
import { Calculator } from "@langchain/community/tools/calculator";
// Use a pre-built tool
const calculatorTool = convertToOpenAITool(new Calculator());
const modelWithCalculator = new ChatTogetherAI({
  temperature: 0,
  // This is the default env variable name it will look for if none is passed.
  apiKey: process.env.TOGETHER_AI_API_KEY,
  // Together JSON mode/tool calling only supports a select number of models
  model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
}).bind({
  // Bind the tool to the model.
  tools: [calculatorTool],
  tool_choice: calculatorTool, // Specify what tool the model should use
});
const promptForTools = ChatPromptTemplate.fromMessages([
  ["system", "You are a super not-so-smart mathmatician."],
  ["human", "Help me out, how can I add {math}?"],
]);
// Use LCEL to chain the prompt to the model.
const responseWithTool = await promptForTools.pipe(modelWithCalculator).invoke({
  math: "2 plus 3",
});
console.dir(responseWithTool.tool_calls, { depth: null });
[
  {
    name: 'calculator',
    args: { input: '2 + 3' },
    type: 'tool_call',
    id: 'call_nhtnmganqJPAG9I1cN8ULI9R'
  }
]
Behind the scenes, TogetherAI uses the OpenAI SDK and OpenAI compatible API, with some caveats:
- Certain properties are not supported by the TogetherAI API, see here.
API reference
For detailed documentation of all ChatTogetherAI features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_community_chat_models_togetherai.ChatTogetherAI.html