Tekimax LogoSDK
Guides

The SDK ships two React hooks from tekimax-omat/react:

  • useChat — streaming chat interface with tool loops, abort, and optimistic updates
  • useAssessment — rubric-grounded formative feedback with streaming and history

Installation

Code
npm install tekimax-omat react

useChat

Manages message state, streaming, tool execution, and abort handling for chat interfaces.

Setup

Initialize the provider outside your component (or memoize) to avoid re-creating the client on every render.

Code
import { Tekimax, OpenAIProvider } from 'tekimax-omat'; const provider = new OpenAIProvider({ apiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY!, // Required for browser usage. In production, proxy through your backend. dangerouslyAllowBrowser: true, }); const client = new Tekimax({ provider });

Security: In production, proxy AI requests through your backend. Never expose secret API keys in client-side code.

Basic Chat UI

Code
import { useChat } from 'tekimax-omat/react'; export function ChatComponent() { const { messages, // Message[] — full conversation history input, // Current input string handleInputChange, // onChange handler for <input> / <textarea> handleSubmit, // onSubmit handler for <form> append, // Programmatically add a message isLoading, // true while streaming stop, // Abort the current request setMessages, // Override message history (e.g. clear chat) } = useChat({ client, model: 'gpt-4o', onFinish: (message) => console.log('Done:', message.content), onError: (error) => console.error('Chat error:', error), }); return ( <div className="flex flex-col h-[500px]"> <div className="flex-1 overflow-y-auto p-4 space-y-4"> {messages.map((m, i) => ( <div key={i} className={`p-4 rounded-lg ${ m.role === 'user' ? 'bg-blue-100 ml-auto max-w-[80%]' : 'bg-gray-100 max-w-[80%]' }`}> <strong className="text-sm text-gray-500">{m.role === 'user' ? 'You' : 'AI'}</strong> <p className="whitespace-pre-wrap mt-1">{m.content as string}</p> </div> ))} </div> <form onSubmit={handleSubmit} className="p-4 border-t flex gap-2"> <input className="flex-1 p-2 border rounded" value={input} onChange={handleInputChange} placeholder="Ask something…" disabled={isLoading} /> <button type="submit" disabled={isLoading} className="px-4 py-2 bg-blue-600 text-white rounded"> Send </button> {isLoading && ( <button type="button" onClick={stop} className="px-4 py-2 border rounded"> Stop </button> )} </form> </div> ); }

Programmatic Messages with append

Code
// String shorthand — auto-wrapped as user message await append('What job training programs do you offer?'); // Full Message object await append({ role: 'user', content: 'I have a background in food service' });

Tool Calling

Code
const { messages, handleSubmit, input, handleInputChange, isLoading } = useChat({ client, model: 'gpt-4o', tools: { search_programs: { type: 'function', function: { name: 'search_programs', description: 'Search available training programs by criteria', parameters: { type: 'object', properties: { category: { type: 'string', description: 'e.g. technology, healthcare, trades' }, city: { type: 'string' }, costLimit: { type: 'number', description: 'Max cost in USD' }, }, required: ['category'] } }, execute: async ({ category, city, costLimit }) => { const res = await fetch(`/api/programs?category=${category}&city=${city ?? ''}`); return res.json(); } } }, });

When the model returns tool calls, the hook executes them, appends results, and re-sends — all automatically.

Reasoning Models

Code
const { messages } = useChat({ client, model: 'deepseek-r1', think: true, // Capture chain-of-thought during streaming }); // Access reasoning separately from the final answer messages.forEach(m => { if (m.thinking) console.log('Reasoning:', m.thinking); console.log('Answer:', m.content); });

Provider Options

Code
// Option A: Tekimax client (recommended — full namespace access) useChat({ client: new Tekimax({ provider }), model: 'gpt-4o' }); // Option B: Raw provider (advanced — custom providers without Tekimax wrapper) useChat({ adapter: provider, model: 'gpt-4o' });

useAssessment

Wraps AssessmentPipeline with React state, streaming support, and abort handling. For full OMAT setup (rubric design, pipeline config, plugins), see the Assessment Guide.

Code
import { useState } from 'react'; import { useAssessment } from 'tekimax-omat/react'; import { AssessmentPipeline, OpenAIProvider } from 'tekimax-omat'; const pipeline = new AssessmentPipeline({ provider: new OpenAIProvider({ apiKey: process.env.NEXT_PUBLIC_OPENAI_API_KEY!, dangerouslyAllowBrowser: true, }), rubric: myRubric, model: 'gpt-4o', }); export function AssessmentWidget() { const [text, setText] = useState(''); const { feedback, isAssessing, streamedText, history, assess, stop, reset } = useAssessment({ pipeline, streaming: false, // true = real-time text stream; false = structured feedback on completion onFeedback: (f) => console.log('Score:', f.normalizedScore), onError: (e) => console.error(e), }); const handleSubmit = () => { assess({ id: crypto.randomUUID(), modality: 'text', text }); }; return ( <div> <textarea value={text} onChange={e => setText(e.target.value)} placeholder="Enter your response…" rows={8} /> <div className="flex gap-2 mt-2"> <button onClick={handleSubmit} disabled={isAssessing || !text.trim()}> {isAssessing ? 'Assessing…' : 'Get Feedback'} </button> {isAssessing && <button onClick={stop}>Stop</button>} {history.length > 0 && <button onClick={reset}>Reset</button>} </div> {feedback && ( <div className="mt-4 space-y-4"> <div className="p-4 bg-blue-50 rounded"> <p className="font-medium">{feedback.overall}</p> <p className="text-sm text-gray-600 mt-1"> Score: {(feedback.normalizedScore * 100).toFixed(0)}% </p> </div> <div> <h4 className="font-medium mb-1">What's working</h4> <ul className="list-disc list-inside text-sm space-y-1"> {feedback.strengths.map((s, i) => <li key={i}>{s}</li>)} </ul> </div> <div> <h4 className="font-medium mb-1">Next steps</h4> <ul className="list-disc list-inside text-sm space-y-1"> {feedback.nextSteps.map((s, i) => <li key={i}>{s}</li>)} </ul> </div> <p className="text-sm italic text-blue-700">{feedback.encouragement}</p> </div> )} </div> ); }

useAssessment Options

OptionTypeDefaultDescription
pipelineAssessmentPipelinerequiredConfigured pipeline instance
streamingbooleanfalseStream text chunks in real-time
onFeedback(f: FormativeFeedback) => voidCalled when feedback is complete
onError(e: Error) => voidCalled on error

useAssessment Return Values

ValueTypeDescription
feedbackFormativeFeedback | nullLatest complete feedback
isAssessingbooleanTrue while in flight
streamedTextstringAccumulated stream (streaming mode only)
historyFormativeFeedback[]All feedback this session
assess(response)(StudentResponse) => Promise<void>Submit a response
stop()() => voidCancel in-flight request
reset()() => voidClear all state

On this page