1.7.4 (2026-03-23)
1.7.3 (2026-03-21)
1.7.2 (2026-03-20)
ResilientLLM.chat()now always returns a consistent envelope object:{ content, toolCalls?, metadata }(metadata is no longer gated byreturnOperationMetadata).
This is how you use the library after v1.7.2
import { ResilientLLM } from 'resilient-llm';
const llm = new ResilientLLM({
aiService: 'openai',
model: 'gpt-5-nano',
});
const conversationHistory = [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'assistant', content: 'Hi, I am here to help.' },
{ role: 'user', content: 'What is the capital of France?' }
];
try {
const { content, toolCalls, metadata } = await llm.chat(conversationHistory);
console.log('LLM response:', content);
} catch (err) {
console.error('Error:', err);
}1.7.1 (2026-03-16)
1.7.0 (2026-03-16)
1.6.0 (2026-03-08)
1.4.2 (2026-03-01)
1.4.1 (2026-01-05)
1.4.0 (2026-01-04)
- use node 24 in npm publish action (#50) (9cac13f)
- use the replce ollamaUrl with baseUrl (#45) (d8c692f)