On this page
Changelog 0.8.4 Add OpenTelemetry integration for AI.JSX render tracing, which can be enabled by setting the AIJSX_ENABLE_OPENTELEMETRY
environment variable. Throw validation errors when invalid elements (like bare strings) are passed to ChatCompletion
components. Reduce logspam from memoization. Fix issue where the description
field wasn't passed to function definitions. Add support for token-based conversation shrinking via <Shrinkable>
. Move MdxChatCompletion
to be MdxSystemMessage
. You can now put this SystemMessage
in any ChatCompletion
to prompt the model to give MDX output. Add Converse
and ShowConversation
components facilitate streaming conversations. Change ChatCompletion
components to render to <AssistantMessage>
and <FunctionCall>
elements. Move memo
to AI.RenderContext
to ensure that memoized components render once, even if placed under a different context provider. Add AIJSX_LOG
environment variable to control log level and output location. Update <UseTools>
to take a complete conversation as a children
prop, rather than as a string query
prop. Update toTextStream
to accept a logger
, so you can now see log output when you're running AI.JSX on the server and outputting to a stream. See AI + UI and Observability . 0.5.12 Updated readme.md
in the ai-jsx
package to fix bugs on the npm landing page. Make JIT UI stream rather than appear all at once. Use openai-edge
instead of @nick.heiner/openai-edge
ImageGen
now produces an Image
object which will render to a URL in the command line, but returns an <img />
tag when using in the browser (React/Next).Fix build system issue that caused problems for some consumers. Remove need for projects consuming AI.JSX to set "moduleResolution": "esnext"
in their tsconfig
. Adding Weights and Biases integration Fix how env vars are read. When reading env vars, read from VAR_NAME
and REACT_APP_VAR_NAME
. This makes your env vars available to projects using create-react-app
. Add OpenAI client proxy.