BoxLang 🚀 A New JVM Dynamic Language Learn More...
|:------------------------------------------------------: |
| ⚡︎ B o x L a n g ⚡︎
| Dynamic : Modular : Productive
|:------------------------------------------------------: |
Copyright Since 2023 by Ortus Solutions, Corp
www.boxlang.io | www.ortussolutions.com
Welcome to the BoxLang AI Module. This module is a BoxLang module that provides AI capabilities to your BoxLang applications. The following AI providers are supported:
More coming soon.
Here are the settings you can place in your boxlang.json
file:
{
"modules" : {
"bxai" : {
// The provider of the AI: openai, deepseek, etc
provider = "openai",
// The API Key for the provider
apiKey = "",
// The provider model to use, if any
model = "gpt-4o-mini",
// The provider properties according to provider, if any
properties = {
}
}
}
}
This module exposes the following BoxLang functions:
aiChat( messages, model, struct data={}, boolean verbose=false )
: This function will allow you to chat with the AI provider and get responses back.aiChatAsync( messages, model, struct data={}, boolean verbose=false )
: This function will allow you to chat with the AI provider asynchronously and give you back a BoxLang Completable Future.messages
: The messages to chat with the AI. This is provider dependent. Please see each section for more information.model
: The model to use for the AI provider. This is provider dependent. Please see each section for more information.data
: The data to pass to the AI provider. This is provider dependent. Please see each section for more information.verbose
: A flag to output verbose information about the AI chat or just the response message.// Chat with the AI
aiChat( "What is the meaning of life?" );
The OpenAI provider will allow you to interact with the following APIs:
Please see OpenAI API for more information.
You can use the aiChat()
function to chat with the OpenAI API. Here is more docs on this: https://platform.openai.com/docs/guides/text-generation
This can be any of the following
role
of user
will be usedrole
and a content
keyrole
and a content
keys// Chat with the AI
aiChat( "What is the meaning of life?" );
// Chat with the AI
aiChat( { role="developer", content="What is the meaning of life?" } );
// Chat with the AI
aiChat( [
{ role="developer", content="Be a helpful assistant" },
{ role="user", content="What is the meaning of life?" }
] );
The supported models for OpenAI are:
gpt-4o
: The large modelgpt-4o-mini
: The more affordable but slower modelgpt-4o-turbo
: The turbo modelYou can find more information here: https://platform.openai.com/docs/models
This is an arbitrary structure that will be passed to the OpenAI API alongsside the top level body.
// Chat with the AI
aiChat( "What is BoxLang?", "gpt-4o-mini", { temperature=0.5, max_tokens=100 } );
Here are some examples of chatting with the AI:
aiChat( "Write a haiku about recursion in programming." );
aiChat( {
"role": "user",
"content": "Write a haiku about recursion in programming."
} );
aiChat( [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Write a haiku about recursion in programming."
}
] );
// Analyze an image
aiChat( [
{
"role": "user",
"content": [
{
"type": "text",
"text": "What is in this image?"
},
{
"type": "image_url",
"image_url": {
"url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg"
}
}
]
}
] );
Using tools
tool = new bxmodules.bxai.models.Tool();
tool.setName( "get_weather" )
.describe( "Get current temperature for a given location." )
.describeLocation( "City and country e.g. Bogotá, Colombia" )
.setFunc( ( location ) => {
if( location contains "Kansas City" ) {
return "85"
}
if( location contains "San Salvador" ){
return "90"
}
return "unknown";
});
result = aiChat( messages = "How hot is it in Kansas City? What about San Salvador? Answer with only the name of the warmer city, nothing else.", data = {
tools: [ tool ]
} )
// San Salvador
println( result )
BoxLang is a professional open-source project and it is completely funded by the community and Ortus Solutions, Corp. Ortus Patreons get many benefits like a cfcasts account, a FORGEBOX Pro account and so much more. If you are interested in becoming a sponsor, please visit our patronage page: https://patreon.com/ortussolutions
"I am the way, and the truth, and the life; no one comes to the Father, but by me (JESUS)" Jn 14:1-12
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
$
box install bx-ai