Skip to content

Commit

Permalink
Add support for Anthropic API
Browse files Browse the repository at this point in the history
  • Loading branch information
dspinellis committed Mar 6, 2024
1 parent fedeec8 commit 45934fe
Show file tree
Hide file tree
Showing 12 changed files with 448 additions and 57 deletions.
16 changes: 11 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@ The __ai-cli__
library detects programs that offer interactive command-line editing
through the __readline__ library,
and modifies their interface to allow obtaining help from a GPT
large language model, such as OpenAI's or one provided through a
large language model server, such as Anthropic's or OpenAI's,
or one provided through a
[llama.cpp](https://github.com/ggerganov/llama.cpp) server.
Think of it as a command line copilot.

Expand Down Expand Up @@ -85,15 +86,20 @@ make install PREFIX=~
the Homebrew library directory, e.g.
`export DYLD_LIBRARY_PATH=/opt/homebrew/lib:$DYLD_LIBRARY_PATH`.
* Perform one of the following.
* [Obtain your OpenAI API key](https://platform.openai.com/api-keys),
* Obtain your
[Anthropic API key](https://console.anthropic.com/settings/keys)
or
[OpenAI API key](https://platform.openai.com/api-keys)
and configure it in the `.aicliconfig` file in your home directory.
This is done with a `key={key}` entry in the file's `[openai]` section.
In addition, add `api=openai` in the file's `[general]` section.
This is done with a `key={key}` entry in the file's
`[anthropic]` or `[openai]` section.
In addition, add `api=anthropic` or `api=openai` in the file's
`[general]` section.
See the file [ai-cli-config](src/ai-cli-config) to understand how configuration
files are structured.
Anthropic currently provides free trial credits to new users.
Note that OpenAI API access requires a different (usage-based)
subscription from the ChatGPT one.
OpenAI currently provides free trial credits to new users.
* Configure a [llama.cpp](https://github.com/ggerganov/llama.cpp) server
and list its `endpoint` (e.g. `endpoint=http://localhost:8080/completion`
in the configuration file's `[llamacpp]` section.
Expand Down
3 changes: 2 additions & 1 deletion src/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,8 @@ MANPREFIX ?= "$(PREFIX)/share/man/"
SHAREPREFIX ?= "$(PREFIX)/share/ai-cli"

PROGS=rl_driver $(SHARED_LIB)
RL_SRC=ai_cli.c config.c ini.c openai_fetch.c llamacpp_fetch.c support.c
RL_SRC=ai_cli.c config.c ini.c anthropic_fetch.c openai_fetch.c \
llamacpp_fetch.c support.c
TEST_SRC=$(wildcard *_test.c)
LIB=-lcurl -ljansson

Expand Down
9 changes: 9 additions & 0 deletions src/ai-cli-config
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,15 @@ temperature = 0.7
; User-specific: add it in a local protected file
; key =

[anthropic]
endpoint = https://api.anthropic.com/v1/messages
version = 2023-06-01
; Most capable. Cost (3/2024): Input: $15 / MTok, Output: $75 / MTok
model = claude-3-opus-20240229
; User-specific: add it in a local protected file
; key =
max_tokens = 256

[llamacpp]
endpoint = http://localhost:8080/completion

Expand Down
41 changes: 39 additions & 2 deletions src/ai_cli.5
Original file line number Diff line number Diff line change
Expand Up @@ -101,9 +101,46 @@ followed by a space, before being added to the edit buffer.
\fItimestamp=\fR
.RS 4
Setting \fItimestamp\fP to \fItrue\fP will cause the log file
to include the timesamp (in ISO format) of each request or response.
to include the timestamp (in ISO format) of each request or response.
.RE

.SH [ANTHROPIC] SECTION OPTIONS
These options tailor the behavior of queries made to the Anthropic servers.
Refer to the
.UR "https://docs.anthropic.com/claude/reference/messages_post"
Anthropic API documentation
.UE
for more details.

.PP
\fImodel=\fR
.RS 4
The Anthropic model to use, e.g.
.IR claude-3-opus-20240229 .
This affects performance and pricing.
.RE
.PP
\fIendpoint=\fR
.RS 4
The URL of the API endpoint, e.g. \fChttps://api.anthropic.com/v1/messages\fP.
.RE
.PP
\fImax_tokens=\fR
.PP
\fItemperature=\fR
.PP
\fItop_k=\fR
.PP
\fItop_p=\fR
.PP
\fIversion=\fR
.RS 4
The version to supply to the
.I anthropic-version
HTTP header field.
.RE
.PP

.SH [LLAMACPP] SECTION OPTIONS
These options tailor the behavior of the llama.cpp
queries.
Expand Down Expand Up @@ -264,7 +301,7 @@ The names and order of configuration files are documented in
Diomidis Spinellis ([email protected])

.SH COPYRIGHT
Copyright 2023 Diomidis Spinellis
Copyright 2023-2024 Diomidis Spinellis

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Expand Down
4 changes: 2 additions & 2 deletions src/ai_cli.7
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Set the
.B DYLD_INSERT_LIBRARIES
(under macOS)
environment variable to load the library using its full path.
Obtain your OpenAI API key and configure it in the
Obtain your Anthropic or OpenAI API key and configure it in the
.B .aicliconfig
file.
Alternatively, setup a running
Expand Down Expand Up @@ -115,7 +115,7 @@ configuration files.
Diomidis Spinellis ([email protected])

.SH COPYRIGHT
Copyright 2023 Diomidis Spinellis
Copyright 2023-2024 Diomidis Spinellis

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Expand Down
7 changes: 7 additions & 0 deletions src/ai_cli.c
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,8 @@
#include <readline/history.h>

#include "config.h"

#include "anthropic_fetch.h"
#include "llamacpp_fetch.h"
#include "openai_fetch.h"

Expand Down Expand Up @@ -133,6 +135,11 @@ setup(void)
fetch = openai_fetch;
REQUIRE(openai, key);
REQUIRE(openai, endpoint);
} else if (strcmp(config.general_api, "anthropic") == 0) {
fetch = anthropic_fetch;
REQUIRE(anthropic, key);
REQUIRE(anthropic, endpoint);
REQUIRE(anthropic, version);
} else if (strcmp(config.general_api, "llamacpp") == 0) {
fetch = llamacpp_fetch;
REQUIRE(llamacpp, endpoint);
Expand Down
22 changes: 22 additions & 0 deletions src/all_tests.c
Original file line number Diff line number Diff line change
@@ -1,8 +1,29 @@
/*-
*
* ai-cli - readline wrapper to obtain a generative AI suggestion
* Unit test driver
*
* Copyright 2023-2024 Diomidis Spinellis
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

#include <stdio.h>

#include "CuTest.h"

CuSuite* cu_config_suite();
CuSuite* cu_fetch_anthropic_suite();
CuSuite* cu_fetch_openai_suite();
CuSuite* cu_fetch_llamacpp_suite();
CuSuite* cu_support_suite();
Expand All @@ -14,6 +35,7 @@ run_all_tests(void)
CuSuite* suite = CuSuiteNew();

CuSuiteAddSuite(suite, cu_config_suite());
CuSuiteAddSuite(suite, cu_fetch_anthropic_suite());
CuSuiteAddSuite(suite, cu_fetch_openai_suite());
CuSuiteAddSuite(suite, cu_fetch_llamacpp_suite());
CuSuiteAddSuite(suite, cu_support_suite());
Expand Down
188 changes: 188 additions & 0 deletions src/anthropic_fetch.c
Original file line number Diff line number Diff line change
@@ -0,0 +1,188 @@
/*-
*
* ai-cli - readline wrapper to obtain a generative AI suggestion
* anthropic access function
*
* Copyright 2023-2024 Diomidis Spinellis
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <curl/curl.h>
#include <readline/history.h>
#include <jansson.h>

#include "config.h"
#include "support.h"
#include "anthropic_fetch.h"

// HTTP headers
static char *key_header;
static char *version_header;

// Return the response content from an Anthropic JSON response
STATIC char *
anthropic_get_response_content(const char *json_response)
{
json_error_t error;
json_t *root = json_loads(json_response, 0, &error);
if (!root) {
readline_printf("\nanthropic JSON error: on line %d: %s\n", error.line, error.text);
return NULL;
}

char *ret;
json_t *content = json_object_get(root, "content");
if (content) {
json_t *first_content = json_array_get(content, 0);
json_t *text = json_object_get(first_content, "text");
ret = safe_strdup(json_string_value(text));
} else {
json_t *error = json_object_get(root, "error");
if (error) {
json_t *message = json_object_get(error, "message");
readline_printf("\nAnthropic invocation error: %s\n", json_string_value(message));
} else
readline_printf("\nAnthropic invocation error: %s\n", json_response);
ret = NULL;
}

json_decref(root);
return ret;
}

/*
* Initialize anthropic connection
* Return 0 on success -1 on error
*/
static int
initialize(config_t *config)
{
if (config->general_verbose)
fprintf(stderr, "\nInitializing Anthropic, program name [%s] system prompt to use [%s]\n",
short_program_name(), config->prompt_system);
safe_asprintf(&key_header, "x-api-key: %s", config->anthropic_key);
safe_asprintf(&version_header, "anthropic-version: %s", config->anthropic_version);
return curl_initialize(config);
}

/*
* Fetch response from the anthropic API given the provided prompt.
* Provide context in the form of n-shot prompts and history prompts.
*/
char *
anthropic_fetch(config_t *config, const char *prompt, int history_length)
{
CURLcode res;

if (!curl && initialize(config) < 0)
return NULL;

if (config->general_verbose)
fprintf(stderr, "\nContacting Llamacpp API...\n");

struct curl_slist *headers = NULL;
headers = curl_slist_append(headers, "content-type: application/json");
headers = curl_slist_append(headers, key_header);
headers = curl_slist_append(headers, version_header);

struct string json_response;
string_init(&json_response, "");

struct string json_request;
string_init(&json_request, "{\n");

string_appendf(&json_request, " \"model\": %s,\n",
json_escape(config->anthropic_model));
string_appendf(&json_request, " \"max_tokens\": %d,\n",
config->anthropic_max_tokens);

char *system_role = system_role_get(config);
string_appendf(&json_request, " \"system\": %s,\n",
json_escape(system_role));
free(system_role);

// Add configuration settings
if (config->anthropic_temperature_set)
string_appendf(&json_request, " \"temperature\": %g,\n", config->anthropic_temperature);
if (config->anthropic_top_k_set)
string_appendf(&json_request, " \"top_k\": %d,\n", config->anthropic_top_k);
if (config->anthropic_top_p_set)
string_appendf(&json_request, " \"top_p\": %g,\n", config->anthropic_top_p);

string_append(&json_request, " \"messages\": [\n");

// Add user and assistant n-shot prompts
for (int i = 0; i < NPROMPTS; i++) {
if (config->prompt_user[i])
string_appendf(&json_request,
" {\"role\": \"user\", \"content\": %s},\n",
json_escape(config->prompt_user[i]));
if (config->prompt_assistant[i])
string_appendf(&json_request,
" {\"role\": \"assistant\", \"content\": %s},\n",
json_escape(config->prompt_assistant[i]));
}

// Add history prompts as context
bool context_explained = false;
for (int i = config->prompt_context - 1; i >= 0; --i) {
HIST_ENTRY *h = history_get(history_length - 1 - i);
if (h == NULL)
continue;
if (!context_explained) {
context_explained = true;
string_appendf(&json_request,
" {\"role\": \"user\", \"content\": \"Before my final prompt to which I expect a reply, I am also supplying you as context with one or more previously issued commands, to which you simply reply OK\"},\n");
string_appendf(&json_request,
" {\"role\": \"assistant\", \"content\": \"OK\"},\n");
}
string_appendf(&json_request,
" {\"role\": \"user\", \"content\": %s},\n",
json_escape(h->line));
string_appendf(&json_request,
" {\"role\": \"assistant\", \"content\": \"OK\"},\n");
}

// Finally, add the user prompt
string_appendf(&json_request,
" {\"role\": \"user\", \"content\": %s}\n", json_escape(prompt));
string_append(&json_request, " ]\n}\n");

write_log(config, json_request.ptr);

curl_easy_setopt(curl, CURLOPT_URL, config->anthropic_endpoint);
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headers);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, string_write);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &json_response);
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, json_request.ptr);

res = curl_easy_perform(curl);

if (res != CURLE_OK) {
free(json_request.ptr);
readline_printf("\nAnthropic API call failed: %s\n",
curl_easy_strerror(res));
return NULL;
}

write_log(config, json_response.ptr);

char *text_response = anthropic_get_response_content(json_response.ptr);
free(json_request.ptr);
free(json_response.ptr);
return text_response;
}
Loading

0 comments on commit 45934fe

Please sign in to comment.