Skip to content

Latest commit

 

History

History

langchain

Deploy LangChain App to Railway and Fly with Node and Docker

Outline

Local Development

Copy .env.example and include your API key in .env:

cp .env.example .env

Yarn:

yarn
node src/index.mjs

pnpm:

pnpm i
node src/index.mjs

Test local server:

curl \
  -H 'Content-Type: application/json' \
  -d '{"input": "hi there"}' \
  'http://localhost:8080/chat'

Deployment

Deploy to Edgio

Install the Edgio CLI and login to your account:

npm i -D @edgio/cli @edgio/core @edgio/devtools @edgio/prefetch
edgio login

Initialize project and build the project:

npm install && edg build

Deploy with the command:

edg deploy --site=my-open-api-project-on-edgio
edgio curl \
  EDGIO_CURL_SHOW_BODY=true \
  -H 'Content-Type: application/json' \
  -d '{"input": "hi there"}' \
  'https://anthony-campolo-langchain-template-node-default.edgio.link/chat'

Deploy to Railway

Install the railway CLI and login to your account:

railway login

Initialize project and build Docker image:

railway init -n langchain-template-node-railway
railway link
railway up

Add API key to your project's environment variables.

railway environment variables in the dashboard

curl \
  -H 'Content-Type: application/json' \
  -d '{"input": "hi there"}' \
  'https://langchain-template-node-railway-production.up.railway.app/chat'

railway logs with chatgpt output

Deploy to Fly

Install the flyctl CLI and login to your account:

fly auth login

Launch and deploy application:

fly launch --now \
  -e OPENAI_API_KEY=YOUR_KEY_HERE \
  --name langchain-template-node-fly

Check application state:

fly logs -a langchain-template-node-fly
fly status -a langchain-template-node-fly
curl \
  -H 'Content-Type: application/json' \
  -d '{"input": "hi there"}' \
  'https://langchain-template-node-fly.fly.dev/chat'

Code

Dockerfile

# Dockerfile

FROM debian:bullseye as builder

ARG NODE_VERSION=19.4.0
ARG YARN_VERSION=3.4.1

RUN apt-get update; apt install -y curl python-is-python3 pkg-config build-essential
RUN curl https://get.volta.sh | bash
ENV VOLTA_HOME /root/.volta
ENV PATH /root/.volta/bin:$PATH
RUN volta install node@${NODE_VERSION} yarn@${YARN_VERSION}

RUN mkdir /app
WORKDIR /app
COPY . .
RUN yarn

FROM debian:bullseye
LABEL fly_launch_runtime="nodejs"

COPY --from=builder /root/.volta /root/.volta
COPY --from=builder /app /app

WORKDIR /app
ENV NODE_ENV production
ENV PATH /root/.volta/bin:$PATH

ENTRYPOINT [ "node", "src/index.mjs" ]

Server

// src/index.mjs

import express from 'express'
import { OpenAI } from 'langchain/llms/openai'
import { ConversationChain } from 'langchain/chains'

const app = express()
const port = process.env.PORT || 8080

const model = new OpenAI({})

app.post('/chat', express.json(), async (req, res) => {
  const chain = new ConversationChain({ llm: model })
  const input = req.body.input
  const result = await chain.call({ input })
  console.log(result.response)

  res.send({ body: result.response })
})

app.listen(port, () => console.log(`Listening on port ${port}`))