Robot

Langchain Prompt Templates from Robocorp Asset Storage

Maintain prompt templates in Robocorp Asset Storage and improve them without touching the code.

Langchain
Python
OpenAI

Prompt Templates is a neat way of managing the prompts that go to large-language models, but keeping the templates in the code often means you'll need to redeploy your project for changes to take effect. Also, you might want different people to work on the prompt templates than actual code.

💡 SOLUTION: Let's put the prompt templates to Robocorp Asset Storage! 💡

Apart from being able to iterate the prompt templates fast, using Robocorp for developing and running Langchain workloads, you'll get several benefits.

  • Curated collection of Python libraries built for automating typical sources of data for e.g. RAG data loaders (websites with Playwright, desktops legacy apps, any cloud platform, documents with OCR, excel and much more).
  • Amazing environment control - define dependencies once, and tooling takes care of environment builds and a whole lot more when developing and running workflows.
  • Run anywhere - Robocorp offers zero-infra workers in the cloud, or you can self-host on-demand containers, Windows VMs or dedicated machines on any (common) OS. Why this matters: your RAG data loaders can work where your data and apps are.

Use case

The example code uses LLM to compare if two addresses are the same as introduced in Benjamin Stein's blog post. So if we give LLM address pair like 1540 Battery St. CA and 1540 Battery Street, San Francisco the answer should be YES.

Step-by-step

Follow this guide to get going. This assumes that you have not previously used Robocorp, so we go through everything step-by-step.

1️⃣ Install VS Code (well, I bet you might have that already;)

2️⃣ Install Robocorp Code extension - this one connects your dev environment with the Robocorp Control Room

3️⃣ Create a Robocorp Control Room account - free accounts available, no credit card needed!

image

4️⃣ In Control Room, create a Vault entry for OpenAI API credentials. Use the name OpenAI and api-key unless you want to edit the code.

image

image

5️⃣ In Control Room, create an Asset for the prompt template. Use the name example_prompt_template unless you want to edit the code. The text is fully shown here for easy copy pasting.

image

image

6️⃣ Clone this example's Git repository to your own machine - use the way most familiar with you!

7️⃣ Open the cloned project folder in VS Code, our extension gets to work to automatically build the Python environment. It'll take a few minutes for the first time.

8️⃣ Link the VS Code with your Control Room account and Workspace from the bottom left corner of the extension.

Screenshot 2023-09-22 at 7 56 21

9️⃣ Hit Command Palette Cmd-Shift-P or Win-Shift-P and find Robocorp: Run Robot. Voila!

Screenshot 2023-09-19 at 13 56 54

The bot code explained

This chapter walks through some of the key features of the code, which is already simple and documented to start with.

Apart from importing the relevant LangChain "things", we are using Robocorp's Python Framework and automation toolkit that makes it easy to use capabilities from the Control Room, as well as interact with popular tools like Excel. Learn more about the framework here.

To create runnable portions for a Python project, use a @task decorator. It tells the library that the function implements one specific automation task (here called compare_addresses), which can then later be scheduled to run in Control Room. In addition, the decorator automatically initializes logging from robocorp-log.

This line does a few things all in one, using Robocorp's built in Excel automation library. It opens the workbook, given sheet and reads the data as a list with data containing the headers.

The example uses Robobocorp's secure Vault from Control Room to store credentials. The secret does not need to be exposed to the developers, yet they are available both when developing and when executing workflows in cloud runtimes.

When creating the Chat Prompt Template, the human message template is read from Asset Storage. It gets the latest version every time the code is run, and editing and develepoing the prompt is isolated away from the code.

When prompting the LLM, the template is simply injected with the required variables coming from the list we got from the excel. In this case the variables are address_one and address_two, which are shown as {address_one} and {address_two} in the template.

What next

Technical information

Last updated

25 September 2023

License

Apache License 2.0