Sharing libraries

One of the choices you will need to make when developing your automation with Robocorp is how to structure your code into robots and tasks.

Many processes targeting one system: one robot with multiple tasks

For example, if you are developing a robot that interacts with a web application, one of the first steps will be adding functionality to log in and out of the system. If you need to create another automated process that targets that same application, it's much more efficient to reuse that functionality instead of duplicating it.

In these cases, the best solution is to create a single robot, containing as many tasks as you need. This way, part of the functionality and the library dependencies can easily be shared and reused when running the individual task locally or in Robocorp Cloud. You can add as many tasks as you need in the robot's robot.yaml file.


Here is a simple example of how you can structure your robot in this case:


*** Keywords ***
Keyword Used By Multiple Tasks
    [Arguments]    ${message}
    Log    ${message}

*** Task ***
Task A
    Keyword Used By Multiple Tasks    Message from Task A

*** Task ***
Task B
    Keyword Used By Multiple Tasks    Message from Task B


  Task A:
    robotTaskName: Task A
  Task B:
    robotTaskName: Task B
condaConfigFile: conda.yaml
artifactsDir: output
  - .
  - .
  - .gitignore

Note: We are showing here the simplest possible example. You can also have your tasks inside their own file, add the shared keywords to a resource file, have the shared code as a custom Python library, etc.

Learn more about the robot structure and configuration and the specifics of the robot.yaml configuration file format.

Robots spanning multiple systems and resource sharing

If you have two robots that automate different target systems (for example, a CRM and SAP), and you want them to use shared resources (we will call this part of the code COMMON), you have two different possibilities.

Sharing parts of the code as a Git submodule

One option is to have the CRM robot, the SAP robot, and the COMMON module as independent Git repositories. Then, both the CRM robot and the SAP robot's repositories can include the COMMON module as a Git submodule.

Learn more about Git submodules on the Atlassian blog.

This method allows you to keep the code separate and reusable. One thing to keep in mind is that you are responsible for keeping the submodule references up-to-date when code changes are done to the COMMON repository. If you don't do anything, the CRM and SAP robots will keep using the version that COMMON was at when it was added as a submodule into them.

If you want to keep the COMMON repository private, you will have to make sure that the environment where the robot is packaged has access to it. The details on how to accomplish this vary depending on your code hosting infrastructure.

Sharing parts of the code as PyPI or Conda package

Suppose the shared code between your robots is Python code. In that case, you can publish the code in COMMON as a Python Package Index (PyPI) package and then refer to it in your robot's conda.yaml file as a dependency as you would do with any other library.

Note that PyPI is a public repository, and anyone will have access to your shared code. You can also set up your private PyPI repository. Still, you should be aware that, at the moment, Robocorp Cloud does not include an easy way to specify credentials for such a setup, so you might need to run your robots independently with a self-hosted runner.

In addition to PyPI, Robocorp supports and encourages the use of Conda. You can create your package and add it to the public Conda Forge channel, or run your own custom channel.

Just like with submodules, when using PyPI or Conda, it is your responsibility to specify and keep the package version of your shared module up-to-date in each of the robots that make use of it.