System requirements for development use
For a multiplatform application, there are multiple permutations of system setups to test on. The following are the systems that Robocorp tests on. Other operating systems might also work, but we are not able to guarantee that.
It is recommended to keep Windows systems up-to-date according to Microsoft release information as older versions always have security concerns.
For development tasks, the user usually needs some elevated permissions (for installing the software), but in Robocorp applications, we avoid requiring admin permissions while running.
- We recommend Professional and Enterprise editions (Home edition should work, but not tested).
- Update to version 20H2 (build 19042) is recommended before May 11th, 2021.
- Minimum requirement is to have build 1909 (Microsoft's end-of-service date for that is May 11th, 2021, Microsoft does not support all other versions).
- The latest Windows Server 2019 version is supported as it is the only one from the Server variants with long filename support.
- Unfortunately, Windows Server 2016 does not truly support this despite the documentation. This version is also reaching mainstream support end on October 12th, 2021.
There is no clear statement on the versions of macOS that are officially supported, but the main rule is that the latest two releases are supported, so users should migrate to those. The build pipelines for some libraries and tools have already dropped builds for
For Robocorp tools, we recommend the latest versions of
We have only limited testing on M1 chips, but the tools seem to work over the emulations.
- Ubuntu 18.04 recommended
- 2020-12-01: Moving to 20.04, but only limited tests so far.
- The latest version of Chrome is the recommended browser.
- Tools and libraries support other browsers as well, but we limit our tests to Chrome.
Minimum system requirements
Minimum system requirements are always subjective because even a slow system can edit and execute a robot.
For our tooling, we took the base case of installing Robocorp Lab and ran an example robot within 30 minutes from starting on a clean Windows machine. This target gives a relatively good overview of processor, memory, and disk usage.
The automation robot's content probably affects the system requirements more than the tooling as legacy applications can demand a lot of local resources, and browsers tend to eat up all the memory you throw at them.
- At least two CPUs
- RCC heavily benefits from parallel running
- At least 4GB of memory
- The target browser probably uses more memory than the tooling.
- For development work, 8GB memory is heavily recommended.
- At least 20GB of free disk space:
- Environment cache uses a lot of disk space.
- SSD is recommended as Python processes handle a lot of small files, so file I/O speed directly affects execution times.
For reference: We could achieve the target goal on a virtual machine on AWS Lightsail with the following instance: