Clarity LIMS
Illumina Connected Software
Clarity LIMS v6.3 & Lablink v2.5
Clarity LIMS v6.3 & Lablink v2.5
  • Release Notes Clarity LIMS v6.3
    • Release Notes Clarity LIMS v6.3.1
    • Release Notes Clarity LIMS v6.3.0
  • Technical Overview
    • Technical Requirements
  • Installation
    • Installation Procedure
    • Guide to Secret Management
    • Install/Upgrade Secret Management for Integration Modules
    • Change the Clarity LIMS Hostname
    • Update Server Passwords and Database Connection Details
  • On-Premise Deployments
    • Pre-installation Requirements
      • Install a Purchased SSL/TLS Certificate
      • Configure Your HashiCorp Vault
    • On Premise to On Premise Upgrade Procedures
    • On Premise to On Premise In-place Upgrade Procedures
    • On Premise to Hosted Upgrade Procedures
    • Hosted to On Premise Upgrade Procedures
  • Administration
    • Database Cleanup Procedure
    • Backup and Restore Procedure
    • Receiving and Decrypting Cloud Backup Data
    • LDAP Integration
      • Using the LDAP Checker Tool
    • Illumina Connected Software Platform Integration
    • Clarity LIMS Log Files
    • Customize the Term Used for Projects
    • Enforcing Unique Sample Names Within a Project
    • Container Name Uniqueness
    • Configure Electronic Signatures
    • Creating Enrypted Passwords
    • Config Slicer Tool
      • Managing Configurations with Config Slicer
      • Upgrading a configuration package/manifest file for compatibility with Config Slicer v3.0.x
      • Config Slicer Use Cases
      • Troubleshooting Config Slicer
    • Audit Trail
      • Enabling, Validating and Disabling Audit Trail
    • System Settings
    • Automation Worker Nodes
      • Troubleshooting Automation Worker
  • Clarity LIMS v6.3 Reference Guide
    • Dashboards
      • Overview Dashboard
      • Projects Dashboard
    • Projects and Samples
      • Projects
      • Samples Accessioning
        • Sample List for Batch Import
        • Guidelines and Tips for Batch Sample Import
      • Assign and Process Samples
    • Lab View
      • Requeue and Rework Samples
      • Storing Sample Aliquots for Later Use
      • Modifying Completed Step Details
      • Alert Notifications
    • Configuration
      • Lab Work
        • Steps and Master Steps
          • Step Milestones
          • Derived Sample Naming Convention Tokens
        • Protocols
        • Workflows
      • Consumables
        • Reagents
          • Reagent Kit Lot Manifest for Batch Import
        • Controls
        • Instruments
        • Labels
        • Containers
      • Custom Fields
      • User Management
        • Manage User Access
        • User Roles
        • Configured Role-Based Permissions
        • User and Profile Page
      • Automations
        • Automation Trigger Configuration
        • Copy Custom Fields from Step Input to Output
        • Template Files Associated With Automations
    • Automated Quality Control
      • Configure a QC System
    • Search
      • Basic Search
      • Advanced Search
    • Genealogy View
    • System Performance
      • Queue Performance and Usability
      • Demultiplexing API Endpoint Performance
    • Terms and Definitions
  • LabLink v2.5 Reference Guide
    • Project
      • Email Notifications for Notes
    • Resource Materials & Contact Us
    • Users
    • Configuration
    • Publishing Files and Progress
Powered by GitBook
On this page
  • Performance and Usability
  • Performance Test Setup
  • Performance Test Results

Was this helpful?

Export as PDF
  1. Clarity LIMS v6.3 Reference Guide
  2. System Performance

Queue Performance and Usability

The Queue screen lists the samples that are queued for a step, and provides a table from which samples are selected for placement into the Ice Bucket.

By default, samples listed in the Sample table are grouped by container. Groups are collapsed by default and can be expanded as required by selecting the arrows.

Lab scientists can also choose to group samples by project, submitted sample, or previous step.

Performance and Usability

In the past, some performance and usability issues were encountered when viewing large data sets in the Queue screen. Clarity LIMS now includes performance enhancements that speed up the Sample table loading time, allowing users to more quickly interact with the data.

Clarity LIMS development teams measured the performance for various samples that are queued for a step with the Time to Interactive (TTI) metric. This metric defines the time it takes for a page to become fully interactive and for functionality to start working (eg, selecting, scrolling, and so on). The metric numbers vary based on the following information:

  • Server specification.

  • Amount of data stored on the server database.

  • Client hardware specifications and the browser type used to access Clarity LIMS.

  • Network conditions between the server and client.

Performance Test Setup

The following table shows the server and client specifications used for the performance test.

The different client types are used to demonstrate the different setups.

Hardware
Specification
Additional Notes

Server

3.1 GHz Intel Xeon Platinum processor (8-core)

32 GB RAM

Oracle Linux v8.8

PostgreSQL 15.2 database

The server database is loaded with the following information:

  • 200,000 submitted sample records

  • 2,000,000 derived sample records

  • 500 projects

Client A

3.1 GHz Intel Xeon Platinum processor (8-core)

32 GB RAM

Access Clarity LIMS within the same network in the lab.

Client B

2.3 GHz Intel Core i9 (8-core)

32 GB RAM

Access Clarity LIMS on the cloud. The VPN access and different network region setup results in high network latency and demonstrates the worst case for performance.

Performance Test Results

The following tables show the results of two performance tests conducted on a Clarity LIMS system on which performance enhancements had been implemented. In both tests, samples were grouped by container.

The tables show how the usability rating changes as the number of samples in the queue increases.

Test 1: Container = Tube

Test 1 Performance Results

Number of Samples
Response Time (seconds) - Client Type A Response Time (seconds) - Client Type B 1000
Response Time (seconds) - Client Type B

1000

2.0

4.5

3000

2.5

5.0

7000

3.5

6.5

10,000

5.0

7.5

15,000

7.0

10.5

20,000

9.0

12.5

Test 2: Container = 96 well plate

Test 2 Performance Results

Number of Samples
Response Time (seconds) - Client Type A Response Time (seconds) - Client Type B 1000
Response Time (seconds) - Client Type B

1000

2.0

4.5

3000

3.0

5.5

7000

5.0

7.5

10,000

7.0

9.5

15,000

9.5

12.5

20,000

12.5

15.0

PreviousSystem PerformanceNextDemultiplexing API Endpoint Performance

Last updated 9 months ago

Was this helpful?