Testing

Tests should be part of all user stories for a hub instance or a connector.

Testing

Tests should be part of all user stories for a hub instance or a connector.
For a quick start take a look at Instance template and Connector template.
Full example with non-trivial tests can be found in Connector tutorial OTP.

The best practice is to keep anything that calls 3rd party services in integration folder instead of test, because
you don't have control over these services, and they can be down or calling them can incur costs. Tests in test folder should be
as complete as possible and they should use mocks for connectors to external services. Running tests in test folder should be part
of build/release action, while tests in integration folder should be run manually by developer as needed.

Another best practice is to split your workflow into smaller bits, ideally the main workflow should be just a chain of calls to
one-purpose sub-workflows and functions with occasional updates of attributes. It is much more readable and easily testable
than one huge workflow of hundreds of lines. Also, whenever possible you should first create tests for each individual
sub-workflows and sub-functions.

Example of a workflow that is split into smaller better testable parts:

workflow('neo') {
    workflow('document-check') {
        namespace document
    }
    
    person << [
        firstName    : document.firstName,
        lastName     : document.lastName,
        fullName     : document.idp.biographic.fullName,
        dateOfBirth  : document.idp.biographic.birthDate
    ]

    function('create-lead') {
        namespace lead
        input person: person
    } 
    
    function('create-verification') {
        namespace verification
        input entityId: lead.id,
              verificationRequirementId: env.salesforce.verificationId
    }
    
    function('lookup-verification-document-ids') {
        input verificationId: verification.uuid
        namespace documentIds
    }
    
    function('document-verification') {                    
        input entity: lead, 
              idp: document.idp, 
              upload: document.upload.personalId,
              verificationDocumentId: documentIds.idDocument
    }

    workflow('liveness-check') {
        namespace liveness
        input upload: document.upload,
              documentId: documentIds.selfie
    }
}

Configuration of your project

The Zenoo Hub provides extensive support for testing, and you should definitely make as much out of it as possible.
Add hub-test-starter dependency to your project's build.gradle to access the whole testing support part of the Zenoo Hub:

ext {
    hubBackendVersion = '2.135.0'
}

dependencies {
    testImplementation group: 'com.zenoo.hub', name: 'backend-spring-boot-starter-test', version: hubBackendVersion
}

You also need to fine tune setup for test and integrationTest tasks:

sourceSets {
    integration {
        groovy.srcDir "$projectDir/src/integration/groovy"
        resources.srcDir "$projectDir/src/integration/resources"
        compileClasspath += main.output
        runtimeClasspath += main.output
    }
}

configurations {
    integrationRuntime.extendsFrom testRuntime
    integrationImplementation.extendsFrom testImplementation
}

test {
    useJUnitPlatform()
    testLogging {
        events "passed", "skipped", "failed"
    }
}

task integrationTest(type: Test) {
    useJUnitPlatform()
    testClassesDirs = sourceSets.integration.output.classesDirs
    classpath = sourceSets.integration.runtimeClasspath
}

processIntegrationResources {
    setDuplicatesStrategy(DuplicatesStrategy.WARN)
}

Setup of the Zenoo Hub for Tests

You will need a separate hub configuration for tests. Usually you should set ComponentConfigurer to be an empty list because you will register
components as needed for individual tests. HubConfigurer should have all necessary connectors - mocked for test folder tests
and real ones for integration folder tests.

Example of TestConfig class:

@Configuration
class TestConfig {

    @Bean
    @Primary
    ComponentConfigurer componentConfigurer() {
        () -> List.of()
    }

    @Bean
    @Primary
    static HubConfigurer hubConfigurer(
            HttpConnectorMock httpConnectorMock
    ) {
        return new HubConfigurer() {

            @Override
            List<ConnectorActivator> connectors() {
                return of(
                        ConnectorActivator.of(ComponentId.from('http'), httpConnectorMock as Connector<HttpConnectorSpec>)
                )
            }
        }
    }

}

Writing a test

Tests in the Zenoo Hub utilize Spock as test framework, you can learn basics in a tutorial on Baeldung.
The easiest way to write a test is to extend WorkflowTestSpecification. It has all the necessary methods for you to test a DSL workflow or function.

1. Prepare mocks

Use Spock's given block to set up mocks as needed. The Zenoo Hub provides MockConnectorExchange class to easily create connector mocks ([see below](./hub-backend-testing#Mocking connectors)).
MockConnectorExchange class implements withResult, withError and withDelay methods that you can configure a connector mock with.

Example:

def "verify code should pass mock call"() {
    given:
    httpConnectorMock.mockExchange.withResult([
            "status"      : "approved",
            "date_updated": "2022-07-21T05:19:21Z",
            "account_sid" : "AC1df896fc9f8d4c30b31490b5303e925e",
            "to"          : "+420123456789",
            "valid"       : true,
            "sid"         : "VE39811dee2cfdfc3b65466f44e07a8dc0",
            "date_created": "2021-07-22T05:17:44Z",
            "service_sid" : "VAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
            "channel"     : "whatsapp"
    ])

}

2. Register components and start workflow or function

WorkflowTestSpecification contains testBuilder attribute that helps with registering and configuring components for a test.
testBuilder implements several methods to serve that end:

  • setWorkflow - workflow that will be called to run the test.
  • setFunction - similar as above but for function, not workflow. There can be either setWorkflow or setFunction but not both.
  • setInput - sets an input for function or workflow upon its call. See DSL workflow or DSL function for details.
  • setConfig - sets config for the component of function or workflow. See component configuration for details.
  • addDependency - adds a component dependency for the test. Don't forget
    to add the component where the workflow or function is located itself.

Method build will generate a testing component, then it will register it and its dependencies,
and finally it will start a testing workflow from the testing component.

Example of testBuilder usage to set up the test:

        expect:
        def result = testBuilder.with {
            function = 'send-code@otp'
            input = ['phoneNumber': '+420123456789', 'channel': 'whatsapp']
            config = [
                    serviceSid: 'VAXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX',
                    accountSid:'AC1df896fc9f8d4c30b31490b5303e925e',
                    authToken: 'lwqIK1nsxcaBwwv7Yuja5PTpdbD7czaI'
            ]
            addDependency(OtpConnector.otpConnector)
            build()
        }.getResult()

3. Check workflow steps and results

Once the workflow has started it will pause on each route DSL command waiting for Hub Client to submit user input.
There is a simple-to-use function submit inherited from WorkflowTestSpecification that you can use to simulate user data entry.
You should also check that workflow stopped on the right route at each step, for that you can check route part of response function.

Example of checking route and submitting user data:

response().route.uri == '/otp'
submit([code : 123456])

response method returns WorkflowTesterResponse which
depending on the state of execution can become one of these types:

  • route - workflow execution is paused and awaits user input. See RouteResource. Available attributes:

    • uuid,
    • uri - the identifying location from route definition,
    • terminal - whether the route is terminal,
    • backEnabled - whether back is enabled at this route,
    • export - object or list of exported attributes for the Hub Client,
    • payload.
  • result - workflow/function has finished and result of the execution is returned. See ResultResource. There are no attributes, it just contains the object that was returned from the callable.

  • error - there has been an error while executing a DSL code or a connector. See ErrorResponseResource. Available attributes:

    • code,
    • message.
  • validation error - when a DSL code fails to be validated. This can be either invalid DSL, missing attribute or invalid input.
    See ValidationResult.
    There is just one attribute, errors that contains list of ValidationError. ValidationError has just one attribute, message.

Another useful method inherited from WorkflowTestSpecification is upload. It allows you to simulate user uploading
file through the Hub Client. The methods itself uploads a file to the test hub instance and returns a FileDescriptor
that you can use as parameter for submit method.

Example:

    @Value("classpath:test-files/idFront.jpg")
    Resource idFrontResource

    def "should pass document check"() {
        given:
        testBuilder.with {
            workflow = '[email protected]'
            addDependency(NEO_WORKFLOWS)
            build()
        }

        expect:
        response().route.uri == '/id-upload'
        def idFrontUpload = upload(idFrontResource)
        submit(personalId: [idFrontUpload])
        def checkOCR = response().route
        checkOCR.uri == '/check-idp'
        submit(retry: false)
    }

In addition to testing happy path for workflows and smoke tests for connectors you should test for common errors responses and invalid data inputs.

Connector usually does not handle error itself and just passes it on to a workflow which should know how to resolve it.
So in case of testing the connector itself you need to write a DSL code just to test different scenarios.

Example of check for error in connector:

Function to test a connector:

function('test-document') {
    input ->
        exchange('RDP document') {
            connector('document')
            config input
            fallback {
                'error'
            }
        }
}

Spock test for invalid data response:

    def "front document verification error"() {
        given:     
        def uploadIdFront = upload(frontError)
        testBuilder.with {
            fuction = 'test-document@rdp'
            input = [idFront: uploadIdFront, defaultValidationBypass: false]
            addDependency(RDPCompoennt.rdp)
            build()
        }     
        expect:
        response().result == 'error'
    }

Workflows should either recover from an error, retry or notify user about it, usually on an error page. You should test
that these errors are handled properly, eg. user is sent to the right error page and is notified about what has gone wrong.

Example of check for error in a workflow:

The part of the workflow to test:

exchange('IDEMIA - Create Identity') {
    fallback {
        route('error') {
            export error_step: 'processing'
        }
    }
}

The part of the workflow test to check an error:

given:
...
createIdentityConnectorMock.mockExchange.withError()

expect:
...
def errorResponseRoute = response().route
errorResponseRoute.uri == "/error"
errorResponseRoute.export.error_step == "processing"
errorResponseRoute.terminal

Mocking connectors

In most cases it should be enough to use MockConnector class to create a new bean in your TestConfig and pass them on to hubConfigurer.
In this way you configure the Zenoo Hub to work with mocks instead of the real connectors.

Example of bean creation:

    @Bean
    MockConnector<DocumentConnector> documentConnectorMock(DocumentConnector documentConnector) {
        new MockConnector<DocumentConnector>(documentConnector)
    }

Example of using it in hubConfigurer

    @Bean
    static HubConfigurer hubConfigurer(
            MockConnector<DocumentConnector> documentConnectorMock,
            MockConnector<LivenessConnector> livenessConnectorMock,
            MockConnector<IdentityConnector> identityConnectorMock
    ) {
        return new HubConfigurer() {

            @Override
            List<ConnectorActivator> connectors() {
                return of(
                        ConnectorActivator.of("[email protected]", documentConnectorMock),
                        ConnectorActivator.of("[email protected]", livenessConnectorMock),
                        ConnectorActivator.of("[email protected]", identityConnectorMock)
                )
            }
        }
    }

MockConnector contains an attribute mockExchange of type MockConnectorExchange that is meant to be used to set mock responses for the connector.

  • withConfigConsumer(Consumer<CustomConnectorConfig> consumer) allows you to add consumer for connector config which is useful
    to verify configuration that was passed on to the connector.

  • withError() sets a simple ConnectorException("Error") as the mocked response of the connector.

  • withResult(Object result) - sets return value of the mock.

  • withDelay(int delay) adds delay in seconds before response is returned from the mock when executed. You can take advantage of this method to check behaviour
    of your flow when a response from a connector takes some time.

Example:

given:
...
identityConnectorMock.mockExchange
        .withConfigConsumer({ identityConfig = it })
        .withResult([countryCode: "AU", transactionId: "e850891a-6a57-4d5f-b499-3c7d891a0cef", overallStatus: "MATCH"])

expect:
...
response().route.uri == '/address'
submit([location: [locality    : null,
                   sublocality : 'BARCELONA',
                   area1       : 'BARCELONA',
                   street      : 'C/MEDES 4-10',
                   country     : 'HongKong',
                   countryCode : 'HK',
                   streetNumber: '10-Apr']])

identityConfig.address.addressLine1 == 'C/MEDES 4-10 10-Apr'
identityConfig.address.countryCode == 'HK'