fix(driver): fixed issue where the connection attempt of Flutter driver to a running app would not retry before throwing a connection error

feat(config): added a before `onBeforeFlutterDriverConnect` and after `onAfterFlutterDriverConnect` driver connection method property to the test configuration `FlutterTestConfiguration` to enable custom logic before and after a driver connection attempt.
This commit is contained in:
Jon Samwell 2020-05-11 12:20:35 +10:00
parent 2205ce1215
commit aad6af0ab1
7 changed files with 216 additions and 156 deletions

View File

@ -1,3 +1,7 @@
## [1.1.8+2] - 11/05/2020
* Fixed issue where the connection attempt of Flutter driver would not retry before throwing a connection error. This was causing an error on some machines trying to connect to an Android emulator (x86 & x86_64) that runs the googleapis (see https://github.com/flutter/flutter/issues/42433)
* Added a before `onBeforeFlutterDriverConnect` and after `onAfterFlutterDriverConnect` Flutter driver connection method property to the test configuration `FlutterTestConfiguration` to enable custom logic before and after a driver connection attempt.
## [1.1.8+1] - 09/05/2020
* Updated Gherkin library version to sort issue with JSON reporter throwing error when an exception is logged before any feature have run

129
README.md
View File

@ -26,13 +26,12 @@ Available as a Dart package https://pub.dartlang.org/packages/flutter_gherkin
Then I end up with 2
```
NOTE: If you are using a Flutter branch other than the current stable version 1.12.x you will need to use the release candiate version of this library due to a breaking change with the way the flutter driver logs output.
## Table of Contents
<!-- TOC -->
- [Getting Started](#getting-started)
- [Configuration](#configuration)
* [Getting Started](#getting-started)
+ [Configuration](#configuration)
- [features](#features)
- [tagExpression](#tagexpression)
- [order](#order)
@ -42,39 +41,44 @@ NOTE: If you are using a Flutter branch other than the current stable version 1.
- [hooks](#hooks)
- [reporters](#reporters)
- [createWorld](#createworld)
- [logFlutterProcessOutput](#logFlutterProcessOutput)
- [flutterBuildTimeout](#flutterBuildTimeout)
- [exitAfterTestRun](#exitaftertestrun)
- [Flutter specific configuration options](#flutter-specific-configuration-options)
+ [Flutter specific configuration options](#flutter-specific-configuration-options)
- [restartAppBetweenScenarios](#restartappbetweenscenarios)
- [build](#build)
- [buildFlavor](#buildFlavor)
- [flutterBuildTimeout](#flutterBuildTimeout)
- [logFlutterProcessOutput](#logFlutterProcessOutput)
- [targetDeviceId](#targetDeviceId)
- [targetAppPath](#targetapppath)
- [Features Files](#features-files)
- [Steps Definitions](#steps-definitions)
- [runningAppProtocolEndpointUri](#runningAppProtocolEndpointUri)
- [onBeforeFlutterDriverConnect](#onBeforeFlutterDriverConnect)
- [onAfterFlutterDriverConnect](#onAfterFlutterDriverConnect)
- [flutterDriverMaxConnectionAttempts](#flutterDriverMaxConnectionAttempts)
- [flutterDriverReconnectionDelay](#flutterDriverReconnectionDelay)
* [Features Files](#features-files)
+ [Steps Definitions](#steps-definitions)
- [Given](#given)
- [Then](#then)
- [Step Timeout](#step-timeout)
- [Multiline Strings](#multiline-strings)
- [Data tables](#data-tables)
- [Well known step parameters](#well-known-step-parameters)
- [Pluralisation](#pluralisation)
- [Pluralization](#pluralization)
- [Custom Parameters](#custom-parameters)
- [World Context (per test scenario shared state)](#world-context-per-test-scenario-shared-state)
- [Assertions](#assertions)
- [Tags](#tags)
- [Languages](#languages)
- [Hooks](#hooks)
- [Attachements](#attachments)
- [Screenshot on step failure](#screenshot)
- [Reporting](#reporting)
- [Flutter](#flutter)
- [Restarting the app before each test](#restarting-the-app-before-each-test)
+ [Tags](#tags)
+ [Languages](#languages)
* [Hooks](#hooks)
* [Attachments](#attachments)
+ [Screenshot on step failure](#screenshot)
* [Reporting](#reporting)
* [Flutter](#flutter)
+ [Restarting the app before each test](#restarting-the-app-before-each-test)
- [Flutter World](#flutter-world)
- [Pre-defined Steps](#pre-defined-steps)
+ [Pre-defined Steps](#pre-defined-steps)
- [Flutter Driver Utilities](#flutter-driver-utilities)
- [Debugging](#debugging)
+ [Debugging](#debugging)
- [Debugging the app under test](#debugging-the-app-under-test)
<!-- /TOC -->
@ -256,9 +260,9 @@ Future<void> main() {
Defaults to `en`
This specifies the default langauge the feature files are written in. See https://cucumber.io/docs/gherkin/reference/#overview for supported languages.
This specifies the default language the feature files are written in. See https://cucumber.io/docs/gherkin/reference/#overview for supported languages.
Note that this can be overriden in the feature itself by the use of a language block.
Note that this can be overridden in the feature itself by the use of a language block.
```
# language: de
@ -345,10 +349,9 @@ class AttachScreenshotOnFailedStepHook extends Hook {
```
##### screenshot
To take a screenshot on a step failing you can used the pre-defined hook `AttachScreenshotOnFailedStepHook` and include it in the hook configuration of the tests config. This hook will take a screenshot and add it as an attachment to the scenerio. If the `JsonReporter` is being used the screenshot will be embedded in the report which can be used to generate a HTML report which will ultimately display the screenshot under the failed step.
To take a screenshot on a step failing you can used the pre-defined hook `AttachScreenshotOnFailedStepHook` and include it in the hook configuration of the tests config. This hook will take a screenshot and add it as an attachment to the scenario. If the `JsonReporter` is being used the screenshot will be embedded in the report which can be used to generate a HTML report which will ultimately display the screenshot under the failed step.
```
import 'dart:async';
@ -382,9 +385,9 @@ Future<void> main() {
Reporters are classes that are able to report on the status of the test run. This could be a simple as merely logging scenario result to the console. There are a number of built-in reporter:
- `StdoutReporter` : Logs all messages from the test run to the standard output (console).
- `ProgressReporter` : Logs the progress of the test run marking each step with a scenario as either passed, skipped or failed.
- `JsonReporter` - creates a JSON file with the results of the test run which can then be used by 'https://www.npmjs.com/package/cucumber-html-reporter.' to create a HTML report. You can pass in the file path of the json file to be created.
* `StdoutReporter` : Logs all messages from the test run to the standard output (console).
* `ProgressReporter` : Logs the progress of the test run marking each step with a scenario as either passed, skipped or failed.
* `JsonReporter` - creates a JSON file with the results of the test run which can then be used by 'https://www.npmjs.com/package/cucumber-html-reporter.' to create a HTML report. You can pass in the file path of the json file to be created.
You should provide at least one reporter in the configuration otherwise it'll be hard to know what is going on.
@ -449,6 +452,26 @@ Defaults to `90 seconds`
Specifies the period of time to wait for the Flutter build to complete and the app to be installed and in a state to be tested. Slower machines may need longer than the default 90 seconds to complete this process.
#### onBeforeFlutterDriverConnect
An async method that is called before an attempt by Flutter driver to connect to the app under test
#### onAfterFlutterDriverConnect
An async method that is called after a successful attempt by Flutter driver to connect to the app under test
#### flutterDriverMaxConnectionAttempts
Defaults to `3`
Specifies the number of Flutter driver connection attempts to a running app before the test is aborted
#### flutterDriverReconnectionDelay
Defaults to `2 seconds`
Specifies the amount of time to wait after a failed Flutter driver connection attempt to the running app
#### exitAfterTestRun
Defaults to `true`
@ -686,11 +709,11 @@ In most scenarios theses parameters will be enough for you to write quite advanc
Note that you can combine there well known parameters in any step. For example `Given I {word} {int} worm(s)` would match `Given I "see" 6 worms` and also match `Given I "eat" 1 worm`
#### Pluralisation
#### Pluralization
As the aim of a feature is to convey human readable tests it is often desirable to optionally have some word pluaralised so you can use the special pluralisation syntax to do simple pluralisation of some words in your step definition. For example:
As the aim of a feature is to convey human readable tests it is often desirable to optionally have some word pluralized so you can use the special pluralization syntax to do simple pluralization of some words in your step definition. For example:
The step string `Given I see {int} worm(s)` has the pluralisation syntax on the word "worm" and thus would be matched to both `Given I see 1 worm` and `Given I see 4 worms`.
The step string `Given I see {int} worm(s)` has the pluralization syntax on the word "worm" and thus would be matched to both `Given I see 1 worm` and `Given I see 4 worms` .
#### Custom Parameters
@ -743,7 +766,7 @@ This customer parameter would be used like this: `Given I pick the colour red`.
### Tags
Tags are a great way of organising your features and marking them with filterable information. Tags can be uses to filter the scenarios that are run. For instance you might have a set of smoke tests to run on every check-in as the full test suite is only ran once a day. You could also use an `@ignore` or `@todo` tag to ignore certain scenarios that might not be ready to run yet.
Tags are a great way of organizing your features and marking them with filterable information. Tags can be uses to filter the scenarios that are run. For instance you might have a set of smoke tests to run on every check-in as the full test suite is only ran once a day. You could also use an `@ignore` or `@todo` tag to ignore certain scenarios that might not be ready to run yet.
You can filter the scenarios by providing a tag expression to your configuration file. Tag expression are simple infix expressions such as:
@ -765,7 +788,7 @@ Also see <https://docs.cucumber.io/cucumber/api/#tags>
### Languages
In order to allow features to be written in a number of languages, you can now write the keywords in languages other than English. To improve readability and flow, some languages may have more than one translation for any given keyword. See https://cucumber.io/docs/gherkin/reference/#overview for a list of supported langauges.
In order to allow features to be written in a number of languages, you can now write the keywords in languages other than English. To improve readability and flow, some languages may have more than one translation for any given keyword. See https://cucumber.io/docs/gherkin/reference/#overview for a list of supported languages.
You can set the default language of feature files in your project via the configuration setting see [defaultLanguage](#defaultLanguage)
@ -814,10 +837,10 @@ Please note the language data is take and attributed to the cucumber project htt
A hook is a point in the execution that custom code can be run. Hooks can be run at the below points in the test run.
- Before any tests run
- After all the tests have run
- Before each scenario
- After each scenario
* Before any tests run
* After all the tests have run
* Before each scenario
* After each scenario
To create a hook is easy. Just inherit from `Hook` and override the method(s) that signifies the point in the process you want to run code at. Note that not all methods need to be override, just the points at which you want to run custom code.
@ -887,25 +910,25 @@ Future<void> main() {
A reporter is a class that is able to report on the progress of the test run. In it simplest form it could just print messages to the console or be used to tell a build server such as TeamCity of the progress of the test run. The library has a number of built in reporters.
- `StdoutReporter` - prints all messages from the test run to the console.
- `ProgressReporter` - prints the result of each scenario and step to the console - colours the output.
- `TestRunSummaryReporter` - prints the results and duration of the test run once the run has completed - colours the output.
- `JsonReporter` - creates a JSON file with the results of the test run which can then be used by 'https://www.npmjs.com/package/cucumber-html-reporter.' to create a HTML report. You can pass in the file path of the json file to be created.
- `FlutterDriverReporter` - prints the output from Flutter Driver. Flutter driver logs all messages to the stderr stream by default so most CI servers would mark the process as failed if anything is logged to the stderr stream (even if the Flutter driver logs are only info messages). This reporter ensures the log messages are output to the most appropiate stream depending on their log level.
* `StdoutReporter` - prints all messages from the test run to the console.
* `ProgressReporter` - prints the result of each scenario and step to the console - colours the output.
* `TestRunSummaryReporter` - prints the results and duration of the test run once the run has completed - colours the output.
* `JsonReporter` - creates a JSON file with the results of the test run which can then be used by 'https://www.npmjs.com/package/cucumber-html-reporter.' to create a HTML report. You can pass in the file path of the json file to be created.
* `FlutterDriverReporter` - prints the output from Flutter Driver. Flutter driver logs all messages to the stderr stream by default so most CI servers would mark the process as failed if anything is logged to the stderr stream (even if the Flutter driver logs are only info messages). This reporter ensures the log messages are output to the most appropriate stream depending on their log level.
You can create your own custom reporter by inheriting from the base `Reporter` class and overriding the one or many of the methods to direct the output message. The `Reporter` defines the following methods that can be overridden. All methods must return a `Future<void>` and can be async.
- `onTestRunStarted`
- `onTestRunFinished`
- `onFeatureStarted`
- `onFeatureFinished`
- `onScenarioStarted`
- `onScenarioFinished`
- `onStepStarted`
- `onStepFinished`
- `onException`
- `message`
- `dispose`
* `onTestRunStarted`
* `onTestRunFinished`
* `onFeatureStarted`
* `onFeatureFinished`
* `onScenarioStarted`
* `onScenarioFinished`
* `onStepStarted`
* `onStepFinished`
* `onException`
* `message`
* `dispose`
Once you have created your custom reporter don't forget to add it to the `reporters` configuration file property.
@ -931,7 +954,7 @@ For convenience the library defines a number of pre-defined steps so you can get
| I fill the {string} field with {string} | Fills the element with the provided key with the given value (given by the second input parameter) | `When I fill the "email" field with "someone@gmail.com"` |
| I expect the {string} to be {string} | Asserts that the element with the given key has the given string value | `Then I expect the "cost" to be "£10.95"` |
| I (open\|close) the drawer | Opens or closes the application default drawer | `When I open the drawer` , `And I close the drawer` |
| I expect the [button\|element\|label\|icon\|field\|text\|widget] {string} to be present within {int} second(s) | Expects a widget with the given key to be present within n secondss | `Then I expect the widget 'notification' to be present within 10 seconds`, `Then I expect the icon 'notification' to be present within 1 second` |
| I expect the [button\|element\|label\|icon\|field\|text\|widget] {string} to be present within {int} second(s) | Expects a widget with the given key to be present within n seconds | `Then I expect the widget 'notification' to be present within 10 seconds` , `Then I expect the icon 'notification' to be present within 1 second` |
| I pause for {int} seconds | Pauses the test execution for the given seconds. Only use in debug scenarios or to inspect the state of the app | `Then I pause for 20 seconds` |
| I restart the app | Restarts the app under test | `Then I restart the app` |
| I tap the back button | Taps the page default back button widget | `Then I tap the back button` |

View File

@ -36,7 +36,7 @@ Future<void> main() {
..targetAppPath = 'test_driver/app.dart'
// ..buildFlavor = "staging" // uncomment when using build flavor and check android/ios flavor setup see android file android\app\build.gradle
// ..targetDeviceId = "all" // uncomment to run tests on all connected devices or set specific device target id
// ..tagExpression = "@smoke" // uncomment to see an example of running scenarios based on tag expressions
// ..tagExpression = '@smoke' // uncomment to see an example of running scenarios based on tag expressions
// ..logFlutterProcessOutput = true // uncomment to see command invoked to start the flutter test app
// ..verboseFlutterProcessLogs = true // uncomment to see the verbose output from the Flutter process
// ..flutterBuildTimeout = Duration(minutes: 3) // uncomment to change the default period that flutter is expected to build and start the app within

File diff suppressed because one or more lines are too long

View File

@ -1,3 +1,4 @@
import 'dart:async';
import 'dart:io';
import 'package:flutter_gherkin/flutter_gherkin.dart';
import 'package:flutter_gherkin/src/flutter/flutter_world.dart';
@ -71,13 +72,42 @@ class FlutterTestConfiguration extends TestConfiguration {
/// You will have to add the `--verbose` flag to the command to start your flutter app to see this output and ensure `enableFlutterDriverExtension()` is called by the running app
String runningAppProtocolEndpointUri;
/// Called before any attempt to connect Flutter driver to the running application, Depending on your configuration this
/// method will be called before each scenario is run.
Future<void> Function() onBeforeFlutterDriverConnect;
/// Called after the successful connection of Flutter driver to the running application. Depending on your configuration this
/// method will be called on each new connection usually before each scenario is run.
Future<void> Function(FlutterDriver driver) onAfterFlutterDriverConnect;
void setObservatoryDebuggerUri(String uri) => _observatoryDebuggerUri = uri;
Future<FlutterDriver> createFlutterDriver([String dartVmServiceUrl]) async {
final completer = Completer<FlutterDriver>();
dartVmServiceUrl = (dartVmServiceUrl ?? _observatoryDebuggerUri) ??
Platform.environment['VM_SERVICE_URL'];
return await _attemptDriverConnection(dartVmServiceUrl, 1, 3);
await runZonedGuarded(
() async {
if (onBeforeFlutterDriverConnect != null) {
await onBeforeFlutterDriverConnect();
}
final driver = await _attemptDriverConnection(dartVmServiceUrl, 1, 3);
if (onAfterFlutterDriverConnect != null) {
await onAfterFlutterDriverConnect(driver);
}
completer.complete(driver);
},
(Object e, StackTrace st) {
if (e is DriverError) {
completer.completeError(e, st);
}
},
);
return completer.future;
}
Future<FlutterWorld> createFlutterWorld(
@ -93,6 +123,7 @@ class FlutterTestConfiguration extends TestConfiguration {
? flutterConfig.runningAppProtocolEndpointUri
: null,
);
world.setFlutterDriver(driver);
return world;
@ -122,7 +153,7 @@ class FlutterTestConfiguration extends TestConfiguration {
WhenPauseStep(),
WhenFillFieldStep(),
ThenExpectWidgetToBePresent(),
RestartAppStep()
RestartAppStep(),
]);
}
@ -131,15 +162,16 @@ class FlutterTestConfiguration extends TestConfiguration {
int attempt,
int maxAttempts,
) async {
try {
return await FlutterDriver.connect(
dartVmServiceUrl: dartVmServiceUrl,
);
} catch (e) {
).catchError(
(e, st) async {
if (attempt > maxAttempts) {
rethrow;
throw e;
} else {
print(e);
print(
'Fluter driver error connecting to application at `$dartVmServiceUrl`, retrying after delay of $flutterDriverReconnectionDelay',
);
await Future<void>.delayed(flutterDriverReconnectionDelay);
return _attemptDriverConnection(
@ -148,7 +180,8 @@ class FlutterTestConfiguration extends TestConfiguration {
maxAttempts,
);
}
}
},
);
}
void _ensureCorrectConfiguration() {

View File

@ -118,7 +118,7 @@ packages:
name: gherkin
url: "https://pub.dartlang.org"
source: hosted
version: "1.1.8"
version: "1.1.8+1"
glob:
dependency: "direct main"
description:

View File

@ -1,6 +1,6 @@
name: flutter_gherkin
description: A Gherkin / Cucumber parser and test runner for Dart and Flutter
version: 1.1.8+1
version: 1.1.8+2
homepage: https://github.com/jonsamwell/flutter_gherkin
environment:
@ -16,7 +16,7 @@ dependencies:
sdk: flutter
glob: ^1.1.7
meta: ">=1.1.6 <2.0.0"
gherkin: ^1.1.8
gherkin: ^1.1.8+1
# gherkin:
# path: ../dart_gherkin