Cloud and Software Architecture, Soft skills, IOT and embedded
BDD with SpecFlow - Features Scenarios Steps and Contexts
Get link
Facebook
Twitter
Pinterest
Email
Other Apps
SpecFlow is a tool that bridges the gap between business-level behavior
specification and the technical implementations of automated testing. SpecFlow
is an acceptance criteria definition and testing tool that makes it easier to
integrate Behavior Driven Specifications into software projects earlier, in a
shift-left fashion.
Business value is defined and modeled as Business Features. Those features are
built from more granular components, often called User Stories. The
Feature and User Stories contain sets of Acceptance Criteria that represent
the target state for the Feature.
The Feature is the top-level construct in Specflow/BDD. Gherkin Business
Features are made up of a set of Scenarios. Each Scenario
represents one or more acceptance criteria. Each SCenario is validated
as a single automated unit test or a parameter-driven unit test with a list of
parameter sets. One or more features and their associated scenarios
define the criteria needed to verify that some business functionality
implements the business behavior.
Example Walkthrough
This Feature contains three Scenarios. The Feature is described in a SpecFlow
feature file. The Feature and the Scenarios are materialized into Unit
Tests via SpecFlow's code-behind generated Feature .cs file. Scenarios
results are verified every time the Unit test set passes a test run.
Features and Scenarios
Features represent some business outcome or value stream. Scenarios act
as the acceptance criteria for the implementation of that business outcome.
Features should contain as many Scenarios as required to describe the desired
business behavior. This includes positive and negative outcomes.
Some teams will break the Functional Behavior and the Non-Functional Behavior
into separate features the the end result is still that some Features contain
enough Scenarios to define the behavior in enough detail to agree that the
requirements have been met if the Scenario (BDD Tests) passes.
SpecFlow implements Features via Gherkin Syntax Feature Files. SpecFlow
generates code-behind .cs files that provide Unit Test Scaffolding that set up
each individual Scenario as its own Unit test. Features contain
scenarios. Feature Files result in Unit Tests sets one for each Scenario
in the Feature.
A Scenario in a Feature is written something like
Scenario: Example - Search with BingGiven I search the internet using site "bing"When I use the term "facebook"Then There should be at least 1 trademark holder site link "facebook.com"
Scenarios and Steps
Scenarios are made up of multiple steps. SpecFlow, and Cucumber, use
Gherkin Given When Then Syntax for this specification. This is
similar to other Arrange, Action, Assert frameworks. Developers
implement the steps with the appropriate assertions around the acceptance
criteria. Each Given When Then clause is implemented as an atomic
step. Steps are global functions essentially visible to any test that wishes
to use them. Steps are grouped in step definition files whose
organization does not impact how steps can be used. Scenario (Test) can
mix and match previously created steps with new steps created just for this
scenario.
Code generated Scenario Unit Test in the Feature File call each test Scenario
step in turn in the order specified by the Gherkin Scenario. SpecFlow, and
Cucumber, implement Steps as standalone Functions. Steps are stateless.
Steps operate against input parameters and context. They store the state in
the context to be possibly used by later steps.
Scenario Steps and Contexts
Steps are grouped in Step Definition Files. Each step is an
independent function that is globally visible across all BDD
tests. Steps can be reused across an unlimited number of scenarios. Step
Definitions are instantiated for each individual test making the individual
Step Functions visible inside that test. If a Scenario contains
steps from multiple Step Definition Files then all of the relevant Step
Definition Files are instantiated to bring the Steps into context.
Scenario Steps operate, at run time, within a Scenario Context or scope.
The scope could contain data like the results of a previous step, intermediate
values, calculated values, security credentials, configuration information, or
other data. The Steps receive or access that context via invocation parameters
or dependency injected scope objects. SpecFlow creates context objects
that are referenced in Step Definition File constructors and inject those same
instantiated objects across all the Step Definition Files that contain Steps
referenced in a Scenario. This means the same context is available to
Scenario Steps no matter what Step Definition File they are implemented in as
long as they have the same data type injected via the Step Definition
Constructor.
Context design and standardization is an ongoing concern and a source of
refactoring work as BDD testing projects grow and mature.
Step Visibility
Step Definitions are global by default. All of the Step Definitions in
all of the Step Definition Files in a project can be used in any of the
Scenarios. There are some design considerations in that not all Steps
may accept or expect context state in the same form or location. Step
re-use considerations can drive refactoring efforts across BDD testing
projects.
In some cases, it is better to assume that Steps are global but are normally
used in pools of steps for certain types of scenario interactions. In
this case, the Step Definition Files may contain related Steps more than the
Steps for a specific Scenario.
Step Visibility and reusability design is an ongoing concern in large BDD
projects.
Video
Execution
The IDE or CI/CD process run tests using the standard test runner. The
code-behind .cs file for Feature is recognized as a Unit Test file for the
runner (NUnit/Xunit) that you are using. Each Scenario is modeled as its
own test. The standard test runner invokes the unit tests via the code-behind.
The feature code-behind file loads all the step definition files bringing them
into the scope and then runs each scenario in turn. Each scenario is actually
a unit test (from the runner's point of view). Each scenario is just
made up of calls to the appropriate given/when/then steps. Test failures are
bubbled up in the usual fashion for that test library.
The Windows Subsystem for Linux operates as a virtual machine that can dynamically grow the amount of RAM to a maximum set at startup time. Microsoft sets a default maximum RAM available to 50% of the physical memory and a swap-space that is 1/4 of the maximum WSL RAM. You can scale those numbers up or down to allocate more or less RAM to the Linux instance. The first drawing shows the default WSL memory and swap space sizing. The images below show a developer machine that is running a dev environment in WSL2 and Docker Desktop. Docker Desktop has two of its own WSL modules that need to be accounted for. You can see that the memory would actually be oversubscribed, 3 x 50% if every VM used its maximum memory. The actual amount of memory used is significantly smaller allowing every piece to fit. Click to Enlarge The second drawing shows the memory allocation on my 64GB laptop. WSL Linux defaults to a maximum RAM size of 5
I wanted to access all my Azure resources without making any of them visible to the Internet. The easiest give my local machine access to everything on my Azure Virtual Network (VNET) was to connect to it over VPN. It turns out creating Azure VPN gateways and connecting to Azure VPN endpoints is easy. There are some subtleties in getting DNS name resolution to work that can confuse when first starting out. Setting the Stage There are a few ways to get to Azure endpoints and resources that are blocked from the internet. We can Create a Point-to-Site connection from our local machines to Azure Network Gateways Create a Site-to-Site network connection from our local networks to Azure Network Gateways. Use Bastion Hosts Use Cloud Shell Leave everything open to the internet. I chose a Point-to-Site (P2S) VPN connection that connects from my laptop to a VNet Gateway. That joins my laptop
Create Storage Spaces in Windows 10 Windows Server O/S contains Storage Spaces support for Server Spaces tiered storage. You can front slower spinning disks with smaller faster SSDs. Windows 10 has a Storage Spaces GUI Control Panel that does not include the tiered storage GUI. This means Powershell must be used for all configuration. https://github.com/freemansoft/win10-storage-spaces contains scripts that create tiered storage pools that integrate SSDs as caching drives and HDDs as storage drives. They assume you have at least one SSD and one HDD. The scripts automatically find all raw drives and add them to the pool. Some HDDs have their types incorrectly identified. The script can coerce them to be MediaType:HDD The entire virtual drive is added to the system as a single large volume You need at least 1 SSD and 1 HDD to run cached storage / Simple resiliency 2 SSD and 2 HDD to run cached storage / Mirror resiliency / 1 SSD and 2 HDD to run cached storage / Simple re
Comments
Post a Comment