Frequently Asked Questions
- How can I make sure every requirement gets verified on my product?
- How can I put together a set of specification documents?
- What should I do when a specification gets modified?
- Do I need to re-test when a new specification gets out?
- How can I avoid loosing important specification documents?
- Why is test repeatability so important?
- How do you test things that cannot be automated?
- How do you label requirements and testcases?
- How do you create requirements?
- What do I need to show to an auditor?
- How do you produce a certification report for an auditor?
- How do you keep track of what has been done versus what is still unproven?
- How do you deal with changing requirements?
- How do you manage multiple specification documents?
- What is the meaning of red-green?
You can highlight your specifications so that all the text has been identified as a Shall, Should or May or Not-To-Test area. Then, it is a matter of having all the highlighted requirements validated using at least one testcase (with a validating script or a validating e-form). Derogations can also be used instead for specific conditions.
Coverage can easily be assesed through the dashboard.
Integrator let's you simply upload them to your project. They are thus vaulted and revision controlled for the rest of the duration of your project.
The important part to remember is that, by providing a pdf version of all your specifications in Integrator, you will be given the opportunity to extract the requirements from the specification you uploaded and put them into Integrator.
A few benefits include:
- The complete text of your specifications will be available in Integrator for searches.
- Synchronisation between the document and Integrator will highlight the document with the latest test status directly in place. This is a very compelling feature to demonstrate your quality process to an auditor or a customer.
Integrator has an upgrade feature where the new spec is inserted to replace an older version. This process is meant to save you time by updating only the modified requirements.
Whenever a part of a requirement chain is modified, the status will clearly indicate that some areas need to be re-tested. So, the simple answer is: yes you need to re-test. However, Integrator will let you see clearly which areas are affected and will help you put the ressources where they have the most impact on the project status.
By uploading them to Integrator, you automaticaly benefit from all the features of a secured vault.
Test repeatability is important for many reasons, the most important being that it provides a trust level to your test system. Repeatability gives statistical power to your test results. Your confidence level rises as the same stimuli to a system in given inital conditions gives the same expected results.
There are many factors driving the repeatability of tests:
- Automation, as it can be used to minimize variability in the measurement process itself.
- Configuration control, as it documents the initial configuration of your test and helps identify minute discrepancies when measurements or results start to shift.
- Simplicity, since a short test that measures a single element is easier to write, debug and test (yes, test should be debugged! Search for red-green for more information on this!)
Test repeatability is a key driving factor behind all agile, test driven or extreme development methods.
Certain tasks are more difficult to automate. They might involve manual intervention, could be too costly or would simply be unreliable when performed automatically. We have created e-forms that can be used to ask specific information to a human in a structured way.
These data entry e-forms are fully editable and use the same primitives as the ones used within an automated script within a test harness. This means that the data gathered using these e-forms is connected to Integrator seemlessly.
Moreover, you can assemble e-forms into a questionnaire that can very easily be re-used in other project, letting you create a libray of standard user interactions.
So, even for manual data entry, you can benefit from a formal level of data validation and formating. This easy to deploy approach gets you all the benefits of structured data entry while greatly minimizing trivial errors.
Requirement and Testcase numbering schemes once was a crucial traceability issue, especially with older tools (spreadsheets, custom or in-house systems) that did not offer real referential integrity. Requirement numbers and Testcase numbers were then used to synchronize complete teams and so they had to be unique, consistent and revision controlled.
Since Integrator has complete referential integrity (who did what, when and why), the actual naming convention for Requirements and Testcases is less of a critical point. The point is, by looking at the latest data from the web interface, it is clear what is connected together.
In fact, we offer automatic renumbering so that you can line-up Requirements and Testcases in the order they would appear in a spec, which is much more convenient as the requirement base grows. We also found that it is a simple solution that simplifies the incorporation of requirements from various sources that would naturally produce various numbering schemes. You just pick the one you like and adapt as you discover better solutions. Everybody stays on the same page all the time.
We usually like a short mnemonic with a counter. For example R_ELECTR_xxxx would generate electrical requirements from 0000 to 9999. For folks who absolutely want a specific revision identification directly in a requirement, something like R_ELECTR_xxxx.yy would do the trick.
Again, the point is if you later change your mind, it is easy to renumber everything and still keep everybody using the right information.
Requirements are largely taken from reference text in specification documents selected for a project.
So, Integrator let's you highlight official text in the pdf version of your specification and will keep this highlighted text as a reference throughout your project.
Two important benefits emerge from this approach:
- You can comfortably highlight requirements directly in the specification document, right from within Acrobat(™) Standard*.
- Since the highlighted portion of the specification remains active, we can keep the color of the highlight synchronized ith the status of the requirement: your specification will be painted in the color of the status of your project for instant visual feedback!
*Adobe Acrobat, Acrobat Standard are trademarks of Adobe.
In a nutshell, an auditor wants to assess if you said what you did and if you did what you said.
Well, it might be a bit more complex in certain situations, but it mostly boils down to this for the majority of times. Of course, each audit situation requires analysis and preparation, and far from us the intent of taking this subject lightly.
So, since Integrator is a requirement driven testing framework, we suggest you put the audit section of your certification right into your project, highlighting each audit requirement as if it was like any other requirement of your project (because it is exactly the case, after all!). You can use tags to label these requirements with something like "Certification", "AuditReq" or the like.
From that point, you have a clear checklist of things to perform to get certified which is impossible to miss and you have a great section to print or show to an auditor. Moreover, you can record your intent for each of these audit requirements.
On top of that, by using Integrator for the project, you will have collected:
- A definition of the project (using the description fields: what does it apply to, how long is it expected to take, who works on this, what is the desired outcome, who are the end customers).
- A list of all the reference documents you have used (the specifications with their provenance and versions used).
- A list of all the requirements you anotated directly in the spec with the reference text.
- A list of all the reference text that you explicitely will not test (ie: table of content, annex).
- A list of all the test cases you developed to prove all the requirements.
- A list of any derogation you might have put in place.
- A complete set of test results, with the configurations used (serial numbers, versions of everything).
- A list of test equipment used for the project with their pedigree.
We are not aware of any certification entity that would not consider these as important items to collect!
Producing a complete certification report fir an audit is simply a matter of clicking on 'produce report' for the selected project. You can print the report as an html document or as a pdf file.
Integrator computes a certification status in real time. Your responsibility is to provide the specification documents and highlight what is important for you in them. From that point in your project, you get instant covergae status.
Performing tests to validate these requirements will have Integrator track in real-time the status of every requirements. The tree-view and the dashboard display very valuable information for a quick assessement of where you are in your project.
We provide you with two ways of providing test results:
Scripts are grouped into a Test Harness that knows how to collect results and synchronize them with your Integrator server. The test harness runs on any machine running perl (Linux, Windows, Mac) and integrates with a comprehensive library of tests that are used by tens of thousands of people for more than 20 years (Test::More).
E-forms (electronic forms) can be created, edited and filled with data right from the Integrator web interface to let you, or people you select, gather and submit data in a structured way when things are more difficult to automate completely.
Integrator has been designed to let you easily cope with change at all levels. Requirement change can have drastic impact on a project. It is thus important to first assess the impact on the project, then start modifiying the requirement chain to adapt to the new reality. Perhaps some tests need parameters to be modified? Maybe there are some product areas that will need more specification details? You can easily answer these questions while looking at the most up-to-date information directly into your project within Integrator.
A key element in change management is the necessity to keep everybody on the same page. In theory this is trivial but, in practice, this is often the area that is the most challenging. Questions like:
- What are the new specs?
- Who proposed the modification?
- Which unit has been re-tested, yet?
Can be answered in a few simple mouse clicks. If you have been through this dance before, you know how difficult it is to keep it together unless you have a tool in place to assist you with referential integrity.
Integrator is very easy to update and the incentives for keeping Integrator up-to-date are far greater than the perceived burden of performing the update. This is a major advantage in favor of Integrator when dealing with change.
In fact, you can just directly edit a requirement and follow the cascading changes down to test cases and spot all the units that need to be re tested. Using Integrator to assess the impact on the status of your project is trivial with the data drill down in the tree-view.
Integrator also features a specification upgrade mechanism that helps you deal with modifications within specification documents themselves. This feature is there to help you replace an old specification with a completely revised one and, yet, save time by re-using all unmodified requirement statements.
In Integrator, you don't have to manage multiple specifications. You simply add them to your project. They are vaulted and revision controlled from the moment you upload them and you can focus on requirement highlighting and testing right from the start.
Requirements always remember from which document they have been extracted, so you don't have to be bothered with this aspect either, it is all internal and automatic. This is a great time saver if you ever had to manage documentation for a complex project (either manually or even with a version control software).
In a general test context, red-green means that a test condition has been cycled at least once from Fail to Pass. Red goes for Fail and Green goes for Pass, following the street light metaphor.
Some design philosphies (TDD - Test Driven Development) refer to it directly in the short expression "red, green, refactor". This mantra goes like this: when designing a piece of software, first write a test that will fail - hence start with "red", then write some minimal (some even say canonical) piece of code that will have this test pass without breaking any other test. This is the "green" part. Then, "refactor" your code according to general software principle to clean-it up. This simple process is repeated until the amount of tests written cover the entire design space desired. In essence, the test suite becomes a specification in itself if no better document can describe what it was intended to be built from a start.
This process is very well adapted to unit test development, as it provides a sure fire way to build a set of unit tests that are individualy proven.
When a specification with testable requirements is available in Integrator, using a red-green-refactor brings a very effective way of implementing automated testing. Try it!